Monday, October 28, 2013

Chapter 3, part 5 of n: other term structures

Hello again.

Back to chapter 3 of my book after the detour of last post (the series on chapter 3 started here). This post covers two sections: default-probability term structures and inflation ter structures. They are both short enough that splitting them into two post would have felt like stretching the content a bit too much.

Follow me on Twitter if you want to be notified of new posts, or add me to your circles, or subscribe via RSS: the widgets for that are in the sidebar, at the top right of the page. Also, make sure to check my Training page.

Term structures

Other term structures

So far, the focus of this chapter has been on yield term structures. Of course, other kinds of term structure are implemented in the library. In this section, I'll review them shortly: mostly, I'll point out how they differ from yield term structures and what particular features they sport.

Default-probability term structures

Default-probability term structures are the most similar in design to yield term structures. They can be expressed in terms of default probability, survival probability, default density, or hazard rate; any one of the four quantities can be obtained from any other, much like zero rates and discount factors.

Unlike yield term structures (in which all methods are implemented in terms of the discountImpl method) the base default-probability structure has no single method for others to build upon. Instead, as shown in listing 3.13, it declares two abstract methods survivalProbabilityImpl and defaultDensityImpl. It's left to derived classes to decide which one should be written in terms of the other; the base class implements survivalProbability and defaultDensity based on the respective implementation methods (their implementation is not as simple as shown, of course; here I omitted range checking and extrapolation for clarity), defaultProbability based trivially on survivalProbability, and hazardRate in terms of both survival probability and default density.

Listing 3.13: Sketch of the DefaultProbabilityTermStructure class.
    class DefaultProbabilityTermStructure : public TermStructure {
      public:
        // ...constructors...

        Probability survivalProbability(Time t) const {
            return survivalProbabilityImpl(t);
        }

        Probability defaultProbability(Time t) const {
            return 1.0 - survivalProbability(t);
        }
        Probability defaultProbability(Time t1,
                                       Time t2) const {
            Probability p1 = defaultProbability(t1),
                        p2 = defaultProbability(t2);
            return p2 - p1;
        }

        Real defaultDensity(Time t) const {
            return defaultDensityImpl(t);
        }

        Rate hazardRate(Time t) const {
            Probability S = survivalProbability(t);
            return S == 0.0 ? 0.0 : defaultDensity(t)/S;
        }

        // ...other methods...
      protected:
        virtual Probability survivalProbabilityImpl(Time) const = 0;
        virtual Real defaultDensityImpl(Time) const = 0;
      private:
        // ...data members...
    };
Listing 3.14 sketches the adapter classes that, as for yield term structures, allow one to define a new default-probability structure in terms of the single quantity of his choice—either survival probability, default density, or hazard rate (the default probability is so closely related to survival probability that we didn't think it necessary to provide a corresponding adapter). The first one, SurvivalProbabilityStructure, defines defaultDensityImpl in terms of the implementation of survivalProbabilityImpl, which is left purely virtual and must be provided in derived classes; the second one, DefaultDensityStructure, does the opposite; and the last one, HazardRateStructure, defines both survival probability and default density in terms of a newly-declared purely abstract hazardRateImpl method.

Unfortunately, some of the adapters rely on numerical integration in order to provide conversions among the desired quantities. The provided implementations use dark magic in both maths and coding (namely, Gaussian quadratures and boost::bind) to perform the integrations efficiently; but when inheriting from such classes, you should consider overriding the adapter methods if a closed formula is available for the integral.

Listing 3.14: Adapter classes for default-probability term structures.
    class SurvivalProbabilityStructure
        : public DefaultProbabilityTermStructure {
      public:
        // ...constructors...
      protected:
        Real defaultDensityImpl(Time t) const {
            // returns the derivative of the survival probability at t
        }
    };

    class DefaultDensityStructure
        : public DefaultProbabilityTermStructure {
      public:
        // ...constructors...
      protected:
        Probability survivalProbabilityImpl(Time t) const {
            // 1 minus the integral of the default density from 0 to t
        }
    };

    class HazardRateStructure
        : public DefaultProbabilityTermStructure {
      public:
        // ...constructors...
      protected:
        virtual Real hazardRateImpl(Time) const = 0;
        Probability survivalProbabilityImpl(Time t) const {
            // exp(-I); I is the integral of the hazard rate from 0 to t
        }
        Real defaultDensityImpl(Time t) const {
            return hazardRateImpl(t)*survivalProbabilityImpl(t);
        }
    };
Like for yield curves, the library provides a few template classes that implement the adapter interfaces by interpolating discrete data, as well as a generic piecewise default-probability curve template and the traits required to select its underlying quantity. Together with the existing interpolation traits, this allows one to instantiate classes such as PiecewiseDefaultCurve<DefaultDensity,Linear>. The implementation is quite similar to the one described in this post; in fact, so much similar that it's not worth describing here. The only thing worth noting is that the default-probability structure is not self-sufficient: in order to price the instruments required for its bootstrap, a discount curve is needed (then again, the same is true of LIBOR curves in today's multiple-curve settings). You'll have to be consistent and use the same curve for your pricing engines; otherwise, you might suddenly find out that your CDS are no longer at the money.

Aside: Cinderella method

In the implementation of DefaultProbabilityTermStructure, you've probably noticed another symmetry break like the one discussed in a previous post. There's a difference though; in that case, discount factors were singled out to be given a privileged role. In this case, hazard rates are singled out to play the mistreated stepsister; there's no hazardRateImpl beside the similar methods declared for survival probability or default density. Again, a look at past versions of the code shows that once, it was symmetric; and again, I can give no reason for the change. I'm sure it looked like a good idea at the time.

The effect is that classes deriving from the HazardRateStructure adapter must go through some hoops to return hazard rates, since they're not able to call hazardRateImpl directly; instead, they have to use the default implementation and return the ratio of default density and survival probability (possibly performing an integration along the way). Unfortunately, even our fairy godmother can't change this now without risking to break existing code.

Inflation term structures

Inflation term structures have a number of features that set them apart from the term structures I described so far. Not surprisingly, most such features add complexity to the provided classes.

The most noticeable difference is that we have two separate kinds of inflation term structures (and two different interfaces) instead of a single one. The library does provide a single base class InflationTermStructure, that contains some inspectors and some common behavior; however, the interfaces returning the actual inflation rates are declared in two separate child classes, leading to the hierarchy sketched in listing 3.15. The two subclasses model zero-coupon and year-on-year inflation rates, which are not easily converted into one another and thus foil our usual multiple-adapter scheme.

Listing 3.15: Sketch of the InflationTermStructure class and its children.
    class InflationTermStructure : public TermStructure {
      public:
        // ...constructors...

        virtual Date baseDate() const = 0;
        virtual Rate baseRate() const;
        virtual Period observationLag() const;

        Handle<YieldTermStructure> nominalTermStructure() const;

        void setSeasonality(const shared_ptr<Seasonality>&);
      protected:
        Handle<YieldTermStructure> nominalTermStructure_;
        Period observationLag_;
        // ...other data members...
    };

    class ZeroInflationTermStructure
        : public InflationTermStructure {
      public:
        // ...constructors...
        Rate zeroRate(const Date &d,
                      const Period& instObsLag = Period(-1,Days),
                      bool forceLinearInterpolation = false,
                      bool extrapolate = false) const;
      protected:
        virtual Rate zeroRateImpl(Time t) const = 0;
    };

    class YoYInflationTermStructure
        : public InflationTermStructure {
      public:
        // ...        `\ldots{}constructors\ldots{}`
        Rate yoyRate(const Date &d,
                     const Period& instObsLag = Period(-1,Days),
                     bool forceLinearInterpolation = false,
                     bool extrapolate = false) const;
      protected:
        virtual Rate yoyRateImpl(Time time) const = 0;
    };
This state of things has both advantages and disadvantages; possibly more of the latter. On the one hand, it leads to a pair of duplicated sub-hierarchies, which is obviously a smell. (It can get worse. Until now, we haven't considered period-on-period rates with a frequency other than annual. Hopefully, they will only lead to a generalization of the year-on-year curve.) On the other hand, it simplifies a bit the sub-hierarchies; for instance, there's no adapter classes since each kind of term structure has only one underlying quantity (that is, either zero-coupon rates or year-on-year rates).

Other differences are due to the specific quirks of inflation fixings. Since inflation figures for a given month are announced after an observation lag, inflation term structures have a base date, as well as a reference date; the base date is the one corresponding to the latest announced fixing. If an inflation figure is needed for a date in the past with respect to today's date but after the base date, it must be forecast. (You might remember that, at the beginning of this chapter, I obscurely suggested that the future doesn't always begin at the reference date. This is the exception I was referring to.) Also, since inflation fixings are affected by seasonality, inflation term structures provide the means to store an instance of the polymorphic Seasonality class (which for brevity I won't describe). If given, such instance models the desired seasonal correction for the returned inflation rates, in a plain implementation of the Strategy pattern.

Unlike other kinds of term structure, the inflation interface doesn't provide methods taking a time value instead of a date; the whole fixing machinery depends on a number of date calculations (what month we're in, the corresponding period for the fixing, and whatnot) and there's simply no reliable way to convert from times to dates, so we called the whole thing off.

Yet another difference is that inflation term structures store a discount curve. Yes, I made the same face as you when I realized it—and shortly before the 1.0 release, too, when the interfaces would have to be frozen. No, let me rephrase that: I made a worse face than you. In the end, though, I left it alone. It's not the design we used for default-probability structures; but at least it has the advantage that a bootstrapped inflation term structure carries around the discount curve it used, so one doesn't have to keep track of their association to be consistent. We can choose the best design for release 2.0, when we have used both for a few years; so all's well that ends well, I guess. Still, I made a mental note to perform code reviews more often.

The remaining machinery (interpolated curves, piecewise bootstrap etc.) is similar to what I described for other kinds of curves. Due to the separate sub-hierarchies for zero-coupon and year-on-year rates, though, there's a couple of differences. On the one hand, there are no adapter classes that implement the whole interface based on a single underlying quantity—such as, say, ZeroStructure for discounts or HazardRateStructure for default probabilities. On the other hand, piecewise curves don't use traits to select the underlying quantity (as, for instance, in PiecewiseYieldCurve<Discount,LogLinear>). To select one or the other, you'll have to choose the appropriate class and write something like PiecewiseZeroInflation<Linear> instead.

Share this post:

Monday, October 21, 2013

Odds and ends: interest rates

Welcome back.

This post marks a break in the series on chapter 3 which started in this post and ran for the past few weeks. It covers interest rates, which are relevant to its subject but are not technically part of the chapter.

The idea is that a number of basic classes are glossed over in the main part of the book and are described in an appendix instead, so they don't interrupt the flow of the chapters. As pointed out by Douglas Adams in the fourth book of his Hitchhiker trilogy, (no, that's right: it's an inaccurately-named trilogy of five books. It's a long story.)
"[An excessive amount of detail] is guff. It doesn't advance the action. It makes for nice fat books such as the American market thrives on, but it doesn't actually get you anywhere."
I'm not sure that this is still the case with the blog form. I'm already cutting up every chapter in several posts, so the additional content might follow one of those. We'll see: I'm still new at this blogging thing, so I'm likely to experiment in the future.

Follow me on Twitter if you want to be notified of new posts, or add me to your circles, or subscribe via RSS: the widgets for that are in the sidebar, at the top right of the page. Also, make sure to check my Training page.

Odds and ends: interest rates

The InterestRate class (shown in listing A.6) encapsulates general interest-rate calculations. Instances of this class are built from a rate, a day-count convention, a compounding convention, and a compounding frequency (note, though, that the value of the rate is always annualized, whatever the frequency). This allows one to specify rates such as "5%, actual/365, continuously compounded" or "2.5%, actual/360, semiannually compounded." As can be seen, the frequency is not always needed. I'll return to this later.

Listing A.6: Outline of the InterestRate class.
    enum Compounding { Simple,              // 1+rT
                       Compounded,          // (1+r)^T
                       Continuous,          // e^{rT}
                       SimpleThenCompounded
    };

    class InterestRate {
      public:
        InterestRate(Rate r,
                     const DayCounter&,
                     Compounding,
                     Frequency);
        // inspectors
        Rate rate() const;
        const DayCounter& dayCounter();
        Compounding compounding() const;
        Frequency frequency() const;
        // automatic conversion
        operator Rate() const;
        // implied discount factor and compounding after a given time
        // (or between two given dates)
        DiscountFactor discountFactor(Time t) const;
        DiscountFactor discountFactor(const Date& d1,
                                      const Date& d2) const;
        Real compoundFactor(Time t) const;
        Real compoundFactor(const Date& d1,
                            const Date& d2) const;
        // other calculations
        static InterestRate impliedRate(Real compound,
                                        const DayCounter&,
                                        Compounding,
                                        Frequency,
                                        Time t);
        ... // same with dates
        InterestRate equivalentRate(Compounding,
                                    Frequency,
                                    Time t) const;
        ... // same with dates
    };
Besides the obvious inspectors, the class provides a number of methods. One is the conversion operator to Rate, i.e., to double. On afterthought, this is kind of risky, as the converted value loses any day-count and compounding information; this might allow, say, a simply-compounded rate to slip undetected where a continuously-compounded one was expected. The conversion was added for backward compatibility when the InterestRate class was first introduced; it might be removed in a future revision of the library, dependent on the level of safety we want to force on users. (There are different views on safety among the core developers, ranging from "babysit the user and don't let him hurt himself" to "give him his part of the inheritance, pat him on his back, and send him to find his place in the world.")

Other methods complete a basic set of calculations. The compoundFactor returns the unit amount compounded for a time t (or equivalently, between two dates d1 and d2) according to the given interest rate; the discountFactor method returns the discount factor between two dates or for a time, i.e., the reciprocal of the compound factor; the impliedRate method returns a rate that, given a set of conventions, yields a given compound factor over a given time; and the equivalentRate method converts a rate to an equivalent one with different conventions (that is, one that results in the same compounded amount).

Like the InterestRate constructor, some of these methods take a compounding frequency. As I mentioned, this doesn't always make sense; and in fact, the Frequency enumeration has a NoFrequency item just to cover this case.

Obviously, this is a bit of a smell. Ideally, the frequency should be associated only with those compounding conventions that need it, and left out entirely for those (such as Simple and Continuous) that don't. If C++ supported it, we would write something like
    enum Compounding { Simple,
                       Compounded(Frequency),
                       Continuous,
                       SimpleThenCompounded(Frequency)
    };
which would be similar to algebraic data types in functional languages, or case classes in Scala (both support pattern matching on an object, which is like a neater switch on steroids; go have a look when you have some time) but unfortunately that's not an option. To have something of this kind, we'd have to go for a full-featured Strategy pattern and turn Compounding into a class hierarchy. That would probably be overkill for the needs of this class, so we're keeping both the enumeration and the smell.

Share this post:

Monday, October 14, 2013

Chapter 3, part 4 of n: adding z-spread to an interest-rate curve

Hello again.

This post is the fourth in an ongoing series covering chapter 3 of my book and starting here. It's a short one, to make up for the monster post of last week.

As I already mentioned, IKB, Quaternion and d-fine are organizing a two-days QuantLib workshop in Düsseldorf on November 13th and 14th. The agenda is here; if you're interested, send an email to the contacts listed in there. Registration is free; there are just 35 places, though.

Follow me on Twitter if you want to be notified of new posts, or add me to your circles, or subscribe via RSS: the widgets for that are in the sidebar, at the top right of the page. Also, make sure to check my Training page.

Term structures

Example: adding z-spread to an interest-rate curve

This example (a lot simpler than the previous one) shows how to build a term-structure based on another one. We'll take an existing risk-free curve and modify it to include credit risk. The risk is expressed as a z-spread, i.e., a constant spread to be added to the zero-yield rates. For the pattern-savvy, this is an application of the Decorator pattern; we'll wrap an existing object, adding some behavior and delegating the rest to the original instance.

Listing 3.12: Implementation of the ZeroSpreadedTermStructure class.
  class ZeroSpreadedTermStructure : public ZeroYieldStructure {
    public:
      ZeroSpreadedTermStructure(
                        const Handle<YieldTermStructure>& h,
                        const Handle<Quote>& spread);
      : originalCurve_(h), spread_(spread) {
          registerWith(originalCurve_);
          registerWith(spread_);
      }
      const Date& referenceDate() const {
          return originalCurve_->referenceDate();
      }
      DayCounter dayCounter() const {
          return originalCurve_->dayCounter();
      }
      Calendar calendar() const {
          return originalCurve_->calendar();
      }
      Natural settlementDays() const {
          return originalCurve_->settlementDays();
      }
      Date maxDate() const {
          return originalCurve_->maxDate();
      }
    protected:
      Rate zeroYieldImpl(Time t) const {
          InterestRate zeroRate =
              originalCurve_->zeroRate(t, Continuous,
                                       NoFrequency, true);
          return zeroRate + spread_->value();
      }
    private:
      Handle<YieldTermStructure> originalCurve_;
      Handle<Quote> spread_;
  };
The implementation of the ZeroSpreadedTermStructure is shown in listing 3.12. As previously mentioned, it is based on zero-yield rates; therefore, it inherits from the ZeroYieldStructure adapter described here and will have to implement the required zeroYieldImpl method. Not surprisingly, its constructor takes as arguments the risk-free curve to be modified and the z-spread to be applied; to allow switching data sources, both are passed as handles. The arguments are stored in the corresponding data members, and the curve registers with both of them as an observer. The update method inherited from the base class will take care of forwarding any received notifications.

Note that none of the base-class constructors was called explicitly. As you might remember from this post, this means that instances of our curve store no data that can be used by the TermStructure machinery; therefore, the class must provide its own implementation of the methods related to reference-date calculation. In true Decorator fashion, this is done by delegating behavior to the wrapped object; each of the referenceDate, dayCounter, calendar, settlementDays, and maxDate methods forwards to the corresponding method in the risk-free curve.

Finally, we can implement our own specific behavior—namely, adding the z-spread. This is done in the zeroYieldImpl method; we ask the risk-free curve for the zero-yield rate at the required time (continuously compounded, since that's what our method must return), add the value of the z-spread, and return the result as the new zero-yield rate. The machinery of the ZeroYieldStructure adapter will take care of the rest, giving us the desired risky curve.

Share this post:

Monday, October 7, 2013

Chapter 3, part 3 of n: bootstrapping an interest-rate curve

Welcome back.

This post is the third in a series covering chapter 3 of my book and starting here. It's a rather large example in which I dissect the code used for bootstrapping an interest-rate curve.

A bit of news: IKB, Quaternion and d-fine are organizing a QuantLib workshop in Düsseldorf on November 13th and 14th, and they were kind enough to ask me to give the keynote on the 13th. As you can guess, it's flattering and scaring at the same time. Possibly more scaring than flattering.

The perks of this will be, first, that I get to meet in person for the first or second time a couple of people that I've worked with on the library, sometimes for years; and second, that I have the opportunity to collect a few data points about the way QuantLib is used in the industry. I'll try to share with you some of the goodies (not the beers, of course: those will stay in Düsseldorf). I'll report on this blog on what I see, and I'll ask the speakers to put their material on the QuantLib site.

Follow me on Twitter if you want to be notified of new posts, or add me to your circles, or subscribe via RSS: the widgets for that are in the sidebar, at the top right of the page. Also, make sure to check my Training page.

Term Structures

Example: bootstrapping an interpolated yield curve

In this section, we'll build an all-purpose yield-curve template. Building upon the classes described in the previous subsections, we'll give it the ability to interpolate in a number of ways on either discount factors, zero yields, or instantaneous forward rates. The nodes of the curve will be bootstrapped from quoted—and possibly varying—market rates.

Needless to say, this example is a fairly complex one. The class template I'll describe (PiecewiseYieldCurve) makes use of a few helper classes, as well as a few template tricks. I'll try and explain all of them as needed.

Listing 3.7: Implementation of the PiecewiseYieldCurve class template.
template <class Traits, class Interpolator,
          template <class> class Bootstrap = IterativeBootstrap>
class PiecewiseYieldCurve
    : public Traits::template curve<Interpolator>::type,
      public LazyObject {
  private:
    typedef typename Traits::template curve<Interpolator>::type base_curve;
    typedef PiecewiseYieldCurve<Traits,Interpolator,Bootstrap> this_curve;
    typedef typename Traits::helper helper;
  public:
    typedef Traits traits_type;
    typedef Interpolator interpolator_type;
    PiecewiseYieldCurve(
           Natural settlementDays,
           const Calendar& calendar,
           const std::vector<shared_ptr<helper> >& instruments,
           const DayCounter& dayCounter,
           Real accuracy = 1.0e-12,
           const Interpolator& i = Interpolator());

    // inspectors not shown

    void update();
  private:
    void performCalculations() const;
    DiscountFactor discountImpl(Time) const;
    std::vector<shared_ptr<helper> > instruments_;
    Real accuracy_;

    friend class Bootstrap<this_curve>;
    friend class BootstrapError<this_curve>;
    Bootstrap<this_curve> bootstrap_;
};
The implementation of our class template is shown in listing 3.7. The class takes three template arguments. The first two determine the choice of the underlying data and the interpolation method, respectively; our goal is to instantiate the template as, say,
    PiecewiseYieldCurve<Discount,LogLinear>
or some such combination. I'll refer to the first parameter as the bootstrap traits; the second is the interpolator already described in a previous post. The third, and seemingly ungainly, parameter specifies a class which implements the bootstrapping algorithm. The parameter has a default value (the IterativeBootstrap class, which I'll describe later) so you can happily forget about it most of the times; it is provided so that interested developers can replace the bootstrapping algorithm with another one, either provided by the library or of their own creation. If you're not familiar with its syntax, it's a template template parameter [1]. The repetition is not an error, as suspected by my spell checker as I write these lines; it means that the template parameter should, in turn, be an uninstantiated class template (in this case, one taking a single template argument) rather than a typename. (An uninstantiated template is something like std::vector, as opposed to an instantiated one like std::vector<double>. The second names a type; the first doesn't.)

Before declaring the class interface, we need another bit of template programming to determine the parent class of the curve. On the one hand, we inherit our term structure from the LazyObject class; as described here, this will enable the curve to re-bootstrap itself when needed. On the other hand, we want to inherit it from one of the interpolated curves described here; depending on the choice of underlying data (discount, zero yields, or forward rates), we must select one of the available class templates and instantiate it with the chosen interpolator class. This is done by storing the class template in the bootstrap traits. Unluckily, C++ didn't allow template typedefs until recently (template aliases were introduced in the 2011 revision of the C++ standard). Therefore, the traits define an inner class template curve which takes the interpolator as its template parameter and defines the instantiated parent class as a typedef; this can be seen in the definition of the Discount traits, partially shown in listing 3.8.

Listing 3.8: Sketch of the Discount bootstrap traits.
    struct Discount {
        template <class Interpolator>
        struct curve {
            typedef InterpolatedDiscountCurve<Interpolator> type;
        };
        typedef BootstrapHelper<YieldTermStructure> helper;

        // other static methods
    }
The described machinery allows us to finally refer to the chosen class by using the expression
    Traits::template curve<Interpolator>::type
that we can add to the list of parent classes. (For those unfamiliar with the dark corners of template syntax, the template keyword in the expression is a hint for the compiler. When reading this expression, the compiler doesn't know what Traits is and has no means to determine that Traits::curve is a class template. Adding the keyword gives it the information, required for processing the rest of the expression correctly.) For the instantiation shown above as an example, with Discount as bootstrap traits and LogLinear as interpolator, the above works out as
    Discount::curve<LogLinear>::type
which in turn corresponds—as desired—to
    InterpolatedDiscountCurve<LogLinear>
as can be seen from the definition of the Discount class.

We make a last short stop before we finally implement the curve; in order to avoid long template expressions, we define a few typedefs, namely, base_curve, this_curve, and helper. The first one refers to the parent class; the second one refers to the very class we're declaring; and the third one extracts from the bootstrap traits the type of a helper class. This class will be described later in the example; for the time being, I'll just say that it provides the quoted value of an instrument, as well as the means to evaluate such instrument on the term structure being bootstrapped. The aim of the bootstrap will be to modify the curve until the two values coincide. Finally, two other typedefs (traits_type and interpolation_type) store the two corresponding template arguments so that they can be retrieved later.

The actual interface, at last. The constructors (of which only one is shown in the listing) take the arguments required for instantiating the parent interpolated curve, as well as a vector of helpers and the target accuracy for the bootstrap. Next, a number of inspectors (such as times or data) are defined, overriding the versions in the parent class; as we'll see, this allows them to ensure that the curve is fully built before returning the required data. The public interface is completed by the update method.

The protected methods include performCalculations, needed to implement the LazyObject interface; and discountImpl, which (like the public inspectors) overrides the parent-class version. The Bootstrap class template is instantiated with the type of the curve being defined. Since it will need access to the internals of the curve, the resulting class is declared as a friend of the PiecewiseYieldCurve class; the same is done for the BootstrapError class, used in the bootstrap algorithm and described later. Finally, we store an instance of the bootstrap class, as well as the required curve accuracy and the helpers.

Listing 3.7 (continued)
template <class T, class I,
          template <class> class B>
PiecewiseYieldCurve<T,I,B>::PiecewiseYieldCurve(
           Natural settlementDays,
           const Calendar& calendar,
           const std::vector<shared_ptr<helper> >& instruments,
           const DayCounter& dayCounter,
           Real accuracy,
           const I& interpolator)
: base_curve(settlementDays, calendar, dayCounter, i),
  instruments_(instruments), accuracy_(accuracy) {
    bootstrap_.setup(this);
}

template <class T, class I,
          template <class> class B>
void PiecewiseYieldCurve<T,I,B>::update() {
    base_curve::update();
    LazyObject::update();
}

template <class T, class I,
          template <class> class B>
void PiecewiseYieldCurve<T,I,B>::performCalculations() const {
    bootstrap_.calculate();
}

template <class T, class I,
          template <class> class B>
DiscountFactor PiecewiseYieldCurve<T,I,B>::discountImpl(Time t) const {
    calculate();
    return base_curve::discountImpl(t);
}
Let's now have a look at the implementation. The constructor holds no surprises: it passes the needed arguments to the base class and stores in this class the other ones. Finally, it passes the curve itself to the stored Bootstrap instance and makes it perform some preliminary work—more on this later. The update method is needed for disambiguation; we are inheriting from two classes (LazyObject and TermStructure) which both define their implementation of the method. The compiler justly refuses to second-guess us, so we have to explicitly call both parent implementations.

The performCalculations simply delegates its work to the Bootstrap instance. Lastly, a look at the discountImpl method shows us why such method had to be overridden; before calling the parent-class implementation, it has to ensure that the data were bootstrapped by calling the calculate method. This also holds for the other overridden inspectors, all following the same pattern. Instead, there's no need to override the zeroYieldImpl and forwardImpl methods, even if the curve inherits from ZeroYieldStructure or ForwardRateStructure; if you look back at listing 3.5 here, you'll see that those methods are only ever called by discountImpl. Thus, overriding the latter is enough.

At this point, I need to describe the BootstrapHelper class template; its interface is sketched in listing 3.9. For our curve, we'll instantiate it (as you can see in the Discount traits shown earlier) as BootstrapHelper<YieldTermStructure>; for convenience, the library provides an alias to this class called RateHelper that can be used in place of the more verbose type name.

Listing 3.9: Interface of the BootstrapHelper class template.
    template <class TS>
    class BootstrapHelper : public Observer, public Observable {
      public:
        BootstrapHelper(const Handle<Quote>& quote);
        virtual ~BootstrapHelper() {}

        Real quoteError() const;
        const Handle<Quote>& quote() const;
        virtual Real impliedQuote() const = 0;

        virtual void setTermStructure(TS*);

        virtual Date latestDate() const;

        virtual void update();
      protected:
        Handle<Quote> quote_;
        TS* termStructure_;
        Date latestDate_;
    };
Each instance of this class—or rather, of derived classes; the class itself is an abstract one—will help bootstrapping a single node on the curve. The input datum for the node is the quoted value of an instrument; this is provided as a Handle to a Quote instance, since the value will change in time. For a yield curve, such instruments might be deposits or swaps, quoted as the corresponding market rates. For each kind of instrument, a derived class must be provided.

The functionality that is common to all helpers is implemented in the base class. BootstrapHelper inherit from both Observer and Observable; the double role allows it to register with the market value and notify changes to the curve, signaling the need to perform a new bootstrap. Its constructor takes a Handle<Quote> providing the input market value, stores it as a data member, and registers itself as an observer. Three methods deal with the underlying instrument value. The quote method returns the handle containing the quoted value; the abstract impliedValue method returns the value as calculated on the curve being bootstrapped; and the convenience method quoteError returns the signed difference between the two values.

Two more methods are used for setting up the bootstrap algorithm. The setTermStructure method links the helper with the curve being built. The latestDate method returns the latest date for which curve data are required in order to calculate the implied value of the market datum; such date will be used as the coordinate of the node being bootstrapped. (The latest required date does not necessarily correspond to the maturity of the instrument. For instance, if the instrument were a constant-maturity swap, the curve should extend a few years beyond the swap maturity in order to forecast the rate paid by the last coupon.) The last method, update, forwards notifications from the quote to the observers of the helper.

The library provides a few concrete helper classes inherited from RateHelper. In the interest of brevity, allow me to do a little hand-waving here instead of showing you the actual code. Each of the helper classes implements the impliedValue for a specific instrument and includes code for returning the proper latest date. For instance, the DepositRateHelper class forecasts a quoted deposit rate by asking the curve being bootstrapped for the forward rate between its start and maturity dates; whereas the SwapRateHelper class forecasts a swap rate by instantiating a Swap object, pricing it on the curve, and implying its fair rate. In a multi-curve setting, it's also possible to use the bootstrapped curve for forecasting forward rates and an external curve for discounting. If you're interested, more details on this (and a discussion on what helpers to choose for a given curve) are available in Ametrano and Bianchetti [2].

We can finally dive into the bootstrap code. Listing 3.10 shows the interface of the IterativeBootstrap class, which is provided by the library and used by default.

Listing 3.10: Sketch of the \texttt{IterativeBootstrap} class template.
template <class Curve>
class IterativeBootstrap {
    typedef typename Curve::traits_type Traits;
    typedef typename Curve::interpolator_type Interpolator;
  public:
    IterativeBootstrap();
    void setup(Curve* ts);
    void calculate() const;
  private:
    Curve* ts_;
};

template <class Curve>
void IterativeBootstrap<Curve>::calculate() const {
    Size n = ts_->instruments_.size();

    // sort rate helpers by maturity
    // check that no two instruments have the same maturity
    // check that no instrument has an invalid quote

    for (Size i=0; i<n; ++i)
        ts_->instruments_[i]->setTermStructure(
                                       const_cast<Curve*>(ts_));

    ts_->dates_ = std::vector<Date>(n+1);
    // same for the other data vectors

    ts_->dates_[0] = Traits::initialDate(ts_);
    ts_->times_[0] = ts_->timeFromReference(ts_->dates_[0]);
    ts_->data_[0] = Traits::initialValue(ts_);

    for (Size i=0; i<n; ++i) {
        ts_->dates_[i+1] = ts_->instruments_[i]->latestDate();
        ts_->times_[i+1] =
            ts_->timeFromReference(ts_->dates_[i+1]);
    }

    Brent solver;

    for (Size iteration = 0; ; ++iteration) {
      for (Size i=1; i<n+1; ++i) {
          if (iteration == 0)   {
              // extend interpolation a point at a time
              ts_->interpolation_ =
                  ts_->interpolator_.interpolate(
                                        ts_->times_.begin(),
                                        ts_->times_.begin()+i+1,
                                        ts_->data_.begin());
              ts_->interpolation_.update();
          }

          Rate guess;
          // estimate guess by using the value at the previous iteration,
          // by extrapolating, or by asking the traits

          // bracket the solution
          Real min = Traits::minValueAfter(i, ts_->data_);
          Real max = Traits::maxValueAfter(i, ts_->data_);

          BootstrapError<Curve> error(ts_, instrument, i);
          ts_->data_[i] = solver.solve(error, ts_->accuracy_,
                                       guess, min, max);
      }

      if (!Interpolator::global)
          break;      // no need for convergence loop

      // check convergence and break if tolerance is reached; bail out
      // if tolerance wasn't reached in the given number of iterations
    }
}
For convenience, typedefs are defined to extract from the curve the traits and interpolator types. The constructor and the setup method are not of particular interest. The first just initializes the contained term-structure pointer to a null one; the second stores the passed curve pointer, checks that we have enough helpers to bootstrap the curve, and registers the curve as an observer of each helper. The bootstrap algorithm is implemented by the calculate method. In the version shown here, I'll gloss over a few details and corner cases; if you're interested, you can peruse the full code in the library.

First, all helpers are set the current term structure. This is done each time (rather than in the setup method) to allow a set of helpers to be used with different curves. (Of course, the same helpers could not be passed safely to different curves in a multi-threaded environment since hey would compete for it. Then again, much of QuantLib is not thread-safe, so it's kind of a moot point.) Then, the data vectors are initialized. The date and value for the initial node are provided by the passed traits; for yield term structures, the initial date corresponds to the reference date of the curve. The initial value depends on the choice of the underlying data; it is 1 for discount factors and a dummy value (which will be overwritten during the bootstrap procedure) for zero or forward rates. The dates for the other nodes are the latest needed dates of the corresponding helpers; the times are obtained by using the available curve facilities.

At this point, we can instantiate the one-dimensional solver that we'll use at each node (more details on solvers in a future post) and start the actual bootstrap. The calculation is written as two nested loops; an inner one—the bootstrap proper—that walks over each node, and an outer one that repeats the process. Iterative bootstrap is needed when a non-local interpolation (such as cubic splines) is used. In this case, setting the value of a node modifies the whole curve, invalidating previous nodes; therefore, we must go over the nodes a number of times until convergence is reached.

As I mentioned, the inner loop walks over each node—starting, of course, from the one at the earliest date. During the first iteration (when iteration == 0) the interpolation is extended a point at a time; later iterations use the full data range, so that the previous results are used as a starting point and refined. After each node is added, a one-dimensional root-finding algorithm is used to reproduce the corresponding market quote. For the first iteration, a guess for the solution can be provided by the bootstrap traits or by extrapolating the curve built so far; for further iteration, the previous result is used as the guess. The maximum and minimum value are provided by the traits, possibly based on the nodes already bootstrapped. For instance, the traits for bootstrapping zero or forward rates can prevent negative values by setting the minimum to a zero; the traits for discount factors can do the same by ensuring that the discounts are not increasing, i.e., by setting the maximum to the discount at the previous node.

The only missing ingredient for the root-finding algorithm is the function whose zero must be found. It is provided by the BootstrapError class template (shown in listing 3.11), that adapts the helper's quoteError calculation to a function-object interface. Its constructor takes the curve being built, the helper for the current node, and the node index, and stores them. Its operator() makes instances of this class usable as functions; it takes a guess for the node value, modifies the curve data accordingly, and returns the quote error.

Listing 3.11: Interface of the BootstrapError class template.
    template <class Curve>
    class BootstrapError {
        typedef typename Curve::traits_type Traits;
      public:
        BootstrapError(
            const Curve* curve,
            const shared_ptr<typename Traits::helper>& helper,
            Size segment);
        Real operator()(Rate guess) const {
            Traits::updateGuess(curve_->data_, guess, segment_);
            curve_->interpolation_.update();
            return helper_->quoteError();
        }
      private:
        const Curve* curve_;
        const shared_ptr<typename Traits::helper> helper_;
        const Size segment_;
    };
At this point, we're all set. The inner bootstrap loop creates a BootstrapError instance and passes it to the solver, which returns the node value for which the error is zero—i.e., for which the implied quote equals (within accuracy) the market quote. The curve data are then updated to include the returned value, and the loop turns to the next node.

When all nodes are bootstrapped, the outer loop checks whether another iteration is necessary. For local interpolations, this is not the case and we can break out of the loop. For non-local ones, the obtained accuracy is checked (I'll spare you the details here) and iterations are added until convergence is reached.

This concludes the bootstrap; and, as I don't want to further test your patience, it also concludes the example. Sample code using the PiecewiseYieldCurve class can be found in the QuantLib distribution, e.g., in the swap-valuation example.

Aside: a friend in need

"Wait a minute," you might have said upon looking at the PiecewiseYieldCurve declaration, "wasn't friend considered harmful?" Well, yes—friend declarations break encapsulation and force tight coupling of classes.

However, let's look at the alternatives. The Bootstrap class needs write access to the curve data. Beside declaring it as a friend, we might have given it access in three ways. On the one hand, we might have passed the data to the Bootstrap instance; but this would have coupled the two classes just as tightly (the curve internals couldn't be changed without changing the bootstrap code as well). On the other hand, we might have exposed the curve data through its public interface; but this would have been an even greater break of encapsulation (remember that we need write access). And on the gripping hand, we might encapsulate the data in a separate class and use private inheritance to control access. At the time of release 1.0, we felt that the friend declaration was no worse than the first two alternatives and resulted in simpler code. The third (and best) alternative might be implemented in a future release.

Bibliography

[1] V. Simonis and R. Weiss, Exploring Template Template Parameters. In Perspectives of System Informatics, Lecture Notes in Computer Science 2244, Springer Berlin/Heidelberg, 2001.
[2] F. Ametrano and M. Bianchetti, Everything You Always Wanted to Know About Multiple Interest Rate Curve Bootstrapping but Were Afraid to Ask, SSRN working papers series n. 2219548, 2013.

Share this post: