There are always the What-ifs we wish we knew
UPDATE: The GIFs and code snippets in this article are based on versions prior to atoti 0.7.0. They may have evolved or become obsolete. Check out the atoti documentation for the latest features. The updated example for this use case is available in the atoti notebook gallery.
Profit and Loss (or PnL) is a self-explanatory term that simply refers to the total profit or loss made by an individual or group over a period. While we won’t know the exact PnL until it happens, the PnL Explain technique seeks to estimate the daily PnL from the change in the underlying risk factors for financial instruments such as equities and derivatives. The returns for bonds for instance, could be affected by risk factors such as duration, credit spreads and default risk, not to mention macroeconomic risk factors such as volatility and inflation.
In this case, the risk factors are represented by the yield curve rates for each tenor, within a yield curve.Usually a portfolio risk manager will monitor the risk factors that impact his portfolio instead of monitoring all the positions booked in his portfolio. To assess what will be the value of his portfolio, the portfolio manager is interested in performing what-if analysis based on a scenario of the risk factor values.
The various dimensions of risk called sensitivities, are represented by Greeks such as Delta, Gamma, Theta, and Vega. In this article however, we shall explore a simplified use case of PnL Explained by using only Delta (Δ) to represent sensitivity. Delta here gives the rate of change in value against the price movement. For instance, in our dataset against the EUR-Govt-NL curve, we see that Delta is 0.01 for the currency EUR.
This means that if there is a 2,000,000 drop in market value, the impact of this change is (0.01 x -2,000,000). Therefore, our theoretical PnL will be -20,000.
In this article, we shall explore PnL Explained using atoti library in a Jupyter notebook. Thereafter, we will utilise atoti’s simulation capability to create a few scenarios that stress the yield curves.
Getting started with atoti
Now that we understand PnL Explained a little more than before, let’s see how atoti comes into play in our use case.
atoti is a python library that allows us to easily create a multi-dimensional data cube. In fact, its in-memory data cube enables us to process more data at an accelerated speed with minimal latency, thus making it a powerful aggregation engine.
atoti can natively consume csv and parquet files into data stores, for instance:
position_sensitivity_store = session.read_csv( "s3://data.atoti.io/notebooks/pnl-explained/position_sensitivities.csv", keys=["book_id", "instrument_code", "currency", "curve", "tenor"], store_name="Position Sensitivities Store", )
Additionally, atoti also supports data objects from pandas and spark dataframes:
dataframe = pandas.read_csv("data/position_sensitivities.csv") position_sensitivity_store = session.read_pandas( dataframe, keys=["book_id", "instrument_code", "currency", "curve", "tenor"], store_name="Position Sensitivities Store", )
This implies that we are not constrained by the format our data comes in. With pandas and spark dataframes, we can virtually integrate any form of data, perform changes on it and then integrate the data into atoti, ready to be translated into a cube.
After creating the data stores required for our use cases, we join them together to get the following setup:
Position Sensitivities Store is what we called a base store. For the cube to display a book, it must be available in this store.
All we have to do next is to invoke create_cube on the base store:
position_sensitivity_cube = session.create_cube( position_sensitivity_store, "Position_Sensitivities" )
There you go! Simple as that!
By default, a cube is created in auto mode and that means that a hierarchy is created for every non-numeric column, and a measure is created for every numeric column. Advanced users can also set mode to manual and only a count measure will be created. If you just want the hierarchy but do not want the measures? Set it to no_measures mode.
Diving into Theoretical PnL
If we take the difference between the rates at the start and end of day, we can calculate the daily change in yield:
Yield Change = (Last rate – Start rate) x Notional
We can then estimate the impact of Delta on this change simply by multiplying them:
Theoretical PnL = Delta x Yield Change
Using atoti, these translate to:
m["Theoretical PnL"] = tt.agg.sum( m["sensi.SUM"] * (m["last.VALUE"] - m["start_of_day.VALUE"]) * m["notional.SUM"], scope=tt.scope.origin("currency", "curve", "tenor", "book_id", "instrument_code"), )
Note that we are aggregating within the scope of the data cube levels defined in the command above. We will inspect the yield curves at these levels in our later use cases.
Theoretical PnL by Investment Portfolio
The beauty of having a multi-dimensional cube is that we can quickly aggregate the PnL at different levels.
In the table on the left, we drill-down the Investment Portfolio hierarchy. We could see the aggregated subtotal at each level of the hierarchy.
In the table on the right, we look at a portfolio from a single level hierarchy. We could see that it gives the total PnL of the portfolio regardless of the fund breakdowns. In any case, we always have the flexibility to drill-down to the funds, giving different perspectives of the same data.
The Yield Curve gives insights to the future interest rate changes and economic activity.
A normal yield curve which shows an upward curve, shows longer-term bonds having higher yields than short-term ones. Short-term interest rates are the lowest because there is less embedded inflation risk. This shows economic expansion.
This is what we are having in our current use case.
An inverted curve which has a downward slope, is a sign of an upcoming recession. Investors believe that interest rates will fall in the near future, hence preferred longer-term bonds that give lower yields but are safer investment. The low demand in the shorter-term bonds causes yields to therefore go higher.
A flat or humped yield curve shows that the yields for the shorter-term and longer-term bonds are very close to each other. Investors are expecting interest rates to remain about the same, probably an economic transition.
Seeing the importance of the yield curves, it would be interesting to see how the change in rates changes the shape of the curve and downstream, the Theoretical PnL.
Simulations – Trouble’s brewing
Applying sensitivity against the Asset Class and Currency, we can see that the 3 peaks are in the order of “Rates & Credit, EUR”, “Forex, EUR” and lastly “Equity, EUR”.
Currency EUR seems like a probable candidate for running simulation for the yield curves. With this in mind, we shall set up a simulation and baseline our initial data as Last Curve. Under this simulation, we will reduce the last rate of the currency EUR for different maturities:
curve_simulation = position_sensitivity_cube.setup_simulation( "Curve Simulation", per=[lvl["tenor"], lvl["currency"]], add=[m["last.VALUE"]], base_scenario_name="Last Curve", )
Immediately we will be able to see a new hierarchy called Curve Simulation under the dimension Measure Simulations in the cube.
We are now ready to stir up some trouble for the Euro!
Scenario 1 – Parallel Shift
We create a scenario under this simulation, where the last rate for currency EUR is reduced by 0.1%, also known as 10 basis points (bps), regardless of the maturity. This should result in a parallel shift in the yield curve. It happens commonly when the yield curve is upward sloping.
This simulation is important for investors who might liquidate their positions before maturity, as the shift can cause bond prices to fluctuate substantially.
The below statement defines a 10 bps reduction in the last.VALUE for EUR, across all tenors (first parameter with the * value).
curve_shift = curve_simulation.scenarios['Curve Parallel Shift'] curve_shift += ("*", "EUR", -0.001, tt.simulation.Priority.CRITICAL)
We put both the scenarios under Curve Simulation hierarchy together on a chart, and we observe that while the curve retains its shape, all the data points moved to the right of the graph.
This shift in rates for EUR causes the Theoretical PnL to go from positive to negative, with Rates & Credit suffering the most loss. Inline with the Risk map we saw earlier, Risk & Credit is the most sensitive to changes to EUR among the three asset classes.
In such a scenario, investors should mitigate the risk by reducing the bond duration, hence alleviating the volatility.
Scenario 2 – Inverse Curve
Investors expect short-term bills to plummet if recession is coming, as interest rates will be lowered when the economy slows down. Hence, they are more willing to hold on to longer-term bonds to have a safer investment, despite the lower yield. The drop in demand for short-term bonds sends its yields up, while the higher demand in longer-term bonds drives its yield down further. We will then see the inverse curve.
An inverted yield curve is therefore said to be a predictor of an incoming recession.
To simulate the beginning of a recession, we are going to bring the rates down by 20 bps for those longer-term bonds, affecting only the currency EUR.
curve_inversion = curve_simulation.scenarios['Curve Inversion'] curve_inversion += ("5Y", "EUR", -0.002, tt.simulation.Priority.CRITICAL) curve_inversion += ("6Y", "EUR", -0.002, tt.simulation.Priority.CRITICAL) curve_inversion += ("7Y", "EUR", -0.002, tt.simulation.Priority.CRITICAL) curve_inversion += ("8Y", "EUR", -0.002, tt.simulation.Priority.CRITICAL) curve_inversion += ("9Y", "EUR", -0.002, tt.simulation.Priority.CRITICAL)
We observed a slight inversion in the curve from the tenor 4Y to 5Y before it becomes a shift to the right, as in the case of a parallel shift.
We see the Theoretical PnL becomes negative for a tenor period of 5 years and beyond.
Scenario 3 – Inverse Curve Stress
As recession looms, we stress the curve by reducing the rates further for maturity greater or equal to 7 years:
curve_inversion_stress = curve_simulation.scenarios['Curve Inversion Stress'] curve_inversion_stress += ("5Y", "EUR", -0.002, tt.Priority.CRITICAL) curve_inversion_stress += ("6Y", "EUR", -0.002, tt.Priority.CRITICAL) curve_inversion_stress += ("7Y", "EUR", -0.004, tt.Priority.CRITICAL) curve_inversion_stress += ("8Y", "EUR", -0.004, tt.Priority.CRITICAL) curve_inversion_stress += ("9Y", "EUR", -0.004, tt.Priority.CRITICAL)
Comparing the Theoretical PnL across the 3 scenarios (Last Curve, Curve Inversion and Curve Inversion Stress), we see that the curve gradually gets inverted.
Unless we get all the analysts to learn python and use Jupyter notebook, this way of running simulation is not very user friendly. Not to mention, it gets a little repetitive trying to write the simulation and visualize the analysis each time we have a simulation.
Luckily, there is a more efficient and user-friendly way to interact with the data. That would be publishing the visuals that we have previously created as widgets and create an interactive dashboard for our users.
Publishing a visual is straightforward. Right-click on the visual and select Publish widget in app.
Tips: It is easier to identify our widgets by passing in a name when we are doing visualization.
E.g. position_sensitivity_cube.visualize(‘Yield Curve Curve Parallel Shift’)
Click on Open App after we have published all the widgets that we are interested in. Otherwise, we can also execute session.url to get the url to access the atoti UI.
We can quickly put together interesting dashboards that could be shared with our users and have them run the simulation from atoti UI directly.
You can use the Measure Simulations widget to update your simulation or create new scenarios.
Keep your eyes on atoti.