Financial Mathematics and Applied
Probability Seminars 2008-2009
Unless otherwise indicated, all seminars take place at Lecture Theatre K2.31 (formerly known as 2C),
King's College London, The Strand, London WC2R 2LS.
|Tuesday 30 September,
|Professor Claudio Albanese
Department of Mathematics, King's College London
Calibration to Interest Rate Exotics in a Model Agnostic Framework
Calibrating financial models is a challenging optimization problem, particularly difficult for interest rate exotics. One possibility is to design special models which are solvable in closed form for the derivatives used as calibration targets. An alternative, is to find a brute force engineering solution without restricting the underlying dynamics, thus remaining model agnostic. The added flexibility is beneficial and confers robustness and economic realism to the model. The model agnostic alternative is now possible due to the recent breakthroughs in computer engineering. Nowadays, special function evaluations can be replaced in their role as the pivotal computational engine by matrix-matrix multiplication routines accelerated on GPU coprocessors.
The needed mathematics and numerical analysis change apparently beyond recognition. The emphasis shifts away from stochastic calculus to evaluate drift restrictions and solve for the pricing of European swaptions by change of measure. Implicit differentiation schemes for lattice models on long time steps and sparse node assignments are no longer optimal. Instead the emphasis is on functional analysis and operator methods [which leverage on matrix algebra], explicit differentiation schemes, convergence criteria and smoothness estimates. Simple parallelism for grid computing on CPU clusters is replaced by complex multi-threading patterns on multicore architectures. However, beneath the formal and technical differences, the basic postulate of arbitrage freedom still plays the role of the pivotal linchpin of Financial Mathematics.
As a concrete example, I discuss the calibration of a stochastic monetary policy interest rate model to the swaption volatility cube along with CMS spread options. In this model, exotics can safely be included into calibration baskets as they are not much harder to price than vanilla swaptions or even swaps themselves. I find that it is possible to fit well market datasets with a single global calibration [not instrument dependent]. A good quality ab-initio calibration takes less than one hour on multi-GPU hardware. Individual pricing of an exotic instrument takes around 4-5 seconds. Aggregate pricing of large real portfolios with hundreds of positions can take less than a minute if one synchronizes cash flows, thus enabling real time VaR calculation for exotic portfolios without SABR mapping. The model is thus viable from the engineering standpoint. I also obtain a surprisingly tight agreement with BGM quotes on large portfolios of callable CMS spread options. Prof Albanese's presentation.
Prof Albanese's web page on agnostic modelling.
|Tuesday 7 October,
|Professor Chris Rogers
Statistical Laboratory, DPMMS, Cambridge
Contracting for optimal investment with risk control
The theory of risk measurement has been extensively developed over the
past ten years or so, but there has been comparatively little effort
using this theory to inform portfolio choice. One theme of this paper is
how an investor in a conventional log-Brownian market would invest to
expected utility of terminal wealth, when subjected to a bound on his
measured by a coherent law-invariant risk measure. Results of Kusuoka
lead to remarkably complete expressions for the solution
to this problem.
The second theme of the paper is to discuss how one would
actually manage (not just measure) risk. We study a principal/agent
problem, where the principal is required to satisfy some risk constraint.
The principal proposes a compensation package to the agent, who then
optimises selfishly ignoring the risk constraint. The principal can pick a
compensation package that induces the agent to select the principal's
choice. Prof Rogers' paper associated with this talk.
|Tuesday 14 October,
|Special Financial Computing Event:
Developing grid solutions with Mathematica, a financial computing perspective
This special extended seminar will focus on the development of financial computing solutions with GridMathematica, and will feature contributions from Wolfram Research and examples from the work of the KCL Centre for Financial Grid Computing. Registration is required - register here.
Further details are available.
5.30: Conrad Wolfram, Director of Strategic and International Development, Wolfram Research:
Integrating key components of HPC
Abstract: Since its inception, Mathematica has integrated apparently disparate aspects
of technical computing. Now a complete high-performance workflow exists from
basic analysis, data provision and prototyping to application development
and final deployment. This talk will address what's possible and why it's
6.00: Tom Wickham-Jones, Director of Kernel Technology, Wolfram Research:
Advanced Technical Computing, grid and HPC applications.
Abstract: This talk will cover the advanced technical computing functionality of
Mathematica with an emphasis on grid and HPC techniques. It will cover
areas such as modelling, statistical analysis, and data processing, also
showing how these integrate with Mathematica's computable data feeds. The
talk will include current and forthcoming technologies.
6.30: Prof. William Shaw, King's College:
The "Map -> ParallelMap" paradigm for financial computing: Monte Carlo and other examples of parallelization minus pain. Mathematica notebook of this talk.
|Tuesday 21 October,
|Professor Nick Bingham
Mathematics Department, Imperial College London
Stationary time series and long memory
This is joint work with Akihiko Inoue and Yukio Kasahara of the University of Hokkaido, Sapporo.
This talk is about how to tell the difference between things whose influence is local (in space or time), and global. The big picture (inevitably these days, during a crisis) is the difference between systemic aspects and normal ebb and flow. In more normal times, one might think of the difference between finance (prices of the underlying given) and economics (largely about how prices are arrived at).
Those with a physics background may like to think of long-range dependence in space. This is relevant to ferromagnetism, and to other physical phenomena involving interactions in large systems. These are characterized by phase transitions (back to the current financial crisis again!).
In this talk, we focus on time, dealing for simplicity with stationary time series. Here the question is the relevance or otherwise of the remote past.
The relevant mathematics - prediction theory - goes back to Szego and Kolmogorov, and involves orthogonal polynomials on the unit circle (OPUC). The key result is Baxter's theorem. We use recent work of Inoue and Kasahara to give a new definition of long memory (in time - or long-range dependence in space).
Prof Bingham's presentation. We also heard his views on the recent crash [PDF].
|Tuesday 4 Nov,
|Dr Peter Carr
Bloomberg, New York
On the Information Content of Option Prices
It is well known that the market price of a standard option
reflects the risk-neutral mean of its path-independent payoff.
It is less well known that this same option price
also reflects the risk-neutral mean of various path-dependent
payoffs. We give several examples of such payoffs which together
suggest that hockey stick payoffs are a very good choice
for a listed payoff. Dr Carr's presentation.
|Tuesday 11 November,
|Special Financial Computing Event: GPU computing for financial applications
Professor Claudio Albanese (KCL), Michael Giles (Oxford), the NAG team, Gernot Ziegler (NVIDIA)
This all day event is being held at King's in collaboration with NVIDIA and will explore the latest developments in GPU financial computing.
The workshop provides an introduction to GPU programming based on the nVidia CUDA language and reviews applications to Financial Engineering. The morning session covers hardware configurations of nVidia Tesla cards, kernel writing with CUDA, CUBLAS and CUFFT libraries, asynchronous programming patterns and multi-GPU platforms. The afternoon and evening sessions dwell on practical applications to Financial Engineering.
The official web site for this past workshop and links to presentations is here. Presentations may also be downloaded from the KCL mirror site in the Grid Computing pages here.
|Tuesday 18 November,
|Professor David Hobson
Liquidation Strategies for Infinitely Divisible portfolios
We consider the problem facing a risk averse agent who seeks to
liquidate or exercise a portfolio of (infinitely divisible) American
style options. The optimal liquidation strategy is of threshold form and
can be characterised explicitly as the solution of a calculus of
variations problem. Apart from a possible initial exercise of a tranche
of options, the optimal behaviour involves liquidating the portfolio in
infinitesimal amounts, but at times which are singular with respect to
calendar time. We consider a number of illustrative examples involving
CRRA and CARA utility, stocks and portfolios of options with different
strikes, and a model where the act of exercising has an impact on the
underlying asset price.
Joint Work with Vicky Henderson.
Prof Hobson's presentation.
|Tuesday 2 December,
|Dr Emmanuel Acar
Directional Trading Limited
Correlation between trading models
This presentation investigates the correlation coefficient between returns
generated by forecasting strategies. It first reviews existing theoretical
formulae. A special attention is given to the cases where 1) different
strategies are applied to the same market 2) the same strategy is applied to
different markets. An empirical application to the foreign exchange markets
follows. Finally portfolio implications and challenges ahead are discussed.
Dr Acar's presentation.
|Tuesday 9 December,
|Dr Michel Dacorogna
Risk Aggregation, Dependence Structure and Diversification Benefits
Work done with Roland Burgi and Roger Iles
Insurance and reinsurance live and die from the diversification benefits or lack of it in
their risk portfolio. The new solvency regulations allow companies to include them in
their computation of risk-based capital (RBC). The question is how to really evaluate
To compute the total risk of a portfolio, it is important to establish the rules for aggregating the various risks that compose it. This can only be done through modelling of their dependence. It is a well known fact among traders in financial markets that "diversification works the worst when one needs it the most". In other words, in times of crisis the dependence between risks increases. Experience has shown that very large losses almost always affect simultaneously multiple lines of business. September 11, 2001, is an example of this: when the claims originated from lines of business as usually de-correlated as property and life at the same time that the assets of the company were depreciated by the crisis on the stock markets.
In this paper, we explore various methods of modelling dependence and their influence on diversification benefits. We show that the latter strongly depend on the chosen method and that linear correlation grossly overestimate diversification. This has consequences on the RBC for the whole portfolio, which is smaller than it should be when correctly accounting for tail correlation. However, the problem remains to calibrate the dependence for extreme events, which are rare by definition. We analyze and propose possible ways to get out of this dilemma and come up with reasonable estimates. Dr Dacorogna's presentation.
|Tuesday 20 January,
|Dr Damiano Brigo
Fitch Solutions / Imperial College London
Credit Index Options: the no-armageddon pricing measure and the role of correlation after the subprime crisis
In this work we consider three problems of the standard market approach to credit index options pricing: the definition of the index spread is not valid in general, the payoff that is usually considered leads to a pricing methodology which is not always defined, and the candidate numeraire one would use to define a pricing measure is not strictly positive, which would lead to a non-equivalent pricing measure.
We give a general mathematical solution to the three problems, based on a novel way of modeling the flow of information through the definition of a new subfiltration. Using this subfiltration, we take into account consistently the possibility of default of all names in the portfolio, that is neglected in the standard market approach. We show that, while the related mispricing can be negligible for standard options in normal market conditions, it can become highly relevant for different options or in stressed market conditions. In particular, we show on 2007 market data that after the subprime credit crisis the mispricing of the market formula compared to the no arbitrage formula we propose has become financially relevant even for the liquid Crossover Index Options.
|Tuesday 27 January,
|Professor Damir Filipovic
Vienna Institute of Finance
Title: Dynamic CDO Term Structure Modelling
This paper provides a unifying approach for valuing contingent claims
on a portfolio of credits, such as collateralized debt obligations (CDOs).
We introduce the defaultable (T,x)-bonds, which pay one if the aggregated loss process in the underlying pool of the CDO has not exceeded
x at maturity T, and zero else. Necessary and sufficient conditions on the stochastic term structure movements for the absence of arbitrage are given. Background market risk as well as feedback contagion effects of the loss process are taken into account. Moreover, we show that any exogenous specification of the volatility and the contagion parameters actually yields a unique consistent loss process and thus an arbitrage-free family of (T,x)-bond prices. For the sake of analytical and computational efficiency we then develop a tractable class of doubly stochastic affine term structure models.
|Tuesday 10 February,
|Professor Stewart Hodges
Cass Business School, City University London
Title: Fixed Odds Bookmaking/Option Writing with Stochastic Speculative Demand
This paper provides a model of bookmaking in the market for bets in a British horse race. The bookmaker faces the risk of unbalanced liability exposures. Even random shocks in the noisy betting demands are costly to the bookmaker since his book could become less balanced. In our model, the bookmaker sets appropriate odds to influence the betting flow and thereby mitigate the risk. The stylized fact of the favorite-longshot bias arises from the model under some specific assumptions. The model offers insights into the complexity of managing a series of state contingent exposures such as options.
|Tuesday 17 February,
|Professor Sergei Levendorskiy
Title: Refined and enhanced FFT techniques, with applications to pricing barrier options and their sensitivities
Many mathematical methods of option pricing rely on one's
ability to calculate the action of certain integro-differential
operators and convolution operators quickly and efficiently. In
turn, the latter computations are based on FFT techniques. However,
in many important cases, a straightforward application of FFT and
iFFT leads to errors of several kind, which cannot be made
simultaneously small (uncertainty principle) unless grids with too
many points are used. We explain an approach to using FFT techniques
that gives one more flexibility in controlling the aforementioned
errors, and, at the same time, yields fast and efficient algorithms.
As applications, using Carr's randomization, we compute the prices
and sensitivities of barrier options and first-touch digital options
on stocks whose log-price follows a Levy process. The numerical
results obtained via our approach are demonstrated to be in good
agreement with the results obtained using other (sometimes
fundamentally different) approaches that exist in the literature.
However, our method is computationally much faster (often, dozens of
times faster). Moreover, our technique has the advantage that its
application does not entail a detailed analysis of the underlying
Levy process: one only needs an explicit analytic formula for the
characteristic exponent of the process. Thus our algorithm is very
easy to implement in practice. Finally, our method yields accurate
results for a wide range of values of the spot price, including
those that are very close to the barrier, regardless of whether the
maturity period of the option is long or short.
A natural extension of the method gives similar results for
|Tuesday 24 February,
|Dr Vicky Henderson
Oxford-Man Institute of Quantitative Finance
Title: Prospect Theory, Partial Liquidation and the Disposition Effect
We solve a liquidation problem for an agent with prospect theory preferences who seeks to sell-off a portfolio of (divisible) claims. Our methodology enables us to consider different formulations of prospect preferences in the literature (piecewise exponential or piecewise power) and various price processes. We find that these differences in specification matter - for instance, with piecewise power functions, the agent
may liquidate at a loss relative to break-even, albeit the likelihood of liquidating at a (small) gain is much higher than liquidating at a (large) loss. This is consistent with the disposition effect documented in empirical and experimental studies. We find the agent does not choose to partially liquidate a position, but rather, if liquidation occurs, the entire position is sold. This is in contrast to partial liquidation when agents have standard concave utilities.
|Tuesday 3 March,
|Dr Riccardo Rebonato
The Royal Bank of Scotland
Title: The LMM-SABR Model: Pricing, Hedging and Empirical Evidence
This is joint work with Richard White. This talk is about a financially-motivated extension of the LIBOR market model (LMM) that reproduces for all strikes and maturities the prices of the plain-vanilla hedging instruments (swaptions and caplets) produced by the SABR model. We present a `philosophy' of option pricing that
takes into account both the realities of the industry needs (eg, the need to calibrate as accurately as possible to the plain-vanilla reference hedging instruments, the need to obtain prices and hedges in reasonable time) while reproducing a realistic future evolution of the smile surface (our `financial reality'). With our model we bring the dynamics of the various forward rates and stochastic volatilities under a single measure. To ensure absence of arbitrage we also derive `drift adjustments'. Not surprisingly, these have to be applied both to the forward rates and to their volatilities. When this is done, complex derivatives, that depend on the joint realization of all the relevant forward rates, can now be priced. Comparisons with local-volatility modelling (see, e.g., Dupire (1994), Derman and Kani (1994)) and the Variance Gamma model (see, e.g., Madan and Seneta (1990)) and its `stochastic volatility' extensions (see,e.g., Madan and Carr (1998)) are provided.
|Tuesday 17 March,
|Dr Tiziana Di Matteo
King's College London
Title: Econophysics: new tools to study financial markets
In this talk I will give a broad overview of the state of the art in
Econophysics: a relatively recent discipline that has already a rich history and even controversial trends . In particular, I will show results concerning the characterization and visualization of correlations in financial systems by means of network theory and I will introduce a new tool to filter relevant information in these systems and to extract the hierarchical structure of the market [2,3]. I will discuss these results and the economic meaning of the financial market hierarchical structure and its dynamical evolution .
 "Topical Issue: Trends in Econophysics" in EPJB, Vol. 55, No. 2 (2007).
 M. Tumminello, T. Aste, T. Di Matteo, R. N. Mantegna, PNAS 102, n. 30
 M. Tumminello, T. Aste, T. Di Matteo, and R. N. Mantegna, EPJB 55 (2007)
 F. Pozzi, T. Di Matteo and T. Aste, Advances in Complex Systems 11
|Tuesday 24 March