Wednesday 4th December 2019 from 13:00 to 14:00 (GMT)
The computation of Dynamic Initial Margin requires an enormous amount of computer power.
Different approaches have been put forward, ranging from Regressions, Deep Neural Networks to Chebyshev.
In this talk we introduce all of them and present how we can harness the power of Chebyshev polynomials to compute Dynamic IM very efficiently. We compare Chebyshev results to other available techniques:
- Review of available methods for Dynamic IM
- The power of Chebyshev Polynomials. Exponential convergence of Chebyshev methods: why is it so fast?
- Theoretical basis
- Application to simulation of Initial Margin inside Monte Carlo simulations
- Examples: swaps, swaptions and beyond
- Comparison to regression and Deep Neural Networks
- Numerical results
Presenter: Mariano Zeron: Head of Research and Development: MoCaX Intelligence