This paper compares the CPU effort and numerical biases of six Fourier-based implementations. Our analyses focus on two jump models that can consistently price options with different strikes and maturities: (i) the Bates jump-diffusion model, which combines jumps with stochastic volatility and (ii) the Asymmetric Variance Gamma (AVG) model, a pure-jump process where an infinite number of jumps can occur in any interval of time. We show that both truncation and discretization errors significantly increase as we move away from the diffusive Black-Scholes-Merton dynamics. While most pricing choices converge to the Bates reference values, Attari’s formula is the only Fourier-based method that does not completely blow up in any AVG problematic region. In terms of CPU speed, the strike vector computations proposed by Zhu (2010) significantly improve the computational burden, rendering the use of fast Fourier transforms and plain delta-probability decompositions inefficient.
© 2001-2024 Fundación Dialnet · Todos los derechos reservados