There is a VaR Beyond Usual Approximations
Résumé
Basel II and Solvency 2 both use the Value-at Risk (VaR) as the risk measure to compute the Capital Requirements. In practice, to calibrate the VaR, a normal approximation is often chosen for the unknown distribution of the yearly log returns of financial assets. This is usually justified by the use of the Central Limit Theorem (CLT), when assuming aggregation of independent and identically distributed (iid) observations in the portfolio model. Such a choice of modeling, in particular using light tail distributions, has proven during the crisis of 2008/2009 to be an inadequate approximation when dealing with the presence of extreme returns; as a consequence, it leads to a gross underestimation of the risks. The main objective of our study is to obtain the most accurate evaluations of the aggregated risks distribution and risk measures when working on financial or insurance data under the presence of heavy tail and to provide practical solutions for accurately estimating high quantiles of aggregated risks. We explore a new method, called Normex, to handle this problem numerically as well as theoretically, based on properties of upper order statistics. Normex provides accurate results, only weakly dependent upon the sample size and the tail index. We compare it with existing methods.
Mots clés
Stable Distribution
Conditional (Pareto) Distribution
Conditional (Pareto) Moment
Convolution
Expected Short Fall
Extreme Values
Rate of Convergence
Financial Data
Aggregated risk
(generalized) Central Limit Theorem
(refined) Berry-Esséen Inequality
Order Statistics
Value-at-Risk
Market Risk
Pareto Distribution
Risk Measures
High Frequency Data
Origine | Fichiers éditeurs autorisés sur une archive ouverte |
---|
Loading...