Otwarty dostęp

Measurement of Risk Based on QR-GARCH-EVT Model


Zacytuj

Introduction

In order to cope with financial market crisis such as the stock market crash of the Wall Street in 1987 and the monetary system breakdown of Europe in 1992, risk value (VaR) emerged accordingly in 1990s. Now, in risk management VaR has become a standard to measure market risk [1]. Risk value is used to measure the maximum potential loss of an investment portfolio during a certain period under a certain confidence level.

The concept of VaR model is simple though, it is still a challenge to estimate it accurately [2]. So far, various methods have been developed to predict VaR. Because the distribution of a typical investment portfolio will vary over time, there is no a method that can give a satisfying solution [3]. The most widely-used method use completely parametric time series models such as ARCH or GARCH model to capture the dynamic volatility of residuum term [4]. Main weakness of this kind of method is that it needs to assume the distribution of the residuum of rate of return [5]. Initially GARCH and ARCH model both were assumed as a normal distribution, later considering leptokurtosis characteristic of financial proceeds, it was replaced by those like Student-t distribution, which is widely thought effective [6, 7]. However, so far there has not been a unified answer chosen which distribution to fit residuum term is the best. Regarding to non-parametric method, the most popular is historical simulation method. By using this method, the empirical fractile of historical return can be used as the estimation of VaR. This method is easily processed, though its VaR estimation may be unstable. On the other hand, in recent years, there has been a new method that uses fractile regression to measure financial risk. Because the method does not consider the distribution of residuum term when measuring risk, and it can reflect its tail characteristics to some extent, the method provides a very good statistical method for fitting the financial data with leptokurtosis characteristics. So scholars have conducted deep researches on the method at home and abroad [8, 9, 10, 11, 12], discovering quantile model performs better [13, 14].

However, when the investor concerns the risk of extreme incident, such as the Great Depression or 2008 financial crisis, the method of fractile alone fails to statistically provide sound risk control measure. The extreme theory was used in risk measure field in 90s last century. Because it builds a model only aiming at the tail, without considering overall distribution, in addition, that the data at tail zone is sparse does not affect the robustness of the predicted result [15, 16]. The experience showed that using extreme method to evaluate the risk was more suitable for prediction of thick tail distribution high fractile that using traditional model, and its predicted result was relatively stable [17]. In practice, it found that using EVT theory to statistically deduce extreme distribution can obtain very good effect, so it is used in description of extreme risk more and more [18]. However, many EVT model based VaR estimation measure extreme risk by combining GARCH type, SV type model, on the basis of assuming sample overall distribution, which results in insufficient consideration of extreme incident and underestimation of the risk [19, 20]. On the other hand, extreme theory only focuses on the observed value at upper or lower tail, ignoring the observed value of the data distributing in the middle. Wang et al. [21] for the first time used the composite fractile regression method to estimate the middle fractile, and then calculated these estimation to EVT based limit fractile. But they only considered random sample, without considering time series data.

From the above review, this paper found that China's research on risk theory and quantile regression theory was relatively backward. The application of foreign financial risk theory and quantile regression theory is very extensive, and model innovation is also very advanced. Great gap. The quantile regression method can be combined with many models, which provides a lot of space for foreign scholars to study. They can build a lot of improved models and apply them to empirical research, and then summarize a lot of practical models, which promotes the development of quantile regression methods and also provides a lot of theoretical basis for Chinese scholars to study financial risk theory and quantile regression theory.

So, existing researches on financial risk mostly are based on measuring VaR value by combining fractile method and GARCH family models, with something negotiable. Firstly, GARCH family model considers the leptokurtosis characteristics of financial data, but it fails to effectively reflect the tail characteristics of financial data; Secondly, there have been researches on risk using fractile regression, but there is few related literature yet, especially those literature combined with fractile regression and EVT model even fewer. This paper adopted a new method to estimate the limit fractile of financial time series by combining fractile regression a GARCH model (hereinafter referred as to QR-GARCH model) and EVT model. Specifically, this paper firstly used the fractile self-regression method proposed by Xiao and Koenker [22] to estimate global parameter and potential volatility, then used EVT to simulate the tail of unknown innovation distribution, when borrowing the information of adjacent fractile to estimate limit fractile, semiparametric setting imposes minimum restrictive hypothesis on the innovation distribution using EVT, consequently, this paper firstly used QR-GARCH based on the model to fit the stylized fact characteristics of financial return, and based on obtained residuum series, use EVT model to build the model of residuum series, further build QR-GARCH-EVT based on financial risk measure model, and then use HS300 index data to demonstrate and test the effectiveness of the model.

EVT-QR-GARCH model
VaR model

Assume that the value of a financial position a moment t is Vt, the loss in the next phase 1 is: Lt(l)=Vt+1Vt{L_t}(l) = {V_{t + 1}} - {V_t}

Assume that the accumulative distribution function of random variable is Lt(l) under confidence level p, this paper defines VaR1−p as the maximum loos potentially faced with: VaR1p=inf{x|FL(x)1p}{VaR}_{1 - p} = \inf \{x|{F_L}(x) \ge 1 - p\}

(2) can be transformed to: VaRf(x)dx=1p\int_{- \infty}^{VaR} f(x)dx = 1 - p

For the univariate function, VaR is the loss distribution function q = 1 − p fractile, accordingly: Pr[Lt(l)>VaR1p]p{P_r}\left[ {{L_t}\left(l \right) > {VaR}_{1 - p}} \right] \le p

Equation (4) can be transformed to: VaRf(x)dx=p\int_{VaR}^\infty f(x)dx = p

So, assume that the mean value and variance of financial asset return are μt and σt, the fractile corresponding to confidence level p is Q1−p, and the general expression of VaR is: VaR1p=μt+Q1pσt{VaR}_{1 - p} = {\mu _t} + Q_{1 - p}^{{\sigma _t}}

QR-GARCH model

Despite of simple definition, it is a challenging statistical question to calculate VaR. Statistically, VaR is the fractile of the expected return of an investment portfolio. So, VaR is closely correlative to the distribution of return. However, it is not enough to only assume the distribution function of the return. The reason is that the return does not meet the hypothesis of independent different distribution. Xiao and Koenker [22] proposed a fractile regression based QR-GARCH model: {rt=σtεt,εtiid(0,1)σt=ω+αxt1+βσt12xt=λ+δσt2+η1εt+η2εt12+ut,εtiid(0,1)\left\{{\matrix{{{r_t} = {\sigma _t}{\varepsilon _t},{\varepsilon _t} \sim iid(0,1)} \hfill \cr {{\sigma _t} = \omega + \alpha {x_{t - 1}} + \beta \sigma _{t - 1}^2} \hfill \cr {{x_t} = \lambda + \delta \sigma _t^2 + {\eta _1}{\varepsilon _t} + {\eta _2}\varepsilon _{t - 1}^2 + {u_t},{\varepsilon _t} \sim iid(0,1)} \hfill \cr}} \right.

Where ω is a constant term and greater than 0, α represents the speed of the response to market impact made by the rate of return, β represents the amplitude of vitality compared with earlier stage, both α and β are greater than 0 and α + β < 1. Using iterative method to give the estimated value of σt: σt=ω1β+α(xt1+βxt2+β2xt3+β3xt4+){\sigma _t} = {\omega \over {1 - \beta}} + \alpha \left({{x_{t - 1}} + \beta {x_{t - 2}} + {\beta ^2}{x_{t - 3}} + {\beta ^3}{x_{t - 4}} + \ldots} \right)

If the ARCH (∞) process in equation (8) is stable, and this paper can obtain: {σt2=α0+j=1αjxtjrt2=(α0+j=1αjxtj)εt2\left\{{\matrix{{\sigma _t^2 = {\alpha _0} + \sum\nolimits_{j = 1}^\infty {\alpha _j}{x_{t - j}}} \hfill \cr {r_t^2 = \left({{\alpha _0} + \sum\nolimits_{j = 1}^\infty {\alpha _j}{x_{t - j}}} \right)\varepsilon _t^2} \hfill \cr}} \right.

In equation (9), α0=ω1β{\alpha _0} = {\omega \over {1 - \beta}} , αj = βj−1.

Let Ft−1 be the information at moment t-1, then the fractile function of rt2r_t^2 based on past information can be expressed as: Qrt2(τ|Ft1)=α0(τ)+j=1αjxtj{Q_{r_t^2}}\left({\tau |{F_{t - 1}}} \right) = {\alpha _0}\left(\tau \right) + \sum\nolimits_{j = 1}^\infty {\alpha _j}{x_{t - j}}

Where, aj(τ) = ajQε2 (τ), j = 0,1,2,···, Qe2 (τ) are the τ fractile of εt2\varepsilon _t^2 .

Due to unobservability of fluctuation process, when there is no enough data, the estimation of extreme fractile regression is not accurate, so Xiao and Koenker [22] method shall be used to estimate (ω,α1,β1)'{(\mathop \omega^\frown,{{\mathop \alpha^\frown }_1},{{\mathop \beta ^\frown} _1})^{'}} and the specific process is as follows:

First, according to equations (9) and (10) use fractile regression to estimate γ^(τk)\hat \gamma ({\tau _k}) of, specific equation is: γ^(τk)=argmint=m+2T[ρτ(rt2a0(τ)t=j+1Tajxtj]\hat \gamma ({\tau _k}) = \arg \min \sum\limits_{t = m + 2}^T [{\rho _\tau}(r_t^2 - {a_0}(\tau) - \sum\limits_{t = j + 1}^T {a_j}{x_{t - j}}]

Use equation (11) to obtain a˜(τk)\tilde a({\tau _k}) , k = 1,2,···, k, ρ is loss function. Then use minimum distance method to estimate [22] to obtain (a^0,a^1,a^m)'{({\hat a_0},{\hat a_1}, \cdots {\hat a_m})^{'}} , at the same time, the volatility σt can be estimated equation (9). Further, in the paper fractile regression of rt is conducted based on (1,ρ^t12,xt)'{(1,\hat \rho _{t - 1}^2,{x_t})^{'}} : θ^(τ)=argmint=m+2T[ρτrt2ω(τ)a(τ)xtj]+β(τ)ρ^t12\hat \theta (\tau) = \arg \min \sum\limits_{t = m + 2}^T [{\rho _\tau}r_t^2 - \omega (\tau) - a(\tau){x_{t - j}}] + \beta (\tau)\hat \rho _{t - 1}^2

Under (τ1,τ2,,τk)'{({\tau _1},{\tau _2}, \cdots,{\tau _k})^{'}} fractile, it can be estimated that, θ^(τk)=(ω^τk,α^τk,β^τk)'\widehat \theta \left({{\tau _k}} \right) = (\widehat \omega {\tau _k},\widehat \alpha {\tau _k},\widehat \beta {\tau _k}{)^{'}}

Finally, the evaluated value of (ω,α,β)'{(\mathop \omega \limits^\frown,\mathop \alpha \limits^\frown,\mathop \beta \limits^\frown)^{'}} can be obtained, and the residuum term εt, of rate of return can be obtained.

QR-GARCH-EVT model

Use QR-GARCH model to obtain residuum series, it is needed to transform εt into standard residuum term Zt, let μt,σt be the condition mean value and condition variance of return series residuum term, then: (Ztn+1,,Zt)=(rtn+1μ^tn+1σ^tn+1,,rtμ^tσ^t)({Z_{t - n + 1,}} \cdot \cdot \cdot,{Z_t}) = ({{{r_{t - n + 1}} - {{\hat \mu}_{t - n + 1}}} \over {{{\hat \sigma}_{t - n + 1}}}}, \cdot \cdot \cdot,{{{r_t} - {{\hat \mu}_t}} \over {{{\hat \sigma}_t}}})

Through the above equation the VaR value of rt can be estimated through the VaR value of Zt and the dynamic VaR of asset return rt under confidence level p, which can be expressed as VaR1ptVaR_{1 - p}^t , the calculation formula is: VaR1pt=μ+σtVaR(z)1pVaR_{1 - p}^t = \mu + {\sigma _t}VaR{(z)_{1 - p}}

Where, μ is expected return, σt represents the fluctuation estimation of the day, VaR1ptVaR_{1 - p}^t represents the risk value of residuum term Zt when the fractile is p, equation (15) represents the fractile level VaR1ptVaR_{1 - p}^t of standard extreme residuum series Zt under p. During the process of solving, it is also the process of POT model to build a model for Zt. The key to POT modelling is to determine the threshold, and the typical methods to determine the threshold includes the excess expectation function graph and Du Mouchel 10% principle [23]. In this paper, both methods this paper used to determine the threshold, so the function distribution of the standard residuum series of the part beyond threshold is: Fu(y)=P(ruy|r>u)=F(u+y)F(u)1F(u){F_u}\left(y \right) = P\left({r - u \le y|r > u} \right) = {{F(u + y) - F(u)} \over {1 - F(u)}}

When the threshold is great enough, according to extreme theory [24, 25], there is Fu(y) ≈ Gξ, σ (y): Fu(y)Gξ,σ(y)=1(1yξσ)1ξξ0{F_u}\left(y \right) \approx {G_{\xi,\sigma}}(y) = 1 - {\left({1 - {{y\xi} \over \sigma}} \right)^{- {1 \over \xi}}}\xi \ne 0

Gξ, σ (y) function is GPD distribution, Where, σ and ξ represent scale parameter and shape parameter, the greater ξ is, it represents that the fatter the tail is, and the values of σ and ξ can be obtained by using maximum likelihood estimation. Under the condition where Fu(y) ≈ Gξ, σ (y), by combining equation (16) and (17) this paper can obtain: F(y)=F(u)+Gξ,σ(yu)[1F(u)]F\left(y \right) = F\left(u \right) + {G_{\xi,\sigma}}(y - u)\left[ {1 - F\left(u \right)} \right]

In order for research convenience, this paper uses N to represent the total number of samples, use n to represent the number of samples beyond the threshold, the threshold distribution function F (u) is represented by (Nn)/N, putting it in equation (18), and combining the evaluated values of σ and ξ, we can obtain: F^(y)=1nN(1+ξ^yuσ^)1/ξ^\widehat F\left(y \right) = 1 - {n \over N}{\left({1 + \widehat \xi {{y - u} \over {\widehat \sigma}}} \right)^{- 1/\widehat \xi}}

By combining equation (6), it can be concluded that the risk value under confidence level p is: VaR1pt=F^1(y)=u+σ^ξ^{[Nn(1p)]ξ^1}VaR_{1 - p}^t = {\widehat F^{- 1}}\left(y \right) = u + {{\widehat \sigma} \over {\widehat \xi}}\left\{{{{\left[ {{N \over n}\left({1 - p} \right)} \right]}^{- \widehat \xi}} - 1} \right\}

For given confidence level p, by using equation (6), this paper can obtain VaR(Z)1−P, then put VaR(Z)1−P in equation (15), dynamic VaR value of the model to be solved can be obtained. So it can obtain the dynamic VaR model based on QR-GARCH-EVT: {rt=σtεt,εtiid(0,1)σt=ω+αxt1+βσt12xt=λ+δσt2+η1εt+η2εt12+ut,εtiid(0,1)VaR1pt=F^1(y)=u+σ^ξ^{[Nn(1p)ξ^1]}\left\{{\matrix{{{r_t} = {\sigma _t}{\varepsilon _t},{\varepsilon _t} \sim iid(0,1)} \hfill \cr {{\sigma _t} = \omega + \alpha {x_{t - 1}} + \beta \sigma _{t - 1}^2} \hfill \cr {{x_t} = \lambda + \delta \sigma _t^2 + {\eta _1}{\varepsilon _t} + {\eta _2}\varepsilon _{t - 1}^2 + {u_t},{\varepsilon _t} \sim iid(0,1)} \hfill \cr {VaR_{1 - p}^t = {{\widehat F}^{- 1}}(y) = u + {{\widehat \sigma} \over {\widehat \xi}}\left\{{\left[ {{N \over n}{{\left({1 - p} \right)}^{- \widehat \xi}} - 1} \right]} \right\}} \hfill \cr}} \right.

Empirical analysis
Source and descriptive statistics of data

In order to test the model constructed previously, this paper selected HS300 index (HS300). Considering that HS300 index was launched beginning from April 4th, 2005, for analysis convenience, this paper selected HS300 Index, the time span of Hang Seng Index sample is: from April 8th, 2005 through March 19th, 2017, totally nearly 12 years, for each index there are about 2,767 sample points. The source of the data is Tonghuashun software, the return per day is defined as: Xt = 100(ln pt − ln pt−1).

From Table 1 it can be found that all sample series are left-skewed form (skewed degree <0), in addition, they have leptokurtosis characteristics (kurtosis >3), and in J-B test, under 1% significance level every sample series is significantly different from normal distribution. In order to further test the distribution characteristics of sample series, in the paper, the frequency chart and QQ chart (Fig. 2) of the sample series were drawn, the upper tail and lower tail of HS300 series deviate from normal distribution apparently, showing thick tail characteristics, consequently it can be concluded that the sample series show typical “leptokurtosis” characteristics. In general, the sample series show asymmetrical distribution, with typical financial return “leptokurtosis” characteristics, and a certain burst and gathering characteristics. Ljung-Box statistic Q(10) shows that under 1% significance level, HS300 has no series correlation, ARCH-LM test indicates that HS300 return has ARCH effect, and ADF test shows that HS300 return is steady, So QR-GARCH-EVT model can be used to build model.

Fig. 1

Sequence Diagram of Sample Closing Price and Rate of Return.

Fig. 2

Frequency Chart and Q-Q Diagram of the Yield Sequence.

Descriptive Statistics of Sample Data

SampleMeanMaxMinSDSkewnessKurtosisJ-BQ(10)ARCHLM(10)ADF
HS3000.02003.8786−4.49870.8460−0.44246.45611419 (0.0000)17.56 (0.0022)251.86 (0.0000)−37.00 (0.0000)

Note: P value

QR-GARCH-EVT model estimation

Considering excellent properties of QR-GARCH model, in the paper it used above QR-GARCH model to fit HS300 return series, for specific result please see Table 2.

From Table 2 it can be found that α + β < 1 indicates the stability of QR-GARCH model, the overall risk of HS300 Index return is apparently positively correlative to its volatility in the past, and the return series has a continuous fluctuation. A and η1 are opposite in sign, indicating there is leverage effect, and negative impact has a greater influence that positive impact, β is 0.637, indicating that the volatility in QR-GARCH model is not only influenced by the fluctuation in previous stage.

The Estimation Results of the QR-GARCH Model of the HS300 Return

Model Parameterωαβη1η2δλ
Estimated Value p−0.125 (0.021)0.264 (0)0.637 (0)−0.053 (0.082)0.085 (0)1.025 (0)0.211 (0.641)

Based on QR-GARCH model estimation, this paper obtained the residuum series of HS300 Index return. Standardizing the residuum series, the test showed that the standard residuum was steady, without self-correlation, So EVT model could be used to build a model for it.

A model was built for standard residuum series in three phases, including model exploration phase, model fitting and model diagnosis phase. In the first model exploration phase, the key to determine the threshold, currently popular threshold selection methods include excess expectation function graph and Du Mouchel 10% principle, etc., but there has been on a unified standard yet. Considering the importance of threshold selection attached to the model, this paper will combine both methods to determine the magnitude of the threshold.

Firstly, according to Du Mouchel 10% principle, through calculation, this paper has obtained that the threshold of HS300 series was 1.165, and made the mean life graph of HS300 Index return series to aid in selection of threshold. By observing its mean residual life, it can be found that the series began to have positive ramp near above 1, by combining the threshold obtained by Du Mouchel 10% principle, so this paper has concluded that the threshold was 1.569 respectively.

Fig. 3

The Mean Residual live plot of the residual sequence.

Secondly, in model fitting phase, based on the obtained threshold, this paper used EVT model to estimate standard residuum series to obtain the scale parameter σ and shape parameter ξ of EVT mode, the result is shown in Table 2. From the table it can be found that the shape parameter ξ is greater than 0, indicating that extreme standard residuum series has apparent fat tail.

POT Model Estimation Results

HS300u1.569
ξ0.054
σ0.561

Finally, in model testing phase. In order to test the effectiveness of EVT model estimation, this paper made the fitting diagnosis chart of standard residuum series, as shown in Fig. 4, most points directly fell in the threshold distribution chart and tail distribution chart or nearby, only individual points deviate, without influencing fitting effect. Over fitting effect of EVT is good.

Fig. 4

HS300 Residual Sequence EVT Model Diagnosis Diagram.

Based on estimation of QR-GARCH-EVT model, this paper estimated risk value according to the risk measure model constructed previously. In Fig. 5 this paper lists the VaR values when the fractile is between 0.01 and 0.05.

Fig. 5

Risk Values Under the Number of Quantiles 0.01 and 0.05.

QR-GARCH-EVT model test

In order to test the accuracy of the model, this paper used Kupiec test (posterior test) proposed by Kupiec [26]. Kupiec test is a unconditional coverage test, which statistically check if the frequency of excess in the sample is enough statistically close to the selected confidence level. For example, for a confidence level of 99%, it is estimated that one excess occurs every 100 day on average. For the confidence level of 95%, at the same time, interval five excesses occur. For the confidence of other levels, the same situation occurs. Firstly, a binary variable is defined, it is 1 when VaR prediction fails; otherwise it is 0. During the back test the typical value of the variable is recorded, and the significance of unconditional coverage was tested using the statistical of likelihood ratio test, then the likelihood ratio test statistical is as follows:

Where, p is probability level, n is the number of tested samples and m is time numbers of failures. In theory, under given confidence level, the less the failure is, the higher the accuracy of the model is, the better the effect is. At the same time, for comparison, this paper calculated the results of Kupiec test of GARCH model, GARCH-EVT model as well as QR-GRACH model, for specific result please see Table 4.

Quantile Risk Value and Test Under Different Conditions

ModelConfidence LevelFailed DaysFailure RateLR
GARCH95%1555.60%2.034
GARCH-EVT95%1364.92%0.042
QR-GRACH95%1495.38%0.843
QR-GARCH-EVT95%1324.77%0.311
GARCH99%321.16%0.390
GARCH-EVT99%301.08%0.977
QR-GRACH99%351.26%0.017
QR-GARCH-EVT99%311.12%1.262

From Table 4, through comparison of the model, it found that under 95% confidence level, the VaR estimated only by GARCH type model is different slightly, with the risk of underestimation, while combination of GARCH type and EVT has an apparently good effect. But compared with QR-GARCH-EVT model, in terms of failure rate, the latter has a higher accuracy.

Under 99% confidence level, the number of failures of every model is apparently greater than the theoretical level of 28. Underestimating ample risk to some extent, the model has an apparent inadaptability, with the low effectiveness of risk prediction. In terms of model comparison, QR-GARCH-EVT model still performs better than other models.

So, under 95% confidence level, the QR-GARCH-EVT model constructed in this paper has higher model accuracy, stronger effectiveness, but under 99% confidence level, the model has inadaptability. In general, QR-GARCH-EVT model performs better than other models.

Conclusion

Considering the excellent property of fractile regression model not needing to assume the shape and parameter of distribution, this paper firstly used QR-GARCH model to fit financial asset return characteristics, on the basis of obtaining the volatility and residuum, this paper introduced EVT model and constructed QR-GARCH-EVT based on extreme risk measure model eventually. This paper used HS 300 data to test, the result showing that under 5% significance, and QR-GARCH-EVT model could effectively measure the risk value of the sample, but under 1% significance, so QR-GARCH-EVT model would underestimate the risk value of the sample to some extent, but compared with other models. Generally speaking, QR-GARCH-EVT model increases the accuracy of risk measurement somewhat and enhances effectiveness somewhat.

Improved model increases the accuracy of extreme risk estimation somewhat, adaptability of the model improves somewhat, however, this paper aiming at combination of fractile regression and extreme theory, it just made an explorative research. So, next how to select the model with even better performance to fit the typical characteristics of financial market will be the direction of our research. Of course, when calculating the threshold, this paper applied currently popular Du Mouchel 10% principle, but there has been a unified standard on how to select the threshold in current academic circle yet, while selection of threshold directly concerns the accuracy of extreme risk calculation. So, this is a research direction next.

eISSN:
2444-8656
Język:
Angielski
Częstotliwość wydawania:
Volume Open
Dziedziny czasopisma:
Life Sciences, other, Mathematics, Applied Mathematics, General Mathematics, Physics