August 7th, 2018, 3:36 am
One method tests the other. Compute a VaR-based on 20-day overlapping time periods. In addition, compute the VaR using the daily observations and use the square root of time as an approximation- this will allow you to test the mathematical integrity of the Model and reinforces the validation of the square root rule. As regards to time & efficiency, If your PnL vectors have significant non-linearity then a full reveal VaR is required for historical simulation, based on the 1-day & 20-day perturbations, this will have to be at the risk factor level (Model choice is risk factor dependent either additive or Multiplicative ) on the other hand, if you PnL is linear, then approximate your VaR using a first-order approximation to the PnL using Taylor and compute the VaR-based on that. In relation to data ( Very Important ), if this is an exercise to gain insight into mechanics of VaR, then the quantitative & qualitative criteria of how reliable the data is important. However, if you are doing this exercise in the real world, then your data must be robust and reliable, otherwise you VaR can potentially have many onerous interpretations. If your data is not sufficiently liquid, then you need to have a separate framework to asses the materiality of the underlying risk factor and then quantify its impact using non-direct risk metrics, such as sensitivity's, scenario's and stress testing, then you need to think of a long-term solution such for example, may be a separate add-on on top of VaR or an add-on on the Market risk capital formula ( this is part of your RNIV & NMRF framework ) etc...