Serving the Quantitative Finance Community

 
User avatar
DavidJN
Topic Author
Posts: 262
Joined: July 14th, 2002, 3:00 am

Model vetting

September 7th, 2006, 5:17 pm

Recently I was asked by someone to look at a model vetting report produced by a bank department. Apart from the fact that the report appeared two years after the model in question was put into production, the other interesting thing about it was that I could not discern any concrete evidence of vetting - no examples, no test numbers etc. This got me thinking about what constitutes good model vetting. I have done such work before on a consulting basis and drawing on my own experience I have come up with the following short list. First, any model needs to be, for lack of better words, stressed to see if it will come up with pathologial results under plausible and even not so plausible inputs. Consider a simple option model for example - if we reduce volatility to zero does the result converge to that of a forward? I guess what I mean here is whether the corner solutions or boundary conditions make sense when we tweak the inputs. A very simple but especially useful test is to see what happens on expiry day. I've seen many option models return only intrinsic value on the morning of expiry days. This is not patricularly helpful for at-the-money options on high volatility underlyings. Another dimension of this kind of model stress testing is whether it is robust enough to not crash and hold things up if it somehow gets passed garbage. Mind you, some people think might think that a crash is a good thing if a model is passed garbage as inputs!Second, a decent vetting report should very clearly identify what was used to create benchmark test numbers and what data and scenarios were investigated. Saying "it agrees with our own calculations" is surely not good enough. In a sense I am asking here "Quis custodiet ipsos custodes" or "who guards the guardians"? Why should they be trusted? Just because they say so?And finally, some discussion of comparative results, differences and their materiality is warranted.Ideas? Comments? Your experiences?
 
User avatar
nparaschos
Posts: 2
Joined: July 14th, 2002, 3:00 am

Model vetting

September 7th, 2006, 10:18 pm

 
User avatar
nparaschos
Posts: 2
Joined: July 14th, 2002, 3:00 am

Model vetting

September 7th, 2006, 10:20 pm

Umm...how about using some other methodology to value the same scenario in the model under vetting, i.e. killing a cockroach with a steamroller, or Monte Carlo simulation as it's more commonly known.
 
User avatar
DavidJN
Topic Author
Posts: 262
Joined: July 14th, 2002, 3:00 am

Model vetting

September 8th, 2006, 12:09 pm

Hi N, greetings from Toronto!May I assume you are referring to using Monte Carlo simulation to select sets of inputs (e.g. yield curves, vol surfaces) passed to the model OR are you talking about using MC as a benchmark valuation model for model comparison? The latter idea being that one can model pretty much anything using Monte Carlo techniques.
 
User avatar
nparaschos
Posts: 2
Joined: July 14th, 2002, 3:00 am

Model vetting

September 8th, 2006, 1:13 pm

Greetings from Boston Mr. D.Well, probably a bit of both. For instance, one could devise a Monte Carlo procedure to value path-dependent options by simulating the asset price over the specified time horizon, calculate the average payoffs over the number of simulations and calculate the price of the option as the pv of the average payoff. Then one can compare the results obtained against the model under vetting. I suppose you could also use simulations to model the inputs passed to the model. This could be useful when vetting term structure models or models of forward and spot prices in the energy sector.As an alternative to Monte Carlo for vetting purposes one could also use finite difference methods as favoured by engineering grads.What are your views?
 
User avatar
DavidJN
Topic Author
Posts: 262
Joined: July 14th, 2002, 3:00 am

Model vetting

September 8th, 2006, 2:27 pm

Fair enough N, there are indeed a number of ways to skin a cat. I'm sure you would agree that a decent vetting report should include a description of what a proposed model was vetted against (MC, FD, whatever), what market data was used, what scenarios were investigated, a listing of comparative valuation outputs, an analysis of discrepancies and a discussion of materiality, etc. In your experience, what has been the overall quality of vetting reports you have encountered and have you ever produced any yourself? Were there standardized templates for vetting reports where you have worked, did they include anything over and above the things previously mentioned in this thread? And finally (sorry for all the questions) what is your impression of the vetting report I described at the start of this thread? Would you send it back to the authors and say "try again and this time prove you've actually vetted the model"?
 
User avatar
nparaschos
Posts: 2
Joined: July 14th, 2002, 3:00 am

Model vetting

September 8th, 2006, 2:57 pm

Last thing first - I would return the report and politely ask the authors to go back to the drawing board. And this time, they should include a complete description of the methodology used to vet the model in question as per your description.In my previous life up north my vetting reports included detailed descriptions of the model or models (and the underlying theory and assumptions) used for vettting purposes. I have included a list of the scenarios used, from normal to extreme or stress events, the data used (i.e. how many years worth of data and from where), and the result of the valuation outputs of these scenarios from the different models. The results should be detailed enough to show dollar and percentage variances. A discussion of the variances then followed including the materiality of the variances and an attempt to determine the variances. Other tests I have included in vetting reports involved how the model's sensitivities (exotic options and their greeks) behaved under various scenarios, such as how does the delta of a barrier option behave close to the barrier or close to expiry under the models used for vetting and the model in question.I don't recall having standardised templates. I do remember creating numerous vetting reports that had all the information mentioned above, though. Strict templates could limit the originality and imagination of the vetting team. A more generalized template using the above guidelines would serve as a good start, in my opinion.
 
User avatar
RedAlert
Posts: 2
Joined: April 11th, 2002, 10:54 am

Model vetting

September 8th, 2006, 4:06 pm

David,I guess your question really boils down to what tests should be conducted and documented as part of a model validation. Well my list would read as follows:1. Start with a description of the product and the technical framework used to model/price it. This will help ensure everyone is on the same page in that the model developers, traders etc know that what has been implemented corresponds to what they require.2. Standard base case test. Set up some standard base cases and compare the Front-Office model against some independent implementation - this could just be a rough and ready implementation such a Monte Carlo framework or it could be a comparison against some external vendor model or something else within the organisation.3. Check the model behaves as expected in the limiting cases. That is for example as the volatility -> 0 or as the rates drop.4. Stress test the model, examine the behaviour (and if possible compare to an independent model) in different rate and volatility environments, using different currencies or market data sets.5. Examine the limitations of the model and document where you believe its use could be inappropriate. For example a limitation may be that the model doesn't incorporate the volatility smile - this is a big problem and should be flagged so that users are aware of it. If the misvaluation can be quantified then that would be of huge benefit.6. Look at issues such as calibrating the model or determining the unobservable parameters. As everyone knows any model is only as good as the inputs it is given!7. Conduct a lifecycle test, examine how the model performs over time, on fixing dates, payment dates, expiry dates etc.I think this is a reasonable starting point.F.
 
User avatar
KackToodles
Posts: 0
Joined: August 28th, 2005, 10:46 pm

Model vetting

September 9th, 2006, 5:06 am

what is the difference between model vetting and data mining?
 
User avatar
DavidJN
Topic Author
Posts: 262
Joined: July 14th, 2002, 3:00 am

Model vetting

September 10th, 2006, 7:24 pm

Useful comments RedAlert. Thanks for your input.
 
User avatar
eredhuin
Posts: 3
Joined: July 14th, 2002, 3:00 am

Model vetting

September 20th, 2006, 1:19 am

I used to vet. In general it's not possible to test all states that could possibly give a model trouble. Think of an option price as a map from R^n to R^1, and naturally you can't test all possible inputs. A good vetting report will list a market assumptions section (with ranges employed). Outside of the tested combinations, you assume things are still ok. It proves nothing more than "I looked here, and it was ok". For insipiration, think of the pentium's "FDIV bug", for example. I bet you didn't know that every new processor comes with a 40 page list of errata nowadays. And that's the known problems, on release date.David's comments about testing corner cases are a good metric. I did this when I vetted and usually found tons of problems, and generally the line/the vendor never appreciated the feedback. If you've got a copy of Hull's book's barrier options formulas in your library, try typing in 1% or 0.1% as a vol and see what happens. (It's the "exp" statement that causes the grief, btw -- exp(+738) returns inf or nan or something in IEEE arithmetic). Who's to blame? The coder? The user?I wrote a list up somewhere of "numerical analysis for dummies" at a prior workplace. Every "/" and "exp" statement could potentially be considered dangerous. Even subtraction can be dangerous, given the potential for catastrophic cancellation. Writing defensive code makes it ugly but if you want to write production code, you have to program defensively. BTW I am firmly in the school that the program should complain like hell and/or blow up when given illegal inputs rather than force some "default behavior". Banks are crufty places and silently broken things have a habit of staying so for years on end. In my own code I usually have the "rotating knives / no safety" layer of library code, and overlay that with a whack of validation code when interfacing to, say, excel.