1. Keith Shields,
Chief Analytics Officer – Magnify
SVP, CCO – Loan Science
The Ongoing Evolution of the Model Validation "Function":
Naiveté, Confusion, Control, and Regulation
1
2. magnifyAnalytic Solutions
magnify • simplify • amplify
Model Validation
Opening Remarks
Defining model validation loosely…
the set of ongoing analyses and practices used to prove to internal and external
audiences that your model is good (and better than immediately available
alternatives), and that it contributes to decisions and actions that make the business
more profitable
Very little analysis to be shared - this discussion is less about diagnostics and more
about philosophy and organizational structure.
The examples offered are all related to Marketing or Financial Services – apologies to
those working in engineering.
Analytics professionals should pay careful attention to how their clients (internal or
external) view the importance of model validation.
If the business cares about a model being right, then they are likely using it. These are
the clients that analytics-driven organizations are built around.
The title: “Naiveté, Confusion, Control, and Regulation”, in some sense, is itself an
evolutionary scale.
2
3. magnifyAnalytic Solutions
magnify • simplify • amplify
Model Validation
Observations
Huge gap between:
What it takes for an analyst to be satisfied a model is good and
What it takes for the business to be satisfied a model is good and
What it takes for regulatory entities to be satisfied a model is good
Given the above bullets, conversations involving relevant parties about
model validation often become intractable. Example:
Analyst: “The model-derived credit loss forecast was only off 8% last
quarter.”
Client: “The corporate standard is 5% error.”
Analyst: “But the forecast derived from the old method was off 18%”.
Client: “We’re comfortable with the old method”.
3
4. magnifyAnalytic Solutions
magnify • simplify • amplify
Model Validation
More Observations
The conversation on the previous slide goes even worse in the case of models
where the dependent variable is binary.
In these cases we (analysts) like to see improvements in the C-stat, Gini coefficient, K-
S, and other non-parametric measures of how well the model distinguishes between
the 1’s and 0’s.
For business acceptance these improvements have to be stated in terms of reduced
credit losses, incremental vehicle sales, etc…
Translating the model validation results into bottom-line business results isn’t
always easy and sometimes treacherous.
Assumptions about exogenous and endogenous conditions have to be reasonably
accurate.
From what we’ve seen, the biggest danger to model validation results holding up in
production are the “decision strategies” around the model.
Models aren’t strategies. Strategies aren’t models.
“Do I have to tell you that too?” – Kevin Cooper, FMCC
4
5. magnifyAnalytic Solutions
magnify • simplify • amplify
Model Validation
The Evolutionary Scale
So much of our ability to prove models work depends on where in my contrived
evolutionary scale your organization lies.
Naiveté -> Confusion -> Control -> Regulation
In hopes of creating a more tangible understanding, we offer the following
“profiles” (many thanks to Nicole Detterman, Manager of Analytic
Solutions, Magnify):
Naiveté: statistical models are not part of the business decisions or processes; often
times the assumption is that business heuristics are better
Confusion: general understanding and modest use of models; sometimes confuse
models with strategies (models are often scapegoated for bad strategies); model
validation is sometimes confused with portfolio management
Control: understanding of both models and strategy; organizational roles clearly
defined; general understanding of the metrics/analyses that point to model vs.
problems vs. strategy problems vs. portfolio problems
Regulation: banks fully endorse model validation and are willing to spend millions on
proving they are following every process to ensure their models work, mainly because
they are scared s***less of the OCC (Office of the Comptroller of the Currency) and
CFPB (Consumer Financial Protection Bureau)
5
6. magnifyAnalytic Solutions
magnify • simplify • amplify
Model Validation
The Evolutionary Scale…joking and cynicism aside
Like all profiles, what we offered on the previous slide is perhaps easy to digest but
doesn’t help at all with actions or next steps. So we offer the following:
Where there is a lack of experience with models:
Focus on the “exactness” of your validation
Offer micro-level detail
Decision trees can be a good validation tool
Where there is a confusion around models:
Always offer two sets of ongoing validation – one for models, one for champion /
challenger strategies
Stress the importance of business domain knowledge, and be willing to “call the cops on
yourself”
Where there is control:
As above, offer two sets of ongoing validation reports
Offer full “methodological transparency”
Show the rigor required in a regulatory environment
Where there is regulation:
Understand the OCC document on model risk management:
http://www.occ.treas.gov/news-issuances/bulletins/2011/bulletin-2011-12a.pdf
Further understand that model validation shops are really model development shops
6
7. magnifyAnalytic Solutions
magnify • simplify • amplify
Model Validation
Validating Models (Lack of Familiarity)
Example:
Small business lender with a credit scorecard developed through a subjective Bayesian methodology (thank
you Chas Gastineau for inventing this term)
A model developed through logistic regression improves the C-stat from .56 to .73; projected business
impact is that contract originations will double at constant losses
The new model doesn’t contain any of the variables the old one did; validations involving c-stat, ROC
curve, etc… do little to assuage fears; projected business impact is met with skepticism
Be simple and “exact” with the validation:
1,000 contracts scored between 400-500 on your SB Model
150 of them defaulted (that’s 15%)
200 of them scored between 701-900 on the new model. 10 of them defaulted (5%). More below…
7
SB SCORE CONTRACTS DEFAULTS DEFAULT RATE NEW SCORE CONTRACTS DEFAULTS DEFAULT RATE
400-500 600 120 20.0%
400-500 1000 150 15.0% 501-700 200 20 10.0%
701-900 200 10 5.0%
400-500 250 45 18.0%
501-700 1000 100 10.0% 501-700 500 45 9.0%
701-900 250 10 4.0%
400-500 100 17 17.0%
701-900 1000 50 5.0% 501-700 250 17 6.8%
701-900 650 16 2.5%
8. magnifyAnalytic Solutions
magnify • simplify • amplify
Model Validation
Validating Models (Confusion)
Example:
Solicitations for 10% off domestic money transfer are mailed to the top 50% of customers
most likely to use this service. Top 50% is determined by a propensity model.
Response rates are the same for test and control groups.
The determination is made to redevelop the model.
Be willing to call the cops on yourself; set up triggers to determine when the model is
bad. Those triggers must of course be independent of strategies.
8
Model Problem
Increments good, ranking bad
Strategy Problem
Ranking good, Increments bad
Propensity Decile Test Response Control Response Increment
1 5.54% 4.82% 0.72%
2 5.40% 4.70% 0.70% 5.17%
3 5.31% 4.62% 0.69% Above cutoff increment
4 5.03% 4.37% 0.66% 0.67%
5 4.59% 3.99% 0.60%
6 4.34% 3.77% 0.57%
7 5.05% 4.39% 0.66%
8 3.86% 3.36% 0.50% 4.43%
9 4.63% 4.03% 0.60%
10 4.25% 3.69% 0.55% 0.58%
Below cutoff increment
Above cutoff response
Below cutoff response
Propensity Decile Test Response Control Response Increment
1 10.86% 10.52% 0.34%
2 9.12% 9.26% -0.14% 7.56%
3 7.11% 7.10% 0.01%
4 5.86% 5.97% -0.11% -0.01%
5 4.88% 5.00% -0.12%
6 4.39% 4.63% -0.24%
7 2.79% 2.71% 0.08%
8 1.54% 1.55% -0.01% 2.12%
9 1.35% 1.34% 0.01%
10 0.55% 0.58% -0.03% 0.01%
Below cutoff increment
Above cutoff response
Below cutoff response
Above cutoff increment
9. magnifyAnalytic Solutions
magnify • simplify • amplify
Model Validation
Validating Models (Control)
Example:
An auto lender determines sees that portfolio performance vis-à-vis model expectations is deteriorating
in its Western region (dominated by California).
Region-specific models are debated but it is determined through validation reports that the relationship
between revolving utilization and default probability has become much weaker since the 2008 crisis.
Revolving utilization is one of the most heavily-weighted variables in the scorecard.
The same set of validation reports indicates the relationship between mortgage utilization and default
probability has become stronger.
The variables are “swapped” and a new model (scorecard) fixes the problem in the Western
region, because California customers generally had higher mortgage utilizations, which wasn’t penalized
by the old model. The c-stat increased by 0.3 on a validation sample.
9
Swap out
Swap in
10. magnifyAnalytic Solutions
magnify • simplify • amplify
Model Validation
Validating Models (Regulation)
Regulated model validation is just controlled model validation with oversight.
“Building models is fun. Our jobs should be fun. It’s only our kind of people that could make our jobs not
fun”. -- Andy Dolan, Director of Analytics, Loan Science
In the example on the previous slide, the model validation function is acting very much like a
model development function. This is because one of the primary purposes of the model validation
“team” is to constantly challenge the specification and performance of the existing model. The
importance of this role is spelled out in the OCC document on Model Risk Management:
“Model risk management begins with robust model development, implementation, and use. Another
essential element is a sound model validation process.”
“Model validation is the set of processes and activities intended to verify that models are performing as
expected, in line with their design objectives and business uses. Effective validation helps ensure that
models are sound.”
“All model components, including input, processing, and reporting, should be subject to validation;
this applies equally to models developed in-house and to those purchased from or developed by vendors
or consultants.”
“Validation involves a degree of independence from model development and use. Generally, validation
should be done by people who are not responsible for development or use and do not have a stake in
whether a model is determined to be valid.”
“Staff doing validation should have the requisite knowledge, skills, and expertise. A high level of
technical expertise may be needed because of the complexity of many models, both in structure and in
application.”
10
11. magnifyAnalytic Solutions
magnify • simplify • amplify
Model Validation
Why Analytics is usually highly evolved in a regulated
environment…
Many of the recommendations in the OCC document imply that effective “model risk management” is
dependent upon trained Analytic professionals should having an appropriate amount of
influence…with support from top management…
States clearly that model risk management (and thus validation) should be part of what a bank is and does:
“Model risk should be managed like other types of risk. Banks should identify the sources of risk and assess the
magnitude.”
Recommends some organization and structure around validation:
“A guiding principle for managing model risk is ‘effective challenge’ of models, that is, critical analysis by objective,
informed parties who can identify model limitations and assumptions and produce appropriate changes. Effective
challenge depends on a combination of incentives, competence, and influence.”
Recommends influence for that organization:
“Influence comes from a combination of explicit authority, stature within the organization, and commitment and
support from higher levels of management.
Recommends that organization scale with the use and importance of models
“Where models and model output have a material impact on business decisions, including decisions related to risk
management and capital and liquidity planning, and where model failure would have a particularly harmful impact
on a bank’s financial condition, a bank’s model risk management framework should be more extensive and
rigorous.”
11
12. magnifyAnalytic Solutions
magnify • simplify • amplify
Model Validation
Questions?
Thank you for your time. For inquiries about Model Validation or any other services
offered through Magnify or it’s partners, please contact any one of the following:
• Keith Shields: kshields@magnifyas.com, kshields@loanscience.com
• Susan Arnot: sarnot@magnifyas.com
• Mike Lyons: mlyons@magnifyas.com
• Nicole Detterman: ndetterman@magnifyas.com
• Sameer Desai: sdesai@marketingassociates.com
12