Is your risk policy getting that much needed proactive health check?
Sure, your model is working well, or well enough, you say. And all those upheavals mean cash is scarce and the old saw of “if it ain’t broke, don’t fix it,” holds sway. Why should you spend scarce dollars validating, and possibly refining a perfectly predictive model?
It can be a tough sell, we know. But, reality is that behaviors are constantly shifting. People change behaviors in response to changes in circumstances and environments. You have to move with them to give your company the best risk management guidance.
For example, in the past, people always paid the mortgage first. That was sacrosanct for generations. What do they pay first now? Bankcards, their lifeblood in hard times. And further heresy, the stigma of walking away from underwater homes is gone. And so are an ever-growing number of ex-homeowners. A total erosion of an almost-sacred American value in literally the blink of an eye.
Changes at the macro level like these are relatively easy to see now, but would it not have been great to have had a metaphorical tripwire out there a year ago? And who knows what lesser, more subtle changes are occurring? They are occurring and they are affecting your risk analysis model.
You need to have a way to find these micro changes before they bite you.
Prudent risk management requires frequent validation of whether your policies and processes are still working as intended. Leveraging advanced predictive analytics is crucial. For example, suppose you have a policy of sending a letter to a group of consumers reminding them to pay their mortgage. If that process traditionally results in x% success, would it not be prudent – in light of changing behaviors – to validate whether that policy is still effective?
To us, tracking the effectiveness of a policy over time would seem in order. Ideally, these efforts would lead to refinements to increase predictive capabilities. Clearly, not validating and refining your models risks significant negative ramifications. We believe validating, tracking, and refining are the most essential parts of any advanced analytical program with quick refinements being the most important.
Fortunately, there are ways to go about it that won’t break your budget.
For example, we can help you test ideas against small, sub-population samples of your model; the equivalent of a medical trial where some get the treatment and some get a placebo. This actually is a fine way to approach this dilemma of refinement. And it lets you keep your control group intact while trying out new strategies at minimal cost.
When you hit a winner, flash updates can boost your model’s effectiveness almost instantly. No reason to reinvent the wheel.
A well-monitored portfolio that reflects current business intelligence delivers a better result.
If you want to talk to an Equifax specialist about validating your models, please send us an e-mail. If you are interested in learning more about Equifax technologies and analytical services, please sign up for our monthly newsletter. It summarizes the new articles in our blog.
This article was contributed by Tom Aliff.
Tom Aliff is the Sr. Director of Analytics at Equifax. Tom has been working in statistical modeling for 7 years with experience with financial service providers. Tom attained his Bachelors and Masters in statistics from Purdue University.
Recommended For You
Last week, top data and analytics experts from Equifax and around the world convened in Edinburgh, Scotland for Credit Scoring and […]
Adaptive AI is the next great advancement in leveraging AI for credit risk. Equifax’s Chief Innovation Architect, John Fenstermaker, developed the product […]
Equifax has developed Insight Score™ for Personal Loans, a risk score optimized to help lenders evaluate applicants seeking unsecured personal loans. I […]
In November, Equifax and Moody’s joined forces to recap the economic and credit trends of 2018 — and look ahead […]