< Articles & News

Analyzing the Analytics

March 6, 2017   |   6:01 PM
Big Data Analytics
Teradata Articles

by Yasmeen Ahmad, Teradata

So… the global financial meltdown.

With all of our sophisticated analytics and modelling, shouldn’t we have been able to avoid the crisis?

The license to print money – revoked

The financial industry had lauded the analytical model known as Gaussian copula function as breakthrough for the modelling of complex risk, leading to unprecedented growth. Then, suddenly, in 2008 the financial system collapsed creating huge losses and disruption – the level of which, even today, is hard to comprehend.

Was it really that sudden? Why didn’t the analytics see it coming?

The truth of the matter is that early-warning signs were ignored. Worse still, signals were not monitored and analytics were left unchecked for more than a decade while markets slow-marched towards the abyss.

How do we learn from this catastrophe?

By automating model execution with measurement and monitoring in place. By ensuring that slow-changing conditions are identified and acted upon. By being proactive.

And talking of proactivity, which would you rather have? A team of innovators or a team of mechanics? Mechanics respond to dashboard lights with diagnostics and repair programs. But R&D teams innovate, designing engineering solutions that prevent the glowing dashboard lights in the first place, reducing costs and maximizing performance. In the same way, analytical teams should be innovators, ensuring that analytical models are not outdated, underperforming or in the worst-case scenario, wrong.

Gaining control and implementing governance is essential. With access to more granular data and sophisticated machine-learning analytics, it’s possible to have a model factory running simulations or forecasts for each customer, each product, and each store. Companies like eBay run thousands of tests on their digital platforms, concurrently, with every customer becoming a test subject. Wells Fargo can predict behavior within seconds and respond with a personalized offer within minutes. This opens up the possibility of modelling the world like never before, making predictions and acting on insights.

However, with the explosion in sophisticated machine learning, managing, tracking, and monitoring models is going to be our most formidable challenge to date.

Measure, monitor, and maintain

The basis for any management and monitoring requires a clear understanding of what the expected outcome is (KPIs can help), predicted by our model. However, control metrics are also needed to measure the validity and accuracy of predictions. These measures provide a model interface; a communication channel that reports input-feed quality, algorithm performance and prediction accuracy, triggering proactive action and maintenance where required. This ensures models continue to deliver business results consistently.

The cause of remedial action is not usually a massive failing. Instead, slowly-changing business and environmental circumstances affect the underlying assumptions that went into building the original model. Dramatic changes such as the introduction of a new product or service will get enough traction to initiate corrective action, immediately. However, subtle changes don’t attract the same level of attention. Shifts in customer behavior and market trends reveal themselves slowly (it can take years sometimes), and these small levels of degradation provide below-par results which lose the company money.

Constant review and recalibration ensure models are refreshed with old, outdated variables removed and new datasets introduced, generating variables that increase the knowledge input to a model and improve predictions.

Weekly, monthly, or quarterly, report automation allows constant baseline comparison, enabling experimentation – a process of trying hundreds and thousands of models to find the most accurate algorithm. Comparison with models from the past may also show that the performance of previous algorithms has been improved by access to increased data granularity and new variables. Previous models which were regarded as being too far ahead of their time can be dusted off and reintroduced.

Agile campaign management and a commitment to continual trial & error are essential. Letting models run feral can be bad news though. Take Target for example. The retailer identified a teenage pregnancy through predictive analytics and sent the girl appropriate marketing material, leaving an unwitting father to discover the truth from a coupon and deal with the consequences. Or Knight Trading’s $440 million loss, a result of mistakenly launching an incorrect version of an algorithm. A measurement and monitoring framework should have interceded this model to create alerts and modify business actions.

Managing models through automation can prove enormously beneficial, but strict process and governance have to be observed to maintain control. The lineage of models and the understanding of their development explain the impact that models have on the bottom line.

Rather than being an afterthought, these activities should always be front of mind; repeatable processes, not just one-time exercises for model construction. They’re just as important as insight generation and therefore analytical teams should focus on validation to enable model execution to build better business outcomes.