Predictive Analytics Interview-Client Deployment Example

Today’s topic is Predictive Analytics and this is continuation in a series of audio podcast interviews on this topic. Dr. Eric Siegel is a former computer science professor at Columbia University and the President of Prediction Impact.

Dr. Siegel, good afternoon and thanks again for agreeing to this interview and for your time today. Could you describe a specific client deployment for Predictive Analytics?

Sure.  Last time, I illustrated the value of response modeling for direct marketing, which applies in very much the same way across companies.  As we saw, if there’s no method in place to score or segment customers, predictive analytics makes a tremendous impact on profit and response rates.  But let’s move to a more unique application of predictive analytics which competed against an existing legacy system.

A successful information portal in the educational sector, which is used by 1 in 3 college-bound high school seniors, wanted to increase advertisement response rates by predicting which promotion each user was most likely to respond to.  The existing system selected between hundreds of ads by way of “A-B testing” — or, more precisely, “A-B-C-D… testing.”  It measured which ad was most popular, universally across users, but separately for each web page, which, in this case, closely corresponds to the lifecycle of the user. This legacy system was also improved by not showing the same ad to a user more than once, since, if the user is interested, she or he would likely have already responded, given that the ads are embedded prominently within the website’s content.

This made for a formidable system over which to improve, given that a small number of the ads were much more popular than the rest.  This made it a challenge for predictive analytics to confidently say, “Hey, this user is much more likely to be interested in this relatively unpopular ad than any of the really popular ones.”

We generated 291 predictive models, one per ad — ads that hadn’t been tested enough to learn about what kind of user is more likely to respond to it were omitted.  These models were generated over 10’s of millions of web transaction logs, which encode which user was solicited with which ad, and whether they responded.

The results of deploying these predictive models were strong.  They increased the revenue of the website by almost $1 million per year.  More specifically, the response rates increased by 25% and revenue increased by an observed 3.6%, later reported at 5% by our client. (There is a difference between the response rate and revenue improvement levels as a result of different pay levels for different ads.)  Since this business was already successful, earning a monthly revenue of $1.5 million from ads, this incremental but significant improvement amounted to a lot, and provides a great ROI for this predictive analytics initiative.

This is the kind of impact that you get from predictive analytics even in many cases where you already have a well-engineered system in place.  For example, in the insurance sector, there’s a great win when predictive analytics successfully improves actuarial methods to reduce the loss ratio by, say, 2 to 5 points.

Leave a Reply

Your email address will not be published. Required fields are marked *