Posted by Arijit Sengupta ● Feb 22, 2019 3:41:00 AM

Accurately Wrong.


Does your AI have high accuracy but still not achieve the results that you want? Are you training it on the results that you care about?

AI has long sought after higher and higher levels of accuracy. The practice needed a standardized way to compare different models against each other in order to see which ones were the best. New models are continually being developed and they often determine their predictions in different ways. However, correctly predicting the outcome of a scenario, while helpful, is not the purpose of a business. Optimizing for that key business metric is the underlying goal of all management reporting today. Whether you are a company trying to make money for your investors, a manager trying to predict which employees will leave, or a non-profit trying to determine who is mostly likely to use your shelter, shouldn’t your AI be built for the results that matter to you?

For example, a bank that is targeting new customers to sign up for a mortgage may benefit significantly more from a single new customer compared with the cost of a few mailers and a few phone calls. In contrast, your usual academic AI will focus all of its efforts on correctly predicting which of those prospects will sign up for a loan, and which ones will not. Then, it recommends that you only market to those that it is confident will convert to paying customers. However, a more aggressive AI will end up going after more prospects to ensure that it captures all of the prospects that are likely to be profitable. The savvy business user knows that it is worth going after significantly more leads to capture that extra customer who helps their company to achieve their profit goals, as well as him or her to earn their annual bonus.

Other AIs out there go after recall. Recall is a data science term that measures the percentage of events of interest identified out of the total number events of interest. Recall-hungry models go after so many deals in order to make sure that they capture them all, that they even go after the unprofitable opportunities to make sure that they hit their goal. What a business wants is not to go after only customers that you are sure will convert, and not to chase the last customer at any cost, but to go after all of the opportunities they can be profitably pursued.

What is most important is pursuing the goal that business users are trying to achieve in the first place.

The problems facing businesses over the years have not changed. The information available and the ability to make recommendations from that information, however, has improved significantly. These AI models can also improve lives beyond the black and white world of dollars and cents.

Healthcare providers are trying to reduce the rate of hospital readmissions. The impact to patient satisfaction and health can be significant if the patient unexpectedly needs to come back to the hospital because he or she was sent home prematurely. It is much better for the patient and the hospital if the patient stays in the hospital slightly longer or one additional test is run to help ensure that there is no readmission. Because the impact of readmission is so much bigger than a few additional days onsite, the AI needs to respect this imbalance in order to optimize for patient outcome. Academic AIs will train on accurately predicting who will get readmitted, but only after the fact, if ever, is the analysis done to see if skipping additional monitoring is worth the risk of readmitting the patient.

Rarely in life or business are the payoffs for going after something exactly even. And you take either an aggressive or a conservative stance so that you achieve an outcome more important than simply being right. While being correct is nice, ultimately it is not the result that matters most. Your AI should understand the way that you make decisions. Whether you are trying to maximize your profitability, or increase your quality of life, that goal is more important than accuracy. Those results are ultimately what matter.