Drive smarter decision-making with explainable machine learning

Did you miss a session from the Future of Work Summit? Visit our Future of Work Summit on-demand library to stream.

This article was contributed by Burke Birand, CEO Farrow Labs.

Is the hype surrounding AI finally cooling off?

Some recent surveys suggest the same. Most executives now say technology is more hype than reality અને and 65% report zero value from their AI and machine learning investments.

However, these statements often reflect a fundamental misunderstanding. Many executives do not differentiate between generic black box AI related techniques such as explanatory machine learning. As a result, they’re losing the crucial way to make smarter and more efficient decisions that could drive more enterprise value.

Black boxes, or software programs that throw out mysterious answers without revealing how they got there, are algorithms that power the world’s top tech companies. You have no way of knowing how the black box comes up with its result. Occasionally, the results are amusing, such as when Google’s image recognition software mistakenly identifies a cat as guacamole, or when Netflix recommends a bad show. In those cases, the stakes are low. An error on Netflix’s part, at most, wastes a few minutes.

But for complex, high-profile sectors like healthcare, criminal justice and manufacturing, that’s a different story. If AI technology informs the steel engineer to add the wrong amount of alloy by producing metal with the wrong density, the buildings may collapse.

In areas such as healthcare, where a single decision literally makes the difference between life and death, professionals may be reluctant to rely on the recommendations of a particularly mysterious black box algorithm. Or, worse, they could adopt them, leading to potentially catastrophic consequences.

Explain machine learning

Unlike black box software, any AI solution that can properly call itself “explainable” should disclose how different inputs affect output. For example, take autopilot software – the steering control algorithm needs to know how much the aircraft will bend if the sensor detects northwest winds at 50 miles per hour, and the user should be able to understand how this information affects the algorithm’s predictions. Is. Without this capability, the software will fail to achieve its intended purpose, and thus result in a negative value.

In addition, explanatory software should provide some kind of measurement that demonstrates its confidence in each prediction, allowing safe and accurate decision making. In healthcare, for example, a doctor is not only asked to use certain treatments. Instead, they will be told the probability of the desired outcome as well as the level of confidence. In other words, is the software too confident in its predictions, or are the predictions more predictable? Only with this kind of information can a doctor make informed and safe decisions.

How can you apply explanatory machine learning to make smarter decisions in your company?

If you want to build a tool internally, know that it is difficult. Explainable, machine learning is complex and requires deep statistical knowledge to develop. One area that has done this well is pharmaceuticals, where companies often have PhD scores in in-house explanatory data science and analysis.

If you want to buy software, you need to do some hard work. Look at the actual usage cases provided by the seller, not just the tagline. See the background of the science / research team – are they proficient in explainable machine learning? What evidence are they showing their technology?

The most important? Use your judgment. The great thing about explanatory machine learning is that it can be well explained. If you don’t get it, it probably won’t add value to your company.

Burke is the CEO of Birand Farrow LabsNew York-based industrial AI software company.


Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including tech people working on data, can share data-related insights and innovations.

If you would like to read about the latest ideas and latest information, best practices and the future of data and data tech, join us at DataDecisionMakers.

You might even consider contributing to your own article!

Read more from DataDecisionMakers

Similar Posts

Leave a Reply

Your email address will not be published.