Michael Ross, an executive fellow at the London Business School and DMS CEO, James Taylor recently had an article published on the Harvard Business Review: Managing AI Decision-Making Tools.
“The nature of micro-decisions requires some level of automation, particularly for real-time and higher-volume decisions. Automation is enabled by algorithms (the rules, predictions, constraints, and logic that determine how a micro-decision is made). And these decision-making algorithms are often described as artificial intelligence (AI). The critical question is, how do human managers manage these types of algorithm-powered systems. An autonomous system is conceptually very easy. Imagine a driverless car without a steering wheel. The driver simply tells the car where to go and hopes for the best. But the moment there’s a steering wheel, you have a problem. You must inform the driver when they might want to intervene, how they can intervene, and how much notice you will give them when the need to intervene arises. You must think carefully about the information you will present to the driver to help them make an appropriate intervention.”
In this article they lay out four different ways in which humans can interact with AI – automated decision-making systems. Each of these four approaches – Human in the Loop (HITL), Human in the Loop For Exceptions (HITLFE), Human on the loop (HOTL) and Human out of the loop (HOOTL) – have pros and cons. Each is appropriate in different situations.
As a complement to the article, this blog series will consider each of them in turn and illustrate them with examples from our experience helping clients adopt AI and decision automation. First, Human in the Loop (HITL).
HITL systems are often called decision support. Our work with clients has focused on two kinds of HITL systems – those where the automation at the same level as the human decision those where the automation was to create new data in the form of a simulation for the human decision-maker.
Direct decision support can be illustrated with a supplier management example. In this example, the client had multiple service suppliers and they all worked on portions of the business and supplier management was responsible for making sure they all worked effectively. The decision support environment we developed for them enabled supplier management to decide which of the suppliers needed to be proactively managed this week and why. We built a model of the decision-making by interviewing the supplier management team and identified various calculations and summaries that could be developed for each supplier. Several of the sub-decisions in this model were intended for a human decision-maker so these were not automated, instead a data visualization was developed for each to help the human decision-maker make their determination. The overall decision model structured these various elements so that supplier management could quickly decide who to call and what their agenda would be.
Equipment Planning Simulation
Simulation-oriented decision support is often related to planning decisions. In another example, we had a client who needed to make planning decisions about equipment by comparing the impact of different planning assumptions. The relative value of these options depended on the outcome of using this equipment in a variety of possible situations. We modeled how to decide on the value of a piece of equipment in a particular situation. Once this decision was automated, it could be made for different pieces of equipment in various situations. The client provided a set of situations and their likelihood of occurring as well as the various equipment options. We could then run the “how effective is this equipment in this situation” decision many thousands of times for different options, allowing them to summarize this data for the human decision-maker.
Both these kinds of HITL decisions repay an investment in decision modeling and decision automation. However, once you start automating decisions there is generally a move toward making many decisions without human intervention and focusing humans on exceptions.
Stay tuned for next week’s installment, the second kind of system: Human in the Loop For Exceptions (HITLFE).