Dissertation: Computer must be able to explain the reasons for the trade-off decisions it makes

In his dissertation, M.Sc. Giovanni Misitano researched methods for making compromise decisions with the aid of artificial intelligence. Although ultimately all decisions are usually compromises, finding the best compromise is not easy. This search can be enhanced through various computer methods. A key aspect of the methods Misitano studied is that the compromise solutions provided by the methods should also be explainable. This means that the user can understand why the computer has arrived at a particular suggestion and can influence, it if necessary, by adjusting their own preferences.
M.Sc. Giovanni Misitano researches computer assisted decision making. Photo: Petteri Kivimäki.
Published
24.5.2024

In his dissertation, M.Sc. Giovanni Misitano investigated how finding compromise solutions can be enhanced using artificial intelligence in decision-making tasks found both in everyday life and the industry.

"A good compromise does not mean the same thing to everyone. For example, Bob might be willing to compromise on the quality of a product if it means a lower price. Alice, on the other hand, does not consider the price but is very particular about the quality of the product. However, Bob will not settle for the lowest quality product, and Alice is not willing to pay any price," Misitano explains.

The compromises that Bob and Alice are willing to make ultimately determine which product is most suitable for each of them and what the final purchasing decision will be.

In Misitano's example, the compromise was the balance between price and quality. This trade-off depends on each individual's personal preferences. In the real world, these problems are much more complex, and compromises must sometimes be considered with respect to many, even dozens of, conflicting criteria. This is called multiobjective optimization.

Misitano explains that the interactive decision-support methods he studied can be utilized in many different areas and fields, such as improving forest management or enhancing car crash safety.

These methods help in finding the best possible compromise solution by optimizing, that is, computing, and utilizing the user's preferences.

The Computer Must Be Able to Explain to Humans How Certain Solutions Are Reached

According to Misitano, many of the existing methods function as black boxes from the human perspective, taking in expressed preferences and spitting out compromise solutions.

"The decision maker does not get to see how the computer's solution proposal is generated, nor can they necessarily influence it during the optimization process," Misitano clarifies.

In his research, Misitano aims to find answers to the explainability of methods that seek compromise solutions. With explainability, the person using the method can see the connection between their preferences and the compromises presented by the computer.

"Through this research, people can more easily find the most suitable compromises for various problems, while also understanding how different preferences have influenced the final solution. This also helps in justifying the compromise solutions to oneself and to others," Misitano explains.

The research can potentially lead to various breakthroughs in supporting decision-making and making the best possible decisions. A good compromise will still depend on the preferences of the individual making the decision, but the best compromises must also be justifiable.

Additional information

M.Sc. Giovanni Misitano defends his doctoral dissertation “Enhancing the decision-support capabilities of interactive multiobjective optimization with explainability”. The opponent is Professor Serpil Sayin (Koc University, Turkey) and the custos is Professor Kaisa Miettinen(Ģֱ). 

The language of the dissertation is English.

The dissertation can be followed in the lecture hall RUU D104 Helena (Ruusupuisto) or online. Link to the online event (Zoom, application or Google Chrome -browser recommended):