One of the central assumptions of the economic theory is that people will make decisions based on optimization. The "homo economicus" - or economic man, a term used to refer to the agents representing the humans in the theory - will define their choices and actions using rationality, aiming to maximize their economic well-being. Another foundation of this theory is the concept of equilibrium, such as the one present in the demand-supply balance: prices will always fluctuate freely until an equilibrium is found.
This classical economic theory has been the basis for many models and studies over the years. Financial markets, the insurance industry, pricing and marketing departments, and many others have considered that humans are rational when making decisions to define their investment portfolios, insurance policies, and product prices, for example. However, it has not been enough to prevent economic bubbles and collapses such as the tech bubble at the end of the last century and the 2008 financial crisis. The gap between the theoretical decisions and results obtained generated criticism and skepticism.
The main critics to the theory are related to the main assumptions. First, the optimization problems faced by an average person are usually extremely hard to solve: a simple visit to a grocery store can generate many decisions among hundreds of items that will not be examined entirely, for example. Another flaw is related to the rationality of the agents: people's beliefs and emotions are always taken into account when making real decisions, and there are countless biases that have been documented as affecting them.
To better understand and adapt the economics theory, researchers from other fields such as sociology and psychology joined the economists and have made many experiments, originating the called behavioral economics. By definition, it studies the effects of psychological, cognitive, emotional, cultural, and social factors on the decisions of individuals and institutions. It helps the traditional theory to adapt its models to reality, where agents' decisions are not consistent with rationality and problems are complex.
As mentioned earlier, due to the complexity of the problems we tend to deal with, and to save energy thinking, we tend to make decisions with small thinking, relying on intuition. This results in many cognitive shortcuts (also referred to as heuristics), biases, and noise in the process. Daniel Kahneman and Amos Tversky1, considered by many the fathers of Behavioral Economics, cluster these biases into three main cognitive shortcuts: representativeness (the probable degree to which an object represents a particular type); availability (the subjective assessment of the probability of an event based on instances or occurrences that come to one’s mind); and adjustment from an anchor (the subjective estimates made starting from an initial value and adjusted to yield the final answer, with such adjustments typically proving insufficient).
We will go deeper into the representativeness heuristics to exemplify these biases. It is used when people judge the probability that an object or event A belongs to class B by looking at the degree to which A resembles B, neglecting the information about the general probability of B. For example, suppose the following problem:
Bob is an opera fan who enjoys touring art museums when on holiday. Growing up, he enjoyed playing chess with family members and friends. Which situation is more likely?
A. Bob plays trumpet for a major symphony orchestra
B. Bob is a farmer
Most people will choose option A because Bob's description better represents the stereotype of a classical musician. However, the likelihood of B being true is far greater, since farmers make up a much larger proportion of the population.
Another exercise demonstrating it is the Linda problem:
Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations. Which is more probable?
A. Linda is a bank teller.
B. Linda is a bank teller and is active in the feminist movement.
Impacted by the representativeness of B, the majority will choose this option, even if the probability of two events occurring together is always less than or equal to the probability of either one occurring alone.
Both these small cases were presented by Kahneman and Tversky and are only used to illustrate the cognitive shortcuts, but it's possible to imagine the "damage" they might cause to decision-making. Other cognitive biases with simple examples can be found in the infographic below, created by Racounter.net.
In addition to the impact on each person's choices, these "misbehaves" can generate many questionable decisions for institutions and organizations, such as the government and companies. One example of how the institutional agents of the real world do not follow the economics theory was found in the studies made by Richard Lester2, in the 40’s.
He surveyed many managers and CEOs to understand how they decided on the production level of their firms and the number of workers they needed to either hire or fire. A rational agent would take marginalism into account, where it is worth producing until the marginal revenue is greater than the marginal cost; a worker will be hired if the marginal gain obtained from his work is greater than the salary paid. However, Lester found out that most managers seemed not to care about these factors, but rather relied on simple ideas such as selling as much as possible. The question was raised on whether market agents were actually rational at a time when the debate of behavioral economics had not yet begun.
Another researcher called Fritz Machlup3 entered the discussion started by Lester, stating that even if managers did not mention marginalism and the basis of firm theory, their decisions and actions would approximate the theory. Individuals act "as if" they balance marginal benefits and costs even if they do not explicitly do so in their own minds. In other words, even if they did not know the reason, they would act as a rational agent.
Although at the time Machlup was considered the "winner" of the debate (the "as-if" idea was reinforced by many other economists, such as Milton Friedman, and the debate was called "closed"), after the discoveries of biases and cognitive shortcuts by Kahneman and Tversky it was impossible to sustain the idea that human intuition would always approximate rational decisions.
To prevent the impact of biases and cognitive shortcuts on decision-making, Kahneman’s4 prescription is for organizations to temper human judgment with “disciplined thinking” through the use of algorithms. The indications from the research are unequivocal, he said: When it comes to decision-making, algorithms are superior to people. “Algorithms are noise-free. People are not,” he stated. “When you put some data in front of an algorithm, you will always get the same response at the other end.”
This is especially true for some fields, such as medicine (diagnosis of diseases as cancer), and financial markets (credit risk, portfolio allocation), among many others. In these cases, called "low-validity environments"there is a significant degree of complexity, uncertainty, and unpredictability, and low feedback to transform experience into expertise and, according to researchers, simple algorithms matched or outplayed humans and their “complex” decision-making criteria, essentially every time.
Cassotis' vision reflects this kind of solution: we want to bring rationality to decision-making in organizations. By using mathematical models and algorithms we aim to increase the accuracy of decision-makers, by empowering their expertise and intuition with easy-to-use tools to support their processes.
Do you face complex problems, which may be subject to cognitive biases and noises when taking decisions? If so, do not hesitate to reach out!
Author: Cassiano Lima - Senior Consultant at Cassotis Consulting
Co-author: Fabio Silva - Senior Manager at Cassotis Consulting