Bounded Rationality
How hard should we work to be certain?
I’ve always enjoyed thinking about the concept of bounded rationality, originally put forth by Herbert Simon, a professor at Carnegie Mellon, in the 1950s. Simon pointed out that due to limitations of time and access to complete information, we often make decisions that we think are “good enough” rather than taking the time and energy to pursue perfect solutions. He called this behavior satisficing, by which he meant that we are willing to give up a certain degree of certainty in favor of taking expeditious action. Faced with the difficulty of obtaining complete information while being aware of the opportunity costs involved in not acting, we place a bet that we understand enough of what we need to know to move forward rather than to continue to seek further clarity. The outcomes of our actions will tell us whether we in fact knew enough or not.
Individuals and teams vary greatly in their acceptance of risk accompanying satisficing, which is why we hear people complain about their bosses demanding that they collect more data before seizing what appear to be obvious opportunities. In war, satisficing decisions can get people killed unnecessarily but so can the failure to make a preemptive strike. In business, the timing of venture capital investments greatly influences both risk and returns, so the art of knowing when to make an investment decision in the face of incomplete information is crucially important. In less extreme instances, everyday business decisions are almost always made in the context of bounded rationality. Selection, promotion, and product development decisions, along with thousands of others, get made based upon our best guess at the time, with fingers crossed behind our backs. The odds tell us that sometimes we will lose when we roll the dice but acting too slowly or not deciding at all aren’t good options either. Depending on the context, a poor outcome from a decision based on satisficing can be career limiting but so too can being perceived as someone who evades opportunities to seize the moment.
Personality plays a large part in influencing one’s level of comfort in taking risks but so does the environment in which the decision is being made. From a personality standpoint, people who prefer to have more data before they are comfortable in making a decision tend to display neurotic tendencies on the Big Five personality assessment and a combination of high prudence and low adjustment on the Hogan Personality Inventory with a cautious derailer. While individuals with this personality are less likely to be selected for leadership roles in organizations that require entrepreneurial behaviors, they are not uncommon in organizations entrusted with protecting existing wealth or maintaining the status quo. Leaders of this latter ilk are less likely to respond quickly to shocks affecting their company or industry, turning their previously entrusted conservatism into a liability.
Within companies, the influence of personality can be overridden by social pressures emanating from those above, below or surrounding the person making the decision. A leader who is by nature conservative can join a decision that their riskier peers overwhelmingly support. Irving Janus labeled this phenomenon groupthink, pointing out how dangerous socially fueled satisficing decision making can become. For a time, psychologists were enamored with the idea that groups make inherently risker decisions than individuals acting alone, until the concept of the “risky shift” as it was known was largely debunked. Groups can be either riskier or more conservative than the average of their individual members’ tendencies, largely depending on the influence of the leader.
All of this to say, there is no bright line to indicate exactly when enough information exists to make a decision that balances risk and timeliness. It would be nice if such a line, like the ones that we see superimposed on football fields on television to help viewers know how far it is to the first down marker, popped up whenever a team was faced with an ambiguous decision. So far, at least, we haven’t invented that technology for the boardroom. Artificial intelligence has certainly made the acquisition of greater amounts of information faster, addressing one dimension of Simon’s limitations resulting in bounded rationality. However, it wasn’t just the quantity of information available that concerned Simon; it was also its reliability. To make better decisions, leaders in real organizations are not only in need of more information about things that exist or what has happened in the past but also what will happen in the future. Will the price of soybeans rise or fall, and what might determine that? It’s nice to know what the price is, and how it has been trending, but much more important to project accurately into the future.
Moreover, organizations in the real world exist in what Fred Emery and Eric Trist, two 20th century scholars from the Tavistock Institute in London called a disturbed-reactive environment. By that, they meant that the actions taken by an organization will trigger reactions on the part of their competitors or other stakeholders, which in turn will call for an additional response from the organization. Once you put things in motion, you can never be completely certain what the effect will be. Therefore, even AI will not eliminate the need to make decisions with less than perfect information about the future, aka, bounded rationality.
For most leaders, concepts like bounded rationality are like water for fish. You just accept that this is the way things are because you can’t really change it. At some point, you even forget that the water exists. That is, until the resistance created by the water doesn’t allow you to swim faster than pursuing prey.
Just because bounded rationality will always be a given doesn’t grant license to ignore it. Sloppy decision making is a real thing, with real consequences. So is excessive risk avoidance resulting in missed opportunities. Like many things about leadership, the goal isn’t to find the right answer or perfect balance. It’s to develop objective awareness of what is going on so that it can be questioned and improved if necessary. By objective awareness, I mean the perspective that would be offered by a seasoned professional observing the process as a bystander rather than as a participant who is caught up in the social pressures and emotional demands of the moment. Achieving this kind of perspective for those involved is understandably difficult. Therefore, I’m suggesting that leaders need to impose rules and processes that prevent the ill-effects of bounded rationality from clouding critically important decisions, making them either riskier than they need to be or too slow to take advantage of opportunities in a timely fashion. Personally, I don’t think the often-recommended designation of a devil’s advocate is adequate, given the authority dynamics that operate in hierarchical systems. Additional steps need to be taken, including such things as building in a pause for reflection, enabling individuals to anonymously speak truth to power, or inviting external reviews of decisions that might actually reverse the momentum behind the current direction when necessary.
Or perhaps we could see if those clever people who came up with the superimposed first down line for television could do something to indicate when we have crossed the line in terms of bounded rationality.

