Are medical decisions based solely on facts? Should we continue developing procedures and algorithms, or should we put greater trust in our own experience and competence? Decision-making theory can help answer these questions.
Illustration: Ørjan Jensen/Superpop
Decision-making responsibility is a key part of the medical profession. All doctors have felt the uncertainty associated with making challenging decisions. This is how it should be. This is a responsibility we must be happy to take on.
Some fields of medicine – like anaesthesiology and traumatology – are characterised by time-critical decisions that need to be made fast, often based on incomplete information. In other areas of medicine, such as rehabilitation, the uncertainty is associated with complex and lengthy decision-making processes that can stretch over weeks and months.
Traditionally, we have been good at emphasising the importance of keeping professionally updated on diagnostics and treatments. Whenever we come across errors and non-conformances, the solution is often to seek more knowledge and/or to draw up new guidelines and procedures. These are important and correct prerequisites for good medical practice, but form only part of the solution.
In the following I will examine the matter from a different angle: How can insight into decision-making theory make us better decision-makers? Two researchers have been – and still are – key figures within the field: Nobel Prize winning psychologist Daniel Kahneman is best known to many as the author of the book Thinking, fast and slow (1). He has played an important part in developing the Heuristics and biases approach. Psychologist and researcher Gary Klein has been a leading figure in developing the theory of Recognition-primed decision making (2).
Kahneman’s cognitive biases
According to Kahneman, the brain’s decision-making activity relies on two separate systems (1). System 1 makes fast and intuitive decisions without involving conscious deliberation. We recognise intuitively a familiar voice, an aggressive face or simple numeric values on a monitor without the need for focused mental attention.
While system 1 makes automatic decisions based on familiar patterns, distances and sounds, system 2 works at a slower pace, but more analytically. This enables us to reason and reflect, like when we are trying to draw conclusions from large amounts of information. Using system 2 requires concentration and conscious deliberation.
Both systems have their strengths, weaknesses and sources of error. Kahneman (1), Lighthall (3) and Stiegler (4) present several thought-provoking examples of the fallibility of our decision-making process. This fictional story is based on an article written by the latter:
An anaesthetist conducts the preoperative evaluation of a girl prior to an appendectomy. She is otherwise in good health, but the girl’s mother seems to remember that the anaesthetist might have mentioned intubation problems when the girl was tonsillectomised a few years earlier. The anaesthetist cannot see any clinical signs of difficult airways, and there are no notes to this effect in her medical records. Several years have passed and he thinks that circumstances may well have changed. The anaesthetist sticks to the plan, thinking that things tend to go well. The girl’s father – who is a lawyer – jokingly talks about making a fuss if anything goes wrong, mumbling something about recent newspaper headlines. The anaesthetist reconsiders the situation and asks a colleague for assistance during the induction of anaesthesia, but the colleague is otherwise engaged. The video laryngoscopes are in use. The surgeon is anxious to get going, and the anaesthetist starts, a little reluctantly, while telling himself that he has dealt successfully with similar situations before. Luckily, everything goes well and the anaesthetist is happy with his own assessment and work. His colleagues however, express scepticism at his failure to put on the brakes and factor in potentially major airway challenges.
This story illustrates a number of well-documented psychological effects that influence decision-making processes (4): We often look for information that confirms what we know, while we ignore contradictory information (confirmation bias). Also, we become anchored in our original plan and our ability to reconsider is poor (anchoring bias). Furthermore, resourceful patients and relatives influence us more than others (the halo effect). Probability and risk are assessed on the basis of how readily we remember other incidents (availability bias). We take unnecessary risks to avoid appearing incompetent (loss aversion), while we have little awareness of the cognitive biases that influence us (blind spot bias). We will often assess identical information differently in retrospect (hindsight bias).
A vast number of such cognitive biases have been identified, and various mental strategies have been proposed to counteract their effect (1–4). According to Stiegler, we therefore need to recognise that clinical decision-making processes are affected by cognitive biases, individual preferences and inaccurate probability calculations, even if the diagnosis is known and there are treatment guidelines available. This means that while professional expertise is essential, it is also fallible. This is particularly the case when we are working under pressures of time, uncertainty and fatigue (4). Kahneman and his supporters use this as an argument for the importance of basing our decision-making processes on procedures and guidelines, precisely because we are easily deceived by these cognitive biases (1).
Klein’s intuition
Klein maintains that experienced professionals largely make decisions based on experience and pattern recognition (2). An expert recognises patterns and elements in new situations and compares them to earlier experiences. Different solutions are mentally simulated until he or she identifies one that might work. There is little conscious comparison of different hypotheses. This may explain why experienced professionals in times of crisis are hardly aware of the thought processes that led them to the solution.
Klein’s model was developed by studying firemen, military service personnel and members of other professions where the decision-making processes are characterised by urgency, uncertainty and unpredictable circumstances (2).
Klein recognises the fallibility of human experience and judgement, as demonstrated by Kahneman’s research, but points out that these cognitive biases are also – or perhaps primarily –mental shortcuts that enable us to make quick decisions. He points to experience and pattern recognition as important factors in a decision-making process and warns against the belief that procedures can replace experience in complex, unclear situations (2).
In his book entitled Streetlights and shadows: searching for the keys to adaptive decision making (2), Klein describes how Captain Bob Pearson and his first officer sat behind the controls of a Boeing 767 over Canada on 23 July 1983. Suddenly the fuel pump for the engine on the left side stopped working. Captain Pearson did not become particularly anxious. He knew the procedures for this type of problem. Soon after, the left engine shut down. The captain kept his cool and started planning for an emergency landing. There are robust procedures in place for this type of situation. Before he had the time to open the checklist, the right-side engine also shut down. And as if to emphasise the gravity of the situation, all of the cockpit lights went out. The two pilots looked at each other and realised that the impossible had happened. They were cruising at an altitude of 41 000 feet in a large aeroplane carrying 61 passengers – and they had run out of fuel. The captain realised that he was in an entirely new predicament and that there were no procedures for him to follow. However, he did have many years’ piloting experience, including flying glider planes. After a couple of intense minutes, the plane came safely to a standstill on the ground. No passengers had been injured, and the aeroplane was repaired and back in service two days later. The plane and the incident were named after the airport where they landed: the Gimli Glider.
Klein and the supporters of his model use this and similar stories as dramatic and important reminders that it is impossible to draw up procedures that cover all conceivable eventualities. When the complexity of the situation increases, nothing can replace experience and a high level of competence (2).
Klein versus Kahneman
So how should we approach these two apparently contradictory models? Kahneman points out that we are heavily influenced by cognitive biases even if we are professionally erudite and experienced. Klein points to experience and pattern recognition as important factors in a decision-making process.
The two researchers provide the answer themselves in their article Conditions for intuitive expertise: a failure to disagree (5). Their principal message is that the two theories complement one another more than they contradict one another. A holistic approach to good decision-making includes procedures and guidelines as well as clinical experience. Intuitive and analytical strategies both have their strengths and weaknesses. The optimal decision-making strategy will depend on factors such as the doctor’s level of competence and experience, the nature of the situation and the degree of urgency as well as the equipment available. The challenge is to possess sufficient mental agility to choose the correct strategy based on the current situation.
Is it possible to boost your decision-making competence through practice?
It is difficult to investigate how people make decisions in real-life situations characterised by uncertainty, urgency and inadequate information. Attempts to deconstruct such complexities tend to oversimplify the reality we want to study. Consequently, there is no compelling evidence to suggest that decision-making competence can be gained through practice. Despite inadequate corroboration, it does however seem reasonable to assume that such practice has an effect. For example, airlines and operational military units have long considered decision-making competence to be a key skill (1–6).
It has been suggested that in a medical context, such training might include theory classes combined with simulations and, most importantly, implementation in day-to-day practice (3, 4).
Having a basic understanding of decision theory entails being familiar with important decision-making models, the sources of error that influence us, and how we can prevent the bias they cause. Theoretical understanding can be built through self-study of relevant literature, discussion with colleagues and/or different forms of training – preferably starting while still at medical school (3, 4). This will provide a common framework of concepts and definitions.
Simulations and exercises have won an important place in medical training. There are training models that are, in addition to their purely medical content, intended to specifically provide practice in employing cognitive strategies (6). The simulation scenario should emulate our normal working environment and involve the same people. Case histories and feedback should deal with issues such as how clinical information is perceived, ignored and prioritised. This stimulates conscious choices and meta cognition: Why am I thinking the way I do (6)?
The matter of clinical implementation can be solved in various ways, and the organisation of day-to-day work routines should facilitate this. Self-monitoring helps us to recognise sources of error in our own decision-making processes and to apply cognitive strategies to minimise their impact. This can prevent potentially dangerous situations from arising. Experiencing and understanding that urgency and incomplete information is the norm, can help to reduce the level of frustration and the feeling of inadequacy. Good routines for continual and direct feedback on the effect of one’s own assessments is a prerequisite for the development of expertise. Klein and Kahneman refer to this as a high-validity environment (5).
Concepts from decision-making theory can contribute to accurate communication in emergencies and when we retrospectively judge each other’s decisions and try to analyse the basis on which they were made. This is especially the case when handling rare and/or undesirable incidents. Consequently, these are important tools for our work in quality improvement.
We should use and further develop procedures for situations that lend themselves to this type of approach. Procedures are highly appropriate if the patient’s symptoms are limited and the treatment goal defined. Such procedures can ensure that patients quickly receive the correct treatment in cases of acute myocardial infarction or have their blood sugar level gradually corrected in cases of diabetic ketoacidosis. Good algorithms contribute to efficient treatment of trauma patients, and checklists minimise oversights. Procedures represent a good starting point for beginners but should not be a restrictive factor for experienced professionals who can identify more flexible solutions.
At the same time, we should avoid procedure-based practice wherever this is unhelpful (5). Procedures are often inappropriate in complex situations. Strict procedural thinking can lower the level of caution and professional interest among colleagues – thereby counteracting their objective over time (5). No collection of procedures can cover all eventualities. It is impossible to treat a seriously ill patient with a long case history, vague symptoms and poorly defined treatment goals by following a flow chart. Just as it is impossible for a pilot to look up the correct procedure when faced with landing an aircraft that has run out of fuel.
The results are decisive doctors, robust decisions and better treatment of patients.