Blue Line

Features
Situation awareness: is it time to expand use-of-force training?

March 23, 2020  By Lori Horne


Photo: Michael Blann/Digital Vision/Getty Images

We hear a lot about the importance of situation(al) awareness (SA) in law enforcement but what do we know about how to train officers to monitor and regulate their SA in dynamic and uncertain environments?

To answer this question, we first need to understand what is meant by SA and why it is so important to decision-making and action. With this, we can begin to examine the central processes involved in developing and maintaining SA and explore how failures in SA can occur. We will then discuss the importance of monitoring and controlling (self-regulating) our internal conditions (prior knowledge, beliefs, expectations, assumptions, emotions, motivations) that influence SA accuracy and confidence. Lastly, we will highlight the need for evidence-based training that moves beyond simple performance outcomes.

Broadly stated, situation awareness is knowing what is going on in a given situation so you can act upon the right information at the right time. More technically, SA is comprised of three levels that are best described as recognizing readily available cues (Level 1: perception), understanding the significance of those cues (Level 2: comprehension), and anticipating future events/states based on that understanding (Level 3: projection).1

The work of perception (Level 1) is accomplished through the processes of monitoring, cue detection and recognition, while the processes involved in comprehension (Level 2) include pattern matching and cue categorization. These processes allow officers to “make sense” of information from Level 1 SA and comprehend its significance in relation to individual goals and current mental models (i.e. your beliefs/representations about the situation at hand). Lastly, projection (Level 3) relies heavily on expectancies (defined as the state of thinking something will happen or be the case), thereby enabling officers to readily anticipate future states/events.1

Advertisement

The central processes involved in the three levels of SA (e.g. cue recognition, pattern matching, and expectancies) allow for rapid cue-based categorization and automatic action selection, both of which are necessary in time-pressured dynamic situations.1,2,3 More specifically, cues enable us to generalize from prior knowledge and experience, while pattern matching guides expectancies and denotes a typical response.4

Pattern matching involves the ability to recognize key features in the environment that map to key features in mental models; the mental model then serves in the development of situation awareness.1 Expectancies are considered an important aspect of situation awareness in that they direct attention and interpretation and allow for the anticipation of future states. These processes support a police officer’s ability to draw on prior experience and training to quickly categorize situations and generate an appropriate response.

However, there is a dark side to all this, which is rooted in the reality that cue recognition, pattern matching and expectancies trivialize the complex cognitive and motivational processes inherent in decision-making; coupled with the fact such processes are often disposed to bias and error.5,6 This limitation is of particular importance in uncertain situations.

Let’s take a closer look at the central processes of SA described above to consider how they may actually limit response generation and cognitive adaptability. Doing so will highlight the need to continually regulate one’s own SA accuracy and confidence, thereby minimizing the risk of bias and error.

As an officer you bring with you to all situations prior knowledge, experience and training, which contribute to the formation of your expectancies. While this can serve you well, it is critical to understand how it can also catch you up, so to speak. Expectancies can be problematic in that there may be a tendency to override or exclude cues that do not support your expectations and can lead you to perceive certain cues more readily than others.7,8,9 Seeking information that confirms expectations while discounting other available information is referred to as confirmation bias (failure in Level 1 SA), and results in faulty mental representations.8

When cues from Level 1 SA are misinterpreted — due to faulty mental models — there is a likelihood that you will fail to understand the significance of critical cues. The misinterpretation of information due to faulty mental models is known as a representational error (failure in Level 2 SA) and occurs when there is a mismatch between your understanding of the situation and the reality of that situation.10 When contextual cues are misinterpreted, overlooked, or ignored due to unrecognized biases, officers are at risk of making an error in projecting what is actually happening (failure in Level 3 SA). Although information may be readily available in the environment, it may not be utilized as it does not match prior experiences and is therefore not mentally accessible to the officer. The resultant outcome is decision and action based on inaccurate SA and misplaced confidence.

Research has shown that 75 per cent of errors in SA occur during Level 1 (perception) while 65 per cent of all representational errors experienced during Level 2 SA (comprehension) go undetected.10,11 Additionally, research has linked overconfidence to errors in Level 3 SA (projection).12 This data highlights the importance of monitoring and controlling (regulating) one’s own awareness of both SA confidence and accuracy levels (high/low).

Having a high level of SA confidence and high SA accuracy will likely lead to the adoption of different tactical strategies than low SA confidence and low SA accuracy.13,14 Similarly, high SA confidence combined with low SA accuracy may lead to different actions than low SA confidence combined with high SA accuracy. Being aware of these differences in the moment is critical to successful decision-making and performance.

Maintaining accurate situation awareness in dynamic and uncertain task environments requires the ability to be flexible and self-regulating in one’s cognitions and motivational states.15,16 Under conditions of uncertainty, it is acknowledged that higher-level cognitive processes aid in the reduction of associated judgment bias and decision error.7 In novel situations, often only one option is accessible to the decision maker, yet this option is not always the most accurate option for task conditions.17 Ill-defined dynamic tasks require officers to be aware of their own cognitive and motivational states and to monitor and control how such states influence SA. Strategic response requires accurate SA in order to act upon the right information at the right time. Without this, officers are at risk of enacting autonomic reactions that may or may not suit the situation at hand.

When considering the question of how to train SA, we need to consider developing training methodologies, measures and debriefing practices that not only explicitly distinguish between the three levels of SA but also identify SA confidence levels. Doing so will provide insight into where and how breakdowns occur and allow for investigation and diagnosis of the cognitive and motivational states that contribute to SA failure. Simple performance outcomes will not provide adequate assessment and analysis of the complex processes influencing SA accuracy and confidence. Adopting such an evidence-based approach to SA training will better enable officers to develop the cognitive skills required to successfully regulate one’s own SA in the moment.

Endnotes
  1. Endsley, M.R. (1995). Toward a theory of situation awareness in dynamic systems. Human Factors, 37(1), 32-64.
  2. Klein, G. A. (1993). A recognition-primed decision (RPD) model of rapid decision making. In Klein, G., Orasanu, J., Caldenvood, D., & Zsambok, C. (Eds.), Decision making in action: models and methods (pp. 138-147). Norwood, NJ: Ablex.
  3. Klein, G. (1998). Sources of power. Cambridge, MA: MIT Press.
  4. Klein, G. (2008). Naturalistic decision making. Human Factors: The Journal of the Human Factors and Ergonomics Society, 50(3), 456-460.
  5. Barrows, H. S. & Feltovich, P. J. (1987). The clinical reasoning process. Medical Education, 21, 86–91.
  6. Marcum, J. (2012). An integrated model of clinical reasoning: dual-process theory of cognition and metacognition. Journal of Evaluation in Clinical Practice, 18, 954-961.
  7. Kahneman, D., & Tversky, A. (1982). Variants of uncertainty. Cognition, 11, 143-157.
  8. Plant, K. L., & Stanton, N. A. (2012). Why did the pilots shut down the wrong engine? Explaining errors in context using Schema Theory and the Perceptual Cycle Model. Safety Science, 50(2), 300-315.
  9. Stanton, N. A., & Walker, G. H. (2011). Exploring the psychological factors involved in the Ladbroke Grove rail accident. Accident Analysis & Prevention, 43(3), 1117-1127.
  10. Jones, D. G., & Endsley, M. R. (2000). Overcoming representational errors in complex environments. Human Factors: The Journal of the Human Factors and Ergonomics Society, 42(3), 367-378.
  11. Shorrock, S. (2007). Errors of perception in air traffic control. Safety Science 45, 890–904.
  12. Sulistyawati, K., Wickens, C. D., & Chui, Y. P. (2011). Prediction in situation awareness: Confidence bias and underlying cognitive abilities. The International Journal of Aviation Psychology, 21(2),153-174.
  13. Endsley, M.R., (in press). The divergence of objective and subjective situation awareness: A meta-analysis. Journal of Cognitive Engineering and Decision Making.
  14. Minotra, D., & Burns, C. (2015). Finding common ground: Situation awareness and cognitive work analysis. Journal of Cognitive Engineering and Decision Making, 9(1), 87-89.
  15. Haynie, J. M., Shepherd, D., Mosakowski, E., & Earley, P. C. (2010). A situated metacognitive model of the entrepreneurial mindset. Journal of Business Venturing, 25(2), 217-229.
  16. Osman, M. (2010). Controlling uncertainty: A review of human behavior in complex dynamic environments. Psychological Bulletin, 136(1), 65-86.
  17. Kahneman, D. (2003). A perspective on judgment and choice: Mapping bounded rationality. American Psychologist, 58(9), 697-720.

Lori Horne founded SATR (a training and consulting business) out of her doctoral work, fuelled by her experience as a Toronto police officer. She can be reached at info@satr.co or through satr.co.

 


Print this page

Advertisement

Stories continue below