By Apollo Kowalyk
By Apollo Kowalyk
A paradox is defined as a situation, person, or thing that combines contradictory features or qualities. Unfortunately, it also describes how records management systems work at cross purposes with the information needs of police investigations.
The architecture of conventional database systems is rooted in the industrial thinking of the World War II era. O.W. Wilson, an early voice of police reform, reflected this frame of mind when he wrote the book on police records classification in 1942.1 The influence of his prescriptive methods is still evident today, manifesting itself in the ”tyranny of the case file,” which U.S. Senator Richard Shelby identified in a 2002 Senate Committee Report on the need for post-9/11 intelligence reform.2 This “case file mentality” characterizes the way in which law enforcement agencies gather and organize information according to a sequential case number and store it in file folders, electronic or otherwise, where the information quietly waits for somebody to find it through a precise search. Centralized record keeping was intended to provide greater control over police operations, but criminal intelligence was never part of the equation.
This is why law enforcement agencies continue to struggle with an intelligence paradox- the ability to generate investigative insight is hindered by the very systems that store the data we need to solve crimes.
“Bolt-on” solutions have been incorporated with little success. Hotspot maps are generated to show trends and patterns, but this postdictive approach-based on the post-mortem analysis of criminal activity is of limited use to frontline police officers. The notion of predictive analytics has generated some attention in recent years, but is mere conjecture at this point. It is almost impossible to predict where criminal activity will occur in the near future, except perhaps at an aggregate level, such as the anticipation of an increase in assaults at an entertainment district during Friday and Saturday nights. Criminal activity might average out over time; however, science does not work in terms of averages. Just like you can drown in a lake that averages three feet deep, crime statistics can be misleading.
But the problem is deeper than this. Clearance rates are driven by arrests made by patrol officers who catch the perpetrator at the scene or identify a suspect through information provided by the victim or witness. This is not a new phenomenon. The startling difference in clearance rates, in a comparison of investigations involving named and unnamed suspects, was recognized in a 1966 study within the Los Angeles Police Department, which lay buried in the voluminous 1967 publication, The Challenge of Crime in a Free Society.3 Clearance rates were approximately seven times higher when a suspect was identified early in the investigation. This disparity still exists and has enormous implications for investigative practices today.
This can be explained by an unsolved problem in computer science, commonly referred to as the P=NP problem. It asks whether a difficult problem, whose solution is easily verified, could have been easily solved. For example, confirming a password is much easier than trying to figure it out in the first place. This speaks to the importance of having a named suspect at the start of an investigation; relevant information flows forth from various databases once you know where to look. However, without a named suspect, that same information remains hidden from view because the investigator might not know where to look for it- or that it even exists.
The P=NP problem has real-world applications; for example, school shootings and terrorist attacks often occur without warning, with the attacker’s identity and motive becoming clear to investigators only in hindsight. Identifying a suspect prior to a spree killing or terrorist attack allows police to intervene and prevent a potentially catastrophic incident, but this type of investigative success is often dependent on luck in the absence of sophisticated intelligence-gathering practices, which require a level of investment far beyond what the vast majority of police agencies can afford. Therefore, the best solution is one in which the records management system itself does this work- but how?
Resolving the Intelligence Paradox
Most analytical insight occurs in unexpected ways, often when people least expect it, but we can’t rely on serendipity and happenstance to solve crime. Although records management systems are good at storing data, they are poor at matching related data points to generate leads. Automated, intuitive algorithms are required for a task of this magnitude, increasingly important to mission and enterprise needs in a post-9/11 world.
IBM’s Jeff Jonas refers to this as a process of” sense making;’ in which ” data talks to data” within a Context Computing model.4 Once entered into the system, each new datum automatically introduces itself to other data points and decides whether a connection exists, perhaps resulting in the unexpected discovery of certain relationships or patterns that raise a red flag. The ability to discover leads that can solve crimes or protect an officer’s life through enhanced situational awareness will one day become the gold standard for analytical algorithm design, measured by the ability to increase clearance rates by making use of information we didn’t even know we had. Only then will we overcome the intelligence paradox.
1 – O.W. Wilson,
3 – Herbet H. Isaacs, “A Study of Communications, Crimes, and Arrests in a Metropolitan Police Department,”
4 – Jeff Jonas, “G2 Is 4,”
<<< BIO BOX >>>
Apollo Kowalyk is a Staff Sergeant with the Edmonton Police Service. He is also the former Director Alberta Gang Reduction Strategy, Justice and Attorney General, Safe Communities and Strategic Policy
Email: Apollo.Kowalyk@gov.ab.ca He may be reached by email to: