Blue Line

Features Deep Blue Opinion
There was data to prove it

Data, data, data. At times I get the distinct impression that people are tired of hearing me rant on and on about the need for data.

April 11, 2017  By Dorothy Cotton


I was recently asked to comment on some training that a particular police service had provided. “What do you think?” they asked. It looked really good. It ticked all the little boxes that this kind of training ought to tick. The officers said it was good. The trainers appeared to know their stuff. But when I asked if it actually worked, like out in the field, I got blank stares. There was no way of measuring that. It sounded good. It sounded like it would have the desired effect. However, I think we all know that just because something sounds good and we think it OUGHT to achieve a certain purpose, doesn’t mean it really does.

I ran across a research article the other day that made this point in spades. You may recall a column I wrote for the December 2014 issue of Blue Line Magazine that described a procedural justice intervention that was carried out in Australia. The gist of the project was that officers were given a script to use when conducting random breath tests on drivers. The script involved emphasizing a number of attributes related to procedural justice and the hypothesis was that emphasizing procedural justice would improve the public’s level of trust in police and their confidence in the police’s ability to do their job fairly. It worked. There was data to prove it.

Since this sounded like such a good idea, some people in Scotland decided to adopt the program. They made a few adjustments and ran a similar program and the results were… bad. The intervention had the opposite effect to what they expected and wanted. During the time the study was in progress, some officers were required to do the usual thing — whatever that was; and others were given scripts emphasizing procedural justice concepts, as well as leaflets to give the drivers with more explanation of what they were doing. The scripts were a little different from the ones the Australians used. In the Australian study, the officers were to use the script pretty well verbatim. In the Scottish study, the script was more like suggestions and officers were to use their judgment about how and when to say what. They measured the public’s attitudes before and after the study. The opinions of people who received the “business as usual” treatment from the police improved. The opinions of people who got the new improved (?) version focusing on procedural justice got worse.

What’s with that? My first thought was that maybe Australians are easier to please and Scots are just generally ornery (I have no data to support that belief) – but the authors of a paper recently published in the Journal of Experimental Criminology1 had a few other ideas. It is of course possible that the strategy simply does not work—and maybe the positive findings in Australia were a fluke. But there are other options.

Advertisement

In the Scottish program, as mentioned, the police also gave out a leaflet. It turns out that not everyone remembered getting the leaflet. It is not known whether people actually did not get the leaflet or if they just did not remember getting the leaflet—but in any case, people who did not remember getting a leaflet were more negative in their comments, and were less likely to feel that they had been given an opportunity to express their own point of view.

A second possibility is that the officers in the Scottish study really did not implement the program the way it was intended, and did not do what the Higher Ups thought and wanted them to do. It may be noteworthy that the Scottish folks objected to a script and thus were left with some discretion as compared to the Australian group. Perhaps the Scots choose their words badly. Not all officers actually received any instruction about the program; many were simply handed a sheet saying “Do this.” Perhaps by the time the information and direction had filtered down from the research team to senior management to the inspector to the sergeant to the constable, some stuff got lost along the way. In other words, maybe the program did not work because the officers did it wrong—or at least they did not do it as intended by the program designers.

The third possible explanation is that the failure of the program might have had something to do with the organizational context within which the program was carried out. I think we all know what happens when the Powers-That-Be think something is a good idea and the folks on the street either don’t agree or get their feathers ruffled about being told how to do their job. If the quotes in the article cited are any example, in the Scottish project, the officers who were supposed to deliver this program were less than amused by the whole thing. Some ignored it, many thought it was bad idea to start with, some were insulted and considered that they knew better. Overall, the reaction of the officers to the program was negative. The program was introduced during a time when there were many other organizational changes going on. It is well known that many of us do not exactly embrace change. There was also a great deal of tension between the management and the frontline people at that time. The article examines these factors at some length. I will spare you the details but I will be surprised if any of this surprises you.

So is it the case that the program itself does not work? Well, yes and no. Here is where I return to the subject of data and its importance. If you simply lifted the program from Australia and implemented it, you’d be puttering along thinking you were revolutionizing the public’s attitudes toward and confidence in police—while in fact you were making things worse. It would be handy to know that, regardless of the reasons. But if you don’t ask, you don’t know.

1 Sara MacQueen and Ben Bradford (2016) Where did it all go wrong? Implementation failure—and more—in a field experiment of procedural justice policing. Journal of Experimental Criminology DOI 10.1007/s11292-016-9278-7

Dr. Dorothy Cotton is Blue Line’s psychology columnist. She can be contacted at: deepblue@blueline.ca.


Print this page

Advertisement

Stories continue below


Related