Survey says… enough already
February 27, 2015 By Dorothy Cotton
I just got back from ten days away – a combination of business and personal travel. Much as I like to travel, I really hate coming back. Not that my life is awful or anything – quite the contrary, but two things happen when one has the gall to go away:
Everything piles up on your desk awaiting your return, making you wish you had never left; and
Every single person that you so much as made eye contact with during your sojourns sends you extensive surveys and questionnaires to determine your experience with their hotel/restaurant/service/merchandise.
I have nothing to say about #1 but a whole lot to say about #2. The fact is that I am all data’d out. Done. Caput. I do not want to tell one more person how I felt on a scale of 1 to 7 about the colour of the wait staffs’ uniforms or whether the bathtub was clean.
It’s not just travel-related surveys that cross my desk. Psychology is a research profession so everyone and their dog is trying to collect data about something. I returned to find in my mailbox overflowing with requests to complete surveys about my supervision style, way of selecting interns, satisfaction with various psychology-type services, my view on someone’s new website… you get the picture.
Don’t get me wrong – I actually enjoy completing surveys. I even volunteered for an agency that does nothing but ask you to complete surveys – but quit after answering what looked to me to be the same set of questions five different times.
So I am done. DONE. But alas, while I say that today, tomorrow I will turn into a hypocrite and start collecting data for something we don’t seem to know enough about. There are lots of those sorts of things in police psychology. What is the most effective way of contributing to officer selection and promotion? Which model of joint police/mental health response works best and what do the various models achieve?
How does one design and implement a workplace based mental health initiative that is accepted and works? What is the best way of re-integrating injured folks back into the workplace? I could go on… but every one of these questions require collecting even more data! How does one get around the problem of surveying people to death?
I have a few suggestions:
Stop collecting data that you never use or look at. I do not know a single agency that doesn’t collect a package of “routine statistics.” Are you using them? Do they tell you anything new? If the question has already been answered, stop asking it. (Mind you, I do realize that you are mandated to collect a whole bunch of statistics, many of which are useless. Good luck with that.)
May I repeat… Once you have answered a question, stop asking it. If you’ve done community satisfaction surveys every three years for the last few decades and they all say the same things, stop asking – or ask less often. Most questions can be answered through time sampling: you collect data for a month, or for the first week of several consecutive months, or once per season, then quit.
Before you ask for an answer, figure out what the question was. If there is no question in mind, you don’t need the data. Do you really need to know whether people who have never called the police think the response time is quick enough? If the answer is yes, then ask the question, but if it is no….
Focus your survey. Maybe a blunderbuss approach which involves asking everyone everything forever is not the way to go. Identify a specific subpopulation, and ask them enough questions about a specific topic, over a specified period of time, then see # 2 above. If you have answered the question, quit.
Use the data that exist – whether they are yours or someone else’s. There is a ton of data out there so there’s often no need to generate more. Likewise, if you have data, make it available to others.
Collect GOOD data. Make sure the people who are designing the surveys know what they are doing, and make sure that you can use the information you collect. If you collect useful data, it takes on a life on its own. This tends to happen whether the data was good or bad so make it good. If you don’t know what the difference is, find someone who does.
Also, if you are going to collect data, make sure you collect enough that you can stand behind your conclusions – because otherwise someone else has to do it all over again. The Internet has made it easy to collect bad data. If you send a questionnaire to every police officer in Canada and get responses from 237 of them, what on earth will that tell you?
Consider teaming up with other agencies to collect data so you all collect the same – which makes it infinitely more useable. It is hard to interpret data when there is no reference point and we can dismiss it if we think it only applies to you and no one else.
Don’t make the survey any more complex than it needs to be. Not everything merits a 10 point Likert scale. If you want my opinion on whether your hotel bathtub was clean enough, the answer is pretty well ““yes” or”“no.” I don’t think my bathtub inspection skills are up to discerning whether the cleanliness was a 6 or 7 on a 10 point scale.
Make it apparent to the people who have contributed data that you are actually using it. As noted above, if you do not plan to use it, don’t ask. If you do use the data, let me know. I might be more inclined to complete your next survey.
Now that I have all that off my chest, I will wade through the many requests for my feedback. I will tell Via Rail that I hate their new meals – even though I’m sure they already know everyone hates them. They are cheaper and require less staff time to prepare so they are unlikely to heed my advice – and besides, I have told them this a zillion times already.
I will tell the university student that I do not provide the kind of services she is asking about – and do not even live in her country.
I will redirect the person doing the survey on a particular aspect of police psychology to the existing literature because I don’t see what his survey will add to what we already know.
I will point out to the local social service agency that its survey does not make sense and the numbers are not in order.
I will tell the latest conference organizer that asking if the chairs were comfortable and the speakers good really does not provide useful information to anyone.
I think I will design a study about bad surveys and useless data.
Print this page