Blue Line


December 2, 2014  By Christopher Chabris

by Christopher Chabris and Daniel Simons

Neil Degrasse Tyson, the astrophysicist and host of the TV series “Cosmos,” regularly speaks on the appalling state of science literacy. One of his staple stories hinged on a line from President George W. Bush’s speech to Congress after the 9/11 terrorist attacks. In a 2008 talk, for example, Dr. Tyson said that in order “to distinguish we from they” – meaning to divide Judeo-Christian Americans from fundamentalist Muslims – Mr. Bush uttered the words “Our God is the God who named the stars.”

Dr. Tyson implied that President Bush was prejudiced against Islam in order to make a broader point about scientific awareness: Two-thirds of the named stars actually have Arabic names, given to them at a time when Muslims led the world in astronomy – and Mr. Bush might not have said what he did if he had known this fact.

This is a powerful example of how our biases can blind us, but not in the way Dr. Tyson thought. Mr. Bush wasn’t blinded by religious bigotry. Instead, Dr. Tyson was fooled by his faith in the accuracy of his own memory.


In his post-9/11 speech, Mr. Bush actually said, “The enemy of America is not our many Muslim friends,” and he said nothing about the stars. Mr. Bush did once say something like what Dr. Tyson remembered; in 2003, in tribute to the astronauts lost in the Columbia space shuttle explosion, he said that “the same creator who names the stars also knows the names of the seven souls we mourn today.” Critics pointed these facts out; some accused Dr. Tyson of lying and argued that the episode should call into question his reliability as a scientist and a public advocate.

When he was first asked for the source of Mr. Bush’s quotation, Dr. Tyson insisted, “I have explicit memory of those words being spoken by the president. I reacted on the spot, making note for possible later reference in my public discourse. Odd that nobody seems to be able to find the quote anywhere.” He then added, “One of our mantras in science is that the absence of evidence is not the same as evidence of absence.”

That is how we all usually respond when our memory is challenged. We have an abstract understanding that people can remember the same event differently, but when our own memories are challenged, we may neglect all this and instead respond emotionally, acting as though we must be right and everyone else must be wrong.

Overconfidence in memory could emerge from our daily experience: We recall events easily and often, at least if they are important to us, but only rarely do we find our memories contradicted by evidence, much less take the initiative to check if they are right. We then rely on confidence as a signal of accuracy – in ourselves and in others. It’s no accident that Oprah Winfrey’s latest best seller is called “What I Know For Sure,” rather than “Some Things That Might Be True.”

Our lack of appreciation for the fallibility of our own memories can lead to much bigger problems than a misattributed quote. Memory failures that resemble Dr. Tyson’s mash-up of distinct experiences have led to false convictions, and even death sentences.

A critical concern about eyewitness memory is the sometimes tenuous relationship between the accuracy of a witness’s memory and his confidence in it. In general, if you have seen something before, your confidence that you have seen it and your accuracy in recalling it are linked: The more confident you are in your memory, the more likely you are to be right. New research reveals important nuances about this link.

In a paper published last year, cognitive psychologists Henry L. Roediger III and K. Andrew DeSoto tested how well people could recall words from lists they had studied, and how measured they were in their recollections. For words that were actually on the lists, when people were highly confident in their memory, they were also accurate; greater confidence was associated with greater accuracy. But when people mistakenly recalled words that were similar to those on the lists but not actually on the lists – a false memory – they also expressed high confidence. That is, for false memories, higher confidence was associated with lower accuracy.

To complicate matters further, the content of our memories can easily change over time. Nearly a century ago, psychologist Sir Frederic Charles Bartlett conducted a series of experiments that mimicked the “telephone” game, in which you whisper a message to the person next to you, who then passes it along to the person next to them, and so on. Over repeated tellings, the story becomes distorted, with some elements remaining, others vanishing, and entirely new details appearing.

When we recall our own memories, we are not extracting a perfect record of our experiences and playing it back verbatim. Most people believe that memory works this way, but it doesn’t. Instead, we are effectively whispering a message from our past to our present, reconstructing it on the fly each time. We get a lot of details right, but when our memories change, we only “hear” the most recent version of the message, and we may assume that what we believe now is what we always believed. Studies find that even our “flashbulb memories” of emotionally charged events can be distorted and inaccurate, but we cling to them with the greatest of confidence.

With each retrieval our memories can morph, and so can our confidence in them. This is why a National Academy of Sciences report strongly advises courts to rely on initial statements rather than courtroom proclamations: A witness who only tentatively identifies a suspect in a police station lineup can later claim – sincerely – to be absolutely certain that the defendant in the courtroom committed the crime. In fact, the mere act of describing a person’s appearance can change how likely you are to pick him out of a lineup later. This finding, known as “verbal overshadowing,” had been controversial, but was recently verified in a collective effort by more than 30 separate research labs.

The science of memory distortion has become rigorous and reliable enough to help guide public policy. It should also guide our personal attitudes and actions. In Dr. Tyson’s case, once the evidence of his error was undeniable, he recognized that the evidence outweighs his experience, and publicly apologized.

Dr. Tyson’s decision is especially apt, coming from a scientist. Good scientists remain open to the possibility that they are wrong, and should question their own beliefs until the evidence is overwhelming. We would all be wise to do the same.

<<< bio box >>>

Christopher Chabris, a psychology professor at Union College, and Daniel Simons, a psychology professor at the University of Illinois, are the authors of This is an edited version of an article that ran in the DEC. 1, 2014 opinion pages.

Print this page


Stories continue below