Debunking Hollywood: What Sci-Fi Movies Get Wrong About Memory and Learning

Angela Cabotaje Fact Checked
science-fiction-woman
© Colin Anderson / Stocksy United

Imagine if you could learn kung fu in an instant. Or implant an idea into someone else’s mind. Or tap into dormant parts of your brain to gain superhuman abilities.

Sound familiar?

These are scenes and plotlines pulled from popular science-fiction movies. And, sure, they’re entertaining, but how many of them are rooted in real science and which ones are total fiction?

Enter Elizabeth Buffalo, a professor in the University of Washington Department of Physiology & Biophysics and an investigator at UW Medicine’s Alzheimer’s Disease Research Center.

What monkeys teach us about memory

While Buffalo isn’t a Hollywood scriptwriter, she does know a lot about the brain.

She studies the parts of the brain that are important for memory by recording neural activity from both humans and rhesus macaque monkeys.

“Monkeys make an excellent model because their memory structures are so similar to those in human brains, and we can train them to do complex memory tasks that are identical to tests we use in human subjects,” she explains.

In one such task, called the Wisconsin Card Sorting Test, monkeys are asked to sort a deck of cards by color, number or suit.

“They figure out by trial and error what to match on,” Buffalo says. “At some unknown point in the game, the rules switch. All of a sudden, they’re getting the sorting wrong, and they have to realize that and try to figure out which feature is now correct. We want to know, in terms of neural physiology, what is it that they’re doing during the course of trial and error as they learn the new rule.”

Elizabeth-Buffalo-Lab

By observing data from sessions like this where monkeys sort cards, play video games and navigate virtual mazes (yes, the prize is a digital banana), Buffalo is hoping to better understand just how the neural mechanisms supporting learning and memory processes work.

The National Institutes of Health’s BRAIN Initiative recently awarded her a $12 million grant to further study rapid learning.

So where does all this research leave those sci-fi flicks in all their maybe-science, maybe-not glory?

Buffalo starts by sharing what researchers already know about the human brain, memory and learning — and plays movie critic for the science behind some notable films.

What we know about memory and Alzheimer’s

Much of the modern research about memory stems from British-Canadian neuroscientist Brenda Milner’s work with patient H.M., starting in the 1950s.

H.M. lived with debilitating seizures due to epilepsy and eventually had the medial temporal lobes — responsible for processing sensory information — removed from both sides of his brain. The surgical procedure relieved his seizures but left him with a profoundly impaired memory.

He had what’s called anterograde amnesia, or the inability to learn new information. H.M. could, for example, remember stories from his childhood or even repeat a phone number back to you immediately, but he couldn’t learn and retain new information for a long-term period.

“This told us a couple of things: that memory is a separate cognitive function, because his intellect was still intact although his memory was impaired. And, because some forms of memory were spared, that different forms of memory depend on different brain structures,” Buffalo explains.

In essence, medial temporal lobes are not where memories get stored for the long term. After all, H.M. could recall moments from when he was young.

But our brains still need the medial temporal lobes — containing structures like the hippocampus and surrounding cortex — to form new memories and consolidate those memories into learned information.

Milner’s findings laid a foundation for what we have since discovered about brain disorders like Alzheimer’s and dementia.

“We know the pathology associated with Alzheimer’s starts in the medial temporal lobes, with the degeneration of neurons in the cortex just outside the hippocampus,” Buffalo says. “The first symptom that you might see in Alzheimer’s is difficulty forming new memories. You can’t remember what you had for lunch, but you can remember your kids and old family stories.”

Separating science from fiction in movies

While researchers like Buffalo are striving to uncover more about how memory and learning work in the human brain, Hollywood doesn’t seem quite as concerned with the science part of it.

How do these popular sci-fi films score with Buffalo?

“Lucy”

Humans only use 10 percent of their brain … well, that’s what this 2014 film headlined by Scarlett Johansson would like you to believe.

The movie follows main character Lucy, who ingests a synthetic drug and gains psychokinetic abilities like mental time travel, telepathy and superhuman strength by achieving her full cerebral capacity.

Buffalo gives this one two cerebral thumbs down.

“The 10 percent brain myth is absolutely not true,” Buffalo says. “We know that the brain is active all the time, and it’s likely just that the whole brain is active across disparate regions.”

“The Matrix”

Whether you choose the red pill or the blue pill, we can probably all agree that instantly downloading information directly into our brains would be pretty cool.

In one of the most memorable scenes from this 1999 movie starring Keanu Reeves, his character Neo transfers a database of kung fu techniques into his brain almost instantly, leading him to utter this now-famous quote: “I know kung fu.”

While it’s not possible to download years of knowledge into our brains quick as a flash drive, Buffalo says there is evidence to back up how we can improve our memory and learning.

“When we have a new experience, this activates the hippocampus and there’s activity in the cortex that involves sensory information,” she explains. “Studies in animals and humans have shown a replay of activity both in the hippocampus and the cortex that primarily happens during sleep. This replay of activity is thought to support memory consolidation.”

Sleep equals better learning? Sounds like it’s nap time.

“Eternal Sunshine of the Spotless Mind”

In this 2004 romantic comedy meets science-fiction movie, Jim Carey and Kate Winslet portray two former lovers who erase their memories of each other after a bad break-up but unwittingly begin dating again.

Who wouldn’t want to get over a broken heart that easily?

While Buffalo says there aren’t any known cases of selective amnesia, there is research that shows some potential for erasing specific memories.

During one study with mice, scientists tagged brain cells that are active when a memory is formed.

The researchers then taught the mice to associate a specific room with a negative memory. Eventually, they destroyed just the tagged cells, effectively erasing that negative memory from the mice.

A follow-up study is being conducted to see if the same holds true for positive memories.

“Inception”

You have a great idea. But is it really your great idea? Or was it someone else’s great idea that they want you to think is your own great idea? Wait, does that mean it’s not a great idea?

These are the kind of mind-bending questions that await in this 2010 sci-fi action flick starring Leonardo DiCaprio, who plays a criminal trying to implant an idea into another person’s subconscious.

Scarily enough, Buffalo says there’s actually some real science behind this idea of implanting false memories.

The most well-known studies on the matter were conducted by psychologist and former University of Washington professor Elizabeth Loftus, who demonstrated that memory is malleable and susceptible to outside suggestion.

“She did experiments in which she was able to implant memories for events that absolutely didn’t happen,” Buffalo notes. “She showed that about 25 percent of her subjects could develop a vivid, detailed memory of getting lost in a mall when it didn’t happen.”

Yikes.

“Memento”

A man on a quest to catch his wife’s killer has a slight problem: he can’t remember any new information for longer than five minutes.

That’s the premise of this 2000 psychological thriller, where Guy Pearce plays someone with anterograde amnesia — what patient H.M. had — and relies on a patchwork of tattoos and photographs to remember things from previous days.

Buffalo’s take?

“This one gets a gold star,” she says. “The main character suffered damage to the medial temporal lobe, and the movie is actually pretty accurate regarding his memory loss.”

OK, maybe Hollywood doesn’t always get it wrong.