© 2024 Connecticut Public

FCC Public Inspection Files:
WEDH · WEDN · WEDW · WEDY · WNPR
WPKT · WRLI-FM · WEDW-FM · Public Files Contact
ATSC 3.0 FAQ
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

How Yale Scientists Are Trying to Read Minds

digitalbob8/flickr creative commons

New research out of Yale University is claiming clairvoyance. It's called "neuroimaging," a fancy way of saying scientists are reading your mind.

"What people actually see are real faces. What we generate are our guesses."
Marvin Chun

"If you see any kind of futuristic movie where they have these brain scanners, and they're reading out people's thoughts, that is kind of what we're doing," said Marvin Chun, professor of psychology and neurobiology at Yale.

Chun recently teamed up with Brice Kuhl and a former undergraduate student, Alan Cowen, who presented him with an interesting question: is it possible to reconstruct a face using nothing more than someone's memory?

To try to accomplish this, the team did two things. They built up a library of "brain data," showing test subjects 300 different "training faces," and recording brain activity elicited by each. 

Credit Nathanial Burton-Bradford / Creative Commons
/
Creative Commons
An fMRI brain scan.

Then, they built up a database of "face data," projecting the "training faces" onto computer models called "eigenfaces."

Chun said, "They're basically mathematical descriptions, or summaries, of different facial features, [or] what we would call components."

A diagram shows how Marvin Chun's neuroimaging process works.
Credit NeuroImage
/
NeuroImage
A diagram shows how Marvin Chun's neuroimaging process works.

In the study, published in the March edition of the journal NeuroImage, Chun's team next showed test subjects new faces. He used his brain and face data to predict and reconstruct the image locked inside the viewer's mind.

Think of him as a really tech-savvy police sketch artist. "What people actually see are real faces," Chun said. "What we generate are our guesses -- our pictures of faces that we think are most closely-matched to what we think people were looking at."

Chun said the study has a lot of immediate practical implications, like helping scientists better understand how the brains of autistic children, and people with face blindness, work. There are also long-term implications: help police develop sketches of suspects, or reconstruct an image seen only in a dream. 

Then there's the stuff you see in Hollywood. "I watched the movie 'Divergent' recently, before our paper came out," Chun said. "They have this sequence where they're reading out people's thoughts using brain scanners. I think our work is a step toward that futuristic reality."

Patrick Skahill is a reporter and digital editor at Connecticut Public. Prior to becoming a reporter, he was the founding producer of Connecticut Public Radio's The Colin McEnroe Show, which began in 2009. Patrick's reporting has appeared on NPR's Morning Edition, Here & Now, and All Things Considered. He has also reported for the Marketplace Morning Report. He can be reached at pskahill@ctpublic.org.

Stand up for civility

This news story is funded in large part by Connecticut Public’s Members — listeners, viewers, and readers like you who value fact-based journalism and trustworthy information.

We hope their support inspires you to donate so that we can continue telling stories that inform, educate, and inspire you and your neighbors. As a community-supported public media service, Connecticut Public has relied on donor support for more than 50 years.

Your donation today will allow us to continue this work on your behalf. Give today at any amount and join the 50,000 members who are building a better—and more civil—Connecticut to live, work, and play.

Related Content