BERLIN – At a laboratory in Germany, volunteers slide into a donut-shaped MRI machine and perform simple tasks, such as deciding whether to add or subtract two numbers, or choosing which of two buttons to press.
They have no inkling that scientists in the next room are trying to read their minds — using a brain scan to figure out their intention before it is turned into action.
In the past, scientists had been able to detect decisions about making physical movements before those movements appeared.
• Click here to visit FOXNews.com's Human Body Center.
But researchers at Berlin's Bernstein Center for Computational Neuroscience claim they have now, for the first time, identified people's decisions about how they would later do a high-level mental activity — in this case, adding versus subtracting.
While still in its initial stages, the techniques may eventually have wide-ranging implications for everything from criminal interrogations to airline security checks. And that alarms some ethicists who fear the technology could one day be abused by authorities, marketers, or employers.
Tanja Steinbach, a 21-year-old student in Leipzig who participated in the experiment, found it a bit spooky but wasn't overly concerned about the civil liberties implications.
"It's really weird," she said. "But since I know they're only able to do this if they have certain machines, I'm not worried that everybody else on the street can read my mind."
Researchers have long used MRI machines to identify different types of brain activity, and scientists in the United States have recently developed brain scans designed for lie detection.
But outside experts say the work led by Dr. John-Dylan Haynes at the Bernstein Center is groundbreaking.
"The fact that we can determine what intention a person is holding in their mind pushes the level of our understanding of subjective thought to a whole new level," said Dr. Paul Wolpe, a professor of psychiatry at the University of Pennsylvania, who was not connected to the study.
The research, which began in July 2005, has been of limited scope: only 21 people have been tested so far. And the 71 percent accuracy rate is only about 20 percent more successful than random selection.
Still, the research conducted at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, about 150 kilometers (90 miles) southwest of Berlin, has been generating strong interest in the scientific community.
"Haynes' experiment strikes at the heart of how good we will get at predicting behaviors," said Dr. Todd Braver, an associate professor in the department of psychology at Washington University, who was not connected with the research.
"The barriers that we assumed existed in reading our minds keep getting breached."
In one study, participants were told to decide whether to add or subtract two numbers a few seconds before the numbers were flashed on a screen. In the interim, a computer captured images of their brain waves to predict the subject's decision — with one pattern suggesting addition, and another subtraction.
Haynes' team began its research by trying to identify which part of the mind was storing intentions. They discovered it was found in the prefrontal cortex region by scanning the brain to look for bursts of activity when subjects were given choices.
Then they went about studying which type of patterns were associated with different intentions.
"If you knew which thought signatures to look for, you could theoretically predict in more detail what people were going to do in the future," said Haynes.
For the moment, reading minds is a cumbersome process and there is no chance scientists could spy on decision-making surreptitiously. Haynes' studies focus on people who choose between just two alternatives, not the infinite number present in everyday life.
But scientists are making enough progress to make ethicists nervous, since the research has already progressed from identifying the regions of the brain where certain thoughts occur to identifying the very content of those thoughts.
"These technologies, for the first time, give us a real possibility of going straight to the source to see what somebody is thinking or feeling, without them having any ability to stop us," said Dr. Hank Greely, director of Stanford University's Center for Law and the Biosciences.
"The concept of keeping your thoughts private could be profoundly altered in the future," he said.
Civil libertarians are concerned that mind-reading technology may fit into a trend of pre-emptive security measures in which authorities could take action against individuals before they commit a crime — a scenario explored in the 2002 science fiction film "Minority Report."
Already, Britain is creating a national DNA database that would allow authorities to track people with violent predispositions. In addition, the government has also floated the idea of locking up people with personality disorders that could lead to criminal behavior.
"We need to start thinking about how far we are going to allow these technologies to be used," said Wolpe.
Despite the fears, Haynes believes his research has more benign practical applications.
For example, he says it will contribute to the development of machines already in existence that respond to brain signals and allow the paralyzed to change TV channels, surf the Internet, and operate small robotic devices.
For now, the practical applications of Haynes' research are years if not decades away.
"We are making the first steps in reading out what the specific contents of people's thoughts are by trying to understand the language of the brain," Haynes said. "But it's not like we are going to have a machine tomorrow."