Guess intent through magnetic resonance imaging technique

Every day, we plan countless tasks such as returning a book to a friend or making an appointment. How and where the brain has stored these intentions, this was discovered by John-Dylan Haynes from the Max Planck Institute for Studies of the Brain and Human Awareness, in collaboration with the Researcher from London and Tokyo.

For the first time, they can ' read ' the intentions of the study participants from brain activities. This is possible due to a new combination of fMRI - functional magnetic resonance radiography (ie using MRI-based magnetic resonance imaging to measure reaction). of blood flow regulation related to neurological activity in the brain or spinal cord) and complex computer algorithms.

Picture 1 of Guess intent through magnetic resonance imaging technique

The brain areas from which people can read their intentions.In specific areas, ' small ' models of brain activity show minor differences depending on whether a person is preparing to add or subtract. From models that operate in green areas, one can read secret intentions before objects begin to perform calculations. From the areas marked in red, we can read the intentions that have already been made. (Accredited by Bernstein Center for Computer Neuroscience research, Berlin)

Our hidden intentions remain hidden until we do them - we think so. Today, researchers have been able to decipher these secret intentions through brain activity patterns. They let the subjects study freely and secretly choose one of two possible things - either plus or minus two numbers. After that, they were asked to keep their intentions for a while until the relevant numbers were displayed on a screen. Researchers are able to recognize the intent of objects with 70% accuracy based solely on their brain activity - even before these subjects see the numbers and begin to do calculations.

Participants make decisions secretly and initially do not know the two numbers they think they will add or subtract. Only a few seconds later, the numbers appear on a screen and objects can perform calculations. This ensures that the intention itself was read out at the time rather than the brain activity involved in performing calculations or pressing buttons to indicate the answer. Haynes explained: 'Previously, it was thought that optional intentions could be stored in areas between the prefrontal cortex while intentions according to external guidelines could be stored on the surface. Brain. We can validate this theory in our experiments. '

Haynes and colleagues' work far from the original goal is to confirm only previous assumptions. In the past, people could not read from the activity of the brain how a person decided to act in the future. The secret to making the invisible visible is in a new method called 'classifying multi-parameter data patterns'. A computer is programmed to recognize specific patterns of activity in the brain that often appear to coordinate with specific thoughts. Once this computer is ' trained ', it can be used to predict the decisions of subjects only from their brain activity. An important technical improvement also lies in combining information across expanded brain regions to enhance sensitivity.

The study also discovered the basic principles of how the brain stores intentions. Haynes said: The experiments show that intentions are not encoded in individual neurons but throughout the spatial model of brain activity . In addition, they discovered that different areas of the prefrontal cortex perform different activities . The areas facing the front of the brain store their intent until it is done while the areas backwards take on the task when the objects become active and begin to do calculations. Haynes said: 'The intentions for future actions that are encoded in a part of the brain need to be copied to another area to be taken.'

These research results also provide hope for improvements in clinical and technical applications. Today, the first steps to ease the lives of paralyzed patients with devices that support prosthetics with computers and those called computer interfaces that have brains already and be done. These devices focus on 'reading' the patient's intended movements but are not possible. Previous research has shown that patients can move prosthetic limbs or cursors on a computer screen with only their willpower. This study by Haynes and his colleagues now opens up a whole new context.

In the future, it will be possible to read even the abstract thoughts and intentions of the patient's brain. Someday, the intention to 'open a blue folder' or 'reply to that email' can be received by the brain scanner and turn into appropriate action.

Thien Kim