Complex Made Simple

Scientists test mind reading tech, including one from Facebook

AI and ultrasound technologies are being tested to effectively read our minds, interpret our thoughts, and pre-empt our actions or intentions

Ultrasound technology and imagery works by emitting pulses of high-frequency sound Imaging could also be used to figure out intentions before they are carried out Facebook has unveiled its mind-reading wrist device

It’s not telepathy but close enough.

AI and ultrasound technologies are being tested to effectively read our minds, interpret our thoughts, and pre-empt our actions or intentions.

Brainy intentions  

A new type of brain-machine interface (BMI) that’s minimally invasive can read out the brain’s intentions using ultrasound technology. 

A collaborative team of researchers at Caltech developed the system that can read brain activity corresponding to the planning of movement. 

BMIs typically interpret brain activity and link it up to a computer or machine, but also require invasive brain surgery, which many patients aren’t willing to partake in. 

The news of this new technology is that it uses functional ultrasound (fUS) technology to accurately map out neural activity from its source deep within the brain at a resolution of 100 micrometers.  

Ultrasound technology and imagery works by emitting pulses of high-frequency sound, explain the researchers, and then measures how those sound vibrations reverberate throughout a substance, like human tissue. This type of imaging is already widely used to take images of a fetus in utero, for example. 

Mikhail Shapiro, professor of chemical engineering and Heritage Medical Research Institute Investigator and part of the research team said, “This technique produced detailed images of the dynamics of neural signals in our target region that could not be seen with other non-invasive techniques like fMRI.”

Imaging non-human primates’ brains

The team tried and tested its method on non-human primates, teaching them to carry out simple tasks like moving their eyes and arms in a certain direction after receiving cues.  

The team wanted to see if fUS imaging could also be used to figure out the primates’ intentions before they carried out their tasks.

The system was then trained with a machine-learning algorithm using ultrasound data collected in real-time, and within a few seconds, the algorithm was able to predict what next movement the non-human primates would carry out. 

As Richard Andersen, who was part of the study, pointed out, in order for this method to work “Only a small, ultrasound-transparent window needs to be implanted in the skull; this surgery is significantly less invasive than that required for implanting electrodes.” 

The next steps include human trials, to see if these images and predictions also work for reading human brains

Read: The Internet of Senses: Sci-fi or your brains wired with super wi-fi?

Read: Computerized brains are here: Great for medicine but haven for neuro-criminals

Mind reading wrist device

Facebook has unveiled its mind-reading wrist device and an augmented reality keyboard that would allow users to replace the mouse and keyboard in future hardware products.

The wrist device is capable of reading neurological signals sent from a users’ brain down to their hands. It could theoretically read these signals to get a sense of what a user wants to do and replicate the action in a virtual or augmented reality environment.

“You actually have more of your brain dedicated to controlling your wrist than any other part of your body, probably twice as many neurons than is actually dedicated to your mouth for feeding and for speech,” said TR Reardon, director of research science at Facebook Reality Labs.

The Facebook researchers demonstrated “force” actions where a user could pinch with their fingers in real life to hold and control virtual, far-away objects in augmented reality.  

Additionally, the company demonstrated electromyography wristbands that users could wear to type on any surface as though they were typing on a physical keyboard. Though there’s no keyboard, the EMG wristbands would register the intentions of a user’s finger strokes and jot down the letters and words.

Reading our minds’ likes

A group of scientists has trained artificial intelligence to create the perfect face for every one of us.

The AI researchers, working at the University of Helsinki and University of Copenhagen, used an electroencephalograph (EEG) to read the brainwaves of 30 volunteers while it showed them a variety of faces on a screen.

By learning exactly how the brain’s pleasure centers lit up for each image, the AI was able to create a synthetic face that was perfectly targeted for each participant – a face that was absolutely matched to their preferences.

The expected brain activity showed that the AI’s matchmaking was over 80% on-target.

The team’s paper, published in the scientific journal IEEE Transactions on Affective Computing, explains: “A brain-computer interface such as this is able to interpret users’ opinions on the attractiveness of a range of images,” explained the project lead, Academy Research Fellow and Associate Professor Tuukka Ruotsalo.