AI News, How to control robots with brainwaves and hand gestures

How to control robots with brainwaves and hand gestures

Getting robots to do things isn’t easy: Usually, scientists have to either explicitly program them or get them to understand how humans communicate via language.

Building off the team’s past work focused on simple binary-choice activities, the new work expands the scope to multiple-choice tasks, opening up new possibilities for how human workers could manage teams of robots.

“By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.” PhD candidate Joseph DelPreto was lead author on a paper about the project alongside Rus, former CSAIL postdoc Andres F.

To create the system the team harnessed the power of electroencephalography (EEG) for brain activity and electromyography (EMG) for muscle activity, putting a series of electrodes on the users’ scalp and forearm.

Both metrics have some individual shortcomings: EEG signals are not always reliably detectable, while EMG signals can sometimes be difficult to map to motions that are any more specific than “move left or right.” Merging the two, however, allows for more robust bio-sensing and makes it possible for the system to work on new users without training.

“This helps make communicating with a robot more like communicating with another person.” The team says that they could imagine the system one day being useful for the elderly, or workers with language disorders or limited mobility.

Brain-controlled robots

“A streamlined approach like that would improve our abilities to supervise factory robots, driverless cars, and other technologies we haven’t even invented yet.” In the current study the team used a humanoid robot named “Baxter” from Rethink Robotics, the company led by former CSAIL director and iRobot co-founder Rodney Brooks.

“You don’t have to train yourself to think in a certain way — the machine adapts to you, and not the other way around.” ErrP signals are extremely faint, which means that the system has to be fine-tuned enough to both classify the signal and incorporate it into the feedback loop for the human operator. In addition to monitoring the initial ErrPs, the team also sought to detect “secondary errors” that occur when the system doesn’t notice the human’s original correction.

“These signals can dramatically improve accuracy, creating a continuous dialogue between human and robot in communicating their choices.” While the system cannot yet recognize secondary errors in real time, Gil expects the model to be able to improve to upwards of 90 percent accuracy once it can.

Salazar-Gomez notes that the system could even be useful for people who can’t communicate verbally: a task like spelling could be accomplished via a series of several discrete binary choices, which he likens to an advanced form of the blinking that allowed stroke victim Jean-Dominique Bauby to write his memoir “The Diving Bell and the Butterfly.” “This work brings us closer to developing effective tools for brain-controlled robots and prostheses,” says Wolfram Burgard, a professor of computer science at the University of Freiburg who was not involved in the research.

This Robot Can Be Controlled by Brain Signals and Hand Gestures

Scientists from MIT have developed a new way for humans to train robots using brain signals and body gestures.

The team responsible for the breakthrough developed a way to harness brain signals called 'error-related potentials' (ErrPs), which unconsciously occur when people observe a mistake.

The system works by monitoring the brain activity of a person observing a robot at work, if an ErrP occurs because the robot made a mistake, the robot is notified and pauses to wait for a correction from its human observer.

“By looking at both muscle and brain signals, we can start to pick up on a person's natural gestures along with their snap decisions about whether something is going wrong,”

DelPreto says the new system is particularly important as users don’t need to be trained to think in a particular way, the brain signals happen unconsciously and the gestures are intuitive and resemble what might happen if a human was training another human.'The machine adapts to you, and not the other way around,' he said, adding that the system 'makes communicating with a robot more like communicating with another person.'

Controlling robots with brainwaves and hand gestures

new system spearheaded by researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) aims to do exactly that, allowing users to instantly correct robot mistakes with nothing more than brain signals and the flick of a finger.

Building off the team's past work focused on simple binary-choice activities, the new work expands the scope to multiple-choice tasks, opening up new possibilities for how human workers could manage teams of robots.

'This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications than we've been able to do before using only EEG feedback,' says CSAIL director Daniela Rus, who supervised the work.

Intuitive human-robot interaction In most previous work, systems could generally only recognize brain signals when people trained themselves to 'think' in very specific but arbitrary ways and when the system was trained on such signals.

Not surprisingly, such approaches are difficult for people to handle reliably, especially if they work in fields like construction or navigation that already require intense concentration.

Meanwhile, Rus' team harnessed the power of brain signals called 'error-related potentials' (ErrPs), which researchers have found to naturally occur when people notice mistakes.

To create the system the team harnessed the power of electroencephalography (EEG) for brain activity and electromyography (EMG) for muscle activity, putting a series of electrodes on the users' scalp and forearm.

MIT uses brain signals and hand gestures to control robots

The team harnessed the power of brain signals called 'error-related potentials' (ErrPs), which naturally occur when people notice a mistake.

The system monitors the brain activity of a person observing robotic work, and if an ErrP occurs -- because the robot has made an error -- the robot pauses its activity so the user can correct it.

Being able to control robots in this way opens up new possibilities for how humans could manage teams of robot workers, but longer term, it could be useful for the elderly, or workers with language disorders or limited mobility.

New system connects your mind to a machine to help stop mistakes

A scalp EEG and EMG system is connected to a Baxter work robot and lets a human wave or gesture when the robot is doing something that it shouldn’t be doing.

“By looking at both muscle and brain signals, we can start to pick up on a person’s natural gestures along with their snap decisions about whether something is going wrong,”

Because the system uses nuances like gestures and emotional reactions you can train robots to interact with humans with disabilities and even prevent accidents by catching concern or alarm before it is communicated verbally.

The task changed occasionally and a human standing nearby was able to gesture to the robot to change position before it drilled, essentially training it to do new tasks in the midst of its current task.

Brain-controlled Robots

Paper:

Brain training a boon for stroke victims (30.3.2014)

A biomedical engineering team from Polytechnic University has developed an innovative device that is proving to be a godsend for patients recovering from a ...

MIT robot reads your mind

This is not a Jedi mind trick! A team of scientists at MIT has created a robot that can be controlled by a human brain. The scientists, working from MIT's Computer ...

Thought projection by neurons in the human brain

A team from California have shown that it's possible to control images on a screen using just the power of thought. Working with patients who had electrodes ...

EEG Brain-Robot Interface Project

Work by my team and Dr. Shaun Boe to teach people how to drive a robot using only brain activity. This fun project also taught us a lot about how measuring ...

A robot that runs and swims like a salamander | Auke Ijspeert

Roboticist Auke Ijspeert designs biorobots, machines modeled after real animals that are capable of handling complex terrain and would appear at home in the ...

mind-reading robot that You can control using your Thoughts and hand Gestures

The machine, named Baxter, reads human brainwaves in real-time so that it knows when a human is unhappy with its actions. Communicating with machines ...

Can artificial intelligence help predict and prevent traffic accidents? - BBC Click

Click visits the US to see how predictive analytics help in emergencies. Plus Honda's work on creating robots with emotions and some AR pilot training.

Automatic Transmission, How it works ?

Help us to make future videos for you. Make LE's efforts sustainable. Please support us at Patreon.com ! The ..

BrainHQ Demonstration

BrainHQ is a brain-​training system built and tested by an international team of top neuroscientists and other brain experts. Unlike other brain-training programs, ...