AI News, Controlling robots with brainwaves and hand gestures

Controlling robots with brainwaves and hand gestures

new system spearheaded by researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) aims to do exactly that, allowing users to instantly correct robot mistakes with nothing more than brain signals and the flick of a finger.

Building off the team's past work focused on simple binary-choice activities, the new work expands the scope to multiple-choice tasks, opening up new possibilities for how human workers could manage teams of robots.

'This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications than we've been able to do before using only EEG feedback,' says CSAIL director Daniela Rus, who supervised the work.

Intuitive human-robot interaction In most previous work, systems could generally only recognize brain signals when people trained themselves to 'think' in very specific but arbitrary ways and when the system was trained on such signals.

Not surprisingly, such approaches are difficult for people to handle reliably, especially if they work in fields like construction or navigation that already require intense concentration.

Meanwhile, Rus' team harnessed the power of brain signals called 'error-related potentials' (ErrPs), which researchers have found to naturally occur when people notice mistakes.

To create the system the team harnessed the power of electroencephalography (EEG) for brain activity and electromyography (EMG) for muscle activity, putting a series of electrodes on the users' scalp and forearm.

Brain-controlled robots

“A streamlined approach like that would improve our abilities to supervise factory robots, driverless cars, and other technologies we haven’t even invented yet.” In the current study the team used a humanoid robot named “Baxter” from Rethink Robotics, the company led by former CSAIL director and iRobot co-founder Rodney Brooks.

“You don’t have to train yourself to think in a certain way — the machine adapts to you, and not the other way around.” ErrP signals are extremely faint, which means that the system has to be fine-tuned enough to both classify the signal and incorporate it into the feedback loop for the human operator. In addition to monitoring the initial ErrPs, the team also sought to detect “secondary errors” that occur when the system doesn’t notice the human’s original correction.

“These signals can dramatically improve accuracy, creating a continuous dialogue between human and robot in communicating their choices.” While the system cannot yet recognize secondary errors in real time, Gil expects the model to be able to improve to upwards of 90 percent accuracy once it can.

Salazar-Gomez notes that the system could even be useful for people who can’t communicate verbally: a task like spelling could be accomplished via a series of several discrete binary choices, which he likens to an advanced form of the blinking that allowed stroke victim Jean-Dominique Bauby to write his memoir “The Diving Bell and the Butterfly.” “This work brings us closer to developing effective tools for brain-controlled robots and prostheses,” says Wolfram Burgard, a professor of computer science at the University of Freiburg who was not involved in the research.

How to control robots with brainwaves and hand gestures

Getting robots to do things isn’t easy: Usually, scientists have to either explicitly program them or get them to understand how humans communicate via language.

Building off the team’s past work focused on simple binary-choice activities, the new work expands the scope to multiple-choice tasks, opening up new possibilities for how human workers could manage teams of robots.

“By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.” PhD candidate Joseph DelPreto was lead author on a paper about the project alongside Rus, former CSAIL postdoc Andres F.

To create the system the team harnessed the power of electroencephalography (EEG) for brain activity and electromyography (EMG) for muscle activity, putting a series of electrodes on the users’ scalp and forearm.

Both metrics have some individual shortcomings: EEG signals are not always reliably detectable, while EMG signals can sometimes be difficult to map to motions that are any more specific than “move left or right.” Merging the two, however, allows for more robust bio-sensing and makes it possible for the system to work on new users without training.

“This helps make communicating with a robot more like communicating with another person.” The team says that they could imagine the system one day being useful for the elderly, or workers with language disorders or limited mobility.

This Robot Can Be Controlled by Brain Signals and Hand Gestures

Scientists from MIT have developed a new way for humans to train robots using brain signals and body gestures.

The team responsible for the breakthrough developed a way to harness brain signals called 'error-related potentials' (ErrPs), which unconsciously occur when people observe a mistake.

The system works by monitoring the brain activity of a person observing a robot at work, if an ErrP occurs because the robot made a mistake, the robot is notified and pauses to wait for a correction from its human observer.

“By looking at both muscle and brain signals, we can start to pick up on a person's natural gestures along with their snap decisions about whether something is going wrong,”

DelPreto says the new system is particularly important as users don’t need to be trained to think in a particular way, the brain signals happen unconsciously and the gestures are intuitive and resemble what might happen if a human was training another human.'The machine adapts to you, and not the other way around,' he said, adding that the system 'makes communicating with a robot more like communicating with another person.'

MIT uses brain signals and hand gestures to control robots

The team harnessed the power of brain signals called 'error-related potentials' (ErrPs), which naturally occur when people notice a mistake.

The system monitors the brain activity of a person observing robotic work, and if an ErrP occurs -- because the robot has made an error -- the robot pauses its activity so the user can correct it.

Being able to control robots in this way opens up new possibilities for how humans could manage teams of robot workers, but longer term, it could be useful for the elderly, or workers with language disorders or limited mobility.

Brain–computer interface

brain–computer interface (BCI), sometimes called a neural-control interface (NCI), mind-machine interface (MMI), direct neural interface (DNI), or brain–machine interface (BMI), is a direct communication pathway between an enhanced or wired brain and an external device.

However, more sophisticated measuring devices, such as the Siemens double-coil recording galvanometer, which displayed electric voltages as small as one ten thousandth of a volt, led to success.

To perform the piece one must produce alpha waves and thereby 'play' the various percussion instruments via loudspeakers which are placed near or directly on the instruments themselves.[5]

Neuroprosthetics is an area of neuroscience concerned with neural prostheses, that is, using artificial devices to replace the function of impaired nervous systems and brain related problems, or of sensory organs.

The difference between BCIs and neuroprosthetics is mostly in how the terms are used: neuroprosthetics typically connect the nervous system to a device, whereas BCIs usually connect the brain (or nervous system) with a computer system.

Practical neuroprosthetics can be linked to any part of the nervous system—for example, peripheral nerves—while the term 'BCI' usually designates a narrower class of systems which interface with the central nervous system.

Monkeys have navigated computer cursors on screen and commanded robotic arms to perform simple tasks simply by thinking about the task and seeing the visual feedback, but without any motor output.[15]

Similar work in the 1970s established that monkeys could quickly learn to voluntarily control the firing rates of individual and multiple neurons in the primary motor cortex if they were rewarded for generating appropriate patterns of neural activity.[18]

In the 1980s, Apostolos Georgopoulos at Johns Hopkins University found a mathematical relationship between the electrical responses of single motor cortex neurons in rhesus macaque monkeys and the direction in which they moved their arms (based on a cosine function).

He also found that dispersed groups of neurons, in different areas of the monkey's brains, collectively controlled motor commands, but was able to record the firings of neurons in only one area at a time, because of the technical limitations imposed by his equipment.[19]

Several groups have been able to capture complex brain motor cortex signals by recording from neural ensembles (groups of neurons) and using these to control external devices.

Using mathematical filters, the researchers decoded the signals to generate movies of what the cats saw and were able to reconstruct recognizable scenes and moving objects.[21]

After conducting initial studies in rats during the 1990s, Nicolelis and his colleagues developed BCIs that decoded brain activity in owl monkeys and used the devices to reproduce monkey movements in robotic arms.

Later experiments by Nicolelis using rhesus monkeys succeeded in closing the feedback loop and reproduced monkey reaching and grasping movements in a robot arm.

The monkey was brain controlling the position of an avatar arm while receiving sensory feedback through direct intracortical stimulation (ICMS) in the arm representation area of the sensory cortex.[25]

These researchers have been able to produce working BCIs, even using recorded signals from far fewer neurons than did Nicolelis (15–30 neurons versus 50–200 neurons).

The same group also created headlines when they demonstrated that a monkey could feed itself pieces of fruit and marshmallows using a robotic arm controlled by the animal's own brain signals.[28][29][30]

Andersen's group used recordings of premovement activity from the posterior parietal cortex in their BCI, including signals created when experimental animals anticipated receiving a reward.[31]

In the context of a simple learning task, illumination of transfected cells in the somatosensory cortex influenced the decision making process of freely moving mice.[33]

Research has shown that despite the inclination of neuroscientists to believe that neurons have the most effect when working together, single neurons can be conditioned through the use of BMIs to fire at a pattern that allows primates to control motor outputs.

The use of BMIs has led to development of the single neuron insufficiency principle which states that even with a well tuned firing rate single neurons can only carry a narrow amount of information and therefore the highest level of accuracy is achieved by recording firings of the collective ensemble.

Because they lie in the grey matter, invasive devices produce the highest quality signals of BCI devices but are prone to scar-tissue build-up, causing the signal to become weaker, or even non-existent, as the body reacts to a foreign object in the brain.[37]

This also required him to be hooked up to a mainframe computer, but shrinking electronics and faster computers made his artificial eye more portable and now enable him to perform simple tasks unassisted.[38]

In 2002, Jens Naumann, also blinded in adulthood, became the first in a series of 16 paying patients to receive Dobelle’s second generation implant, marking one of the earliest commercial uses of BCIs.

BCIs focusing on motor neuroprosthetics aim to either restore movement in individuals with paralysis or provide devices to assist them, such as interfaces with computers or robot arms.

Implanted in Nagle’s right precentral gyrus (area of the motor cortex for arm movement), the 96-electrode BrainGate implant allowed Nagle to control a robotic arm by thinking about moving his hand as well as a computer cursor, lights and TV.[43]

both in collaborations with the United States Department of Veterans Affairs, have demonstrated further success in direct control of robotic prosthetic limbs with many degrees of freedom using direct connections to arrays of neurons in the motor cortex of patients with tetraplegia.

They produce better resolution signals than non-invasive BCIs where the bone tissue of the cranium deflects and deforms signals and have a lower risk of forming scar-tissue in the brain than fully invasive BCIs.

Electrocorticography (ECoG) measures the electrical activity of the brain taken from beneath the skull in a similar way to non-invasive electroencephalography, but the electrodes are embedded in a thin plastic pad that is placed above the cortex, beneath the dura mater.[47]

ECoG is a very promising intermediate BCI modality because it has higher spatial resolution, better signal-to-noise ratio, wider frequency range, and less training requirements than scalp-recorded EEG, and at the same time has lower technical difficulty, lower clinical risk, and probably superior long-term stability than intracortical single-neuron recording.

This feature profile and recent evidence of the high level of control with minimal training requirements shows potential for real world application for people with motor disabilities.[50][51]

Although EEG-based interfaces are easy to wear and do not require surgery, they have relatively poor spatial resolution and cannot effectively use higher-frequency signals because the skull dampens signals, dispersing and blurring the electromagnetic waves created by the neurons.

In a 2016 article, an entirely new communication device and non-EEG-based human-computer interface was developed, requiring no visual fixation or ability to move eyes at all, that is based on covert interest in (i.e.

without fixing eyes on) chosen letter on a virtual keyboard with letters each having its own (background) circle that is micro-oscillating in brightness in different time transitions, where the letter selection is based on best fit between, on one hand, unintentional pupil-size oscillation pattern, and, on the other hand, the circle-in-background's brightness oscillation pattern.

In 2014 and 2017, a BCI using functional near-infrared spectroscopy for 'locked-in' patients with amyotrophic lateral sclerosis (ALS) was able to restore some basic ability of the patients to communicate with other people.[53][54]

For example, in experiments beginning in the mid-1990s, Niels Birbaumer at the University of Tübingen in Germany trained severely paralysed people to self-regulate the slow cortical potentials in their EEG to such an extent that these signals could be used as a binary signal to control a computer cursor.[55]

(Birbaumer had earlier trained epileptics to prevent impending fits by controlling this low voltage wave.) The experiment saw ten patients trained to move a computer cursor by controlling their brainwaves.

However, the slow cortical potential approach to BCIs has not been used in several years, since other approaches require little or no training, are faster and more accurate, and work for a greater proportion of users.

Birbaumer's later research with Jonathan Wolpaw at New York State University has focused on developing technology that would allow users to choose the brain signals they found easiest to operate a BCI, including mu and beta rhythms.

While an EEG based brain-computer interface has been pursued extensively by a number of research labs, recent advancements made by Bin He and his team at the University of Minnesota suggest the potential of an EEG based brain-computer interface to accomplish tasks close to invasive brain-computer interface.

Using advanced functional neuroimaging including BOLD functional MRI and EEG source imaging, Bin He and co-workers identified the co-variation and co-localization of electrophysiological and hemodynamic signals induced by motor imagination.[56] Refined

by a neuroimaging approach and by a training protocol, Bin He and co-workers demonstrated the ability of a non-invasive EEG based brain-computer interface to control the flight of a virtual helicopter in 3-dimensional space, based upon motor imagination.[57]

In addition to a brain-computer interface based on brain waves, as recorded from scalp EEG electrodes, Bin He and co-workers explored a virtual EEG signal-based brain-computer interface by first solving the EEG inverse problem and then used the resulting virtual EEG for brain-computer interface tasks.

The advantages of such electrodes are: (1) no electrolyte used, (2) no skin preparation, (3) significantly reduced sensor size, and (4) compatibility with EEG monitoring systems.

The active electrode array is an integrated system made of an array of capacitive sensors with local integrated circuitry housed in a package with batteries to power the circuitry.

The electrode was tested on an electrical test bench and on human subjects in four modalities of EEG activity, namely: (1) spontaneous EEG, (2) sensory event-related potentials, (3) brain stem potentials, and (4) cognitive event-related potentials.

The performance of the dry electrode compared favorably with that of the standard wet electrodes in terms of skin preparation, no gel requirements (dry), and higher signal-to-noise ratio.[62]

For example, Gert Pfurtscheller of Graz University of Technology and colleagues demonstrated a BCI-controlled functional electrical stimulation system to restore upper extremity movements in a person with tetraplegia due to spinal cord injury.[64]

In their spinal cord injury research study, a person with paraplegia was able to operate a BCI-robotic gait orthosis to regain basic brain-controlled ambulation.[65][66]

He then went on to make several demonstration mind controlled wheelchairs and home automation that could be operated by people with limited or no motor control such as those with paraplegia and cerebral palsy.

In a widely reported experiment, fMRI allowed two users being scanned to play Pong in real-time by altering their haemodynamic response or brain blood flow through biofeedback techniques.[72]

In 2008 research developed in the Advanced Telecommunications Research (ATR) Computational Neuroscience Laboratories in Kyoto, Japan, allowed the scientists to reconstruct images directly from the brain and display them on a computer in black and white at a resolution of 10x10 pixels.

This was achieved by creating a statistical model relating visual patterns in videos shown to the subjects, to the brain activity caused by watching the videos.

This model was then used to look up the 100 one-second video segments, in a database of 18 million seconds of random YouTube videos, whose visual patterns most closely matched the brain activity recorded when subjects watched a new video.

Many biofeedback systems are used to treat certain disorders such as attention deficit hyperactivity disorder (ADHD), sleep problems in children, teeth grinding, and chronic pain.

involves using BCI to enrich human–machine interaction with implicit information on the actual user's state, for example, simulations to detect when users intend to push brakes during an emergency car stopping procedure.

This is due to several factors, the signal elicited is measurable in as large a population as the transient VEP and blink movement and electrocardiographic artefacts do not affect the frequencies monitored.

The P300 event-related potential is a positive peak in the EEG that occurs at roughly 300 ms after the appearance of a target stimulus (a stimulus for which the user is waiting or seeking) or oddball stimuli.

The P300 amplitude decreases as the target stimuli and the ignored stimuli grow more similar.The P300 is thought to be related to a higher level attention process or an orienting response Using P300 as a control scheme has the advantage of the participant only having to attend limited training sessions.

The advantage of P300 use within games is that the player does not have to teach himself/herself how to use a completely new control system and so only has to undertake short training instances, to learn the gameplay mechanics and basic use of the BCI paradigm.[79][82]

In a $6.3 million Army initiative to invent devices for telepathic communication, Gerwin Schalk, underwritten in a $2.2 million grant, found the use of ECoG signals can discriminate the vowels and consonants embedded in spoken and imagined words, shedding light on the distinct mechanisms associated with production of vowels and consonants, and could provide the basis for brain-based communication using imagined speech.[51][83]

On 27 February 2013 the group with Miguel Nicolelis at Duke University and IINN-ELS successfully connected the brains of two rats with electronic interfaces that allowed them to directly share information, in the first-ever direct brain-to-brain interface.[89][90][91]

company and Dipartimento di Ingegneria e Architettura – Università di Trieste, showed confirmatory results analyzing the EEG activity of two human partners spatially separated approximately 190 km apart when one member of the pair receives the stimulation and the second one is connected only mentally with the first.[94][95]

As well as furthering research on animal implantable devices, experiments on cultured neural tissue have focused on building problem-solving networks, constructing basic computers and manipulating robotic devices.

Researchers are well aware that sound ethical guidelines, appropriately moderated enthusiasm in media coverage and education about BCI systems will be of utmost importance for the societal acceptance of this technology.

Recently a number of companies have scaled back medical grade EEG technology (and in one case, NeuroSky, rebuilt the technology from the ground up[clarification needed]) to create inexpensive BCIs.

For example, this article reviewed work within this project that further defined BCIs and applications, explored recent trends, discussed ethical issues, and evaluated different directions for new BCIs.

That is, some persons who are diagnosed with DOC may in fact be able to process information and make important life decisions (such as whether to seek therapy, where to live, and their views on end-of-life decisions regarding them).

Given the new prospect of allowing these patients to provide their views on this decision, there would seem to be a strong ethical pressure to develop this research direction to guarantee that DOC patients are given an opportunity to decide whether they want to live.[127][128]

In addition, these patients could then be provided with BCI-based communication tools that could help them convey basic needs, adjust bed position and HVAC (heating, ventilation, and air conditioning), and otherwise empower them to make major life decisions and communicate.[129][130][131]

The VERE project included work on a new system for stroke rehabilitation focused on BCIs and advanced virtual environments designed to provide the patient with immersive feedback to foster recovery.

the flexible nature of the organic background materials allowing the electronics created to bend, and the fabrication techniques utilized to create these devices resembles those used to create integrated circuits and microelectromechanical systems (MEMS).[143]

Brain-controlled Robots

Paper:

Brain training a boon for stroke victims (30.3.2014)

A biomedical engineering team from Polytechnic University has developed an innovative device that is proving to be a godsend for patients recovering from a ...

MIT robot reads your mind

This is not a Jedi mind trick! A team of scientists at MIT has created a robot that can be controlled by a human brain. The scientists, working from MIT's Computer ...

EEG Brain-Robot Interface Project

Work by my team and Dr. Shaun Boe to teach people how to drive a robot using only brain activity. This fun project also taught us a lot about how measuring ...

Thought projection by neurons in the human brain

A team from California have shown that it's possible to control images on a screen using just the power of thought. Working with patients who had electrodes ...

BrainHQ Demonstration

BrainHQ is a brain-​training system built and tested by an international team of top neuroscientists and other brain experts. Unlike other brain-training programs, ...

A robot that runs and swims like a salamander | Auke Ijspeert

Roboticist Auke Ijspeert designs biorobots, machines modeled after real animals that are capable of handling complex terrain and would appear at home in the ...

Gaming The Brain: Integrating EEG into mainstream gaming

Peter LeBlanc from Humber College presents his team's eeg acquisition headset and details there work with the technology and the pro's and con's of ...

Can artificial intelligence help predict and prevent traffic accidents? - BBC Click

Click visits the US to see how predictive analytics help in emergencies. Plus Honda's work on creating robots with emotions and some AR pilot training.

The Immune System Explained I – Bacteria Infection

Every second of your life you are under attack. Bacteria, viruses, spores and more living stuff wants to enter your body and use its resources for itself.