The European Union’s VERE project aims to dissolve the boundary between the human body and a surrogate. The system was tested with three volunteers, each of whom wore an electroencephalogram (EEG) cap and a head-mounted display that showed what a robot in Japan was seeing. The volunteers made the robot move by concentrating on arrows superimposed across the display, each flashing at different frequencies. A computer detected which arrow a participant was looking at using the EEG readings that each frequency provoked, and it sent the corresponding movement to the robot. The researchers found this system enabled the participants to control the robot in near-real time, and they were able to make the robot pick up a drink, move across the room, and put the drink on a table. The researchers tried to improve the feeling of embodiment using auditory feedback. As they controlled the robot, both able-bodied volunteers and those with spinal cord injuries were able to place the drink closer to a target location when they heard footsteps as they walked, instead of a beep or no noise at all. The improved control suggests users feel more in tune with the robot itself when there is auditory feedback, says University of Rome researcher Emmanuele Tidoni.