AI News, MIT Computer Science and Artificial Intelligence Laboratory ... artificial intelligence

MIT's new robot can identify things by sight and by touch

For humans, it's easy to tell how an object will feel by looking at it or tell what an object looks like by touching it, but this can be a big challenge for machines.

'By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge', says Yunzhu Li, CSAIL PhD student and lead author on a new paper about the system.

'Methods like this have the potential to be very useful for robotics, where you need to answer questions like 'is this object hard or soft?', or 'if I lift this mug by its handle, how good will my grip be?',' says Andrew Owens, a postdoctoral researcher at the University of California at Berkeley.

MIT develops a system to give robots more human senses

Researchers at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) have developed a new system that could equip robots with something we take for granted: the ability to link multiple senses.

you can imagine this being useful with a robot appendage reaching for a switch, lever or even a part it’s looking to pick up, and verifying that it has the right thing, and not, for example, a human operator it’s working with.

This type of AI also could be used to help robots operate more efficiently and effectively in low-light environments without requiring advanced sensors, for instance, and as components of more general systems when used in combination with other sensory simulation technologies.