AI News, Shrewbot Uses Whiskers to Map Its Environment

Shrewbot Uses Whiskers to Map Its Environment

Robots that make maps tend to be highly reliant on vision of one sort or another, whether it’s a camera image or something off the end of the visible spectrum like a laser scanner.

Using a combination of wheel odometry and detection by whisking (the behavior really is called whisking), Shrewbot is able to gradually make a tactile map of an area by combining hundreds (or thousands) of whisk contacts that it feels when it encounters walls or other obstacles.

Robots like Shrewbot are ideal for exploring and mapping spaces where laser, acoustic, or visual sensors don’t work very well, like dark spaces, spaces filled with dust or smoke, or even in turbid water, and future research will investigate how well this technique works at larger scales, with an eye towards practical deployment, and perhaps even an implementation of texture detection with whiskers as well.

Active Touch Laboratory at Sheffield

I investigate active touch sensing in animals, humans and robots, focusing on (i) the evolution of the mammalian somatosensory system, (ii) multisensory perception and memory, (iii) haptic interaction and emotional touch in biomimetic robots, and (iv) haptic interfaces for sensory augmentation and telepresence.

My research on active touch sensing focuses on its role in generating functional spatial percepts that drive local motion planning through spatial attention and global motion planning through mapping and navigation systems.