AI News, Startup Spotlight: SynTouch Seeks to Enable Robots With 'Machine Touch'

Startup Spotlight: SynTouch Seeks to Enable Robots With 'Machine Touch'

This is the fourth post in ourStartup Spotlight series featuring new robotics companiesfrom around the world.

It took more than 50 years from the development of the artificial eye—the video camera—to mature to the widespread application of 'machine vision' currently used in autonomous robots, assembly lines, quality control, surveillance, and so many other systems.

At SynTouch we have developed an advanced tactile sensor modeled after the human fingertip, and our hope is that this technology and its applications can launch the field of 'machine touch.'

When Professor Gerald Loeb, also a SynTouch co-founder, and his research team at the University of Southern California were asked to participate inDARPA's Revolutionizing Prosthetics project a few years ago, he soon realized that even the best mechatronics couldn't compensate for the inadequate and fragile technologies then available for tactile sensing.

The company has completed a number of successful SBIR grants and has sold over 100 of its BioTac sensors to a wide range of academic and industrial researchers throughout the world.

(The photo above shows a Shadow hand equipped with BioTacfingertips, one of which is pictured below.) The technology itself is based on several biomimetic principles that allow the device to sense point of contact, normal forces, shear forces, vibrations, and thermal gradients just like the human fingertip.

Jeremy Fishel, one of the original USC graduate students and a co-founder, developed an algorithm called Bayesian exploration, which mimics how humans explore and classify objects based on their prior experience.

Augury’s Gadget Lets Machines Hear When They’re About to Die

About four years ago software developer Gal Shaul boarded a flight from Tel Aviv, Israel, to Delhi, India.

So he called a college friend Saar Yoskovitz, an expert in analog signal processing, the complex mathematics involved in processing non-digital signals such as sound.

That might sound like a privacy and security nightmare, but Yoskovitz say Augury isn’t recording the full audio of the entire space in which its hardware is installed, just the vibration patterns produced by the monitored machine, along with various inaudible frequencies.

For example, if Augury’s software had never heard the sound of a clogged vacuum hose, it would first alert a machine’s owners or technicians that it was making an unusual sound so they could check to see if there was a problem.

Then, after hearing the sound of a few clogged hoses before a device failure at different customer sites, the software will learn the sound of a clogged hose, someone will label the sound as such, and Augury will be able to send more specific alerts to its customers—including those who have never had a clogged hose problem before.

And since a clogged hose will make similar sounds whether it’s part of a commercial refrigerator or an oil pump or a car, the software will be able to generalize that sound across many different types of equipment.

Latest Blog Post

One of the big problems common among robot grippers is that usually they cannot feel what they are handling.

The technology inside the tactile sensors themselves isn't exotic: they contain flexible circuit boards, pressure sensors, electrodes, thermistors, and injection molded parts.

MB: The goal of tactile sensing is to endow robotic hands with the ability to perform human-like capabilities such as identification and manipulation of objects to perform useful tasks.

If dexterous hands needed high-resolution taxel arrays to be useful, humans wouldn't be tool users because the human hand doesn’t have high-resolution taxel arrays.

It requires an understanding of the key mechanical features of a human fingertip: the shape of the bone, compliance of the skin, presence of fingerprints, and other seemingly unimportant features of a human fingertip that are all critical for the identification and manipulation of objects.

Our CEO, Dr. Gerald Loeb has expanded on this, saying 'Unlike other sensing modalities, tactile sensing is predicated on the occurrence of a specific event - the collision between the object to be sensed and the sensors themselves.

The sensors can be designed to be sensitive to various mechanical and/or thermal aspects of the collision, but what they sense is necessarily determined by the mechanical dynamics of this interaction.'

To create useful data with these tactile sensors requires that the human or robotic fingertips are used to explore objects with strategies gained by spending a lifetime colliding hands with objects to create useful percepts.

Our complete understanding of the fingertip's mechanical features and sensory capabilities, and the exploration strategies of hands that generate useful percepts have enabled a new field that we callMachine Touch®.

In both cases, the robots need to handle mechanically complicated, unpredictable and fragile parts, a fact that makes these seemingly disparate types of robots have very similar needs.

In autonomous robots a computer would be the overseer, selecting and modulating these reflexes to accomplish manipulation objectives just like the human brain controls the spinal cord.

Microphones and speakers, video cameras and monitors, and now tactile sensors and tactile displays are pairs of recording and display devices that allow operators of telerobots to experience the world remotely.

Designing tactile displays requires an understanding of human tactile sensing, the information provided by our BioTac, and the ability to bring these together into a device that will withstand the rigors of daily use.

This will allow otherwise brittle robots to cause less damage to themselves, and also allows currently clumsy robots to cause less damage to the world around them

MB: SynTouch is the most widely integrated tactile sensor: beyond our integration with Robotiq we have developed interface kits for robotic hand manufacturers like: Shadow Robot Company, Kinova, Schunk, Simlab, Barrett Technology, and for full mobile robot manufacturers like: Willow Garage and Rainbow Robot, who make the PR2 and HUBO robots, and for prosthetic hand companies like, Motion Control and OttoBock.

The applications have enormous breadth, but in a short summary they're being used to do everything you expect your own fingers to do: they enable dexterity, prevent damage, provide awareness, protect humans, and improve product quality.

Luke Skywalker's Robotic Hand Is A Reality

What's your favorite piece of fictional technology? Tell us and we may make an episode about it. Subscribe ▻ The next episode will be released next Thursday, April..