AI News, Deep Learning AI Listens to Machines For Signs of Trouble

Deep Learning AI Listens to Machines For Signs of Trouble

Driving your car until it breaks down on the road is never anyone’s favorite way to learn the need for routine maintenance.

The serviceof 3DSignals, a startup based in Kefar Sava, Israel, relies on the artificial intelligence technique known as deep learning to understandthe noise patterns of troubled machines and predict problems in advance.

3DSignals has alreadybegun talking with leading European automakers about possibly using the deep learning service to detect possible troubleboth in auto factory machinery and in the cars themselves.

“If you’re a passenger in a driverless taxi, you only careabout getting to your destination andyou’re not reporting maintenance problems,” says Yair Lavi, a co-founder and head of algorithms for 3DSignals.So actually having the 3DSignals solution in autonomous taxis is very interesting to the owners oftaxi fleets.”

These neural networks can learn to become better at specific tasks by filtering relevant data through multiple (deep) layers of artificial neurons.Many companies such as Googleand Facebook have used deep learning to develop AI systems that can swiftlyfindthat one face in a milliononline images or do millions of Chinese to English translations per day.

Many tech giants have also applied deep learning to make their services become better at automatically recognizing the spoken sounds of differenthuman languages.Butfew companies have bothered with using deep learning to develop AI that’s good at listening to other acoustic signalssuch as the sounds of machines or music.

Instead, this first tier of service uses software that relies on basic physics modeling of certain machine parts—such as circular cutting saws—to predict when some parts may start to wear out.

The one-year-old startup has just 15 employees, but it has grown fairly fast and raised $3.3 million so far from investors such as Dov Moran, the Israeli entrepreneur credited with being one of the first to invent USB flash drives.

Deep Learning AI Pinpoints Mechanical Breakdowns by Listening to Machines

Most of us probably aren’t nimble mechanics when it comes to diagnosing whatever troubles our cars may be having — even though we know those weird knocking sounds means something is up, but we’re not sure what.

But while we’re hearing about companies developing more human-oriented forms of artificial intelligence that pop up in everything from image recognition, disease diagnosis to even software that can create passable pop music, 3DSignals is taking another tack by using those same tools to help diagnose what might be wrong with machines themselves, with a long-term view on developing a more well-rounded, artificial general intelligence that’s proficient at more than one task.

The second level involves the installation of ultrasonic sensors that are capable of registering sounds up to 100 kilohertz — well above the human range of 20 hertz to 20 kilohertz.

Either way, the company claims that its system can pinpoint or predict issues with 98 percent accuracy, and machine operators are sent real-time alerts on their devices in the event of a problem.

service that could be used in industries, power plants and even consumer products. As seen in the video above, a factory that uses a saw blade could monitor any changes in the sound of the blade in operation.

Minor discrepancies in the emitted sound due to moderate wear and tear could trigger the system to automatically signal operators to change the blade ahead of time, prior to a major failure, saving money and eliminating potential downtime.

Did You Hear That? Robots Are Learning The Subtle Sounds Of Mechanical Breakdown

Brakes squeal, hard drives crunch, air conditioners rattle, and their owners know it’s time for a service call.

But some of the most valuable machinery in the world often operates with nobody around to hear the mechanical breakdowns, from the chillers and pumps that drive big-building climate control systems to the massive turbines at hydroelectric power plants.

And while most current efforts are currently focused on large-scale machinery, Shenfeld says the same sort of technology might one day help detect failures in home appliances or in devices like self-driving cars or rental vehicles that don’t spend much time in the hands of an owner who’s used to their usual sounds.

The sound of impending failure

We could predict the failure of engines, rail infrastructure, oil drills and power plants in real time — notifying humans the moment of an acoustical anomaly.

And while these problems plague even single-purpose acoustical classifiers, the holy grail of the space is a generalizable tool for identifying all sounds, not simply building a model to differentiate the sounds of those doors.

The tracks were then cut into 20-second segments to create a spectrogram. Combined with spectrograms of fully mixed songs, the model was able to separate vocals from backing instruments in new songs.

But it’s one thing to divide up a five piece song with easily identifiable components, it’s another to record the sound of a nearly 60 foot high MAN BW 12S90ME-C Mark 9.2 type diesel engine and ask a machine learning model to chop up its acoustic signature into component parts.

Spotify is one of the more ambitious companies toying with the applications of machine learning to audio signals. Though Spotify still relies on heaps of other data, the signals held within songs themselves are a factor in what gets recommended on its popular Discover feature.

The first I’m going to call the “custom solutions” model, which essentially involves a company collecting data from a client with the sole purpose of identifying a pre-set range of sounds.

“We built a very scaled architecture to connect huge fleets of distributed machines to our monitoring platform where the algorithms will highlight whenever any of these machines start misbehaving,” said company CEO Amnon Shenfeld.

“There’s a strong push from the Federal Transit Administration to do condition assessments for Transit Asset Management,” said Shannon McKenna, an engineer at ATS Consulting, a firm working on noise and vibration analysis. “We see this as one way to help transit agencies come up with a condition assessment metric for their rail systems.” Beyond short-tail indicators like wheel-squeal, in the case of rail monitoring, engineers start to run into a pretty gnarly needle in the haystack problem. McKenna explains that common acoustic signals only represent about 50 percent of the problems that a complex rail system can face. As opposed to checking boxes for compliance, true risk-management requires a generalized system — you don’t want an outlier case to result in disaster.

We will need researchers and founders alike building classifiers for the sounds of underground subway systems, the human respiratory system, and critical energy infrastructure to help prevent tomorrow’s failures.