AI News, Revolutionizing everyday products with artificial intelligence

Revolutionizing everyday products with artificial intelligence

But the ability to replicate a human brain’s ability to learn is incredibly difficult.” Kim specializes in machine learning, which relies on algorithms to teach computers how to learn like a human brain.

While the phrase “machine learning” often conjures up science fiction typified in shows like 'Westworld' or 'Battlestar Galactica,' smart systems and devices are already pervasive in the fabric of our daily lives.

Rather than building the sentient robots romanticized in popular culture, these researchers are working on projects that improve everyday life and make humans safer, more efficient, and better informed.  Making portable devices smarter Jeehwan Kim holds up sheet of paper.

“So, you build artificial neurons and synapses on a small-scale wafer.” The result is a so-called ‘brain-on-a-chip.’ Rather than compute information from binary signaling, Kim’s neural network processes information like an analog device.

In a Nature Materials study published earlier this year, Kim found that when his team made a chip out of silicon germanium they were able to control the current flowing out of the synapse and reduce variability to 1 percent.

“The potential is limitless – we can integrate this technology in our phones, computers, and robots to make them substantially smarter.” Making homes smarter While Kim is working on making our portable products more intelligent, Professor Sanjay Sarma and Research Scientist Josh Siegel hope to integrate smart devices within the biggest product we own: our homes.  One evening, Sarma was in his home when one of his circuit breakers kept going off.

If he could embed the AFCI with smart technologies and connect it to the ‘internet of things,’ he could teach the circuit breaker to learn when a product is safe or when a product actually poses a fire risk.

“Virus scanners are connected to a system that updates them with new virus definitions over time.” If Sarma and Siegel could embed similar technology into AFCIs, the circuit breakers could detect exactly what product is being plugged in and learn new object definitions over time.

If, for example, a new vacuum cleaner is plugged into the circuit breaker and the power shuts off without reason, the smart AFCI can learn that it’s safe and add it to a list of known safe objects.

“You don’t teach these devices all the rules, you teach them how to learn the rules.” Making manufacturing and design smarter Artificial intelligence can not only help improve how users interact with products, devices, and environments.

“Having 3-D printers that learn how to create parts with fewer defects and inspect parts as they make them will be a really big deal — especially when the products you’re making have critical properties such as medical devices or parts for aircraft engines,” Hart explains.   The very process of designing the structure of these parts can also benefit from intelligent software.

“The goal is to enable effective collaboration between intelligent tools and human designers.” In a recent study, Yang and graduate student Edward Burnell tested a design tool with varying levels of automation.

You can think of all kinds of applications — medical, health care, factories.” Kim sees opportunity to eventually connect his research with the physical neural network his colleague Jeewhan Kim is working on.

“Jeewhan’s neural network hardware could possibly enable that someday.” Combining the power of a portable neural network with a robot capable of skillfully navigating its surroundings could open up a new world of possibilities for human and AI interaction.

Whether it’s using face and handwriting recognition to protect our information, tapping into the internet of things to keep our homes safe, or helping engineers build and design more efficiently, the benefits of AI technologies are pervasive.

Direct Neural Interface & DARPA - Dr Justin Sanchez

Subscribe: Follow: Like: .

Implants & Technology -- The Future of Healthcare? Kevin Warwick at TEDxWarwick

Kevin Warwick is Professor of Cybernetics at the University of Reading, where he carries out research in artificial intelligence, control, robotics and cyborgs.

WaveNet by Google DeepMind | Two Minute Papers #93

Let's talk about Google DeepMind's Wavenet! This piece of work is about generating audio waveforms for Text To Speech and more. Text To Speech basically ...

What Can We Learn From Deep Learning Programs? | Two Minute Papers #75

The paper "Model Compression" is available here: There is also a talk on it here: ..

Why you should make useless things | Simone Giertz

In this joyful, heartfelt talk featuring demos of her wonderfully wacky creations, Simone Giertz shares her craft: making useless robots. Her inventions -- designed ...

How to control someone else's arm with your brain | Greg Gage

Greg Gage is on a mission to make brain science accessible to all. In this fun, kind of creepy demo, the neuroscientist and TED Senior Fellow uses a simple, ...

Connecting devices to cookies via filtering, feature engineering, and boosting

By: Michael Sungjun Kim, Jiwei Liu, Xiasozhou Wang, Wei Yang At IEEE's International Conference on Data Mining in 2015, Drawbridge hosted a cross-device ...

Demystifying Machine and Deep Learning for Developers : Build 2018

To build the next set of personalized and engaging applications, more and more developers are adding ML to their applications. In this session, you'll learn the ...

Transforming medical diagnostics: Dr. Sylvia L. Asa at TEDxDistilleryDistrictWomen

Sylvia L. Asa, MD, PhD is the Pathologist-in-Chief and Medical Director of the Laboratory Medicine Program at the University Health Network and Professor in ...

Human Pose Estimation With Deep Learning | Two Minute Papers #106

The paper "Keep it SMPL: Automatic Estimation of 3D Human Pose and Shape from a Single Image" is available here: ...