AI News, Cloud Robotics: Connected to the Cloud, Robots Get Smarter

Cloud Robotics: Connected to the Cloud, Robots Get Smarter

In the first “Matrix” movie, there’s a scene where Neo points to a helicopter on a rooftop and asks Trinity, “Can you fly that thing?” Her answer: “Not yet.” Then she gets a “pilot program” uploaded to her brain and they fly away.

This approach, which some are calling 'cloud robotics,' would allow robots to offload compute-intensive tasks like image processing and voice recognition and even download new skills instantly, Matrix-style.

The robot could simplysend an image of the cup to the cloud and receive back the object’s name, a 3-D model, and instructions on how to use it, saysJames Kuffner, a professor at Carnegie Mellon currently working at Googlewho coined the term “cloud robotics.”

“Cloud robotics could make that possible by expanding a robot’s knowledge beyond its physical body.” “Coupling robotics and distributed computing could bring about big changes in robot autonomy,” said Jean-Paul Laumond, director of research at France’s Laboratory of Analysis and Architecture of Systems, in Toulouse.Hesays that it’s not surprising that a company like Google, which develops core cloud technologies and services, is pushing the idea of cloud robotics.

Stefan Schaal, a robotics professor at the University of Southern California, says that arobot may solve a complex path planning problem in the cloud, or possibly other optimization problems that do not require strict real-time performance, 'but it willhave to react to the world, balance on its feet, perceive, and control mostly out of local computation.'

He envisions a future when robotswill feed data into a 'knowledge database,' where they'll share their interactions with the world and learn about new objects, places, and behaviors.

Researchers at Singapore's ASORO laboratoryhave built a cloud computing infrastructureto generate 3-D models of environments, allowing robots to perform simultaneous localization and mapping, or SLAM, much faster than by relying on their onboard computers.

This manual would specify, for example,the position from which the robot should manipulate theobject.The approachtries to break down the computational complexity of manipulation tasks into simpler,decoupled parts:a simpli¿ed manipulation problem based on the object's 'user manual,' and awhole-body motion generation by an inverse kinematicssolver, which the robot's computer can solve in real time.

The iCub, an open child-sized humanoid platform, works as a 'container of behaviors,' Sandini says.'Today we share simple behaviors, but in the same way we could develop more complex ones like a pizza making behavior, and our French collaborators could develop a crepes making behavior.'

Ken Goldberg: "Cloud Robotics" | Talks at Google

Ken Goldberg is the craigslist Distinguished Professor of New Media and Professor of Industrial Engineering and Operations Research (IEOR) at the University of California, Berkeley. He also...