AI News, Virtual-reality testing ground for drones

Virtual-reality testing ground for drones

Now MIT engineers have developed a new virtual-reality training system for drones that enables a vehicle to 'see' a rich, virtual environment while flying in an empty physical space.

Pushing boundaries Karaman was initially motivated by a new, extreme robo-sport: competitive drone racing, in which remote-controlled drones, driven by human players, attempt to out-fly each other through an intricate maze of windows, doors, and other obstacles.

Currently, training autonomous drones is a physical task: Researchers fly drones in large, enclosed testing grounds, in which they often hang large nets to catch any careening vehicles.

Flight Goggles The team's new virtual training system comprises a motion capture system, an image rendering program, and electronics that enable the team to quickly process images and transmit them to the drone.

The actual test space -- a hangar-like gymnasium in MIT's new drone-testing facility in Building 31 -- is lined with motion-capture cameras that track the orientation of the drone as it's flying.

With the image-rendering system, Karaman and his colleagues can draw up photorealistic scenes, such as a loft apartment or a living room, and beam these virtual images to the drone as it's flying through the empty facility.

The virtual images can be processed by the drone at a rate of about 90 frames per second -- around three times as fast as the human eye can see and process images.

As the drone flew in the actual, empty testing facility, the researchers beamed images of the living room scene, from the drone's perspective, back to the vehicle.

Over 10 flights, the drone, flying at around 2.3 meters per second (5 miles per hour), successfully flew through the virtual window 361 times, only 'crashing' into the window three times, according to positioning information provided by the facility's motion-capture cameras.

Karaman points out that, even if the drone crashed thousands of times, it wouldn't make much of an impact on the cost or time of development, as it's crashing in a virtual environment and not making any physical contact with the real world.

Using the navigation algorithm that the researchers tuned in the virtual system, the drone, over eight flights, was able to fly through the real window 119 times, only crashing or requiring human intervention six times.

Researchers develop virtual-reality testing ground for drones

Training drones to fly fast, around even the simplest obstacles, is a crash-prone exercise that can have engineers repairing or replacing vehicles with frustrating regularity.

Now MIT engineers have developed a new virtual-reality training system for drones that enables a vehicle to “see” a rich, virtual environment while flying in an empty physical space.

Pushing boundaries Karaman was initially motivated by a new, extreme robo-sport: competitive drone racing, in which remote-controlled drones, driven by human players, attempt to out-fly each other through an intricate maze of windows, doors, and other obstacles.

Currently, training autonomous drones is a physical task: Researchers fly drones in large, enclosed testing grounds, in which they often hang large nets to catch any careening vehicles.

If you want to push boundaries on how fast you can go and compute, you need some sort of virtual-reality environment.” Flight Goggles The team’s new virtual training system comprises a motion capture system, an image rendering program, and electronics that enable the team to quickly process images and transmit them to the drone.

The actual test space — a hangar-like gymnasium in MIT’s new drone-testing facility in Building 31 — is lined with motion-capture cameras that track the orientation of the drone as it’s flying.

With the image-rendering system, Karaman and his colleagues can draw up photorealistic scenes, such as a loft apartment or a living room, and beam these virtual images to the drone as it’s flying through the empty facility.

The virtual images can be processed by the drone at a rate of about 90 frames per second — around three times as fast as the human eye can see and process images.

Over 10 flights, the drone, flying at around 2.3 meters per second (5 miles per hour), successfully flew through the virtual window 361 times, only “crashing” into the window three times, according to positioning information provided by the facility’s motion-capture cameras.

Karaman points out that, even if the drone crashed thousands of times, it wouldn’t make much of an impact on the cost or time of development, as it’s crashing in a virtual environment and not making any physical contact with the real world.

Using the navigation algorithm that the researchers tuned in the virtual system, the drone, over eight flights, was able to fly through the real window 119 times, only crashing or requiring human intervention six times.

MIT Researchers Use VR to Train Autonomous Drones

The result in computer science labs developing autonomous systems is often regular breaks where expensive drone components need to be put back together again.

Using a virtual-reality training platform, drones can be fooled into thinking they are seeing (and learning to navigate around) genuine obstacles despite flying in a wide open empty space.

“If anything, the system can make autonomous vehicles more responsive, faster, and more efficient.” “In the next two or three years, we want to enter a drone racing competition with an autonomous drone, and beat the best human player,” Karaman says.

It’s also useful because, in order to train drones to fly quickly through challenging environments, a new training system is required to help the drones learn to process visual information at speed.

If you want to push boundaries on how fast you can go and compute, you need some sort of virtual-reality environment.” The Flight Goggles virtual reality training kit features a motion capture system, an image rendering program, and electronics that enable the team to quickly process images and transmit them to the drone.

Campus Technology News

Virtual Reality Engineers at the Massachusetts Institute of Technology have created a virtual reality environment to train drones to fly fast around obstacles.

Such a setup is fine for training slow-flying drones, like those used in mapping, but not suitable for training racers, whose flight can vary wildly from small changes in their code.

The training facility includes a motion-capture system to track the drone's position and orientation as it flies, an image rendering program that can create photorealistic images of physical environments, and tools for beaming the images into the drone's computer as it flies, all to give the drone the illusion of physical obstacles as it flies through an empty room.

Karaman said the system could be used to test out new sensors, as well, by putting them on the drone and seeing if the performance improves or deteriorates in different environments, or to train drones how to fly around humans.

For example, a human could walk around in a motion capture suit and that data could be beamed into the drone's computer from a remote location, thus making the person appear near the drone while at no risk of physical harm.

MIT Creates Virtual Reality Training Ground for Drones

Scientists have developed a new virtual training ground that can help fine-tune fast-flying drones, without a lot of messy clean up or broken windows.

“If anything, the system can make autonomous vehicles more responsive, faster, and more efficient.” Flight Goggles is comprised of a motion capture system, an image rendering program and electronics that enable the researchers to quickly process images and transmit them to the drone.

The drone can process the virtual images at a rate of about 90 frames per second—about three times faster than the human eye can see and process images—because of custom-built circuit boards that integrate a powerful embedded supercomputer, along with an inertial measurement unit and camera.

If you want to push boundaries on how fast you can go and compute, you need some sort of virtual-reality environment.” In the new set-up, the researchers conducted a set of experiments, including one where the drone learned to fly through a virtual window about twice its size.

The window was set within a virtual living room and as the drone flew in the actual, empty testing facility, the researchers beamed images of the living room scene, from the drone's perspective, back to the vehicle.

Using the navigation algorithm that the researchers tuned in the virtual system, the drone, over eight flights, was able to fly through the real window 119 times, only crashing or requiring human intervention six times.

MIT Develops Virtual Reality Training Ground for Drones

To date, training and testing drones to fly can be a costly and time-consuming task with engineers having to repair and replace parts before training continues.

Dubbed the Flight Goggles, the new system could reduce the number of crashes that the drones normally experience in actual flight training activities.

The team’s virtual training system is composed of a motion capture system with an image rendering program and electronics that would enable the team to immediately process the images and send them to the drone.

The image rendering system would enable Karaman and his team to draw up photorealistic scenes like apartment loft or living room and transmit the virtual images to the drone as it’s flying around the empty place.

MIT 6.S094: Deep Reinforcement Learning for Motion Planning

This is lecture 2 of course 6.S094: Deep Learning for Self-Driving Cars taught in Winter 2017. This lecture introduces types of machine learning, the neuron as a ...

MIT 6.S094: Deep Learning

This is lecture 1 of course 6.S094: Deep Learning for Self-Driving Cars (2018 version). This class is free and open to everyone. It is an introduction to the practice ...

MIT 6.S094: Introduction to Deep Learning and Self-Driving Cars

This is lecture 1 of course 6.S094: Deep Learning for Self-Driving Cars taught in Winter 2017. Course website: Lecture 1 slides: ..

FlightGoggles: Visual-inertial-odometry flight with photorealistic camera simulation in the loop

FlightGoggles: Visual-inertial-odometry flight with photorealistic camera simulation in the loop Thomas Sayre-McCord, Amado Antonini, Jasper Arneberg, Austin ...

Sparse 3D Topological Graphs for Micro-Aerial Vehicle Planning

Video accompanying the IROS 2018 submission "Sparse 3D Topological Graphs for Micro-Aerial Vehicle Planning", written by Helen Oleynikova, Zachary ...

From Perception to Decision: Deep End-to-end Motion Planning

In this work we present a model that is able to learn the complex mapping from raw 2D- laser range findings and a target position to the required steering ...