AI News, Drone With Event Camera Takes First Autonomous Flight

Drone With Event Camera Takes First Autonomous Flight

A few years ago, Davide Scaramuzza’s lab at the University of Zurich introduced us to the usefulness of a kind of dynamic vision sensor called an event camera.

As you start trying to do state estimation with robots that are smaller and simpler, the problem gets more and more difficult, especially if you’re trying to deal with highly dynamic platforms, like fast-moving quadrotors.

The two most significant issues are that camera images tend to blur when the motion of the sensor exceeds what can be “frozen” by the camera’s frame rate, and that cameras (just like our eyes) are very Goldilocks-y about light: It has to be just right, not too little or too much, and you can forget about moving between the two extremes.

Furthermore, we use our pipeline to demonstrate—to the best of our knowledge—the first autonomous quadrotor flight using an event camera for state estimation, unlocking flight scenarios that were not reachable with traditional visual inertial odometry, such as low-light environments and high dynamic range scenes: we demonstrate how we can even fly in low light (such as after switching completely off the light in a room) or scenes characterized by a very high dynamic range (one side of the room highly illuminated and another side of the room dark).

The researchers haven’t yet tested this outside (where rapid transitions between bright sunlight and dark shadow are particular challenges for robots), but based on what we’ve seen so far, it seems very promisingand we’re excited to see what kinds of fancy new capabilities sensors like these will enable.

First Autonomous Flight for Drone with Event Camera

As you start trying to do state estimation with robots that are smaller and simpler, the problem gets more and more difficult, especially if you’re trying to deal with highly dynamic platforms, like fast-moving quadrotors.

The two most significant issues are that camera images tend to blur when the motion of the sensor exceeds what can be “frozen” by the camera’s frame rate, and that cameras (just like our eyes) are very Goldilocks-y about light: It has to be just right, not too little or too much, and you can forget about moving between the two extremes.

Furthermore, we use our pipeline to demonstrate—to the best of our knowledge—the first autonomous quadrotor flight using an event camera for state estimation, unlocking flight scenarios that were not reachable with traditional visual inertial odometry, such as low-light environments and high dynamic range scenes: we demonstrate how we can even fly in low light (such as after switching completely off the light in a room) or scenes characterized by a very high dynamic range (one side of the room highly illuminated and another side of the room dark).  In order to detect relative motion for accurate state estimation, the quadrotor’s cameras try to identify unique image features and track how those features move.

The researchers haven’t yet tested this outside (where rapid transitions between bright sunlight and dark shadow are particular challenges for robots), but based on what we’ve seen so far, it seems very promising and we’re excited to see what kinds of fancy new capabilities sensors like these will enable.

Heterogeneous Sensor Fusion for Accurate State Estimation of Dynamic Legged Robots (RSS'17)

Heterogeneous Sensor Fusion for Accurate State Estimation of Dynamic Legged Robots Simona Nobili, Marco Camurri, Victor Barasuol, Michele Focchi, Darwin G. Caldwell, Claudio Semini, Maurice...

Your Life in 2027: A Look at the Future | Vivek Wadhwa (Full Video)

Between driverless cars and AI and gene modification technology, we're about to see a lot of very big changes in a short amount of time. But there's a danger of people rebelling because things...

Andrew Ng: Artificial Intelligence is the New Electricity

On Wednesday, January 25, 2017, Baidu chief scientist, Coursera co-founder, and Stanford adjunct professor Andrew Ng spoke at the Stanford MSx Future Forum. The Future Forum is a discussion...

D3: Pitch Challenge: New Technologies in Diplomacy, Defense, Development

The Summit's main event, the D3 Pitch Challenge, featured six finalist teams—selected from nearly 500 employee submissions across State, DoD, and USAID—who developed proposals on how...

Sidewalk Toronto Community Town Hall (11/1)

On November 1, 2017, we hosted a Community Town Hall where we invited Torontonians to join the conversation and learn more about Sidewalk Toronto. Moderator Denise Pinto asked questions of...

The Giant Robot Fight Was Totally LAME - Weekly Weird News

Follow us on TWITCH! □ Our merch is now available! Check it out at □ LINKS TO OU

Winning The DARPA Grand Challenge

Google TechTalks August 2, 2006 Sebastian Thrun ABSTRACT The DARPA grand challenge, technical details enabling Sebastian Thrun's win, and an introduction to the next phase called "The Urban...

Towards Machines that Perceive and Communicate

Kevin Murphy (Google Research) Abstract: In this talk, I summarize some recent work in my group related to visual scene understanding and "grounded" language understanding. In particular,...

Stereo Dense SLAM - LDRMC 2011

Real-time demo of dense SLAM using a stereo pair with drastic lighting changes : Andrew I. Comport, Ezio Malis, Patrick Rives. Accurate Quadri-focal Tracking for Robust 3D Visual Odometry....

Combing Edge Images and Depth Maps for Robust Visual Odometry

This work is presented at the British Machine Vision Conference (BMVC) 2017. PDF: Code is available: