AI News, How Drive.ai Is Mastering Autonomous Driving With Deep Learning

How Drive.ai Is Mastering Autonomous Driving With Deep Learning

Among all of the self-driving startups working towardLevel 4 autonomy (a self-driving system that doesn’t require human intervention in most scenarios), Mountain View, Calif.-basedDrive.ai’s scalable deep-learning approach and aggressive pace make it unique.

There’s so much complication in driving, there are so many things that are nuanced and hard, that if you have to do this in ways that aren’t learned, then you’re never going to get these cars out there.” It’s only been about a year since Drive went public, but already, the company has a fleet of four vehicles navigating (mostly) autonomously around the San Francisco Bay Area—even in situations (such as darkness, rain, or hail) that are notoriously difficult for self-driving cars.

While a pedestrian in a camera image is a perceptual pattern, there are also patterns in decision making and motion planning—the right behavior at a four way stop, or when turning right on red, to name two examples—to which deep learning can be applied.

deep-learning system’s ability to recognize patterns is a powerful tool, but because this pattern recognition occurs as part of algorithms running on neural networks, a major concern is that the system is a “black box.” Once the system is trained, data can be fed to it and a useful interpretation of those data will come out.

“What we want to be able to do is to train deep-learning systems to help us with the perception and the decision makingbut also incorporate some rules and some human knowledge to make sure that it’s safe.” While a fully realized deep-learning system would use a massive black box to ingest raw sensor data and translate that into, say, a turn of the steering wheel or activation of the accelerator or brakes, Drive has intentionally avoided implementing a complete end-to-end system like that, Tandon says.

“If you break it down into parts where you’re using deep learning, and you understand that different parts can be validated in different ways, then you can be a lot more confident in how the system will behave.” There are a few tricks that can be used to peek into the black box a little bit, and then validate (or adjust) what goes on inside of it, say the Drive researchers.

We then augment the data set with synthetic examples, and with that, what you do is say, “Hey, system, tell me what you’re going to do on this overpass, and then I’m going to jitter it a little bit, and you’re going to do it again.” Over time, the system starts to work around overpasses, and then you can validate it on a systematic level.

Annotation, while very simple, is also very tedious: Ahuman annotator is presented with a data set, perhaps a short video clipor even just a few frames of video or lidar data, and tasked with drawing and labeling boxes around every car, pedestrian, road sign, traffic light, or anything else that might possibly be relevant to an autonomous driving algorithm.

Already in many cases, our deep-learning systems perform better than our expert annotators.” Reiley adds: “Think about how mind-blowing that is.” It’s difficult for the Drive team to articulate exactly what prevents other companies from building their own deep-learning infrastructure and tools and doing the same kind of deep-learning-based annotation and training.

there’s just so many components to get right throughout the entire stack, that it becomes hard to say there’s one specific reason why this works well.” Reiley agrees:“Your decisions have to be software driven and optimized for deep learning, for software and hardware integration.

Autonomous driving is much more than just an algorithm— it’s a very complicated hardware-software problem that nobody has solved before.” The hardware in Drive’s four-car fleet is designed to be retrofit onto most vehicles with a minimum of hassleand is concentrated in an array of sensors, including cameras and lidar, located on the roof.

We get some low-resolution depth data from the lidarand really high-resolution context information from the camera.” This kind of multimodal redundancy and decision making through deep learning based on fused sensor data has advantages in an autonomous vehicle context.

Deep learning has a significant advantage over rules-based approaches here, since rules conflicts can lead to failures that can be, according to Pazhayampallil, “catastrophic.” And sensor failure is most often not a hardware or software issuebut rather a sensor that isn’t producing good data for some reason, like sun glare, darkness at night, or (more commonly) being occluded by water.

“Our lidar sensors can only see about 50 to 75 meters reliably, and on a road like this, where you can have cross traffic at 45 or 50 miles per hour [72 or 80 km/h], you can’t yet with sufficient confidence detect cross traffic and know exactly what lane it’s going to be in.” When the light turned green, the car made the turn into the rightmost lane (which is legally how a right turn is supposed to be made).

“In terms of path planning, having the vehicle compensate for obstructions on the fly like that is a place where we’re currently building in more capability.” This situation is more than just a path planning problem, though: Waiting for the truck to move is the right call, if the truck is active.

“Humans aren’t necessarily perfect at doing very precise things,” says Smith, “but they’re great at improvising and dealing with ambiguity, and that’s where the traditional robotics approach breaks down, is when you have ambiguous situations like the one we just saw.

These annotated data are fed into Drive’s deep learning algorithms, and the system learns to recognize the general concept of a traffic light, much in the same way that humans do, as Smith explains: The other thing about deep learning that’s really nice is that we get to use the context of the entire scene, not just the lights themselves.

For better or worse, Drive’s ability to maintain its aggressive, um, drive towardLevel 4 autonomy (vehicles that operate in specific areas and conditions without a safety driver) likely depends, in the short term, on state and federal regulations.

What role does Deep Learning play in Self Driving Cars?

deep learning and self driving cars Autonomous Drive is here and I'm going to take a look at the intellige..

How Tesla's Self-Driving Autopilot Actually Works | WIRED

Exploring everything from the radars and camera to the Mario Kart easter egg, our roadtrip shows Tesla's autopilot works well—but it's no self-driving wunderkind. Still haven't subscribed...

Chris Urmson: How a driverless car sees the road

Statistically, the least reliable part of the car is ... the driver. Chris Urmson heads up Google's driverless car program, one of several efforts to remove humans from the driver's seat. He...

Cognitive Mobility: Olli the self-driving vehicle and Watson the cognitive system

Local Motors transforms the passenger experience with IBM Watson Internet of Things technology and Watson IoT AutoLAB; On roads now in Washington, DC and soon in Miami-Dade County and Las Vegas....

Google autonomous vehicle: how do Google's self-driving cars work? - TomoNews

MOUNTAIN VIEW, CALIFORNIA — A Google self-driving car finally was the cause of a crash on Valentine's Day after driving more than 1.3 million miles since 2009, reported Wired. No one was...

MIT 6.S094: Introduction to Deep Learning and Self-Driving Cars

This is lecture 1 of course 6.S094: Deep Learning for Self-Driving Cars taught in Winter 2017. Course website: Lecture 1 slides: Contact: deepcars@mit.edu.

NVIDIA Drive PX2 self-driving car platform visualized

NVIDIA CEO Jen-Hsun Huang at GTC Europe showing off the DriveWorks platform for self-driving cars. All Tesla cars shipping today feature the NVIDIA Drive PX 2. A visualized look at how the...

How Deep Learning Will Enable Self-Driving Cars

Deep learning is a branch of artificial intelligence that teaches machines to understand unstructured data, such as images, speech and video, using algorithms—step-by-step data-crunching...

[Vehicle Control 6] Automatic Steering Control for Autonomous Vehicle

An autonomous car (driverless car,[1] self-driving car,[2] robotic car[3]) is a vehicle that is capable of sensing its environment and navigating without human input.[4] Autonomous cars can...

The self-healing map from HERE Technologies

HERE HD Live Map is paving the way to autonomous driving with its self-healing technology. Autonomous vehicles now have up-to-date and reliable data, critical for safe and proactive driving...