AI News, Quanergy Announces $250 Solid-State LIDAR for Cars, Robots, and More

Quanergy Announces $250 Solid-State LIDAR for Cars, Robots, and More

Yesterday at CES, Quanergy, an automotive startup based in Sunnyvale, Calif.,held a press conference to announce the S3, a solid-state LIDAR system designed primarily to bring versatile, comprehensive, and affordable sensing to autonomous cars.

Instead, Quanergy’s LIDAR uses an optical phased array as a transmitter, which can steer pulses of light by shifting the phase of a laser pulse as it’s projected through the array: Each pulse is sent out in about a microsecond, yielding about a million points of data per second.

This refocusing can happen dynamically and almost instantly, such that if your car is speeding down the road and spots a potential obstacle, it can focus as much of its LIDAR resources on that obstacle to collect the information that it needs to decide what to do next, while still continuing to collect (slightly sparser) data from everywhere else.

Frame rate is usually a big deal for LIDAR systems that spin, but with the S3, you just get a million points per second and you can decide what frame rate you want to deal with them at, since it’s all software controlled.

At the same time, however, we’re obligated to point out that Quanergy has not yet demonstrated a version of the S3 that performs to the specifications that they announced at their press conference, leaving us wondering whether having a press conference of this kind at CES might have been somewhat premature.

Here it is, fully armed and operational.” And we also wouldn’t mind an “Oh and if you want to come check it out we’ll send a private jet.” Private jet or no private jet, Quanergy is planning to have a preproduction S3 sensor ready to go at the end of September 2016 so that they can spend Q4 ramping up production, with deliveries to OEMs scheduled for early 2017.

Optics Photonics News

Beyond everyday use, self-driving cars could expand transportation options for the elderly and disabled and ease business travel by guiding drivers in unfamiliar locales.

Achieving automotive autonomy requires artificial intelligence to process and integrate data from a suite of sensors including cameras, microwave radar, ultrasonic sensors and laser radar, better known as lidar.

The difference between the SAE’s five levels is more than academic—a fact that came into tragic focus on 7 May 2016, when the driver of a Tesla Model S equipped with the company’s Autopilot and cruise control, traveling at 74 miles per hour, fatally smashed into a tractor-trailer on a divided highway in Florida.

lidar with 200-meter range could have spotted the truck in time for a suitably programmed car to apply the brakes, says Jason Eichenholz, co-founder and chief technology officer of Luminar Technologies.

At the time, car lidar ranges were limited to a few tens of meters, but in 2017 Luminar announced a 200-meter model, and this year it plans to deliver its first production run to automakers.

In fact, to achieve Level 4 or 5 autonomy, most vehicle developers envision a suite of sensors, including not just lidar but also microwave sensors, optical cameras and even ultrasound.

Laser ranging has been around since the early 1960s, when MIT’s Lincoln Laboratory measured the distance to the moon by firing 50-joule pulses from a ruby laser, with a return signal of only 12 photons.

His vehicle didn’t finish the race, but he refined his spinning lidar design to include 64 lasers and sensors that recorded a cloud of points showing objects that reflected light in the surrounding area.

An enhanced version that scans more than two million points per second is still on the market, and Velodyne now offers smaller and less expensive versions with 32 and 16 lasers.

That limits their range to 100 meters for highly reflective objects and only 30 to 40 meters for dark objects with 10 percent reflectivity.

Google worked around the lidar’s short range by limiting its self-driving tests to 25 miles per hour—slow enough for cars to stop within 30 meters when they used the lidar data on fixed and moving objects for navigation and collision avoidance.

To avoid the high cost of spinning mirrors, the new lidars scan back and forth across ranges typically limited to 120 degrees.

That would require putting multiple stationary lidars pointing in different directions on the front, sides and back of the vehicle to build a 3-D point cloud of the surroundings.

Based on CMOS silicon, the lidar uses an optical phased array to scan half a million points a second with a spot size of 3.5 centimeters at 100 meters, says CEO Louay Eldada.

In April 2017, Velodyne Lidar announced plans for a compact 905-nm solid-state lidar that the company says will scan 120 degrees horizontally and 35 degrees vertically and have a 200-meter range, even for low reflectivity.

“We want to have higher spatial resolution, smaller angles between beams, tighter beams, and enough energy per pulse to confidently see a 10 percent reflective object at 200 meters.”

The company started at 905 nm and quickly realized that, because of the eye-safety issue, that wavelength didn’t allow a high enough photon budget to reach the desired range.

That natural safety margin allows the higher pulse power needed for a 200-meter range for dark objects—far enough for cars going at highway speed to stop safely if the lidar spots a hazard.

For scanning the scene, Luminar turned to galvanometers, which, instead of spinning, bend back and forth to scan up to 120 degrees horizontally and 30 degrees vertically.

When the reflected chirped signal returns to the lidar, it mixes with an outgoing chirped beam in a photodiode to produce a beat frequency, says Strobe founder Lute Maleki.

What makes Strobe’s system possible is a self-injection-locked laser which combines a diode laser with a millimeter-scale whispering-gallery-mode resonator to give very low noise and very linear chirp.

The Strobe lidar tolerates interference from sunlight that can blind cameras and humans, such as a low sun reflecting off pavement, and spot a person dressed in black walking at night on black pavement, wrote Kyle Vogt, CEO of GM’s Cruise division.

the long-range version might scan only 50 by 20 degrees, to cover the road and adjoining areas to 200 meters ahead, allowing for driving at highway speeds.

These lidars would form part of a sensor suite that interfaces with a powerful computerized navigation and control system—because, strong as lidar looks, no single technology can do all the sensing needed.

And artificial-intelligence software in the car will need to pull all of the input from these sensors together to create a single robotic sensory system—and a whole greater than the sum of its parts.

Fisker EMotion On Track For CES Debut, Will Feature 5 LiDAR Sensors

The company is the industry leader when it comes to all key areas, including power, efficiency, price, size, weight, and reliability.

According to Green Car Congress: Mechanical LiDAR units use a laser and sensor that are physically moved around to build up their view of 3D space (e.g., spinning in a circle).

Quanergy’s solid-state LiDAR uses an optical phased array as a transmitter, which can steer pulses of light by shifting the phase of a laser pulse as it is projected through the array.

With the ability to scan in every direction, the unit creates a live 3D view around a vehicle to detect, classify, and track objects in the scene.