AI News, Fatal Tesla Self-Driving Car Crash Reminds Us That Robots Aren't Perfect

Fatal Tesla Self-Driving Car Crash Reminds Us That Robots Aren't Perfect

The driver of the truck didn’t see the Tesla, nor didthe self-driving Tesla and its human occupantnoticethe trailer.The Tesla collided with the truck without the human or the Autopilot system ever applying the brakes.The Tesla passed under the center of the trailer at windshield height and came to rest at the side of the road after hitting a fence and a pole.

The autopilot relies on cameras and radar to detect and avoid obstacles, and the cameras weren’t able to effectively differentiate “the white side of the tractor trailer against a brightly lit sky.” The radar should not have had any problems detecting the trailer, but according to Musk, “radar tunes out what looks like an overhead road sign to avoid false braking events.” We don’t know all the details of how the Tesla S’s radar works, but the fact that the radar could likely see underneath the trailer (between its front and rear wheels), coupled with a position that was perpendicular to the road (and mostly stationary) could easily lead to a situation where a computer could reasonably assume that it was looking at an overhead road sign.

Tesla’s statement also emphasized that, despite being called “Autopilot,” the system is assistive only and is not intended to assume complete control over the vehicle: It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled.

Vehicle-to-Vehicle Communication: The NHTSA is currently studying vehicle-to-vehicle (V2V) communication technology, which would allow vehicles “to communicate important safety and mobility information to one another that can help save lives, prevent injuries, ease traffic congestion, and improve the environment.” If (or hopefully when) vehicles are able to tell all other vehicles around them exactly where they are and where they’re going, accidents like these will become much less frequent.

This bypasses almost all front-impact safety systems on the passenger vehicle, and as Tesla points out, “had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.” If Tesla comes up with a software fix, which seems like the most likely scenario, all other Tesla Autopilot systems will immediately benefit from improved safety.

Similar mistakes are possible, but as Tesla says, “as more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing.” The near infinite variability of driving on real-world roads full of unpredictable humans means that it’s unrealistic to think that the probability of injury while driving, even if your car is fully autonomous, will ever reach zero.

This is Tesla’s first Autopilot-related fatality in 130 million miles [210 million km]: humans in the U.S. experience a driving fatality on average every 90 million miles [145 million km], and in the rest of the world, it’s every 60 million miles [100 million km].

Feds to investigate Tesla crash driver blamed on Autopilot

The outcome could have been much worse if firefighters had been standing at the back of the truck, Battalion Chief Ken Powell told the San Jose Mercury News.

NTSB investigators said the accident resulted from a combination of factors, including the limitations of the system as well as the actions of both the Tesla driver and the driver of the semitrailer the car crashed into.

Driver in Tesla crash relied excessively on Autopilot, but Tesla shares some blame, federal panel finds

'We appreciate the NTSB's analysis of last year's tragic accident and we will evaluate their recommendations as we continue to evolve our technology,' the company said in a written statement.

Tesla Bears Some Blame for Self-Driving Crash Death, Feds Say

It's been nearly a year and a half since Joshua Brown became the first person to die in a car driving itself.

According to car's driving manual and the disclaimer drivers accept before they can engage it, the system should only have been used on highways with clear lane markings, strict medians, and exit and entrance ramps.

So when a tractor trailer turning left crossed into the Model S's lane, the system did not recognize it—and the car crashed into its side, killing Brown instantly.

The National Highway Traffic Safety Administration, the government's vehicle safety watchdog, concluded in January that because Brown was supposed to be monitoring the car's driving, human error—not Tesla tech—caused the crash.

The automotive industry promises that fully driverless cars will nosedive the roughly 35,000 American road deaths per year, 94 percent of which result from human error.

But while roboticists wrangle with the complex problems that stand in the way of full self-driving, carmakers are rolling out semi-autonomous features, which help drivers perform some driving tasks.

“We are inherently imperfect beings, and automated systems can help compensate for that,” says David Friedman, who ran NHTSA for part of the Obama administration and now directs cars and product policy at Consumers Union.

They mostly just follow lane lines and stay clear of other vehicles, and rely on the human driver to take control if anything goes wrong, like if the lines disappear or the weather gets dicey.

Before the Florida crash, Tesla's Autopilot system was programmed to give repeated auditory and visual warnings when a driver went a few minutes without touching the steering wheel, but that's where its powers of persuasion stopped.

When NHTSA concluded its investigation in January, it found that Autopilot hadn’t malfunctioned: Because it was designed for driving on highways with exit and entrance ramps, it wasn't expected to detect a truck turning across the car's path.

“There has been a reluctance across the industry to deploy cameras in the vehicle because at the end of the day there are individual privacy concerns,” says Bryan Reimer, an engineer who studies driver behavior at MIT.

"It highlights that with the introduction of ever smarter technologies, the companies developing such systems have to carefully consider how to keep the human in the loop if they want to rely on the human operator as a 'backup,'"

Statistically, self-driving cars are about to kill someone. What happens next?

The first known death caused by a self-driving car was disclosed by Tesla Motors on Thursday, a development that is sure to cause consumers to second-guess the trust they put in the booming autonomous vehicle industry.

The 7 May accident occurred in Williston, Florida, after the driver, Joshua Brown, 40, of Ohio put his Model S into Tesla’s autopilot mode, which is able to control the car during highway driving.

Against a bright spring sky, the car’s sensors system failed to distinguish a large white 18-wheel truck and trailer crossing the highway, Tesla said.

Owner video of Autopilot steering to avoid collision with a truckhttps://t.co/FZUAXSjlR7 In its 537-word statement on the incident, the electric vehicle company repeatedly went out of its way to shift blame for the accident.

“He was a friend to Tesla and the broader [electric vehicle] community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission.” Our condolences for the tragic loss https://t.co/zI2100zEGL “Preliminary reports indicate the vehicle crash occurred when a tractor-trailer made a left turn in front of the Tesla at an intersection on a non-controlled access highway.

The car continued to travel after passing under the truck’s trailer, veered off the road, and then crashed through two fences and into a power pole, the local police report said.

Tesla has generated enormous fanfare with its autopilot mode and inspired consumers – despite the company’s warnings – to see just how much they can do while letting the car drive.

Tesla Autopilot Predicts Crash Compilation 2

Could Tesla's Autopilot system have predicted that the two vehicles in front of it would collide?

Tesla Driver Killed In Crash With Autopilot System Driving

Tesla's autopilot system was at the helm when a Model S driver was killed in a Florida crash. Kiet Do reports. (6/30/16)

See Motorists Play, Read and Relax In Self-Driving Cars As Second Tesla Crashes

From playing patty cake to taking part in arm wrestling matches, these motorists appear to be concentrating on anything but the road. The startling videos posted to YouTube show people lounging...

Tesla Autopilot predicts an accident caught on dashcam a second later

Not my video.

Testing Tesla's Autopilot System At 70mph

CLICK HERE TO DOWNLOAD THE CAR THROTTLE MOBILE APP! VISIT OUR SHOP: What's it really like trusting a car with your life at 70mph?.

Tesla Autopilot Predicts Car Crash before it happen

Tesla Autopilot system detect all vehicles around and make a beep sound if they are going to crash. Join Discord Channel: Tesla Model 3 - Interior Exterior and..

Watch This Grandmother Freak Out Behind The Wheel of Self-Driving Tesla

The Tesla electric car prides itself on being a marvel of technology but one recent driver wasn't exactly thrilled. A grandmother's reaction to the autopilot had her looking like she was on...

Tesla autopilot crash: Tesla Model S rear ends fire truck in Culver City, California - TomoNews

CULVER CITY, CALIFORNIA — The U.S. government is looking into an accident involving a Tesla Model S on auto and the rear end of a fire truck. Culver City firefighters were attending the...

Tesla crashes into concrete barrier....ON AUTO PILOT!

Video show a Tesla Model S that didn't adjust for the construction zone lane change, the result......BOOM!! Now they want driverless 18 wheeler trucks at 80000 lbs roaming the highways with...

What Happens If You Leave Tesla Autopilot On FOREVER? (Terrible Idea)

What happens if you turn on autopilot, take your hands off the wheel, ignore all Tesla self driving warnings and leave it on while driving? Well, it does not end well. Don't try this! Subscribe:...