Self-Driving Tesla Was Involved in Fatal Crash, U.S. Says
Source: NYTimes
The driver of a Tesla Model S electric car was killed in a crash that occurred while the vehicle was driving itself in autopilot mode, according to the National Highway Traffic Safety Adminstration, which has opened a formal investigation.
The accident, on May 7, is thought to be the first death resulting from a crash involving a self-driving car. Automakers and Silicon Valley companies like Tesla and Google are pushing to perfect automated vehicles and speed their introduction.
In a statement, the safety agency said it learned of the fatality from Tesla, and has sent an investigative team to examine the vehicle and the crash site in Williston, Fla., about 100 miles northwest of Orlando. The team is looking at the cars automated driving system and whether it played a role in the crash.
Preliminary reports indicate the vehicle crash occurred when a tractor-trailer made a left turn in front of the Tesla at an intersection on a noncontrolled access highway, the agency said. The driver of the Tesla died due to injuries sustained in the crash.
The safety agency is working with the Florida Highway Patrol in the inquiry. The agency cautioned that the opening of an investigation does not mean it believes there is a defect in the vehicle being examined.
Read more: http://www.nytimes.com/2016/07/01/business/self-driving-tesla-fatal-crash-investigation.html?_r=0
seabeckind
(1,957 posts)Sometimes stopping just causes a bigger accident with you scrunched in the middle. Much better to watch it from the berm.
forest444
(5,902 posts)As imperfect as we are, there is - as yet - no substitute for a human being and his/her pineal gland.
jtuck004
(15,882 posts)mike_c
(36,281 posts)I mean, the accident would have been much less likely if the same autonomous guidance system running in the Tesla were also running in the big rig. The solution isn't to "fix the other driver's errors." It's to replace the error prone driver with something less dangerous.
jtuck004
(15,882 posts)a ways from that really being the solution.
Better people drivers would help, 'cause we aren't gonna get rid of them until we do some major infrastructure re-design and re-building. There will be a lot of retrofitting, but at some point the future will likely need to look different.
But before that, starting about 40 years ago or so, we really, really need to have a conversation about what life is gonna be like in the future, with a handful of robot owners and a world of hungry people.
Else the robots ain't gonna have any Ms. Daisies, or Mr. Daisies for that matter, to drive for.
AtheistCrusader
(33,982 posts)Zero evidence at this point that any human driver could have done better.
scscholar
(2,902 posts)This was not a driver error. They admitted to ignoring the white trailer against a white background thus committing murder.
jtuck004
(15,882 posts)regardless of what software the other car has.
There is no evidence that a driver might have made the distinction.
Regardless, we kill off about 30-40K of ourselves, mostly voluntarily, every year while driving,. The software still has a ways to go before it can be declared even close to how murderous we are.
jberryhill
(62,444 posts)LiberalLovinLug
(14,169 posts)Wait for the 2.0 or even 3.0 version.
A self driving car sharing the road with so many idiots...what could possibly go wrong?
Kashkakat v.2.0
(1,752 posts)TOLD THEM TO.... I refuse to drive or ride in any self driving car, period.
Figure I might give it a spin in year 2073 or so.
Darb
(2,807 posts)I can assure you of that.
progree
(10,901 posts)Last edited Sun Jul 3, 2016, 09:57 PM - Edit history (1)
The male driver died in a May 7 crash in Williston, Fla., when a big rig made a left turn in front of his Tesla.
In a blog post, Tesla Motors Inc. said the car passed under the trailer, with the bottom of the trailer hitting the Model S windshield.
Neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied, Tesla said.
Sixty-two-year-old Frank Baressi, the driver of the truck and owner of Okemah Express LLC, said the Tesla driver was "playing Harry Potter on the TV screen" at the time of the crash and driving so quickly that "he went so fast through my trailer I didn't see him."
ErikJ
(6,335 posts)But it might have been too late anyway. Auto pilots are probably not paying too close attention half the time.
progree
(10,901 posts)and Tesla claims no differently
But many will be quick to remind that Tesla's Autopilot is not a fully self-driving system anyway it's generally considered Level 2 on NHTSA's 0-4 scale of autonomy. At Level 2, there's still a lot that the driver is still responsible for, which is really laid bare in an unusually long series of warnings printed in the Model S owner's manual:
And then page after page of warnings from the Tesla manual. They call it "traffic aware cruise control"
Shankapotomus
(4,840 posts)A white background?
FrodosPet
(5,169 posts)Humans look at things, and based on a combination of appearance, context, and prior experience, are good at understanding what they are and how they may affect their moving vehicle.
Computers can make educated guesses based on those factors, but keep in mind that the world is made up of a lot of different things with different shapes and colors and lighting conditions. To be able to figure out what that constantly shifting blob ahead of you is going to do takes enormous processing power. First you have to find edges and key points. Remember, you are in constant motion, travelling at 102.667 feet per second (31.293 meters per second) at a freeway speed of 70 mph (112.654 kph). The car has to understand what in the world is going on in a wide cone ahead. It needs to understand the world, track potentially hazardous objects, such as vehicles, people, animals, and debris, and calculate an optimal path at least 20 times per second, which is 5.133 feet of travel.
Optical systems - stereoscopic cameras face a lot of challenges in performing that task. Dust and dirt either in the air or on the sensors. Fog, darkness, shadows, lens flare, washout from headlights and streetlights and the sun all make the challenge of detecting shapes based on like colors and edge detection even more difficult.
The consequences of design flaws and inadequate resources are major. We need to get this right as much as humanly possible.
bucolic_frolic
(43,123 posts)self-driving cars will be such a target, almost a badge of honor
I doubt they will be around in 20 years
ErikJ
(6,335 posts)1 fatality per 140 million miles vs 90 million miles for human pilots.
bucolic_frolic
(43,123 posts)No robot is going to drive me anywhere
ErikJ
(6,335 posts)Saying theyd never give up their horse. It would be very scary the 1st few times out.
bucolic_frolic
(43,123 posts)but it will never happen for me
I like to drive, I like the feel of the wheel
I don't like to ride in a car when others drive
Darb
(2,807 posts)Your driving days are numbered. I'd say about 3,000, max.
OwlinAZ
(410 posts)I would go into debt for a Tesla.
mahatmakanejeeves
(57,378 posts)You have the car to take you to the grocery store and doctors' appointments.
Because of your deteriorating eyesight and reaction time, you no longer feel safe behind the wheel. Is someone going to take you where you want to go, day after day? I doubt it.
Your car does the job for you, and it never complains.
I wouldn't worry about "technology that thinks it's smarter than me but isn't." I would worry about having your - or my - having a bad day, or not having received enough sleep last night, or trying to get home before too long, even though I really should pull over.
Hope that whatever caused this an be corrected.
snooper2
(30,151 posts)ErikJ
(6,335 posts)and just googled it again. I found one that cited 130 million miles vs 100 million.
840high
(17,196 posts)penndragon69
(788 posts)Will be the death of self driving cars !
olddad56
(5,732 posts)if they can be hacked into and the hacker can take control. That could get a bit scary.
ErikJ
(6,335 posts)They get in and end up at the police HQ? lol
They just stand in the middle of the road to stop the car, and then mug you at gunpoint.
ErikJ
(6,335 posts)If they try to take the car they might not get far if they dont know how to operate it. But the car could be pre-programmed to drive to the nearest police station if its car jacked somehow.
jberryhill
(62,444 posts)But when an ill-intentioned pedestrian can stop any car they want to, then there are new possibilities.
Imagine city traffic with no-risk jaywalking. What keeps most pedestrians in the crossings and with the lights is, fundamentally, the proposition that they might get run over if they walk out into traffic.
struggle4progress
(118,273 posts)Here are the current NHTSA statistics on fatalities per 100 million road miles: 1.08
That looks like a wash
PoliticAverse
(26,366 posts)ErikJ
(6,335 posts)The radio waves bounce back above the Tesla making high trucks invisible. This is the same problem with motorhomes with clearance radars. Almost impossible to tell how high the bridge is going to be until youre under it.
It might be the self drivers fatal flaw.
Hassin Bin Sober
(26,324 posts)GPS and transponders will have to tell other vehicles in proximity where they are and where they are going.
Radar is a necessary feature for now. Maybe gps transponders will be required enven on human driven cars in the near future.
The FAA is moving away from radar and will soon rely on the aircraft's gps alone to report position.
Thor_MN
(11,843 posts)but the tractor would reflect just as much or more than any other vehicle. The notion that high vehicles are invisible is ridiculous.
leveymg
(36,418 posts)Not one of the better ideas. Autopilot is fine at 35,000 feet in air-traffic controlled airspace. But, not on a highway where other drivers can and often make sudden lane changes and do other stupid things.
ErikJ
(6,335 posts)for the car to "see". Therefor the higher truck bed is indeed invisible. The car is sensing nothing in front of it which is UNDER the bed of the truck. It might if it was further away but at close range it would not.
Its the same problem bridge clearance radars have for trucks. It would be fantastic if the truck or RV could sense if a bridge ahead is too high or too low before it got there, but the physics of the problem is impossible.
Thor_MN
(11,843 posts)Do it a a shallow angle. If you are right, you will not be able to see the side of the building because the light is reflecting away from you at an angle you can't see. (Hint: objects are not perfect reflectors.)
I noted the issue with the space under a trailer.
What is a bridge that is too high?
I suggest you do actually perform the experiment that I suggested. It might help you understand the actual physics.
ErikJ
(6,335 posts)But the truck turned suddenly in fornt of the Tesla so enough of the radar signal couldnt bounce back to the car.
The farther away you shine the flashlit straigt ahead at the building the more of the building u can see. Now shine the flashlite from 10 feet away straight ahead and you wont be able to see anything above the beam.
I would presume the radar can radiate to the sides pretty far to avoid cars collisions so maybe the radar has a shallower vertical beam than horizontal. I know my backup camera is like that. It can show far off to the sides much more than vertically.
Thor_MN
(11,843 posts)Surfaces are not perfect mirrors. RADAR will bounce back to the detector, unless the car is UNDER the trailer. The RADAR on my boat will see seagulls flying higher than my boat until they get to angle about 20-30 degrees above "level"
.Which is mostly irrelevant as the autopilot relies more on cameras rather than RADAR.
ErikJ
(6,335 posts)As I said my backup camera has 2 or 3 times the horizontal vision as vertical somehow which being mounted at the top of my motorhome makes it impossible to see when an object is going to hit my bumper.
Thor_MN
(11,843 posts)From the distance where the car could have braked to a stop to avoid the accident, there is no way the bottom of a semi trailer is at an angle that would put it out of view of the radar.
A Tesla S can brake 60-0 in 108 feet. A semi trailer's bed varies, but is around 4 to 5 feet off the ground. Subtract the height of the radar sensor and the depth of the bed and call it, very generously to your argument, 3 and a half feet in 108 feet. That's an angle of 1.86 degrees above horizontal.
If you think the long range RADAR on a Tesla has an 18 degree horizontal (side to side) cone but less than 2 degrees vertical, then I can see how you would believe that a semi trailer would be "too high", but you would still be wrong.
At your "2 to 3" ratio, the long distance RADAR would have a 6 degree to 9 degree vertical cone. Even if one made the ridiculous mistake of putting half that facing below horizontal, we are talking 3 to 4.5 degrees. The bottom of the trailer would have to be 5 feet 8 inches to 8 feet 6 inches off the ground to be "too high" to be in the cone.
And that is at minimum distance. Get farther back and your argument becomes even more ridiculous.
ErikJ
(6,335 posts)Thor_MN
(11,843 posts)Try running under one sometime when you see one parked.
Try not to hit any spare tires, gear mounted underneath, spoilers etc.
Not to mention that if the bottom of the trailer was over 6 feet off the ground, the Tesla would have passed cleanly under it.
The semi turned too close in front of the Tesla for either a human driver or a computer to stop the car. That the Tesla did not apparently brake itself is disturbing, almost as much as that the now dead guy behind the wheel was watching a movie.
whistler162
(11,155 posts)will occasionally turn off for no obvious reason and if there is a heavy storm, snow or rain, forget about using it. Love the system but it is still a work in progress.
FailureToCommunicate
(14,012 posts)uncle ray
(3,156 posts)7962
(11,841 posts)BootinUp
(47,138 posts)ANYONE that thought self driving cars was a good idea is an idiot.
whatthehey
(3,660 posts)Tesla is very clear about what autopilot is, and a self-driving car it is not.
Note the bits about assistance, reducing driver workload, being alert and keeping your hands on the wheel at all times?
A self-driving car would make all those things irrelevant and in fact exclusionary.
https://www.teslamotors.com/presskit/autopilot
Darb
(2,807 posts)when they are the only game in town.
Hassin Bin Sober
(26,324 posts).... of a "save" he claims his car "Tessy" performed.
I hate to say it, but I'm getting the feeling this guy was a little too impressed with the car's "autopilot"
Is that what they call this system? If so, that's irresponsible.
most curves (though it doesn't seem to get ALL of them right right yet). It also does NOT dive off exits any more. that used to be a 50/50 change on whether it would follow the lane marking off an exit if you were driving down the right lane. Things autopilot has troubles with: 1. vertical hills. Meaning, if you crest over the top of a hill the camera seems to have a hard time figuring out if the lines continue straight or go to the right or left. I urge caution when you see yourself cresting over the top of a relatively steep hill. 2. Stopped vehicles: the car sees moving vehicles GREAT. Even really slowly moving vehicles. It has a harder time with stopped ones. I suspect this is due to how radar works. I won't go into the physics of that hear aside from to say it's because of Doppler Shift. I believe the camera is typically what ends up picking up on the stopped vehicles. Educated guess on my part. I've never had it NOT see it and not stop, but I've been bold enough to let it really need to slam on the brakes pretty hard. I don't see this as an over all issue. It's learning fast and getting better at those things. It's autopilot. You do need to pay some attention to the road. Though not a whole lot...
https://m.
He has several videos demonstrating the "autopilot "
https://m.
bananas
(27,509 posts)longship
(40,416 posts)Few paved roads. Almost all roads are two lanes with no shoulder. Lots of snow in winter when there is no road to be seen.
And somehow in this environment I am to somehow trust GPS? Hell, I'd end up in Lake Michigan! Or a deep snow bank. Or fucking Grass Lake!
No thank you. I will drive for myself.
Eric J in MN
(35,619 posts)..and the same thing would have happened if the driver were in regular mode.
csziggy
(34,135 posts)As Will Fealey commented on the article at this link http://electrek.co/2016/06/30/tesla-autopilot-fata-crash-nhtsa-investigation/ that PoliticAverse posted, the lack of any barrier under the sides of the tractor trailer allowed the low riding Tesla to go under the trailer and ripped the roof off the car. He wrote an article of his own about this issue, Why the Tesla accident had nothing to do with the safety features of the Model S.
Florida used to have a law that required lower bumpers on jacked up vehicles but I don't know if that would have covered the sides of trailers. The law was passed after two friends of mine were killed when they hit the rear of a jacked up mud truck. Both were decapitated. The mother of one of the men pushed for a law to require lowered bumpers as part of any modification to a vehicle to raise the frame. Unfortunately that law expired so those dangerous vehicles are allowed on the roads again - and I can't look up what provisions it had for trailers, if any.
MrsMatt
(1,660 posts)I read he was watching a movie at the time of the accident.
Eugene
(61,859 posts)Source: The Guardian
Sam Levin and Nicky Woolf in San Francisco
Friday 1 July 2016 18.43 BST
The Tesla driver killed in the first known fatal crash involving a self-driving car may have been watching a Harry Potter movie at the time of the collision in Florida, according to a truck driver involved in the crash.
The truck driver, Frank Baressi, 62, told the Associated Press that the Tesla driver Joshua Brown, 40, was playing Harry Potter on the TV screen during the collision and was driving so fast that he went so fast through my trailer I didnt see him.
The disclosure raises further questions about the 7 May crash in Williston, Florida, which occurred after Brown put his Model S into Teslas autopilot mode, which is able to control a car while its driving on the highway.
The fatal crash, which federal highway safety regulators are now investigating, is a significant setback and a public relations disaster for the growing autonomous vehicle industry. Tesla Motors Incs shares, however, were down less than 1% on Friday in early trading.
[font size=1]-snip-[/font]
Read more: https://www.theguardian.com/technology/2016/jul/01/tesla-driver-killed-autopilot-self-driving-car-harry-potter
UMTerp01
(1,048 posts)Seriously....one fatality in over 100 million miles driven vs the constant accidents and fatalities that happen with distracted, drunk, crappy drivers every day. Yeah no....I'll take a self driving Tesla any day rather than a lot of these fools out here I see who need not have a license....PERIOD!!!
ErikJ
(6,335 posts)...............................snip
The Tesla Model S in Autopilot mode seems to have a large, important blind spot above the cars hood. The primary forward-facing sensors used by autopilot are a radar emitter located on the front of the car, centrally and below the upper false grille area, and a camera, mounted at the top of the windshield in front of the rear-view mirror assembly.
The camera above provides the system with lane-keeping information, speed limit data, the radar unit provides the ability to detect cars in front and determine how far away they are. These sensors combine with a dozen ultrasonic sensors around the car that give a limited-range (about 16 feet) view of the area around the car covering a full 360°.
While these do provide an impressive sense of the surrounding world to the cars electronic brains, it does seem to leave a large hole from, essentially, the hoodline of the car and up. The upper camera doesnt appear to be tasked with looking for obstacles in that volume of space, and the forward radar assembly is calmly unaware of what is happening less than a foot or so above it.
Teslas own literature seems to confirm this blind area, as their Autopark/Autopilot instructions include this:
Please note that the vehicle may not detect certain obstacles, including those that are very narrow (e.g., bikes), lower than the fascia, or hanging from the ceiling.
This one sentence manages to give a pretty good idea of the range that the Tesla system is capable of seeing: a horizontal plane of reality thats about as thick as the cars face and hovering about six feet above the ground.
..............more
http://jalopnik.com/does-teslas-semi-autonomous-driving-system-suffer-from-1782935594
Odin2005
(53,521 posts)This guy had to know that the self-driving AI wasn't all there yet and yet he still decided to watch Harry Potter?
Too bad all the luddites will start bleating about how this shows that self-driving cars are evil and must be banned.
TheFarseer
(9,319 posts)Last edited Tue Jul 5, 2016, 07:15 AM - Edit history (1)
Sorry to be a luddite but tens of millions of Americans will be without a job. And the damn thing will probably drive 35 in a 35 when everyone else is going 50.
ram2008
(1,238 posts)That is the price of progress. New jobs will be created, obsolete jobs removed. Also, self driving cars would almost certainly reduce civilian casualties, and also clear congestion and keep the flow of traffic smooth. Overall, it would be a net positive for society.
TheFarseer
(9,319 posts)I'm not even a truck driver or anything.
ram2008
(1,238 posts)No need to prop up dead industries. Coal is another good example.
killbotfactory
(13,566 posts)Then yes, screw self driving cars.
Nihil
(13,508 posts)> Sorry to be a luddite but tens of millions of Americans will be without a job.
> And the damn thing will probably drive 35 in a 35 when everyone else is going 50.
On the other hand, tens of thousands of Americans will still be alive, hundreds of
thousands will remain uninjured and people will have to learn to allow enough time
for their journey rather than putting their own impatience & incompetence above
the safety (and equally valuable time) of other road users.
Sounds like a win to me.
TheFarseer
(9,319 posts)Night Watchman
(743 posts)Greg Gardner, Detroit Free Press 7:02 p.m. EDT July 5, 2016
A Southfield art gallery owner told police his 2016 Tesla Model X was in Autopilot mode when it crashed and rolled over on the Pennsylvania Turnpike last week. The crash came just one day after the National Highway Traffic Safety Administration issued a report on a fatal crash in May involving a Tesla that was in self-driving mode.
Albert Scaglione and his artist son-in-law, Tim Yanke, both survived Friday's crash near the Bedford exit, about 107 miles east of Pittsburgh.
The Free Press was not able to reach Scaglione, owner of Park West Gallery, or Yanke, but Dale Vukovich of the Pennsylvania State Police, who responded to the crash, said Scaglione told him that he had activated the Autopilot feature.
In his crash report, Vukovich stated that Scaglione's car was traveling east near mile marker 160, about 5 p.m. when it hit a guard rail "off the right side of the roadway. It then crossed over the eastbound lanes and hit the concrete median."
http://www.freep.com/story/money/cars/2016/07/05/southfield-art-gallery-owner-survives-tesla-crash/86712884/