Another Tesla Model S Blamed for Crash

42
Another Tesla Model S Blamed for Crash

Another Tesla Model S owner is blaming the car’s self-driving technology for a collision. 

According to Arianna Simpson, her Model S was operating in autopilot mode when the car in front of her came to a sudden halt and her car did not slow down automatically like it its designed to do.

“There was a decent amount of space so I figured that the car was going to brake as it is supposed to and didn’t brake immediately,” she told ars technica. “When it became apparent that the car was not slowing down at all, I slammed on the brakes but was probably still going 40 when I collided with the other car,” she said.

Like in the other recent case, where a Model S driver blamed the car’s ‘summon’ feature for crashing into a parked trailer, Tesla says that the vehicle’s records don’t back up Simpson’s story. Tesla says that the logs show that adaptive cruise control and automatic emergency braking was deactivated when Simpson hit the brake pedal, leading to the crash.

SEE ALSO: Did This Tesla Model S Crash Itself?

“Tesla Autopilot is designed to provide a hands-on experience to give drivers more confidence behind the wheel, increase their safety on the road, and make highway driving more enjoyable. Autopilot is by far the most advanced such system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility,” said the company.

[Source: Ars Technica]

  • Domingos Sousa

    As a driver your responsable

  • smartacus

    how soon before they officially blame it on the driver?

  • Sarah2581

    just as Glenn implied I cannot believe that a single mom able to profit $4252 in 4 weeks on the computer . check here CLCK.RU/9uzt4

  • Personal reponsibility

    Autopilot mode should be called driver assist mode. Pay attention behind the wheel, don’t forget that you could kill someone at any moment from inside your cocoon.

  • liuping

    When she tapped the brake she indicated to the car that she wanted control and was aware of the situation, so the car relinquished control to her. at that point she waited for the car to stop itself, but it was no longer driving, so, yes it is her fault. She had plenty of time to stop, but chose to wait for the car to stop itself.

  • Ron

    I love the fact that the car keeps track of what’s going on so Tesla can determine whether what the owner is saying is true or not. OTOH, as others have pointed out, her Model S isn’t autonomous…

  • smartacus

    LOL,
    reading comprehension can be fun
    🙂

    -she did not have plenty of time to stop, she said the car in front of her came to a sudden halt.
    Too bad the tesla can’t haul down all 4,800 lbs fast enough.

  • AlphaWolf

    “There was a decent amount of space so I figured that the car was going to brake as it is supposed to…”

    It certainly seems that the space was there for one of the two intelligent beings in that car to safely stop it in time. Now it’s a case of “She said, It said”…

  • smartacus

    except the car in front came to an sudden halt.
    her tesla hit the other car at 40MPH even though she was slamming on the brakes.
    Even one second extra time spent not braking will negate that “decent amount of space”.

  • Haggy

    She’s responsible for a number of reasons, and should have taken over much sooner. She agreed to it when she activated autopilot, not to mention that every state law says the driver is responsible.

    It’s not a question of who is responsible. It’s certainly not Tesla. The issue is whether Tesla can say that autopilot wasn’t a factor and act as if this was an accident where autopilot was off. Technically it was, but if she really slammed on the brakes, then it means that had she not done so, it would have been an accident with autopilot on. Tesla shouldn’t be so quick to dismiss incidents such as this from the “autopilot was a factor” column when coming up with their statistics. If they are classifying all accidents that had autopilot off at the time of collision, including those where autopilot was disengaged moments before the accident because the driver didn’t have enough time to recover, then their statistics on autopilot safety won’t mean much.

    If Tesla had acknowledged that autopilot wasn’t working as expected in this scenario, and drivers are told that they need to be prepared to take over because it may not work in all situations, that would have been reasonable. But to acknowledge that it was deactivated when she hit the brake pedal leading to the crash, and that’s exactly what a person is supposed to do when automatic emergency braking is used (it’s supposed to start braking and give the driver enough time to move the foot to the brake, or at least try to) and then blame the driver for slamming on the brakes as if that was what caused the crash is downright absurd. If they had said that their records showed that she hit the brakes but didn’t apply the necessary force, I might agree with them. But what they are saying, aside from the fact that the driver is responsible, doesn’t show me that they are serious about coming up with meaningful information.

    Regardless of what happened, they should have been looking into why it was one of those situations that autopilot might miss in the first place. I can’t see anything in Tesla’s statement that disputes what she said. She said that she slammed on the brakes, and nobody is disputing that doing so turns off autopilot.

  • Haggy

    She had the ability to set the following distance. I don’t know if she set it to the minimum, the maximum or somewhere in between. If you set it so you have a three second following distance, you should have enough time to react if the car doesn’t. The worst that can happen is that you’d come to a smooth stop when the car might have done so anyway had you waited, and a quick double pull of a lever turns it back on.

  • Haggy

    She knows that and agreed to it when she enabled autopilot. Tesla made it clear that there might be circumstances in which it might not work. Had Tesla merely said that this is one of them and the data from these situations will be used to improve the software over time, but she had a responsibility to take over as the instructions warned her, I wouldn’t have a problem with it. It’s Tesla not admitting that autopilot wasn’t handling it and acting as if she caused it by slamming on the brakes that I have a problem with. She caused it by not taking over for autopilot earlier, but waited long enough to establish that autopilot wasn’t able to handle it.

  • DisruptiveChanges

    AutoPilot, SelfDriving whatever you call is a misnomer. Don’t use these terms until you have a car without a steering wheel and pedals.

  • smartacus

    unfortunately; the tesla did not prevent the collision.

    autopilot should not be legally available unless the driver has an autopilot endorsement on their License and swipes the DL through a card reader.

    *because not everyone is a naturally adroit driver, some are tesla drivers.

  • smartacus

    ah yes of course it’s certainly not the tesla’s fault.

    Not seeing anything in tesla’s statement that disputes what she said?
    Why even look?

    Any statement after the fact is not going to be released without the legal department’s input (if not completely drawn up by them); so that’s not very reassuring.

    One of the reasons this happened is because tesla failed to take into design consideration that their cars are going to tesla buyers and not Porsche buyers.

  • One-Of-A-Kind

    Auto Pilot is not autonomous ?

    What’s the point then? It’s a false (and dangerous) perception of liberty if the car can only sometimes operate itself.

    Volvo’s technology chief nailed this when he said this is a dangerous wannabe, and now Tesla is admitting it can’t be trusted. Then, what’s the point!?!?

  • One-Of-A-Kind

    Oh big brother, there art thou!

  • One-Of-A-Kind

    She’s responsible because she trusted an amateur car company that has more regard for hype rather than safety.

  • JachinRivers

    It’s neat technology and I’m sure it has it’s place, but it’s being put in the hands a population that’s increasingly reluctant to accept any responsibility. That and the increasingly widening gap in intelligence between the developers of tech and the average consumer makes me wonder where this trend is going. I could see applications for this tech in public transportation such as a city bus, however applied to a high performance car is just asking for this kind of thing and will continue to increase.

  • Geo T

    I can see autopilot being used as an excuse for many crashes and/or causing them itself. Those data logs are critical but will also be the subject of conspiracy theories. Rational or not, I’d keep a good distance from these cars per Murphy’s Law. Summon mode should be banned because someone should at least be in the car to override it (people are already ignoring Tesla’s rules about only doing it at home). If one kid or beloved pet gets run over by a sidling Tesla, TSWHTF.

    I remember having a debate years ago with fools who thought automatic “air cars” would be perfectly safe due to their mindless faith in technology, especially GPS. With lives and property at stake you always need to err on the side of safety, not overconfidence. Tech fan-boys need to leave their egos at the door.

  • Geo T

    As electric cars become cheaper, an even dumber demographic will be driving more of them (harsh but true). Millions of people can’t even obey common sense no-talking/texting laws needed because of general overconfidence. I think self-driving cars are far too optimistic about human nature. If something just doesn’t feel safe on a gut level, you go with that, not blind optimism.

  • Geo T

    Yes, timing is everything! How can Tesla engineers ever foresee all wildcards? Braniacs often get overconfident. Unexpected things like a flying piece of newspaper masking a sensor, ice buildup, or animals being hit, render this technology inherently risky. I plan to keep a buffer zone between myself and these cars, and it’s not paranoia. Or course many will say I have a better chance of being hit by regular cars but it misses the point of cumulative risks.

  • Geo T

    You think Porsche drivers are inherently safer? Many of them are just speed freaks who happen to be rich. Anyone obsessed with speed can be a bad luck time bomb. Google porsche crash speeding, Paul Walker (his driver), and so on.

  • Geo T

    How can the system account for the behavior or cars behind you, though? Or bad weather hitting, or debris from the roadway blocking sensors? It takes no seer to know that these gizmos are only foolproof in a perfect world.

  • JR250

    Hi. I believe there are endless disclaimers, warnings and risk-acceptance that Tesla owners must accept before they use the beta software.

    Tesla reports their logs show the driver pressing the brake pedal at some point prior to the accident that disengaged auto-pilot. The car was probably running on cruise control only, if anything.

    One thing is that crash avoidance should always be on until explicitly turned off by the driver. Had it be on, it would have outsmarted the driver. That said, the driver reported reacting to the fact that auto-pilot should have slowed down the car but decided to not brake herself as she had complete faith in it.

  • liuping

    She had plenty of time, she chose to wait for the car to stop itself, when it did not (because she deactivated autopilot) only then she did not have enough time to come to a complete stop.

  • smartacus

    oh i don’t doubt tesla owners have the tendency to put blind trust in an unproven technology because of their religious faith in social media telling them tesla so good tesla so good.

  • smartacus

    exactly, this is only foolproof in a perfect world.

    In a perfect world, imperfection like tesla autopilot engineers would not exist

  • RussellL

    I encourage readers to read the original article from Ars Technica. Important information has been left out.

    “Tesla says that the vehicle logs show that its adaptive cruise control system is not to blame. Data points to Simpson hitting the brake pedal and deactivating autopilot and traffic aware cruise control, returning the car to manual control instantly. (This has been industry-wide practice for cruise control systems for many years.) Simpson’s use of the brake also apparently disengaged the automatic emergency braking system, something that’s been standard across Tesla’s range since it rolled out firmware version 6.2 last year.”

  • Haggy

    There was a time when automatic transmission was new. Some states had special licenses. You didn’t need a special license to drive a car with an automatic transmission though. If you took a road test with a car with an automatic transmission, you got a special endorsement, and it meant that you couldn’t drive a car WITHOUT automatic transmission. If you took your road test with a car with a manual transmission, you could drive whatever you wanted.

    As things stand, any licensed driver should be able to drive a Tesla. It still has a steering wheel, an accelerator and a brake pedal. They are used the same as in any other car. They instantly override any automated features. Touching the brake pedal or turning the steering wheel will disable the automated feature until it’s turned back on. No special training is needed for that because drivers are already trained for that.

    Once it gets to the point that true autonomous cars are on the road, and they are far more reliable than human drivers, then perhaps states could consider special licenses for them that would be restrictive licenses and wouldn’t allow people to operate cars without those features. Your proposal is backwards though.

  • Haggy

    Yes, in a perfect world, all cars would have automated safety features that would prevent these things. In the mean time, it’s up to the driver to compensate for cars following too closely. It’s up to the driver to take over when the car gives a message that a sensor is blocked. It’s up to the driver to adjust for weather conditions. But that’s true without autopilot. You can increase your following distance when driving manually or you can turn the dial from 3 to 6 when using autopilot. Or you can tailgate when driving manually on a rainy day. It’s still up to drivers to do the right things. Autopilot is one more tool.

    We also have reality to deal with. If a driver looks away for a moment to adjust the radio, autopilot might prevent an accident. Without it, taking the driver’s eyes off the road for a second might have caused the accident. It’s easy to say that the driver should have been watching the road and that’s correct. But would you rather say that to the driver behind you whose car just plowed into yours or would you rather that that car had autopilot and prevented the accident?

    Nobody is suggesting that drivers shouldn’t be paying attention. But some are dealing with the reality that the road is full of drivers who are already not paying attention, and the solution isn’t to have them pay less attention but to have things in place to mitigate in cases where they don’t pay attention.

    The idea is to have a system that’s makes a trip safer with it than without it both with respect to the amount of attention a driver is supposed to pay, and the amount of attention a driver really does pay.

  • smartacus

    they have now modified the autopilot feature.
    they could have just left it as is, but even they are realizing tesla buyers need extra safeguards.
    tesla should not be allowed to use public roads as their own personal laboratory because it puts other vehicles at risk.

  • smartacus

    i did not say Porsche drivers are thrill seekers who happen to be rich. OTOH tesla drivers are overly relaxed-brained who happen to be rich

    “Porsche crash speeding Paul Walker (his driver)” did not rear-end anyone.

  • Haggy

    Statistically, their cars are safer with autopilot on than with it off. But if that’s how you feel, then maybe we should take cars off the roads that lack safeguards because they put other vehicles at risk. Let’s start with vehicles that lack lane keeping and lane departure warning, and blind spot warning and rear view cameras. Then we can get rid of those that lack automatic emergency braking. Then we can get cars with an internal combustion engine off the road and save 300,000 vehicle fires a year. Those are the vehicles that put the public at risk, and many times a minute there’s an accident because of the lack of one of those features or the presence of a dangerous engine that has constant explosions.

    Then we should get rid of cars that don’t automatically apply the parking brake when somebody puts the car in park and ban those cars that don’t put themselves in park automatically when the car is stopped and the car detects no weight in the driver’s seat. Now we are starting to get safer. Let’s get rid of cars that don’t have five start crash test ratings across the board too.

    And since you mentioned that this accident could no longer happen because Tesla changed it, let’s get cars off the road that can’t be updated automatically over the air, because if anything goes wrong with them, it’s a disaster to try to get them fixed.

    Now that we have all those cars off the road, Tesla is the only one left. So I agree with you that we should get cars off the road that put others at risk. How about we start with yours?

  • smartacus

    no statistically they are NOT SAFER with autopilot.
    67.53% of all statistics are made up on the spot.

    if you want to indulge your desire of taking cars off the road to make them safer for the others around them, let’s start with tesla autopilot.
    Then further it to other vehicles with dangerous and unperfected technology.

    Right now tesla is the only one using the public roads out there as personal laboratories because whatever happens; they default to it never being their fault.

  • Haggy

    That makes no sense. The cars have steering wheels and brake pedals and accelerators. They have drivers. There are plenty of examples of times that autopilot prevented accidents that wouldn’t have been possible to prevent otherwise. In any circumstance, a driver is equally capable or more capable of driving it than if it didn’t have autopilot. If the car were driving itself, you might have a point. But that’s not how it works.

    Tesla isn’t using public roads as their personal laboratories but plenty of other companies are. Google has self driving cars on the road. Tesla’s are all fully equipped for human drivers.

  • smartacus

    steering wheel, brake pedals, and accelerators and electronic driving aids all make the human element manageable to the point of 217MPH Brabus C-Class cars.

    Tesla fails at properly compensating for the human element because their autopilot is meant for a normal driver and not the relaxed-brain (alpha brain waves) tesla buyer.

    Please try to accept that engineers working at tesla are not the same as the tesla buyer.

  • One-Of-A-Kind

    So GM just needed to provide a disclaimer to people that their ignition switches may not hold, and it’s their responsibility to hold the key to the on position in the event of a crash, otherwise airbags may not go off…

    Sounds logical to me. I mean, who can’t be trusted during split second decision making?

  • One-Of-A-Kind

    You just care about protecting Tesla. The idea of auto-pilot controlling your car gives you the idea that you don’t have to. The name is a lie. It nearly killed this person had the circumstance been slightly different. You don’t care about that, you just care about protecting Tesla’s ass, so they can keep this UNSAFE gimmick feature on the roads.

    I live in a City where I can’t be on my cellphone while I drive (talking, texting, browsing, etc) yet, these cars have a giant 19″ tablet in your face at all times with constant internet access. Some people have HACKED it to watch movies on. With all of the connectivity and electronic control over the vehicles functions, it’s only a matter a time before somebody hacks it and kills somebody in their car that they are unable to control. Their blatant disregard for safety and common sense is going to come back and haunt them as more cars are on the road for longer and longer.

  • JR250

    You’re reaching a little too far, won’t you say?

    In Tesla’s case, they show several disclaimers and locks that you must accept to continue. It clearly says it is beta software being tested. You accept that responsibility and any consequences stemming from it.

  • Haggy

    Please try to accept that if you don’t count auto steering, which wasn’t a factor in this accident, what you are left with is called Adaptive Cruise Control and has been around well over a decade and is common on many cars. This is hardly a Tesla specific issue. You can find it on anything from a mid grade Acura to a Mercedes, and even on a Subaru.

    If you want to take cars off the road that use ACC, don’t start with Tesla. You have a very long list of manufacturers that have far more vehicles on the road.

    While you are at it, perhaps with your expertise on physiology, you can cite some studies that compare the brain waves of Tesla buyers with those of anybody else, or is that simply one more part of the 97.2% of things you say that you make up on the spot?

  • smartacus

    i thank you for proving my point for me 🙂

    Adaptive Cruise Control seems to be working on hundreds of thousands of Mercedes, Acura, and FUBAR-u.

    it must be some conspiracy against tesla, right?

    i can easily prove tesla buyers are relaxed-brained.
    Don’t believe me, just believe your patron saint Jōvan Musk himself:
    It’s tesla owners fault here and tesla owners fault there.
    They obviously must be defective brained somehow to be making all these unforced errors.