Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths

  • Lucidlethargy@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    3
    ·
    1 year ago

    Didn’t, or couldn’t? Tesla uses a vastly inferior technology to run their “automated” driving protocols. It’s a hardware problem first and foremost.

    It’s like trying to drive a car with a 720p resolution camera mounted on the license plate holder versus a 4k monitor on the top of the car. That’s not a perfect analogy, but it’s close enough for those not aware of how cheap these cars and their tech really is.

    • CmdrShepard@lemmy.one
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      1
      ·
      1 year ago

      It remains to be seen what hardware is required for autonomous driving as no company has a fully functioning system, so there is no baseline to compare to. Cruise (the “4k monitor” in your anaology) just had to cut their fleet of geofenced vehicles after back to back crashes involving emergency vehicles along with blocking traffic and attempting to run over things like fire hoses.

      • MrFagtron9000@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Cruise and Waymo have self-driving cars, without safety drivers, driving around cities right now.

        We know what hardware it takes - More than just cameras and some premapping is required.

          • MrFagtron9000@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            1 year ago

            Does cutting their fleet mean they stopped operating or they reduced the size of their fleet?

            One of the crashes was a car running a red light and hitting the Cruise vehicle.

            • CmdrShepard@lemmy.one
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              3
              ·
              edit-2
              1 year ago

              It means these vehicles are still crashing even when geofenced to operate in ‘perfect’ conditions with premapped destinations. This is not a fully functional system as they can’t operate outside these city grids at low speeds in favorable weather conditions with a premapped route. Compare miles driven between Cruise and Teslas systems. One has several orders of magnitude more miles driven than the other and they’re somehow doing it without all this extra hardware.

        • CmdrShepard@lemmy.one
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          Mercedes is level 3 only, and their system will only work in Nevada on a highway at speeds below 40MPH.

  • dub@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    3
    ·
    1 year ago

    A times B times C equals X… I am jacks something something something

    • tool@lemmy.world
      link
      fedilink
      English
      arrow-up
      31
      arrow-down
      1
      ·
      1 year ago

      A times B times C equals X… I am jacks something something something

      Narrator: A new car built by my company leaves somewhere traveling at 60 mph. The rear differential locks up. The car crashes and burns with everyone trapped inside. Now, should we initiate a recall? Take the number of vehicles in the field, A, multiply by the probable rate of failure, B, multiply by the average out-of-court settlement, C. A times B times C equals X. If X is less than the cost of a recall, we don’t do one.

      Woman on Plane: Are there a lot of these kinds of accidents?

      Narrator: You wouldn’t believe.

      Woman on Plane: Which car company do you work for?

      Narrator: A major one.

      • droans@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        2
        ·
        1 year ago

        When you’re selling a million cars, it’s guaranteed that some of them will have a missed defect, no matter how good your QC is.

        That’s why you have agencies like the NHTSA. You need someone who can decide at what point the issue is a major defect that constitutes a recall.

      • Uriel238 [all pronouns]@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        1 year ago

        All of the major ones. On the other hand, the Pinto’s gas tank exploded less ofteb than competing models in the era, and wasn’t the only design with the lowered gas tank.

        Look up the You’re Wrong about podcast on the Ford Pinto which is a great deep dive on car development. and product investigative reporting.

    • Ultraviolet@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      1 year ago

      Here’s how to do self driving cars in a reliable way. First, instead of cameras that try to use road markings designed for human eyes, use specially designed roads with guide rails on them to ensure it follows a safe path. Second, for added convenience, these roads could also power the cars so you don’t need to stop to charge. Then we could even connect those cars together to increase efficiency. To mitigate the cost, no individual has to own them, they can stop at fixed points to pick up and drop off passengers, charging an affordable rate for each trip, or monthly/annual passes for frequent users. Maybe we could call them trains.

    • CmdrShepard@lemmy.one
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      3
      ·
      edit-2
      1 year ago

      https://www.wardsauto.com/blog/my-somewhat-begrudging-apology-ford-pinto

      https://www.latimes.com/archives/la-xpm-1993-02-10-mn-1335-story.html

      Two examples of the media creating a frenzy that wound up being proven completely false later.

      In OP’s case, both of these drivers failed to see a semi crossing the road right in front of them even though they were sitting in the driver’s seat with their hands on the wheel. This technology certainly needs improvement, but this is like blaming every auto manufacturer when someone crashes their car while texting on their phone.

  • skymtf@pricefield.org
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    5
    ·
    1 year ago

    I feel like some people are such Tesla fanboys that they will argue when I say Tesla FSD is not real and never has been.

    • _stranger_@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      12
      ·
      1 year ago

      Probably because calling something “not real” is infuriatingly vague.

      Feel free to expand on your position, I actually do want to know what “not real” means in this context.

      If you mean, from a semantics perspective, that FULL means it should be a completely independent and autonomous system, bravo, you’ve made and won the most uninteresting form of that argument.

      • Liz@midwest.social
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        3
        ·
        1 year ago

        I mean, don’t call your service something it’s not? Words should have meaning? Tesla’s Autopilot is very impressive, but it’s not fully independent, and that’s okay. Honestly if it had an accurate name people wouldn’t attack it so much. Other manufacturers are gaining similar capabilities but no one is complaining that their cars aren’t perfect either.

        • CmdrShepard@lemmy.one
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          6
          ·
          1 year ago

          Autopilot is an accurate name as it takes over the mundane portions of the task. Airline pilots don’t just hit a green button on the dash that says “fly” and the autopilot takes over until they hit a red “land” button. You can argue that people have a misconception about the word but the word itself is correct.

          • Liz@midwest.social
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            2
            ·
            1 year ago

            It’s my understanding that they actually could do that at this point, commerical flying is a controlled and predictable environment compared to driving on the road. Ten years ago I was hearing anecdotes from pilots saying the only thing they do is takeoff and land and even then the computer could handle it just fine if they let it. Maybe the autopilot in a Cesna sucks, but it’s pretty much fully automated in an Airbus.

        • _stranger_@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          6
          ·
          1 year ago

          Yeah that’s a really tired argument. I agree, they should call it whatever will stop people from arguing about the name. Something super generic and meaningless and uncreative that doesn’t encourage conversation. Something like Blue Cruise, ooh, or super cruise!

          At least then 99% of the complaints wouldn’t be about the least possible interesting part of it

      • renohren@partizle.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        [comment clarification: I confused Autopilot and FSD]

        Yeah they should have called it level 2 autonomous driving, like most other mass market car makers do (except Mercedes which have level 3 on the roads). People could then compare the different limits and clearly see what brands are or are not at the forefront of the tech.

        • Zink@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          I did not know that about Mercedes, so I had to go read about it. Level 3 is huge because that’s when the system is approved to not have constant human monitoring. It’s the difference between being able to read a book or use your phone on a boring trip, even if it might not get you fully door to door on many trips.

          It can’t drive you home drunk, and you can’t sleep in your car (you have to be available to take over when requested) but it’s a huge jump in most practical usage.

          • renohren@partizle.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Realisticaly, i think FSD has the potential to be level 3 officially and probably some car makers have the tech to do it too BUT in the EU, if the car has a level 3 autonomous driving, the car maker becomes legally responsible of accidents when the driving conditions are met ( most EU states limit it to highways). For the time being,only Mercedes had the courage to try it (probably because they have ample knowledge of driving assistance through their trucking production.)

    • Ocelot@lemmies.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      4
      ·
      1 year ago

      I have nearly 20k miles on tesla’s FSD platform, it works amazingly well for something thats “not real”. There are countless youtube channels out there where people will mount a gopro in their car and go for a drive. Some of them like AIDRIVR and Whole Mars Catalog pretty much never take over control of the car without any drama. Especially in the past ~6 months or so of development it has been amazing.

  • harold@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    7
    ·
    edit-2
    1 year ago

    but im saving the planet and making sure elon gets a cut of my money

  • gamer@lemm.ee
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 year ago

    I remember reading about the ethical question about the hypothetical self driving car that loses control and can choose to either turn left and kill a child, turn right and kill a crowd of old people, or do nothing and hit a wall, killing the driver. It’s a question that doesn’t have a right answer, but it must be answered by anybody implementing a self driving car.

    I non-sarcastically feel like Tesla would implement this system by trying to see which option kills the least number of paying Xitter subscribers.

    • Liz@midwest.social
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      At the very least, they would prioritize the driver, because the driver is likely to buy another Tesla in the future if they do.

    • Ocelot@lemmies.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      3
      ·
      1 year ago

      Meanwhile hundreds of people are killed in auto accidents every single day in the US. Even if a self driving car is 1000x safer than a human driver there will still be accidents as long as other humans are also sharing the same road.

      • Oderus@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        3
        ·
        1 year ago

        When a human is found to be at fault, you can punish them.

        With automated driving, who’s to punish? The company? Great. They pay a small fine and keep making millions while your loved one is gone and you get no justice.

        • CmdrShepard@lemmy.one
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          1
          ·
          1 year ago

          People generally aren’t punished for an accident unless they did it intentionally or negligently. The better and more prevalent these systems get, the fewer the families with lost loved ones. Are you really arguing that this is a bad thing because it isn’t absolutely perfect and you can’t take vengeance on it?

          • Oderus@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Generally, people are punished for causing an accident, purposefully or not. Their insurance will either raise their rates or drop them causing them to not be able to drive. That is a form of punishment you don’t get with automated driving.

            • int3ro@lemm.ee
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 year ago

              Of course you get the same with automated driving. Accidents will cause either the insurance rate of the whole company to raise, or the company will have to pay out of pocket. In both cases accidents have direct financial “punishment” and if a car company is seen to be “unsafe” (see cruise right now) they are not allowed to drive (or drive “less”). I don’t see a big difference to normal people. After a while this is is my opinion even better, because “safer” companies will push out “less safe” companies… Assuming of course that the gov properly regulates that stuff so that a minimum of safety is required.

            • CmdrShepard@lemmy.one
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              2
              ·
              1 year ago

              Increased rates aren’t a punishment they’re a risk calculation and insurance (outside of maybe property insurance in case a tree falls on the car for example) may not even be needed someday if everything is handled automatically without driver input. Why are you so stuck on the punishment aspect when these systems are already preventing needless death?

        • Meissnerscorpsucle@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          edit-2
          1 year ago

          punish and justice are synonymous… edit WOW bad typo should have read punish and justice are NOT synonymous.

    • CmdrShepard@lemmy.one
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      I think the whole premise is flawed because the car would have had to have had numerous failures before ever reaching a point where it would need to make this decision. This applies to humans as we have free will. A computer does not.

  • fne8w2ah@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    Yet Phoney Stark keeps on whinging about the risks of AI but at the same time slags off humans who actually know their stuff especially regarding safety.

    • silvercove@lemdro.id
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Then you should call it driver asist, not autopilot.

      Also Tesla’s advertisement is based on “having solved self driving”.

    • Ocelot@lemmies.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      9
      ·
      1 year ago

      That is precisely why autopilot is called a driver assist system. Just like every other manufacturer’s LKAS.

        • Ocelot@lemmies.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          4
          ·
          1 year ago

          How is that confusing? If you look at the capabilities an airplane autopilot does, it will maintain altitude and heading and make turns at pre-determined points. Autopilot in an airplane does absolutely zero to avoid other airplanes or obstacles, and no airplane is equipped with any AP system that allows the pilot to leave the cockpit.

          Tesla autopilot maintains speed and distance from the car in front of you and keeps you in your lane. Nothing else. It is a perfect name for the system.

            • CmdrShepard@lemmy.one
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              1 year ago

              I’d hope they would before willfully getting behind the controls of one to operate it. Regardless of what we call it, these people still would have crashed. They both drove into the side of a semi while sitting in the driver’s seat with their hands on the wheel.

              • renohren@partizle.com
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                1 year ago

                That because tesla induced them to think it was level 4 or 5 while FSD is level 2 (like most Toyotas are ) but with a few extra options.

                And as long as there is a need for a human to endorse responsability, it will remain at level 2.

                • CmdrShepard@lemmy.one
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  A) Autopilot and the FSD beta are two totally separate systems and FSD wasn’t even available as an option when one of these crashes occurred.

                  B) where’s the evidence that these drivers believed they were operating a level 4 or 5 system?

  • chakan2@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    2
    ·
    1 year ago

    It’s time to give up the Tesla FSD dream. I loved the idea of it when it came out, and believed it would get better over time. FSD simply hasn’t. Worse, Musk has either fired or lost all the engineering talent Telsa had. FSD is only going to get worse from here and it’s time to put a stop to it.

    • NιƙƙιDιɱҽʂ@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      1 year ago

      The article isn’t talking about FSD, these accidents are from 2019 and 2016 before public availability of FSD. Of course, “Full Self Driving” ain’t great either…

      The whole article is kind of FUD. It’s saying engineers didn’t “fix” the issue, when the issue is people are using Autopilot, essentially advanced lane keep, on roads it shouldn’t be used on. It doesn’t give a shit about intersections, stop signs, or stop lights. It just keeps you in your lane and prevents you from rear ending someone. That’s it. It’s a super useful tool in it’s element, but shouldn’t be used outside of freeways or very simple roads at reasonable speeds. That said, it also shouldn’t be fucking called “autopilot”. That’s purely marketing and it’s extremely dangerous, as we can see.

  • Nogami@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    8
    ·
    edit-2
    1 year ago

    Calling it Autopilot was always a marketing decision. It’s a driver assistance feature, nothing more. When used “as intended”, it works great. I drove for 14 hours during a road trip using AP and arrived not dead tired and still alert. That’s awesome, and would never have happened in a conventional car.

    I have the “FSD” beta right now. It has potential, but I still always keep a hand on the wheel and am in control of my car.

    At the end of the day, if the car makes a poor choice because of the automation, I’m still responsible as the driver, and I don’t want an accident, injury, or death on my conscience.

  • _stranger_@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    edit-2
    1 year ago

    There’s like three comments in here talking about the technology, everyone else is arguing about names like people are magically absolved of personal responsibilities when they believe advertising over common sense.

    • chakan2@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      1 year ago

      Because the tech has inarguably failed. It’s all about the lawyers and how long they can extend Tesla’s irresponsibility.

      • _stranger_@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        See, I would much rather have this discussion vs another one about advertising and names.

        We’re seeing progress. Ford is expanding features on Blue Cruise (in-lane avoidance maneuvers I believe). I think Mercedes is expanding the area theirs works in. Tesla added off-highway navigation in the last year.

        No one’s reached full autonomy for more than a few minutes or a few miles, but I wouldn’t say there’s no argument there. In fact, I’d say they’re arguably making visible progress.

  • Ocelot@lemmies.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    10
    ·
    1 year ago

    Since when has autopilot, especially in 2019, ever had the ability to deal with “cross-traffic” situations? It always has been a glorified adaptive cruise with lanekeeping and has always been advertised as such. Literally the same as any other car with LKAS. Tesla’s self-driving software wasn’t released to the public until 2021/2022.

    Meanwhile about 120 people died in traffic related accidents today in the US.

  • PatFusty@lemm.ee
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    33
    ·
    1 year ago

    The driver was also not even paying attention to the road so the blame should be on him not the car. People need to learn that Tesla’s version of autopilot has a specific use case and regular streets is not that.