• snooggums@lemmy.world
    link
    fedilink
    English
    arrow-up
    113
    ·
    1 month ago

    Paraphrasing:

    “We only have the driver’s word they were in self driving mode…”

    “This isn’t the first time a Tesla has driven onto train tracks…”

    Since it isn’t the first time I’m gonna go ahead and believe the driver, thanks.

    • XeroxCool@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 month ago

      The ~2010 runaway Toyota hysteria was ultimately blamed on mechanical problems less than half the time. Floor mats jamming the pedal, drivers mixing up gas/brake pedals in panic, downright lying to evade a speeding ticket, etc were cause for many cases.

      Should a manufacturer be held accountable for legitimate flaws? Absolutely. Should drivers be absolved without the facts just because we don’t like a company? I don’t think so. But if Tesla has proof fsd was off, we’ll know in a minute when they invade the driver’s privacy and release driving events

      • snooggums@lemmy.world
        link
        fedilink
        English
        arrow-up
        31
        ·
        1 month ago

        Tesla has constantly lied about their FSD for a decade. We don’t trust them because they are untrustworthy, not because we don’t like them.

        • AA5B@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 month ago

          They promote it in ways that people sometimes trust it too much …. But in particular when releasing telemetry I do t remember tha ever being an accusation

    • TheKingBee@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 month ago

      Maybe I’m missing something, but isn’t it trivial to take it out of their bullshit dangerous “FSD” mode and take control? How does a car go approximately 40-50 feet down the tracks without the driver noticing and stopping it?

      • snooggums@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 month ago

        On some railroad crossings you might only need to go off the crossing to get stuck in the tracks and unable to back out. Trying to get out is another 30-40 feet.

        Being caught off guard when the car isn’t supposed to do that is how to get stuck in the first place. Yeah, terrible driver trusting shit technology.

  • XeroxCool@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    1 month ago

    If only there was a way to avoid the place where trains drive.

    I checked first. They didn’t make a turn into a crossing. It turned onto the tracks. Jalopnik says there’s no official statement that it was actually driving under FSD(elusion) but if it was strictly under human driving (or FSD turned itself off after driving off) I guarantee Tesla will invade privacy and slander the driver by next day for the sake of court of public opinion

    • egrets@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 month ago

      They didn’t make a turn into a crossing. It turned onto the tracks.

      Just to be clear for others, it did so at a crossing. That’s still obviously not what it should have done and it’s no defence of the self-driving feature, but I read your comment as suggesting it had found its way onto train tracks by some other route.

      • XeroxCool@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 month ago

        Thanks. I could have clarified better myself. I meant “didn’t turn from a rail-parallel road onto a crossing to be met by a train it couldn’t reasonably detect due to bad road design”

    • jj4211@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 month ago

      The thing that strikes me about both this story and the thing you posted is that the people in the Tesla seem to be like “this is fine” as the car does some pretty terrible stuff.

      In that one, Tesla failing to honor a forced left turn instead opting to go straight into oncoming lanes and waggle about causing things to honk at them, the human just sits there without trying to intervene. Meanwhile they describe it as “navigation issue/hesitation” which really understates what happened there.

      The train one didn’t come with video, but I can’t imagine just letting my car turn itself onto tracks and going 40 feet without thinking.

      My Ford even thinks about going too close to another lane and I’m intervening even if it was really going to be no big deal. I can’t imagine this level of “oh well”.

      Tesla drivers/riders are really nuts…

  • atlien51@lemm.ee
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 month ago

    Elongated Musketon: UM THAT WAS JUST 1 FAULTY MODEL STOP CHERRY PICKING GUYS JUST BUY IT!!!1

    • lsibilla@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 month ago

      For as much as I’d like to see Tesla stock crash these days, and without judging on the whole autonomous car topic, this IS cherrypicking.

      Human drivers aren’t exactly flawless either, but we won’t ban human driven cars because some acts recklessly or other had a seizure while driving.

      If statistically self driving cars are safer, I’d rather have them and reduce the risk of coming across another reckless driver.

  • shaggyb@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 month ago

    Driver failed to control their car and avoid a collision.

    FTFY.

    I’m sure the car did actually take the action. But there are TONS of unavoidable warnings and reminders to the driver to supervise and take control when FSD goes wrong.

    Which you can do by such super-technical means as “hitting the brake” or “steering the other way” or “flipping the right stalk up”. Rocket science, I know.

    Driver’s fault. Bad technology, yes. Worse driver.