“Translation: all the times Tesla has vowed that all of its vehicles would soon be capable of fully driving themselves may have been a convenient act of salesmanship that ultimately turned out not to be true.”

Another way to say that, is Tesla scammed all of their customers, since you know, everyone saw this coming…

  • MajorHavoc@programming.dev
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    1
    ·
    edit-2
    12 days ago

    That out of the way, FSD sucks, and it’s getting worse, not better.

    It’s almost like they bet on the AI to teach the AI, rather than continuing to pay for skilled engineers.

    Buckle up folks, we’re going to see a lot more of this, across every industry, before the lawsuits go into high gear and anything gets better.

    • capital@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      12 days ago

      Since the first time I heard about FSD I’ve been wondering why Tesla (or others) doesn’t set up a system where drivers opt-in (no opt-in by default) to sending anonymized driving data to help train the model. The vast majority of the time, it’s probably modeling OK driving. At least no accidents. But the shitty driving and accidents are also useful as data about what to avoid.

      Maybe they’re already doing this? But then I wonder why their FSD is getting shittier rather than improving. One would think with more driving data, good and bad examples, would only help.