Tesla’s recall of 2 million cars relies on a fix that may not even work::Tesla agreed to the recall last week after a federal investigation the system to monitor drivers was defective and required a fix.

  • Voroxpete@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    11 months ago

    Tesla’s website says that Autopilot and more sophisticated “Full Self Driving” software cannot drive themselves

    Full self driving

    Cannot drive themselves

    Christ I can smell the bullshit all the way from Canada.

    • DreadPotato@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      11 months ago

      The “we can do it by end of this year” he’s been toting since 2016 wasn’t a giveaway?

  • serial_crusher@lemmy.basedcount.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    A better solution, experts say, would be to require Tesla to use cameras to monitor drivers’ eyes to make sure they’re watching the road. Some Teslas do have interior-facing cameras. But they don’t see well at night, unlike those in General Motors or Ford driver monitoring systems, said Philip Koopman, a professor at Carnegie Mellon University who studies vehicle automation safety.

    In case you were wondering who wrote the article

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 months ago

    This is the best summary I could come up with:


    Tesla’s recall of more than 2 million of its electric vehicles — an effort to have drivers who use its Autopilot system pay closer attention to the road — relies on technology that research shows may not work as intended.

    But research conducted by NHTSA, the National Transportation Safety Board and other investigators show that merely measuring torque on the steering wheel doesn’t ensure that drivers are paying sufficient attention.

    “I do have concerns about the solution,” said Jennifer Homendy, the chairwoman of the NTSB, which investigated two fatal Florida crashes involving Teslas on Autopilot in which neither the driver nor the system detected crossing tractor trailers.

    Missy Cummings, a professor of engineering and computing at George Mason University who studies automated vehicles, said it’s widely accepted by researchers that monitoring hands on the steering wheel is insufficient to ensure a driver’s attention to the road.

    But they don’t see well at night, unlike those in General Motors or Ford driver monitoring systems, said Philip Koopman, a professor at Carnegie Mellon University who studies vehicle automation safety.

    Kelly Funkhouser, associate director of vehicle technology for Consumer Reports, said she was able to use Autopilot on roads that weren’t controlled access highways while testing a Tesla Model S that received the software update.


    The original article contains 1,028 words, the summary contains 212 words. Saved 79%. I’m a bot and I’m open source!

  • PatFusty@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 months ago

    This is so fucking stupid it actually makes me mad. A tiny percentage of people died misusing the feature and now Tesla is forced to upgrade people to a technology that doesnt exist yet??? For free??? Holy shit this is dumb. Tesla should just relabel it to auto assist or something

  • Fapper_McFapper@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    11 months ago

    Here I am hoping that Tesla, Twitter, Space X, and any other brand associated with Elon Musk burn to the fucking ground. Burn baby burn, show this wanna be emperor that he’s wearing nothing at all.