Enticing though they are, such arguments conceal a logical flaw. As a classic 19th-century theory known as a Jevons paradox explains, even if autonomous vehicles eventually work perfectly — an enormous “if” — they are likely to increase total emissions and crash deaths, simply because people will use them so much.

  • alyaza [they/she]@beehaw.orgOP
    link
    fedilink
    arrow-up
    8
    ·
    3 months ago

    As long as cars exist, AVs will be better than human drivers,

    this is at obvious odds with the current state of self-driving technology itself–which is (as i noted in the other comment) subject to routine overhyping and also has rather minimal oversight and regulation generally. Tesla is only the most egregious example in both respects; even stuff like Waymo is pretty much entirely reliant on taking their word for it that the technology would be safer than humans (which meshes awkwardly with well publicized problems and efforts to hide robotaxi safety records).

    • masterspace@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 months ago

      this is at obvious odds with the current state of self-driving technology itself–which is (as i noted in the other comment) subject to routine overhyping and also has rather minimal oversight and regulation generally

      All cool tech things are overhyped. If you judgement for whether or not a technology is going to be useful is “if it sounds at all overhyped then it will flop” then you would never predict any technology would change the world ever.

      And no, quite frankly those assertions are objectively false. Waymo and Cruise’s driverless programs are both monitored by the DMV which is why they revoked Cruise’s license when they found them hiding crash data. Waymo has never been found to do so or even accused of doing so. Notice that in the lawsuit you linked, Waymo was happy to publish accident and safety data but did not want to publish data about how it’s vehicles handle edge cases, which would give their rivals information on how they operate, and the courts agreed with them.

      https://arstechnica.com/cars/2023/12/human-drivers-crash-a-lot-more-than-waymos-software-data-shows/

      Since their inception, Waymo vehicles have driven 5.3 million driverless miles in Phoenix, 1.8 million driverless miles in San Francisco, and a few thousand driverless miles in Los Angeles through the end of October 2023. And during all those miles, there were three crashes serious enough to cause injuries:

      In July, a Waymo in Tempe, Arizona, braked to avoid hitting a downed branch, leading to a three-car pileup. A Waymo passenger was not wearing a seatbelt (they were sitting on the buckled seatbelt instead) and sustained injuries that Waymo described as minor. In August, a Waymo at an intersection “began to proceed forward” but then “slowed to a stop” and was hit from behind by an SUV. The SUV left the scene without exchanging information, and a Waymo passenger reported minor injuries. In October, a Waymo vehicle in Chandler, Arizona, was traveling in the left lane when it detected another vehicle approaching from behind at high speed. The Waymo tried to accelerate to avoid a collision but got hit from behind. Again, there was an injury, but Waymo described it as minor. The two Arizona injuries over 5.3 million miles works out to 0.38 injuries per million vehicle miles. One San Francisco injury over 1.75 million miles equals 0.57 injuries per million vehicle miles. An important question is whether that’s more or less than you’d expect from a human-driven vehicle.

      After making certain adjustments—including the fact that driverless Waymo vehicles do not travel on freeways—Waymo calculates that comparable human drivers reported 1.29 injury crashes per million miles in Phoenix and 3.79 injury crashes per million miles in San Francisco. In other words, human drivers get into injury crashes three times as often as Waymo in the Phoenix area and six times as often in San Francisco.

      Waymo argues that these figures actually understate the gap because human drivers don’t report all crashes. Independent studies have estimated that about a third of injury crashes go unreported. After adjusting for these and other reporting biases, Waymo estimates that human-driven vehicles actually get into five times as many injury crashes in Phoenix and nine times as many in San Francisco.

      To help evaluate the study, I talked to David Zuby, the chief research officer at the Insurance Institute for Highway Safety. The IIHS is a well-respected nonprofit that is funded by the insurance industry, which has a strong interest in promoting automotive safety.

      While Zuby had some quibbles with some details of Waymo’s methodology, he was generally positive about the study. Zuby agrees with Waymo that human drivers underreport crashes relative to Waymo. But it’s hard to estimate this underreporting rate with any precision. Ultimately, Zuby believes that the true rate of crashes for human-driven vehicles lies somewhere between Waymo’s adjusted and unadjusted figures.