How safe are self-driving cars?

Sdílet
Vložit
  • čas přidán 24. 03. 2018
  • How safe are self-driving cars? And how safe should they be before they're widely used? Risk Bites dives into the safety of autonomous vehicles, and looks at how we make sense of their risks and benefits. With scientist and emerging technologies expert Andrew Maynard.
    Questions around self-driving car risks and safety are becoming increasingly relevant as companies like Uber and Waymo begin testing their autonomous vehicles on public roads. This technology could radically improve road safety, and increase mobility for people who currently cannot drive, or should not be driving. At the same time, it raises new safety issues.
    Is it OK if autonomous vehicles are just as safe as human drivers? Should they be safer? What safety standards should we be using? What happens when people are killed by these vehicles (as has already happened) - who's liable, and who decides what's OK, and what is not?
    As companies like Waymo, Uber, GM and Tesla continue to develop their self-driving technologies, there is more urgency than ever for a broader discussion about safety and risk, that leads to smart policies and regulations that ensure the technology is as safe and as beneficial as possible.
    The video is part of Risk Bites series on Public Interest Technology - technology in the service of public good.
    #selfdriving #safety #risk #cars
    USEFUL LINKS
    Safety in a World of Driverless Cars (Rand Corporation) www.rand.org/m...
    Why Waiting for Perfect Autonomous Vehicles May Cost Lives www.rand.org/b...
    After Tempe fatality, self-driving car developers must engage with public now or risk rejection theconversatio...
    Arizona Motor Vehicle Crash Facts 2016 www.azdot.gov/...
    Pedestrian Traffic Fatalities by State. 2017 Preliminary Data www.ghsa.org/s...
    Redefining "Safety" for Self-Driving Cars www.scientific...
    Waymo Safety Report 2017 storage.google...
    GM Safety Report 2018 www.gm.com/con...
    RISK BITES LITE
    Risk Bites Lite videos are shorter and lighter than regular Risk Bites videos - perfect for an injection of fun thoughts when you're not in the mood for anything too heavy!
    RISK BITES
    Risk Bites videos are devised, created and produced by Andrew Maynard, in association with the Arizona State University School for the Future of Innovation in Society (sfis.asu.edu). They focus on issues ranging from risk assessment and evidence-based decision making, to the challenges associated with emerging technologies and opportunities presented by public interest technology.
    Risk Bites videos are produced under a Creative Commons License CC-BY-SA
    Backing tracks:
    Believe in Yourself by Olive Musique. www.premiumbea...
    Risk Bites is your guide to making sense of risk. We cover everything from understanding and balancing the risks and benefits of everyday products, to health science more broadly, to the potential impacts of emerging technologies, to making sense of risk perception. If you enjoy our videos, please subscribe, and spread the word!
  • Věda a technologie

Komentáře • 11

  • @AllAboutEverythingTV
    @AllAboutEverythingTV Před 6 lety +2

    Interesting, it's easy to forget how much testing needs to be done to collect a significant amount of data, especially in something regarding public safety. Great video, thanks for making this.

    • @riskbites
      @riskbites  Před 6 lety +1

      Thanks. I also find it interesting that humans driving cars cause less harm than we might imagine on a miles driven basis!

  • @jazzysyed8536
    @jazzysyed8536 Před 5 lety +2

    This really helped me on a HUGE project so thanks😁

  • @kamikazehound3243
    @kamikazehound3243 Před 5 lety

    Also how busy are the places they are testing and how well are the drivers on average in those areas

  • @richardconto9575
    @richardconto9575 Před 6 lety

    As a critique of the video, the (implicit) idea that testing 100 self-driving cars 24x7 is the same as 300 cars 8x7 is disingenuous. The risks vary depending on time of day (light, congestion, population.)
    One thing that the video suggests is that monitoring would be a way to "catch" problems before they become serious - but that ignores the way computer algorithms tend to fail catastrophically without warning (making this a literal computer crash.)
    In order to make a quick transition (i.e.: for everyone, their replacement car would be a self-driving car), simply halving the risk of injuries and fatalities isn't going to be enough. Something much more compelling would be necessary.
    Of course, that's likely to be outside of the area of safety. Reduced cost, greater convenience (or utility) would have to be significant too. All-electric cars offer the promise of lower regular maintenance costs (due to the simplified mechanical systems) although the replacement costs of the battery are hard for me to estimate. (And extensive adoption of electric cars are a problem for the current models of funding roads and highways through gasoline taxes.)
    And my description suggests that a quick transition to significant use of self-driving cars would take around half the lifetime of an existing car, which is now between 11 and 14 years.
    One benefit of self-driving cars is the prospect of easy-to-hire cars - and the consequent ability to reclaim the space currently being used for driveways (if not garages and carports). Of course, the draw back is that - given current work and school scheduling paradigms - easy-to-hire cars would have to be as badly over-provisioned as cars currently are.

    • @riskbites
      @riskbites  Před 6 lety

      Thanks Richard - yes, the estimates are crude and skip over what would be important details in a fuller analysis. They are however based on Rand Corporation's modeling of safety testing (see links), which is still one of the most comprehensive analyses around.
      On the issue of catastrophic failure, yes, also agree -- in part. What we do know from complex/non linear systems is that it's possible to build in resilience while working out the bounds and probabilities of some failure modes, and in doing so identifying early signals of possibly greater failures. By no means foolproof, but a good step in the right direction.
      And of course, this whole issue needs to be approached holistically, and not just as a narrow safety issue!

    • @riskbites
      @riskbites  Před 6 lety

      I agree that it's an oversimplification, albeit one that Rand used in their analysis. But it is a starting point, and not an unusual approach to making a complex issues more tractable. That said, more nuance is definitely needed to tease apart the specifics of save versus less safe conditions.

  • @NneonNTJ
    @NneonNTJ Před 6 lety +1

    the problem is that in most deaths in car traffic a human lays at the basis of it. if there will be accidents with self driving cars who will be in fault? the company that made them? the person not taking over the steering wheel?
    crashes by autonomous vehicles wont be as accepted by the population as human errors

    • @riskbites
      @riskbites  Před 6 lety +2

      Really important point, and one that lawyers, regulators, insurance companies and others need to be grappling with now, as irrespective of how safe or otherwise the technology is, we don't have systems in place to handle the consequences that arise when things go wrong.

    • @riskbites
      @riskbites  Před 6 lety

      This is going to be one of the more interesting legal issues as completely driverless cars become more commonplace.