Self-driving cars will kill people and we need to accept that - article

Discussion in 'Petrol Heads Forum' started by Calliers, Jun 2, 2018.

  1. Calliers

    Calliers HH's MC Staff Member

    Joined:
    Oct 12, 2004
    Messages:
    35,639
    Likes Received:
    2,316
    Trophy Points:
    139
    Recently, headlines have been circulating speculation about what we need to do about the risks of self-driving vehicles. After one of its self-driving vehicles was responsible for a fatal crash, Uber has temporarily paused all autonomous vehicle testing in the state of Arizona. In its wake, Arizona Governor Douglas Ducey has reiterated his position to prioritize public safety as a top priority and has described the Uber accident as an “unquestionable failure” in preserving this priority.

    Also recently, Tesla confirmed that a recent highway crash (which killed the driver of the vehicle) happened while the Tesla Autopilot system (a semi-autonomous feature) was controlling the car. This is the second accident in which the Tesla Autopilot system was at least partially at fault.

    To many consumers, these incidents are a confirmation of something they suspected all along; trusting an AI system to handle driving is a mistake and one that’s destined to kill people. Self-driving cars, they therefore conclude, need to be heavily regulated and scrutinized, and potentially delayed indefinitely, until we can be sure that they’ll bring no harm to their drivers and passengers.

    This is an inherently flawed view. It’s not a good thing that self-driving cars have killed people, but testing them in real-world situations is a necessary thing if we want to keep moving forward toward a safer, brighter future. And unless we want to jeopardize that future, we need to get over our fears.
    ____________________
    Source: thenextweb
     
  2. Mr Cairo

    Mr Cairo Require backup .... NO

    Joined:
    Jul 1, 2002
    Messages:
    2,945
    Likes Received:
    371
    Trophy Points:
    108
    Oh I accepted that the second companies said they were going to develop self driving cars "people will die" I thought, However companies constantly reassured us that it would be safe ..... nice to see they finally caught up with common opinion and agree that "People will die"
     
    Calliers likes this.
  3. Judas

    Judas Obvious Closet Brony Pony

    Joined:
    May 13, 2002
    Messages:
    37,904
    Likes Received:
    787
    Trophy Points:
    138
    I don't think any sane person would ever claim that no one would die.... the problem is that people are unpredictable... and humans are prone to error, and since humans are developing vehicles, there are bound to be bugs or glitches.. HOWEVER a self driving car even today, is significantly less likely to kill or be involved in even a fender bender, compared to the average competent individuals. The moment automated cars become the majority on the road, the faster potential deaths will drop off the cliff, eventually coming close to but will never touch 0.
     
    Calliers likes this.
  4. IvanV

    IvanV HH Assassin Guild Member

    Joined:
    Dec 18, 2004
    Messages:
    9,863
    Likes Received:
    1,257
    Trophy Points:
    123
    A lot of people seem to believe that once all cars are autonomous, no one will die. That's even the broader tech enthusiast, so, slightly more knowledgeable than average, crowd. Also, for the media and the general, semi-luddite public, any such an occurrence is a gift from heaven.

    There's a thought I've had for a few days: if an autonomous system is twice as safe as human drivers (i.e. accidents and fatalities occur twice as rarely as with average humans), but some of those happen in situations which 99+% of human drivers would have handled correctly, will the public ever allow them on the roads? Also, how high do we set the bar for such systems? As safe as humans? Better? Do we keep raising it, so that every five years the autonomous systems have to get say, 50% better? How do we know the difference between realistic limitations and desire by the manufacturers to limit the costs?
     
    Calliers likes this.
  5. Judas

    Judas Obvious Closet Brony Pony

    Joined:
    May 13, 2002
    Messages:
    37,904
    Likes Received:
    787
    Trophy Points:
    138
    Considering how the plan is to make autonomous cars communicate with each other, this will make it possible for vehicles to avoid each other far easier as both will be able to know where one is attempting to make avoidance attempt in what direction and the other to know that and avoid in the opposite direction (as an example). The problem is that with humans at the wheel, in many cases where "you made the right choice, but you still died" because hey that's life, one can make all the best possible choices and still lose, and some even lose their life, all because the other people around them made clearly very poor choices/wrong choices or perhaps the one they were involved with made the right choice too, as that can happen. Circumstance is a kicker, and no one can read other peoples minds, they are predict to a degree, but it's never 100% accurate. With computers running the vehicle, they can elevate a lot of unknown variables instantly with 100% certainty. This is where automated vehicles will excel and drop the mortality and injury rates down substantially, again never 0...

    If there were dedicated roadways that made it illegal for non automated vehicles to drive, i'd be fairly confident that we'd see things massive change in the incident issues, and pretty much all the cases that do happen, would be those breaking the law and personally taking control and driving on such a roadway.
     

Share This Page

visited