Self-driving cars have the
potential to reduce accidents, but somebody will still need to pay for repairs.
Autonomous vehicles are currently the en
vogue topic where it concerns the media and automotive industry. The question
is, do we really understand how they will actually impact our world and when
we’re likely to see them operating in significant numbers?
Beyond that, there’s also the question
of how they are programmed. You might have heard of the so-called “Trolley
Problem,” in which an AV is traveling the road when somebody or an object is
suddenly thrust in front of it. The AV doesn’t have sufficient time nor
distance to stop safely, so it must either steer to one side of the street or
the other. The problem is that on one side is an elderly lady; on the other is
a group of children.
Not realistic
While such scenarios are great for grabbing headlines, industry experts concur that software that’s able to decide what to avoid simply does not exist, nor is it currently being programmed into future autonomous vehicle operation.
While such scenarios are great for grabbing headlines, industry experts concur that software that’s able to decide what to avoid simply does not exist, nor is it currently being programmed into future autonomous vehicle operation.
Recently, Gowling WLG (a Global 100 law
practice) and UK Autodrive (a consortium designed to facilitate the
introduction of autonomous vehicles in Britain) put together The Moral
Algorithm, a white paper that aims to look at moral decisions by
autonomous vehicles and the need for regulating them.
Collision Management talked with Andre Rivest, the Canadian leader of WLG Gowling’s Auto
Practice regarding the adoption of AVs and the impact they could likely have on
the current automotive ecosystem, including collision repairers.
Rivest acknowledges that while AV
technology is already at a stage where cars can drive themselves, it is
consumer acceptance and regulatory issues that will likely be deciding factors
as to how and when AVs start actually hitting the road.
Furthermore, he says there will need to
be a cohesive strategy when it comes to legislation and liability concerns.
Issues such as crossing national borders, developing safety nets against cyber
attacks and how to program software, will have to come down to collaboration
across different organizations, not rest in the hands of individual companies
or regulators.
And although the primary focus of AVs is
to save lives and decrease injury and property damage, as The Moral Algorithm
notes, it will likely be many decades before we could possibly see road
environments without human drivers.
Footing the bill
In the near term, it’s likely we could have a mixture of both AVs and vehicles operated by human drivers. Given both the potential for human error and also software failure, there’s every chance that collision repairers will still have plenty of work. The question will likely be then, who’s going to pay for it?
In the near term, it’s likely we could have a mixture of both AVs and vehicles operated by human drivers. Given both the potential for human error and also software failure, there’s every chance that collision repairers will still have plenty of work. The question will likely be then, who’s going to pay for it?
“Volvo has gone on the record to say,
‘As an automaker, if we build AVs then we are responsible for them’,” says
Rivest. “And if we are moving toward a scenario where we have product liability
that’s the responsibility of the manufacturer, then OEMs are going to have a
far more vested interest in controlling or impacting the repair process.”
No comments:
Post a Comment
Share your experience with us, make a review or let us know what you think by leaving a comment here...