A lot of research is being done on autonomous driving, and the day will soon come when we humans can just tell our car the destination and it will bring us there. With the development of vehicle control technology, computer vision, map following, and inter-vehicle communication, getting a car to actually drive automatically to a destination is already possible. The real challenge is how to do so safely in a crowded environment when there are other vehicles and road users.
This decision-making aspect of driving is the difficult part. People have brought up the ethical issues, which will also become legal issues. The classic question has been: should the car swerve and kill pedestrians to save its passenger from a collision (and possible death) or should it collide to avoid killing pedestrians (and possibly killing its passenger instead)? Who to save, who to kill? This ethical aspect of decision-making will need to be part of the autonomous driving system, and I frankly don't think anyone is ever going to have a good answer for this. At the end of the day, whatever the final decision may be, the next question is: who is responsible for damage/injury/death caused by an autonomous vehicle? Is it the passenger, the owner, or the manufacturer?
Obviously, manufacturers are not going to want to take such responsibility. They make millions of cars. If they are held liable for every accident, it won't take long to bankrupt them. Yet at the same time, not having some form of liability on the manufacturers may result in manufacturers being more willing to take development and production risks in churning out products that may fail to meet ethical standards. Why spend billions on developing a proper decision-making engine if you do not need to take responsibility for the poor decisions made by that engine? But if we do hold manufacturers responsible, then it deters manufacturers from working on autonomous driving in the first place, since it is the driver who takes responsibility in a car driven by a human.
Should the owner then be held responsible? You might say that it is the owner who chose to buy that car, so the owner will need to be responsible for that decision when the car kills or injures someone. But as cars become more advanced, I don't think owners will be able to fully understand how these autonomous cars make decisions, and it will be unrealistic to hold car owners responsible for the car they choose to buy.
Okay, then how about the passenger? The current method of development has a "driver" who is ready to respond in an emergency, taking over from the car's decision-making engine. In the future, we may expect passengers to fulfill that role--to take over in an emergency. There are two problems, though: reaction time and skill. Will the passenger be able to react in time to take over from the autonomous driving system? This requires the passenger to be fully focused on the road and be capable of driving, which defeats the purpose of having autonomous cars. And even if the passenger is fully focused on the road, can we expect he or she to have the skills necessary to handle a car in an emergency? After all, driving is a skill, and the passenger is likely to be suffering from a severe lack of practice if cars drive themselves 99.99% of the time. It may be even more dangerous for an out-of-practice driver to be trying to handle the car in an emergency.
Responsibility must lie somewhere, else we will end up with substandard cars that endanger the lives of people and substandard passengers who cannot handle cars in an emergency. And when we have reached an ethical/legal conclusion, we need to remember the longer-term impact on society. When autonomous cars have entrenched themselves in our lives, we humans will not longer have the skill to drive. It is that simple. We will be relying on machines to get us around. While autonomous driving opens up possibilities for people who may not be otherwise able to drive (like the elderly or disabled) and lets them move around, reliance on autonomous driving means that human mobility in the future for everyone (young and old, disabled or not) will be limited by what machines can do.
So yes, autonomous driving sounds great in the short term. But the long-term impact is that human movement will be limited by what machines can provide. If your car refuses to drive off the paved road to follow a forest trail, you either walk or give up on going down that trail. Are we ready for such a future?
No comments:
Post a Comment