Wednesday, February 29, 2012

Driverless Car Ethics?

Google has been under development with their driverless car technology. The project is being led by Sebastian Thrun who is the director of Stanford Artificial Intelligence Laboratory and co-founder of Google StreetView. The car utilizes information from Google Street View and artificial intelligence gathered from a LIDAR sensor, video cameras, radar sensors, and a position sensor in order to stay on track. Their incredible driverless car has managed to safely navigate 1,000 miles without any human interaction and 140,000 miles with limited human interaction on public roads. Oddly enough the only accident that has occurred was while it was being driven in a manual mode by one of the engineers. So although this is a new technology, they have made great steps forward in the idea of driverless cars.

Google's Driverless Car

This exciting new technology seems great at first but it opens up a lot of questions of ethics and law. After much lobbying from Google, in June 2011 Nevada Legislature passed a law becoming the first state allowing autonomous vehicles on public roads. This immediately gave them the responsibility to set the bar for safety regulations and standards for driverless cars. The problem is that technology is surpassing the ability lawmakers to keep up. Our current driving/car laws are written under the assumption that a human is operating the vehicle.

So this opens up a lot of questions on responsibility and decision making. First of all, who would ultimately be responsible if/when an accident or traffic violation does occur? Do you blame the company who designed the artificial intelligence controlling the vehicle? This would surely turn any company away from wanting to pursue this technology since they would end up having their own personal chairs on the defendant's side of courts across the country. So does the blame fall on the human operating the vehicle in much the same way as cruise control is implemented now? This scenario would be unfair if any glitches or malfunctions occurred in a way that the human driver wouldn't have time to respond and correct.

Then there is the idea of an ethical car. Say you are sitting in your vehicle as it autonomously drives through a busy city street. A dog accidentally wanders right into the path of your vehicle with little time to react. At this point the artificial intelligence controlling the vehicle has to make an ethical decision. Should it brake as hard as possible to lessen the impact or swerve into oncoming traffic resulting in a head on collision with another vehicle? The artificial intelligence will be asked to make a split second ethical decision on wether the dog's safety is more important than the occupants of the vehicle itself. Does the outcome of this scenario change if it is a person wandering into the street instead?
Driverless Audi TT: For those who agree that the only thing worse than driving a Prius is being driven by one.
It will also be asked to make decisions of comfort. Most people might be a little angry about not being able to take advantage of the infamous "5 mph cushion" when cruising on the interstate. Also, when making a pass on a slow moving semi on the interstate, would the driverless car accelerate to get out of the zone right next to the semi which makes many drivers uncomfortable? How about various weather conditions? The artificial intelligence might end up traveling frustratingly slow through the rain or unervingly fast through a thick fog since the sensors can "see" further than your own eyes.

Ultimately the original programmer of the artificial intelligence might have a lot of control over these issues but is this disconnect from human reaction and decision a major downfall of driverless cars? I personally love driving and would have troubles putting my life in the hands of a machine who's decision making process might not be as transparent as it seems. There is a certain amount of the human aspect of driving that just can't be simulated through artificial intelligence. Then there are the difficulties in laws and restrictions on these vehicles. Who is responsible for setting guidelines and restrictions in a quickly evolving field? It's exciting to see the cars of tomorrow coming to life but there are certainly a lot of hurdles to leap in the meantime. Are you ready for a world where cars drive themselves?

6 comments:

  1. I don't see driverless cars taking off for a long time. It will take generations before we will trust a car to drive itself.

    ReplyDelete
  2. I love reading about technological advances like these. I think if self driving cars were out on the roads they would change how people commute completely. Currently, it's almost required for everyone to have a car to get to work and school. But what if a self driving car could pick you up and drop you off at your destination, then drive itself home to pick up someone else? Instead of having 2-3 cars per household a single vehicle would suffice. And who says a self driving car couldn't be shared between households? Just think, if the driver-less car was being used in this fashion it could reduce the amount of cars and ultimately create safer roadways and an entirely new travel ecosystem.

    On a side note; even though I'd love to see something like this in the future it's highly unlikely. Just think at how car companies would oppose this if a single "smart" vehicle would replace 10 normal cars.

    On another side note; Insurance companies would have a heyday!

    ReplyDelete
  3. I agree with Adam while fully automated cars are a cool idea I dont see them catching on in society very quickly. The issue I have is that fact that with this technology some actions become completely out of the passengers control that would be the driver in a regular situation.

    ReplyDelete
  4. This is a really interesting article. I think the ethical dilemmas you talked about in the article were very good questions. Who would you blame if the car got in an accident? I don’t think these cars would be very efficient in ND though. I would be too nervous to have a driveless car drive me in the winter months.

    ReplyDelete
  5. I had actually heard about Google's automated car quite some time ago and while I thought it was a really cool idea, I never took the time to consider the ethics involved with it. You made a very good point about laws not being able to keep up. It really is hard to say who would be in charge. I mean, it is the drivers car, thus they should be aware of whats going even if they aren't driving but at the same time that would defeat the point of the car being automated. Also, since the car is electronic it is incapable of making an ethical decision. Instead you would have to hope that the engineers who designed the system thought of every scenario possible and ranked them in order of importance. Unfortunately it is of course impossible to foresee every possible issue the car may encounter.

    ReplyDelete
  6. I think some people need this technology! There is so many bad drivers out there it is unbelievable. This would be great for long trips. But in all I hope this doesn't come out because it will just be another excuse for people when things go wrong.

    ReplyDelete