We’ve talked about autonomous vehicles many times on our blog, and you may think we’re looking too far forward, but we’re not! All newer cars have at least one kind of semi autonomous feature, whether it’s automatic brakes or blind-spot detection. In fact, the National Highway Traffic Safety Administration is inches away from creating guidelines for autonomous vehicles, so they are coming quickly!
But many people are begging them to hit the brakes on this process. Why? Because we still have a lot of concerns regarding autonomous vehicle technology. Google and Tesla have been testing completely autonomous cars with mostly success, but both have run into trouble out on the road. And manufacturers of autonomous cars plan to remove the steering wheel, brake and gas pedals, and everything that would allow human intervention during a trip. So what are some problems we still have with autonomous vehicles?
At the public meeting with the NHTSA regarding the guidelines, the Executive Director of the National Society of Professional Engineers, Mark Golden, brought up the point that weather can cause a hazard. And it can happen unexpectedly. Driving conditions change by the second. And some test vehicles have had problems finding lane markings in California during adverse weather, so how do we know it will work anywhere else in these conditions? And with no way for a driver to take over, you’re putting everyone on the road in danger. These systems must be tested in various conditions, by third-party testers, before we even consider releasing them in mass.
Traffic Light Malfunction
In addition to, and typically paired with adverse weather, is traffic light malfunction. If an intersection loses power, then typically traffic cops will take on the task of directing traffic. How will autonomous vehicles respond in this situation? Are they equipped to understand gestures from the cops?
Will manufacturers program ethical choices into the vehicles? And if so, what will they program? Will an autonomous vehicle save a pedestrian’s life over the passengers? When you’re driving, you make ethical decisions all the time. If you have to come to a screeching stop in the middle of an intersection because someone walked across the road without looking or when they weren’t supposed to, then you’re making an ethical decision to save that person’s life. You may not realize it because it happens so quickly, but you do. And you make those split-second decisions daily. Will an autonomous car be programmed for these hard choices driver’s make every day for every situation?