This past Saturday while I was getting my hair cut I heard an interesting conversation that discussed the problems with smart cars, government surveillance, and the things people should consider when using the latest technology. A handful of weekend customers explained why they wouldn’t want to use a smart car, one of the main reasons related to the issue of privacy. One customer said, “If Google has all this information on us now with just our email, what could they do if they controlled our cars?”
One barber explained why he was against certain forms of technology and why he would never use a smart car. While his customer and a few others thought the idea of having a smart car would be a good thing. The barber disagreed because he believed that the potential for the car to malfunction would be to high among the first time users.
After listening to the debate, by the young barber and his customer about why they were against using Google’s self-driving cars, I started to wonder if any other tech savvy young adults might feel the same. It seems like many people are in love with the idea of a specific technology making their lives easier compared to the individual doing something extra to make a situation work in their favor. Maybe one day this technology can assist people the way the designer had in visioned it, but I know that as a society we tend to have a love-hate relationship with technology.
How many times have you typed something into your smartphone and auto correct decided to “auto correct” your text for you? Or how many times you’ve tried to use a GPS and the next thing you know it’s recalculating your directions and you end up going the “long way” to your destination? It’s moments like this where technology can inconvenience you.
While I was researching stories on Google’s latest endeavor: self-driving cars I found an interesting video on Ted Talks. The Ted video featured Sebastian Thrun discussing how these cars could help reduce the number of deaths due to human error.
At first glance these latest technological wonder may seem to good to be true, but the question remains do we really need self-driving cars? A recent article on Mashable explained how Google had improved their software to “detect hundreds of distinct objects simultaneously–pedestrians, buses, a stop sign held up by a crossing guard, or a cyclist making gestures that indicate a possible turn. A self -driving vehicle can pay attention to all of these things in a way that a human physically can’t–and it never gets tired or distracted.”
Would you feel safe knowing that the car behind you isn’t being driven by a human, but by a computer program instead? What happens if a self-driving car was involved in a car accident that cost someone their life who’s at fault the car manufacturer, the software company, or the owner of the self-driving car? Do you think that we’re ready for self-driving cars?