This project started as a solution to one of my biggest pet-peeves with controls within vehicles. Automobiles have been evolving over time, and with 17-inch touchscreen displays within cars, the way humans interact with the vehicle has been changing too. A driver’s primary focus usually is considered to be driving and navigating the car safely. With the advent of multiple features creeping into vehicles that enable the drivers to even Tweet while driving, there seems to be a drift from what is a priority while driving. The self-driving cars are here, sure, but not yet entirely. The roads have a very few cars that drive themselves, and that for very limited stretches, but the majority is still driven by humans, and are prone to human mistakes. My intention was to get into the space of driver-machine interaction and find potential opportunity spaces for improvement of existing systems.
To get a feel of what the current experience within vehicles looks like, I decided to do a quick study of the different existing technologies within vehicles that address various aspects of a driving experience.
Tesla currently in Model S boasts a 17 inch screen as the central console, and a digital instrument cluster that displays information such as mileage, power consumption/efficiency, projected mileage and stats over time as graphical visualization. There seems to be an overload of information on the screen, and the complete functioning of the vehicle can be modified on, on the central console. Yes, Tesla cars are going to drive themselves soon, but how much of screen-time do they require now?