Imagine being seated in an intelligent car that takes off with a simple hand gesture, accelerates in sync with traffic, activates its brakes on sensing obstructions, while tackling the sharpest bends effortlessly. All this while you sit back and unwind, watch a movie, read the news, work on a presentation or catch up on your sleep.

While such a scenario is not likely to play out in the immediate future, we are already on the road to partial automation, with the car capable of managing without human intervention for short distances, in a controlled environment.

To drive itself, the car will need to see around the bend and beyond the horizon – a combination of radars, lidars and sensors will make this possible. Further, a network of connected vehicles should be able to exchange information for real time insights with data processing and analytics in the back-end and benefit from this collective intelligence.

How this information is gathered and presented will determine the car’s response to varying road conditions. Communication between the driver and car will be vital in this scenario. And this is where holistic human machine interface comes in.

One of the key challenges in the path of automated driving is the handover from the driver to automated driving mode and vice versa. Since its introduction in the early 1900s, instrumentation has dramatically transformed to cater to the driver’s demand for better presented information.

Taking us a step closer to partial automation, it is now becoming the source of structured information for safe and hassle-free driving. The car of the future will be able to drive itself with inputs from its digital and physical environment. To do that, it must be able to sense, learn, make decisions and communicate.

In a typical human machine interface cockpit, information from instrument clusters is presented by an extended module, the head-up display. All elements of infotainment and connectivity are summarised in a head unit whose output area is mainly in the centre console, showing the driver relevant information depending on his traffic situation, personal preferences, and current state.

For safe navigation, a powerful back-end needs to support the system providing accurate road information in real time which includes data from other surrounding vehicles as well. This is V2X (vehicle to everything) communication, which enables prompt information exchange with the back-end system and in case of other on-road vehicles it is called V2V (vehicle to vehicle).

In sync with advanced driver assistance system technologies, a holistic human machine interface becomes an output unit for blind spot deduction, lane departure warningand cloud-based information which include traffic data and appropriate big data algorithms. With the digital environment becoming more and more interactive, holistic human machine interfaces will go beyond head-up display indication to two-way communication.

For example, while the driver receives an indication for fuel refill, apart from spotting nearby fuel stations the car will also interact with the fuel stations to gather information on special offers.

Since the first appearance of secondary displays in cars in the early 1990s, in-car human machine interfaces have come a long way. As drivers and passengers demand more from their driving experience, human machine interfaces will evolve into the most significant link between driver and car.

The car of the future will be able to sense, remember and communicate, just like a human being. Human machine interfaces will cater not only to the growing need for safety, information and economy, but also fill the role of that perfect driving companion, while ensuring that the ultimate control, even in the most intelligent car, remains with the driver.

The writer is Head – Business Unit Instrumentation & Driver Human Machine Interface, Continental Automotive Components India

comment COMMENT NOW