You don’t speak Spanish and he doesn’t speak English. However, you still communicate with a translator. Not a human being, per se, but earphones paired to your phone. This isn’t science fiction, it’s reality in 2018. The Google Pixel Buds can translate languages in real time when paired to Pixel 2 phones. Just like Siri can make bookings, Alexa can order groceries and Bixby can set appointments using voice commands.

In 2018, Google, Amazon, Apple, Samsung and Microsoft’s voice assistants are going to battle for your attention when you’re not looking at the screen. These voice assistants powered by artificial intelligence (AI) are capable of understanding several languages. You can quickly execute tasks such as setting reminders and even playing music without navigating through the the maze of apps on your phone.

Google’s latest Bluetooth earphones can talk 40 languages in real time, making it possible for two people with no language in common to have a conversation more spontaneous than having a third person translating for them. But it’s not the earphones where the magic happens. While Google Translate is a big part, the Google Assistant does quite a bit of the heavy lifting. You can summon this virtual assistant on an Android phone or Google app by simply saying “Okay Google” followed by a question or a task. And you’ll find yourself in a conversation, where even follow-up questions are acknowledged and understood.

While the ability to understand key phrases isn’t very hard, understanding a conversation is a completely different game. That’s what all tech giants are racing to get right. Using Natural Language Processing, these voice assistants are trying to be more than just robotic voices to us. Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana and Samsung’s Bixy are in the same race but have their own areas of excellence. Alexa has managed to become a popular smart speaker, and Siri and Google Assistant are capturing the phone space. But this just seems to be the beginning.

At CES 2018 in Las Vegas, an annual consumer electronics show, the AI assistants seem to have spiralled an entire ecosystem of their own. From locks, to lights and fridges with virtual assistants were available at the show. Companies such as Philips, GE and iDevices have launched lights that work with virtual assistants like Alexa. This means that dimming the lights is just a matter of speaking to an AI in your speaker, phone or even smartwatch. Similarly, Onelink’s Safe & Sound is a smoke detector that is fixed to your ceiling and while hanging there, takes requests to play music or remind you of chores.

While Samsung and Microsoft’s assistants have quite a bit of catching up to do in terms of a natural conversation, they have managed to make their presence felt. Cortana, as part of the Windows operating system has made its way to millions of computers while Samsung’s Bixy, besides being the default AI assistant on Samsung’s flagship phones, is also becoming a part of their refrigerator models. With massive screens and cameras on fridges, Samsung is hoping customers will want to use it as the centre for controlling smart devices spread across the home.

All the new devices with, or controlled by voice assistants, seem to be eventually trying to do one thing — automate your home. To give you the ability to unlock doors by speaking to your phone or switching off the lights in the kitchen while sitting in office. Technology is already playing a big role in making homes smarter and more secure. Smart home appliances now sport features from remote surveillance systems to home entertainment. And companies such as LG and Samsung have already started making devices that communicate with each other and can all be controlled from a single app.

These features might seem excessive, if you take the sense of sight for granted. Devices ranging from earphones to fridges that can do more with voice commands will only push the possibilities. But till then, earphones like Airpods and Pixel Bugs or speakers such as Alexa and Google Home have become an alternative mode of interacting with not just apps and phone but also home appliances. And while earphones are generally difficult to keep wearing all day, bone-conduction speakers might solve the problem soon. Now there are spectacles and wristbands that let send vibrations through your bones for audio while letting you hear ambient sounds as well.

Meanwhile, Facebook has been working on simplifying interactions for those who have trouble using screens. The company is already famous for using blue in its appearance which makes using it easier for people with colour blindness, for instance, like their CEO, Mark Zuckerberg. And in 2016, Facebook introduced a feature that reads out the description of images in everyone’s personalised newsfeed. We reached out to Google, Apple and Amazon about their voice assistants as an accessibility option for the visually challenged. But from the lack of responses and the way voice assistants are launched, accessibility is, at most, a second thought. So, whether voice input will be the new touchscreen is yet to be decided, it will definitely take time to prioritise the users that need it the most.

Till then, it seems to be a race to the smartest AI voice assistant, and it is only a matter of time, till we get used to these features and they become common in gadgets. The use of voice search appears to be the top feature propelled by the likes of devices such as smart earphones that let you talk to your virtual assistant without having to take the phone out of your pocket. Companies will also be looking into making the user experience more rich. Amazon’s Alexa can already call an Uber, read you an e-book, order from Starbucks or track your packages. And with their abilities to engage in conversations improving, voice seems to gearing up to match our obsession with the touchscreen.

Sanshey Biswasis a Delhi-based journalist

comment COMMENT NOW