Alphabet, parent company of Google, is clubbing two of the research projects — robotics and AI (Artificial Intelligence) language understanding — for better understanding of human language command.

Q

<blockquote class=”twitter-tweet”><p lang=”en” dir=”ltr”>Tasks that seem simple to humans — like cleaning up a spilled drink — are actually incredibly complex for helper robots. That’s why Google Research and Everyday Robots are using language models to improve robot learning. <a href=”https://t.co/e6qdVJrVrF”>https://t.co/e6qdVJrVrF</a></p>&mdash; Google (@Google) <a href=”https://twitter.com/Google/status/1559664381448933379?ref_src=twsrc%5Etfw”>August 16, 2022</a></blockquote> <script async src=”https://platform.twitter.com/widgets.js” charset=”utf-8”></script>

Alphabet have been carrying out its research on robots since 2019. Alphabet is keeping a check on simple commands that are easier for robots to grasp - like fetching a drink or so. The ‘Everyday Robots’ project is still a long way.

Google says, if someone tells one of the Everyday Robots prototypes “I spilled my drink, can you help?”, the robot filters this instruction through an internal list of possible actions and interprets it as “fetch me the sponge from the kitchen.”

Google robots are not yet out in the market for sale. Google is yet to embed the robots with the “OK, Google” command. For now they perform a few specific and simple actions. Google said it is working on the efficiency of the robot actions and commands, to avoid them becoming surveillance machines, or a chat technology that can give out offensive responses.

Google said that the introduction of the AI language skills is made possible by infusing the robots with language codes for understanding words from Wikipedia, social media and other webpages

As per reports, Amazon and Microsoft are also performing robotic research.

comment COMMENT NOW