MIT researchers have developed a new voice-controlled app that may help obese people to lose weight by logging calorie counts and other nutritional information after every meal.

The system parses the description and automatically retrieves the pertinent nutritional data from an online database maintained by the US Department of Agriculture (USDA), researchers said.

The data are displayed together with images of the corresponding foods and pull-down menus that allow the user to refine their descriptions - selecting, for instance, precise quantities of food. But those refinements can also be made verbally, researchers said.

A user who begins by saying, “for breakfast, I had a bowl of oatmeal, bananas, and a glass of orange juice” can then make the amendment, “I had half a banana,” and the system will update the data it displays about bananas while leaving the rest unchanged, they said.

Researchers at the Massachusetts Institute of Technology (MIT) concentrated on two problems. One is identifying words’ functional role - the system needs to recognise that if the user records the phrase “bowl of oatmeal,” nutritional information on oatmeal is pertinent, but if the phrase is “oatmeal cookie,” it is not.

The other problem is reconciling the user’s phrasing with the entries in the USDA database. For instance, the USDA data on oatmeal is recorded under the heading “oats“; the word “oatmeal” shows up nowhere in the entry.

To address the first problem, researchers used machine learning. Through the Amazon Mechanical Turk crowd-sourcing platform, they recruited workers who simply described what they had eaten at recent meals, then labelled the pertinent words in the description as names of foods, quantities, brand names, or modifiers of the food names.

To translate between users’ descriptions and the labels in the USDA database, the researchers used an open-source database called Freebase, which has entries on more than 8,000 common food items, many of which include synonyms.

comment COMMENT NOW