Researchers at the International Institute of Information Technology (IIIT-Hyderabad) have found that images taken from a smartphone could be used in the primary screening of oral cancer. Backed by Artificial Intelligence, this model can help in the early detection of abnormal lesions, which could turn malignant.

The collaborative model was developed by IIIT-H’s iHub-Data and INAI, an applied AI research centre run by Intel, Telangana Government, Public Health Foundation of India, and IIIT-H. 

Besides being a cost-effective model, this method ensures consistency in imaging, as the device uses the camera light, eliminating the requirement of using an external light source. Also, this doesn’t require much training and can be easily handled by healthcare workers at the field level.

“Due to consumption of tobacco and areca nut, there’s a high prevalence of oral cancer and OPMDs (oral potentially malignant disorders). Early detection and treatment of OPMDs is needed to prevent the natural progression into cancer,” an IIIT-H spokesperson said.

“While the oral cavity can be easily visualised in the absence of specialised instruments unlike that for internal organs, such a clinical assessment is unfortunately subjective,” he said.

Though biopsies can help identify the problem, absence of experts and diagnostic centres can leave several lesions undetected in people. A mobile solution and easy-to-operate solution could solve the problem. 

A study titled ‘AI-Assisted Screening of Potentially Malignant Disorders’, commissioned by INAI and IHub Data, set the ball rolling, exploring how AI could be used in recognising OPMDs by taking images of oral cavities and lesions.

The idea was to test the efficacy of deep learning models and see whether they can identify suspicious lesions. The researchers used smartphone-based images taken during community outreach programmes for early cancer detection.

F1 score

The research team, led by Vivek Talwar and Pragya Singh, found that an AI solution could distinguish and label ‘suspicious and non-suspicious’ lesions. “We saw a score of ‘F1-score’ of over 70 per cent,” said the team.

The F1 score is an indicator that measures the model’s accuracy. “Early detection would help in attending to the problem early. The treatment cycle reduction and improved prognosis can be achieved by early detection,” said the spokesperson.

The team suggests that further innovations in taking quality pictures and transferring the data to the app would accelerate the deployment of the solution.

“Innovation in the quality of screening and future deployment through mobile app will help improve the capturing of oral cancer incidence rate in community setups,” said Konala Varma, CEO, INAI, in an IIIT-H blog.

comment COMMENT NOW