Advertisers will have to disclose whenever a social issue, electoral or political ad contains a photorealistic image or video, or realistic-sounding audio, that was digitally created or altered, social media giant Meta said on Wednesday.

The development comes as the government asked all social media platforms, including X, Instagram and Facebook to remove morphed images within 24 hours of receiving a complaint under the IT rules.

“A new policy to help people understand when a social issue, election or political advertisement on Facebook or Instagram has been digitally created or altered, including through the use of AI. This policy will go into effect in the new year and will be required globally,” Meta said in its statement.

It said advertisers will have to disclose the content (image or video, or realistic-sounding audio) that was digitally created or altered to — depict a real person as saying or doing something they did not say or do; or depict a realistic-looking person that does not exist or a realistic-looking event that did not happen, or alter footage of a real event that happened; or depict a realistic event that allegedly occurred, but that is not a true image, video or audio recording of the event.

It also mentioned that advertisers running these ads do not need to disclose when content is digitally created or altered in ways that are inconsequential or immaterial to the claim, assertion or issue raised in the ad.

“This may include image size adjusting, cropping an image, colour correction, or image sharpening, unless such changes are consequential or material to the claim, assertion, or issue raised in the ad. Meta will add information on the ad when an advertiser discloses in the advertising flow that the content is digitally created or altered,” the social media company said.

The same information will also appear in the Ad Library, it said adding that “If we determine that an advertiser doesn’t disclose as required, we will reject the ad and repeated failure to disclose may result in penalties against the advertiser.”

Meta said it will share additional details about the specific process advertisers will go through during the ad creation process.

A deep fake video of actress Rashmika Mandanna has been circulating on social media platforms. Netizens claimed the video has been morphed and the actual video is of an Indian-origin person living in the UK.

The Ministry of Electronics and Information Technology (MeitY) had also issued an advisory on Tuesday calling upon online platforms to take decisive actions against the spread of deepfakes. This was the second such advisory within the last six months.

As per the advisory, social media platforms should take all measures to remove or disable content which is in the nature of impersonation in an electronic form, including artificially morphed images of such individuals within 24 hours from the receipt of a complaint in relation to the content.

“Deepfakes are a major violation and harm women in particular. Our government takes the responsibility of safety and trust of all nagriks very very seriously, and more so about our children and women who are targeted by such content. It is a legal obligation for online platforms to prevent the spread of misinformation by any user under the IT Rules, 2021,” Rajeev Chandrasekhar, Minister of State for Electronics and IT, said.