Instagram is introducing a range of new features and updates in a bid to improve safety for young users on the platform.

It will limit the ability for adults on the platform to interact with teens that follow them. Adult users on the platform will not be allowed to direct message users below 18 that do not follow them.

“To protect teens from unwanted contact from adults, we’re introducing a new feature that prevents adults from sending messages to people under 18 who don’t follow them,” the Facebook-owned photo sharing platform said in a blog post.

When an adult will try to message a teen who doesn’t follow them, they will receive a notification that DM’ing them isn’t an option.

“This feature relies on our work to predict peoples’ ages using machine learning technology, and the age people give us when they sign up. As we move to end-to-end encryption, we’re investing in features that protect privacy and keep people safe without accessing the content of DMs,” it said.

Apart from this, it will also start using prompts or “safety notices” to advice young users to remain cautious in conversations with adults they’re already connected to.

“Safety notices in DMs will notify young people when an adult who has been exhibiting potentially suspicious behaviour is interacting with them in DMs,” it said.

AI and machine learning tech

For instance, it will use this tool to alert recipients within an adult’s DMs who has been sending a large amount of friend or message requests to people under 18. The prompt will give young users to end the conversation, or block, report, or restrict the adult.

“People will start seeing these in some countries this month, and we hope to have them available everywhere soon,” it said.

The platform has a minimum age requirement of 13. It asks new users to provide their age when they sign up for an account for some time.

“While many people are honest about their age, we know that young people can lie about their date of birth. We want to do more to stop this from happening, but verifying people's age online is complex and something many in our industry are grappling with,” it said.

“To address this challenge, we’re developing new artificial intelligence and machine learning technology to help us keep teens safer and apply new age-appropriate features, like those described,” it said.

It will also be exploring ways to make it more difficult for adults who display “suspicious behaviour” to find and interact with teens on the platform in the coming weeks.

“This may include things like restricting these adults from seeing teen accounts in 'Suggested Users', preventing them from discovering teen content in Reels or Explore, and automatically hiding their comments on public posts by teens,” it said.

It will also prompt young users to opt for a private account once they sign up.

“We’ve recently added a new step when someone under 18 signs up for an Instagram account that gives them the option to choose between a public or private account. Our aim is to encourage young people to opt for a private account by equipping them with information on what the different settings mean,” it said.

Young users can still choose for a public account. If the teen doesn’t choose ‘private’ when signing up, Instagram will send them a notification later on highlighting the benefits of a private account and reminding them to check their settings.

comment COMMENT NOW