Instagram is adding safety measures designed to protect teenagers from unwanted direct messages from adults.
Older users will be able to privately message teenagers who follow them only.
And messages will be overlaid with a notice reminding teenagers they need not respond to anything that makes them uncomfortable.
The measures will work only if accounts have users’ correct ages, which young people sometimes lie about to avoid restrictions on what they can see.
Likewise, predators might pretend to be younger than they actually are.
Instagram said it was developing “new artificial intelligence and machine learning technology” to help tackle the challenge of age verification, especially in cases where account holders have not been honest.
Private account
The minimum age for using Instagram is officially 13.
The platform also said it now offered young account holders the option to make their accounts private when they created them.
“If the teen doesn’t choose ‘private’ when signing up, we send them a notification later on, highlighting the benefits of a private account and reminding them to check their settings,” it blogged.
In January, rival TikTok announced:
under-16s’ accounts would be made private by default
13-15-year-olds would be able to approve “friends” for comments and choose whether to make videos public
The UK’s proposed Online Harms Bill would give regulator Ofcom the power to block online services that fail to protect children – but it is unlikely to become law before 2022.
Related Topics