[ad_1]
Instagram will blur explicit photos sent to under-18s in an effort to protect children from unsolicited images.
The social media app will automatically prevent young users from seeing images if its algorithms detect nudity being sent in private messages.
It comes amid growing concerns about social media and smartphones’ negative impact on teenagers, which has led to calls for younger teenagers to be banned from owning smartphones, and the rise of online blackmail using intimate photos of victims.
Instagram will also warn under-18s when they share explicit images of themselves that sending the pictures could leave them vulnerable to scams or bullying.
The app said it was introducing the change to protect children from sextortion scammers, who blackmail young people after encouraging them to share intimate images, either seeking money or requiring them to obtain and share intimate images of other children.
In addition, the app will force users to confirm before they forward explicit images sent to them to another account, warning them that it may be illegal to do so.
The feature will be automatically turned on for under-18s and adults will be asked if they want to activate it.
However, Instagram will use an AI algorithm within the app to detect if a photo features nudity, rather than scanning it as it is sent through Instagram’s servers.
Source link