Researchers say parent company Meta is failing to remove explicit images on the social media site
Meta is actively helping self-harm content to flourish on Instagram by failing to remove explicit images and encouraging those engaging with such content to befriend one another, according to a damning new study that found its moderation “extremely inadequate”.
Danish researchers created a private self-harm network on the social media platform, including fake profiles of people as young as 13 years old, in which they shared 85 pieces of self-harm-related content gradually increasing in severity, including blood, razor blades and encouragement of self-harm.