Friday, 15 November 2024

Instagram’s Selective Blurring Of Nudity Falls Woefully Short Of Protecting Kids


Share
  • Share Article on Facebook
  • Share Article on Twitter
  • Share Article on Truth Social
  • Copy Article Link
  • Share Article via Email
  • Instagram is finally taking action against sexual exploitation on its platform, just one day after being called out in the National Center on Sexual Exploitation’s (NCOSE) Dirty Dozen List. Instagram, which is owned by Meta, will use artificial intelligence to automatically blur images of nudity in the direct messages (DMs) of users under 18 years old. 

    While the new policy may seem like a welcome step in the right direction, it’s far from enough. Minors may still click “view image anyway” and easily surpass the blurring on an explicit direct message. Many children will want to click on the blurred image to see what it is. In fact, the change is little different than Instagram’s existing policy banning the posting of nude images, which is easily circumvented or overridden by users. 

    “Why is Meta even putting the burden on children to make the choice about seeing sexually explicit content when their own policies expressly prohibit it?” asked Tori Rousay, corporate advocacy manager and analyst at NCOSE.

    Even TikTok simply disables all direct messaging for users aged 13 to 15, understanding that this private venue for communication opens up children to victimization of various forms, even without the sharing of explicit imagery. Disabling DMs for children under 18 would be an obvious next step for Meta to take in combating sexual exploitation on its platforms. 

    “Disabling direct messaging for minors, like TikTok does … would be a significant step and something the National Center on Sexual Exploitation has also been asking Meta to do,” Rousay said. “Further, prohibiting minors from sending or receiving images and videos, as Meta does for message requests, is an additional way to protect minors.”

    Pedophiles and Predators

    Facebook, Messenger, Instagram, and WhatsApp, all platforms owned by Meta, are consistently some of the top places that pedophiles and predators first contact their victims, according to a recent study by Protect Children. Instagram and Facebook alone make up a full 75 percent of the platforms that offenders use to make first contact with children. Meta’s direct messaging system is fully encrypted, so the tool has long been the target of groups that fight child sexual exploitation.

    Over one-quarter of underage users on Instagram receive message requests from adults they do not follow. There are hardly any innocent reasons one could conjure up that so many adult users would want to initiate contact with underage accounts, especially given that 13 percent of respondents to a 2021 survey reported receiving “unwanted advances” on Instagram in the span of only a week. 

    Other Restrictions

    Meta has been gradually enacting restrictions on underage users and the content they view in recent years. In January of 2024, Meta announced default settings that would only allow teens on the app to message accounts they already follow. In March 2021, users over 19 years old were restricted from sending direct messages to teens who do not follow them. Various tools allow parents to monitor or restrict the content their children view, or how much time they spend on the app. 

    But banning minors from DMs, from sending photos or videos, or from creating accounts would be much better ways to protect minors.

    Age verification has little effect on curbing child sexual exploitation if users simply lie about their age. Estimates indicate that about 1 in 3 children on social media do just that. Instagram announced a crackdown on this behavior in 2022, where users who tried to edit their ages from under 18 to over 18 would have to either upload their IDs, upload a video selfie, or get confirmation of their age from three mutual followers. But once again, the reform is good but not good enough. Active users could simply make a new account where they lie about their age to circumvent the rules. 

    The larger problem of pornographic images circulating on social media often beckons libertarians and leftists to warn against bans that would result in the loss of “free speech.” Of course Meta, as a private company, could enact related restrictions if they pleased without infringing on the First Amendment, and one thing we can usually all agree on is that children must be aggressively protected from sexual victimization. There is no free speech protection for pedophiles soliciting sexual favors via private messages. 

    Social media is still a sort of Leviathan for which none of us were prepared. Instagram was launched in 2010, and in just 12 years has surpassed 2 billion users. A full 40 percent of kids under 13 already use apps like Instagram and Facebook. The older generations can rail against “kids these days and their phones,” but in many ways, modern children have been dealt a losing hand as soon as they were handed an iPad as a toddler. It is because of the addictive popularity of these apps that banning or heavily restricting them will be so unpopular, both with the underage users and their checked-out parents.

    Meta needs to go beyond merely blurring explicit images. Children deserve far more protection, up to and including completely removing from the clutches of online predators the massive rolodex of innocent teens that is Instagram. 


    Source link