The United States now hosts more child pornography online than any other country


To support the journalism of MIT Technology Review, please consider subscribing.

Other than “bad press,” there aren’t many penalties for platforms that fail to quickly remove CSAM, says Lloyd Richardson, chief technology officer at the Canadian Center for Child Protection. “I think you’d be hard-pressed to find a country that fines an e-service provider for being slow or not removing CSAM,” he says.

CSAM volume has increased dramatically across the world during the pandemic as children and predators spend more time online than ever before. Child protection experts, including an anti-child trafficking organization Spikedand IN HOPEa worldwide network of 50 CSAM hotlines, predict that the problem will only grow.

So what to do to fix it? The Netherlands can provide some guidance. The country still has a significant CSAM problem, thanks in part to its national infrastructure, geographic location, and status as an internet hub for global traffic. However, he managed to make major progress. It fell from 41% of global CSAMs at the end of 2021 to 13% at the end of March 2022, according to the IWF.

This can largely be attributed to the fact that when a new government came to power in the Netherlands in 2017, it made the fight against CSAM a priority. In 2020, he released a report that named and shamed internet hosting providers who failed to remove CSAM within 24 hours of being alerted to its presence.

It seemed to have worked, at least in the short term. The Dutch CSAM hotline EOKM found that vendors were more willing to remove material quickly and adopt proactive CSAM detection measures, such as committing to remove CSAM within 24 hours of discovery, following the publication of the list.

However, Arda Gerkens, managing director of EOKM, believes that rather than eradicating the problem, the Netherlands have simply pushed it elsewhere. “It looks like a successful model, because the Netherlands have cleaned up. But it didn’t disappear, it moved. And that worries me,” she said.

According to child protection experts, the solution will come in the form of legislation. Congress is currently considering a new law called the EARN IT (Eliminating Abusive and Rampant Neglect of Interactive Technologies) Act, which, if passed, would make open services susceptible to prosecution for hosting CSAM on their networks, and could compel service providers to scan user data for CSAM.

Privacy and human rights advocates vehemently oppose the law, arguing it threatens free speech and could usher in a ban on end-to-end encryption and other privacy protections. . But the flip side of that argument, Shehan says, is that tech companies currently prioritize the privacy of those who distribute CSAM on their platforms over the privacy of those who are victimized.

Even if lawmakers fail to pass the EARN IT Act, upcoming legislation in the UK promises to hold tech platforms accountable for illegal content, including CSAM. Britain’s Online Safety Bill and EU Digital Services Act could result in billions of dollars in fines for tech giants if they fail to adequately tackle illegal content when the law will come into force.

The new laws will apply to social media networks, search engines and video platforms that operate in the UK or Europe, meaning that US-based companies, such as Facebook, Apple and Google, will have to respect them to continue to function. UK. “There’s a lot of global movement around this,” Shehan says. “It will have a ripple effect around the world,” Shelan says.

“I would prefer that we didn’t have to legislate,” says Farid. “But we’ve been waiting 20 years for them to find a moral compass. And that’s the last resort.

Previous FSU team wins global competition to develop next-generation tools and improve public health solutions
Next Augmented Reality Software Market Size by Product Type, by Application, by Competitive Landscape, Trends and Forecast by 2029 – themobility.club