Not Swift Enough: Survivors and Children Wait for Government, Industry to Act - IJM Hong Kong
Select or Enter Gift Amount(HKD):
HK$

Global

Not Swift Enough: Survivors and Children Wait for Government, Industry to Act


Late last month, sexually explicit deepfake images of Super Bowl-bound Taylor Swift exploded across the internet with tens of millions of views. One image was seen 47 million times before X removed it 17 hours later. While clearly not fast enough, the response was strong. X said it was “actively removing” the images and taking “appropriate actions” against the accounts involved in spreading them. The name “Taylor Swift” was temporarily unsearchable on X, alongside “Taylor Swift AI” and “Taylor AI.” U.S. politicians called for “new laws to criminalise the creation of deepfake images.” Microsoft reported that it strengthened “existing safety systems to further prevent our services from being misused to help generate images like them” in the first place. Ensuing media coverage was intense.

Yet, broadly speaking, industry actions to address the rampant creation of images, recorded and live videos of real children being sexually abused, with or without the use of AI, pales in comparison. For example, according to Australia’s eSafety Commissioner, tech companies reported that they do not detect, prevent, or address child sexual abuse material (CSAM) created and distributed “live” in video calls and livestreams.

In fact, a 2020 study by the International Justice Mission (IJM) found that children in the Philippines were sexually abused and exploited online for at least two years, on average. Another IJM study released in 2023 found that nearly half a million children in the Philippines were exploited in this way in 2022 alone. These children are abused in their homes while men, including Americans, around the world pay to direct and consume it live using the same popular social media and video chat apps we all use every day.

IJM has a robust program that, to date, has supported Philippine authorities to bring to safety over 1,200 victims and at-risk individuals, while supporting the government and key stakeholders to holistically strengthen the response to these crimes. And we will continue that critical work until Filipino children are protected from this violence. In fact, since 2022 through Project Boost, IJM’s Center to End Online Sexual Exploitation of Children has—in partnership with the U.S. National Center for Missing & Exploited Children (NCMEC) and Meta—begun training law enforcement in other countries to investigate cases of online sexual exploitation of children, including in Nigeria, Kenya, and elsewhere. We’re sharing our proven model from the Philippines to support other governments to protect their children too.

But to help reduce the massive scale of this harm, the private sector must do more to address these crimes happening on and through their apps and platforms. That “more” includes building their platforms safe by design to prevent harm, while also supporting efforts by organizations like IJM to strengthen law enforcement capacity to successfully investigate priority reports, bring children to safety, and hold offenders accountable.

According to research from 2023 by the Philippine Anti-Money Laundering Council, payments flagged by the financial sector as “suspicious transactions” for child sexual exploitation in the Philippines chiefly originated from the United States, followed by the U.K., Australia and Canada. And because tech company platforms are global, creating clear rules of the road and accountability for the tech sector will translate to a safer internet for children everywhere.

What the hearing made abundantly clear is that multinational tech platforms are not built safe by design at their core; and to become safe, the industry as a whole needs laws now. Deepfake pornography that harms everyday citizens and celebrities like Taylor Swift, AI-generated CSAM, and livestreamed child sexual abuse are all prime examples for why governments must require companies to embed safety into their products before they roll out technology. Because safety by design is not mainstream across the tech sector, popular video call and livestreaming apps are easily weaponized by sex offenders to livestream child sexual abuse – and much more could be done to identify them and hold them accountable.

In summary, all companies should use safety technology to prevent as much online exploitation as possible, while timely and robustly reporting harm when it happens so law enforcement can do its job.

Legislation to protect people – whether celebrities like Taylor Swift or ordinary citizens – from harmful deepfake images is simply a must. Protecting all children and survivors from the ongoing trauma of sexual abuse and exploitation, including CSAM, is past due and urgently needed to ensure a strong industrywide response. The time for change is now.

By John Tanagho, Executive Director, IJM Center to End Online Sexual Exploitation of Children
Feb. 8, 2024

Support Our Work Today >>

You might also be interested in…