SEOUL, SOUTH KOREA – In South Korea, a surge in deepfake porn crimes is causing havoc, with victims’ faces being superimposed onto explicit images and shared online.
A worrying trend that has seen over 900 students, teachers, and staff fall victim within the last year alone. Despite new laws and an emergency task force, arrests remain low, prompting some victims to take matters into their own hands.
Ruma, a pseudonym for a 27-year-old university student, shares with CNN how she experienced this nightmare firsthand. She was shocked to find her face superimposed onto explicit images and shared on the messaging app Telegram.
The sender, who remains unknown to Ruma, threatened to spread these images even further and made it clear that they had access to her personal details.
South Korea has seen a significant rise in digital sex crimes recently. From hidden cameras in public places to blackmail in chat rooms, the growth of AI tools now allows anyone to become a victim of deepfake pornography, regardless of whether they’ve ever taken or sent a nude photo.
The crisis is particularly severe in schools, with more than 900 students, teachers, and staff reporting being victims of deepfake sex crimes from January to early November last year.
In response to this alarming trend, the country’s education ministry established an emergency task force. In September, legislators passed an amendment making possessing and viewing deepfake porn punishable by up to three years in prison or a fine of up to 30 million won (over $20,000).
Additionally, creating and distributing non-consensual deepfake explicit images now carries a maximum prison sentence of seven years, up from five.
Despite these measures, arrests remain meagre. Of the 964 deepfake-related sex crime cases reported from January to October last year, only 23 resulted in arrests. This lack of action has led some victims like Ruma to take matters into their own hands.
With help from activist Won Eun-ji, two former students from Seoul National University were arrested last May for producing and distributing sexually exploitative materials.
But not all victims receive such support. A high school teacher identified as Kim discovered she was being exploited when a student showed her inappropriate photos taken in the classroom and edited onto explicit images.
Despite the trauma caused by these crimes, public empathy remains low, adding to the victims’ emotional burden.
Pressure is now mounting on social platforms like Telegram to act. The company recently announced that it would increase sharing user data with authorities as part of a broader crackdown on illegal activities.
South Korea’s media regulator also said Telegram had agreed to establish a hotline to help wipe illegal content from the app.
Despite these steps, the victims interviewed by CNN urged for more support from the police and courts. Ruma said, “there’s a long way to go,” highlighting the ongoing struggle victims face in seeking justice for these horrific crimes.
The surge in deepfake sex crimes in South Korea is part of a broader, global trend, highlighting the dark side of advancing technology. As AI tools become more sophisticated and accessible, so does the potential for misuse. In the wrong hands, these technologies can be used to invade privacy, exploit individuals and cause irreparable harm.
While South Korea’s efforts to combat these crimes through legislation and increased penalties are commendable, it’s clear that the country—and the world—is grappling with how to effectively address this problem. The low arrest rate, despite hundreds of reported cases, underscores the challenges law enforcement faces in tracking down anonymous perpetrators who hide behind these digital tools.
The victims’ stories illuminate not just the severity of the crime itself but also the societal apathy that often accompanies it. The comments questioning why deepfake porn is a serious crime because it’s “not even your real body” underscore a glaring misunderstanding about the profound psychological impact these crimes can have.
These digital sex crimes go beyond mere privacy invasion; they are a form of sexual exploitation and abuse that can leave victims feeling violated and humiliated.
Moreover, they raise important questions about consent, bodily autonomy, and the right to one’s image.
The pressure is now mounting on social media platforms like Telegram to play a more active role in preventing such crimes. Telegram’s recent commitment to share more user data with authorities marks a significant shift for a platform known for its strong commitment to privacy.
However, as activist Won Eun-ji points out, government interventions may be needed if platforms don’t show substantial progress soon. This raises critical questions about the balance between user privacy and safety, and how far tech companies should go in monitoring and moderating content.
Ultimately, though, these disturbing crimes highlight the urgent need for a multi-faceted response that includes not just tougher laws and better law enforcement but also greater public understanding and empathy towards victims. It also underscores the importance of tech companies taking responsibility for how their tools can be misused, and the role they must play in preventing such abuses.
The rise of deepfake sex crimes in South Korea is a stark reminder of the dual-edged nature of technology. While advancements in artificial intelligence have led to numerous benefits, they also present new avenues for harm and exploitation.
The battle against these crimes is not only a legal issue but a societal one, calling for a reevaluation of how we understand and address digital sex crimes.