The recent rise of artificial intelligence, or AI, has led to numerous breakthroughs in what is technologically possible. At the same time, the AI’s lax regulation and rapid evolution has led to its manipulation for exploitative purposes.
Recently, AI has led to an increasingly popular and terrifying form of digital alteration called deepfakes. Deepfakes are AI-generated images of people’s faces or bodies digitally altered to appear like someone else. The deepfakes, in this case, involve altering pornography and nude images so that the person’s face looks like someone else’s.
In South Korea, there is an ongoing case regarding the creation and sharing of deepfake porn targeting women across the country. This case is receiving global attention for its space of impact and the main demographic of victims. This case is currently being investigated, and all numbers represented are estimates, not proven facts.
Telegram Chat Rooms
In recent months, South Korea’s deepfake crisis has come into public light, leading to mass outrage and fear. Women across the country have uncovered secret Telegram chat rooms where people have deepfaked their faces onto pornography and nude photos. These images and videos are shared across Telegram channels and oftentimes spread and used to blackmail victims.
Digital sex crimes are not uncommon to Korea. A previous case in 2019, referred to as the “Nth Room Case”, garnered mass global attention. In the “Nth Room Case,” perpetrators were blackmailing, cyber sex-trafficking, and spreading illegal, sexually exploitative, and abusive videos through Telegram chat rooms.
Although this current ongoing case is similar to the “Nth Room Case”, it differs in its specific use of deepfakes. Alongside the deepfake aspect, many of those affected have been unknowingly victimized by those close to them.
How to Gain Access
In order to gain access to these Telegram chat rooms, it’s not simply enough to know what room to join. You have to send in photos of very specific kinds of women. These photos have to be of someone that you personally know: classmates, teachers, friends, and even family members. With these photos, the rooms also require you to send these women’s personal information. This includes their names, age, contact information, student IDs, and, if possible, even their addresses.
Participants use this personal information to threaten, terrorize, and blackmail the victims. These photos are often taken from social media accounts and public images, such as graduation or school photos. Some men, though, also take these photos in-person, using hidden cameras or other non-consensual means.
Participants must send multiple photos because these images are used to generate the deepfake porn. Using AI, these criminals are able to send in these images to generate deepfaked porn instantaneously. They even customize the porn, altering the shape of the body and content to their fetishistic desires. There are estimates of over 200,000 Telegram users producing and sharing deepfaked porn in these rooms, all happening in mere seconds.
“Humiliation Rooms”
In addition to the automated deepfake creation channels, there are also chats these participants refer to as “humiliation rooms”. In these chat rooms, users would post photos of family members to gain entry and for other men to use. The fact that these women and girls were their own mothers, sisters, or even daughters didn’t deter them; in fact, they would encourage and praise this behavior.
Some would take hidden camera photos and videos to send in. Others would take it a step further, recording themselves touching their family while they were sleeping. There have been texts in these rooms where guys would plot drugging their sister. Rather than report such behavior, people in the chats responded with excitement and anticipation.
Cyber-Terror in Schools
This case first gained public attention after female students from two of South Korea’s most prestigious universities uncovered deepfake porn rings in their school. Some of these chat rooms from top universities have been active for three years and are only now brought to the limelight. Numerous students from these universities have been notifying the police about these rooms, which finally prompted investigation.
As police started looking into this case, though, they found that people weren’t just deepfaking college students. Participants had created designated rooms for specific schools where they deepfaked and shared images of female students from those schools. Investigators speculate that rooms exist for over 60 schools across the country.
These rooms included high schools and middle schools, revealing that these channels have also been targeting minors. However, the youngest recorded ages of those who’ve been deepfaked have actually been elementary school children. This means that the nature of many of these deepfakes are not only illegal, sexually exploitative impersonations, but also child pornography.
Perpetrators are not only targeting students, though. They have also been attacking teachers. Like with the students, there are dedicated chat rooms that deepfake, sexualize, and dehumanize these teachers.
As people investigated further, they discovered a reason as to why students and teachers are being so highly targeted. It was because a large portion of the criminals are actually male students. Male peers, fellow classmates, and even the teacher’s own students have been producing deepfakes of these women. That also means that many of these criminals are also underage boys as well.
Due to their age, many people worry that these perpetrators will receive light punishment if they are convicted at all. Although their crimes were extremely violating, the juvenile justice system may not reflect that.
Turning Victims into Aggressors
Participants in these chat rooms use highly degrading, offensive, and obscene language towards these girls. A lot of the messages sent are misogynistic, attacking women and justifying their actions in a twisted, victim-blaming manner.
Similarly to content such as “revenge porn”, many of these men feel that women have wronged them in some way. Some texts read statements like, “I can’t forgive women anymore”. Others say online, on numerous platforms, that only pretty women are targeted, and if you’re “ugly”, you’re safe.
One viral post, written anonymously on a college forum, tells women not to be scared or take down their photos. According to this post, you don’t have to worry about being a victim if you have never been confessed to as a joke, get low male attention, are not a feminist, or are above a certain weight. These reactions distort and lower the true severity of this situation. They place the blame and responsibility on the women rather than the perpetrators.
Anonymity Doesn’t Equate to Safety
Women who don’t even post on social media have spoken on how they still found porn using their faces. In one instance, a woman recounts how although she doesn’t post herself, she still finds herself deepfaked into porn. She found out that whenever she happened to change her Instagram profile photo, it would be saved and used to create porn.
Some women even had their accounts hacked solely for the purpose of creating deepfakes. The extent to which these perpetrators go to humiliate and dehumanize these women clearly contrasts their own self-justification and victim-blaming. Yet, the deflection of responsibility towards women runs deeper.
As women are reporting these men to the police, guys would complain online that these women are misandrists and downplay the severity of the crimes. Some online users have even come up with conspiracy theories that this entire situation was manufactured by feminists to generate hate against men. It seems that to these people, no matter what these women do, it is still partly, if not entirely, their own fault.
The Impact on Women
The extent of this crime’s impact on women is far-reaching and long-lasting. Many victims have shown a traumatic response to these crimes, mainly stemming from their close relationships with the perpetrators. The way that a lot of these deepfakes are produced and their degrading natures have deeply affected their perception of the world and people around them.
Women have expressed that they no longer know who to trust; they have lost faith in society as a whole. Even those who have not been targeted are forced to be cautious in case they become a victim themselves. The added weight of potentially being targeted further if they speak up only increases their anxiety and fear.
Reactions and Outrage
With the publicity of this case, schools have been trying to find ways to prevent deepfake crimes. However, it’s the girls who are being taken into auditoriums and being instructed to delete their photos. The boys, according to their perspective, simply do not understand that what they are doing is a crime. It is, instead, the girl’s responsibility to prevent herself from being deepfaked. Now, women are no longer able to even enjoy posting themselves online–to protect themselves, they have to remain faceless entirely.
A South Korean Center for Violence Against Women published a now-deleted post warning against deepfakes. In this image, a male student is depicted as the victim while female students laugh with their phones out. The role reversal ignores the fact that the vast majority of these deepfake crimes were committed by men against women.
There are rare instances of men deepfaking other men or women deepfaking other women for the purpose of embarrassment. However, there have been no noted or confirmed cases so far of women deepfaking men for humiliation in this situation.
Official’s Respond
The South Korean president addressed the situation, speaking on the rapid spread of deepfakes. He and the police promise to look into these crimes and bring justice. The Seoul National Police Agency has also announced that they will examine Telegram and its use in the production and spread of child pornography.
A few politicians, though, believe that this case is being exaggerated and blown out of proportion. One politician in particular even says that the growing fear of deepfakes may actually lead to over-regulation.
On the other hand, women feel that the police and justice system do too little. They cite the 2019 “Nth Room Case” as an example of the officials’ negligence in investigating sexual abuse on Telegram. In that case, it was mainly the main instigators who were sentenced and not the majority of participants. Some women feel that this has led to a sense of relative comfort and safety in committing digital sex crimes.
Questions rise: if the “Nth Room Case” was treated with more severity, would this 2024 deepfake case be this terrible?
Deepfake crimes are not restricted to South Korea, although South Korea is currently one of the countries with the highest levels of deepfake usage. As AI becomes increasingly and rapidly more popular and improved, deepfake sex crimes have surfaced all over the world. There have been cases in America concerning deepfake pornography among students as well. This ongoing case in South Korea is one that is highly important to the current discussion of deepfakes and AI.
The way that South Korea responds to this case may set a precedent for similar crimes in the future. What exactly that response may be, though, is unclear as of now.