In recent years, the rise of artificial intelligence has changed the way people create videos, music, and art. But along with its positive uses, a dark and disturbing trend has appeared — the creation of fake, sexually explicit images and videos using real people’s faces. This is known as KpopDeepfake, and it has become one of the most serious digital crimes in South Korea.
The term KpopDeepfake comes from the combination of “Kpop” — South Korea’s globally famous pop culture — and “deepfake,” a technology that uses AI to manipulate videos and images. This trend has led to a crisis that is now being called a digital emergency in the country.
What Is KpopDeepfake?
KpopDeepfake are fake videos or images that have been produced by employing artificial intelligence to place the face of one person, usually a K-pop idol, student, or even a private citizen, into the body of another in a sexually explicit or embarrassing scenario. These bogus videos are designed to appear terribly authentic, and they go viral very quickly on sites such as Telegram, Twitter, and Reddit.
It began primarily with K-pop idols and celebrities as its main target, but now, everyday women, high school, and even middle school girls have started falling prey. The individuals who make such fake videos tend to use publicly shared photos on social networks and subject them to the AI software that can create realistic deepfakes in a few seconds.
To the victims, the outcome is gruesome. They find themselves one day staring at fake videos of their own online -videos that they have not actually recorded.
The Story That Shocked South Korea
A South Korean student, who is called Heejin (not her real name), was sent a message on Telegram last Saturday. It stated that your personal information and pictures were leaked. Let’s discuss.” Upon opening the chat, she discovered counterfeit sexual photographs of herself- she made them with her old school photos.
She was terrified. The images were not real, yet they appeared to be real. She did not answer, but there were additional fake pictures. She was shocked and frightened by the incident.
Heejin’s story is not unique. There are dozens of such stories that have taken place in South Korea in the recent past. What is even more threatening is the fact that the online deepfake networks are also structured and arranged.
How Telegram Became the Center of the KpopDeepfake Scandal
A journalist named Ko Narin investigated this issue and found hundreds of Telegram chat groups where users were sharing photos of women and creating deepfake pornographic content in real time.
These groups often had thousands of members. Some were focused on specific schools and universities, and many targeted underage girls. Victims were sometimes given their own “rooms,” where fake videos of them were shared and discussed.
Telegram has been criticized for not doing enough to monitor or remove illegal content. Although the app says it uses AI moderation and removes millions of posts every day, South Korean activists argue that it is not enough.
This is not the first time South Korea has faced such a scandal involving Telegram. Back in 2019, a massive online sex crime called the Nth Room Case shocked the nation when it was revealed that women were blackmailed into sharing explicit photos. That case led to multiple arrests and heavy prison sentences, but the app itself was never punished.
Now, history seems to be repeating itself.
A Systematic Digital Crime Network
It has been found out that these KpopDeepfake chatrooms are conducted in an extremely systematic manner. The members are frequently requested to post various pictures of the same individual, as well as their personal information, such as their name, school, and location. These pictures are, in turn, transformed into counterfeit sex videos using AI.
Others even name chat rooms as humiliation rooms or friend of a friend rooms, and victims are selected either in schools or workplaces.
There might be thousands of members of a single chatroom, and even users compete in the creation of the most lifelike deepfakes. The whole procedure is made to look like a game, and this is even more disturbing.
How Victims Are Affected
The KpopDeepfake scandal has resulted in the ensuing outrage of the South Korean population. The demonstrators have rallied in Seoul, calling on the toughening of anyone who breaks the law and increased responsibility of technology firms such as Telegram.
The Seoul National Police Agency has declared it will initiate a complete inquiry into the part that Telegram played in the dissemination of fake pornographic imagery.
The South Korean government has also given an assurance that new laws would be enacted to tighten the punitive measures against any individual who would have created or shared deepfake porn. Individuals who watch such content will also be punished.
Nonetheless, female rights activists feel that it is not enough. They claim that South Korea has not considered the issue of structural sexism and has not been able to safeguard women against online sexual abuse over the years.
The Role of Women’s Rights Organizations
Since the KpopDeepfake crisis started, groups such as the Advocacy Centre for Online Sexual Abuse Victims (ACOSAV) have been overwhelmed. They emotionally support the victims and collaborate with the online channels in an attempt to eliminate fake content.
As of 2023, ACOSAV was dealing with 86 cases of teenage deepfake victims. However, already in the first eight months of this year, that number had reached 238, and over 60 new victims have surfaced within the last week by itself.
Park Seonghye, the director of ACOSAV, now thinks it is a digital war. Her team works 24/7 to assist victims and report illegal content.
The Role of Women’s Rights Organizations
Groups like the Advocacy Centre for Online Sexual Abuse Victims (ACOSAV) have been overwhelmed since the KpopDeepfake crisis began. They provide emotional support to victims and work with online platforms to remove fake content.
In 2023, ACOSAV handled 86 cases of teenage deepfake victims. But in just the first eight months of this year, that number jumped to 238 — and more than 60 new victims have come forward in the past week alone.
According to ACOSAV’s director, Park Seonghye, it now feels like a “digital war.” Her team is working day and night to support victims and report illegal content.
KpopDeepfake Statistics and Facts
Here is a table showing some key facts about the ongoing KpopDeepfake crisis:
| Category | Details |
| Primary Platform Used | Telegram |
| Victim Age Range | Mostly teenagers and young women |
| Number of Schools Targeted | Over 500 identified so far |
| Teen Victims in 2023 | 86 reported |
| Teen Victims in First 8 Months of 2025 | 238 reported |
| Recent New Victims (Past Week) | 64 new cases |
| Type of Content | AI-generated fake sexual images and videos |
| Common Victim Reaction | Fear, anxiety, deletion of social media |
| Government Action | Investigation into Telegram, stricter penalties, new AI crime laws |
Why the Problem Runs Deeper
Experts believe the KpopDeepfake crisis is not just about technology — it’s about a deeper social issue. South Korea has faced repeated waves of online sexual abuse, from hidden camera scandals to now AI-generated pornography.
According to activists, the core problem is structural sexism. Many women feel that authorities and society do not take digital sex crimes seriously.
President Yoon Suk Yeol has been criticized for denying the existence of structural sexism and for cutting funding to gender equality programs. Women’s organizations argue that without gender education and proper legal protection, the problem will only get worse.
The Importance of Education and Awareness
Counsellors and psychologists stress the importance of education — especially for young men. Many teenage offenders see deepfakes as a joke or a game without realizing they are committing serious crimes.
Lee Myung-hwa, a youth counsellor, explained that when offenders are properly educated about how their actions harm others, they are less likely to repeat the behavior. Schools and communities now need to teach about consent, respect, and digital ethics from an early age.
The Ongoing Fight Against KpopDeepfake
Many of the Telegram chatrooms have been closed down since the exposure of the chatrooms. However, new ones continue to emerge, and it is difficult to entirely stop these crimes. To work against journalists and other activists who criticize them, some groups have gone ahead to form their own humiliation rooms.
Journalist Ko Narin who initially reported on the story, reports that she is now living in fear that she might become a victim of the fake image in the same way as the story she initially uncovered. The scenario has generated a lot of panic among women throughout South Korea.
The truth is that now many young women are not safe on the internet, as they cannot know whom to trust. This has altered their mode of interacting with individuals both online and offline because they fear being the next victim.
Final Thoughts
KpopDeepfake crisis has opened the dark side of modern technology. What began to be used as a tool of creative innovation has become a weapon of harassment and abuse.
This scenario demonstrates that it is high time to develop more measures to govern the digital landscape and hold tech firms accountable, and educate people to be respectful and empathetic on the internet.
It is a wake-up call to South Korea and to the rest of the world in general. AI has the potential to be potent; however, when combined with unethical practices, there is a risk of it can be harmful. It should be the duty of the world to prevent the abuse of individuals and, in particular, young women as victims of deepfakes.
KpopDeepfake is not a problem exclusive to Korea, but it serves as an alert to all societies to watch out for the misuse when human morality and responsibility are discarded.
Also Read About :- Rndcoin Kr