A wave of edited sexual images of female students’ faces has shocked South Korea, with authorities announcing harsh punishments for those who believe, distribute or view them.
“Your photos and personal information have been leaked. Let’s discuss it. With this message, Heejin, a university student from South Korea received a photograph of his face on a naked, sexually explicit body that was not his.
The young woman was victim of a Artificial intelligence used with bad intentions.
As he said BBC World —with a different name, to protect his identity—, The young woman continues to receive images of herself in pornographic scenarios, created by an AI application.
This case is similar to what happened at the Saint Georges school, in the commune of Vitacura in Chile where a group of male students made a montage of classmates’ faces in explicit scenes.
This is the abuse deep false what happens when AI technology is used to mimic a person’s appearance and even voice. Typically, malicious people choose to use this “tool” to put their victims in sexual situations, either to humiliate them or even to blackmail them.
And the most dangerous thing about all this is that Some deepfake AI is so advanced that the results are quite convincing.
This is why, with the rise in cases of young victims of AI, South Korea plans to take tough measures against those responsible and perpetrators of these acts.

South Korean students in crisis over deepfake cases
Young Heejin said BBC that after receiving the photographs showing her face in sexual scenes, she felt petrified and very alone.
But like her, The victims of this type of artificial intelligence are increasing in number: this was denounced by the South Korean journalist Ko Narin, which revealed in a recent report that police were investigating fake porn rings at two of South Korea’s top universities.
The Korean journalist found dozens of chat groups on Telegram where users shared photos of women they know and asked for AI software to be used to turn them into fake pornographic images.
All this was done in just a few seconds.
However, Ko’s discovery, published in the journal Hankyoreh , The situation became even darker: they were raping not only university students, but also girls from high schools and colleges.
Additionally, if many photographs of a single woman were sent, The men behind them created “rooms of their own”, called “humiliation rooms” or “rooms of friends of friends”, to which they restricted access to a few.
“I was surprised to see how systematic and organized the process is. The most horrific thing I discovered was a group of underage students in a school, which had more than 2,000 members. Ko said of his report.

As confirmed BBC, Their journalists were also able to see these Telegram discussions and were able to see the seriousness of the situation: e.g. In some groups, members are asked to post more than four photos of a woman with her name, age and the region where she lives.
Days after Ko’s shocking article was published, various women’s rights activist organizations began investigating these groups. deep false And What they discovered was equally devastating.
More than 500 schools and universities in South Korea have been identified as “targets” by the perpetrators. It is believed that many of the victims may be under 16 years old and, furthermore, The alleged perpetrators are mostly teenagers.
Additionally, experience deep false It is a difficult process for victims: Heejin said she feels a lot of anxiety because she doesn’t know how many people have seen her fake photos.
“I couldn’t help but think that if this had happened because I had posted my photos on social media, should I have been more careful?”
Like her, thousands of women They started hiding or deleting their photos from social media, because they fear falling victim to this malicious AI.
“We are frustrated and angry that our behavior and use of social media is being censored when we have done nothing wrong.” ” said Ah-eun, a student.
But beyond social control, there is the question of complaints: Is it possible to find a culprit?

Can you report a deepfake to the police?
According to Ah-eun, The student, who was victimized by her university, contacted the Korean police to report that they had used a photo of her. and they had modified it to have a sexual connotation.
However, the response of the authorities was simple and direct: He told her not to bother pursuing her case because it was “too difficult to catch the attacker” and that what happened to her “wasn’t really a crime” because the photos were “fake.”
But now, with the impact of the report that has shocked not only public opinion, but also In government, he assured that he would impose “more severe sanctions” on those involved: both those who create the content, those who share it and those who consume it.
In addition, THE Seoul National Police Agency assured that he would investigate Telegram , Because the app has allowed this type of content to be distributed throughout this time, which includes, on many occasions, false pornographic images of minors.
In addition, South Korean President Yoon Suk-yeol has urged families to “better educate” young people.
The latter, for Lee Myung-hwa, who treats young sex offenders at the Aha Seoul Youth Cultural Center, Deepfakes aimed at teenagers “are now part of their culture, they see it as a game or a joke.”
For this reason, the expert assured that It is essential to educate young people so that this type of behavior, which is serious and which ends up falling into the category of sexual abuse, stops becoming normalized.
Source: Latercera

I am David Jack and I have been working in the news industry for over 10 years. As an experienced journalist, I specialize in covering sports news with a focus on golf. My articles have been published by some of the most respected publications in the world including The New York Times and Sports Illustrated.