Isn’t it ironic that the human body has been the greatest instrument of exploitation in human history? Women, forming a very precious half of human existence, have mostly been the target of this exploitation throughout history. The endeavor to exploit women by means of commodifying aims indeed at exploiting not only the physical existence of human, but also now his digital existence. Sometimes the person who is under any attack even may not know about this attack, or when he finds becomes aware, it may be too late. This form of inhumane exploitation legitimized by the concept of adult content, targets not only women, but also at children and juveniles.
Deepfake porn sites emerged
Thanks to synthetic media and deepfake technology caused by artificial intelligence (AI), people are unwittingly turned into the object of adult content in the form of videos or photos, often by stealing their faces from social media, and thus they are exploited. In 2017 revenge porn, which is the first deepfake video content published in Reddit.com and projected to be 95% of deepfake content today, is now published on specially prepared deepfake porn sites and goes viral on social media. Ordinary people are also succumbing to their psychological problems or passions, with different bad intentions, and are increasingly joining the amateur producers of adult deepfake content. Experts predict that the number of adult-content deepfake videos circulating online, in which innocent people are chosen as victims, will rise to 180,000 this year and 720,000 next year.
Without taking control of adult content, which has become the human mind’s greatest weakness, fighting the threat of deepfake seems unlikely. As Deepware AI, we are leading a very important development in this area by adding the first online deepfake scanning and detection engine that we have developed the Adult Scanner function to Deepware Scanner.
Who is winning, who is losing?
According to the online porn market 2019-2027 survey of India-based market research organization Absolute Markets Insights, the online global adult content market is estimated at $ 35.17 billion for 2019. Thanks to pandemic and crypto currency content subscription, the market is estimated to have exceeded $ 40 billion last year, generating 15.12% growth. The pornography sector, which dominates such a giant economy, no longer confines itself to produce and sell content aimed at the instinctive satisfaction of humanity. By spreading the pornography epidemic to society, it encourages and drives amateur content production. Those who are not content customers of adult content have the opportunity to produce pornographic deepfake content of their chosen victim with the help of easily accessible mobile applications and user-friendly web interfaces. When they try to harm someone else, they do not take into account that they or a relative of them may be the victims of a similar attack in the future.
For the most part, the lives of abused victims are turned upside down by adding their faces to naked bodies in adult media. When such images are circulated online, it is not much questioned whether the images are real or increasingly loses their importance. The chosen victims are in danger of losing their reputation, dignity, spouses, families, social environment, their jobs, educational and career opportunities, their future and everything valuable they have, at a moment.
Revenge porn, the most dangerous method of blackmail
Deepfake media in the adult content genre is becoming a very dangerous assault weapon, with the quality of harassment and reputation attack directed at the victim. But this attack occurred when the deepfake media in question was circulated online. However, at the stage before entering online circulation, we may never know who in this way succumbed to what dark threats and blackmails directed at them, or what dirty purposes they will not be able to resist in this way in the future. Fake synthetic media with sex content can become a method of blackmail in more conservative societies, the consequences of which are too heavy to ever be afforded.
When women, especially politicians, bureaucrats, diplomats, journalists, writers, activists, opinion leaders, who should treat benefit of society and nation over everything due to their tasks and missions are threatened with deepfake media with adult content to what extent they will be able to resist? How can they resist such threats and continue to defend the interests of society, especially in more conservative geographies such as the Middle East and North Africa?
The price of investigative journalism is fake nude images served to half the country
Rana Ayyub (Eyyub) is a famous Muslim investigative journalist who made her name through his research exposing political scandals such as human rights violations and corruption in India. Young female journalist-writer wrote a revenge porn attack in her home country on the US-based news site the Huffington Post in the 6th month of the history of the world deepfake. Rana Ayyup, told in her article titled “I’m a victim of a deepfake porn incident to silence me” published in “Lives That Are Less Ordinary” section of huffingtonpost.com on 21 November 2018 that she has become the target of misogynistic harassment and hate messages on social media for a long time because she was an investigative journalist. For Rana Ayyub, who always tried to ignore the danger by convincing herself that it was only online hate, everything changed in two days in April 2018.
As polarization and tension dominate across the country over the rape of an 8-year-old Kashmiri girl, Rana Ayyub is invited to TV programs on the BBC and Al Jazeera to talk about how sexual abusers against children are protected in India. The next day, a series of fake tweets began circulating on social media, allegedly posted from her social media account. The fake tweets, which carry messages such as ‘I hate India’ and ‘I love child rapists and I support them if they do it in the name of Islam’, follow a 2.5-minute campaign of revenge porn and online lynching that was put into distribution a day later. Moreover, the video using her own face has been distributed to at least half of mobile phones and social media in the country via WhatsApp, a source from opposition BJP party tells her by email.
The law is not enough to ensure justice against deepfake
Although the harassment campaign, carried out via Twitter, Facebook and Whatsapp, has reached a limit that exceeds human dignity and tolerance, the police and court are allegedly indifferent to applications. Meanwhile, the video in question has been shared tens of thousands of times and hundreds of thousands of abusive comments were made. When the incident takes on an international dimension, and the United Nations rapporteurs step in and warn the Indian government, the legal process comes into play. But because of the great trauma experienced by the young female journalist, it is already too late.
Rana Ayyub attended to the TED Radio Hour program on NPR with Professor Danielle Citron at Boston University, School of Law, who studies in the field of cyber harassment on October 30, 2020. The female journalist he pointed out that an important minister in Modi’s government who was jailed in 2010 over her corruption report was the second most powerful man in India at the time of these events. Ayyub said that she lay dead for 5 days due to the trauma she experienced and could not leave the house for 6 months. Prof. Citron, on the other hand, emphasized that provocative online fake content spreads 10 times faster than other true and true content.
Barrier to pornography, detection to deepfake
“Deepfake pornography” is actually a double-layered social hazard. It is not necessary to explain the damage that pornography, which exploits the human body, with its obscene nature, which encourages the propensity for violence on the psychology of the individual and the moral values of society. The trauma of this abuse on children and young people is even greater. When you add the trauma experienced by deepfake victims who were used to this abasement without their knowledge or consent, without having anything to do with such media content, it becomes clearer how vital a two-stage measure is. Against this most dangerous and harmful type of deepfake, it is necessary both to prevent the uncontrolled availability of media content of this nature and to expose the deceptive qualities of deepfake.
Deepware Adult Scanner ignites technological fight against deepfake pornography
With Deepware Scanner, a Deepware AI-based multi-layer deepfake scanning and detection engine developed by deepware AI’s engineers deepware.ai for the first time in the world, offers online users the ability to view videos for free and report the possibility of deepfake through our website. As Deepware AI, we have also ignited the fight against “deepfake pornography” by adding an adult scanner filter to our Deepware Scanner product for the first time in the world.
The Adult Scanner filter, added to Deepware scanner, detects pornographic video content thanks to advanced AI algorithms. It blocks the display of a video scanned through a Video link or as a file during scanning and reports if it carries pornographic content. In this way, the system protects the scanning user from viewing obscene content in any scene of the scanned video. In addition, due to the report output of scanning in Deepware scanner, whether the video is deepfake or not, pornographic video is also prevented from being delivered to other users through search engines.
Although Deepware Scanner prevents such item from being displayed because it carries pornographic content, it reports whether the video is deepfake as a probability ratio to the scanning user. As a result, a user who performs a deepfake scan of a video in Deepware scanner is protected from viewing pornographic images and being an instrument for distributing the video during scanning and reporting, as well as being informed of the possibility of the video’s authenticity.
Whether pornographic deepfake images with adult content circulate online or victims submit to blackmail, the price to be paid will be too high. What if deepfake media with adult content, which can be easily produced to realize dark intentions, soon gets out of control… what becomes of the world when social lynching attempts based on social engineering, aimed at abuse, harassment and blackmail, eliminate the common mind, justice and conscience? How convincing can it be to talk about fighting deepfake, without preventing pornography abasement, when this picture is in the middle?