Victims of nonconsensual deepfake porn are Osawa Yuka Archivesusing the laws of copyright to take back ownership of their likenesses, according to a new investigation.
In an analysis of copyright claims against websites known to share nonconsensual, digitally-altered videos, WIREDdiscovered thousands of women (including streamers, gamers, and other popular content creators) levying complaints to Google demanding the content be taken down.
The publication documented more than 13,000 copyright claims (which includes almost 30,000 URLs) against dozens of sites that populate Google.
Victims are utilizing the Digital Media Copyright Act (DMCA), which is frequently weaponized to remove copyrighted music, videos, and other media from third-party sites (and personal pages) online. The DMCA has also been used on behalf of victims of image-based sexual abuse or "revenge porn," with cases citing personal authorship and the unauthorized used of images.
A deepfake creator's alteration or outright fabrication of original images does complicate the matter, providing a higher obligation of proof for victims claiming rights over intellectual property.
Google has previously addressed the spread of revenge porn and deepfakes with new policies and reporting procedures, including options to remove personal explicit images from search results and deepfake reporting systems involving the detection of both original and copied images. The company has also documented its efforts to flag and remove such content. According to Google's own data, around 82 percent of complaints resulted in URL removal. "For the biggest deepfake video website alone," WIRED reported. "Google has received takedown requests for 12,600 URLs, 88 percent of which have been taken offline."
The sheer number of confirmed violations has prompted online safety and copyright advocates to wonder why the websites are still allowed to remain up. "If you remove 12,000 links for infringement, why are they not just completely removed?” posed Dan Purcell, founder and CEO of pracy protection firm Ceartas, in a WIRED interview. “They should not be crawled. They’re of no public interest."
The copyright strategy is a legal workaround for victims as government leaders crawl forward with proposed legislation that would criminalize the spread of "sexualized digital forgeries."
Known as the DEFIANCE (Disrupt Explicit Forged Images and Non-Consensual Edits) Act, the legislation also outlines a civil path for victims to sue the creators of deepfake images using their likeness.
"Victims of nonconsensual pornographic deepfakes have waited too long for federal legislation to hold perpetrators accountable. As deepfakes become easier to access and create — 96% of deepfake videos circulating online are nonconsensual pornography — Congress needs to act to show victims that they won’t be left behind,” wrote Congresswoman Alexandria Ocasio-Cortez upon the bill's introduction to the House.
In February, hundreds of AI leaders — joined by academics, researchers, artists, and even politicians — issued an open letter calling for the prioritization of deepfake legislation. The coalition called for a bill that would fully criminalize deepfake child pornography, establish criminal penalties for anyone knowingly involved in creating or spreading harmful deepfakes, and place requirements on software developers and distributers, Mashable's Meera Navlakha reported.
The letter cited the limits and inadequacies of current legislation to address deepfakes specifically, as well as the sheer increase in deepfake technologies and output. "Unprecedented AI progress is making deepfake creation fast, cheap, and easy. The total number of deepfakes has grown by 550 percent from 2019 to 2023," the coalition wrote.
Explicit deepfakes of celebrities are top of mind for many, following the spread of nonconsensual images of Taylor Swift on X and the recent discovery of deepfake porn ads using the likeness of actor Jenna Ortega.
But the problem is just as worrisome for non-famous individuals. Deepfake images are increasingly entering the social lives of young children and teens, prompting online child safety experts to call for preventative measures and heightened attention from parents.
In February, a group of California middle school students used deepfake technology to create and disseminate nude images of their classmates, just the latest instance among minors who seem to be getting younger and younger. Various court cases have laid down the verdict on victim recourse, with few to no laws to guide them.
"Deepfake pornography is a form of digital sexual violence. It violates victims' consent, autonomy, and privacy," wrote Sexual Violence Prevention Association (SVPA)founder Omny Miranda Martone in support of the DEFIANCE Act. "Victims face increased risk of stalking, domestic abuse, loss of employment, damaged reputation, and emotional trauma."
If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful informationas well as a list of international resources.
Topics Artificial Intelligence Google Social Good
Previous:Out of Sight, Out of Our Minds
Unity is walking back its runtime install policy'No One Will Save You' review: Alien home invasion horror without the tensionNotes on PrecocityThe Body Image Bill is calling for Photoshopped posts to be flaggedNew on Our Masthead: Susannah Hunnewell and Adam Thirlwell'Absurdle' is a new internet guessing game that's way harder than 'Wordle'Best Coleman deals: Tents, camping chairs, and beyond for up to 73% off at AmazonX adds "Formerly Twitter" to App Store listing as app plunges in the chartsThis app will take you inside Tana Mongeau's camera rollNational emergency alert: AtPhilip Larkin’s Awful Vacation'Absurdle' is a new internet guessing game that's way harder than 'Wordle'Why do we call celebrities our sons?The Foul, Unclean Caricatures of James GillrayThe Return of the Old Man from the Upper West SideGoogle Easter egg pays tribute to the late Betty WhiteMicrosoft's acquisition of Activision is essentially a done dealWhere We Live: David Graham’s Photos of American HomesListen: Saul Bellow Reads from “Humboldt’s Gift,” 1988National emergency alert: At Heinz actually uses the ad campaign pitched by Don Draper in 'Mad Men' 'The Walking Dead' recap: Favorite characters return to their roots Facebook just took a surprising stand on an important digital rights issue Snoop Dogg points gun at a Donald Trump clown in his new video Best robot vacuum deal: Save $300 on the Ecovacs Deebot N30 Omni Here's every color the Samsung Galaxy S8 might come in, according to a new leak Intel and TAG Heuer team up to make new modular connected watches Seriously, stop sleeping next to your phone because another one just exploded Rubin Observatory's first images flaunt millions of galaxies. Take a look. Woman makes incredible meals using only random office equipment You will soon be able to pay for Netflix using Paytm This startup is caught in a harassment scandal dubbed 'India's Uber' Fifth grade girl builds portable, solar powered tiny house thanks to YouTube tutorials Brunei's hottest royal is Asia's answer to Prince Harry Blizzard of 2017: What does bombogenesis mean? Bars are throwing out this brand of beer in the name of marriage equality Here's the first real look at TRAPPIST Get ready for a bunch of new Apple products, including a phone in this long Airbnb continues to push Trips into more Asian countries This YouTuber is the realest damn thing to happen to 'beauty vlogging'
2.1368s , 8223.078125 kb
Copyright © 2025 Powered by 【Osawa Yuka Archives】,Pursuit Information Network