Victims of nonconsensual deepfake porn are definition of eroticizeusing the laws of copyright to take back ownership of their likenesses, according to a new investigation.
In an analysis of copyright claims against websites known to share nonconsensual, digitally-altered videos, WIREDdiscovered thousands of women (including streamers, gamers, and other popular content creators) levying complaints to Google demanding the content be taken down.
The publication documented more than 13,000 copyright claims (which includes almost 30,000 URLs) against dozens of sites that populate Google.
Victims are utilizing the Digital Media Copyright Act (DMCA), which is frequently weaponized to remove copyrighted music, videos, and other media from third-party sites (and personal pages) online. The DMCA has also been used on behalf of victims of image-based sexual abuse or "revenge porn," with cases citing personal authorship and the unauthorized used of images.
A deepfake creator's alteration or outright fabrication of original images does complicate the matter, providing a higher obligation of proof for victims claiming rights over intellectual property.
Google has previously addressed the spread of revenge porn and deepfakes with new policies and reporting procedures, including options to remove personal explicit images from search results and deepfake reporting systems involving the detection of both original and copied images. The company has also documented its efforts to flag and remove such content. According to Google's own data, around 82 percent of complaints resulted in URL removal. "For the biggest deepfake video website alone," WIRED reported. "Google has received takedown requests for 12,600 URLs, 88 percent of which have been taken offline."
The sheer number of confirmed violations has prompted online safety and copyright advocates to wonder why the websites are still allowed to remain up. "If you remove 12,000 links for infringement, why are they not just completely removed?” posed Dan Purcell, founder and CEO of pracy protection firm Ceartas, in a WIRED interview. “They should not be crawled. They’re of no public interest."
The copyright strategy is a legal workaround for victims as government leaders crawl forward with proposed legislation that would criminalize the spread of "sexualized digital forgeries."
Known as the DEFIANCE (Disrupt Explicit Forged Images and Non-Consensual Edits) Act, the legislation also outlines a civil path for victims to sue the creators of deepfake images using their likeness.
"Victims of nonconsensual pornographic deepfakes have waited too long for federal legislation to hold perpetrators accountable. As deepfakes become easier to access and create — 96% of deepfake videos circulating online are nonconsensual pornography — Congress needs to act to show victims that they won’t be left behind,” wrote Congresswoman Alexandria Ocasio-Cortez upon the bill's introduction to the House.
In February, hundreds of AI leaders — joined by academics, researchers, artists, and even politicians — issued an open letter calling for the prioritization of deepfake legislation. The coalition called for a bill that would fully criminalize deepfake child pornography, establish criminal penalties for anyone knowingly involved in creating or spreading harmful deepfakes, and place requirements on software developers and distributers, Mashable's Meera Navlakha reported.
The letter cited the limits and inadequacies of current legislation to address deepfakes specifically, as well as the sheer increase in deepfake technologies and output. "Unprecedented AI progress is making deepfake creation fast, cheap, and easy. The total number of deepfakes has grown by 550 percent from 2019 to 2023," the coalition wrote.
Explicit deepfakes of celebrities are top of mind for many, following the spread of nonconsensual images of Taylor Swift on X and the recent discovery of deepfake porn ads using the likeness of actor Jenna Ortega.
But the problem is just as worrisome for non-famous individuals. Deepfake images are increasingly entering the social lives of young children and teens, prompting online child safety experts to call for preventative measures and heightened attention from parents.
In February, a group of California middle school students used deepfake technology to create and disseminate nude images of their classmates, just the latest instance among minors who seem to be getting younger and younger. Various court cases have laid down the verdict on victim recourse, with few to no laws to guide them.
"Deepfake pornography is a form of digital sexual violence. It violates victims' consent, autonomy, and privacy," wrote Sexual Violence Prevention Association (SVPA)founder Omny Miranda Martone in support of the DEFIANCE Act. "Victims face increased risk of stalking, domestic abuse, loss of employment, damaged reputation, and emotional trauma."
If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful informationas well as a list of international resources.
Topics Artificial Intelligence Google Social Good
UAE billionaire is funding India's most expensive film of all timeYou can now walk into literal walls in virtual realityFaith, the queen of pop cultureEven LeBron wants to be at Coachella right nowHow Lorde's synesthesia helped her write 'Melodrama'Gabrielle Union comes for Dwayne Wade's playoff performance with 1 hilarious tweetGabrielle Union comes for Dwayne Wade's playoff performance with 1 hilarious tweetUber driver deserves 5 stars for the NES Classic in his back seatScientists discover a giant black worm monster in the PhilippinesMarvel plans to keep making movies until you're dead and your children are oldSamsung Gear VR and Controller is a virtual reality marriage made in heavenOur eyes reveal when we're about to have an epiphanyArizona town welcomes 'drugSamsung Galaxy S8 and S8+ review: huge screen, fantastic phoneSea level rise could send U.S. 'climate migrants' fleeing to Austin, AtlantaPeople are pretty unimpressed that a bunch of American guys formed a KSorry, Facebook, but social VR will never be a thingAustralia's back in the satellite game with a teeny tiny new launchLegendary sportsperson's response to fan letter wows InstagramThe women of 'Black Panther' steal the show in first Marvel footage 'True Detective: Night Country' review: A can't How to watch Purdue vs. PSU basketball without cable: game time, streaming deals, and more Whiting Awards 2021: Xandria Phillips, Poetry Redux: A Man Says Yes without Knowing by The Paris Review Wordle today: The answer and hints for January 12 The best deals on space heaters this week The B Side of War: An Interview with Agustín Fernández Mallo by Jorge Carrión Announcing This Year’s Whiting Award Winners Whiting Awards 2021: Sarah Stewart Johnson, Nonfiction Whiting Awards 2021: Tope Folarin, Fiction Best beauty deal: Build your own Shark FlexStyle or SpeedStyle and save up to $50 ‘The Book of Clarence’ review: The year’s first great comedy is a Biblical epic Every Day Was Saturday in Harlem by The Paris Review The Tarot Is a Chameleon by Rhian Sasseen Year on TikTok 2024: All the trends from brat summer to a Moo Deng in finance Jon Hamm's bad cop brings out the best in 'Fargo' How to buy the Apple Vision Pro: A checklist of what you'll need at check out Blueprints for Another World by The Paris Review Google Pixel unlocked phones deal: Save up to 21% at Amazon Imagining Nora Barnacle’s Love Letters to James Joyce by Nuala O’Connor
1.9968s , 8289.125 kb
Copyright © 2025 Powered by 【definition of eroticize】,Pursuit Information Network