Abstract
2024 Louise A. Halper Award Winner for Best Student Note.
Deepfakes have become popular due to their user-friendly nature and accessibility, allowing anyone to create one by installing deepfake software programs on their phones or laptops. Deepfake software programs allow creators to create hyper-realistic multimedia featuring anyone whose image they can find. Some industries have drawn positive uses from deepfakes; however, deepfakes also create harms that can have detrimental effects on people’s mental health, employment, and reputation. Women and children, including those without a large online presence, have become the target for nonconsensual pornographic deepfakes. Congress has yet to pass a federal bill that encourages online service providers to properly regulate its platform and prevent the spread of nonconsensual pornographic deepfakes. Due to the lack of federal law, some states have attempted to address the issue by imposing criminal and civil penalties. But current state laws do not go far enough in protecting women and children because they take a reactive approach in which the harms of the deepfakes have already taken place. This Note discusses federal bills and amendments as possible solutions, but further proposes that a public-private partnership with deepfake software program creators would bring the most effective solution.
A public–private partnership with deepfake software companies would serve as a preventative measure while also ensuring clear legal recourse for victims.
Recommended Citation
Rena Song,
Faking It: A Proposed Solution to Counter Nonconsensual Pornographic Deepfakes,
31 Wash. & Lee J. Civ. Rts. & Soc. Just. 157
(2025).
Available at: https://scholarlycommons.law.wlu.edu/crsj/vol31/iss1/6
Included in
Civil Rights and Discrimination Commons, Computer Law Commons, Human Rights Law Commons, Law and Gender Commons, Privacy Law Commons, Science and Technology Law Commons, Sexuality and the Law Commons