Visit our website
New America Cypbersecurity Initiative
New America Cypbersecurity Initiative
MIT Technology Review
MIT Technology Review
io9
io9
Techdirt
Techdirt
Knowledge@Wharton
Knowledge@Wharton
Bioscience Technology
Bioscience Technology
redOrbit
redOrbit
Technology & Marketing Law Blog
Technology & Marketing Law Blog
Popular Science Blog
Popular Science Blog
Pew Research Center
Pew Research Center
Genomics Law Report
Genomics Law Report
Science 2.0
Science 2.0
The Guardian Headquarters
The Guardian Headquarters
Genetic Literacy Project
Genetic Literacy Project
Disclaimer

Statements posted on this blog represent the views of individual authors and do not necessarily represent the views of the Center for Law Science & Innovation (which does not take positions on policy issues) or of the Sandra Day O'Connor College of Law or Arizona State University.

LISTEN: LSI Director Josh Abbott Discusses Deepfakes on Regulatory Transparency Project’s Fourth Branch Podcast

The Center for Law Science and Innovation’s Executive Director Josh Abbott recently sat down for a discussion on The Federalist Society’s Fourth Branch Podcast for the Regulatory Transparency Project. Abbott discussed the potential harms deepfakes could bring with leading industry experts. The discussion was moderated by Kathryn Ciano Mauler, Product Counsel at Google. 

Bobby Chesney, the James A. Baker III chair in the Rule of Law and World Affairs and associate dean for Academic Affairs at the University of Texas at Austin School of Law and Matthew Feeney the  Director of Project on Emerging Technologies at the Cato Institute joined Abbott for the deepfake episode. 

One of the hot topics in technology is the development of deepfakes and the potential for abuse. This can raise cause for concern as the technology adapts to become more realistic. But what is a deepfake? Feeney describes the technology “as a set of tools that include generators and discriminators, so detectives and counterfeiters.”

“The technology is trying to develop material while another set of the technology is trying to detect it,” said Feeney “And the result is very realistic-looking fake media.”

Recently, a popular example of this technology is the video of Tom Cruise created with the deepfake technology process. In fact, many celebrities have had their likeness used by this technology. While most of the videos are not a cause for concern- the potential for the technology to be used for videos that could negatively impact the person whose likeness is being used. 

 

Prof. Chesney cites specific examples of the negative impact the technology can bring to unsuspecting victims. First writing about this in 2018, Prof. Chesney and co-author Danielle Citron “observed the emergence of sexually explicit deepfakes.”

“The term itself is a play on deep throat,” said Prof. Chesney, “Originating from that period where there was a Reddit community that was beginning to use an early version of the technology to create pornography.”

Prof. Chesney and Citron sat down to map out the full benefits and harms this technology could bring as it became more accessible. One issue is the potential ability to frame people for things that were never said. On the other hand, Prof. Chesney mentions, there is the problem of those who have been caught on camera saying damaging things and denying it because of this technology. Prof. Chesney notes examples around the world of this phenomenon taking place. 

Abbott emphasizes the power that this new technology has, and with all powerful tools they could have harmful effects when used inappropriately. 

“The point is that when more powerful tools come along, how does the law respond to their use in things that we want to prevent anyway?” said Abbott.

“So for example, the use of certain kinds of weapons in the commission of crimes like assault or something could be an aggravating factor. And the purpose of that is not so much that — it’s not that it’s a separate crime. It’s that what we’re really trying to do is prevent the assault. We’re trying to prevent the harm.”

Share on facebook
Share on email
Share on twitter
Share on linkedin