Release Date: November 20, 2023
BUFFALO, N.Y. – The last time University at Buffalo artificial intelligence expert David Doermann testified before Congress, in 2019, he warned lawmakers about the dangers of deepfakes and other synthetic media.
Since then, the threat has only grown, Doermann said during his latest appearance on Capitol Hill.
Speaking Nov. 8 to members of the House Oversight Subcommittee on Cybersecurity, Information Technology, and Government Innovation, Doermann again urged lawmakers to invest more resources into ensuring the technology is not misused.
“As these technologies advance at an unprecedented rate, it is crucial to recognize the potential for both positive and negative implications. Every week, we hear of its use at both ends of the spectrum,” he said.
“This week, we heard about AI being used to finish a new Beatles song on one hand and to generate nude images of classmates by a high schooler in New Jersey on the other. Despite our president's executive orders and the testimony of our thought and business leaders, we are not moving fast enough to curtail the continued damage this technology is doing and will continue to do as it evolves,” he said.
A SUNY Empire Innovation Professor and interim chair of the Department of Computer Science and Engineering, Doermann elaborated on the many ways in which synthetic or manipulated digital content can cause harm.
“Not only has it been used in non-consensual pornography, cyberbullying, and harassment, causing severe harm to individuals, the potential national security implications are grave. Deepfakes can be exploited to impersonate government officials, military personnel, or law enforcement, leading to misinformation and potentially dangerous situations,” said Doermann, who before arriving at UB worked for the Defense Advanced Research Projects Agency (DARPA), where he oversaw the agency’s media forensics program and other programs related to the use of human language technologies.
As deepfake technology becomes more sophisticated, Doermann advocated for federal policies that govern its use.
“I urge you to consider legislation and regulations to address the misuse of deepfake technology. Striking the right balance between free speech and safeguards to protect against malicious uses of deepfakes is essential,” he said. “First and foremost, public awareness and digital literacy programs are vital in helping individuals recognize deepfakes and false information.”
At UB, researchers are tackling this problem with federal support. Examples include the Center for Information Integrity, whose researchers are developing tools to help older adults and children spot online deceptions, as well as work by researchers on the DARPA Semantic Media Forensic Program.
The challenges facing society, Doermann said, are complex and pervasive enough that federal resources alone will not alleviate them.
“Collaboration between Congress and technology companies is essential to address the challenges posed by deepfakes. Tech companies are responsible for developing and implementing policies to detect and mitigate deepfake content on their platforms,” he said. “More robust privacy and consent laws are needed to protect individuals from using their likeness and voice in deepfake content without their permission. Continued research and development in AI and deepfake technology are necessary, as is funding for initiatives to counter deepfake misuse.”
Doermann’s full testimony is availble on the House Oversight Subcommittee on Cybersecurity, Information Technology, and Government Innovation’s website.
The University at Buffalo has been a worldwide leader in artificial intelligence research and education for nearly 50 years. This includes pioneering work creating the world’s first autonomous handwriting recognition system, which the U.S. Postal Service and Royal Mail adopted to save billions of dollars. As New York’s flagship university, that legacy of innovation continues today. UB researchers are committed to using AI for social good, including developing new technology that addresses the shortage of speech-language pathologists in K-12 education, deepfakes, the need for improved medical imaging and more.
Cory Nealon
Director of Media Relations
Engineering, Computer Science
Tel: 716-645-4614
cmnealon@buffalo.edu
SHARE THE NEWS
Have some news or an event to share with us? Please send it to cse-dept@buffalo.edu.