We use cookies or similar technologies to improve user experience, analyze website traffic, enable and maintain log-in and personalization settings, connections with selected partners and for marketing purposes (managing advertising space and recommend products of interest to you). You can block or adjust saving cookies at any time, by changing the settings of your web browser. By continuing to use this website without disabling cookies in your web browser you ‘ACCEPT’ saving cookies. Learn more in our Privacy Policy.

Deepfakes Casting Doubt: When Evidence’s Authenticity is Questioned 2023

Product ID: CA3567R
Presented By: State Bar of Wisconsin PINNACLE

Co-produced by the Center for Integrity in Forensic Sciences and State Bar of Wisconsin PINNACLE

Seeing isn’t always believing

In several recent high-profile cases, including the Kyle Rittenhouse trial in Kenosha, defendants have challenged the authenticity of video evidence by arguing that the videos could have been altered by artificial intelligence (AI).1 The increasing prevalence and sophistication of deepfake media opens the door for litigants to call into question traditionally accepted video and photo evidence. The existence of deepfakes makes it easier for parties to challenge the authenticity of genuine evidence, whether they have a reasonable basis for the challenge or not. Deepfakes also increase the chance that believable but falsified evidence could be relied upon to justify an unjust outcome.

At Deepfakes Casting Doubt: When Evidence’s Authenticity is Questioned, you’ll get an introduction to deepfake technology and hear about real cases where video evidence has been challenged on deepfake grounds. Our highly experienced panel will walk you through the rules of evidence and professional conduct applicable to authenticating evidence in any case where media evidence is challenged. You’ll also pick up best practices for handling challenges to the authenticity.

Read More ↓

Interested in sponsoring this program? Find out more.

Select a Format

OnDemand seminar

Pricing

Member $99.00

Non-Member $149.00

Credits

1 CLE

Date and Time

Friday, September 22, 202312:00 PM - 1:00 PM CT

Add to Calendar 9/22/2023 12:00:00 PM 9/22/2023 1:00:00 PM America/Chicago Deepfakes Casting Doubt: When Evidence’s Authenticity is Questioned 2023

Co-produced by the Center for Integrity in Forensic Sciences and State Bar of Wisconsin PINNACLE

Seeing isn’t always believing

In several recent high-profile cases, including the Kyle Rittenhouse trial in Kenosha, defendants have challenged the authenticity of video evidence by arguing that the videos could have been altered by artificial intelligence (AI).1 The increasing prevalence and sophistication of deepfake media opens the door for litigants to call into question traditionally accepted video and photo evidence. The existence of deepfakes makes it easier for parties to challenge the authenticity of genuine evidence, whether they have a reasonable basis for the challenge or not. Deepfakes also increase the chance that believable but falsified evidence could be relied upon to justify an unjust outcome.

At Deepfakes Casting Doubt: When Evidence’s Authenticity is Questioned, you’ll get an introduction to deepfake technology and hear about real cases where video evidence has been challenged on deepfake grounds. Our highly experienced panel will walk you through the rules of evidence and professional conduct applicable to authenticating evidence in any case where media evidence is challenged. You’ll also pick up best practices for handling challenges to the authenticity.

aagOTNdBczOPpqCrTmAF60877

No longer available, please choose from options above.

Maximum quantity must be less than or equal to 1.

Co-produced by the Center for Integrity in Forensic Sciences and State Bar of Wisconsin PINNACLE

Seeing isn’t always believing

In several recent high-profile cases, including the Kyle Rittenhouse trial in Kenosha, defendants have challenged the authenticity of video evidence by arguing that the videos could have been altered by artificial intelligence (AI).1 The increasing prevalence and sophistication of deepfake media opens the door for litigants to call into question traditionally accepted video and photo evidence. The existence of deepfakes makes it easier for parties to challenge the authenticity of genuine evidence, whether they have a reasonable basis for the challenge or not. Deepfakes also increase the chance that believable but falsified evidence could be relied upon to justify an unjust outcome.

At Deepfakes Casting Doubt: When Evidence’s Authenticity is Questioned, you’ll get an introduction to deepfake technology and hear about real cases where video evidence has been challenged on deepfake grounds. Our highly experienced panel will walk you through the rules of evidence and professional conduct applicable to authenticating evidence in any case where media evidence is challenged. You’ll also pick up best practices for handling challenges to the authenticity.

Read More ↓

The Honorable Zia Faruqui has been serving as a federal magistrate judge for the District of Columbia since September 2020. During that time, he has presided over hundreds of cases related to the January 6th breach of the U.S. Capitol. Prior to his judicial appointment, Judge Faruqui was a federal prosecutor, first in the United States Attorney’s Office for the Eastern District of Missouri and then in the District of Columbia. Over the span of twelve years of federal service as an ASUA, he prosecuted numerous cases, many of which centered on the nexus between financial crimes and national security. His case work included counter terror-finance actions, including by implementing denial of service attacks against, and site takeovers of, websites used by ISIS and Al Qaeda to collect cryptocurrency. Additionally, he led the takedown of the largest ever darknet site dedicated to child exploitation which was funded by cryptocurrency. Judge Faruqui is a board member for Jobs for Homeless People, a non-profit that provides housing and vocational training to people in the D.C. metropolitan area. He also served as a Muslim-outreach coordinator for the Department of Justice and as an adjunct professor at Harris-Stowe State University where he taught classes on criminal rehabilitation, and Georgetown University where he teaches constitutional law. 

Matthew F. Ferraro was a counsel at WilmerHale, advising clients on matters related to defense and national security, cybersecurity, emerging technologies, and crisis management until late August 2023.  In private practice, Mr. Ferraro counseled clients, wrote, and spoke on artificial intelligence issues and the threat that digital disinformation and deepfakes pose to corporations, brands and markets.  Following his service at WilmerHale, he joined the Department of Homeland Security as Senior Counselor to the Secretary, for Cybersecurity and Emerging Technology.

Brent J. Gurney is a partner at WilmerHale. He is a first-chair trial lawyer with over 30 years of significant jury trial and other court experience in challenging civil and criminal cases throughout the United States. Mr. Gurney’s experience includes a wide variety of high-profile matters involving complex commercial disputes, intellectual property disputes, white collar criminal defense and government investigations into a wide variety of matters, including the False Claims Act, government contracts, health care fraud, conspiracy, wire, and mail fraud. Mr. Gurney is a Fellow of the American College of Trial Lawyers and Vice Chair of its Complex Litigation Committee. He is regularly listed in Best Lawyers in America for Commercial Litigation and White Collar Criminal Defense and in Washington, D.C. Super Lawyers.

Natalie Li is a senior associate at WilmerHale. She focuses her practice on complex commercial litigation, with an emphasis on contractual and intellectual property disputes, including copyright, trademark, and trade secret claims. She has represented clients in federal court, state court, and arbitration from a broad range of industries, including pharmaceuticals, technology, education, entertainment, and financial service. She also regularly counsels clients, writes, and speaks on the intersection between copyright and artificial intelligence issues.

  • Understand how rapidly evolving deepfake technology can play a role in your cases
  • Hear about recent cases where parties have raised the potential of deepfake evidence 
  • Be prepared to address allegations of deepfake evidence in court
  • Review the relevant rules of ethics and evidence applicable to authenticating evidence 
  • Criminal defense lawyers
  • Prosecutors
  • Litigators
  • Judges
  • Paralegals

The Going Deep Seminar Series takes you inside the fascinating world of deepfake technology and offers insights into the legal implications for intellectual property, evidentiary issues, and other areas of law. You’ll learn to recognize, account for, and mitigate risks posed by deepfake technology at these informative webcasts.

Looking for similar content with a criminal law focus? Check out the 2024 Forensic Justice Institute in January. Produced by the Center for Integrity in Forensic Sciences and State Bar of Wisconsin PINNACLE, the Forensic Justice Institute gathers renowned experts in forensics and nationally known speakers to present an unbiased look at the current scientific methods and results often submitted as evidence. Understand the reliability of the methods utilized so you can ensure your next criminal case contains science facts – not science fiction, and improves the use of forensic science in the criminal justice system.

0 Customer Reviews
5 star
0%
4 star
0%
3 star
0%
2 star
0%
1 star
0%

Customer Reviews

Share your thoughts with other customers by being the first to review this product and or seminar.