Skip to main content
search
0

Is Seeing Still Believing? The Rise of Deepfakes Sheds Light on Evidence Authenticity

By Mallory Fernandes

Images have the remarkable ability to encapsulate a moment in time, providing compelling and almost irrefutable evidence. To that end, images can be powerful tools used for evidentiary purposes because people have the tendency to accept them at face value as something that is a true reflection of what occurred the moment the image was captured. However, with the advent of deepfake technology and advances in artificial intelligence, it is now possible to manipulate and create realistic images or videos depicting events that never actually happened.1
Deepfake technology can be used to create images or videos of real people doing or saying something that they never actually did or said.2 Of much concern, this advanced deepfake software is readily accessible to the general public to download and use straight from the internet, and, in the absence of any regulatory control, people are free to circulate any deepfake-generated images and videos they create. Deepfake technology is advancing at an alarming rate, making it increasingly more difficult to distinguish between deepfake and genuine images or videos, which begs the question: Is seeing still believing?

Deepfakes are already seen to impact our daily lives, whether we are aware of them or not, because of their increased prevalence in areas such as social media, politics, art, and more.3 Due to the incessant advancements in technology, it is probably safe to say that deepfakes are here to stay, and we are left with no other choice but to adapt or let the ramifications of deepfake technology wreak havoc in our world. Accordingly, it is just a matter of time before the implications of deepfake technology are seen more frequently in the legal system.4

The foreseeable implications of deepfakes in the legal system are profound and multifaceted. On one hand, these highly convincing deepfakes can be used to cast doubt on the authenticity of legitimate evidence, sowing confusion, and discord in courtrooms. It is also a possibility that, as a last-ditch effort, a party may attempt to introduce deepfake evidence solely for the purpose of trying to prevail in the suit. Legal professionals must now grapple with the challenge of verifying the veracity of visual evidence and ensuring that it is free from tampering or manipulation. On the other hand, deepfakes could potentially be exploited to frame innocent individuals, further complicating the pursuit of justice. As technology continues to evolve, judges, juries, and legal experts must adapt to this new digital landscape to preserve the integrity of our legal system.

The implications of deepfake technology on evidence authenticity is a pressing matter that will undoubtedly have the potential to affect nearly every type of case that comes through the legal system. The authentication process of evidence is fundamental to its admissibility.5 However, the threshold standard for evidence authenticity is not particularly high. The standard is satisfied by “evidence sufficient to support a finding that the item is what the proponent claims it is.”6 The party who is putting forth the evidence “need only make prima facie showing of authenticity ‘so that a reasonable juror could find in favor of authenticity or identification.”’7 The effect of this relatively low standard in conjunction with deepfake technology can be twofold.8 Accordingly, it can be forecasted that evidence authenticity will run into problems when the proponent of the evidence is trying to prove that the evidence is real and when trying to prove that the images are not “deepfake.” With the low standard of evidence authenticity, it is more likely that deepfake images will be authenticated as real images and admitted into evidence with ease unless the Court system implicates a higher standard of evidence authenticity or implicates a system that identifies deepfake images prior to even entering the Court system.

Congress is proactively taking steps to combat deepfake technology.9 A bill was recently introduced to the House of Representatives that would require any deepfake photograph to contain provenance technology that identifies that the image was created or altered using artificial intelligence technology; failure to comply with the requirement would result in a fine and imprisonment.10 Although this bill is certainly a step in the right direction, even if this bill were to become law, there will always remain the possibility that people fail to comply, making it completely necessary for there to be secondary defense measures in place to prevent AI-doctored images from entering the Court System.

Accordingly, to deter and combat the use of deepfake technology in the courtroom, members of all facets of the legal system must be educated and aware of the negative possibilities that can arise. Attorneys should be taught to be mindful when a client becomes overly adamant about presenting a particular image or video, as this could be a potential red flag that the evidence may be altered. Moreover, it would be useful to supply attorneys with a baseline level of training on how to spot AI-altered evidence so that they prevent the material from even entering the courtroom.

All in all, deepfake technology is on the rise, and it is inevitable that this technology will find its way into the Court System. It is crucial that all members of the Legal System be armed with knowledge of deepfake detection to preserve the sanctity of the evidentiary process of the Court and ensure that justice is properly served.   

References:

1 Danielle S. Van Lier, The People vs. Deepfakes California Ab 1903 Provides Criminal Charges for Deepfakes Activity to Guard Against Falsified Defaming Celebrity Online Content, 43 L.A. Law. 16 (May 2020)

2 Id

3 S. Karnouskos, “Artificial Intelligence in Digital Media: The Era of Deepfakes,” in IEEE Transactions on Technology and Society, vol. 1, no. 3, pp. 138-147, Sept. 2020

4 Blake A. Klinkner, What Attorneys Should Know About Deepfakes, 46 Wyo. Law. 38 (June 2023)

5 Fed. R. Evid. 401, 402

6 Fed. R. Evid. 901(a)

7 United States v. Workinger, 90 F.3d 1409, 1415 (9th Cir. 1996)

8 Riana Pfefferkorn, “Deepfakes” in the Courtroom,” 29 B.U. Pub. Int. L.J. 245 (2020)

9 DEEPFAKES Accountability Act, H.R. 5586, 118th Cong. (2023)

10 Id

Close Menu

(239) 687-5300