Ken Regum

On AI and Judicial Rules on Evidence

I watched the Senate hearing on disgraced self-proclaimed Son of God Quiboloy and was struck by one of his comments when confronted by a purported audio file with his voice. He said, "Kailangan i-authenticate kasi ngayon, malakas ang AI." (It needs to be authenticated since AI is powerful nowadays.)

How do current rules of evidence in Philippine courts of justice handle AI and deepfakes? Is there a need for specialized rules on the rise of AI, or are the current rules sufficient?

Let's say that you have a picture of A caught in the act of stealing milk from a grocery store. In the Philippines, the party offering the picture in evidence must also offer someone called a "testimonial sponsor" who will authenticate the picture.

Rule 132, Section 20 of the Rules of Court states:

Proof of private document. — Before any private document offered as authentic is received in evidence, its due execution and authenticity must be proved either: 

 (a)  By anyone who saw the document executed or written...

As an aside, pictures are considered as documents under Philippine jurisprudence. Likewise, Section 1, Rule 11 of the Rules on Electronic Evidence states:

Audio, video and similar evidence. – Audio, photographic and video evidence of events, acts or transactions shall be admissible provided is shall be shown, presented or displayed to the court and shall be identified, explained or authenticated by the person who made the recording or by some other person competent to testify on the accuracy thereof.

The testimonial sponsor of a picture is usually the photographer of said picture.

Can someone else authenticate the picture under the guise that the photo is an "accurate portrayal of reality as they saw it" or has happened to see the photographer take a picture of the same?

No. According to the Supreme Court in People vs. Manansala, G.R. No. 233104, 02 September 2020, a person competent to testify on the accuracy of a video or CCTV footage must be able to establish the source of the recording, how it was transferred to a storage medium, and how such medium was subsequently presented to the trial court. I believe the same applies to photographic evidence.

The other party, however, may raise an objection that the picture is in fact not genuine. Who has the burden that the picture is genuine or a fraud?

Since the prosecution is offering the picture as evidence of A's guilt, they need to prove that the picture is genuine. In the advent of AI and deepfakes, do they need to prove that the same has not been altered?

The answer seems to be no. The burden of proof has been shifted by the statement made under oath by the photographer that he took the picture and that it was not generated by AI.

Should the defense prove that the picture is AI-generated?

Yes. Since the burden of proof is now on the defense, they need to present countervailing evidence to create reasonable doubt in the mind of the court. They don't need to prove that the picture is fake, but indirectly hint at it by producing an alibi such as pictures or receipts showing A is, in fact, not at the grocery store at the time of the theft.

What if the defense wishes to prove that the picture is AI-generated? Can they do so?

Yes, of course. Unfortunately, not just anyone can say, "Mayhaps it is generated by AI." An expert witness is required to prove that the picture is in fact AI-generated.

Is it possible not to call an expert witness?

Possibly. Section 31, Rule 132(B) of the Rules of Court states:

Alteration in document, how to explain The party producing a document as genuine which has been altered and appears to have been altered after its execution, in a part material to the question in dispute, must account for the alteration. He may show that the alteration was made by another, without his concurrence, or was made with the consent of the parties affected by it, or was otherwise properly or innocent made, or that the alteration did not change the meaning or language of the instrument. If he fails to do that, the document shall not be admissible in evidence. (32a)

If the defense can show that there is some alteration in the picture - perhaps the hands are not perfect - the court can rule that the picture is inadmissible for being altered.

Is there need to change the rules?

I don't believe so at this moment. I believe that our authentication procedure still works consistently well with deepfakes, as it worked well with photoshopped images, altered audio recordings, or even ephemeral electronic evidence such as text or chat messages.

If there is one change to be had, it is to constitute electronic evidence to the same level as extrajudicial confessions, circumstantial only unless corroborated by other pieces of evidence.

Rule 133 of the Rules of Court states:

SEC. 3. Extrajudicial confession, not sufficient ground for conviction.— An extrajudicial confession made by an accused, shall not be sufficient ground for conviction, unless corroborated by evidence of corpus delicti.

SEC. 4.  Circumstantial evidence, when sufficient.— Circumstantial evidence is sufficient for conviction if: 

(a)  There is more than one circumstance; (b)  The facts from which the inferences are derived are proven; and ©  The combination of all the circumstances is such as to produce a conviction beyond reasonable doubt.

To say that AI is a big step in technology is an understatement (as I wrote before), it has a huge effect on my field), but the judiciary appears to have technology-neutral rules in place to salve the effects of deepfakes and other forgeries to be had with AI. Let us see how it rolls out in the future.

Read more? |

#law #thoughts