Deepfakes in Utah Divorce and Child Custody Disputes: Not If But When

A few years ago, the concern was fake law; lawyers citing to AI-generated cases that didn’t exist. That already happened in Utah. See Garner v. Kadince.

The next concern is worse: fake facts.

Not exaggerated facts. Not misleading facts. Entirely fabricated audio and video that look and sound real. In divorce and child custody disputes, where a single recording—if it’s bad enough (or “good enough” depending upon how you view it)—can shift a case overnight, that’s not a theoretical risk, it’s an inevitability.

If you think a convincing video of a parent screaming at a child or a spouse caught in flagrante at a seedy motel won’t move a judge, you’re wrong. It will. The question is whether anyone in the room knows how to challenge it before the damage is done.

What This Looks Like in a Real World Case

In high-conflict divorce and child custody disputes, the incentive to “create” evidence is at an all-time high. These aren’t the grainy fakes of the past. When a litigant feels his/her back is against the wall, the temptation to use generative AI can be all but overwhelming.

Strip this down to something practical. If a judge sees a video of a parent striking a child, the emotional impact is devastating. It is nearly impossible to “un-see” that evidence once it’s in the record. The court is paying attention.

The law must act as a cold, analytical filter. Now what?

If your response is “that’s fake,” you’re falling short. Courts don’t exclude evidence because it might be fake. They exclude it because you can show it cannot be trusted under the rules of evidence. That’s where the law matters.

Defense Against the Dark Arts: The Utah Rules of Evidence

Rule 901: Authentication Is Necessary—but Not Sufficient in Itself

Under Utah Rule of Evidence 901, the proponent of the evidence has to show the item is what he or she claims it is. Traditionally, that’s easy: “I recorded this on my phone” or “This accurately reflects what happened.”

In the age of AI and deepfakes, however, that’s not enough, but courts haven’t fully caught up yet. A witness saying “this is real” does not make it real. It just checks a box.

Rule 901 is the first line of defense. It requires the person offering the evidence to prove the item is actually what they claim it to be. If a spouse produces a recording that the other claims never happened, the court looks at “distinctive characteristics” under Rule 901(b)(4). We can now challenge the “chain of custody” of a digital file to demand proof of exactly how and when it was created.

The real pressure point under Rule 901 is how the file came to exist: Where was it created? On what device? When? With what software? If those answers are vague, inconsistent, or unsupported, authentication starts to crack.

These are not arguments you make at the end of the hearing or trial. You attack early, before the recording becomes the lens through which the judge or commissioner sees the entire case.

Rule 707: Machine-Generated Evidence

Utah has something many jurisdictions don’t: Utah Rule of Evidence 707. This rule deals with machine-generated output, exactly where deepfakes live. Rule 707 places a gatekeeper burden on the judge to ensure digital evidence is scientifically verifiable, not just “convincing” to the eye.

A deepfake is not just a recording. It’s the output of a system that processes data and generates a synthetic result. That should, and under Rule 707 does, trigger a higher level of scrutiny. What system created this? Is it reliable? Can its output be verified or tested?

Most lawyers aren’t using Rule 707 yet, or using as they should. Properly framed, it shifts the burden from “prove it’s fake” to “prove this machine-generated output is reliable enough to trust.”

That’s a much harder burden for the proponent to carry.

Rule 1002: The Requirement of the Original

If there’s one rule that matters most here, it’s Utah Rule of Evidence 1002. This is the “Metadata Rule.” In the digital world, the “original” is the file containing the digital DNA.

If a spouse produces a video but cannot produce the original device, or if the metadata has been scrubbed, that is a massive red flag for the court.

Rule 1002 contains the “requirement of the original.” In a digital context that means:

  • the original file
  • with intact metadata
  • traceable to the device that created it.

Metadata is the audit trail:

  • creation date and time
  • device identifiers
  • GPS coordinates of where the video was filmed.
  • software used to export or edit the file.

Here’s the reality: fabricated evidence often falls apart here. If the other side produces a clipped video, no original file, no device, or scrubbed metadata, you’re not arguing about credibility anymore. You’re arguing that the evidence does not meet the threshold for admissibility. That’s a different fight, and a better one.

How This Actually Gets Litigated

This is where most discussions stop. This is also where cases are won or lost.

If you suspect a deepfake, you:

  • move to exclude the evidence before the hearing,
  • demand production of the original file and recording device,
  • lock the other side into a clear position on how the evidence was created,
  • and, if necessary, bring in a digital forensics expert early, not as a last-minute scramble.

Handled correctly, the issue becomes procedural, not emotional. Handled poorly, the judge sees the video first and asks questions later. By then, the damage is done.

The Risk That Few Talk About

There’s a flip side to this.

Accusing genuine evidence of being fake—and being wrong—can hurt you too. Courts do not respond well to speculative attacks on evidence. If you raise the issue, you need a basis:

  • gaps in chain of custody,
  • missing metadata,
  • inconsistencies,
  • unexplained edits.

This isn’t a bluff. It’s a technical argument that must be grounded in facts.

Where This Is Headed

Utah judges are not technologists. Like everyone else, they can be fooled (and persuaded) by what looks real. That’s not a criticism, just reality. What this means? If you don’t know how to scrutinize digital evidence, you’re not just behind, you’re exposed.

The legal system will adapt. Rules like 901, 707, and 1002 give courts the tools to filter out fabricated evidence. But unless they’re used correctly, they’re not particularly useful tools.

If you suspect you are being framed by fabricated evidence, you must act immediately. This often requires hiring a digital forensics expert to identify “artifacts”—the microscopic glitches in AI-generated media that reveal its artificial nature. The goal is to exclude the evidence entirely before it can poison the well.

The court is not a tech lab, but now a competent lawyer must treat it like one.

Utah Family Law, LC | divorceutah.com | 801-466-9277