A defensible playbook for authenticating and challenging digital video in court
Contributed by Kris Carlson, COO, Former ICAC Commander, and Digital Forensics Investigator/Testifying Expert
Series context. Earlier installments have focused on how modern evidence travels from device to docket, with attention to tool validation, human expertise, and admissibility. This part turns to video, where metadata, compression, and custody decisions determine whether a clip persuades a jury or collapses under scrutiny. [1][23]
1) What makes video “authentic” now
A video file is more than pictures in motion. It is a container of data and metadata, recorded by a specific device and exported through a specific workflow. Under the Federal Rules of Evidence, authentication can be established by testimony or by the evidence’s characteristics, including the process by which it was produced. Rule 901 sets out the general requirement, and Rule 902 provides self-authentication pathways for electronic records and device-copied data when verified by a qualified certification. Together, these rules allow parties to prove authenticity through processes and hashes, not just eyewitness testimony. [3][2]
Three metadata families often tip the scales.
Timestamps. In formats such as MP4 and QuickTime, per-frame timing is stored separately from the encoded samples. When clocks drift or systems export a transcode rather than the native recording, timing can shift subtly, which matters for use of force windows and alibi claims. Understanding where timing lives helps counsel ask for the correct export.
Geotagging and sensor trails. Many phones and action cameras embed location data as QuickTime keys, for example, ISO 6709-formatted coordinates, or store higher-frequency telemetry such as GPS, accelerometer, and gyroscope data in a parallel track, like GoPro’s GPMF. These fields, when preserved, can corroborate where and how fast a camera moved, or show that footage could not have been shot where a witness claims it was. [8][9]
Compression patterns. Codecs such as H.264 store some frames as full images and others as predictions. The structure, including Group of Pictures cadence and residuals, leaves fingerprints. Re-encoding or frame removal tends to disturb that cadence, creating artifacts that examiners can detect. Courts do not need the math, but they do need a reliable method, documented error rates, and a qualified expert to explain the findings. [16][18]
LCG perspective. Start by asking for native exports with hashes and an export report.
2) Detecting clipped or edited surveillance video
Edits can be crude, for example, a missing minute, or subtle, for example, a transcode that hides frame interpolation. A disciplined review follows a repeatable order:
- Establish provenance and exports. Demand the system’s native export, not a re-capture, and document the export tool and version.
- ISO 22311 defines minimum requirements for interoperable CCTV exports. If the system cannot produce a native file, record why and capture the complete environment for later replication. [7]
- Verify time base and clock accuracy. Compare the file’s creation times and frame timestamps with independent references, such as 911 CAD logs or radio time calls.
- Guidance for video canvassing stresses time validation during collection. [10]
- Hash originals, then work on verified copies. This preserves an audit trail consistent with ISO/IEC 27037, which emphasizes identification, collection, acquisition, and preservation. [5]
- Correlate metadata and sensor tracks. Compare QuickTime location keys and, when available, GPMF telemetry. Mismatches between Geotags and claimed location are classic authenticity flags. [8][9]
- Document everything. Best practices and workflow standards require detailed notes on inputs, software, settings, and outputs so that another examiner can reproduce the results. [4][6]
LCG perspective. Ask opposing counsel how the clip left the recorder. If the answer is “we downloaded it from a portal,” drill down to determine whether the portal transcoded the file. A portal’s convenience feature is often the reason your admissibility is in question.
3) Chain of custody, by platform
Body-worn cameras. Prosecutors and agencies now have guidance that pairs policy with evidence practice, including retention, audit trails, and restrictions on editing. Program materials and federal policies emphasize a system-level audit trail that records access, redactions, and exports. That audit trail frequently makes or breaks the authenticity foundation in close cases. [13][12]
Drones and sUAS. Drone incidents may generate multiple evidence sources, such as SD cards, internal storage, flight logs, controller devices, and cloud syncs. Guidance urges capturing all components of the UAS, documenting firmware versions, and preserving logs that bind video to GPS and barometer data. Law enforcement guidance also describes the broader enforcement context, which can inform a court’s expectations around collection. [14][15]
CCTV and VMS systems. Early, methodical steps matter: identify systems, validate and document date and time, export in native formats with the player, and record any limitations. When a system can export to an interoperability format, downstream analysis is improved. A standard examination workflow gives labs a shared blueprint for handling and reporting. [10][7][6]
Cross-cutting legal hooks. The best evidence article recognizes that duplicates are admissible unless authenticity is genuinely in question. If your only copy is a screen recording, where the original system could have exported in a native format with hashes but did not, you invited a fight that was entirely avoidable. [19]
LCG perspective. Treat custodial platforms as evidence systems, not convenience drives. Configure time synchronization, restrict privileges, and enable automatic logging before an incident, so you do not need to retrofit the chain of custody after the fact.
4) Cross-examining a manipulated or misunderstood video
The goal is to move the court from “this looks real” to “this process is reliable,” or not.
Start with the rules. Rule 901 allows authenticity through distinctive characteristics and methods. Rule 902(13) and (14) let a certification stand in for a witness to authenticate process-generated records and device-copied data when the proponent gives proper notice. Use Sedona’s admissibility checklist to map your questioning to each gateway. [3][2][18]
Opening volley, chain, and exports.
- Who performed the export, with what tool and version?
- What report(s) did the tool generate detailing the export/data or activities engaged in?
- Was the export native, or did the method used to export the date alter/transform or transcode it?
- What hashes were computed at each step, and who verified them?
- Are there audit logs showing your exact access and actions?
Time base and drift.
- How was the recorder time validated?
- What independent references were used?
- Show the calculation that reconciles device time with actual time.
Geolocation and sensors.
- Does the file contain QuickTime location keys or a sensor track such as GPMF?
- If not, is that consistent with the device model and settings?
- If yes, do the traces match the scene’s geography and timeline? [8][9]
Synthetic media awareness. Even when the dispute is not a deepfake, federal advisories recognize manipulated media as a live risk. A short set of questions on provenance, device custody, and metadata is now fair game in any case featuring video sourced from social platforms or messaging apps. [22]
LCG perspective. Keep Lorraine v. Markel in mind. Courts expect advocates to carry each admissibility burden with specificity. A crisp process map, plus evidence of validation, often decides whether the judge ever reaches your substantive argument. [20]
5) A practical mapping you can cite
- Governance and collection. ISO/IEC 27037 for identification, collection, acquisition, and preservation, plus best practices for collection and for video analysis, and a standard examination workflow to structure lab work. [5][11][4][6]
- Exports. ISO 22311 for interoperable CCTV exports: request a native container, player, and export report. [7]
- Authentication. FRE 901 and 902(13), (14), plus the Sedona admissibility checklist, to justify certifications and notices rather than live witnesses. [3][2][18]
- Bodycams and drones. Program and policy materials for body-worn cameras, drone forensics best practices, and law enforcement guidance for incidents. [13][12][14][15]
Quick Checklist
- Demand native exports with hashes and player, then verify timing and metadata. [7][2]
- Test for editing through structure, for example, GOP cadence and double compression, and document method and limits. [4][16]
- Lock the chain of custody with audit trails, especially for bodycams, drones, and VMS portals, then use 902 certifications with proper notice. [12][14][2]
Final thought
Video persuades because people trust their eyes. Courts decide because they trust the process. If your team can request the right export, preserve metadata, explain compression-based tests, and demonstrate a clean chain of custody, your video will usually speak for itself. If the other side cannot do those things, you have a principled path to exclude, limit, or impeach their clip. That is the risk-reward balance in the AI era: the pixels matter, but the paperwork wins.
References (endnotes)
[1] Beyond the Screen series outline and prior installments, LCG Discovery & Governance.
[2] Federal Rules of Evidence 902(13) and 902(14), self-authentication of electronic process records and device-copied data. https://www.law.cornell.edu/rules/fre/rule_902
[3] Federal Rules of Evidence 901, authentication and identification. https://www.law.cornell.edu/rules/fre/rule_901
[4] SWGDE, Best Practices for Digital Forensic Video Analysis (2024). https://www.swgde.org/wp-content/uploads/2024/04/2024-03-22-SWGDE-Best-Practices-for-Digital-Forensic-Video-Analysis-18-V-001-1.1.pdf
[5] ISO/IEC 27037:2012, Guidelines for identification, collection, acquisition, and preservation of digital evidence. https://standards.iteh.ai/catalog/standards/sist/1a0f0032-a18e-4c41-baef-932b9c410aa5/iso-iec-27037-2012
[6] OSAC, Standard Guide for Forensic Digital Video Examination Workflow, Version 2.0. https://www.nist.gov/system/files/documents/2024/01/02/OSAC%202022-S-0031%2C%20Standard%20Guide%20for%20Forensic%20Digital%20Video%20Examination%20Workflow%20Version%202.0.pdf
[7] ISO 22311:2012, Video surveillance, Export interoperability. https://cdn.standards.iteh.ai/samples/53467/3138b6b65b4445e0933842ba0203ea5d/ISO-22311-2012.pdf
[8] Apple Developer Documentation, QuickTime location metadata keys, including ISO 6709 coordinates. https://developer.apple.com/documentation/quicktime-file-format/location_metadata
[9] GoPro, GPMF telemetry resources, and open source parsers. https://github.com/gopro/gpmf-parser
[10] SWGDE, Guidelines for Video Evidence Canvassing and Collection (2020). https://www.swgde.org/documents/published-complete-listing/20-v-002-swgde-guidelines-for-video-evidence-canvassing-and-collection/
[11] SWGDE, Best Practices for Digital Evidence Collection (2025). https://www.swgde.org/documents/published-complete-listing/18-f-002-best-practices-for-digital-evidence-collection/
[12] U.S. Department of the Interior, Body Worn Cameras and Vehicle Mounted Cameras policy, audit trail requirements. https://www.doi.gov/sites/doi.gov/files/elips/documents/446-dm-41-body-worn-cameras-and-vehicle-mounted-cameras.pdf
[13] Bureau of Justice Assistance, FY25 Body Worn Camera Policy and Implementation Program overview. https://bja.ojp.gov/funding/opportunities/o-bja-2025-172461
[14] SWGDE, Best Practices for Drone Forensics (2024). https://www.swgde.org/wp-content/uploads/2024/03/2024-03-07-SWGDE-Best-Practices-for-Drone-Forensics-21-F-002-1.2.pdf
[15] FAA, Law Enforcement Guidance for Suspected Unauthorized UAS Operations. https://www.faa.gov/sites/faa.gov/files/uas/resources/policy_library/FAA_UAS-PO_LEA_Guidance.pdf
[16] Wang et al., Double compression detection for H.264 videos with adaptive GOP structure (2019). https://link.springer.com/content/pdf/10.1007/s11042-019-08306-5.pdf
[17] Li et al., An approach to detect video frame deletion under anti forensics (2019). https://link.springer.com/content/pdf/10.1007/s11554-019-00865-y.pdf
[18] The Sedona Conference, Commentary on ESI Evidence and Admissibility, Second Edition (2020). https://www.thesedonaconference.org/sites/default/files/publications/ESI%20Evidence%20and%20Admissibility%20October%202020.pdf
[19] Federal Rules of Evidence 1003, admissibility of duplicates. https://www.law.cornell.edu/rules/fre/rule_1003
[20] Lorraine v. Markel American Insurance Co., 241 F.R.D. 534 (D. Md. 2007). https://www.computerpi.com/wp-content/uploads/2011/05/Lorraine-v-Markel.pdf
[21] OSAC, Standard Practice for Training in the Areas of Video Analysis, Image Analysis, and Photography (2023). https://www.nist.gov/system/files/documents/2023/07/03/OSAC%202023-N-0001-Standard%20Practice%20for%20Training%20in%20the%20Areas%20of%20Video%20Analysis%20Image%20Analysis%20and%20Photography.pdf
[22] NSA, FBI, and CISA, Contextualizing Deepfake Threats to Organizations (2023). https://www.cisa.gov/news-events/alerts/2023/09/12/nsa-fbi-and-cisa-release-cybersecurity-information-sheet-deepfake-threats
[23] LCG internal research note, Series framing and editorial standards for Beyond the Screen.
This article is for general information and does not constitute legal advice.





