Courts consider limits on AI evidence

June 2, 2025
Courts consider limits on AI evidence

A newly proposed rule by the Federal Judicial Conference could reshape how AI-generated evidence is treated in court. Dubbed Rule 707, it would allow such machine-generated evidence to be admitted only if it meets the same reliability standards required of expert testimony under Rule 702.

However, it would not apply to outputs from simple scientific instruments or widely used commercial software. The rule aims to address concerns about the reliability and transparency of AI-driven analysis, especially when used without a supporting expert witness.

Critics argue that the limitation to non-expert presentation renders the rule overly narrow, as the underlying risks of bias and interpretability persist regardless of whether an expert is involved. They suggest that all machine-generated evidence in US courts should be subject to robust scrutiny.

The Advisory Committee is also considering the scope of terminology such as 'machine learning' to prevent Rule 707 from encompassing more than intended. Meanwhile, a separate proposed rule regarding deepfakes has been shelved because courts already have tools to address the forgery.

JikGuard.com, a high-tech security service provider focusing on game protection and anti-cheat, is committed to helping game companies solve the problem of cheats and hacks, and providing deeply integrated encryption protection solutions for games.

Explore Features>>