Skip to content

‘Deepfake abuse’ bill drawn: ACT seeks to criminalise explicit AI images

Netsafe reports of daily complaints and feedback from school principals that incidents are rising among students.

Summarised by Centrist

ACT MP Laura McClure’s Deepfake Digital Harm and Exploitation Bill has been drawn from Parliament’s ballot, setting up a first reading in due course. 

The bill would update existing law by expanding the definition of an “intimate visual recording” to include images or videos that are created, synthesised, or altered to depict a person’s likeness in a sexual context without their consent.

McClure says the change targets a specific, well-defined harm rather than the technology itself. The proposal does not create a new regulator or ban AI tools; it plugs a gap so victims can seek removal of content and prosecutors can pursue offenders under the same framework used for non-consensual recordings. She cites Netsafe reports of daily complaints and feedback from school principals that incidents are rising among students.

If passed, the tweak would align the law with how explicit material is now made and shared, closing off the defence that a fake is not a “recording.” Critics will likely probe scope, evidential thresholds, and how consent and identity are proved when faces or bodies are composited. Supporters will frame it as a narrow, tech-neutral fix to protect victims and give police clearer charging options.

The bill has been drawn and will receive its first reading at a later date.

Read more over at RNZ

Image: ApolitikNow

Receive our free newsletter here

Latest