Tell the Minister: Protect Industry jobs — Mandate AI Content Labelling Now

Artificial Intelligence is transforming all areas of content creation, including advertising, but it also presents a real and immediate risk. To jobs, to truth and to creativity.

AI is now producing voices, images, scripts, and music that make it impossible for audiences to tell the difference between what is real and what is not.

This ambiguity means misinformation and deepfakes can spread faster than ever, eroding public trust.

And Australia’s creative professionals – our art directors, copywriters, directors, designers, voice actors, musicians, and designers – are already
starting to face job losses as companies choose to increase profit by using AI, without having to disclose it.

The Australian Association of Voice Actors (AAVA) and the wider creative community call on the Federal Government to introduce mandatory labelling and watermarking for all AI-generated content. This action would align Australia with global standards already being adopted by the European Union (from 2026) and China (from 2025).

This means:

  • Clear audible disclaimers for AI-generated voices and sound.
  • Clear visible labels for AI-generated video, imagery, and written content.
  • Metadata watermarking for traceability and copyright transparency.

Transparency is not a barrier to innovation; it’s the foundation for trust, ethics, and creative integrity.

We ask the advertising industry, agencies, broadcasters, and media organisations to stand with us in urging the Government to legislate mandatory AI content labelling in Australia

"*" indicates required fields

This field is for validation purposes and should be left unchanged.