AI copyright crisis: Australian writers push back against AI training deals

AI copyright tensions escalate as authors challenge publishing contracts that allow their work to train artificial intelligence models.
A view from above as someone handwrites into a notebook, thereby avoiding any issues of AI copyright.

A growing rift between Australian writers and publishers has burst into the open, with prominent authors speaking out against contracts that would allow their work to be used for AI training. The controversy has reignited the broader debate about AI copyright, creative consent and the economic future of literature in the digital age.

The spark came earlier this year when Melbourne publisher Black Inc requested that authors sign a new agreement permitting the use of their writing to train generative AI systems. The clause was met with swift and fierce backlash, with many writers refusing to sign and some likening the request to ‘signing their own death warrant’.

According to a report in The Guardian, the clause in question asked contributors to the quarterly journal Australian Foreign Affairs to ‘grant permission for [the publisher] to permit a third party to use the work in machine learning and other similar technologies’.

Authors reject AI copyright clause as exploitative

Author and academic Tim Dunlop described the situation as ‘very much like a land grab’ and accused publishers of prioritising future tech partnerships over protecting the rights of authors. ‘We’re being asked to give up rights that haven’t even been clearly defined yet,’ he told The Guardian. ‘The idea that publishers should just assume they can do this without negotiation or recompense is shocking.’

Novelist Jennifer Mills echoed these concerns, calling the AI copyright clause a betrayal of trust. ‘It’s an insult to writers’ intelligence to ask us to consent to this without a full understanding of how the technology works or how our work will be used,’ she said.

Writers across Australia have raised alarm about the precedent this sets. If major publishers begin including such clauses as standard, it could mean that vast swathes of contemporary Australian writing are used to train AI systems, without additional payment, credit or even informed consent.

These concerns arrive alongside the launch of the federal body Writing Australia under Creative Australia, which aims to provide better support for the literary sector. Yet as ArtsHub recently reported, writers remain unconvinced that new funding bodies alone can address the scale of challenges they face, including the threat of AI exploitation. As author Eleanor Jackson told ArtsHub, ‘We need long-term thinking, not just funding rounds. We need advocacy and protection.’ For many, ensuring robust AI copyright protections is a critical part of that long-term thinking.

What AI copyright means for creativity

The stakes are high. Generative AI tools like ChatGPT and Suno have been trained on enormous datasets scraped from the internet, including books, journalism and essays, often without the knowledge of the original creators. The resulting systems can now generate credible imitations of writing styles, genres and voices.

For many in the literary world, this raises urgent questions about the future of authorship. If a writer’s work can be absorbed into a machine and replicated at scale, what happens to the economic and cultural value of their voice?

The Australian Society of Authors (ASA) has warned that publishers must not rush to sign deals with tech companies without proper consultation and fair remuneration frameworks for authors. ASA CEO Olivia Lanchester said in a recent statement, ‘The ASA is opposed to contracts that seek to grant AI rights beyond what the Copyright Act allows. We are encouraging members to get in touch with us if they’re asked to sign anything unusual.’

A call for industry-wide AI copyright guidelines

While the Black Inc clause was specific to one publication, it has signalled a much larger shift already underway. Across the world, writers are discovering their work has been used to train AI models. Some have joined class-action lawsuits, while others have taken to the media to demand ethical standards.

In response to the backlash, Black Inc told The Guardian that it ‘values the concerns raised by writers’ and has since ‘paused the clause pending further industry consultation’. Yet, for many writers, the damage is already done.

What the publishing industry lacks, critics say, is a consistent and enforceable policy on AI copyright. Without legal clarity, the burden falls on individual writers to push back against powerful publishers or tech companies.

There are growing calls for Creative Australia and the Federal Government to introduce AI-specific protections for artists and authors, particularly in the wake of the newly established National Poet Laureate program and renewed investment in literary infrastructure. Writers argue that genuine support must include safeguarding the rights to their work in emerging digital ecosystems.

Where to next for AI copyright?

The AI copyright crisis may be in its early stages, but it is already shaping how writers, publishers and institutions think about the future of literature. At stake is more than just licensing revenue – it is the ability for writers to retain control over their voices in a rapidly shifting technological landscape.

As Dunlop put it, ‘We’re not anti-tech. But we are pro-consent. And that’s what’s missing here.’

Discover more arts, games and screen reviews on ArtsHub and ScreenHub.

David Burton is a writer from Meanjin, Brisbane. David also works as a playwright, director and author. He is the playwright of over 30 professionally produced plays. He holds a Doctorate in the Creative Industries.