Independent MP Zali Steggall has urged the federal government to introduce stronger rules on political advertising, including potential bans on the use of generative artificial intelligence and deepfakes during election campaigns.
Speaking during question time in the House of Representatives, Steggall asked Prime Minister Anthony Albanese whether the government would introduce “truth in political advertising” laws designed to prevent misleading claims in campaign messaging.
Her question specifically raised concerns about advocacy organisations such as Advance Australia and the growing use of artificial intelligence in political communication.
Steggall argued that stronger safeguards were needed to protect voters from misinformation.
She told parliament that the group had recently promoted controversial claims comparing the legacy of former German chancellor Angela Merkel with that of Nazi leader Adolf Hitler.
Steggall said the remarks were “abhorrent” and noted that former Australian prime minister Tony Abbott had attended an event linked to the organisation alongside current Liberal senators.
Her question also raised concerns about emerging technologies being used to manipulate political messaging.
Steggall asked whether the government would “guardrail” against the use of generative AI and deepfake technology in political advertising.
In response, Albanese said the contributions of Merkel to global politics deserved recognition and warned that hard-right political movements were gaining ground internationally.
“The world is seeing a rise of hard-right politics that sometimes needs to be called out,” the prime minister said.
However, he stopped short of committing to new truth-in-advertising laws.
Instead, Albanese suggested the issue could be examined by the parliamentary Joint Standing Committee on Electoral Matters.
“We do need to always ensure that people have freedom of expression, including freedom of political expression,” he told the House.
Communications minister Anika Wells later expanded on the government’s approach, pointing to plans for a proposed digital duty of care framework that would place greater responsibility on large technology platforms to prevent online harm.
The proposal is aimed at requiring major digital companies to take stronger action against harmful online content.
Steggall objected that the minister’s response did not directly address her question about political advertising laws.
However, Speaker Milton Dick ruled the answer relevant because her original question had referred to artificial intelligence.
The exchange concluded without any commitment from the government to introduce new laws requiring political advertisements to meet truthfulness standards.
Australia currently lacks comprehensive federal legislation requiring political advertising to be factually accurate, although several states have introduced limited truth-in-advertising provisions for state elections.
Calls for federal reforms have intensified in recent years amid concerns about misinformation, political disinformation campaigns and the potential misuse of emerging AI technologies during election periods.

