The Australian Electoral Commission (AEC) has issued a truly disturbing warning, admitting it has “limited scope” to protect Aussie voters from fake AI-generated political content this election season.
You could soon be watching videos of Anthony Albanese praising the benefits of closing down unions, or Peter Dutton extolling the virtues of immigration, if the words of Australia’s Electoral Commissioner (AEC) are anything to go by.
AI-generated content featuring celebrities is unfortunately nothing new, but AEC Tom Rogers wants to reinforce the message ahead of next year’s federal election.
Mr Rogers told a Senate committee that ‘deepfake’ videos of a political nature are not currently illegal in Australia. Because the videos are not marked as being ‘officially authorised’, the AEC has no authority to regulate them.
“If those messages were authorised, duly authorised, they do not fall afoul of the electoral act currently,” he said.
The warning comes as both the United States and South Korea grapple with their own crop of deepfake political viewers.
Earlier this year, voters in the US state of New Hampshire received a robocall using an AI-generated Joe Biden voice. The call urged voters to stay home and not vote on election day.
South Korea has actually introduced laws to attempt to combat the scourge, but Mr Rogers says he doubts how successful they will be.
“I don’t know whether they have some sort of internal capability, or they’re relying on other agencies within the South Korean government to be able to [detect deepfakes], but we will follow up at a point to see how that’s going.”
Have you ever seen an obviously fake Aussie political ad?