How AI-generated ‘news’ creates trauma

In what seems like just a few short years, artificial intelligence (AI) has infiltrated nearly all aspects of life. Now, it appears to be infiltrating our deaths, too – all in the name of generating a few clicks and dollars.

Take the story of Beth Mazur, for example. When she passed away last December, tributes flowed for the staunch advocate for the rights of myalgic encephalomyelitis (ME) sufferers. (Myalgic encephalomyelitis is also known as chronic fatigue syndrome.)

There is an acute irony in the fact that Ms Mazur dabbled with AI herself in the months before death. Before passing away at the age of just 47, the former IT professional “experimented with generative AI tools like ChatGPT”. 

At the same time as genuine tributes for Ms Mazur poured in, some related online ‘news’ begin to filter through. This ‘news’ revealed that her life partner Brian Vastag had passed away on exactly the same day as she had.

By now you’ve probably worked out that Mr Vastag has not died on the same day as his partner. In fact Mr Vastag was, and still is, very much alive. 

Death by AI

How could the reporters get something like this so wrong? Pretty easily as it turns out. The ‘reporters’ simply ask an AI tool to generate a news article about a particular person. The request may include some relevant facts, which AI will use to generate online-sourced information about the apparently deceased subject.

AI tools may one day be very good at this. Indeed, some may already be very good. But there are definitely ones that are not. As an experiment I visited the ChatGPT site and made the following request: “Write an obituary for Andrew Gigacz, who died suddenly on Wednesday 25 July 2024.”

The first thing I noticed was that it repeated an error I’d inserted deliberately. It said that I had “passed away suddenly on Wednesday, July 25, 2024”. July 25, 2024 was, in fact, a Thursday. 

My AI ‘obituary’ went on to say that I had died at age 54, having been born in 1970. I wouldn’t mind winding the clock back five years, but I am, unfortunately, 59. To ChatGPT’s credit, it did get most other aspects of my life correct. It mentioned my great love for sporting statistics and described me as “known for his warm personality and generous spirit”. That has be correct, doesn’t it?

The serious consequences

While I can be light-hearted about my own obituary, falsely reporting the death of a loved one is extremely cruel. In the case of Beth Mazur and Brian Vastag, several of their friends believed the false news that both had died. 

So, how did it happen? Online news site The Verge did a bit of detective work and found that one site appeared to have aggregated two separate stories. One was an accurate report of Ms Mazur’s death, the other an op-ed piece co-authored by Ms Mazur and Mr Vastag. 

The suspicion is that an AI tool generated the article. But why? The answer became clear when The Verge looked closely at the site on which the fake news article appeared. The page on which the obituary appeared was flooded with ads. They were aggressively placed so it was almost impossible not to click on them.

Every click on one of those ads generates revenue. It’s probably only a fraction of a cent but if one repeats the process for other people and on different sites, the amount can soon become ‘worthwhile’.

Worthwhile for those unscrupulous enough to make money in such a misleading and cruel way, perhaps. But not for most humans with even a basic sense of decency.

Beware of AI

Sadly, there will always be an unscrupulous few who spoil it for the rest of us. If possible (and that’s becoming harder in itself), try and source your news from a trusted website. If you come across an obituary that seems poorly written, try and confirm the facts before sharing any sad news.

Otherwise the supposedly deceased person might have to reprise the famous comments attributed to Samuel Clemens, aka Mark Twain. In response to an enquiry about his health after rumours he had died circulated, Clemens allegedly wrote, “The reports of my death are greatly exaggerated.”

Have you read any articles that appear to have been generated by AI? What was it about them that made you suspicious? Let us know via the comments section below.

Also read: Neurological conditions fuel rise in deaths

Andrew Gigacz
Andrew Gigaczhttps://www.patreon.com/AndrewGigacz
Andrew has developed knowledge of the retirement landscape, including retirement income and government entitlements, as well as issues affecting older Australians moving into or living in retirement. He's an accomplished writer with a passion for health and human stories.

2 COMMENTS

- Our Partners -

DON'T MISS

- Advertisment -
- Advertisment -