Dunlop Media

THINK. SPEAK. EDUCATE. ™
  • Clients
  • Overview
    • Mission
    • Methodology
  • Services
    • LEVEL ONE: Narrative-Message Development
    • LEVEL ONE: Media Training-Print and Online
    • LEVEL ONE: Media Training-Broadcast and Video
    • LEVEL ONE: Organizational Alignment Workshops
    • LEVEL TWO: Media Crisis Training
    • LEVEL TWO: Business Presentation Training
    • LEVEL TWO: Executive Communications Leadership
    • LEVEL TWO: Office Hours
    • LEVEL TWO: Online Lecture Coaching
    • Remote Training (all levels)
  • Team
    • Steve Dunlop
    • Lisa M. Duchin
    • Bob Fasbender
    • Charles Feldman
    • Denisse Oller
  • Press Center
    • Dunlop Media Archives
    • Photo Gallery
    • Podcast
    • Commentary
    • Suggested reading
    • About the Press Center
  • Request info
  • Clients
    • Mission
    • Methodology
    • LEVEL ONE: Narrative-Message Development
    • LEVEL ONE: Media Training-Print and Online
    • LEVEL ONE: Media Training-Broadcast and Video
    • LEVEL ONE: Organizational Alignment Workshops
    • LEVEL TWO: Media Crisis Training
    • LEVEL TWO: Business Presentation Training
    • LEVEL TWO: Executive Communications Leadership
    • LEVEL TWO: Office Hours
    • LEVEL TWO: Online Lecture Coaching
    • Remote Training (all levels)
    • Steve Dunlop
    • Lisa M. Duchin
    • Bob Fasbender
    • Charles Feldman
    • Denisse Oller
    • Dunlop Media Archives
    • Photo Gallery
    • Podcast
    • Commentary
    • Suggested reading
    • About the Press Center
  • Request info
earth-PIXABAY-373716_1920.jpg

From The Center

  • Dunlop Media Archives
  • Photo Gallery
  • Podcast
  • Commentary
  • Suggested reading
  • About the Press Center

A digital human (not real) using ChatGPT. Image generated by Stable Diffusion using AI technology.

Does ChatGPT Think You're Dead?

Steve Dunlop June 8, 2023

Learning about AI “hallucinations”

Media commentary by Steve Dunlop

Listen to the podcast version here

We’re well into the digital age, and by now you’ve surely googled yourself to see what the search results say about you.  Maybe you do so regularly. 

But have you ever tried ChatGPT’ing yourself?  Maybe you should.

Not long after I hosted an online panel on artificial intelligence and the future of journalism, I embarked on a modest experiment.  I asked ChatGPT to write a three-paragraph essay about my career as a television reporter and anchor.  It returned a series of inaccuracies that AI experts now refer to as “hallucinations.” But the essay was so deftly written that it carried a seductive air of truth. 

“He began his career as a reporter for WABC-TV in the 1970’s,” the essay wrote.  Wrong.  I never worked at WABC.  I was a news writer for the AP and a radio reporter on Long Island in the 1970’s, and later a news editor at WOR Radio.

The essay claimed I moved to WNEW-TV’s Ten O’Clock News in 1975.  That was eight years off.  I started at the Ten O’Clock News in 1983 before moving on to WNBC, and later, CBS News.

ChatGPT went on to call my reporting on 1977’s “Son of Sam Murders” perhaps my “most notable achievement” and claimed I became a “respected authority on the case.” 

But I never covered Son of Sam.  

It stated I was “one of the first reporters on the scene.”   But anyone who remembers the case knows there was no single “scene” to be at – Son of Sam was a string of murders at a variety of locations, and the first was actually in 1976.  

I asked a few of my journo colleagues to replicate my AI experiment, and they got back similarly false results.  Like the old newsroom saying goes, never let facts get in the way of a good story. 

How about the big stories of the era that I did cover?  The Bernhard Goetz subway vigilante case, the insider trading probe of Ivan Boesky, the Robert Chambers “Preppy Murder” trial, the sentencing of the Mafia Commission, the construction and opening of the Javits Center, and Donald Trump’s brushes with bankruptcy, among others?  ChatGPT made no mention of any of them. 

The algorithm did try to flatter me, though.   ChatGPT claims I was known for my “friendly and approachable personality,” called my approach to journalism “thorough and insightful,” and claimed my non-existent Son of Sam reporting was “praised for its sensitivity.”  If I had made some of these claims on my resumé, I could stand accused of fraud at worst, puffery at least. 

As a storyteller, however, ChatGPT does have one admirable instinct:  it saves the best for last.   In closing, it wrote, “his contributions to the field of journalism and his impact on the New York City media landscape continue to be remembered and celebrated to this day... “

...although, it added, “he passed away in 1999.” 

As Mark Twain said when a newspaper mistakenly published his obituary, “reports of my death are greatly exaggerated.”  At least Twain could cable the editor to get a correction.  Who can I contact?   Beats me.  Which is yet another problem with AI.  

Maybe the need to set so many records straight will be good for actual journalism in the long run.   But those of us for whom truth and facts are coin of the realm already had our work cut out for us by the deluge of fake news.   The advent of AI will only make misinformation more ubiquitous – and the need to debunk it more urgent.  

If you don’t believe that, try asking ChatGPT to write a three-paragraph essay about yourself.    And good luck getting a correction.

  • From The Center
  • Older
  • Newer
twitter linkedin
  • Contact us

THINK. SPEAK. EDUCATE.™

Creators and exclusive teachers of the PATH™ and SPAR® communications curricula. 

© 2025 DUNLOP MEDIA INC.  ALL RIGHTS RESERVED.  BY USING OR ACCESSING THIS SITE YOU ARE AGREEING TO DUNLOP MEDIA'S TERMS AND CONDITIONS. 

Dunlop Media

THINK. SPEAK. EDUCATE. ™

The Chrysler Building | 405 Lexington Avenue, 26th floor, New York, NY 10174, USA

twitter linkedin