Talking About ChatGPT

February 2023
Share this article

Hi, it’s John Elsasser here — personally writing this note.

Given the hype and backlash surrounding ChatGPT (short for “generative pre-trained transformer”), the controversial AI-driven chatbot prototype that’s captivated tech giants and the media since its November debut, I wanted to offer some assurances — especially as we present our annual writing and storytelling issue.

Here’s how it works: A user can give the feature a prompt, and ChatGPT can write an article in a particular style or compose an email “and it will spit out coherent, seemingly human-written text in seconds,” as NPR reported, calling the technology “both awesome — and terrifying.”

There have been conversations about how people might use the technology for newsletters, marketing and other information-based services in the coming months and years.

Chatbots aren’t new. Their origins trace back to the 1960s when something called ELIZA operated “within the MAC time-sharing system at MIT which makes certain kinds of natural language conversation between man and computer possible,” according to a 1966 article in the Communications of the ACM, the monthly journal of the Association for Computing Machinery.

Within a few days of its launch in late November, more than a million people had tried ChatGPT, which is free to use now. But its creator, the for-profit research lab OpenAI, warned that while ChatGPT’s answers might sound plausible, the chatbot “may occasionally generate incorrect or misleading information.” 

As Axios pointed out, by blurring distinctions between human and machine authorship, the technology could further erode trust in disciplines or businesses that rely on the written word, including the PR and communications profession.

Somebody might use the chatbot and similar tools in ways that undermine trust in information and discourage public discourse, NiemanLab predicted in December.

At a time when trust in the news has fallen, and mis- and disinformation run rampant, “ChatGPT’s parlor trick of human mimicry pours [gasoline] on an already flaming Dumpster fire,” NiemanLab wrote.

A school of AI thought

Meanwhile, according to multiple media reports, the ChatGPT technology provides students new ways to cheat on essays and tests, which, in early January, led to the New York City Department of Education (and others across the country) banning access to the chatbot from school-owned networks and devices. Of course, students can still use it on their own devices with cellular networks or non-school WiFi. (Instead of a ban, The New York Times suggested that schools teach with it.)

Daniel Herman, a high school English teacher in Berkeley, Calif., wrote in The Atlantic that the robot writes better than most students today. AI could cause new human creations to dwindle. Herman wonders whether the technology will rob us “of what can be communicated only through human emotion?” And Axios noted, “The more writing AI does for us, the fewer of us will practice the skill.”

In this issue, Ken Scudder offers his non-computer-aided thoughts on surviving in the age of AI writing. And our features include insights on communicating messages that matter, relaying the benefits of purpose-driven storytelling and cutting the bureaucratic jargon.

As for ChatGPT, AI-powered art, essays and responses are here to stay. Engadget writer Will Shanklin stated, “The next part, dealing with the potential societal fallout — including the automation of more and more jobs — will be where the real challenges begin.” 

Return to Current Issue Writing & Storytelling | February 2023
Share this article
chat_logo
[openai]
 

Subscribe to Strategies & Tactics

Subscribe

*Strategies & Tactics is included with a PRSA membership