From left: Dr. Xinya Du, Dr. Gopal Gupta, Dale MacDonald and Dr. Jessica Ouyang of UT Dallas were the featured panelists at the “ChatGPT: Fact vs. Fiction” forum in the Edith O’Donnell Arts and Technology Building Lecture Hall.

University of Texas at Dallas technology experts dispelled misconceptions and highlighted advantages of a new artificial intelligence (AI) tool that has stirred growing chatter about its power to change communication, education and the workforce.

“Students could potentially use it to get their homework done,” said Dr. Gopal Gupta, professor of computer science and one of the panelists at the “ChatGPT: Fact vs. Fiction” forum held March 21 in the Edith O’Donnell Arts and Technology Building Lecture Hall. “It is a double-edged sword. We’ve got to teach students to be honest and use it as a tool to learn.”

The forum, sponsored by The Dallas Morning News and moderated by science reporter Adithi Ramakrishnan, also featured Dr. Xinya Du and Dr. Jessica Ouyang, both assistant professors of computer science in the Erik Jonsson School of Engineering and Computer Science; and Dale MacDonald, associate dean of research and creative technologies in the School of Arts, Humanities, and Technology.

ChatGPT is an AI text chatbot released for the web in late 2022. It uses technology that replicates how people write by quickly processing a large database of books and online material and analyzing how words are put together. Users can ask ChatGPT a question or ask it to write a song, poem, letter or essay, and within seconds, it will provide an answer or complete the task.

When asked if students could use ChatGPT to write their assignments for them, the panelists said that while the threat of cheating is real, there are legitimate academic applications for the tool.

MacDonald pointed out that since ChatGPT and similar technologies have quickly become ubiquitous, it’s essential that teachers use AI chatbots in the classroom so that students can learn about them.

“It is becoming clear that it’s important that students use it and that teachers get students to use it so they can have these ethical conversations,” he said. “Our students are going to have to have this literacy.”

Science reporter Adithi Ramakrishnan of The Dallas Morning News moderated the forum.

Du said educators can adjust to ChatGPT by changing the way they assign work to students.

“We can come up with questions that are more challenging — charts, analysis,” he said. “We can also have students write critiques of the AI-generated content.”

Assigning critiques could be a rich vein for teachers to mine. Ouyang said the answers that ChatGPT produces can be riddled with errors, giving students an opportunity to enhance other skills.

“It might switch the names of two characters from the book you’re supposed to be writing an essay on,” she said. “And unless you are critically reading the essay that it has written for you, you may not realize or notice that.”

The panelists emphasized that ChatGPT is just a tool, one that works via a technology called pattern matching. It predicts the next word in a sentence based on the massive amount of content it has reviewed as part of its machine learning.

“Any logical behavior is there by chance,” Gupta said. “If you ask ChatGPT what is two plus two, it says ‘four,’ because that is what’s out there.”

But if more people in the data set ChatGPT was trained on had said five, that’s the answer ChatGPT would provide, he said.

When asked about the stories of chatbots seemingly expressing emotions with their human chat partners, the panelists reassured the audience.

“Don’t worry,” Ouyang said. “ChatGPT is not going to develop sentience and come after us.”

The reason behind that kind of behavior is the way the chatbot is trained in pattern matching, she said.

“AI is not going to take your job. A human who can use AI is going to take your job.”

Dale MacDonald, associate dean of research and creative technologies in the School of Arts, Humanities, and Technology

“It responds as it’s seen humans respond in the past,” Ouyang said. “If you say, ‘I love you,’ the chatbot will say it back.”

Gupta likened ChatGPT to a child repeating what the adults in their household have said without understanding the meaning or context.

“It may or may not be right, but it sounds right,” he said.

Ouyang also noted some energy-use and privacy-protection concerns with the technology.

“There is a lot of environmental concern with the carbon footprint of training these models,” Ouyang said. “The amount of electricity they use is staggering.”

And while she doubts OpenAI, the company that developed ChatGPT, is selling user data, Ouyang said that online privacy is still a major worry.

“It’s always a concern that anything you put on the internet could be collected,” she said.

Meanwhile, MacDonald reassured the audience that AI will not replace humans in the workplace.

“AI is not going to take your job,” he said. “A human who can use AI is going to take your job.”

MacDonald also cast doubt on the longevity of ChatGPT’s popularity, noting that people’s attention spans quickly drift from one new thing to the next.

“This is still at a very high hype state,” he said. “Until people come up with actually good reasons to use it, it may not bear the investment.”