NewsLocal NewsCleveland Metro

Actions

The rise of AI and decline in news literacy to make a noteworthy confluence

AI 1.jpg
Posted at 3:57 PM, Jan 26, 2023
and last updated 2023-01-26 18:25:23-05

KENT, Ohio — The big bets on the future of artificial intelligence, including Microsoft’s multi-billion dollar investment in OpenAI, the company behind viral chatbot tool ChatGPT, comes at a time when the credibility of what we read online has never been so opaque. As part of News Literacy Week, News 5 is examining the upcoming collision course between artificial intelligence and news literacy.

Earlier this week, tech titan Microsoft confirmed what had long been rumored: A third phase of Microsoft’s long-term investment with OpenAI to “accelerate AI breakthroughs to ensure these benefits are broadly shared with the world,” the company announced in a blog post. Microsoft made previous investments in 2019 and 2021.

For the uninitiated, OpenAI developed ChatGPT, an AI-based chatbot that has the ability to generate and present information in a way that closely resembles the ways a human would. Although artificial intelligence research — and the machine learning techniques needed to make services like ChatGPT possible — have been in development for years, some of the platforms have only recently become public. After being released to the public in late 2022, ChatGPT crossed one million users within a week.

AI 2.jpg
Artificial intelligence and chatbots are changing society and the workforce.

The development and proliferation of artificial intelligence could bring untold changes to numerous industries, including the news industry.

“It could offer some good things for information and news content and I think it could offer some potentially damaging things,” said Gretchen Hoak, an associate professor of journalism at Kent State University. “There are AI technologies that are getting better and better at writing but to be able to include the real details and the real people and all of the things that encompass a news story, I don’t think even ChatGPT is there.”

Training future journalists

Hoak spent 10 years working as a television news reporter in the Toledo and Youngstown markets. Although it was hardly by design, she carved a niche covering the police beat.

From a young age, Hoak said she always reveled in knowing what was going on but it took the advice of a guidance counselor to push her into journalism.

“I enjoyed telling those stories because they were important,” Hoak said. “I think that I loved the idea of it being the something different every day. I don’t know that I romanticized it but I believed what I was doing was important — and I still believe that.”

Hoak has now devoted her career to training future broadcast journalists. Many of her current students, whose entire lives have been spent with information available at their fingertips, have varying levels of news literacy.

“We have found that even within our students that want to be journalists, their level of news literacy when they first come to us is not all that great,” Hoak said. “They can’t differentiate good and bad content and if they can’t do that and they want to be journalists, what do you think the general public is doing?”

AI 3.jpg

News literacy is the ability to determine the credibility of news and other information as well as being able to recognize the standards of fact-based journalism.

According to data collected by the non-profit News Literacy Project, the level of news literacy among young adults has fallen to concerning levels.

Data from 100,000 students collected by the News Literacy Project found 55% of students indicated that they were not even moderately confident in their ability to spot false information online.

Additionally, a 2017 student by Stanford University’s Graduate School of Education determined that the middle school, high school and college students who were tested did not have a good understanding of what constitutes “fake news” versus real news. More than 80% of middle school students could not spot the difference between sponsored articles and legitimate news stories.

“Given the environment and the technology and the ability that social media has created to just put anything out there, I think [news literacy is] really important,” Hoak said. “I think gone are the days when you could look at a piece of fake news and know it was fake because it was poorly written.”

AI 4.jpg

For Kent State senior Chris Abreu and junior Katie Masko, technology was a common theme in their upbringing. Powerful smartphones had already saturated the market; the internet had become deeper and more dynamic; social media platforms connected people from opposite ends of the world.

Information was everywhere. In many respects, it was nowhere, too.

“I have to be the one to know that what I am getting is credible, instead of just the news station making sure everything is credible for me. That’s a very different environment from the one my parents were in,” Masko, who is studying public relations, said. “If you get that skill early on, it’s going to work for you for the rest of your life.”

Abreu, who originally enrolled at Kent State as a science major, said news literacy is becoming an increasingly more important skill, especially for his generation.

“There is disinformation, which is a huge issue, and it is why media literacy is so important,” Abreu said. “I think there is the good, at least in my opinion, that outweighs the bad at long as people know how to navigate those spaces.”

Training AI

Stated simply, Chatbots like ChatGPT are trained by ingesting ingesting vast volumes of text data that is gathered or ‘scraped’ from the internet. Similar to how a Google search returns a long list of results filtered by keywords, the artificial intelligence pulls text from the far reaches of the internet and breaks it up into smaller bits and pieces.

David Silva, an assistant professor in communication studies and emerging media technology at Kent State University, said many current features have served as the “building blocks” for chat-based artificial intelligence.

“Like Google is returning documents to you when you’re conducting a Google search, ChatGPT is returning text you based on the inputs you’re putting into it. It’s a really good pattern-matching system,” Silva said. “Think about the autocorrect on a word document, the red squiggly line—that is a building block of this language system. A lot of documents now have autocomplete for the next word, ChatGPT is an extension of those ideas — not something brand new in itself.”

AI 1.jpg

Although many have raised concerns that ChatGPT could make disinformation campaigns cheaper and more efficient, Silva isn’t so sure.

“They know pretty well what plays to a particular audience. I don’t know if they need to be that creative in playing the old hits,” Silva said. “We can go back to the same divisions in society and pick at those in disinformation campaigns. It’s going to sound like a trite answer maybe but I see it as a tool. You can use a hammer for good things and you can use it for bad things. That’s not new for technology in tools. I think for ChatGPT and other AI systems there are good ways to use that and bad ways.”

The broader impact of artificial intelligence on the journalism industry remains unknown; experts believe it is still too early to tell. However, early forms of AI have already crept into some parts of the industry.

In 2014, the Associated Press began using automation to generate stories about quarterly earnings reports from public companies. The use of automation was then expanded to the AP’s coverage of minor league baseball games. These automated articles feature an editor’s note.

As artificial intelligence grows in popularity and practicality, Hoak and Silva stressed that transparency will be essential.

“Would more and more content be created by things like that versus human journalists… and what do we lose as a result of that? From that standpoint, I don’t know that we are there yet,” Hoak said. “The more the public understands about who you are and what you are doing and how you did it I think the better off any news organization is.”

Silva said the push for transparency is two fold: News organizations should be transparent in how artificial intelligence is being implemented in their reporting and, secondly, there should be transparency in the AI platforms themselves.

“I think you should know who the author is. If there are biases in a tool or system that is being used, at least we can know that tool was used so we can have debate and conversation about how those tools are used,” Silva said. “We know that journalists use technology tools to do all of their reporting. We are using modern technology across the board. Whether or not audience members have a different view of an AI system, we’re still doing active research in that.”

Safeguards in AI

In March 2022, the Pew Research Center released new survey data related to the public’s views of artificial intelligence, how the technology would be used and what safeguards would be in place. The survey found 37% of respondents were more concerned than excited while 45% were equally concerned as excited.

Those survey results may serve as a harbinger of the mixed reactions to news organizations implementing artificial intelligence in their reporting, Hoak said.

“In the case of wanting people to believe in the news and be more credible, there may actually be a place for AI in certain aspects. However, if we are talking about wanting to trust journalists, AI is a problem,” Hoak said. “People who already trust traditional human journalists or trust the news are typically more concerned about AI generated content. People who don’t already trust the news — who already distrust it — are less concerned about AI generated content. As to where it goes, I think it depends on how you already feel about media. If you already don’t trust it, AI is a great thing. But if you do trust it, AI is a threat.”