Nikita Roy, a speaker at the Carleton University Journalism event, is seen presenting with a microphone in hand. She wears a brown jacket and a lanyard with an ID badge, gesturing as she addresses the audience. Behind her, a large screen displays two images of her in a red dress. A Carleton University Journalism banner and a podium are also visible in the background.

“Newsroom Robots” podcaster and journalism educator Nikita Roy speaks at the Carleton University-hosted Journalism and AI roundtable in Toronto on May 30, 2024. The split image in the background shows two videos of Roy — one generated by AI and the other real. About 70 per cent of attendees guessed wrong when asked which was the real Nikita. [Photo © Angel Xing]

Two videos showing Newsrooms Robot podcaster Nikita Roy appeared side by side on a large screen at the Carleton University-hosted Journalism and Artificial Intelligence roundtable in Toronto on Thursday.

A survey of about 50 audience members showed just how accurate AI imaging can be, with 70 per cent of people guessing wrong when asked which of the two images was real and which was artificially generated.

Advances in AI have accelerated in recent years and forced journalists and news organizations to rethink the way they produce news. According to Roy, AI can be a huge booster in terms of productivity and insights for newsrooms. 

“There was a study that was done with Harvard Business School, and the Boston Consulting Group where they took 700 local consultants,” said Roy. Half of the team had training and access to AI, while the other half did not. 

“The group that had access to AI . . . produced 12 per cent more work, they were 25 per cent faster, and on average, they were producing 40-per-cent higher-quality work.”

The study also demonstrated that lower-performance consultants tended to benefit the most from it. 

Additionally, subscription rates have been shown to rise when a media outlet uses AI. 

“I was just talking to a journalist in Norway . . . about how AI has been helping their newsroom not just produce articles and do stuff quicker, but produce better journalism,” said Roy. “They were able to find insights . . . that they were not able to do before without the power of generative AI.”

Nikita Roy’s Newsroom Robots podcast has become a key forum for discussing the ways AI tools are being utilized by news organizations

AI can complete a day’s worth of work in a few minutes, said Roy. It can create data tables in a matter of seconds that would take many journalists many hours to complete. Inputting data into an AI generator can create a simple and effective table that can be integrated into an article. 

“It’s helping you speed up the way in which you’re collecting data,” added Roy. 

Accessibility is something AI can also support by generating alt text, descriptions of visual content for readers who are visually impaired. AI can produce a longer, more descriptive paragraph for those who need it, sparing journalists the time and effort to carry out that task.

“You can see how well the alt text is describing it,” said Roy, referring to alt text produced robotically. “If we (journalists) would do it, we would expect one line.” The text on the screen showed four lines detailing an image.

“The people who require alt texts and making it accessible should be a really prime concern of ours, and a focus.”

With imaging advancing in AI, the conversion from language to visuals is becoming popular. Journalists are able to use the technology to convert text articles to videos, which media outlets can use as reels, Tik Toks or Youtube shorts. 

“News articles to videos — this has been used by a lot of local newsrooms right now, where you can create videos for all kinds of different platforms. All you have to do is put in a link to an article,” said Roy. 

Despite the benefits of AI, using it correctly can be challenging, said Roy. Over the past few years, journalists have become familiar with its shortcomings. 

Being careful to catch false information is something journalists face everyday — and they need to be vigilant with AI. Trusting every fact or source from it can be dangerous, especially considering ChatGPT does not provide links to information. 

“It’s becoming harder and harder for us to talk about and decipher what is true or not,” said Roy. “What needs to happen is more . . . guidelines about how AI generated images are going to be labeled.”

Publishing without fact-checking or looking for plagiarism can cost media outlets their reputations.

“Don’t publish anything without checking for plagiarism,” warned Roy. “I don’t think anything from a generative AI model should be going straight to publication. And I don’t think that’s the purpose of it.”

Despite the benefits of AI, questioning if it is useful to apply it to everyday tasks in the newsroom is important, according to Roy. Provided with a small list of questions, the audience was given the opportunity to reflect on the effectiveness of AI. 

“Can it help me speed up the process?” asked Roy. “Do you have specific instructions on how the tasks can be performed? Is it valuable for you to have a draft to start from?”

Pin It on Pinterest

Share This