AI in journalism offers more flexibility, new opportunities for writers

For as long as technology has been improving with higher intelligence and more autonomy, there have always been people afraid of being replaced by computers. And those fears aren’t totally irrational—computers and machines are taking our jobs, but that may not be a bad thing.

Artificial intelligence is all around us—our digital assistants, curated social media feeds and even in those Teslas many people have their eyes on. Machine learning helps us with our everyday tasks, it makes our lives easier. The intent is the same in news rooms and other working environments that are incorporating AI tools; they’re meant to take care of the easy, everyday tasks like standard weather reports or watching trends so we have more time to take care of the projects that are of higher importance and need our attention.

News consumers are already reading articles that were written by a computer. Since 2016 The Washington Post has been using Heliograf, their in-house AI technology, to write short updates for readers—starting with the Rio Olympics. Heliograf has since been used to cover hundreds of elections, with editors checking the write-ups for accuracy before publishing. According to tech magazine Digiday, the Post published 850 stories written by Heliograf in its first year.

Forbes has been using AI technology since 2018. Their content management system, named Bertie after founder B.C. Forbes, uses machine learning to provide their writers with real-time trends to cover, headline suggestions and imagery that would fit the story.

These two aren’t the only agencies using AI in their newsrooms. There are others experimenting with auto-written stories, while others use AI tools to pull data for their journalists.

“You don’t need to be the Bloombergs or the Washington Posts of this world to use AI.”

“You have the biggest organizations and the most notable ones—I’m thinking about Bloomberg and The Wall Street Journal or BBC here in Europe—that are really using AI at so many different levels already,” Mattia Peretti, manager of Journalism AI initiatives for Polis, the London School of Economics and Political Sciences’ think-tank, said.

“But what we were very encouraged about … is that so many more organizations, maybe they don’t even have much [sic] resources, but they realize that they could start at least experimenting with machine learning and on a variety of levels and for different uses. You don’t need to be the Bloombergs or the Washington Posts of this world to use AI.”

Journalism AI is a collaboration between Polis and the Google News Initiative aiming to educate newsrooms about the potential in AI-powered technologies. So far, their researchers have surveyed at least 71 news organizations from 32 different countries about their understanding, use and concerns about AI.

The recently published survey, New powers, new responsibilities: A global study of journalism and artificial intelligence, states that overall most respondents think incorporating AI will be beneficial to news organizations as long as they retain their ethical and editorial stance.

“The only sustainable response to technological disruption is to try to lead it.”

But that confidence didn’t come without concern. Respondents also identified four areas as potentially harmful to ethics and editorial practice: algorithmic bias, misinformation/filter bubbles, transparency and the role of large technology companies.

Algorithmic bias isn’t much different from what we see with social media. Facebook, Twitter, Instagram and other social media platforms use algorithms to analyze your activity and feed you content that they think will be most interesting to you. While this is great for personalization, it can also lead to the creation of a “filter bubble,” where a user is only exposed to things that the algorithms believe they might find interesting, narrowing their worldview.

As for big tech companies, Polis research found that there is concern about these companies and their influence and control of research and product development, especially because many of these companies are the source of innovation and tools like those that use AI.

These are some of the challenges news professionals are facing when it comes to incorporating AI into their newsrooms. They’ll need their company’s support, as well as the knowledge to teach tomorrow’s reporters. But these are challenges that new technology has fought and defeated time and time again.

“I never tire of saying: the only sustainable response to technological disruption is to try to lead it,” Enrique Dans, a professor teaching innovation and hacking at IE University in Madrid, said in an editorial he wrote for Forbes.

Customization and automation is here to stay and while journalists have more time to work on reporting, developers are being hired to maintain and improve AI tools.

If anything, AI will create jobs—not steal them. And for those who are still concerned about their place in the newsroom, Peretti encourages journalists to think proactively about the skillsets they are building.

“Let’s not focus on [automated tasks] and let’s strive to get the skills that will allow [us] to have a value in the news room of the future.”

Families say New Jersey retirement homes fail to communicate resident's healthcare issues

A night meant for fun, hard rock turned terrifying when massive fire engulfed club