![]() |
Sara Farr Guy |
“AI is forced on us,” said one of the participants at this year’s annual 75th Annual International Communication Association (ICA) Conference held in Denver, Colo., between June 12 and June 16. Having had the honour of joining academics and research scholars from around the world during the four-day event, it was refreshing to listen to discussions on a wide-range of topics, including journalists and AI, AI in journalism, and AI and communication.
A study of Danish journalists conducted and presented by Lynge Asbjørn Møller of the University of Southern Denmark found that those whose responsibilities centred on traditional “craft” tasks — such as headline

:gmast3r: ISTOCKPHOTO.COM
While the use of genAI grows across newsrooms, Cornelia Brantner and Joanne Kuai of Karlstad University in Sweden argued there is a critical dilemma for media and democracy. Their study on the retrieval of information that is increasingly used with the assistance of ChatGPT and Copilot found that even when the prompts are in languages other than English, the results typically show content from English-language sources, primarily in the U.S. and the U.K. This poses a pressing issue for democracy. “If quality and regional information is discontinued, who will tell the world what is happening?” Brantner concluded.
Another issue with this growing technology is that of underrepresented themes. This can be anything from diversity, sustainability, policy and regulation, environmental cost, as well as local and community news. Ralph Schroeder of the University of Oxford said these themes matter more than one thinks. “There is a lot of ambiguity. The biggest issue is how distribution and tailoring of content are being reshaped by AI and algorithms,” he said, adding that this is the new form of “gatekeeping.” While there are gains from AI, these are context-dependent and partially overstated, Schroeder added.
Discussions revolved around AI’s disruption of authorship and originality, journalists consulting AI instead of other journalists, and how AI could be used as institutionalism at a time of emerging worlds and routines. However, the negative impact of social media has had a negative impact of journalists’ perception of AI.
Sora Park of the University of Canberra, Australia, emphasized AI cannot generate news and argued for regulatory framework change that would force companies to disclose their sources. Recommendations to journalists include practising transparency and disclosing the use of genAI, be it in summarizing content, drafting copy or organizing information; and serving their communities as a reliable source of information while promoting informed public conversations about the use of AI and its limitations.
Michael L. Kent of the University of New South Wales, Australia, made the case that just because this technology is widely used, it doesn’t mean it is correct. “Doomsday scenarios overshadow practical AI consequences like misinformation,” he said, adding that while AI excels at recycling ideas, it does not generate true creativity and lacks consciousness. This, in turn, results in true understanding remaining unattainable.
Aside from the environmental cost of AI, which requires massive computational power and water consumption to run data centres, Kent argued that a lot of the time, the data is wrong; and this technology is also taking away entry-level jobs. “Where are these people going to get entry-level experience to get mid-level jobs?”
As societies embrace and adapt to these new technologies, it is important to keep in mind the implications and cost at which these changes are being made.
Sara Farr Guy is an editor with LexisNexis North America.
The opinions expressed are those of the author(s) and do not necessarily reflect the views of the author’s firm, its clients, Law360 Canada, LexisNexis Canada or any of its or their respective affiliates. This article is for general information purposes and is neither intended to be nor should be taken as legal advice.
Interested in writing for us? To learn more about how you can add your voice to Law360 Canada, contact Analysis Editor Peter Carter at peter.carter@lexisnexis.ca or call 647-776-6740.