The massive scale and speed of content production through artificial intelligence (AI) technologies pose a threat to human knowledge in favour of machine-generated knowledge, which does not necessarily aim to enhance awareness, but rather to exploit platform algorithms for other purposes, an academic has warned.

Dr Marc Owen Jones, assistant professor of media analysis at Northwestern University in Qatar (NU-Q) – a QF partner university – believes we are in the early stages of what he describes as the influence of “blind epistemic power”, where AI threatens to flood the digital knowledge ecosystem with misleading information.

“This creates a kind of ‘noise’ in the information landscape,” he said. “It affects the intellectual system and gradually weakens the public’s ability to distinguish between trustworthy journalism and low-quality content designed to attract and manipulate audiences.”

However, Dr Jones also emphasises that AI offers significant opportunities, such as analysing vast amounts of data and overcoming language barriers; for example, journalists from India to Latin America are using language models to investigate corruption, track organised crime, and uncover algorithmic bias.
“Journalists must move beyond the role of passive users of technology and become active players,” he said. “This requires supporting independent journalism, enacting appropriate legislation related to AI, and adopting a culture of AI literacy in newsrooms – while reinforcing the role of the human element and upholding ethical responsibility.”

NU-Q graduate Hessa al-Thani shared her experience of the impact of AI in spreading misinformation.

“I saw a deepfake video of a political figure that looked very real – I didn’t realise it was fake until later,” she said.

“In this era, AI-generated content is everywhere, and it’s incredibly easy to fall into its trap,” Hessa added. “That’s the primary goal: to mimic humans and blur the line between what’s real and what’s fake.”

She acknowledges the creative potential AI holds in the context of journalism, in areas such as gathering information, drafting questions and e-mails, and editing text.

“Our core strength as journalists lies in our ability to tell stories,” Hessa said. “When this ability is handed over to a machine, the stories become hollow, sometimes unethical, and biased – especially as this technology continues to be developed in the West.”

“I believe that the role of journalists will shift toward the human dimension – focusing on ethical aspects and field presence, which AI cannot replicate,” she continued. “But certainly, journalists must become more aware of how they use AI, and they must use it as a tool – without allowing it to overpower their voice, their perspective, or their responsibility to uncover the truth.”