Author: Sara Abdullah
A journalist’s primary role is to verify information before publication and communicate with trusted sources to fact‑check it. Yet with the accelerating pace of information in our time and the increasing complexity of news and data circulated amid wars and crises around the world, in addition to elections, journalists’ responsibilities have grown. The role of fact‑checkers has emerged as a vital element in safeguarding truth and protecting communities from the scourge of false and misleading news.
During 2024, we witnessed tangible developments in fact‑checking. Technological platforms and smart tools have become more advanced and accurate in detecting falsehoods and correcting misrepresentations. In this context, a large number of fact‑checkers gathered at the annual forum organized by Arab Reporters for Investigative Journalism (ARIJ) to discuss the challenges facing fact‑checkers, most notably:
The increasing spread of misinformation and declining public trust in official sources.
The need for fact‑checkers to commit to transparency in disclosing their sources and explaining the verification methods used, while enhancing cooperation with the public without compromising editorial independence.
The importance of building a sustainable relationship with the audience by clarifying the mechanism for verifying particular pieces of information, thus strengthening that relationship.
The active role of fact‑checkers during elections, where they support the democratic process by exposing truths and correcting misinformation.
Discussions on how to use technological tools in the future to enhance fact‑checking capabilities and counter the spread of false news.
On her part, Ingi Holan, director of the International Fact‑Checking Network (IFCN), emphasized the difficulty of the fact‑checking process, saying: “I believe this work will become more challenging, which requires us to adopt creative approaches in dealing with it.” She added in her conversation with the International Journalists’ Network: “I think fact‑checking journalists in 2025 will face increasing challenges related to the volume of misinformation they must handle and how to deal with an audience that has low trust in official sources. To address this, we need to maintain standards of transparency in disclosing our sources, explain how we verify claims, remain transparent in choosing which claims to check, and show how we receive and interact with audience suggestions.”
Holan continued: “Fact‑checkers must work with the public, listening to their input while maintaining editorial independence. Audience understanding of how conclusions are reached and why a particular claim is selected for verification enhances trust and creates a more sustainable relationship with the public.”
For her part, Dr. Arwa Al‑Kaali, an assistant professor at the Institute of Journalism and Information Sciences in Tunisia and a trainer in data journalism and news verification for the International Journalists’ Network, explained: “The year 2024 was primarily the year of elections, and with 2025 approaching, some countries around the world will witness new electoral milestones. These periods typically see a wide spread of misleading information. However, the situation in our region is linked to crises, wars, political transitions, and other rapid events that create a fertile environment for the proliferation of false and misleading information. Alongside the parties benefiting from systematic disinformation campaigns, there is also the anxiety and uncertainty caused by crises, which contributes to the spread of disinformation.”
Al‑Kaali added: “The crises we are experiencing have fueled propaganda and increased media disinformation, leading to a growing decline in trust in journalism and placing greater responsibility on the media to restore that trust. We may need to return to professional and ethical standards more than ever. The polarization that the crises have created clearly indicates its connection to the belief in conspiracy theories.”
She continued: “Another point we must also pay attention to is the rise of deepfake technologies and the development of AI‑driven visual content generation tools. It is expected that this content will become more widespread—either to improve the aesthetic aspects of user‑generated productions or as a means of entertainment—but it is also an effective tool for disinformation. It can expand to systematically target certain individuals and be used in fraud and extortion. On the other hand, we will urgently need to develop media literacy curricula through the use of AI tools. The increasing reliance on social media platforms as news sources, coupled with heavy dependence on AI content generation tools, may negatively affect users’ ability to form an accurate picture of what is happening around them, degrade the quality of circulated information, and weaken their critical faculties in the long term. Therefore, it is important to support media and information literacy campaigns to raise users’ awareness about the positive and negative impacts of generative AI tools that can produce inaccurate and biased content.”
The International Journalists’ Network also interviewed Ahmed Jamal, a fact‑checking trainer and head of the fact‑checking department at the Akhbar Meter observatory. He also oversees verification materials on the Lebanese‑Jordanian platform “Check”, serves on the digital assistant (chatbot) team of the Arab Fact‑Checkers Network (AFCN), and has extensive experience training journalists in fact‑checking techniques, especially in electoral contexts.
Jamal said: “One of the most prominent challenges that fact‑checkers will face in 2025 is the ongoing development of AI technologies used to fake photos and videos (deepfakes). This development increases the difficulty of distinguishing between genuine and fake content, especially given the lack of free tools capable of countering this type of disinformation. Additionally, many fact‑checkers lack the expertise needed to effectively use reverse image search and access primary sources that definitively refute claims. For example, during our coverage of the previous Jordanian elections on the ‘Check’ platform, we monitored numerous cases of misinformation related to photos and videos. Among the most notable cases were images showing election billboards on street poles said to belong to Jordanian candidates. Through research, we discovered that the figures shown were characters from the popular American series The Boys. Although the posts were originally meant as jokes, they turned into misleading information that spread widely.”
Jamal advised fact‑checkers to continue developing their skills, particularly in reverse search and visual data analysis, and he urged greater cooperation among fact‑checkers and relevant platforms to share expertise and resources. He said: “I hope that fact‑checkers will gain access to tools and resources that make their work easier, ensuring accurate, professional, and ethical output that helps build public trust.”
In the same context, the Nieman Lab published an article by Cristina Tardáguila, the founder of Lupa—one of Brazil’s largest anti‑misinformation initiatives and formerly a senior program director at the International Center for Journalists. “In 2015, when I founded Lupa, the term ‘fake news’ was uncommon, and fact‑checking was largely unknown in Brazil. At that time, our mission was urgent and clear: expose lies, identify those responsible, and demand accountability. I felt the work was revolutionary,” she wrote. Tardáguila added that “the global focus on misinformation is undeniable, and this is a positive step. Yet the field has become crowded with numerous actors who prioritize performative actions over measurable results. After nearly a decade in this battle, we are in dire need of reassessment.” She asked: “How many initiatives have we seen recently that promote preemptive debunking, fact‑checking, or explanations as definitive solutions to misinformation? How many media literacy workshops are aimed at immunizing people against lies? And how many extremely powerful AI tools? While these initiatives are valuable, they are no longer innovative. We need to ask ourselves: what is the next step?”
Discussing misinformation and how to combat it in 2025, Tardáguila wrote:
We need a cultural shift within the anti‑misinformation community. It’s time to pause, reassess, and invest in deeper research. From a journalistic perspective, this means delving into thousands of investigations, preemptive debunks, and explanations we’ve already produced to extract actionable insights.
We must listen to our audience using techniques from other fields. Surveys and studies are essential to understand the misinformation they encounter, the beliefs they hold, and the tools they need.
We need to adopt behavioral science and understand human reactions to lies and truths. By grasping these cognitive dynamics, we can design effective strategies.

