Lalya Gaye (EBU)
The threat that online disinformation campaigns pose to media and society is exacerbated by their increased level of sophistication and the speed at which they spread. Time has always been of the essence in the fact-checking process, especially when it comes to breaking news, but news professionals are now in urgent need of advanced tools that can help them catch up with this new form of disinformation. With its ability to process and analyse large amounts of data on a scale and at a speed that can’t be matched by humans, AI can enable newsrooms to level the playing field and help tackle this problem.
The vera.ai project – its name an abbreviation of Verification Assisted by Artificial Intelligence – is a multidisciplinary and collaborative project that tackles the problem of disinformation by supporting the work of fact checkers, journalists and media researchers with AI. Funded by the European Commission and the authorities in the UK and Switzerland, vera.ai focuses on user-centred development of professional AI-based verification tools.
The project is a follow- up to previous EU projects WeVerify and Truly Media, and its implementation builds on the InVid video verification platform that has already become an industry standard. The project consortium gathers 14 experienced partner organizations dedicated to the fight against disinformation, including the EBU itself and EBU Member organization Deutsche Welle.
Participatory design
Since the start of the project in 2022, the project contributions of the EBU and DW have involved identifying end-user needs in the fact-checking, research and journalism community, translating these needs into actionable design requirements, and disseminating the project’s results in order to maximize its impact. A participatory, user-centred approach to design has been used: applying an ethos developed in Scandinavia to empower employees and communities in decision-making processes, this approach is meant to take the perspective of all stakeholders into account and give them an equivalent say in the outcome.
The participatory design methodology ensures that tools are implemented in a way that fits into existing professional workflows as opposed to forcing end users to learn and adopt new ones. AI solutions are not meant to replace fact checkers. They’re meant to be tools in the fact checker’s arsenal against disinformation. And we need to make sure they integrate well into these professionals’ workflows.
Evaluation of tools
While the first phase of the project focused on user needs and design requirements, it is now entering a second phase that will see the implementation of new features and the testing of prototypes by end users.
This evaluation is part of an iterative process that will unfold over the next two years, as we continue to apply a participatory approach.
A webinar on 6 December updated EBU Members about the progress of the project and briefed them on how to take part in the evaluation phase. Also foreseen are training sessions and e-master classes with the EBU Academy over the next two years.
Please contact us if you’re interested in the vera.ai project and would like to take part in making sure that its AI-based verification tools serve our media community to the fullest!
vera.ai is co-financed by the European Union under the Horizon Europe Framework Programme. Additional funding is provided by Innovate UK and the Swiss State Secretariat for Education, Research and Innovation (SERI).
This article first appeared in issue 58 of EBU tech-i magazine.