Back
Fellows of AlgorithmWatch’s reporting program will research surveillance and discrimination - AlgorithmWatch
webalgorithmwatch.org·algorithmwatch.org/en/fellows-algorithmic-accountability-...
This announcement introduces AlgorithmWatch's fifth cohort of investigative journalism fellows focusing on AI-driven surveillance, predictive policing, and AI-facilitated image abuse — areas directly relevant to AI governance, accountability, and fundamental rights.
Metadata
Importance: 28/100news articlenews
Summary
AlgorithmWatch announces its fifth cohort of Algorithmic Accountability Reporting Fellows, eight journalists and researchers who will investigate AI-based surveillance, predictive policing, chatbot risks to vulnerable groups, and AI-facilitated intimate image abuse over six months. The fellowship provides editorial support, mentorship, and publication opportunities across European media. It is the only European program of its kind open to both EU and non-EU applicants.
Key Points
- •Fifth cohort of eight fellows will investigate AI surveillance, face recognition in public spaces, predictive policing, and AI-generated CSAM.
- •Fellowship covers both EU and non-EU journalists from countries including UK, Türkiye, Georgia, and Ukraine.
- •Around 150 applications received from journalists, engineers, lawyers, policy researchers, and academics.
- •Fellows receive editorial/financial support, mentorship from algorithmic accountability experts, and publication opportunities.
- •Focus areas include AI-driven intimate image abuse and power imbalances reinforced by AI technologies.
1 FactBase fact citing this source
Cached Content Preview
HTTP 200Fetched Apr 7, 20268 KB
Fellows of AlgorithmWatch’s reporting program will research surveillance and discrimination - AlgorithmWatch
Algorithmic Accountability Reporting Fellowship
Fellows of AlgorithmWatch’s reporting program will research surveillance and discrimination
As we approach the end of the third year of the Algorithmic Accountability Reporting Fellowship, we are delighted to welcome a new cohort of eight journalists and researchers. This round will focus on two key clusters: digital surveillance and AI-driven intimate image abuse.
Blog
December 9, 2025
#fellowship
Yutong Liu & Digit / https://betterimagesofai.org / https://creativecommons.org/licenses/by/4.0/
The fellowship officially launched in December in Berlin, where the new cohort gathered to kick-off their investigations and meet the AlgorithmWatch team. Building on last year's investigations on the supply chain of AI, the new reporting fellows will examine how AI-based technologies reinforce power imbalances and threaten fundamental rights.
Over the next six months, fellows will pursue stories ranging from the deployment of AI-powered surveillance technologies in public spaces, such as face recognition systems; access to sensitive information in chatbots and its impact on vulnerable groups; the use of predictive policing systems local neighborhoods; and cases of AI-facilitated intimate image abuse, including AI-generated child sexual abuse material and the use of non-consensual sexualisation tools.
The fellowship provides editorial and financial support, mentorship sessions with seasoned journalists and researchers in the algorithmic accountability field, and opportunities to publish the resulting investigations both on AlgorithmWatch's platforms and other relevant media outlets in Europe.
AlgorithmWatch's reporting fellowship is the only European program of its kind that provides support for both EU and non EU-based applicants, including journalists based in the UK, Türkiye, Georgia or Ukraine. Applicants come from a diverse variety of backgrounds — this year we received around 150 submissions from investigative journalists, engineers, lawyers, policy researchers and academics.
Here are the selected candidates for the fifth cohort of AlgorithmWatch's reporting fellowship. Welcome!
Marta Abbà
Algorithmic Accountability Reporting Fellow (2025-2026)
Marta specializes in environmental crimes, migration, gender rights, and indigenous rights. She produces interdisciplinary and cross-media reports with an intersectional and decolonized perspective, and her work has been published in media outlets such as Wired , Voxeurop , Lavialibera , Lifegate , OBCT , Unbias the News , Altreconomia , QCode , and In Genere . She is a member of the journalism collectives Info.Nodes, DatiBenecomune and Clean Energy Wire (CLEW), and has received scholarships from the JournalismFund, the Earth Journalism Network, and th
... (truncated, 8 KB total)Resource ID:
kb-1912c36ea12ef211 | Stable ID: sid_CaZcCyJXc8