AI Watch
- QualityRated 23 but structure suggests 73 (underrated by 50 points)
Quick Assessment
Section titled “Quick Assessment”| Attribute | Assessment |
|---|---|
| Creator | Issa RiceIssa RiceIssa Rice is an independent researcher who has created valuable knowledge infrastructure tools like Timelines Wiki and AI Watch for the EA and AI safety communities, though his work focuses on data...Quality: 45/100 |
| URL | aiwatch.issarice.com |
| Primary Focus | Tracking AI safety organizations, people, funding, and publications |
| Related Projects | Timelines WikiTimelines WikiTimelines Wiki is a specialized MediaWiki project documenting chronological histories of AI safety and EA organizations, created by Issa Rice with funding from Vipul Naik in 2017. While useful as a...Quality: 45/100, Org WatchOrg WatchOrg Watch is a tracking website by Issa Rice that monitors EA and AI safety organizations, but the article lacks concrete information about its actual features, scope, or current status. The piece ...Quality: 23/100 |
| Funding Source | Likely Vipul NaikVipul NaikVipul Naik is a mathematician and EA community member who has funded ~$255K in contract research (primarily to Sebastian Sanchez and Issa Rice) and created the Donations List Website tracking $72.8...Quality: 63/100 (pattern from other Rice projects) |
| Purpose | Knowledge infrastructure for AI safety research |
Key Links
Section titled “Key Links”| Source | Link |
|---|---|
| Official Website | aiwatch.issarice.com |
| Related: Timelines Wiki | timelines.issarice.com |
| Related: Org Watch | orgwatch.issarice.com |
Overview
Section titled “Overview”AI Watch is a tracking database created by Issa RiceIssa RiceIssa Rice is an independent researcher who has created valuable knowledge infrastructure tools like Timelines Wiki and AI Watch for the EA and AI safety communities, though his work focuses on data...Quality: 45/100 that monitors the AI safety field, including organizations, people, funding flows, and publications. The project is part of Rice’s broader ecosystem of knowledge infrastructure tools, which includes Timelines WikiTimelines WikiTimelines Wiki is a specialized MediaWiki project documenting chronological histories of AI safety and EA organizations, created by Issa Rice with funding from Vipul Naik in 2017. While useful as a...Quality: 45/100 (documenting chronological histories) and Org WatchOrg WatchOrg Watch is a tracking website by Issa Rice that monitors EA and AI safety organizations, but the article lacks concrete information about its actual features, scope, or current status. The piece ...Quality: 23/100 (tracking organizational information). Together, these tools provide systematic documentation of the AI safety and effective altruism communities.
According to user-provided context, AI Watch focuses on what key entities in the field are doing, how funding flows between organizations, who works where, and what research is being published. This makes it a reference tool for researchers, funders, and community members seeking to understand the structure and activity of the AI safety ecosystem.
The project reflects Rice’s characteristic approach to knowledge work: systematic data collection, transparent documentation, and creation of public goods that serve research communities. Like his other projects, AI Watch likely receives funding from Vipul NaikVipul NaikVipul Naik is a mathematician and EA community member who has funded ~$255K in contract research (primarily to Sebastian Sanchez and Issa Rice) and created the Donations List Website tracking $72.8...Quality: 63/100, who has supported Rice’s contract work with approximately $80,000 in funding for various tracking and documentation projects.
Role in AI Safety Knowledge Infrastructure
Section titled “Role in AI Safety Knowledge Infrastructure”AI Watch serves as a complement to other tracking tools in the AI safety space. While Timelines WikiTimelines WikiTimelines Wiki is a specialized MediaWiki project documenting chronological histories of AI safety and EA organizations, created by Issa Rice with funding from Vipul Naik in 2017. While useful as a...Quality: 45/100 focuses on chronological histories and sequences of events, AI Watch appears to emphasize current-state tracking of entities and relationships. Org WatchOrg WatchOrg Watch is a tracking website by Issa Rice that monitors EA and AI safety organizations, but the article lacks concrete information about its actual features, scope, or current status. The piece ...Quality: 23/100 provides organizational information, while AI Watch may offer broader coverage including individuals, publications, and funding patterns.
For AI safety researchers, these tracking tools reduce the effort required to understand who is working on what, which organizations are active, how funding flows through the ecosystem, and what research outputs are being produced. This systematic documentation is particularly valuable in a rapidly evolving field where organizational landscapes and research priorities shift frequently.
The project fits within the broader category of “epistemic tools” for the AI safety community—infrastructure that helps researchers and practitioners understand the field itself, track developments, and identify relevant work and organizations. Other examples include the AI AlignmentAlignmentComprehensive review of AI alignment approaches finding current methods (RLHF, Constitutional AI) achieve 75-90% effectiveness on existing systems but face critical scalability challenges, with ove...Quality: 91/100 Forum for technical research discussion and various AI safety newsletters that curate developments.
Relationship to Issa Rice’s Other Projects
Section titled “Relationship to Issa Rice’s Other Projects”AI Watch is part of a coordinated set of tracking and documentation tools created by Issa RiceIssa RiceIssa Rice is an independent researcher who has created valuable knowledge infrastructure tools like Timelines Wiki and AI Watch for the EA and AI safety communities, though his work focuses on data...Quality: 45/100:
Timelines Wiki documents chronological histories of AI safety organizations like MIRIOrganizationMIRIComprehensive organizational history documenting MIRI's trajectory from pioneering AI safety research (2000-2020) to policy advocacy after acknowledging research failure, with detailed financial da...Quality: 50/100 and the Center for Applied RationalityCenter For Applied RationalityBerkeley nonprofit founded 2012 teaching applied rationality through workshops ($3,900 for 4.5 days), trained 1,300+ alumni reporting 9.2/10 satisfaction and 0.17σ life satisfaction increase at 1-y...Quality: 62/100, as well as broader topics like the Timeline of AI Safety. Created in March 2017 with Vipul Naik funding, it provides granular historical documentation cited by sources including LongtermWiki.
Org WatchOrg WatchOrg Watch is a tracking website by Issa Rice that monitors EA and AI safety organizations, but the article lacks concrete information about its actual features, scope, or current status. The piece ...Quality: 23/100 tracks organizational information and appears to focus on entities within the AI safety and effective altruism spaces. The Vipul NaikVipul NaikVipul Naik is a mathematician and EA community member who has funded ~$255K in contract research (primarily to Sebastian Sanchez and Issa Rice) and created the Donations List Website tracking $72.8...Quality: 63/100 page references “orgwatch.issarice.com” as a source for information about individuals’ organizational affiliations and activities.
AI Watch adds to this ecosystem by providing tracking of the contemporary AI safety field—complementing the historical focus of Timelines Wiki with current-state information about active organizations, researchers, funding, and publications.
This division of labor allows each tool to specialize while together providing comprehensive coverage of the AI safety ecosystem’s history, structure, and current activity. Researchers can consult Timelines WikiTimelines WikiTimelines Wiki is a specialized MediaWiki project documenting chronological histories of AI safety and EA organizations, created by Issa Rice with funding from Vipul Naik in 2017. While useful as a...Quality: 45/100 for “when did X happen,” Org WatchOrg WatchOrg Watch is a tracking website by Issa Rice that monitors EA and AI safety organizations, but the article lacks concrete information about its actual features, scope, or current status. The piece ...Quality: 23/100 for “what organizations are involved,” and AI Watch for broader field-level tracking.
Comparison to Other AI Safety Tracking Efforts
Section titled “Comparison to Other AI Safety Tracking Efforts”Several other initiatives track aspects of the AI safety field, though with different scopes and approaches:
The Alignment Newsletter (created by Rohin Shah, later discontinued) provided curated summaries of AI alignment research papers and developments, focusing on synthesizing technical work rather than tracking organizational or funding information.
AI Alignment Forum and LessWrongLesswrongLessWrong is a rationality-focused community blog founded in 2009 that has influenced AI safety discourse, receiving $5M+ in funding and serving as the origin point for ~31% of EA survey respondent...Quality: 44/100 serve as venues for research discussion and community coordination, but don’t systematically track organizations, funding, or comprehensive publication records.
The AI Safety Research Organizations list and related directories provide snapshots of organizations but typically lack the systematic tracking of changes over time, funding flows, or comprehensive coverage of individuals and publications.
According to EA Forum discussions, community members have expressed need for better tracking of AI safety funding and organizational activities. One 2023 post introduced “AI Lab Watch” as a project to evaluate frontier AI labs’ safety actions, suggesting ongoing demand for monitoring infrastructure. AI Watch may serve complementary purposes by providing broader field-level tracking beyond just frontier labs.
Rice’s approach emphasizes comprehensive data collection and systematic documentation rather than analysis or evaluation. This makes AI Watch useful as a reference source and data layer that others can build upon for analysis, similar to how Vipul Naik’s Donations List WebsiteDonations List WebsiteComprehensive documentation of an open-source database tracking $72.8B in philanthropic donations (1969-2023) across 75+ donors, with particular coverage of EA/AI safety funding. The page thoroughl...Quality: 52/100 provides raw funding data that analysts can interpret.
Methodological Approach and Limitations
Section titled “Methodological Approach and Limitations”Based on the pattern from Rice’s other projects, AI Watch likely employs:
Systematic data collection from public sources including organizational websites, grant announcements, publication databases, social media, and news articles. The approach emphasizes comprehensiveness within the defined scope rather than selective curation.
Structured data formats enabling querying and analysis. Timelines Wiki uses MediaWiki; other Rice projects may use databases or structured file formats that support programmatic access.
Citation and verification to enable users to check sources and assess reliability. Timelines Wiki provides citations for entries; AI Watch likely follows similar practices.
Focus on AI safety and adjacent communities rather than comprehensive coverage of all AI research. This reflects the project’s origins in and service to the AI safety community specifically.
However, several limitations likely apply:
Scope constraints due to the project being primarily created and maintained by a single individual (Issa RiceIssa RiceIssa Rice is an independent researcher who has created valuable knowledge infrastructure tools like Timelines Wiki and AI Watch for the EA and AI safety communities, though his work focuses on data...Quality: 45/100). Coverage depends on what Rice identifies as relevant and has capacity to document.
Funding dependence on Vipul Naik’s continued support, creating sustainability questions similar to those affecting Rice’s other contract work projects.
Verification challenges as a single maintainer must verify information across many organizations and individuals, potentially leading to gaps or delays in updates.
Potential for incompleteness in tracking private funding, unpublished research, or organizational activities not publicly announced.
The project provides valuable infrastructure but should be understood as a curated tracking effort rather than guaranteed complete coverage of the AI safety field.
Usage and Impact
Section titled “Usage and Impact”The specific usage patterns and impact of AI Watch are not extensively documented in available sources. However, based on the pattern from related projects:
Research use: Likely consulted by AI safety researchers, funders, and analysts seeking to understand the field’s structure and activity patterns.
Citation as a source: May be referenced in EA Forum posts, research papers, and blog posts about the AI safety ecosystem, similar to how Timelines Wiki is cited in LongtermWiki.
Complementing other tools: Used alongside Timelines WikiTimelines WikiTimelines Wiki is a specialized MediaWiki project documenting chronological histories of AI safety and EA organizations, created by Issa Rice with funding from Vipul Naik in 2017. While useful as a...Quality: 45/100 for historical context and Org WatchOrg WatchOrg Watch is a tracking website by Issa Rice that monitors EA and AI safety organizations, but the article lacks concrete information about its actual features, scope, or current status. The piece ...Quality: 23/100 for organizational details, providing a comprehensive view when all tools are consulted together.
Community awareness: Helps community members stay informed about organizational developments, funding patterns, and who is working on what, potentially facilitating coordination and collaboration.
The project’s value derives from aggregating dispersed information into a centralized, structured format. Without such tools, researchers would need to manually track many sources to maintain awareness of field developments.
Relationship to Vipul Naik’s Funding Model
Section titled “Relationship to Vipul Naik’s Funding Model”AI Watch likely benefits from the individual funding model established by Vipul NaikVipul NaikVipul Naik is a mathematician and EA community member who has funded ~$255K in contract research (primarily to Sebastian Sanchez and Issa Rice) and created the Donations List Website tracking $72.8...Quality: 63/100, who has funded approximately $80,000 of Issa Rice’s contract work across various projects. This funding arrangement enables creation of public goods that might not receive traditional grant funding due to difficulty demonstrating immediate impact or fitting established funding categories.
Naik’s contract work portal (contractwork.vipulnaik.com) documents payments for timelines, Wikipedia articles, and data infrastructure work. The funding model allows sustained development of comprehensive documentation that would be difficult to maintain through pure volunteer effort.
This patronage approach differs from typical EA funding mechanisms. Rather than large grants from institutions like Open PhilanthropyOpen PhilanthropyOpen Philanthropy rebranded to Coefficient Giving in November 2025. See the Coefficient Giving page for current information. or the Long-Term Future FundLtffLTFF is a regranting program that has distributed $20M since 2017 (approximately $10M to AI safety) with median grants of $25K, filling a critical niche between personal savings and institutional f...Quality: 56/100, Naik provides smaller-scale individual funding for knowledge infrastructure projects. The arrangement has enabled projects like Timelines Wiki and the Donations List WebsiteDonations List WebsiteComprehensive documentation of an open-source database tracking $72.8B in philanthropic donations (1969-2023) across 75+ donors, with particular coverage of EA/AI safety funding. The page thoroughl...Quality: 52/100 that serve as community resources.
However, this funding model creates sustainability questions. The projects depend on Naik’s personal financial situation and continued interest, without institutional backing or guaranteed continuity. If funding arrangements change, maintaining and updating AI Watch could become difficult.
Uncertainty About Current Status
Section titled “Uncertainty About Current Status”Several aspects of AI Watch remain unclear from available sources:
Current maintenance status: The extent of active updates and new data entry is not documented.
Specific features and scope: What exactly AI Watch tracks, how data is organized, and what queries or views are available is not detailed in accessible sources.
Usage metrics: How frequently the tool is accessed and by whom is unknown.
Contribution model: Whether AI Watch accepts contributions from others beyond Issa Rice, or remains a single-maintainer project.
Funding details: While likely funded by Vipul Naik based on patterns from other projects, specific funding amounts and arrangements for AI Watch are not documented.
Technical implementation: The platform, database structure, and data formats used by AI Watch are not described in available sources.
The project’s website was not accessible during research for this article, returning only a verification page rather than actual content. This limits what can be confirmed about current functionality and status.
Key Uncertainties
Section titled “Key Uncertainties”Beyond questions about current status, several broader uncertainties affect understanding of AI Watch’s role and impact:
Comprehensiveness of coverage: What fraction of AI safety organizations, people, funding, and publications are tracked, and what criteria determine inclusion.
Update frequency: How often data is refreshed to reflect new developments, organizational changes, funding announcements, and publications.
Verification methods: How information is fact-checked and what standards of evidence are required for inclusion in the database.
Relationship to other tracking efforts: Whether AI Watch coordinates with or duplicates other projects attempting to track the AI safety field.
Long-term sustainability: Plans for maintaining the project if Issa RiceIssa RiceIssa Rice is an independent researcher who has created valuable knowledge infrastructure tools like Timelines Wiki and AI Watch for the EA and AI safety communities, though his work focuses on data...Quality: 45/100’s involvement decreases or funding arrangements change.
Counterfactual impact: Whether AI Watch meaningfully improves community coordination, researcher awareness, or funder decision-making, or primarily serves as a reference that would be consulted rarely.
Accessibility and usability: How easy it is for community members to find relevant information in AI Watch, and whether the tool is designed for casual browsing or requires familiarity with its structure.
Sources
Section titled “Sources”Note: This article is based primarily on user-provided context about AI Watch’s purpose and relationship to Issa RiceIssa RiceIssa Rice is an independent researcher who has created valuable knowledge infrastructure tools like Timelines Wiki and AI Watch for the EA and AI safety communities, though his work focuses on data...Quality: 45/100’s broader project ecosystem. Direct information about AI Watch is limited in available sources, as the website was not accessible during research. Information about the pattern of Rice’s work comes from documented projects like Timelines WikiTimelines WikiTimelines Wiki is a specialized MediaWiki project documenting chronological histories of AI safety and EA organizations, created by Issa Rice with funding from Vipul Naik in 2017. While useful as a...Quality: 45/100 and references in the Vipul NaikVipul NaikVipul Naik is a mathematician and EA community member who has funded ~$255K in contract research (primarily to Sebastian Sanchez and Issa Rice) and created the Donations List Website tracking $72.8...Quality: 63/100 page.