Skip to content

Org Watch

📋Page Status
Page Type:ResponseStyle Guide →Intervention/response page
Quality:23 (Draft)⚠️
Importance:25 (Peripheral)
Last edited:2026-02-03 (3 days ago)
Words:1.1k
Structure:
📊 2📈 0🔗 13📚 116%Score: 10/15
LLM Summary:Org Watch is a tracking website by Issa Rice that monitors EA and AI safety organizations, but the article lacks concrete information about its actual features, scope, or current status. The piece reads more like speculative analysis about what the tool might do rather than documentation of an established resource.
Issues (1):
  • QualityRated 23 but structure suggests 67 (underrated by 44 points)
AspectAssessment
TypeResearch tool / tracking website
Primary FocusOrganization monitoring and data aggregation
CreatorIssa Rice
Target AudienceEA and AI safety researchers
Key FeaturesOrganizational tracking, funding data, personnel information
Related ToolsAI Watch, Timelines Wiki (also by Issa Rice)
AccessWeb-based
SourceLink
Official Websiteorgwatch.issarice.com

Org Watch is a tracking website created by Issa Rice that monitors various organizations, with a likely focus on effective altruism and AI safety organizations.1 The site is part of Rice’s broader ecosystem of tracking and research tools, which includes AI Watch and Timelines Wiki, all aimed at providing structured data about the AI safety and effective altruism communities.

The platform appears designed to serve researchers and community members who need systematic information about organizational activities, funding flows, and personnel changes. By consolidating this information in one location, Org Watch aims to reduce the research burden for those tracking developments in these communities.

Little public information is available about Org Watch’s specific methodologies, data sources, or scope. The site’s utility to researchers likely depends on its coverage breadth, update frequency, and data accuracy—factors that cannot be assessed without direct access to the platform’s contents.

Issa Rice created Org Watch as part of a suite of information-tracking tools focused on the effective altruism and AI safety communities. Rice has established a pattern of creating research infrastructure tools that aggregate and organize publicly scattered information, making it more accessible for community members and researchers.

Rice’s other projects include AI Watch, which tracks AI-related developments, and Timelines Wiki, which documents various timelines relevant to AI safety and effective altruism. This ecosystem of tools suggests a systematic approach to knowledge management and community transparency.

According to available information, Org Watch tracks various organizations, likely providing data on:

  • Organizational profiles: Basic information about organizations in the EA and AI safety spaces
  • Funding information: Data about funding sources, amounts, and allocation patterns
  • Personnel tracking: Information about key people and their organizational affiliations
  • Organizational activities: Documentation of major projects, initiatives, or shifts in focus

The platform’s value proposition appears to be reducing information asymmetry within the EA and AI safety communities by making organizational data more readily available and comparable.

Org Watch likely serves several categories of users:

Researchers and analysts studying the EA and AI safety ecosystems can use Org Watch to track organizational trends, funding patterns, and personnel movements without conducting extensive independent research. This aggregation function reduces redundant effort across the community.

Job seekers and career planners in these fields might use the platform to identify organizations aligned with their interests, understand organizational funding stability, and track personnel changes that might signal job opportunities.

Donors and funders could use Org Watch to compare organizations, verify claims about funding and activities, and identify gaps or overlaps in the organizational landscape.

Organization leaders themselves might reference the platform to understand their competitive landscape, benchmark their activities against similar organizations, or identify potential collaboration partners.

Org Watch exists within a broader ecosystem of EA and AI safety community infrastructure:

The LessWrong and EA Forum communities maintain their own forms of organizational tracking through community posts, annual reviews, and discussion threads. These platforms provide more narrative and qualitative information compared to Org Watch’s likely more structured data approach.

Academic researchers and journalists occasionally produce reports on EA and AI safety organizations, but these tend to be point-in-time snapshots rather than continuously updated resources. Org Watch’s presumed advantage lies in its ongoing maintenance and systematic coverage.

Open Philanthropy maintains a public grants database that provides detailed information about its funding decisions, including grantee organizations.2 However, this covers only one funder’s perspective. Org Watch likely aggregates information across multiple funding sources to provide a more comprehensive view.

Several challenges likely affect Org Watch’s utility:

Data availability and accuracy: Organizational information may be incomplete, outdated, or inaccurate, particularly for organizations that don’t publish detailed public information about their activities, funding, or personnel. Maintaining current data requires continuous effort and access to reliable sources.

Coverage scope: Without clear documentation of which organizations are tracked, users may incorrectly assume coverage is comprehensive when significant gaps may exist. The criteria for inclusion or exclusion of organizations is unclear.

Methodological transparency: The sources and methods used to gather and verify information are not publicly documented based on available information. This makes it difficult for users to assess data reliability or reproduce findings.

Maintenance burden: Tracking websites require significant ongoing effort to remain current and useful. If updates lag, the platform’s value diminishes rapidly, particularly in fast-moving fields like AI safety.

Privacy and consent considerations: Tracking personnel information raises questions about consent and privacy, particularly for individuals who may not have explicitly agreed to be included in such databases.

Org Watch can be understood in relation to other organizational tracking efforts:

Wikipedia provides organizational information but with variable depth and often limited coverage of smaller organizations or recent developments. Its requirement for secondary sources means it may lag behind primary developments.

Crunchbase and similar business databases track companies but focus primarily on traditional metrics like funding rounds and valuations rather than the mission-specific metrics relevant to EA and AI safety organizations.

Community-maintained lists and spreadsheets exist within the EA and AI safety communities but tend to be fragmented, irregularly updated, and difficult to discover. Org Watch’s potential advantage lies in providing a centralized, systematically maintained resource.

Academic databases like the Organizations & Social Impact database at Stanford focus on research-oriented metrics and may not capture the specific information needs of EA and AI safety community members.

Significant information about Org Watch remains unavailable:

  • The current operational status of the website (active, maintained, archived)
  • The scope of organizational coverage (which types and which specific organizations)
  • Update frequency and data freshness
  • Data sources and verification methods
  • User access requirements (public, restricted, paywalled)
  • Integration with other Issa Rice projects
  • Community reception and usage patterns
  • Funding or support for the project

These gaps mean that assessments of Org Watch’s utility remain speculative without direct platform access and usage.

Key Questions (5)
  • What organizations does Org Watch currently track, and what are the criteria for inclusion?
  • How frequently is the data updated, and what sources are used for verification?
  • Is Org Watch actively maintained, or has it been archived or deprecated?
  • What specific data fields are tracked for each organization?
  • How does usage of Org Watch compare to alternative information sources within the EA and AI safety communities?
  1. User-provided context from https://orgwatch.issarice.com/

  2. Coefficient Giving (formerly Open Philanthropy) - Inside Philanthropy profile