Navigation
Website
website · 75 facts across 75 entities · general
Definition
| Name | Website |
| Description | Primary website URL |
| Data Type | text |
| Unit | — |
| Category | general |
| Temporal | No |
| Computed | No |
| Applies To | organization, person, project, funder, safety-agenda |
All Facts (75)
80,000 Hourshttps://80000hours.org/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://80000hours.org/ | wikidata.org | xsVlqa9RjQ |
ACX Grantshttps://www.astralcodexten.com/p/acx-grants-results—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.astralcodexten.com/p/acx-grants-results | — | f_hkrt42xnt0 |
AI Watchhttps://aiwatch.issarice.comMar 20261 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| Mar 2026 | https://aiwatch.issarice.com | — | f_lGC7IT0oPp |
Ajeya Cotrahttps://metr.org—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://metr.org | — | f_zABjoVjk7g |
Andrew Nghttps://www.andrewng.org—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.andrewng.org | — | f_PomBL98jGQ |
Anthropic (Funder)https://www.anthropic.com/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.anthropic.com/ | wikidata.org | dcLEbK812Q |
Anthropichttps://www.anthropic.com/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.anthropic.com/ | wikidata.org | XWMSO83Myw |
Apollo Researchhttps://www.apolloresearch.ai—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.apolloresearch.ai | — | f_ZJgtLkzc9w |
Alignment Research Centerhttp://alignment.org/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | http://alignment.org/ | wikidata.org | xU3JVFJnrQ |
Astralis Foundationhttps://astralisfoundation.org—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://astralisfoundation.org | — | f_RHhkhesbww |
Buck Shlegerishttps://redwoodresearch.org—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://redwoodresearch.org | — | f_fixd9Ra6uA |
Centre for Effective Altruismhttps://www.centreforeffectivealtruism.org—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.centreforeffectivealtruism.org | wikidata.org | vrQK8e7bPQ |
Center for AI Safetyhttps://www.safe.ai/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.safe.ai/ | wikidata.org | t7r3olcDFw |
Center for Applied Rationalityhttp://rationality.org/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | http://rationality.org/ | wikidata.org | 9ToMXM39MA |
Centre for Long-Term Resiliencehttps://www.longtermresilience.org/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.longtermresilience.org/ | wikidata.org | CHFmqwxbTA |
Center for Human-Compatible AIhttps://humancompatible.ai—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://humancompatible.ai | — | f_q88t1C2dbw |
Chan Zuckerberg Initiativehttps://chanzuckerberg.com/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://chanzuckerberg.com/ | wikidata.org | euDjDvv50g |
Chris Olahhttps://colah.github.io—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://colah.github.io | — | f_pdsb5nkkx7 |
Coalition for Epidemic Preparedness Innovationshttps://cepi.net—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://cepi.net | wikidata.org | HRR7Noz5zg |
Coefficient Givinghttps://www.coefficientgiving.org—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.coefficientgiving.org | coefficientgiving.org | AZlMk0b5hg |
Conjecturehttps://www.conjecture.org—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.conjecture.org | wikidata.org | x1hQfPpo8g |
Dan Hendryckshttps://hendrycks.com—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://hendrycks.com | — | f_jm56ULnB8A |
David Kruegerhttps://www.davidscottkrueger.com—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.davidscottkrueger.com | — | f_eywAH1NAxg |
Google DeepMindhttps://deepmind.google/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://deepmind.google/ | wikidata.org | 2Y4spe8DsA |
Eliezer Yudkowskyhttps://www.yudkowsky.net—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.yudkowsky.net | — | f_tcr35bkwp1 |
Epoch AIhttps://epochai.org—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://epochai.org | — | f_GTFSlqnFlw |
FAR AIhttps://far.ai—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://far.ai | — | f_PchO7e0QGA |
Future of Humanity Institutehttp://www.fhi.ox.ac.uk/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | http://www.fhi.ox.ac.uk/ | wikidata.org | wF7Rkbxkbg |
Future of Life Institutehttps://futureoflife.org/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://futureoflife.org/ | wikidata.org | jBsUkEukYQ |
ForecastBenchhttps://www.forecastbench.orgMar 20261 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| Mar 2026 | https://www.forecastbench.org | — | f_aVr6xH9cNd |
Founders Fundhttp://www.foundersfund.com—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | http://www.foundersfund.com | wikidata.org | NrKgge1b1Q |
Forecasting Research Institute (FRI)https://forecastingresearch.org—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://forecastingresearch.org | wikidata.org | 4eRTUHLLtA |
FTX Future Fundhttps://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 | — | f_KTWCJv2vQw |
FTXhttps://en.wikipedia.org/wiki/FTX—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://en.wikipedia.org/wiki/FTX | — | f_DDhHhfcJXA |
Future of Life Foundation (FLF)https://www.flf.org—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.flf.org | flf.org | czOWfDOLvg |
Geoffrey Hintonhttps://www.cs.toronto.edu/~hinton/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.cs.toronto.edu/~hinton/ | — | f_dx6qemfhow |
GiveWellhttps://www.givewell.org—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.givewell.org | — | f_b3f24fq272 |
Giving What We Canhttps://www.givingwhatwecan.org/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.givingwhatwecan.org/ | — | f_Dg8uYlL7aA |
GovAIhttps://governance.ai—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://governance.ai | — | f_JmckqJ2vnw |
William and Flora Hewlett Foundationhttps://www.hewlett.org/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.hewlett.org/ | wikidata.org | HKEBuHCnIQ |
Holden Karnofskyhttps://www.cold-takes.com—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.cold-takes.com | — | f_lagva40ag7 |
Jacob Steinhardthttps://jsteinhardt.stat.berkeley.edu—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://jsteinhardt.stat.berkeley.edu | — | f_VzmoBlD3AA |
Johns Hopkins Center for Health Securityhttp://www.centerforhealthsecurity.org/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | http://www.centerforhealthsecurity.org/ | wikidata.org | etlhaGMWzg |
Kalshi (Prediction Market)https://kalshi.com/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://kalshi.com/ | wikidata.org | jYc4Oe5GJA |
Katja Gracehttps://aiimpacts.org—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://aiimpacts.org | — | f_hv6QB2ZWzg |
Longview Philanthropyhttps://www.longview.org/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.longview.org/ | wikidata.org | sEPBGRfOAA |
MacArthur Foundationhttps://www.macfound.org/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.macfound.org/ | wikidata.org | ggXcLQLgkw |
Meta AI (FAIR)https://ai.meta.com/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://ai.meta.com/ | wikidata.org | jHzNZtapXA |
Metaforecasthttps://metaforecast.orgMar 20261 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| Mar 2026 | https://metaforecast.org | — | f_dYu9AL2gGh |
METRhttps://metr.org—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://metr.org | — | f_YryJVbw2Bw |
Machine Intelligence Research Institutehttp://intelligence.org/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | http://intelligence.org/ | wikidata.org | naKZe9gv3A |
Nick Becksteadhttps://www.nickbeckstead.com/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.nickbeckstead.com/ | — | f_9EOQ53N4Gg |
Nick Bostromhttps://nickbostrom.com—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://nickbostrom.com | — | f_hzwm2k1t58 |
NVIDIAhttps://www.nvidia.com—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.nvidia.com | — | f_qNGX8xU6kA |
OpenAI Foundationhttps://openai.com/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://openai.com/ | wikidata.org | kH0V0rizPQ |
OpenAIhttps://openai.com/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://openai.com/ | wikidata.org | EdV4qjV9fA |
Paul Christianohttps://paulfchristiano.com—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://paulfchristiano.com | — | f_ur092jojlz |
QURI (Quantified Uncertainty Research Institute)https://quantifieduncertainty.org—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://quantifieduncertainty.org | — | f_FNxcwwwO26 |
Sam Altmanhttps://blog.samaltman.com—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://blog.samaltman.com | — | f_68fi4nulwx |
Sam Bankman-Friedhttps://en.wikipedia.org/wiki/Sam_Bankman-Fried—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://en.wikipedia.org/wiki/Sam_Bankman-Fried | — | f_fAmmQtDQCg |
Schmidt Futureshttps://www.schmidtfutures.com/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.schmidtfutures.com/ | wikidata.org | yhZ1j7uwBg |
Scott Alexanderhttps://www.astralcodexten.com—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.astralcodexten.com | — | f_wIn1DbjxRQ |
Shane Legghttps://deepmind.google—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://deepmind.google | — | f_BbpmsSq6zg |
Squigglehttps://www.squiggle-language.comMar 20261 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| Mar 2026 | https://www.squiggle-language.com | — | f_sQg8wLm3Rk |
SquiggleAIhttps://squigglehub.orgMar 20261 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| Mar 2026 | https://squigglehub.org | — | f_gBx2DO5jJk |
Safe Superintelligence Inc.https://ssi.inc/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://ssi.inc/ | wikidata.org | leszJt1HxA |
Stuart Russellhttps://people.eecs.berkeley.edu/~russell/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://people.eecs.berkeley.edu/~russell/ | — | f_ws3y2ichnm |
The Foundation Layerhttps://foundation-layer.ai—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://foundation-layer.ai | — | f_aj2EOmQQ7Q |
Toby Ordhttps://www.tobyord.com—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.tobyord.com | — | f_UiqpPbYqPQ |
UK AI Safety Institutehttps://www.aisi.gov.uk—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.aisi.gov.uk | — | f_uEP85IlxRA |
US AI Safety Institutehttps://www.nist.gov/aisi—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.nist.gov/aisi | — | f_8mEYv5duFQ |
Victoria Krakovnahttps://vkrakovna.wordpress.com—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://vkrakovna.wordpress.com | — | f_YV9KvdNUQg |
Will MacAskillhttps://www.willmacaskill.com/—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://www.willmacaskill.com/ | — | f_zFKfTWKs8w |
xAIhttps://x.ai—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | https://x.ai | wikidata.org | jtTOlHKAYg |
Yann LeCunhttp://yann.lecun.com—1 value▶
| As Of | Value | Source | Fact ID |
|---|---|---|---|
| — | http://yann.lecun.com | — | f_4eqnc2klts |
Coverage
| Applies To | organization, person, project, funder, safety-agenda |
| Applicable Entities | 414 |
| Have Current Data | 75 of 414 (18%) |
Missing (340)
1Day SoonerAgainst Malaria FoundationAI Forecasting Benchmark TournamentAI Futures ProjectAI ImpactsAI Revenue SourcesAI Safety CampAI Safety SupportAlexandre KaskasoliAlgoliaAlignment Research Engineer AcceleratorAllan DafoeAlliance to Feed the Earth in DisastersAmanda AskellAmbitious ImpactAndreas StuhlmüllerAndreessen Horowitz (a16z)Andrej KarpathyAnima InternationalAnimal Charity EvaluatorsAnthony AguirreAnthropic Core ViewsApart ResearchArb ResearchARC EvaluationsArcadia ImpactArnold VenturesAtlas FellowshipAvital BalwitBen GoldhaberBenjamin Weinstein-RaunBerkeley Existential Risk InitiativeBeth BarnesBezos Earth FundBipartisan Commission on BiodefenseBloomberg PhilanthropiesBlueDot ImpactBlueprint BiosecurityBridgewater AIA LabsBrown UniversityCambridge Boston Alignment InitiativeCambridge Effective AltruismCanadian AI Safety InstituteCarnegie Endowment for International PeaceCarnegie Mellon UniversityCaroline EllisonCEEALARCenter for a New American SecurityCenter for AI Risk Management & Alignment (CARMA)Center for Global DevelopmentCenter on Long-Term RiskChamber of ProgressCharity EntrepreneurshipCharter Cities InstituteChildren's Investment Fund FoundationCircle MedicalClerkyCoinTrackerCollective Intelligence ProjectColumbia UniversityCompassion in World FarmingCompetitive Enterprise InstituteConnor LeahyConsumer Technology AssociationControlAIConvergent ResearchCornell UniversityCouncil on Strategic RisksCSER (Centre for the Study of Existential Risk)Czech Association for Effective AltruismDaniela AmodeiDario AmodeiDartmouth CollegeDavid SacksDeepSeekDemis HassabisDonations List WebsiteDuke UniversityDustin MoskovitzEA Animal Welfare FundEA GlobalEconomic Security ProjectEffective Altruism AustraliaEffective Altruism AustriaEffective Altruism DenmarkEffective Altruism FinlandEffective Altruism FoundationEffective Altruism FundsEffective Altruism GenevaEffective Altruism IsraelEffective Altruism NetherlandsEffective Altruism New ZealandEffective Altruism NorwayEffective Altruism PolandEffective Altruism SingaporeEffective Institutions ProjectEffective Ventures Foundation USAEffektiv Altruism Sverige (EA Sweden)Eli LiflandElicit (AI Research Tool)Elizabeth GarrettElizabeth KellyElloElon MuskEmilia JavorskyEncode JusticeEU AI OfficeEvan HubingerEvidence ActionExistential Risk ObservatoryFederation of American ScientistsFei-Fei LiFirecrawlFord FoundationForesight InstituteFoundation for American InnovationFounders PledgeFrontier Model ForumFutureSearchGary MarcusGates FoundationGates Philanthropy PartnersGavin NewsomGeneration PledgeGeorgetown Center for Global Health Science and SecurityGeorgetown CSETGeorgetown UniversityGeorgia Institute of TechnologyGiveDirectlyGiving PledgeGladstone AIGlobal Catastrophic Risk InstituteGlobal Health and Development FundGlobal Partnership on Artificial Intelligence (GPAI)Global Priorities InstituteGood Judgment (Forecasting)Good VenturesGoodfireGratifiedGreg BrockmanGreptileGrokipediaGuesstimateGwern BranwenHarvard UniversityHelen Keller InternationalHelen TonerHigh Impact AthletesHigh Impact EngineersHuw PriceIan HogarthIBBIS (International Biosecurity and Biosafety Initiative for Science)IDinsightIlya SutskeverImbueInnovations for Poverty ActionInstitute for AI Policy and StrategyInstitute for ProgressIssa RiceJaan TallinnJack ClarkJacob HiltonJaime SevillaJan LeikeJapan AI Safety InstituteJared KaplanJensen HuangJoe BidenJohn Templeton FoundationJohns Hopkins UniversityJosh JacobsonJosué EstradaJulia WiseKathleen FinlinsonKetan RamakrishnanKeywords AIKira CenterLeading the Future super PACLegal Priorities ProjectLeopold AschenbrennerLessWrongLightcone InfrastructureLighthaven (Event Venue)Lightning Rod LabsLionheart VenturesLondon Initiative for Safe AILong-Term Future Fund (LTFF)Longterm WikiLuke MuehlhauserMalaria ConsortiumManifest (Forecasting Conference)Manifold (Prediction Market)ManifundMarc AndreessenMargrethe VestagerMaría de la Lama LaviadaMark BrakelMark NitzbergMark ZuckerbergMartin ReesMassachusetts Institute of TechnologyMATS ML Alignment Theory Scholars programMax TegmarkMedian GroupMetaculusMicrosoftMira MuratiMistral AIMIT AI Risk RepositoryModeling CooperationMustafa SuleymanNate SoaresNeel NandaNew IncentivesNew York UniversityNiskanen CenterNIST and AI SafetyNon-TrivialNonlinearNortheastern UniversityNTI | bio (Nuclear Threat Initiative - Biological Program)Nuño SempereOliver SourbutOliver ZhangOmidyar NetworkOne for the WorldOneshopOpen PhilanthropyOpenmartOrg WatchOur World in DataOxford China Policy LabPalisade ResearchPanoplia LabsPause AIPeter Thiel (Funder)Philip TetlockPicnicHealthPivotal ResearchPlaygroundPoll EverywherePolymarketPrinceton UniversityProbably GoodProsaic AlignmentPurdue UniversityR Street InstituteRAND CorporationRed Queen BioRedwood ResearchRetell AIRethink PrioritiesReworkd AIRichard MallahRichard NgoRipplingRoastMyPostRobin HansonRockefeller FoundationRohin ShahRootlyRutgers UniversitySaferAISam McCandlishSamotsvetySamuel R. BowmanSatya NadellaScott WienerSeán Ó hÉigeartaighSecure AI ProjectSecureBioSecureDNASeldon LabSentience InstituteSentinel (Catastrophic Risk Foresight)Simon Institute for Longterm GovernanceSimplifySingapore AI Safety InstituteSituational Awareness LPSkoll FoundationSlava MatyukhinSquiggle HubStack AIStampy / AISafety.infoStanford Existential Risks InitiativeStanford UniversityStoryWorthSundar PichaiSurvival and Flourishing FundSwift CentreTechNetThe Future SocietyThe Good Food InstituteThe Humane LeagueThe Life You Can SaveThe Sequences by Eliezer YudkowskyThe UnjournalTimelines WikiTimnit GebruTimothy Telleen-LawtonTopos InstituteTracecatTraining for GoodTuring LabsTurionUniversity of California, BerkeleyUniversity of California, DavisUniversity of California, Los AngelesUniversity of California, San FranciscoUniversity of CambridgeUniversity of ChicagoUniversity of EdinburghUniversity of GlasgowUniversity of MarylandUniversity of MichiganUniversity of Notre DameUniversity of OttawaUniversity of OxfordUniversity of PennsylvaniaUniversity of Southern CaliforniaUniversity of TorontoUniversity of UtahUniversity of WashingtonValue Aligned Research AdvisorsVera Institute of JusticeVidur KapurVipul NaikVitalik Buterin (Funder)Washington University in St. LouisWellcome TrustWikipedia ViewsWise AncestorsX Community NotesXPT (Existential Risk Persuasion Tournament)Y CombinatorYafah EdelmanYale UniversityYoshua BengioZach RobinsonZapier