Item | Value |
---|
Organization | Title | Start date | End date | AI safety relation | Subject | Employment type | Source | Notes |
---|---|---|---|---|---|---|---|---|
Effective Altruism Foundation | Researcher | [1] | ||||||
Effective Altruism Funds | Fund Chair (Forethought Foundation) | [2] | ||||||
Forethought Foundation for Global Priorities Research | Chief of Staff | 2021-07-01 | full-time | [3], [4] | ||||
Foundational Research Institute | Executive Director | 2017-06-01 | 2018-06-01 | GCR organization | general | full-time | [5], [6] | |
Sentience Institute | Board member | 2019-08-01 | board member | [7] |
Name | Creation date | Description |
---|
Title | Publication date | Author | Publisher | Affected organizations | Affected people | Document scope | Cause area | Notes |
---|---|---|---|---|---|---|---|---|
Comment on After one year of applying for EA jobs: It is really, really hard to get hired by an EA organisation | 2019-02-26 | Max Daniel | Effective Altruism Forum | Open Philanthropy, OpenAI, Berkeley Existential Risk Initiative, Centre for Effective Altruism, Future of Humanity Institute, AI Impacts | Daniel gives his experience applying to various effective altruist organizations, detailing the time he spent on applications and preparation. |
Title | Publication date | Author | Publisher | Affected organizations | Affected people | Affected agendas | Notes |
---|
Showing at most 20 people who are most similar in terms of which organizations they have worked at.