Item | Value |
---|---|
Donations List Website (data still preliminary) |
Organization | Title | Start date | End date | AI safety relation | Subject | Employment type | Source | Notes |
---|---|---|---|---|---|---|---|---|
Alignment Research Center | President | full-time | [1] | |||||
Alignment Research Center | Research staff member | full-time | [1] | |||||
OpenAI | OpenAI Fellow | 2018-10-01 | 2019-04-01 | AGI organization | [2] | Reinforcement learning team |
Name | Creation date | Description |
---|
Title | Publication date | Author | Publisher | Affected organizations | Affected people | Document scope | Cause area | Notes |
---|
Title | Publication date | Author | Publisher | Affected organizations | Affected people | Affected agendas | Notes |
---|
Showing at most 20 people who are most similar in terms of which organizations they have worked at.