| Item | Value |
|---|---|
| Donations List Website (data still preliminary) |
| Organization | Title | Start date | End date | AI safety relation | Subject | Employment type | Source | Notes |
|---|---|---|---|---|---|---|---|---|
| Leverhulme Centre for the Future of Intelligence | Project Leader | [1], [2], [3] | ||||||
| Centre for the Study of Existential Risk | Executive Director | 2014-10-01 | 2019-04-01 | position | [4], [5], [6], [7], [8], [9], [10], [3] | |||
| Berkeley Existential Risk Initiative | Board advisor | 2017-03-17 | 2024-09-24 | position | advisor | [11], [12] | ||
| Global Catastrophic Risk Institute | Senior Advisor | 2018-12-12 | GCR organization | advisor | [13], [14] | |||
| Centre for the Study of Existential Risk | Co-Director | 2019-04-01 | position | [4] | ||||
| Future of Humanity Institute | Research Associate | 2024-04-16 | [15], [16] |
| Name | Creation date | Description |
|---|
| Title | Publication date | Author | Publisher | Affected organizations | Affected people | Document scope | Cause area | Notes |
|---|
| Title | Publication date | Author | Publisher | Affected organizations | Affected people | Affected agendas | Notes |
|---|
Showing at most 20 people who are most similar in terms of which organizations they have worked at.