Item | Value |
---|
Organization | Title | Start date | End date | AI safety relation | Subject | Employment type | Source | Notes |
---|---|---|---|---|---|---|---|---|
University of Oxford | Postdoctoral Research Scientist | 2015-01-01 | technical research | [1], [2], [3] | One of 10 AGI Safety Researchers of 2018 that were awarded over $2 million by future of life institute, also one of 37 AGI Safety Researchers of 2015 funded by donations from Elon Musk and the Open Philanthropy Project | |||
Future of Humanity Institute | Research Fellow | [4], [5], [6], [7] | ||||||
Ought | Board member and collaborator | 2018-10-17 | 2019-02-02 | position | board member | [8], [9] | ||
Ought | Advisor | 2021-05-14 | 2023-09-01 | position | advisor | [10], [11] | ||
Ought | Director & CEO | 2023-10-18 | position | board member | [12] |
Name | Creation date | Description |
---|
Title | Publication date | Author | Publisher | Affected organizations | Affected people | Document scope | Cause area | Notes |
---|
Title | Publication date | Author | Publisher | Affected organizations | Affected people | Affected agendas | Notes |
---|
Showing at most 20 people who are most similar in terms of which organizations they have worked at.