Item | Value |
---|
Organization | Title | Start date | End date | AI safety relation | Subject | Employment type | Source | Notes |
---|---|---|---|---|---|---|---|---|
OpenAI | Research Scientist | 2019-06-01 | AGI organization | technical research | [1] | |||
Theiss Research | [2] | One of 37 AGI Safety Researchers of 2015 funded by donations from Elon Musk and the Open Philanthropy Project |
Name | Creation date | Description |
---|
Title | Publication date | Author | Publisher | Affected organizations | Affected people | Document scope | Cause area | Notes |
---|
Title | Publication date | Author | Publisher | Affected organizations | Affected people | Affected agendas | Notes |
---|
Showing at most 20 people who are most similar in terms of which organizations they have worked at.
Person | Number of organizations in common | List of organizations in common |
---|---|---|
Paul Christiano | 2 | OpenAI, Theiss Research |
Yang Liu | 1 | OpenAI |
Mikhail Pavlov | 1 | OpenAI |
Natalie Summers | 1 | OpenAI |
Nancy Otero | 1 | OpenAI |
Nadja Rhodes | 1 | OpenAI |
Munashe Shumba | 1 | OpenAI |
Mor Katz | 1 | OpenAI |
Mo Bavarian | 1 | OpenAI |
Mira Murati | 1 | OpenAI |
MichaĆ Staniszewski | 1 | OpenAI |
Nikhil Mishra | 1 | OpenAI |
Michael Petrov | 1 | OpenAI |
Meredith Blankenship | 1 | OpenAI |
Maxim Sokolov | 1 | OpenAI |
Matthias Plappert | 1 | OpenAI |
Matthew Gentzel | 1 | OpenAI |
Matt Mochary | 1 | OpenAI |
Matt Krisiloff | 1 | OpenAI |
Nicolas Papernot | 1 | OpenAI |