Item | Value |
---|
Organization | Title | Start date | End date | AI safety relation | Subject | Employment type | Source | Notes |
---|---|---|---|---|---|---|---|---|
Fund for Alignment Research | Collaborator | 2022-07-01 | 2022-07-01 | board member | [1] | |||
Fund for Alignment Research | Technical Staff | 2022-07-04 | board member | [2] | ||||
Ought | Software Engineering Intern | 2022-01-01 | position | intern | [3] |
Name | Creation date | Description |
---|
Title | Publication date | Author | Publisher | Affected organizations | Affected people | Document scope | Cause area | Notes |
---|
Title | Publication date | Author | Publisher | Affected organizations | Affected people | Affected agendas | Notes |
---|
Showing at most 20 people who are most similar in terms of which organizations they have worked at.
Person | Number of organizations in common | List of organizations in common |
---|---|---|
Ethan Perez | 1 | Fund for Alignment Research |
Tom McGrath | 1 | Ought |
Aparna Ashok | 1 | Ought |
Luke Stebbing | 1 | Ought |
Justin Reppert | 1 | Ought |
Eli Lifland | 1 | Ought |
Amanda Ngo | 1 | Ought |
Jungwon Byun | 1 | Ought |
Chris Cundy | 1 | Ought |
Milan Griffes | 1 | Ought |
Zachary Miller | 1 | Ought |
Ben West | 1 | Ought |
Andrew Schreiber | 1 | Ought |
Ben Rachbach | 1 | Ought |
Noah Goodman | 1 | Ought |
Ben Weinstein-Raun | 1 | Ought |
Neal Jean | 1 | Ought |
Jun Shern Chan | 1 | Fund for Alignment Research |
Girish Sastry | 1 | Ought |
Ben Goldhaber | 1 | Ought |