Item | Value |
---|---|
Facebook username | abramdemski |
GitHub username | abramdemski |
Intelligent Agent Foundations Forum username | Abram_Demski |
Agendas | Embedded agency |
Organization | Title | Start date | End date | AI safety relation | Subject | Employment type | Source | Notes |
---|---|---|---|---|---|---|---|---|
Machine Intelligence Research Institute | Research Associate | 2015-04-01 | 2017-07-01 | position | [1], [2] | |||
Machine Intelligence Research Institute | Research Fellow | 2017-07-21 | position | [3], [4] |
Name | Creation date | Description |
---|
Title | Publication date | Author | Publisher | Affected organizations | Affected people | Document scope | Cause area | Notes |
---|---|---|---|---|---|---|---|---|
Alignment Research Field Guide | 2019-03-08 | Abram Demski | LessWrong | Machine Intelligence Research Institute | AI safety | A guide for getting involved in AI alignment research, written with MIRIx groups (local groups that meet up to discuss MIRI’s work or make progress on AI alignment, with financial support from MIRI) in mind. |
Title | Publication date | Author | Publisher | Affected organizations | Affected people | Affected agendas | Notes |
---|
Showing at most 20 people who are most similar in terms of which organizations they have worked at.