Can we Build a Superintelligence Without Being Superintelligent?
Field | Value | Language |
dc.contributor.author | Sternhell, Robert | |
dc.date.accessioned | 2018-09-28 | |
dc.date.available | 2018-09-28 | |
dc.date.issued | 2017-01-01 | |
dc.identifier.uri | http://hdl.handle.net/2123/18836 | |
dc.description.abstract | If we create an entity of greater intelligence to us, a superintelligence, it has the possibility to explode in intelligence, creating more and more intelligent entities. If the intelligence explosion argument holds, then the most important step to developing a powerful superintelligence is the development of the first superintelligence. This paper presents objections to the possibility of humans developing this first superintelligence. I argue that this is because we lack required knowledge about them, due to our epistemic position of not being superintelligent. I engage primarily with arguments from David Chalmers and Nick Bostrom about what superintelligences are and the nature of the intelligence explosion. I add my own detail to these areas and explore how to increase intelligence. I argue that my objections stem from flawed expectations of superintelligence such that we ought to change them. I then present my own alternative expectations for superintelligence. | en_AU |
dc.language.iso | en_AU | en_AU |
dc.publisher | Department of Philosophy | en_AU |
dc.rights | The author retains copyright of this thesis | en_AU |
dc.subject | artificial intelligence | en_AU |
dc.subject | AI | en_AU |
dc.subject | The Singularity | en_AU |
dc.subject | intelligence | en_AU |
dc.subject | intelligence explosion | en_AU |
dc.subject | superintelligence | en_AU |
dc.title | Can we Build a Superintelligence Without Being Superintelligent? | en_AU |
dc.type | Thesis, Honours | en_AU |
dc.contributor.department | Department of Philosophy | en_AU |
Associated file/s
Associated collections