Superintelligence - Nick Bostrom
Add to your Fliist
Add

Superintelligence

Updated: 7 Sep 2020
The human brain has some capabilities that the brains of other animals lack. It is to these distinctive capabilities that our species owes its dominant position. Other animals have stronger muscles or sharper claws, but we have cleverer brains. If machine brains one day come to surpass human brains in general intelligence, then this new superintelligence could become very powerful. As the fate of the gorillas now depends more on us humans than on the gorillas themselves, so the fate of our species then would come to depend on the actions of the machine superintelligence. But we have one advantage: we get to make the first move. Will it be possible to construct a seed AI or otherwise to engineer initial conditions so as to make an intelligence explosion survivable? How could one achieve a controlled detonation? To get closer to an answer to this question, we must make our way through a fascinating landscape of topics and considerations. Read the book and learn about oracles, genies, singletons; about boxing methods, tripwires, and mind crime; about humanity's cosmic endowment and differential technological development; indirect normativity, instrumental convergence, whole brain emulation and technology couplings; Malthusian economics and dystopian evolution; artificial intelligence, and biological cognitive enhancement, and collective intelligence.
Philosopher, Interviewer, Business Coach
63 followers
88 FLIISTs
5 years ago
I’ve only recently begun to pay attention to the progress being made in artificial intelligence. The field is advancing faster than most people realise, and we seem to be headed for a precipice of sorts. Reading Bostrom’s book, you come away feeling that there may be no way to build machines that possess true “general intelligence” – that is, the ability to learn new concepts and apply them in unfamiliar contexts – without destroying ourselves in the process. You also get the sense that we will inevitably build such machines, unless we destroy ourselves some other way.
Open FLIIST
Entrepreneur, Philosopher
24 followers
249 FLIISTs
over 1 year ago
"I wouldn’t just read [this book] breathless and wide-eyed and believe everything." - Naval Ravikant
Open FLIIST
Entrepreneur
5 followers
85 FLIISTs
1 year ago
"Best thing I’ve seen on this topic." - Sam Altman
Open FLIIST
Entrepreneur, Scientist
1035 followers
190 FLIISTs
over 5 years ago
Worth reading Superintelligence by Bostrom. We need to be super careful with AI. Potentially more dangerous than nukes.
Open FLIIST