4047:
33:
1076:" in a description of how he thought this statement might be perceived by others. The apology, posted on his website, stated that "the invocation of a racial slur was repulsive" and that he "completely repudiate this disgusting email". In his apology, he wrote: "I think it is deeply unfair that unequal access to education, nutrients and basic healthcare leads to inequality in social outcomes, including sometimes disparities in skills and cognitive capacity." According to
5511:
5016:
1107:, the chair of the Oxford philosophy department faculty board, sent an email to current students and graduates saying that she was "appalled and upset" by Bostrom's 1996 email. According to Bostrom, the investigation by Oxford University concluded on 10 August 2023 "we do not consider you to be a racist or that you hold racist views, and we consider that the apology you posted in January 2023 was sincere."
865:(SIA), shows how they lead to different conclusions in a number of cases, and identifies how each is affected by paradoxes or counterintuitive implications in certain thought experiments. He suggests that a way forward may involve extending SSA into the Strong Self-Sampling Assumption (SSSA), which replaces "observers" in the SSA definition with "observer-moments".
789:, he explores the concept of an ideal life, if humanity transitions successfully into a post-superintelligence world. Bostrom notes that the question is "not how interesting a future is to look at, but how good it is to live in." He outlines some technologies that he considers physically possible in theory and available at technological maturity, such as
526:, which he defines as one in which an "adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential". Bostrom is mostly concerned about anthropogenic risks, which are risks arising from human activities, particularly from new technologies such as advanced artificial intelligence,
797:, arbitrary sensory inputs (taste, sound...), or the precise control of motivation, mood, well-being and personality. According to him, not only machines would be better than humans at working, but they would also undermine the purpose of many leisure activities, providing extreme welfare while challenging the quest for meaning.
1009:. This principle states that we ought to retard the development of dangerous technologies, particularly ones that raise the level of existential risk, and accelerate the development of beneficial technologies, particularly those that protect against the existential risks posed by nature or by other technologies.
659:, resulting (potentially rapidly) in a superintelligence. Such a superintelligence could have vastly superior capabilities, notably in strategizing, social manipulation, hacking or economic productivity. With such capabilities, a superintelligence could outwit humans and take over the world, establishing a
670:
Suppose we give an A.I. the goal to make humans smile. When the A.I. is weak, it performs useful or amusing actions that cause its user to smile. When the A.I. becomes superintelligent, it realizes that there is a more effective way to achieve this goal: take control of the world and stick electrodes
4013:
we do not consider you to be a racist or that you hold racist views, and we consider that the apology you posted in
January 2023 was sincere. … we believe that your apology, your acknowledgement of the distress your actions caused, and your appreciation for the care and time that everyone has given
2254:
Bostrom notes that "the concept of a singleton is an abstract one: a singleton could be democracy, a tyranny, a single dominant AI, a strong set of global norms that include effective provisions for their own enforcement, or even an alien overlord—its defining characteristic being simply that it is
978:
in 2006. Given humans' irrational status quo bias, how can one distinguish between valid criticisms of proposed changes in a human trait and criticisms merely motivated by resistance to change? The reversal test attempts to do this by asking whether it would be a good thing if the trait was altered
716:
Bostrom warns that an existential catastrophe can also occur from AI being misused by humans for destructive purposes, or from humans failing to take into account the potential moral status of digital minds. Despite these risks, he says that machine superintelligence seems involved at some point in
639:
tries to achieve for its own intrinsic value. Instrumental goals are just intermediary steps towards final goals. Bostrom contends there are instrumental goals that will be shared by most sufficiently intelligent agents because they are generally useful to achieve any objective (e.g. preserving the
578:
In a paper called "The
Vulnerable World Hypothesis", Bostrom suggests that there may be some technologies that destroy human civilization by default when discovered. Bostrom proposes a framework for classifying and dealing with these vulnerabilities. He also gives counterfactual thought experiments
3961:
The apology did little to placate
Bostrom's critics, not least because he conspicuously failed to withdraw his central contention regarding race and intelligence, and seemed to make a partial defence of eugenics. Although, after an investigation, Oxford University did accept that Bostrom was not a
692:
dynamics. He suggests potential techniques to help control AI, including containment, stunting AI capabilities or knowledge, narrowing the operating context (e.g. to question-answering), or "tripwires" (diagnostic mechanisms that can lead to a shutdown). But
Bostrom contends that "we should not be
814:
than humans, using less resources. Such highly sentient machines, which he calls "super-beneficiaries", would be extremely efficient at achieving happiness. He recommends finding "paths that will enable digital minds and biological minds to coexist, in a mutually beneficial way where all of these
880:, an observation selection effect that prevents observers from observing certain kinds of catastrophes in their recent geological and evolutionary past. They suggest that events that lie in the anthropic shadow are likely to be underestimated unless statistical corrections are made.
755:. It was praised for offering clear and compelling arguments on a neglected yet important topic. It was sometimes criticized for spreading pessimism about the potential of AI, or for focusing on longterm and speculative risks. Some skeptics such as
2244:
Bostrom says that the risk can be reduced if society sufficiently exits what he calls a "semi-anarchic default condition", which roughly means limited capabilities for preventive policing and global governance, and having individuals with diverse
990:
can provide a solution and that "In any case, the time-scale for human natural genetic evolution seems much too grand for such developments to have any significant effect before other developments will have made the issue moot".
805:
Bostrom supports the substrate independence principle, the idea that consciousness can emerge on various types of physical substrates, not only in "carbon-based biological neural networks" like the human brain. He considers that
4014:
to this process has been genuine and sincere. We were also encouraged that you have already embarked on a journey of deep and meaningful reflection, which includes exploring the learning and self-education from this process.
2710:
856:
is a common flaw in many areas of inquiry (including cosmology, philosophy, evolution theory, game theory, and quantum physics). He argues that an anthropic theory is needed to deal with these. He introduces the
450:, Sweden, he disliked school at a young age and spent his last year of high school learning from home. He was interested in a wide variety of academic areas, including anthropology, art, literature, and science.
438:, which he defines as "any intellect that greatly exceeds the cognitive performance of humans in virtually all domains of interest". He views this as a major source of opportunities and existential risks.
693:
confident in our ability to keep a superintelligent genie locked up in its bottle forever. Sooner or later, it will out". He thus suggests that in order to be safe for humanity, superintelligence must be
1005:
Bostrom has suggested that technology policy aimed at reducing existential risk should seek to influence the order in which various technological capabilities are attained, proposing the principle of
1103:, "The University and Faculty of Philosophy is currently investigating the matter but condemns in the strongest terms possible the views this particular academic expressed in his communications."
913:
Bostrom is favorably disposed toward "human enhancement", or "self-improvement and human perfectibility through the ethical application of science", as well as a critic of bio-conservative views.
4633:
1123:
829:
413:
1072:
In
January 2023, Bostrom issued an apology for a 1996 email he sent as a postgrad where he had stated that he thought "Blacks are more stupid than whites", and where he also used the word "
2953:
608:
is possible and explores different types of superintelligences, their cognition, the associated risks. He also presents technical and strategic considerations on how to make it safe.
3937:
625:
649:
5019:
1015:
Bostrom's theory of the unilateralist's curse has been cited as a reason for the scientific community to avoid controversial dangerous research such as reanimating pathogens.
767:
considers that there is no existential risk, asserting that superintelligent AI will have no desire for self-preservation and that experts can be trusted to make it safe.
4626:
3975:
5192:
4983:
4569:
3674:
2757:
3644:
5050:
4619:
3161:"Almost half of CEOs fear A.I. could destroy humanity five to 10 years from now—but 'A.I. godfather' says an existential threat is 'preposterously ridiculous'"
644:. On the other side, he writes that virtually any level of intelligence can in theory be combined with virtually any final goal (even absurd final goals, e.g.
3376:
4700:
4642:
4448:
681:
393:
4771:
663:(which is "a world order in which there is at the global level a single decision-making agency") and optimizing the world according to its final goals.
5615:
5595:
713:(doing what is morally right), and moral permissibility (following humanity's coherent extrapolated volition except when it's morally impermissible).
5560:
773:
wrote that
Bostrom's book on superintelligence "is not intended as a treatise of deep originality; Bostrom's contribution is to impose the rigors of
4811:
3119:
2629:
579:
of how such vulnerabilities could have historically occurred, e.g. if nuclear weapons had been easier to develop or had ignited the atmosphere (as
481:
in 1996. During his time at
Stockholm University, he researched the relationship between language and reality by studying the analytic philosopher
2859:
3483:"The ideas interview: Nick Bostrom; John Sutherland meets a transhumanist who wrestles with the ethics of technologically enhanced human beings"
2598:
2526:
5343:
5233:
4806:
1086:, "The apology did little to placate Bostrom’s critics, not least because he conspicuously failed to withdraw his central contention regarding
933:
3726:
2778:
5199:
4990:
4970:
4576:
4287:
1175:
593:
419:
963:
and learned helplessness can prevent people from taking action to defeat aging even when the means to do so are at their disposal. YouTuber
5585:
4152:
4074:
1932:
5555:
2840:
3313:
3068:
2290:
562:
characterize the relationship between existential risk and the broader class of global catastrophic risks, and link existential risk to
5565:
5550:
5043:
4122:
3427:
5635:
4776:
4297:
3889:
1061:
940:
546:
3859:
4816:
4377:
3044:
1053:
3339:
5114:
1006:
1000:
785:
Despite having popularized the notion of existential risk from AI, Bostrom also analyzed potential upsides. In his 2024 book,
5036:
3352:
3228:
2928:
1184:
1148:
236:
3453:
2655:
2347:
959:. The fable personifies death as a dragon that demands a tribute of thousands of people every day. The story explores how
5605:
4292:
3684:
3617:
729:
598:
2680:
5226:
4958:
1198:
645:
810:
is a matter of degree" and that digital minds can in theory be engineered to have a much higher rate and intensity of
4695:
3652:
3279:
3134:
1168:
1132:
689:
347:
3915:
5600:
5178:
4705:
4392:
4132:
4067:
1139:
632:
554:
3160:
3093:
901:
The fraction of posthuman civilizations that are interested in running ancestor-simulations is very close to zero;
4660:
4534:
4302:
1048:
Bostrom has provided policy advice and consulted for many governments and organizations. He gave evidence to the
640:
agent's own existence or current goals, acquiring resources, improving its cognition...), this is the concept of
621:
3502:
4766:
4498:
944:
706:
3699:
894:
Bostrom's simulation argument posits that at least one of the following statements is very likely to be true:
5620:
5219:
5154:
4282:
4112:
2884:
2372:
904:
The fraction of all people with our kind of experiences that are living in a simulation is very close to one.
660:
268:
186:
5625:
5461:
5416:
5348:
5059:
4997:
4857:
4786:
4397:
4322:
4277:
3482:
538:
401:
209:
5003:
4548:
4493:
4060:
3802:
2485:
2456:
5610:
5570:
5426:
5139:
4264:
4157:
3186:
2515:
929:
917:
862:
490:
474:
282:
108:
2606:
5277:
4756:
4740:
4690:
4600:
4527:
4387:
4272:
3529:
949:
2110:
1782:
1604:
1401:
5580:
5282:
5261:
5185:
5144:
5119:
4791:
4710:
4555:
4468:
4453:
4327:
2039:
1057:
858:
641:
527:
478:
458:
286:
264:
205:
98:
78:
2986:
1984:"The Superintelligent Will: Motivation and Instrumental Rationality in Advanced Artificial Agents"
833:. In the book, he criticizes previous formulations of the anthropic principle, including those of
563:
5630:
4646:
4357:
4107:
3938:"'Eugenics on steroids': the toxic and contested legacy of Oxford's Future of Humanity Institute"
2460:
1229:
431:
169:
4611:
1943:
666:
Bostrom argues that giving simplistic final goals to a superintelligence could be catastrophic:
5590:
5545:
5514:
4715:
4583:
4412:
4147:
2860:"How technological progress is making it likelier than ever that humans will destroy ourselves"
2105:
2034:
1777:
1599:
1396:
869:
656:
559:
5535:
5171:
5149:
4670:
4541:
4503:
1919:
1356:
1087:
898:
The fraction of human-level civilizations that reach a posthuman stage is very close to zero;
889:
853:
811:
790:
580:
389:
260:
5381:
5575:
5540:
5476:
5317:
4761:
4167:
2725:
2404:
1867:
1653:
1591:
1478:
770:
506:
486:
466:
405:
88:
3976:"Oxford Uni offers students support but no update on any action against Professor Bostrom"
8:
5360:
5333:
5129:
4976:
4402:
3704:
2224:
1033:
987:
824:
774:
381:
272:
191:
147:
4046:
3401:
2729:
1871:
1852:
1657:
1595:
1482:
1012:
In 2011, Bostrom founded the Oxford Martin
Program on the Impacts of Future Technology.
522:
Bostrom's research concerns the future of humanity and long-term outcomes. He discusses
485:. He also did some turns on London's stand-up comedy circuit. In 2000, he was awarded a
5307:
4483:
4443:
4428:
4382:
4347:
4317:
4102:
4083:
3863:
3599:
2783:
2749:
2219:
2151:
2052:
2006:
1891:
1839:
1803:
1737:
1708:
1679:
1625:
1563:
1502:
1468:
1414:
1343:
1290:
1257:
1249:
685:
276:
179:
1983:
697:
with morality or human values so that it is "fundamentally on our side". Potential AI
5496:
5486:
5466:
4942:
4917:
4735:
4520:
4372:
4367:
4243:
4208:
4183:
4097:
3945:
3591:
3348:
3287:
3224:
3194:
2994:
2961:
2934:
2924:
2741:
2300:
2209:
2155:
2128:
1904:
1883:
1879:
1795:
1712:
1671:
1617:
1555:
1494:
1447:
1443:
1384:
1261:
1194:
1180:
1164:
1155:
1144:
1128:
794:
702:
636:
605:
531:
470:
435:
385:
3603:
1629:
1567:
1418:
1320:
1303:
1212:
5491:
4964:
4937:
4837:
4685:
4562:
4213:
4203:
4142:
4042:
3890:"Top Oxford Philosopher Nick Bostrom Admits Writing 'Disgusting' N-Word Mass Email"
3583:
3260:
3049:
2821:
2753:
2733:
2176:
2143:
2115:
2077:
2056:
2044:
2010:
1998:
1895:
1875:
1831:
1807:
1787:
1749:
1700:
1683:
1661:
1609:
1576:
1547:
1506:
1486:
1439:
1406:
1371:
1335:
1282:
1241:
1160:
523:
454:
377:
364:
356:
318:
82:
3568:
2405:"How Britain's oldest universities are trying to protect humanity from risky A.I."
1721:
1532:
1375:
5471:
5441:
5386:
5093:
5078:
4902:
4882:
4872:
4862:
4796:
4730:
4332:
4307:
4127:
3245:
2521:
1963:
1427:
1270:
1099:
1032:
s 2009 list of top global thinkers "for accepting no limits on human potential."
960:
873:
846:
763:
contended that superintelligence is too far away for the risk to be significant.
736:
684:. He emphasizes the importance of international collaboration, notably to reduce
502:
498:
201:
2806:
1923:
1515:
5401:
5312:
5302:
5098:
5028:
4725:
4488:
4478:
4463:
4433:
4407:
4352:
4198:
4162:
4117:
3679:
2774:
2295:
2214:
2090:
2065:
2019:
1816:
1722:"Where Are They? Why I hope the search for extraterrestrial life finds nothing"
1638:
1385:"Astronomical Waste: The Opportunity Cost of Delayed Technological Development"
1077:
1049:
1025:
955:
842:
838:
834:
756:
617:
462:
252:
140:
92:
3551:
3026:"Existential Risks - Analyzing Human Extinction Scenarios and Related Hazards"
2318:
2147:
2048:
2002:
1791:
1704:
1459:
with
Tegmark, Max (December 2005). "How Unlikely is a Doomsday Catastrophe?".
1410:
1245:
5529:
5431:
5406:
5297:
5242:
5134:
4932:
4877:
4847:
4458:
4438:
4362:
3949:
3621:
3291:
3198:
2998:
2965:
2938:
2841:"AI Doomsday Can Be Avoided If We Establish 'World Government': Nick Bostrom"
2684:
2304:
1304:"Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards"
1052:, Select Committee on Digital Skills. He is an advisory board member for the
975:
921:
777:
on a messy corpus of ideas that emerged at the margins of academic thought."
567:
397:
256:
4038:
3402:"Anthropic Shadow: Observation Selection Effects and Human Extinction Risks"
3264:
3025:
2825:
2181:
2164:
2119:
2081:
1853:"Anthropic Shadow: Observation Selection Effects and Human Extinction Risks"
1835:
1613:
1339:
1286:
5481:
5446:
5436:
5421:
5396:
5287:
5124:
5083:
4922:
4665:
4337:
4259:
4233:
4228:
4218:
3595:
3487:
2745:
1887:
1799:
1762:
1753:
1675:
1621:
1559:
1498:
1451:
1104:
1082:
760:
752:
748:
694:
3751:
2255:
some form of agency that can solve all major global coordination problems"
655:
He argues that an AI with the ability to improve itself might initiate an
408:
and is now
Principal Researcher at the Macrostrategy Research Initiative.
32:
5391:
5292:
4927:
4912:
4720:
4680:
4312:
4137:
3120:"No, the Experts Don't Think Superintelligent AI is a Threat to Humanity"
2706:
1473:
698:
616:
Bostrom explores multiple possible paths to superintelligence, including
542:
482:
447:
63:
2091:"Embryo Selection for Cognitive Enhancement: Curiosity or Game-changer?"
1843:
1253:
936:, although he is no longer involved with either of these organisations.
5338:
5256:
4892:
4867:
4842:
4801:
4781:
4188:
4001:
3831:
3341:
Anthropic Bias: Observation Selection Effects in Science and Philosophy
1347:
1294:
1124:
Anthropic Bias: Observation Selection Effects in Science and Philosophy
925:
830:
Anthropic Bias: Observation Selection Effects in Science and Philosophy
764:
740:
414:
Anthropic Bias: Observation Selection Effects in Science and Philosophy
373:
5456:
5354:
4907:
4897:
4675:
4238:
4052:
2552:
2376:
983:
807:
744:
240:
3962:
racist, the whole episode left a stain on the institute's reputation
2737:
2656:"Institute That Pioneered AI 'Existential Risk' Research Shuts Down"
1666:
1490:
5451:
5088:
4887:
4473:
4342:
4223:
3587:
2196:
1905:"Information Hazards: A Typology of Potential Harms from Knowledge"
1551:
1091:
971:
964:
710:
671:
into the facial muscles of humans to cause constant, beaming grins.
3803:"Apocalypse Soon: Meet The Scientists Preparing For the End Times"
3569:"The reversal test: eliminating status quo bias in applied ethics"
1533:"The Reversal Test: Eliminating Status Quo Bias in Applied Ethics"
5411:
4641:
3776:
2570:
221:
5211:
4821:
1763:"Cognitive Enhancement: Methods, Ethics, Regulatory Challenges"
1357:"The Mysteries of Self-Locating Belief and Anthropic Reasoning"
1073:
369:
216:
3428:"Could our Universe be a simulation? How would we even tell?"
2195:
Bostrom met his wife Susan in 2002. As of 2015, she lived in
339:
3187:"If A.I. Takes All Our Jobs, Will It Also Take Our Purpose?"
4031:
3445:
2428:
301:
3645:"Horsepox synthesis: A case of the unilateralist's curse?"
3045:"What happens when our computers get smarter than we are?"
2020:"Thinking Inside the Box: Controlling and Using Oracle AI"
2018:
with Armstrong, Stuart; Sandberg, Anders (November 2012).
735:
and received positive feedback from personalities such as
327:
2451:
2449:
611:
112:
102:
3883:
3881:
2017:
137:
Professorial Distinction Award from University of Oxford
3916:"Investigation Launched into Oxford Don's Racist Email"
3719:
2779:"The Flip Side of Optimism About Life on Other Planets"
2575:(PhD). London School of Economics and Political Science
1850:
2954:"You Should Be Terrified of Superintelligent Machines"
2446:
4701:
Existential risk from artificial general intelligence
4449:
Existential risk from artificial general intelligence
3878:
3314:"The intelligent monster that you should let eat you"
3135:"Yann LeCun sparks a debate on AGI vs human-level AI"
348:
330:
4269:
All-Party Parliamentary Group for Future Generations
2914:
2912:
2910:
2908:
2906:
2630:"We're Underestimating the Risk of Human Extinction"
717:"all the plausible paths to a really great future".
545:
of human civilization. He is also an adviser to the
400:. He was the founding director of the now dissolved
336:
333:
324:
4772:
Center for Human-Compatible Artificial Intelligence
3675:"The FP Top 100 Global Thinkers – 73. Nick Bostrom"
2348:"Nick Bostrom — Life and Meaning in a Solved World"
1230:"Observer-relative chances in anthropic reasoning?"
620:and human intelligence enhancement, but focuses on
321:
2514:
2126:
541:which, until its shutdown in 2024, researched the
3636:
3280:"What if A.I. Sentience Is a Question of Degree?"
2903:
2711:"Astrophysics: is a doomsday catastrophe likely?"
924:Association (which has since changed its name to
5527:
5058:
4812:Leverhulme Centre for the Future of Intelligence
3752:"Team – Machine Intelligence Research Institute"
3219:Bostrom, Nick (2024). "Technological maturity".
2486:"Nick Bostrom on the birth of superintelligence"
680:Bostrom explores several pathways to reduce the
3476:
3474:
3221:Deep utopia: life and meaning in a solved world
2572:Observational selection effects and probability
2066:"Existential Risk Reduction as Global Priority"
1851:with Ćirković, Milan; Sandberg, Anders (2010).
1760:
1574:
1191:Deep Utopia: Life and Meaning in a Solved World
787:Deep Utopia: Life and Meaning in a Solved World
495:Observational selection effects and probability
425:Deep Utopia: Life and Meaning in a Solved World
223:Observational Selection Effects and Probability
5344:Institute for Ethics and Emerging Technologies
4807:Institute for Ethics and Emerging Technologies
4288:Centre for Enabling EA Learning & Research
3066:
2705:
2299:. Vol. XCI, no. 37. pp. 64–79.
2288:
2088:
1639:"Drugs can be used to treat more than disease"
1458:
934:Institute for Ethics and Emerging Technologies
908:
573:
5227:
5200:Superintelligence: Paths, Dangers, Strategies
5044:
4991:Superintelligence: Paths, Dangers, Strategies
4971:Open letter on artificial intelligence (2015)
4627:
4577:Superintelligence: Paths, Dangers, Strategies
4068:
2516:"Artificial intelligence: can we control it?"
1940:Cambridge Handbook of Artificial Intelligence
1176:Superintelligence: Paths, Dangers, Strategies
953:. A shorter version was published in 2012 in
726:Superintelligence: Paths, Dangers, Strategies
594:Superintelligence: Paths, Dangers, Strategies
420:Superintelligence: Paths, Dangers, Strategies
4153:Psychological barriers to effective altruism
3471:
3117:
2923:. Oxford University Press. pp. 98–111.
2767:
2508:
2506:
1060:, and an external advisor for the Cambridge
3907:
3887:
3794:
3377:"Why Earth's History Appears So Miraculous"
2592:
2590:
2284:
2282:
2280:
2278:
2276:
2274:
2272:
1520:Linguistic and Philosophical Investigations
1143:, edited by Bostrom and Milan M. Ćirković,
986:effects in human populations but he thinks
823:Bostrom has published numerous articles on
709:(human values improved via extrapolation),
441:
5234:
5220:
5051:
5037:
4634:
4620:
4123:Distributional cost-effectiveness analysis
4075:
4061:
4045:
3860:"extropians: Re: Offending People's Minds"
3697:
3566:
3503:"A Philosophical History of Transhumanism"
3480:
3246:"Are You Living In a Computer Simulation?"
3067:Khatchadourian, Raffi (16 November 2015).
2398:
2396:
2394:
2289:Khatchadourian, Raffi (23 November 2015).
1584:Annals of the New York Academy of Sciences
1530:
1321:"Are You Living in a Computer Simulation?"
1090:, and seemed to make a partial defence of
967:created an animated version of the story.
815:different forms can flourish and thrive".
624:, explaining that electronic devices have
31:
5616:Swedish expatriates in the United Kingdom
5596:People associated with effective altruism
3744:
2544:
2512:
2503:
2457:"Nick Bostrom on artificial intelligence"
2248:
2199:and Bostrom in Oxford. They had one son.
2180:
2109:
2038:
1781:
1691:— (2008). "The doomsday argument".
1665:
1603:
1472:
1400:
1067:
1036:listed Bostrom in their 2014 list of the
852:Bostrom believes that the mishandling of
5561:Alumni of the London School of Economics
4777:Centre for the Study of Existential Risk
4298:Centre for the Study of Existential Risk
3769:
3223:. Washington, DC: Ideapress Publishing.
2838:
2627:
2587:
2269:
1761:with Sandberg, Anders (September 2009).
1097:In January 2023, Oxford University told
1062:Centre for the Study of Existential Risk
982:Bostrom's work also considers potential
547:Centre for the Study of Existential Risk
4817:Machine Intelligence Research Institute
4378:Machine Intelligence Research Institute
3973:
3935:
3913:
3756:Machine Intelligence Research Institute
3527:
3374:
3347:. New York: Routledge. pp. 44–58.
3337:
3277:
3243:
3218:
3042:
3023:
2951:
2918:
2804:
2773:
2653:
2568:
2391:
2345:
2162:
2063:
1981:
1961:
1933:"THE ETHICS OF ARTIFICIAL INTELLIGENCE"
1930:
1902:
1814:
1735:
1719:
1690:
1636:
1575:with Sandberg, Anders (December 2006).
1513:
1425:
1382:
1354:
1318:
1301:
1268:
1227:
1210:
1054:Machine Intelligence Research Institute
5528:
5115:Differential technological development
4082:
3800:
3620:. Oxford Martin School. Archived from
3618:"Professor Nick Bostrom : People"
3425:
3311:
3158:
1742:Studies in Ethics, Law, and Technology
1007:differential technological development
1001:Differential technological development
994:
883:
818:
675:
612:Characteristics of a superintelligence
465:degree in philosophy and physics from
5215:
5032:
4615:
4056:
3931:
3929:
3825:
3823:
3727:"Digital Skills Committee – timeline"
3642:
3062:
3060:
3019:
3017:
3015:
2857:
2529:from the original on 10 December 2022
2402:
363:
237:Philosophy of artificial intelligence
3888:Ladden-Hall, Dan (12 January 2023).
3698:Kutchinsky, Serena (23 April 2014).
3312:Fisher, Richard (13 November 2020).
3132:
2839:Abhijeet, Katte (25 December 2018).
2596:
2480:
2478:
2089:with Shulman, Carl (February 2014).
1213:"How Long Before Superintelligence?"
1043:
800:
631:Bostrom draws a distinction between
586:
5586:Fellows of St Cross College, Oxford
4293:Center for High Impact Philanthropy
3999:
3829:
3643:Lewis, Gregory (19 February 2018).
3184:
3133:Arul, Akashdeep (27 January 2022).
3030:Journal of Evolution and Technology
2952:Bostrom, Nick (11 September 2014).
2550:
1577:"Converging Cognitive Enhancements"
1308:Journal of Evolution and Technology
1204:
720:
517:
13:
4959:Statement on AI risk of extinction
3926:
3914:Bilyard, Dylan (15 January 2023).
3820:
3700:"World thinkers 2014: The results"
3683:. 30 November 2009. Archived from
3454:"Proof of the Simulation Argument"
3451:
3094:"Is Superintelligence Impossible?"
3057:
3012:
2858:Piper, Kelsey (19 November 2018).
2654:Maiberg, Emanuel (17 April 2024).
916:In 1998, Bostrom co-founded (with
633:final goals and instrumental goals
446:Born as Niklas Boström in 1973 in
430:Bostrom believes that advances in
16:Philosopher and writer (born 1973)
14:
5647:
5566:Artificial intelligence ethicists
5551:21st-century Swedish philosophers
5241:
4696:Ethics of artificial intelligence
4023:
3936:Anthony, Andrew (28 April 2024).
3801:McBain, Sophie (4 October 2014).
3777:"Team – Future of Life Institute"
3649:Bulletin of the Atomic Scientists
3567:Bostrom, Nick; Ord, Toby (2006).
3278:Jackson, Lauren (12 April 2023).
2807:"The Vulnerable World Hypothesis"
2475:
2375:. Nickbostrom.com. Archived from
2165:"The Vulnerable World Hypothesis"
1912:Review of Contemporary Philosophy
1428:"In Defense of Posthuman Dignity"
1110:
497:. He held a teaching position at
5510:
5509:
5015:
5014:
4706:Friendly artificial intelligence
4133:Equal consideration of interests
3530:"The Fable of the Dragon-Tyrant"
3426:Sutter, Paul (31 January 2024).
3375:Brannen, Peter (15 March 2018).
2513:Thornhill, John (14 July 2016).
2190:
2163:Bostrom, Nick (September 2019).
1880:10.1111/j.1539-6924.2010.01460.x
1444:10.1111/j.1467-8519.2005.00437.x
928:). In 2004, he co-founded (with
868:In later work, he proposed with
317:
5636:University of Gothenburg alumni
5556:Alumni of King's College London
4535:Famine, Affluence, and Morality
4303:Development Media International
3993:
3967:
3862:. 27 April 2015. Archived from
3852:
3691:
3667:
3610:
3560:
3544:
3521:
3495:
3481:Sutherland, John (9 May 2006).
3419:
3394:
3368:
3331:
3305:
3271:
3237:
3212:
3178:
3152:
3126:
3111:
3086:
3036:
2979:
2945:
2877:
2851:
2832:
2805:Bostrom, Nick (November 2019).
2798:
2699:
2683:. 17 April 2024. Archived from
2673:
2647:
2628:Andersen, Ross (6 March 2012).
2621:
2562:
2238:
2127:with Muehlhauser, Luke (2014).
622:artificial general intelligence
4767:Center for Applied Rationality
4499:Risk of astronomical suffering
3528:Bostrom, Nick (12 June 2012).
3159:Taylor, Chloe (15 June 2023).
2681:"Future of Humanity Institute"
2421:
2365:
2339:
2311:
1770:Science and Engineering Ethics
945:The Fable of the Dragon-Tyrant
780:
707:coherent extrapolated volition
552:In the 2008 essay collection,
489:degree in philosophy from the
1:
4283:Centre for Effective Altruism
4113:Disability-adjusted life year
2262:
1376:10.5840/harvardreview20031114
537:In 2005, Bostrom founded the
501:from 2000 to 2002, and was a
5417:Nikolai Fyodorovich Fyodorov
5349:Future of Humanity Institute
5060:Future of Humanity Institute
4787:Future of Humanity Institute
4398:Raising for Effective Giving
4323:Future of Humanity Institute
3043:Bostrom, Nick (March 2015).
2885:"Best Selling Science Books"
1531:with Ord, Toby (July 2006).
1364:Harvard Review of Philosophy
539:Future of Humanity Institute
402:Future of Humanity Institute
365:[ˈnɪ̌kːlasˈbûːstrœm]
210:Future of Humanity Institute
7:
5004:Artificial Intelligence Act
4998:Do You Trust This Computer?
4549:Living High and Letting Die
4494:Neglected tropical diseases
3185:Coy, Peter (5 April 2024).
2202:
979:in the opposite direction.
909:Ethics of human enhancement
591:In 2014, Bostrom published
574:Vulnerable world hypothesis
512:
505:Postdoctoral Fellow at the
461:in 1994. He then earned an
10:
5652:
5606:Philosophers of technology
5140:Self-indication assumption
4265:Against Malaria Foundation
4158:Quality-adjusted life year
4002:"Apology for an Old Email"
3553:Fable of the Dragon-Tyrant
2429:"Nick Bostrom's Home Page"
2403:Shead, Sam (25 May 2020).
1271:"The Meta-Newcomb Problem"
998:
887:
863:Self-Indication Assumption
648:), a concept he calls the
635:. A final goal is what an
564:observer selection effects
491:London School of Economics
475:computational neuroscience
283:Self-indication assumption
109:London School of Economics
5505:
5369:
5326:
5270:
5249:
5179:Global Catastrophic Risks
5163:
5107:
5066:
5012:
4951:
4830:
4757:Alignment Research Center
4749:
4741:Technological singularity
4691:Effective accelerationism
4653:
4601:Effective Altruism Global
4593:
4528:The End of Animal Farming
4512:
4421:
4388:Nuclear Threat Initiative
4273:Animal Charity Evaluators
4252:
4176:
4090:
2346:Skeptic (16 April 2024).
2148:10.1017/S1477175613000316
2129:"Why we need friendly AI"
2064:— (February 2013).
2049:10.1007/s11023-012-9282-2
2003:10.1007/s11023-012-9281-3
1792:10.1007/s11948-009-9142-5
1705:10.1017/S1477175600002943
1411:10.1017/S0953820800004076
1383:— (November 2003).
1217:Journal of Future Studies
1140:Global Catastrophic Risks
1018:
950:Journal of Medical Ethics
555:Global Catastrophic Risks
411:Bostrom is the author of
296:
292:
246:
230:
215:
197:
185:
175:
165:
161:
154:s Top World Thinkers list
130:
122:
71:
42:
30:
23:
5283:Eradication of suffering
5262:Transhumanism in fiction
5145:Self-sampling assumption
5120:Global catastrophic risk
4792:Future of Life Institute
4711:Instrumental convergence
4556:The Most Good You Can Do
4469:Intensive animal farming
4454:Global catastrophic risk
4328:Future of Life Institute
3781:Future of Life Institute
3139:Analytics India Magazine
3122:. MIT Technology Review.
3069:"The Doomsday Invention"
2845:Analytics India Magazine
2709:; Bostrom, Nick (2005).
2291:"The Doomsday Invention"
2231:
1971:Analysis and Metaphysics
1637:— (January 2008).
1269:— (October 2001).
1228:— (January 2000).
1159:, edited by Bostrom and
1115:
1058:Future of Life Institute
859:Self-Sampling Assumption
682:existential risk from AI
642:instrumental convergence
628:over biological brains.
528:molecular nanotechnology
493:. His thesis was titled
459:University of Gothenburg
442:Early life and education
368:; born 10 March 1973 in
287:Self-sampling assumption
265:Existential risk studies
79:University of Gothenburg
5601:People from Helsingborg
4647:artificial intelligence
4358:The Good Food Institute
4108:Demandingness objection
3832:"Apology for old email"
3556:(Video). 24 April 2018.
3265:10.1111/1467-9213.00309
3253:Philosophical Quarterly
2826:10.1111/1758-5899.12718
2540:(subscription required)
2461:Oxford University Press
2182:10.1111/1758-5899.12718
2120:10.1111/1758-5899.12123
2082:10.1111/1758-5899.12002
1736:Bostrom, Nick (2008). "
1614:10.1196/annals.1382.015
1340:10.1111/1467-9213.00309
1328:Philosophical Quarterly
1287:10.1111/1467-8284.00310
1246:10.1023/A:1005551304409
604:. The book argues that
432:artificial intelligence
394:superintelligence risks
170:Contemporary philosophy
143:Top 100 Global Thinkers
5361:US Transhumanist Party
4716:Intelligence explosion
4584:What We Owe the Future
4413:Wild Animal Initiative
4148:Moral circle expansion
3338:Bostrom, Nick (2002).
3244:Bostrom, Nick (2003).
3024:Bostrom, Nick (2002).
2919:Bostrom, Nick (2016).
2569:Bostrom, Nick (2000).
1962:Bostrom, Nick (2011).
1931:Bostrom, Nick (2011).
1754:10.2202/1941-6008.1025
1516:"What is a Singleton?"
1319:— (April 2003).
1302:— (March 2002).
1211:Bostrom, Nick (1998).
1068:1996 email controversy
827:, as well as the book
673:
657:intelligence explosion
558:, editors Bostrom and
376:known for his work on
360:
280:Infinitarian paralysis
5150:Simulation hypothesis
4671:AI capability control
4542:The Life You Can Save
4504:Wild animal suffering
3974:Snepvangers, Pieter.
3118:Oren Etzioni (2016).
1836:10.1093/analys/anp062
1426:— (June 2005).
1088:race and intelligence
1023:Bostrom was named in
890:Simulation hypothesis
854:indexical information
812:subjective experience
791:cognitive enhancement
668:
618:whole brain emulation
479:King's College London
390:whole brain emulation
261:Simulation hypothesis
99:King's College London
5621:Swedish philosophers
5477:Gennady Stolyarov II
5318:Techno-progressivism
4762:Center for AI Safety
4168:Venture philanthropy
3624:on 15 September 2018
1982:— (May 2012).
1038:World's Top Thinkers
771:Raffi Khatchadourian
650:orthogonality thesis
507:University of Oxford
467:Stockholm University
406:University of Oxford
206:University of Oxford
89:Stockholm University
5626:Swedish roboticists
5334:Foresight Institute
4977:Our Final Invention
4403:Sentience Institute
3687:on 21 October 2014.
3655:on 25 February 2018
2730:2005Natur.438..754T
1872:2010RiskA..30.1495C
1658:2008Natur.451..520B
1596:2006NYASA1093..201S
1483:2005Natur.438..754T
995:Technology strategy
988:genetic engineering
884:Simulation argument
825:anthropic reasoning
819:Anthropic reasoning
775:analytic philosophy
701:frameworks include
676:Mitigating the risk
509:from 2002 to 2005.
382:anthropic principle
273:Ancestor simulation
192:Analytic philosophy
5382:José Luis Cordeiro
5308:Singularitarianism
4484:Malaria prevention
4444:Economic stability
4429:Biotechnology risk
4383:Malaria Consortium
4348:Giving What We Can
4318:Fistula Foundation
4103:Charity assessment
4084:Effective altruism
3284:The New York Times
3191:The New York Times
2891:. 8 September 2014
2889:The New York Times
2784:The New York Times
2609:on 18 October 2015
2463:. 8 September 2014
2220:Effective altruism
2027:Minds and Machines
1991:Minds and Machines
1817:"Pascal's Mugging"
1738:Letter from Utopia
1731:(May/June): 72–77.
974:, he proposed the
876:the phenomenon of
686:race to the bottom
581:Robert Oppenheimer
277:Information hazard
180:Western philosophy
5611:Swedish ethicists
5571:Consequentialists
5523:
5522:
5497:Eliezer Yudkowsky
5487:Natasha Vita-More
5467:Martine Rothblatt
5209:
5208:
5186:Human Enhancement
5026:
5025:
4943:Eliezer Yudkowsky
4918:Stuart J. Russell
4736:Superintelligence
4609:
4608:
4521:Doing Good Better
4393:Open Philanthropy
4373:Mercy for Animals
4368:The Humane League
4244:Eliezer Yudkowsky
4209:William MacAskill
4184:Sam Bankman-Fried
4098:Aid effectiveness
3407:. Nickbostrom.com
3354:978-0-415-93858-7
3230:978-1-64687-164-3
2930:978-0-19-873983-8
2921:Superintelligence
2777:(3 August 2015).
2379:on 30 August 2018
2373:"nickbostrom.com"
2319:"Infinite Ethics"
2210:Doomsday argument
1964:"Infinite Ethics"
1866:(10): 1495–1506.
1729:Technology Review
1370:(Spring): 59–74.
1185:978-0-19-967811-2
1156:Human Enhancement
1149:978-0-19-857050-9
1044:Public engagement
1034:Prospect Magazine
970:With philosopher
943:the short story "
939:In 2005, Bostrom
870:Milan M. Ćirković
801:Digital sentience
795:reversal of aging
646:making paperclips
606:superintelligence
597:, which became a
587:Superintelligence
560:Milan M. Ćirković
532:synthetic biology
436:superintelligence
434:(AI) may lead to
386:human enhancement
310:
309:
5643:
5513:
5512:
5492:Mark Alan Walker
5236:
5229:
5222:
5213:
5212:
5130:Pascal's mugging
5053:
5046:
5039:
5030:
5029:
5018:
5017:
4965:Human Compatible
4938:Roman Yampolskiy
4686:Consequentialism
4643:Existential risk
4636:
4629:
4622:
4613:
4612:
4563:Practical Ethics
4214:Dustin Moskovitz
4204:Holden Karnofsky
4143:Marginal utility
4077:
4070:
4063:
4054:
4053:
4049:
4035:
4034:
4032:Official website
4017:
4016:
4006:
3997:
3991:
3990:
3988:
3986:
3971:
3965:
3964:
3958:
3956:
3933:
3924:
3923:
3911:
3905:
3904:
3902:
3900:
3885:
3876:
3875:
3873:
3871:
3866:on 27 April 2015
3856:
3850:
3849:
3847:
3845:
3836:
3827:
3818:
3817:
3815:
3813:
3798:
3792:
3791:
3789:
3787:
3773:
3767:
3766:
3764:
3762:
3748:
3742:
3741:
3739:
3737:
3723:
3717:
3716:
3714:
3712:
3695:
3689:
3688:
3671:
3665:
3664:
3662:
3660:
3651:. Archived from
3640:
3634:
3633:
3631:
3629:
3614:
3608:
3607:
3573:
3564:
3558:
3557:
3548:
3542:
3541:
3525:
3519:
3518:
3516:
3514:
3499:
3493:
3492:
3478:
3469:
3468:
3466:
3464:
3449:
3443:
3442:
3440:
3438:
3423:
3417:
3416:
3414:
3412:
3406:
3398:
3392:
3391:
3389:
3387:
3372:
3366:
3365:
3363:
3361:
3346:
3335:
3329:
3328:
3326:
3324:
3309:
3303:
3302:
3300:
3298:
3275:
3269:
3268:
3259:(211): 243–255.
3250:
3241:
3235:
3234:
3216:
3210:
3209:
3207:
3205:
3182:
3176:
3175:
3173:
3171:
3156:
3150:
3149:
3147:
3145:
3130:
3124:
3123:
3115:
3109:
3108:
3106:
3104:
3090:
3084:
3083:
3081:
3079:
3064:
3055:
3054:
3040:
3034:
3033:
3021:
3010:
3009:
3007:
3005:
2983:
2977:
2976:
2974:
2972:
2949:
2943:
2942:
2916:
2901:
2900:
2898:
2896:
2881:
2875:
2874:
2872:
2870:
2855:
2849:
2848:
2836:
2830:
2829:
2811:
2802:
2796:
2795:
2793:
2791:
2771:
2765:
2764:
2762:
2756:. Archived from
2715:
2703:
2697:
2696:
2694:
2692:
2687:on 17 April 2024
2677:
2671:
2670:
2668:
2666:
2651:
2645:
2644:
2642:
2640:
2625:
2619:
2618:
2616:
2614:
2605:. Archived from
2597:Andersen, Ross.
2594:
2585:
2584:
2582:
2580:
2566:
2560:
2559:
2557:
2548:
2542:
2541:
2538:
2536:
2534:
2518:
2510:
2501:
2500:
2498:
2496:
2482:
2473:
2472:
2470:
2468:
2453:
2444:
2443:
2441:
2439:
2425:
2419:
2418:
2416:
2414:
2400:
2389:
2388:
2386:
2384:
2369:
2363:
2362:
2360:
2358:
2343:
2337:
2336:
2334:
2332:
2323:
2315:
2309:
2308:
2286:
2256:
2252:
2246:
2242:
2225:Pascal's mugging
2186:
2184:
2159:
2133:
2123:
2113:
2095:
2085:
2060:
2042:
2024:
2014:
1988:
1978:
1968:
1958:
1956:
1954:
1948:
1942:. Archived from
1937:
1927:
1909:
1903:— (2011).
1899:
1857:
1847:
1821:
1815:— (2009).
1811:
1785:
1767:
1757:
1732:
1726:
1720:— (2008).
1716:
1699:(17–18): 23–28.
1687:
1669:
1643:
1633:
1607:
1581:
1571:
1537:
1527:
1514:— (2006).
1510:
1476:
1474:astro-ph/0512204
1455:
1422:
1404:
1379:
1361:
1355:— (2003).
1351:
1334:(211): 243–255.
1325:
1315:
1298:
1265:
1224:
1205:Journal articles
1161:Julian Savulescu
1031:
878:anthropic shadow
721:Public reception
524:existential risk
518:Existential risk
457:degree from the
378:existential risk
367:
351:
346:
345:
342:
341:
338:
335:
332:
329:
326:
323:
306:
303:
153:
60:
56:
54:
35:
21:
20:
5651:
5650:
5646:
5645:
5644:
5642:
5641:
5640:
5581:Epistemologists
5526:
5525:
5524:
5519:
5501:
5472:Anders Sandberg
5442:Ole Martin Moen
5387:K. Eric Drexler
5365:
5322:
5266:
5245:
5240:
5210:
5205:
5159:
5103:
5094:Anders Sandberg
5079:K. Eric Drexler
5062:
5057:
5027:
5022:
5008:
4947:
4903:Steve Omohundro
4883:Geoffrey Hinton
4873:Stephen Hawking
4858:Paul Christiano
4838:Scott Alexander
4826:
4797:Google DeepMind
4745:
4731:Suffering risks
4649:
4640:
4610:
4605:
4589:
4508:
4474:Land use reform
4417:
4333:Founders Pledge
4308:Evidence Action
4248:
4172:
4128:Earning to give
4086:
4081:
4030:
4029:
4026:
4021:
4020:
4009:nickbostrom.com
4004:
4000:Bostrom, Nick.
3998:
3994:
3984:
3982:
3972:
3968:
3954:
3952:
3934:
3927:
3920:The Oxford Blue
3912:
3908:
3898:
3896:
3894:The Daily Beast
3886:
3879:
3869:
3867:
3858:
3857:
3853:
3843:
3841:
3839:nickbostrom.com
3834:
3830:Bostrom, Nick.
3828:
3821:
3811:
3809:
3799:
3795:
3785:
3783:
3775:
3774:
3770:
3760:
3758:
3750:
3749:
3745:
3735:
3733:
3725:
3724:
3720:
3710:
3708:
3696:
3692:
3673:
3672:
3668:
3658:
3656:
3641:
3637:
3627:
3625:
3616:
3615:
3611:
3571:
3565:
3561:
3550:
3549:
3545:
3526:
3522:
3512:
3510:
3501:
3500:
3496:
3479:
3472:
3462:
3460:
3450:
3446:
3436:
3434:
3424:
3420:
3410:
3408:
3404:
3400:
3399:
3395:
3385:
3383:
3373:
3369:
3359:
3357:
3355:
3344:
3336:
3332:
3322:
3320:
3310:
3306:
3296:
3294:
3276:
3272:
3248:
3242:
3238:
3231:
3217:
3213:
3203:
3201:
3183:
3179:
3169:
3167:
3157:
3153:
3143:
3141:
3131:
3127:
3116:
3112:
3102:
3100:
3092:
3091:
3087:
3077:
3075:
3065:
3058:
3041:
3037:
3022:
3013:
3003:
3001:
2985:
2984:
2980:
2970:
2968:
2950:
2946:
2931:
2917:
2904:
2894:
2892:
2883:
2882:
2878:
2868:
2866:
2856:
2852:
2837:
2833:
2809:
2803:
2799:
2789:
2787:
2775:Overbye, Dennis
2772:
2768:
2763:on 3 July 2011.
2760:
2738:10.1038/438754a
2713:
2704:
2700:
2690:
2688:
2679:
2678:
2674:
2664:
2662:
2652:
2648:
2638:
2636:
2626:
2622:
2612:
2610:
2595:
2588:
2578:
2576:
2567:
2563:
2555:
2551:Bostrom, Nick.
2549:
2545:
2539:
2532:
2530:
2522:Financial Times
2511:
2504:
2494:
2492:
2484:
2483:
2476:
2466:
2464:
2455:
2454:
2447:
2437:
2435:
2433:nickbostrom.com
2427:
2426:
2422:
2412:
2410:
2401:
2392:
2382:
2380:
2371:
2370:
2366:
2356:
2354:
2344:
2340:
2330:
2328:
2326:nickbostrom.com
2321:
2317:
2316:
2312:
2287:
2270:
2265:
2260:
2259:
2253:
2249:
2243:
2239:
2234:
2229:
2205:
2193:
2131:
2111:10.1.1.428.8837
2093:
2022:
1986:
1966:
1952:
1950:
1949:on 4 March 2016
1946:
1935:
1907:
1855:
1819:
1783:10.1.1.143.4686
1765:
1724:
1667:10.1038/451520b
1641:
1605:10.1.1.328.3853
1579:
1535:
1491:10.1038/438754a
1402:10.1.1.429.2849
1359:
1323:
1207:
1118:
1113:
1100:The Daily Beast
1070:
1046:
1029:
1021:
1003:
997:
961:status quo bias
911:
892:
886:
874:Anders Sandberg
821:
803:
783:
737:Stephen Hawking
723:
711:moral rightness
678:
626:many advantages
614:
589:
576:
520:
515:
503:British Academy
499:Yale University
444:
349:
320:
316:
300:
285:
281:
279:
275:
271:
267:
263:
259:
255:
249:
239:
233:
208:
204:
202:Yale University
157:
151:
118:
67:
61:
58:
52:
50:
49:
48:
38:
37:Bostrom in 2020
26:
17:
12:
11:
5:
5649:
5639:
5638:
5633:
5631:Transhumanists
5628:
5623:
5618:
5613:
5608:
5603:
5598:
5593:
5588:
5583:
5578:
5573:
5568:
5563:
5558:
5553:
5548:
5543:
5538:
5521:
5520:
5518:
5517:
5506:
5503:
5502:
5500:
5499:
5494:
5489:
5484:
5479:
5474:
5469:
5464:
5459:
5454:
5449:
5444:
5439:
5434:
5429:
5424:
5419:
5414:
5409:
5404:
5402:Aubrey de Grey
5399:
5394:
5389:
5384:
5379:
5373:
5371:
5367:
5366:
5364:
5363:
5358:
5351:
5346:
5341:
5336:
5330:
5328:
5324:
5323:
5321:
5320:
5315:
5313:Technogaianism
5310:
5305:
5303:Postpoliticism
5300:
5295:
5290:
5285:
5280:
5278:Antinaturalism
5274:
5272:
5268:
5267:
5265:
5264:
5259:
5253:
5251:
5247:
5246:
5239:
5238:
5231:
5224:
5216:
5207:
5206:
5204:
5203:
5196:
5189:
5182:
5175:
5172:Anthropic Bias
5167:
5165:
5161:
5160:
5158:
5157:
5152:
5147:
5142:
5137:
5132:
5127:
5122:
5117:
5111:
5109:
5105:
5104:
5102:
5101:
5099:Rebecca Roache
5096:
5091:
5086:
5081:
5076:
5070:
5068:
5064:
5063:
5056:
5055:
5048:
5041:
5033:
5024:
5023:
5013:
5010:
5009:
5007:
5006:
5001:
4994:
4987:
4980:
4973:
4968:
4961:
4955:
4953:
4949:
4948:
4946:
4945:
4940:
4935:
4930:
4925:
4920:
4915:
4910:
4905:
4900:
4895:
4890:
4885:
4880:
4875:
4870:
4865:
4860:
4855:
4850:
4845:
4840:
4834:
4832:
4828:
4827:
4825:
4824:
4819:
4814:
4809:
4804:
4799:
4794:
4789:
4784:
4779:
4774:
4769:
4764:
4759:
4753:
4751:
4747:
4746:
4744:
4743:
4738:
4733:
4728:
4726:Machine ethics
4723:
4718:
4713:
4708:
4703:
4698:
4693:
4688:
4683:
4678:
4673:
4668:
4663:
4657:
4655:
4651:
4650:
4639:
4638:
4631:
4624:
4616:
4607:
4606:
4604:
4603:
4597:
4595:
4591:
4590:
4588:
4587:
4580:
4573:
4566:
4559:
4552:
4545:
4538:
4531:
4524:
4516:
4514:
4510:
4509:
4507:
4506:
4501:
4496:
4491:
4489:Mass deworming
4486:
4481:
4479:Life extension
4476:
4471:
4466:
4464:Global poverty
4461:
4456:
4451:
4446:
4441:
4436:
4434:Climate change
4431:
4425:
4423:
4419:
4418:
4416:
4415:
4410:
4408:Unlimit Health
4405:
4400:
4395:
4390:
4385:
4380:
4375:
4370:
4365:
4360:
4355:
4353:Good Food Fund
4350:
4345:
4340:
4335:
4330:
4325:
4320:
4315:
4310:
4305:
4300:
4295:
4290:
4285:
4280:
4275:
4270:
4267:
4262:
4256:
4254:
4250:
4249:
4247:
4246:
4241:
4236:
4231:
4226:
4221:
4216:
4211:
4206:
4201:
4199:Hilary Greaves
4196:
4191:
4186:
4180:
4178:
4174:
4173:
4171:
4170:
4165:
4163:Utilitarianism
4160:
4155:
4150:
4145:
4140:
4135:
4130:
4125:
4120:
4118:Disease burden
4115:
4110:
4105:
4100:
4094:
4092:
4088:
4087:
4080:
4079:
4072:
4065:
4057:
4051:
4050:
4036:
4025:
4024:External links
4022:
4019:
4018:
3992:
3966:
3925:
3906:
3877:
3851:
3819:
3793:
3768:
3743:
3718:
3690:
3680:Foreign Policy
3666:
3635:
3609:
3588:10.1086/505233
3582:(4): 656–679.
3559:
3543:
3534:Philosophy Now
3520:
3507:Philosophy Now
3494:
3470:
3452:Nesbit, Jeff.
3444:
3418:
3393:
3367:
3353:
3330:
3304:
3270:
3236:
3229:
3211:
3177:
3151:
3125:
3110:
3085:
3073:The New Yorker
3056:
3035:
3011:
2978:
2944:
2929:
2902:
2876:
2850:
2831:
2820:(4): 455–476.
2797:
2766:
2698:
2672:
2646:
2620:
2586:
2561:
2543:
2502:
2474:
2445:
2420:
2390:
2364:
2338:
2310:
2296:The New Yorker
2267:
2266:
2264:
2261:
2258:
2257:
2247:
2236:
2235:
2233:
2230:
2228:
2227:
2222:
2217:
2215:Dream argument
2212:
2206:
2204:
2201:
2192:
2189:
2188:
2187:
2175:(4): 455–476.
2160:
2124:
2086:
2061:
2040:10.1.1.396.799
2033:(4): 299–324.
2015:
1979:
1959:
1928:
1900:
1848:
1830:(3): 443–445.
1812:
1776:(3): 311–341.
1758:
1733:
1717:
1688:
1634:
1590:(1): 201–207.
1572:
1552:10.1086/505233
1546:(4): 656–680.
1528:
1511:
1456:
1438:(3): 202–214.
1423:
1395:(3): 308–314.
1380:
1352:
1316:
1299:
1281:(4): 309–310.
1266:
1225:
1206:
1203:
1202:
1201:
1199:978-1646871643
1187:
1171:
1151:
1135:
1117:
1114:
1112:
1111:Selected works
1109:
1078:Andrew Anthony
1069:
1066:
1050:House of Lords
1045:
1042:
1026:Foreign Policy
1020:
1017:
996:
993:
956:Philosophy Now
910:
907:
906:
905:
902:
899:
888:Main article:
885:
882:
861:(SSA) and the
835:Brandon Carter
820:
817:
802:
799:
782:
779:
757:Daniel Dennett
731:New York Times
722:
719:
677:
674:
613:
610:
600:New York Times
588:
585:
575:
572:
519:
516:
514:
511:
453:He received a
443:
440:
361:Niklas Boström
308:
307:
298:
294:
293:
290:
289:
253:Anthropic bias
250:
247:
244:
243:
234:
232:Main interests
231:
228:
227:
219:
213:
212:
199:
195:
194:
189:
183:
182:
177:
173:
172:
167:
163:
162:
159:
158:
156:
155:
144:
138:
134:
132:
128:
127:
124:
120:
119:
117:
116:
106:
96:
86:
75:
73:
69:
68:
62:
47:Niklas Boström
46:
44:
40:
39:
36:
28:
27:
24:
15:
9:
6:
4:
3:
2:
5648:
5637:
5634:
5632:
5629:
5627:
5624:
5622:
5619:
5617:
5614:
5612:
5609:
5607:
5604:
5602:
5599:
5597:
5594:
5592:
5591:Futurologists
5589:
5587:
5584:
5582:
5579:
5577:
5574:
5572:
5569:
5567:
5564:
5562:
5559:
5557:
5554:
5552:
5549:
5547:
5546:Living people
5544:
5542:
5539:
5537:
5534:
5533:
5531:
5516:
5508:
5507:
5504:
5498:
5495:
5493:
5490:
5488:
5485:
5483:
5480:
5478:
5475:
5473:
5470:
5468:
5465:
5463:
5460:
5458:
5455:
5453:
5450:
5448:
5445:
5443:
5440:
5438:
5435:
5433:
5432:Julian Huxley
5430:
5428:
5425:
5423:
5420:
5418:
5415:
5413:
5410:
5408:
5407:Zoltan Istvan
5405:
5403:
5400:
5398:
5395:
5393:
5390:
5388:
5385:
5383:
5380:
5378:
5375:
5374:
5372:
5368:
5362:
5359:
5357:
5356:
5352:
5350:
5347:
5345:
5342:
5340:
5337:
5335:
5332:
5331:
5329:
5327:Organizations
5325:
5319:
5316:
5314:
5311:
5309:
5306:
5304:
5301:
5299:
5298:Postgenderism
5296:
5294:
5291:
5289:
5286:
5284:
5281:
5279:
5276:
5275:
5273:
5269:
5263:
5260:
5258:
5255:
5254:
5252:
5248:
5244:
5243:Transhumanism
5237:
5232:
5230:
5225:
5223:
5218:
5217:
5214:
5202:
5201:
5197:
5195:
5194:
5193:The Precipice
5190:
5188:
5187:
5183:
5181:
5180:
5176:
5174:
5173:
5169:
5168:
5166:
5162:
5156:
5153:
5151:
5148:
5146:
5143:
5141:
5138:
5136:
5135:Reversal test
5133:
5131:
5128:
5126:
5123:
5121:
5118:
5116:
5113:
5112:
5110:
5106:
5100:
5097:
5095:
5092:
5090:
5087:
5085:
5082:
5080:
5077:
5075:
5072:
5071:
5069:
5065:
5061:
5054:
5049:
5047:
5042:
5040:
5035:
5034:
5031:
5021:
5011:
5005:
5002:
5000:
4999:
4995:
4993:
4992:
4988:
4986:
4985:
4984:The Precipice
4981:
4979:
4978:
4974:
4972:
4969:
4967:
4966:
4962:
4960:
4957:
4956:
4954:
4950:
4944:
4941:
4939:
4936:
4934:
4933:Frank Wilczek
4931:
4929:
4926:
4924:
4921:
4919:
4916:
4914:
4911:
4909:
4906:
4904:
4901:
4899:
4896:
4894:
4891:
4889:
4886:
4884:
4881:
4879:
4878:Dan Hendrycks
4876:
4874:
4871:
4869:
4866:
4864:
4861:
4859:
4856:
4854:
4851:
4849:
4848:Yoshua Bengio
4846:
4844:
4841:
4839:
4836:
4835:
4833:
4829:
4823:
4820:
4818:
4815:
4813:
4810:
4808:
4805:
4803:
4800:
4798:
4795:
4793:
4790:
4788:
4785:
4783:
4780:
4778:
4775:
4773:
4770:
4768:
4765:
4763:
4760:
4758:
4755:
4754:
4752:
4750:Organizations
4748:
4742:
4739:
4737:
4734:
4732:
4729:
4727:
4724:
4722:
4719:
4717:
4714:
4712:
4709:
4707:
4704:
4702:
4699:
4697:
4694:
4692:
4689:
4687:
4684:
4682:
4679:
4677:
4674:
4672:
4669:
4667:
4664:
4662:
4659:
4658:
4656:
4652:
4648:
4644:
4637:
4632:
4630:
4625:
4623:
4618:
4617:
4614:
4602:
4599:
4598:
4596:
4592:
4586:
4585:
4581:
4579:
4578:
4574:
4572:
4571:
4570:The Precipice
4567:
4565:
4564:
4560:
4558:
4557:
4553:
4551:
4550:
4546:
4544:
4543:
4539:
4537:
4536:
4532:
4530:
4529:
4525:
4523:
4522:
4518:
4517:
4515:
4511:
4505:
4502:
4500:
4497:
4495:
4492:
4490:
4487:
4485:
4482:
4480:
4477:
4475:
4472:
4470:
4467:
4465:
4462:
4460:
4459:Global health
4457:
4455:
4452:
4450:
4447:
4445:
4442:
4440:
4439:Cultured meat
4437:
4435:
4432:
4430:
4427:
4426:
4424:
4420:
4414:
4411:
4409:
4406:
4404:
4401:
4399:
4396:
4394:
4391:
4389:
4386:
4384:
4381:
4379:
4376:
4374:
4371:
4369:
4366:
4364:
4363:Good Ventures
4361:
4359:
4356:
4354:
4351:
4349:
4346:
4344:
4341:
4339:
4336:
4334:
4331:
4329:
4326:
4324:
4321:
4319:
4316:
4314:
4311:
4309:
4306:
4304:
4301:
4299:
4296:
4294:
4291:
4289:
4286:
4284:
4281:
4279:
4278:Animal Ethics
4276:
4274:
4271:
4268:
4266:
4263:
4261:
4258:
4257:
4255:
4253:Organizations
4251:
4245:
4242:
4240:
4237:
4235:
4232:
4230:
4227:
4225:
4222:
4220:
4217:
4215:
4212:
4210:
4207:
4205:
4202:
4200:
4197:
4195:
4192:
4190:
4187:
4185:
4182:
4181:
4179:
4175:
4169:
4166:
4164:
4161:
4159:
4156:
4154:
4151:
4149:
4146:
4144:
4141:
4139:
4136:
4134:
4131:
4129:
4126:
4124:
4121:
4119:
4116:
4114:
4111:
4109:
4106:
4104:
4101:
4099:
4096:
4095:
4093:
4089:
4085:
4078:
4073:
4071:
4066:
4064:
4059:
4058:
4055:
4048:
4044:
4040:
4037:
4033:
4028:
4027:
4015:
4010:
4003:
3996:
3981:
3977:
3970:
3963:
3951:
3947:
3943:
3939:
3932:
3930:
3921:
3917:
3910:
3895:
3891:
3884:
3882:
3865:
3861:
3855:
3840:
3833:
3826:
3824:
3808:
3804:
3797:
3782:
3778:
3772:
3757:
3753:
3747:
3732:
3731:UK Parliament
3728:
3722:
3707:
3706:
3701:
3694:
3686:
3682:
3681:
3676:
3670:
3654:
3650:
3646:
3639:
3623:
3619:
3613:
3605:
3601:
3597:
3593:
3589:
3585:
3581:
3577:
3570:
3563:
3555:
3554:
3547:
3539:
3535:
3531:
3524:
3508:
3504:
3498:
3490:
3489:
3484:
3477:
3475:
3459:
3455:
3448:
3433:
3429:
3422:
3403:
3397:
3382:
3378:
3371:
3356:
3350:
3343:
3342:
3334:
3319:
3315:
3308:
3293:
3289:
3285:
3281:
3274:
3266:
3262:
3258:
3254:
3247:
3240:
3232:
3226:
3222:
3215:
3200:
3196:
3192:
3188:
3181:
3166:
3162:
3155:
3140:
3136:
3129:
3121:
3114:
3099:
3095:
3089:
3074:
3070:
3063:
3061:
3052:
3051:
3046:
3039:
3031:
3027:
3020:
3018:
3016:
3000:
2996:
2992:
2991:The Economist
2988:
2987:"Clever cogs"
2982:
2967:
2963:
2959:
2955:
2948:
2940:
2936:
2932:
2926:
2922:
2915:
2913:
2911:
2909:
2907:
2890:
2886:
2880:
2865:
2861:
2854:
2846:
2842:
2835:
2827:
2823:
2819:
2815:
2814:Global Policy
2808:
2801:
2786:
2785:
2780:
2776:
2770:
2759:
2755:
2751:
2747:
2743:
2739:
2735:
2731:
2727:
2724:(7069): 754.
2723:
2719:
2712:
2708:
2702:
2686:
2682:
2676:
2661:
2657:
2650:
2635:
2631:
2624:
2608:
2604:
2600:
2593:
2591:
2574:
2573:
2565:
2554:
2547:
2528:
2524:
2523:
2517:
2509:
2507:
2491:
2487:
2481:
2479:
2462:
2458:
2452:
2450:
2434:
2430:
2424:
2409:
2406:
2399:
2397:
2395:
2378:
2374:
2368:
2353:
2349:
2342:
2327:
2320:
2314:
2306:
2302:
2298:
2297:
2292:
2285:
2283:
2281:
2279:
2277:
2275:
2273:
2268:
2251:
2241:
2237:
2226:
2223:
2221:
2218:
2216:
2213:
2211:
2208:
2207:
2200:
2198:
2191:Personal life
2183:
2178:
2174:
2170:
2169:Global Policy
2166:
2161:
2157:
2153:
2149:
2145:
2142:(36): 41–47.
2141:
2137:
2130:
2125:
2121:
2117:
2112:
2107:
2103:
2099:
2098:Global Policy
2092:
2087:
2083:
2079:
2075:
2071:
2070:Global Policy
2067:
2062:
2058:
2054:
2050:
2046:
2041:
2036:
2032:
2028:
2021:
2016:
2012:
2008:
2004:
2000:
1996:
1992:
1985:
1980:
1976:
1972:
1965:
1960:
1945:
1941:
1934:
1929:
1925:
1921:
1917:
1913:
1906:
1901:
1897:
1893:
1889:
1885:
1881:
1877:
1873:
1869:
1865:
1861:
1860:Risk Analysis
1854:
1849:
1845:
1841:
1837:
1833:
1829:
1825:
1818:
1813:
1809:
1805:
1801:
1797:
1793:
1789:
1784:
1779:
1775:
1771:
1764:
1759:
1755:
1751:
1747:
1743:
1739:
1734:
1730:
1723:
1718:
1714:
1710:
1706:
1702:
1698:
1694:
1689:
1685:
1681:
1677:
1673:
1668:
1663:
1659:
1655:
1652:(7178): 520.
1651:
1647:
1640:
1635:
1631:
1627:
1623:
1619:
1615:
1611:
1606:
1601:
1597:
1593:
1589:
1585:
1578:
1573:
1569:
1565:
1561:
1557:
1553:
1549:
1545:
1541:
1534:
1529:
1525:
1521:
1517:
1512:
1508:
1504:
1500:
1496:
1492:
1488:
1484:
1480:
1475:
1470:
1467:(7069): 754.
1466:
1462:
1457:
1453:
1449:
1445:
1441:
1437:
1433:
1429:
1424:
1420:
1416:
1412:
1408:
1403:
1398:
1394:
1390:
1386:
1381:
1377:
1373:
1369:
1365:
1358:
1353:
1349:
1345:
1341:
1337:
1333:
1329:
1322:
1317:
1313:
1309:
1305:
1300:
1296:
1292:
1288:
1284:
1280:
1276:
1272:
1267:
1263:
1259:
1255:
1251:
1247:
1243:
1240:(1): 93–108.
1239:
1235:
1231:
1226:
1222:
1218:
1214:
1209:
1208:
1200:
1196:
1192:
1188:
1186:
1182:
1178:
1177:
1172:
1170:
1169:0-19-929972-2
1166:
1162:
1158:
1157:
1152:
1150:
1146:
1142:
1141:
1136:
1134:
1133:0-415-93858-9
1130:
1126:
1125:
1120:
1119:
1108:
1106:
1102:
1101:
1095:
1093:
1089:
1085:
1084:
1079:
1075:
1065:
1063:
1059:
1055:
1051:
1041:
1039:
1035:
1028:
1027:
1016:
1013:
1010:
1008:
1002:
992:
989:
985:
980:
977:
976:reversal test
973:
968:
966:
962:
958:
957:
952:
951:
946:
942:
937:
935:
931:
927:
923:
922:Transhumanist
919:
914:
903:
900:
897:
896:
895:
891:
881:
879:
875:
871:
866:
864:
860:
855:
850:
848:
844:
840:
836:
832:
831:
826:
816:
813:
809:
798:
796:
792:
788:
778:
776:
772:
768:
766:
762:
758:
754:
750:
746:
742:
738:
734:
732:
727:
718:
714:
712:
708:
704:
700:
696:
691:
687:
683:
672:
667:
664:
662:
658:
653:
651:
647:
643:
638:
634:
629:
627:
623:
619:
609:
607:
603:
601:
596:
595:
584:
583:had feared).
582:
571:
569:
568:Fermi paradox
565:
561:
557:
556:
550:
548:
544:
540:
535:
533:
529:
525:
510:
508:
504:
500:
496:
492:
488:
484:
480:
476:
472:
468:
464:
460:
456:
451:
449:
439:
437:
433:
428:
426:
422:
421:
416:
415:
409:
407:
403:
399:
398:reversal test
395:
391:
387:
383:
379:
375:
371:
366:
362:
358:
354:
353:
344:
314:
305:
299:
295:
291:
288:
284:
278:
274:
270:
266:
262:
258:
257:Reversal test
254:
251:
248:Notable ideas
245:
242:
238:
235:
229:
225:
224:
220:
218:
214:
211:
207:
203:
200:
196:
193:
190:
188:
184:
181:
178:
174:
171:
168:
164:
160:
150:
149:
145:
142:
139:
136:
135:
133:
129:
125:
121:
114:
110:
107:
104:
100:
97:
94:
90:
87:
84:
80:
77:
76:
74:
70:
65:
59:(age 51)
57:10 March 1973
45:
41:
34:
29:
22:
19:
5536:Nick Bostrom
5482:Vernor Vinge
5462:David Pearce
5447:Hans Moravec
5437:Ray Kurzweil
5427:James Hughes
5422:Robin Hanson
5397:Ben Goertzel
5377:Nick Bostrom
5376:
5353:
5288:Extropianism
5198:
5191:
5184:
5177:
5170:
5125:Great Filter
5084:Robin Hanson
5074:Nick Bostrom
5073:
4996:
4989:
4982:
4975:
4963:
4923:Jaan Tallinn
4863:Eric Drexler
4853:Nick Bostrom
4852:
4666:AI alignment
4582:
4575:
4568:
4561:
4554:
4547:
4540:
4533:
4526:
4519:
4338:GiveDirectly
4260:80,000 Hours
4234:Peter Singer
4229:Derek Parfit
4219:Yew-Kwang Ng
4194:Nick Bostrom
4193:
4039:Nick Bostrom
4012:
4008:
3995:
3983:. Retrieved
3979:
3969:
3960:
3953:. Retrieved
3942:The Observer
3941:
3919:
3909:
3897:. Retrieved
3893:
3868:. Retrieved
3864:the original
3854:
3842:. Retrieved
3838:
3810:. Retrieved
3807:New Republic
3806:
3796:
3784:. Retrieved
3780:
3771:
3759:. Retrieved
3755:
3746:
3734:. Retrieved
3730:
3721:
3709:. Retrieved
3703:
3693:
3685:the original
3678:
3669:
3657:. Retrieved
3653:the original
3648:
3638:
3626:. Retrieved
3622:the original
3612:
3579:
3575:
3562:
3552:
3546:
3537:
3533:
3523:
3513:12 September
3511:. Retrieved
3506:
3497:
3488:The Guardian
3486:
3461:. Retrieved
3457:
3447:
3437:12 September
3435:. Retrieved
3432:Ars Technica
3431:
3421:
3409:. Retrieved
3396:
3386:12 September
3384:. Retrieved
3381:The Atlantic
3380:
3370:
3358:. Retrieved
3340:
3333:
3321:. Retrieved
3317:
3307:
3295:. Retrieved
3283:
3273:
3256:
3252:
3239:
3220:
3214:
3202:. Retrieved
3190:
3180:
3168:. Retrieved
3164:
3154:
3142:. Retrieved
3138:
3128:
3113:
3101:. Retrieved
3097:
3088:
3076:. Retrieved
3072:
3048:
3038:
3029:
3002:. Retrieved
2990:
2981:
2971:13 September
2969:. Retrieved
2957:
2947:
2920:
2893:. Retrieved
2888:
2879:
2867:. Retrieved
2863:
2853:
2844:
2834:
2817:
2813:
2800:
2788:. Retrieved
2782:
2769:
2758:the original
2721:
2717:
2707:Tegmark, Max
2701:
2689:. Retrieved
2685:the original
2675:
2665:12 September
2663:. Retrieved
2659:
2649:
2637:. Retrieved
2634:The Atlantic
2633:
2623:
2611:. Retrieved
2607:the original
2602:
2577:. Retrieved
2571:
2564:
2546:
2531:. Retrieved
2520:
2493:. Retrieved
2489:
2465:. Retrieved
2436:. Retrieved
2432:
2423:
2411:. Retrieved
2407:
2381:. Retrieved
2377:the original
2367:
2355:. Retrieved
2351:
2341:
2329:. Retrieved
2325:
2313:
2294:
2250:
2245:motivations.
2240:
2194:
2172:
2168:
2139:
2135:
2104:(1): 85–92.
2101:
2097:
2076:(3): 15–31.
2073:
2069:
2030:
2026:
1997:(2): 71–84.
1994:
1990:
1974:
1970:
1951:. Retrieved
1944:the original
1939:
1915:
1911:
1863:
1859:
1827:
1823:
1773:
1769:
1745:
1741:
1728:
1696:
1692:
1649:
1645:
1587:
1583:
1543:
1539:
1523:
1519:
1464:
1460:
1435:
1431:
1392:
1388:
1367:
1363:
1331:
1327:
1311:
1307:
1278:
1274:
1237:
1233:
1220:
1216:
1190:
1174:
1154:
1138:
1122:
1105:Ursula Coope
1098:
1096:
1083:The Guardian
1081:
1071:
1047:
1037:
1024:
1022:
1014:
1011:
1004:
981:
969:
954:
948:
938:
930:James Hughes
920:) the World
918:David Pearce
915:
912:
893:
877:
867:
851:
847:Frank Tipler
828:
822:
804:
786:
784:
769:
761:Oren Etzioni
753:Derek Parfit
749:Peter Singer
730:
725:
724:
715:
690:AI arms race
679:
669:
665:
654:
630:
615:
599:
592:
590:
577:
553:
551:
536:
521:
494:
452:
445:
429:
424:
418:
412:
410:
313:Nick Bostrom
312:
311:
226: (2000)
222:
198:Institutions
146:
25:Nick Bostrom
18:
5576:Cryonicists
5541:1973 births
5392:David Gobel
5293:Immortalism
4928:Max Tegmark
4913:Martin Rees
4721:Longtermism
4681:AI takeover
4422:Focus areas
4313:Faunalytics
4177:Key figures
4138:Longtermism
3659:26 February
2895:19 February
2613:5 September
2331:21 February
1953:13 February
1526:(2): 48–54.
843:John Barrow
839:John Leslie
781:Deep utopia
733:Best Seller
699:normativity
602:Best Seller
483:W. V. Quine
448:Helsingborg
423:(2014) and
374:philosopher
302:nickbostrom
64:Helsingborg
5530:Categories
5257:Transhuman
4893:Shane Legg
4868:Sam Harris
4843:Sam Altman
4782:EleutherAI
4513:Literature
4189:Liv Boeree
3899:12 January
3844:17 January
3628:16 October
3411:16 October
2790:29 October
2383:16 October
2263:References
1234:Erkenntnis
999:See also:
765:Yann LeCun
741:Bill Gates
543:far future
473:degree in
396:, and the
53:1973-03-10
5457:Elon Musk
5355:LessWrong
5339:Humanity+
5250:Overviews
5155:Singleton
4908:Huw Price
4898:Elon Musk
4802:Humanity+
4676:AI safety
4239:Cari Tuna
3950:0029-7712
3292:0362-4331
3199:0362-4331
3170:14 August
3144:14 August
3103:13 August
3078:13 August
3004:14 August
2999:0013-0613
2966:1091-2339
2939:943145542
2660:404 Media
2533:10 August
2495:14 August
2490:Big Think
2305:0028-792X
2156:143657841
2106:CiteSeerX
2035:CiteSeerX
1924:920893069
1918:: 44–79.
1778:CiteSeerX
1713:171035249
1600:CiteSeerX
1432:Bioethics
1397:CiteSeerX
1262:140474848
947:" in the
941:published
926:Humanity+
808:sentience
745:Elon Musk
728:became a
703:Yudkowsky
661:singleton
269:Singleton
241:Bioethics
72:Education
5515:Category
5452:Max More
5271:Currents
5108:Concepts
5089:Toby Ord
5020:Category
4888:Bill Joy
4654:Concepts
4343:GiveWell
4224:Toby Ord
4091:Concepts
3812:17 March
3786:17 March
3761:17 March
3736:17 March
3705:Prospect
3604:12861892
3596:17039628
3463:17 March
2746:16341005
2691:17 April
2527:Archived
2438:19 April
2357:30 April
2203:See also
2197:Montreal
1920:ProQuest
1888:20626690
1844:40607655
1824:Analysis
1800:19543814
1676:18235476
1630:10135931
1622:17312260
1568:12861892
1560:17039628
1499:16341005
1452:16167401
1419:15860897
1389:Utilitas
1275:Analysis
1254:20012969
1092:eugenics
984:dysgenic
972:Toby Ord
965:CGP Grey
566:and the
513:Research
427:(2024).
417:(2002),
388:ethics,
148:Prospect
66:, Sweden
5412:FM-2030
3980:The Tab
3870:24 June
3711:19 June
3458:US News
3360:22 July
3165:Fortune
2754:4390013
2726:Bibcode
2599:"Omens"
2579:25 June
2467:4 March
2352:Skeptic
2057:9464769
2011:7445963
1977:: 9–59.
1896:6485564
1868:Bibcode
1808:6846531
1684:4426990
1654:Bibcode
1592:Bibcode
1507:4390013
1479:Bibcode
1348:3542867
1295:3329010
1189:2024 -
1173:2014 –
1153:2009 –
1137:2008 –
1121:2002 –
1074:niggers
695:aligned
469:and an
404:at the
372:) is a
357:Swedish
297:Website
5370:People
5067:People
4831:People
4822:OpenAI
4594:Events
3985:6 June
3955:4 July
3948:
3602:
3594:
3576:Ethics
3540:: 6–9.
3509:. 2024
3351:
3323:5 July
3297:5 July
3290:
3227:
3204:8 July
3197:
2997:
2964:
2937:
2927:
2869:5 July
2752:
2744:
2718:Nature
2639:6 July
2413:5 June
2303:
2154:
2108:
2055:
2037:
2009:
1922:
1894:
1886:
1842:
1806:
1798:
1780:
1711:
1682:
1674:
1646:Nature
1628:
1620:
1602:
1566:
1558:
1540:Ethics
1505:
1497:
1461:Nature
1450:
1417:
1399:
1346:
1293:
1260:
1252:
1197:
1183:
1167:
1147:
1131:
1019:Awards
932:) the
845:, and
380:, the
370:Sweden
217:Thesis
187:School
176:Region
131:Awards
123:Spouse
5164:Works
4952:Other
4645:from
4005:(PDF)
3835:(PDF)
3600:S2CID
3572:(PDF)
3405:(PDF)
3345:(PDF)
3249:(PDF)
2958:Slate
2810:(PDF)
2761:(PDF)
2750:S2CID
2714:(PDF)
2556:(PDF)
2322:(PDF)
2232:Notes
2152:S2CID
2136:Think
2132:(PDF)
2094:(PDF)
2053:S2CID
2023:(PDF)
2007:S2CID
1987:(PDF)
1967:(PDF)
1947:(PDF)
1936:(PDF)
1908:(PDF)
1892:S2CID
1856:(PDF)
1840:JSTOR
1820:(PDF)
1804:S2CID
1766:(PDF)
1748:(1).
1725:(PDF)
1709:S2CID
1693:Think
1680:S2CID
1642:(PDF)
1626:S2CID
1580:(PDF)
1564:S2CID
1536:(PDF)
1503:S2CID
1469:arXiv
1415:S2CID
1360:(PDF)
1344:JSTOR
1324:(PDF)
1291:JSTOR
1258:S2CID
1250:JSTOR
1116:Books
1030:'
637:agent
530:, or
477:from
152:'
126:Susan
3987:2023
3957:2024
3946:ISSN
3901:2023
3872:2024
3846:2024
3814:2017
3788:2017
3763:2017
3738:2017
3713:2022
3661:2018
3630:2014
3592:PMID
3515:2024
3465:2017
3439:2024
3413:2014
3388:2024
3362:2014
3349:ISBN
3325:2023
3299:2023
3288:ISSN
3225:ISBN
3206:2024
3195:ISSN
3172:2023
3146:2023
3105:2023
3098:Edge
3080:2023
3006:2023
2995:ISSN
2973:2024
2962:ISSN
2935:OCLC
2925:ISBN
2897:2015
2871:2023
2792:2015
2742:PMID
2693:2024
2667:2024
2641:2023
2615:2015
2603:Aeon
2581:2021
2553:"CV"
2535:2016
2497:2023
2469:2015
2440:2024
2415:2023
2408:CNBC
2385:2014
2359:2024
2333:2019
2301:ISSN
1955:2017
1884:PMID
1796:PMID
1672:PMID
1618:PMID
1588:1093
1556:PMID
1495:PMID
1448:PMID
1314:(1).
1195:ISBN
1181:ISBN
1165:ISBN
1145:ISBN
1129:ISBN
872:and
751:and
688:and
463:M.A.
455:B.A.
352:-rəm
350:BOST
304:.com
43:Born
4661:AGI
4043:TED
4041:at
3584:doi
3580:116
3318:BBC
3261:doi
3050:TED
2864:Vox
2822:doi
2734:doi
2722:438
2177:doi
2144:doi
2116:doi
2078:doi
2045:doi
1999:doi
1876:doi
1832:doi
1788:doi
1750:doi
1740:".
1701:doi
1662:doi
1650:452
1610:doi
1548:doi
1544:116
1487:doi
1465:438
1440:doi
1407:doi
1372:doi
1336:doi
1283:doi
1242:doi
1094:."
1080:of
759:or
705:'s
487:PhD
471:MSc
166:Era
113:PhD
103:MSc
5532::
4011:.
4007:.
3978:.
3959:.
3944:.
3940:.
3928:^
3918:.
3892:.
3880:^
3837:.
3822:^
3805:.
3779:.
3754:.
3729:.
3702:.
3677:.
3647:.
3598:.
3590:.
3578:.
3574:.
3538:89
3536:.
3532:.
3505:.
3485:.
3473:^
3456:.
3430:.
3379:.
3316:.
3286:.
3282:.
3257:53
3255:.
3251:.
3193:.
3189:.
3163:.
3137:.
3096:.
3071:.
3059:^
3047:.
3028:.
3014:^
2993:.
2989:.
2960:.
2956:.
2933:.
2905:^
2887:.
2862:.
2843:.
2818:10
2816:.
2812:.
2781:.
2748:.
2740:.
2732:.
2720:.
2716:.
2658:.
2632:.
2601:.
2589:^
2525:.
2519:.
2505:^
2488:.
2477:^
2459:.
2448:^
2431:.
2393:^
2350:.
2324:.
2293:.
2271:^
2173:10
2171:.
2167:.
2150:.
2140:13
2138:.
2134:.
2114:.
2100:.
2096:.
2072:.
2068:.
2051:.
2043:.
2031:22
2029:.
2025:.
2005:.
1995:22
1993:.
1989:.
1975:10
1973:.
1969:.
1938:.
1916:10
1914:.
1910:.
1890:.
1882:.
1874:.
1864:30
1862:.
1858:.
1838:.
1828:69
1826:.
1822:.
1802:.
1794:.
1786:.
1774:15
1772:.
1768:.
1744:.
1727:.
1707:.
1695:.
1678:.
1670:.
1660:.
1648:.
1644:.
1624:.
1616:.
1608:.
1598:.
1586:.
1582:.
1562:.
1554:.
1542:.
1538:.
1522:.
1518:.
1501:.
1493:.
1485:.
1477:.
1463:.
1446:.
1436:19
1434:.
1430:.
1413:.
1405:.
1393:15
1391:.
1387:.
1368:11
1366:.
1362:.
1342:.
1332:53
1330:.
1326:.
1310:.
1306:.
1289:.
1279:61
1277:.
1273:.
1256:.
1248:.
1238:52
1236:.
1232:.
1219:.
1215:.
1193:,
1179:,
1163:,
1127:,
1064:.
1056:,
1040:.
849:.
841:,
837:,
793:,
747:,
743:,
739:,
652:.
570:.
549:.
534:.
392:,
384:,
359::
355:;
340:əm
141:FP
93:MA
83:BA
55:)
5235:e
5228:t
5221:v
5052:e
5045:t
5038:v
4635:e
4628:t
4621:v
4076:e
4069:t
4062:v
3989:.
3922:.
3903:.
3874:.
3848:.
3816:.
3790:.
3765:.
3740:.
3715:.
3663:.
3632:.
3606:.
3586::
3517:.
3491:.
3467:.
3441:.
3415:.
3390:.
3364:.
3327:.
3301:.
3267:.
3263::
3233:.
3208:.
3174:.
3148:.
3107:.
3082:.
3053:.
3032:.
3008:.
2975:.
2941:.
2899:.
2873:.
2847:.
2828:.
2824::
2794:.
2736::
2728::
2695:.
2669:.
2643:.
2617:.
2583:.
2558:.
2537:.
2499:.
2471:.
2442:.
2417:.
2387:.
2361:.
2335:.
2307:.
2185:.
2179::
2158:.
2146::
2122:.
2118::
2102:5
2084:.
2080::
2074:4
2059:.
2047::
2013:.
2001::
1957:.
1926:.
1898:.
1878::
1870::
1846:.
1834::
1810:.
1790::
1756:.
1752::
1746:2
1715:.
1703::
1697:6
1686:.
1664::
1656::
1632:.
1612::
1594::
1570:.
1550::
1524:5
1509:.
1489::
1481::
1471::
1454:.
1442::
1421:.
1409::
1378:.
1374::
1350:.
1338::
1312:9
1297:.
1285::
1264:.
1244::
1223:.
1221:2
806:"
343:/
337:r
334:t
331:s
328:ɒ
325:b
322:ˈ
319:/
315:(
115:)
111:(
105:)
101:(
95:)
91:(
85:)
81:(
51:(
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.