Knowledge

Trolley problem

Source đź“ť

432:
unquestionable. Technological systems must be designed to avoid accidents. However, they cannot be standardized to a complex or intuitive assessment of the impacts of an accident in such a way that they can replace or anticipate the decision of a responsible driver with the moral capacity to make correct judgements. It is true that a human driver would be acting unlawfully if he killed a person in an emergency to save the lives of one or more other persons, but he would not necessarily be acting culpably. Such legal judgements, made in retrospect and taking special circumstances into account, cannot readily be transformed into abstract/general ex ante appraisals and thus also not into corresponding programming activities. …
176:
only by framing some innocent person and having him executed. Beside this example is placed another in which a pilot whose airplane is about to crash is deciding whether to steer from a more to a less inhabited area. To make the parallel as close as possible, it may rather be supposed that he is the driver of a runaway tram, which he can only steer from one narrow track on to another; five men are working on one track and one man on the other; anyone on the track he enters is bound to be killed. In the case of the riots, the mob have five hostages, so that in both examples, the exchange is supposed to be one man's life for the lives of five.
327: 27: 2376: 318:, researchers pointed out that, as measures of utilitarian decisions, sacrificial dilemmas such as the trolley problem measure only one facet of proto-utilitarian tendencies, namely permissive attitudes toward instrumental harm, while ignoring impartial concern for the greater good. As such, the authors argued that the trolley problem provides only a partial measure of utilitarianism. 411:
use of virtual reality to assess human behavior in experimental settings. However, some argue that the investigation of trolley-type cases is not necessary to address the ethical problem of driverless cars, because the trolley cases have a serious practical limitation. It would need to be top-down plan in order to fit the current approaches of addressing emergencies in
419:
others. Although most people would not be willing to use an automated car that might sacrifice themselves in a life-or-death dilemma, some believe the somewhat counterintuitive claim that using mandatory ethics values would nevertheless be in their best interest. According to Gogoll and MĂĽller, "the reason is, simply put, that would most likely result in a
223:. Since then, numerous other studies have employed trolley problems to study moral judgment, investigating topics like the role and influence of stress, emotional state, impression management, levels of anonymity, different types of brain damage, physiological arousal, different neurotransmitters, and genetic factors on responses to trolley dilemmas. 219:, they demonstrated that "personal" dilemmas (like pushing a man off a footbridge) preferentially engage brain regions associated with emotion, whereas "impersonal" dilemmas (like diverting the trolley by flipping a switch) preferentially engaged regions associated with controlled reasoning. On these grounds, they advocate for the 290:
as ways of illustrating their ethical views. Scruton writes, "These 'dilemmas' have the useful character of eliminating from the situation just about every morally relevant relationship and reducing the problem to one of arithmetic alone." Scruton believes that just because one would choose to change
237:
performed the first realistic trolley-problem experiment, where subjects were placed alone in what they thought was a train-switching station, and shown footage that they thought was real (but was actually prerecorded) of a train going down a track, with five workers on the main track, and one on the
246:
The trolley problem has been the subject of many surveys in which about 90% of respondents have chosen to kill the one and save the five. If the situation is modified where the one sacrificed for the five was a relative or romantic partner, respondents are much less likely to be willing to sacrifice
184:
view asserts that it is obligatory to steer to the track with one man on it. According to classical utilitarianism, such a decision would be not only permissible, but, morally speaking, the better option (the other option being no action at all). This fact makes diverting the trolley obligatory. An
175:
Suppose that a judge or magistrate is faced with rioters demanding that a culprit be found for a certain crime and threatening otherwise to take their own bloody revenge on a particular section of the community. The real culprit being unknown, the judge sees himself as able to prevent the bloodshed
410:
to allow the public to express their opinions on what decisions autonomous vehicles should make in scenarios that use the trolley problem paradigm. Analysis of the data collected through Moral Machine showed broad differences in relative preferences among different countries. Other approaches make
66:
or trolley is on course to collide with and kill a number of people (traditionally five) down the track, but a driver or bystander can intervene and divert the vehicle to kill just one person on a different track. Then other variations of the runaway vehicle, and analogous life-and-death dilemmas
418:
Also, a question remains of whether the law should dictate the ethical standards that all autonomous vehicles must use, or whether individual autonomous car owners or drivers should determine their car's ethical values, such as favoring safety of the owner or the owner's family over the safety of
358:
As before, a trolley is hurtling down a track towards five people. You are on a bridge under which it will pass, and you can stop it by putting something very heavy in front of it. As it happens, there is a fat man next to you – your only way to stop the trolley is to push him over the bridge and
270:
In her 2017 paper, Nassim JafariNaimi lays out the reductive nature of the trolley problem in framing ethical problems that serves to uphold an impoverished version of utilitarianism. She argues that the popular argument that the trolley problem can serve as a template for algorithmic morality is
70:
Opinions on the ethics of each scenario turn out to be sensitive to details of the story that may seem immaterial to the abstract dilemma. The question of formulating a general principle that can account for the differing judgments arising in different variants of the story was raised in 1967 as
363:
Resistance to this course of action seems strong; when asked, a majority of people will approve of pulling the switch to save a net of four lives, but will disapprove of pushing the fat man to save a net of four lives. This has led to attempts to find a relevant moral distinction between the two
310:
Masahiro Morioka considers the dropping of atomic bombs as an example of the trolley problem and points out that there are five "problems of the trolley problem", namely, 1) rarity, 2) inevitability, 3) safety zone, 4) possibility of becoming a victim, and 5) the lack of perspective of the dead
185:
alternative viewpoint is that since moral wrongs are already in place in the situation, moving to another track constitutes a participation in the moral wrong, making one partially responsible for the death when otherwise no one would be responsible. An opponent of action may also point to the
102:
extensively. Thomson's 1976 article initiated the literature on the trolley problem as a subject in its own right. Characteristic of this literature are colorful and increasingly absurd alternative scenarios in which the sacrificed person is instead pushed onto the tracks as a way to stop the
426:
In 2016, the German government appointed a commission to study the ethical implications of autonomous driving. The commission adopted 20 rules to be implemented in the laws that will govern the ethical choices that autonomous vehicles will make. Relevant to the trolley dilemma is this rule:
431:
8. Genuine dilemmatic decisions, such as a decision between one human life and another, depend on the actual specific situation, incorporating “unpredictable” behaviour by parties affected. They can thus not be clearly standardized, nor can they be programmed such that they are ethically
375:, which says that one may take action that has bad side effects, but deliberately intending harm (even for good causes) is wrong. So, the action is permissible even if the harm to the innocent person is foreseen, so long as it is not intended. This is an argument which 254:
shows that 68% of professional philosophers would switch (sacrifice the one individual to save five lives) in the case of the trolley problem, 8% would not switch, and the remaining 24% had another view or could not answer.
399:, such as into whom or what to crash, can affect the particulars of the deadly outcome. For example, should the software value the safety of the car's occupants more, or less, than that of potential victims outside the car. 267:, researchers criticized the use of the trolley problem, arguing, among other things, that the scenario it presents is too extreme and unconnected to real-life moral situations to be useful or educational. 67:(medical, judicial, etc.) are posed, each containing the option to either do nothing, in which case several people will be killed, or intervene and sacrifice one initially "safe" person to save the others. 854:
Youssef, Farid F.; Dookeeram, Karine; Basdeo, Vasant; Francis, Emmanuel; Doman, Mekaeel; Mamed, Danielle; Maloo, Stefan; Degannes, Joel; Dobo, Linda (2012). "Stress alters personal moral decision making".
148:
must choose between ordering an air strike on an encroaching enemy force at the cost of his own 20-man patrol unit, or calling off the strike and risking the lives of the main army made up of 500 men.
2366: 525: 193:, simply being present in this situation and being able to influence its outcome constitutes an obligation to participate. If this is the case, then doing nothing would be considered an immoral act. 371:
of switching the trolley away from the five. However, in the second case, harming the one is an integral part of the plan to save the five. This solution is essentially an application of the
295:. As a way of showing the flaws in consequentialist responses to ethical problems, Scruton points out paradoxical elements of belief in utilitarianism and similar beliefs. He believes that 2364: 1917:
Awad, Edmond; Dsouza, Sohan; Kim, Richard; Schulz, Jonathan; Henrich, Joseph; Shariff, Azim; Bonnefon, Jean-François; Rahwan, Iyad (October 24, 2018). "The Moral Machine experiment".
1110:
Navarrete, C. David; McDonald, Melissa M.; Mott, Michael L.; Asher, Benjamin (2012-04-01). "Virtual morality: Emotion and action in a simulated three-dimensional "trolley problem"".
795:
Greene, Joshua D.; Sommerville, R. Brian; Nystrom, Leigh E.; Darley, John M.; Cohen, Jonathan D. (2001-09-14). "An fMRI Investigation of Emotional Engagement in Moral Judgment".
1806: 238:
secondary track; the participants had the option to pull the lever to divert the train toward the secondary track. Five of the seven participants did not pull the lever.
2365: 1228:
Bernhard, Regan M.; Chaponis, Jonathan; Siburian, Richie; Gallagher, Patience; Ransohoff, Katherine; Wikler, Daniel; Perlis, Roy H.; Greene, Joshua D. (2016-12-01).
1546: 2297: 87:
in a 1976 article that catalyzed a large literature, the subject refers to the meta-problem of why different judgements are arrived at in particular instances.
1980:"Using Virtual Reality to Assess Ethical Decisions in Road Traffic Scenarios: Applicability of Value-of-Life-Based Models and Influences of Time Pressure" 1744:
Jean-François Bonnefon; Azim Shariff; Iyad Rahwan (October 13, 2015). "Autonomous Vehicles Need Experimental Ethics: Are We Ready for Utilitarian Cars?".
2321: 996: 1464: 2080:
Francis, Kathryn B.; Howard, Charles; Howard, Ian S.; Gummerum, Michaela; Ganis, Giorgio; Anderson, Grace; Terbeck, Sylvia (October 10, 2016).
1483: 465: 2324:. Federal Ministry of Transport and Digital Infrastructure (German: Bundesministerium fĂĽr Verkehr und digitale Infrastruktur). Archived from 2300:. Federal Ministry of Transport and Digital Infrastructure (German: Bundesministerium fĂĽr Verkehr und digitale Infrastruktur). Archived from 714: 30:
One of the dilemmas included in the trolley problem: is it preferable to pull the lever to divert the runaway trolley onto the side track?
1330:
Kahane, Guy; Everett, Jim A. C.; Earp, Brian D.; Caviola, Lucius; Faber, Nadira S.; Crockett, Molly J.; Savulescu, Julian (March 2018).
215:
and colleagues published the results of the first significant empirical investigation of people's responses to trolley problems. Using
700: 395:. Situations are anticipated where a potentially fatal collision appears to be unavoidable, but in which choices made by the car's 1484:"Evolution and the trolley problem: People save five over one unless the one is young, genetically related, or a romantic partner" 342:
ethical systems. The central question that these dilemmas bring to light is on whether or not it is right to actively inhibit the
2433: 138:
considered the question of whether it is ethical to deflect a projectile from a larger crowd toward a smaller one. Similarly, in
1810: 271:
based on fundamentally flawed premises that serve the most powerful with potentially dire consequences on the future of cities.
2448: 1726: 1696: 1010:
Lee, Minwoo; Sul, Sunhae; Kim, Hackjin (2018-06-18). "Social observation increases deontological judgments in moral dilemmas".
220: 206: 122:
controlled the switch, and the lone individual to be sacrificed (or not) was the switchman's child. German philosopher of law
103:
trolley, has his organs harvested to save transplant patients, or is killed in more indirect ways that complicate the chain of
1601: 367:
One possible distinction could be that in the first case, one does not intend harm towards anyone – harming the one is just a
1631: 216: 1566:
JafariNaimi, Nassim (2018). "Our Bodies in the Trolley's Path, or Why Self-driving Cars Must *Not* Be Programmed to Kill".
139: 20: 2202:
Himmelreich, Johannes (June 1, 2018). "Never Mind the Trolley: The Ethics of Autonomous Vehicles in Mundane Situations".
1648: 330:
Five cases of the trolley problem: the original Switch, the Fat Man, the Fat Villain, the Loop, and the Man in the Yard
226:
Trolley problems have been used as a measure of utilitarianism, but their usefulness for such purposes has been widely
2301: 503: 1290:"Revisiting External Validity: Concerns about Trolley Problems and Other Sacrificial Dilemmas in Moral Psychology" 2443: 59: 2031:"Forced-choice decision-making in modified trolley dilemma situations: a virtual reality and eye tracking study" 1381:"Sidetracked by trolleys: Why sacrificial moral dilemmas tell us little (or nothing) about utilitarian judgment" 2325: 475: 2409:
Forced-choice decision-making in modified trolley dilemma situations: a virtual reality and eye tracking study
898:
Valdesolo, Piercarlo; DeSteno, David (2006-06-01). "Manipulations of Emotional Context Shape Moral Judgment".
291:
the track so that the train hits the one person instead of the five does not mean that they are necessarily a
443: 282:
criticises the usage of ethical dilemmas such as the trolley problem and their usage by philosophers such as
212: 2245:
Gogoll, Jan; MĂĽller, Julian F. (June 1, 2017). "Autonomous Cars: In Favor of a Mandatory Ethics Setting".
2147:
Patil, Indrajeet; Cogoni, Carlotta; Zangrando, Nicola; Chittaro, Luca; Silani, Giorgia (January 2, 2014).
1832:
Bonnefon, Jean-François; Shariff, Azim; Rahwan, Iyad (2016). "The social dilemma of autonomous vehicles".
2423: 617: 2408: 1430: 234: 155:. It has been a topic of popular books. Trolley-style scenarios also arise in discussing the ethics of 76: 2438: 2400: 459: 372: 186: 1482:
April Bleske-Rechek; Lyndsay A. Nelson; Jonathan P. Baker; Mark W. Remiker; Sarah J. Brandt (2010).
1481: 2381: 1506: 741:"Algorithmic Decision-Making in AVs: Understanding Ethical and Technical Concerns for Smart Cities" 135: 540: 349:
The basic Switch form of the trolley problem also supports comparison to other, related dilemmas:
1442: 1288:
Bauman, Christopher W.; McGraw, A. Peter; Bartels, Daniel M.; Warren, Caleb (September 4, 2014).
1230:"Variation in the oxytocin receptor gene (OXTR) is associated with differences in moral judgment" 1055:"Selective deficit in personal moral judgment following damage to ventromedial prefrontal cortex" 1053:
Ciaramelli, Elisa; Muccioli, Michela; LĂ davas, Elisabetta; Pellegrino, Giuseppe di (2007-06-01).
412: 115: 51: 151:
Beginning in 2001, the trolley problem and its variants have been used in empirical research on
2453: 1163:"Serotonin selectively influences moral judgment and behavior through effects on harm aversion" 950: 420: 368: 130:
in 1930, as did German legal scholar Hans Welzel in a work from 1951. In his commentary on the
58:
of whether to sacrifice one person to save a larger number. The series usually begins with a
990: 675:
63 , 47ff. About the German discussion see also Schuster, Crim Law Forum 34, 237–270 (2023).
84: 2396: 2149:"Affective basis of judgment-behavior discrepancy in virtual experiences of moral dilemmas" 2093: 1926: 1851: 1763: 1174: 804: 391:
Variants of the original Trolley Driver dilemma arise in the design of software to control
335: 8: 171:
Foot's version of the thought experiment, now known as "Trolley Driver", ran as follows:
127: 111: 2097: 1930: 1855: 1767: 1178: 808: 2278: 2227: 2184: 2124: 2081: 2057: 2030: 2006: 1979: 1960: 1875: 1841: 1787: 1753: 1583: 1407: 1380: 1356: 1331: 1309: 1262: 1229: 1205: 1162: 1143: 1087: 1054: 1035: 1023: 978: 931: 880: 836: 752: 598: 563: 299: 156: 144:, a television play broadcast in the United States on June 7, 1954, a commander in the 39: 19:"The Trolley Problem" redirects here. For the television episode of the same name, see 2029:
Skulmowski, Alexander; Bunge, Andreas; Kaspar, Kai; Pipa, Gordon (December 16, 2014).
2428: 2270: 2262: 2231: 2219: 2176: 2168: 2129: 2111: 2062: 2011: 1952: 1867: 1779: 1627: 1587: 1412: 1361: 1267: 1249: 1210: 1192: 1135: 1127: 1092: 1074: 1039: 1027: 970: 923: 915: 911: 872: 828: 820: 602: 160: 1964: 1879: 1791: 1693: 1313: 1147: 982: 935: 884: 2282: 2254: 2211: 2160: 2119: 2101: 2052: 2042: 2001: 1991: 1942: 1934: 1859: 1771: 1575: 1498: 1402: 1392: 1351: 1343: 1301: 1257: 1241: 1200: 1182: 1161:
Crockett, Molly J.; Clark, Luke; Hauser, Marc D.; Robbins, Trevor W. (2010-10-05).
1119: 1082: 1066: 1019: 962: 907: 868: 864: 840: 812: 762: 590: 555: 339: 292: 202: 152: 2188: 1397: 2164: 2106: 1700: 448: 72: 55: 1289: 966: 470: 392: 346:
of an individual if doing so produces a greater utility for other individuals.
251: 181: 2258: 2215: 1938: 1434: 159:
design, which may require programming to choose whom or what to strike when a
2417: 2266: 2223: 2172: 2115: 2047: 1996: 1743: 1619: 1579: 1253: 1196: 1131: 1078: 1031: 974: 919: 824: 676: 480: 453: 407: 403: 296: 279: 80: 1863: 1775: 1332:"Beyond sacrificial harm: A two-dimensional model of utilitarian psychology" 1187: 816: 110:
Earlier forms of individual trolley scenarios antedated Foot's publication.
2274: 2180: 2133: 2066: 2015: 1956: 1871: 1783: 1551: 1416: 1365: 1271: 1214: 1139: 1096: 951:"The strategic moral self:self-presentation shapes moral dilemma judgments" 927: 876: 832: 376: 287: 283: 123: 114:
included a version in a moral questionnaire given to undergraduates at the
91: 1245: 1070: 2384:
was created from a revision of this article dated 30 April 2012
634:
Bulletin of the University of Wisconsin no.236 (Madison, June 1908), 138.
95: 2322:"Ethics Commission's complete report on automated and connected driving" 1947: 1667: 1524: 1347: 1305: 767: 740: 594: 145: 47: 2082:"Virtual Morality: Transitioning from Moral Judgment to Moral Action?" 1465:"'Trolley Problem': Virtual-Reality Test for Moral Dilemma – TIME.com" 687:
Hazon Ish, HM, Sanhedrin #25, s.v. "veyesh leayen". Available online,
567: 326: 1502: 1123: 119: 104: 2298:"Bericht der Ethik-Kommission Automatisiertes und vernetztes Fahren" 1978:
Sütfeld, Leon R.; Gast, Richard; König, Peter; Pipa, Gordon (2017).
1846: 1758: 1052: 757: 559: 396: 303: 1804: 26: 1227: 343: 99: 1893: 688: 131: 43: 1547:"Is One of the Most Popular Psychology Experiments Worthless?" 794: 466:
The Case of the Speluncean Explorers § Similar real cases
2148: 853: 504:
The Problem of Abortion and the Doctrine of the Double Effect
359:
onto the track, killing him to save five. Should you proceed?
2146: 659:
Untersuchungen über Vorsatz und Fahrlässigkeit im Strafrecht
510:(Oxford: Basil Blackwell, 1978) (originally appeared in the 2079: 581:
Myrna Kamm, Francis (1989). "Harming Some to Save Others".
190: 63: 1160: 1109: 2028: 1287: 532: 1491:
Journal of Social, Evolutionary, and Cultural Psychology
1329: 673:
ZStW Zeitschrift fĂĽr die gesamte Strafrechtswissenschaft
632:
A Study of the Influence of Custom on the Moral Judgment
1805:
Emerging Technology From the arXiv (October 22, 2015).
386: 1668:"The Trolley Problem and the Dropping of Atomic Bombs" 1831: 379:
considers (and ultimately rejects) in his first book
308:
The Trolley Problem and the Dropping of Atomic Bombs,
1977: 498: 496: 1916: 1807:"Why Self-Driving Cars Must Be Programmed to Kill" 1602:"Why Self-Driving Cars Must Be Programmed to Kill" 334:Trolley problems highlight the difference between 1694:http://www.utilitarian.net/singer/by/200510--.pdf 493: 2415: 897: 311:victims who were deprived of freedom of choice. 2319: 2295: 1522: 1167:Proceedings of the National Academy of Sciences 538: 189:of human lives. Under some interpretations of 2315: 2313: 2311: 1886: 1325: 1323: 1283: 1281: 948: 647:(New York: The Century Co, 1928), 42–44, 122. 526:Killing, Letting Die, and the Trolley Problem 221:dual-process account of moral decision-making 2244: 1626:(1st ed.). Princeton. pp. 79–112. 995:: CS1 maint: multiple names: authors list ( 703:The Paley Center. Retrieved August 07, 2022. 2201: 1724: 1565: 1523:Bourget, David; Chalmers, David J. (2013). 1234:Social Cognitive and Affective Neuroscience 1059:Social Cognitive and Affective Neuroscience 738: 134:, published long before his death in 1953, 2308: 1372: 1320: 1278: 949:Rom, Sarah C., Paul, Conway (2017-08-30). 739:Lim, Hazel Si Min; Taeihagh, Araz (2019). 677:https://doi.org/10.1007/s10609-023-09452-0 580: 302:thought experiment definitively disproves 118:in 1905. In this variation, the railway's 2123: 2105: 2056: 2046: 2005: 1995: 1946: 1845: 1757: 1406: 1396: 1355: 1294:Social and Personality Psychology Compass 1261: 1204: 1186: 1086: 1009: 955:Journal of Experimental Social Psychology 766: 756: 265:Social and Personality Psychology Compass 83:. Later dubbed "the trolley problem" by 2392:, and does not reflect subsequent edits. 2375: 1910: 712: 325: 25: 1809:. MIT Technology review. Archived from 1715:(Oxford: Oxford University Press, 1989) 1665: 1649:"Can nuclear war be morally justified?" 1618: 1568:Science, Technology, & Human Values 1429: 656: 621:(Oxford: Oxford University Press, 1996) 2416: 1544: 1423: 1378: 783:Moral Philosophy: Theories and Issues. 207:Dual process theory (moral psychology) 734: 732: 730: 728: 661:. Berlin: O. Liebermann. p. 288. 217:functional magnetic resonance imaging 196: 2035:Frontiers in Behavioral Neuroscience 1984:Frontiers in Behavioral Neuroscience 785:Belmont, CA: Wadsworth, 2007. Print. 387:Implications for autonomous vehicles 21:The Trolley Problem (The Good Place) 321: 250:A 2009 survey by David Bourget and 166: 126:discussed a similar dilemma in his 13: 2362: 2320:BMVI Commission (28 August 2017). 1646: 1024:10.1016/j.evolhumbehav.2018.06.004 725: 14: 2465: 2343: 2296:BMVI Commission (June 20, 2016). 2204:Ethical Theory and Moral Practice 263:In a 2014 paper published in the 2374: 912:10.1111/j.1467-9280.2006.01731.x 2289: 2238: 2195: 2140: 2073: 2022: 1971: 1825: 1798: 1737: 1727:"The Ethics of Autonomous Cars" 1725:Patrick Lin (October 8, 2013). 1718: 1705: 1682: 1640: 1612: 1594: 1559: 1538: 1525:"What do Philosophers believe?" 1516: 1475: 1457: 1445:from the original on 2021-12-12 1221: 1154: 1103: 1046: 1003: 942: 891: 847: 788: 775: 706: 694: 681: 539:Jarvis Thomson, Judith (1985). 314:In a 2018 article published in 2434:Metaphors referring to objects 2247:Science and Engineering Ethics 1692:The Journal of Ethics (2005). 1545:Khazan, Olga (July 24, 2014). 869:10.1016/j.psyneuen.2011.07.017 713:Bakewell, Sarah (2013-11-22). 665: 650: 637: 624: 609: 574: 517: 476:Violinist (thought experiment) 352: 241: 227: 1: 2449:Thought experiments in ethics 1672:Journal of Philosophy of Life 1398:10.1080/17470919.2015.1023400 1379:Kahane, Guy (20 March 2015). 529:, 59 The Monist 204-17 (1976) 487: 444:Lesser of two evils principle 90:Philosophers Judith Thomson, 2165:10.1080/17470919.2013.870091 2107:10.1371/journal.pone.0164374 1012:Evolution and Human Behavior 701:"Studio One: The Strike(TV)" 689:http://hebrewbooks.org/14332 258: 16:Thought experiment in ethics 7: 618:Living High and Letting Die 436: 163:appears to be unavoidable. 79:by the English philosopher 10: 2470: 1666:Morioka, Masahiro (2017). 967:10.1016/j.jesp.2017.08.003 200: 18: 2259:10.1007/s11948-016-9806-x 2216:10.1007/s10677-018-9896-4 1939:10.1038/s41586-018-0637-6 460:R. v. Dudley and Stephens 373:doctrine of double effect 77:doctrine of double effect 2048:10.3389/fnbeh.2014.00426 1997:10.3389/fnbeh.2017.00122 1580:10.1177/0162243917718942 857:Psychoneuroendocrinology 715:"Clang Went the Trolley" 233:In 2017, a group led by 136:Avrohom Yeshaya Karelitz 1864:10.1126/science.aaf2654 1776:10.1126/science.aaf2654 1441:. Season 2. Episode 1. 1188:10.1073/pnas.1009396107 817:10.1126/science.1062872 523:Judith Jarvis Thomson, 413:artificial intelligence 116:University of Wisconsin 98:have also analysed the 71:part of an analysis of 52:artificial intelligence 2444:Philosophical problems 2370: 2350:Listen to this article 1713:The Limits of Morality 657:Engisch, Karl (1930). 434: 381:The Limits of Morality 361: 331: 306:. In his 2017 article 178: 31: 2369: 2304:on November 15, 2017. 1690:Ethics and Intuitions 1606:MIT Technology Review 900:Psychological Science 643:Frank Chapman Sharp, 630:Frank Chapman Sharp, 583:Philosophical Studies 541:"The Trolley Problem" 429: 356: 329: 274:In 2017, in his book 173: 85:Judith Jarvis Thomson 29: 2401:More spoken articles 2328:on 15 September 2017 1336:Psychological Review 316:Psychological Review 107:and responsibility. 2153:Social Neuroscience 2098:2016PLoSO..1164374F 1931:2018Natur.563...59A 1856:2016Sci...352.1573B 1840:(6293): 1573–1576. 1813:on January 26, 2016 1768:2016Sci...352.1573B 1752:(6293): 1573–1576. 1608:. October 22, 2015. 1433:(6 December 2017). 1385:Social Neuroscience 1246:10.1093/scan/nsw103 1179:2010PNAS..10717433C 1173:(40): 17433–17438. 1071:10.1093/scan/nsm001 809:2001Sci...293.2105G 803:(5537): 2105–2108. 128:habilitation thesis 112:Frank Chapman Sharp 73:debates on abortion 62:in which a runaway 54:involving stylized 40:thought experiments 2424:1967 introductions 2371: 1699:2016-06-17 at the 1435:"The Greater Good" 1348:10.1037/rev0000093 1306:10.1111/spc3.12131 781:Barcalow, Emmett, 768:10.3390/su11205791 719:The New York Times 595:10.1007/bf00372696 514:, Number 5, 1967.) 421:prisoner’s dilemma 402:A platform called 332: 300:experience machine 197:Empirical research 187:incommensurability 157:autonomous vehicle 32: 2367: 1647:Fisher, Richard. 1633:978-0-691-18303-9 1240:(12): 1872–1881. 508:Virtues and Vices 2461: 2439:Moral psychology 2391: 2389: 2378: 2377: 2368: 2358: 2356: 2351: 2338: 2337: 2335: 2333: 2317: 2306: 2305: 2293: 2287: 2286: 2242: 2236: 2235: 2199: 2193: 2192: 2144: 2138: 2137: 2127: 2109: 2092:(10): e0164374. 2077: 2071: 2070: 2060: 2050: 2026: 2020: 2019: 2009: 1999: 1975: 1969: 1968: 1950: 1914: 1908: 1907: 1905: 1904: 1890: 1884: 1883: 1849: 1829: 1823: 1822: 1820: 1818: 1802: 1796: 1795: 1761: 1741: 1735: 1734: 1722: 1716: 1709: 1703: 1686: 1680: 1679: 1663: 1661: 1659: 1644: 1638: 1637: 1616: 1610: 1609: 1598: 1592: 1591: 1563: 1557: 1556: 1542: 1536: 1535: 1533: 1531: 1520: 1514: 1513: 1511: 1505:. Archived from 1503:10.1037/h0099295 1488: 1479: 1473: 1472: 1461: 1455: 1454: 1452: 1450: 1431:Stevens, Michael 1427: 1421: 1420: 1410: 1400: 1376: 1370: 1369: 1359: 1327: 1318: 1317: 1285: 1276: 1275: 1265: 1225: 1219: 1218: 1208: 1190: 1158: 1152: 1151: 1124:10.1037/a0025561 1107: 1101: 1100: 1090: 1050: 1044: 1043: 1007: 1001: 1000: 994: 986: 946: 940: 939: 895: 889: 888: 851: 845: 844: 792: 786: 779: 773: 772: 770: 760: 736: 723: 722: 710: 704: 698: 692: 685: 679: 669: 663: 662: 654: 648: 641: 635: 628: 622: 613: 607: 606: 578: 572: 571: 554:(6): 1395–1415. 548:Yale Law Journal 545: 536: 530: 521: 515: 502:Philippa Foot, " 500: 340:consequentialist 322:Related problems 293:consequentialist 203:Moral psychology 191:moral obligation 167:Original dilemma 153:moral psychology 56:ethical dilemmas 2469: 2468: 2464: 2463: 2462: 2460: 2459: 2458: 2414: 2413: 2405: 2404: 2393: 2387: 2385: 2382:This audio file 2379: 2372: 2363: 2360: 2354: 2353: 2349: 2346: 2341: 2331: 2329: 2318: 2309: 2294: 2290: 2243: 2239: 2200: 2196: 2145: 2141: 2078: 2074: 2027: 2023: 1976: 1972: 1925:(7729): 59–64. 1915: 1911: 1902: 1900: 1894:"Moral Machine" 1892: 1891: 1887: 1830: 1826: 1816: 1814: 1803: 1799: 1742: 1738: 1723: 1719: 1710: 1706: 1701:Wayback Machine 1687: 1683: 1657: 1655: 1645: 1641: 1634: 1624:On Human Nature 1617: 1613: 1600: 1599: 1595: 1564: 1560: 1543: 1539: 1529: 1527: 1521: 1517: 1509: 1486: 1480: 1476: 1463: 1462: 1458: 1448: 1446: 1428: 1424: 1377: 1373: 1328: 1321: 1286: 1279: 1226: 1222: 1159: 1155: 1108: 1104: 1051: 1047: 1008: 1004: 988: 987: 947: 943: 896: 892: 852: 848: 793: 789: 780: 776: 737: 726: 711: 707: 699: 695: 686: 682: 670: 666: 655: 651: 642: 638: 629: 625: 614: 610: 579: 575: 543: 537: 533: 522: 518: 501: 494: 490: 485: 449:Lifeboat ethics 439: 406:was created by 393:autonomous cars 389: 355: 324: 276:On Human Nature 261: 244: 235:Michael Stevens 209: 199: 169: 38:is a series of 36:trolley problem 24: 17: 12: 11: 5: 2467: 2457: 2456: 2451: 2446: 2441: 2436: 2431: 2426: 2412: 2411: 2394: 2380: 2373: 2361: 2348: 2347: 2345: 2344:External links 2342: 2340: 2339: 2307: 2288: 2253:(3): 681–700. 2237: 2210:(3): 669–684. 2194: 2139: 2072: 2021: 1970: 1909: 1885: 1824: 1797: 1736: 1717: 1711:Shelly Kagan, 1704: 1688:Peter Singer, 1681: 1639: 1632: 1620:Scruton, Roger 1611: 1593: 1574:(2): 302–323. 1558: 1537: 1515: 1512:on 2012-04-11. 1497:(3): 115–127. 1474: 1456: 1422: 1391:(5): 551–560. 1371: 1342:(2): 131–164. 1319: 1300:(9): 536–554. 1277: 1220: 1153: 1118:(2): 364–370. 1102: 1045: 1018:(6): 611–621. 1002: 941: 906:(6): 476–477. 890: 863:(4): 491–498. 846: 787: 774: 745:Sustainability 724: 705: 693: 680: 664: 649: 636: 623: 608: 573: 560:10.2307/796133 531: 516: 491: 489: 486: 484: 483: 478: 473: 471:Tunnel problem 468: 463: 456: 451: 446: 440: 438: 435: 388: 385: 354: 351: 323: 320: 260: 257: 252:David Chalmers 247:the one life. 243: 240: 198: 195: 168: 165: 15: 9: 6: 4: 3: 2: 2466: 2455: 2454:Virtue ethics 2452: 2450: 2447: 2445: 2442: 2440: 2437: 2435: 2432: 2430: 2427: 2425: 2422: 2421: 2419: 2410: 2407: 2406: 2402: 2398: 2383: 2327: 2323: 2316: 2314: 2312: 2303: 2299: 2292: 2284: 2280: 2276: 2272: 2268: 2264: 2260: 2256: 2252: 2248: 2241: 2233: 2229: 2225: 2221: 2217: 2213: 2209: 2205: 2198: 2190: 2186: 2182: 2178: 2174: 2170: 2166: 2162: 2159:(1): 94–107. 2158: 2154: 2150: 2143: 2135: 2131: 2126: 2121: 2117: 2113: 2108: 2103: 2099: 2095: 2091: 2087: 2083: 2076: 2068: 2064: 2059: 2054: 2049: 2044: 2040: 2036: 2032: 2025: 2017: 2013: 2008: 2003: 1998: 1993: 1989: 1985: 1981: 1974: 1966: 1962: 1958: 1954: 1949: 1944: 1940: 1936: 1932: 1928: 1924: 1920: 1913: 1899: 1898:Moral Machine 1895: 1889: 1881: 1877: 1873: 1869: 1865: 1861: 1857: 1853: 1848: 1843: 1839: 1835: 1828: 1812: 1808: 1801: 1793: 1789: 1785: 1781: 1777: 1773: 1769: 1765: 1760: 1755: 1751: 1747: 1740: 1732: 1728: 1721: 1714: 1708: 1702: 1698: 1695: 1691: 1685: 1678:(2): 316–337. 1677: 1673: 1669: 1654: 1650: 1643: 1635: 1629: 1625: 1621: 1615: 1607: 1603: 1597: 1589: 1585: 1581: 1577: 1573: 1569: 1562: 1554: 1553: 1548: 1541: 1526: 1519: 1508: 1504: 1500: 1496: 1492: 1485: 1478: 1470: 1466: 1460: 1444: 1440: 1436: 1432: 1426: 1418: 1414: 1409: 1404: 1399: 1394: 1390: 1386: 1382: 1375: 1367: 1363: 1358: 1353: 1349: 1345: 1341: 1337: 1333: 1326: 1324: 1315: 1311: 1307: 1303: 1299: 1295: 1291: 1284: 1282: 1273: 1269: 1264: 1259: 1255: 1251: 1247: 1243: 1239: 1235: 1231: 1224: 1216: 1212: 1207: 1202: 1198: 1194: 1189: 1184: 1180: 1176: 1172: 1168: 1164: 1157: 1149: 1145: 1141: 1137: 1133: 1129: 1125: 1121: 1117: 1113: 1106: 1098: 1094: 1089: 1084: 1080: 1076: 1072: 1068: 1064: 1060: 1056: 1049: 1041: 1037: 1033: 1029: 1025: 1021: 1017: 1013: 1006: 998: 992: 984: 980: 976: 972: 968: 964: 960: 956: 952: 945: 937: 933: 929: 925: 921: 917: 913: 909: 905: 901: 894: 886: 882: 878: 874: 870: 866: 862: 858: 850: 842: 838: 834: 830: 826: 822: 818: 814: 810: 806: 802: 798: 791: 784: 778: 769: 764: 759: 754: 750: 746: 742: 735: 733: 731: 729: 720: 716: 709: 702: 697: 690: 684: 678: 674: 671:Hans Welzel, 668: 660: 653: 646: 640: 633: 627: 620: 619: 615:Peter Unger, 612: 604: 600: 596: 592: 589:(3): 227–60. 588: 584: 577: 569: 565: 561: 557: 553: 549: 542: 535: 528: 527: 520: 513: 512:Oxford Review 509: 505: 499: 497: 492: 482: 481:Virtue ethics 479: 477: 474: 472: 469: 467: 464: 462: 461: 457: 455: 454:Omission bias 452: 450: 447: 445: 442: 441: 433: 428: 424: 422: 416: 414: 409: 408:MIT Media Lab 405: 404:Moral Machine 400: 398: 394: 384: 382: 378: 374: 370: 365: 360: 350: 347: 345: 341: 337: 336:deontological 328: 319: 317: 312: 309: 305: 301: 298: 294: 289: 285: 281: 280:Roger Scruton 277: 272: 268: 266: 256: 253: 248: 239: 236: 231: 229: 224: 222: 218: 214: 213:Joshua Greene 208: 204: 194: 192: 188: 183: 177: 172: 164: 162: 158: 154: 149: 147: 143: 142: 137: 133: 129: 125: 121: 117: 113: 108: 106: 101: 97: 93: 88: 86: 82: 81:Philippa Foot 78: 74: 68: 65: 61: 57: 53: 49: 45: 41: 37: 28: 22: 2330:. Retrieved 2326:the original 2302:the original 2291: 2250: 2246: 2240: 2207: 2203: 2197: 2156: 2152: 2142: 2089: 2085: 2075: 2038: 2034: 2024: 1987: 1983: 1973: 1922: 1918: 1912: 1901:. Retrieved 1897: 1888: 1837: 1833: 1827: 1815:. Retrieved 1811:the original 1800: 1749: 1745: 1739: 1731:The Atlantic 1730: 1720: 1712: 1707: 1689: 1684: 1675: 1671: 1656:. Retrieved 1652: 1642: 1623: 1614: 1605: 1596: 1571: 1567: 1561: 1552:The Atlantic 1550: 1540: 1528:. Retrieved 1518: 1507:the original 1494: 1490: 1477: 1468: 1459: 1447:. Retrieved 1438: 1425: 1388: 1384: 1374: 1339: 1335: 1297: 1293: 1237: 1233: 1223: 1170: 1166: 1156: 1115: 1111: 1105: 1065:(2): 84–92. 1062: 1058: 1048: 1015: 1011: 1005: 991:cite journal 958: 954: 944: 903: 899: 893: 860: 856: 849: 800: 796: 790: 782: 777: 751:(20): 5791. 748: 744: 718: 708: 696: 683: 672: 667: 658: 652: 644: 639: 631: 626: 616: 611: 586: 582: 576: 551: 547: 534: 524: 519: 511: 507: 458: 430: 425: 417: 401: 390: 380: 377:Shelly Kagan 366: 362: 357: 348: 333: 315: 313: 307: 288:Peter Singer 284:Derek Parfit 275: 273: 269: 264: 262: 249: 245: 232: 225: 210: 179: 174: 170: 150: 140: 124:Karl Engisch 109: 92:Frances Kamm 89: 69: 35: 33: 2332:January 20, 1948:10871/39187 1817:October 24, 1653:www.bbc.com 1449:23 December 369:side effect 353:The Fat Man 242:Survey data 182:utilitarian 96:Peter Unger 2418:Categories 2397:Audio help 2388:2012-04-30 1903:2019-01-31 1847:1510.03346 1759:1510.03346 1439:Mind Field 758:1910.13122 691:, page 404 488:References 228:criticized 201:See also: 146:Korean War 141:The Strike 48:psychology 2267:1471-5546 2232:150184601 2224:1572-8447 2173:1747-0919 2116:1932-6203 1588:148793137 1254:1749-5016 1197:0027-8424 1132:1931-1516 1079:1749-5024 1040:150247068 1032:1090-5138 975:0022-1031 961:: 24–37. 920:0956-7976 825:0036-8075 603:171045532 259:Criticism 211:In 2001, 161:collision 120:switchman 105:causation 2429:Dilemmas 2399: Â· 2275:27417644 2181:24359489 2134:27723826 2086:PLOS ONE 2067:25565997 2016:28725188 1965:53029241 1957:30356211 1880:35400794 1872:27339987 1792:35400794 1784:27339987 1697:Archived 1658:19 April 1622:(2017). 1469:TIME.com 1443:Archived 1417:25791902 1366:29265854 1314:11170070 1272:27497314 1215:20876101 1148:34621870 1140:22103331 1097:18985127 983:52205265 936:13511311 928:16771796 885:30489504 877:21899956 833:11557895 437:See also 397:software 304:hedonism 297:Nozick's 75:and the 60:scenario 2386: ( 2357:minutes 2283:3632738 2125:5056714 2094:Bibcode 2058:4267265 2041:: 426. 2007:5496958 1990:: 122. 1927:Bibcode 1852:Bibcode 1834:Science 1764:Bibcode 1746:Science 1408:4642180 1357:5900580 1263:5141955 1206:2951447 1175:Bibcode 1112:Emotion 1088:2555449 841:1437941 805:Bibcode 797:Science 364:cases. 344:utility 100:dilemma 2281:  2273:  2265:  2230:  2222:  2189:706534 2187:  2179:  2171:  2132:  2122:  2114:  2065:  2055:  2014:  2004:  1963:  1955:  1919:Nature 1878:  1870:  1790:  1782:  1630:  1586:  1530:11 May 1415:  1405:  1364:  1354:  1312:  1270:  1260:  1252:  1213:  1203:  1195:  1146:  1138:  1130:  1095:  1085:  1077:  1038:  1030:  981:  973:  934:  926:  918:  883:  875:  839:  831:  823:  645:Ethics 601:  568:796133 566:  132:Talmud 94:, and 50:, and 44:ethics 2279:S2CID 2228:S2CID 2185:S2CID 1961:S2CID 1876:S2CID 1842:arXiv 1788:S2CID 1754:arXiv 1584:S2CID 1510:(PDF) 1487:(PDF) 1310:S2CID 1144:S2CID 1036:S2CID 979:S2CID 932:S2CID 881:S2CID 837:S2CID 753:arXiv 599:S2CID 564:JSTOR 544:(PDF) 506:" in 2334:2021 2271:PMID 2263:ISSN 2220:ISSN 2177:PMID 2169:ISSN 2130:PMID 2112:ISSN 2063:PMID 2012:PMID 1953:PMID 1868:PMID 1819:2015 1780:PMID 1660:2023 1628:ISBN 1532:2013 1451:2018 1413:PMID 1362:PMID 1268:PMID 1250:ISSN 1211:PMID 1193:ISSN 1136:PMID 1128:ISSN 1093:PMID 1075:ISSN 1028:ISSN 997:link 971:ISSN 924:PMID 916:ISSN 873:PMID 829:PMID 821:ISSN 338:and 286:and 205:and 64:tram 34:The 2255:doi 2212:doi 2161:doi 2120:PMC 2102:doi 2053:PMC 2043:doi 2002:PMC 1992:doi 1943:hdl 1935:doi 1923:563 1860:doi 1838:352 1772:doi 1750:352 1576:doi 1499:doi 1403:PMC 1393:doi 1352:PMC 1344:doi 1340:125 1302:doi 1258:PMC 1242:doi 1201:PMC 1183:doi 1171:107 1120:doi 1083:PMC 1067:doi 1020:doi 963:doi 908:doi 865:doi 813:doi 801:293 763:doi 591:doi 556:doi 423:." 42:in 2420:: 2355:18 2310:^ 2277:. 2269:. 2261:. 2251:23 2249:. 2226:. 2218:. 2208:21 2206:. 2183:. 2175:. 2167:. 2155:. 2151:. 2128:. 2118:. 2110:. 2100:. 2090:11 2088:. 2084:. 2061:. 2051:. 2037:. 2033:. 2010:. 2000:. 1988:11 1986:. 1982:. 1959:. 1951:. 1941:. 1933:. 1921:. 1896:. 1874:. 1866:. 1858:. 1850:. 1836:. 1786:. 1778:. 1770:. 1762:. 1748:. 1729:. 1674:. 1670:. 1664:; 1651:. 1604:. 1582:. 1572:43 1570:. 1549:. 1493:. 1489:. 1467:. 1437:. 1411:. 1401:. 1389:10 1387:. 1383:. 1360:. 1350:. 1338:. 1334:. 1322:^ 1308:. 1296:. 1292:. 1280:^ 1266:. 1256:. 1248:. 1238:11 1236:. 1232:. 1209:. 1199:. 1191:. 1181:. 1169:. 1165:. 1142:. 1134:. 1126:. 1116:12 1114:. 1091:. 1081:. 1073:. 1061:. 1057:. 1034:. 1026:. 1016:39 1014:. 993:}} 989:{{ 977:. 969:. 959:74 957:. 953:. 930:. 922:. 914:. 904:17 902:. 879:. 871:. 861:37 859:. 835:. 827:. 819:. 811:. 799:. 761:. 749:11 747:. 743:. 727:^ 717:. 597:. 587:57 585:. 562:. 552:94 550:. 546:. 495:^ 415:. 383:. 278:, 230:. 180:A 46:, 2403:) 2395:( 2390:) 2359:) 2352:( 2336:. 2285:. 2257:: 2234:. 2214:: 2191:. 2163:: 2157:9 2136:. 2104:: 2096:: 2069:. 2045:: 2039:8 2018:. 1994:: 1967:. 1945:: 1937:: 1929:: 1906:. 1882:. 1862:: 1854:: 1844:: 1821:. 1794:. 1774:: 1766:: 1756:: 1733:. 1676:7 1662:. 1636:. 1590:. 1578:: 1555:. 1534:. 1501:: 1495:4 1471:. 1453:. 1419:. 1395:: 1368:. 1346:: 1316:. 1304:: 1298:8 1274:. 1244:: 1217:. 1185:: 1177:: 1150:. 1122:: 1099:. 1069:: 1063:2 1042:. 1022:: 999:) 985:. 965:: 938:. 910:: 887:. 867:: 843:. 815:: 807:: 771:. 765:: 755:: 721:. 605:. 593:: 570:. 558:: 23:.

Index

The Trolley Problem (The Good Place)

thought experiments
ethics
psychology
artificial intelligence
ethical dilemmas
scenario
tram
debates on abortion
doctrine of double effect
Philippa Foot
Judith Jarvis Thomson
Frances Kamm
Peter Unger
dilemma
causation
Frank Chapman Sharp
University of Wisconsin
switchman
Karl Engisch
habilitation thesis
Talmud
Avrohom Yeshaya Karelitz
The Strike
Korean War
moral psychology
autonomous vehicle
collision
utilitarian

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑