Dilemma
A dilemma (Greek δί-λημμα "double proposition") is a problem offering two solutions or possibilities, of which neither is acceptable. The two options are often described as the horns of a dilemma, neither of which is comfortable. Some of the best known dilemmas are "Euthyphro dilemma" by Plato and "Prisoner's dilemma." When a problem offers three solutions or possibilities, it is called Trilemma.
The dilemma is sometimes used as a rhetorical device, in the form "you must accept either A, or B;" here A and B would be propositions, each leading to some further conclusion. Applied in this way, it may be a fallacy or a false dichotomy.
Logic
In formal logic, the definition of a dilemma differs markedly from everyday usage. Two options are still present, but choosing between them is immaterial because they both imply the same conclusion. Symbolically expressed thus:
This can be translated informally as "one (or both) of A or B is known to be true, but they both imply C, so regardless of the truth values of A and B we can conclude C."
Horned dilemmas can present more than two choices. The number of choices of Horned dilemmas can be used in their alternative names, such as two-pronged (two-horned) or dilemma proper, or three-pronged (three-horned) or trilemma, and so on.
Constructive dilemmas:
- 1. (If X, then Y) and (If W, then Z).
- 2. X or W.
- 3. Therefore, Y or Z.
Destructive dilemmas:
- 1. (If X, then Y) and (If W, then Z).
- 2. Not Y or not Z.
- 3. Therefore, not X or not W.
Euthyphro dilemma
The Euthyphro dilemma is found in Plato's dialogue Euthyphro, in which Socrates asks Euthyphro: "Is the pious (τὸ ὅσιον) loved by the gods because it is pious, or is it pious because it is loved by the gods" (10a).
In monotheistic terms, this is usually transformed into: "Is what is moral commanded by God because it is moral, or is it moral because it is commanded by God?" The dilemma has continued to present a problem for theists since Plato presented it, and is still the object of theological and philosophical debate.
Prisoner's dilemma
In game theory, the prisoner's dilemma (sometimes abbreviated PD) is a type of non-zero-sum game in which two players may each "cooperate" with or "defect" (that is, betray) the other player. In this game, as in all game theory, the only concern of each individual player ("prisoner") is maximizing his/her own payoff, without any concern for the other player's payoff. The unique equilibrium for this game is a Pareto-suboptimal solution—that is, rational choice leads the two players to both play defect even though each player's individual reward would be greater if they both played cooperate. In equilibrium, each prisoner chooses to defect even though both would be better off by cooperating, hence the dilemma.
In the classic form of this game, cooperating is strictly dominated by defecting, so that the only possible equilibrium for the game is for all players to defect. In simpler terms, no matter what the other player does, one player will always gain a greater payoff by playing defect. Since in any situation, playing defect is more beneficial than cooperating, all rational players will play defect, all things being equal.
In the iterated prisoner's dilemma, the game is played repeatedly. Thus, each player has an opportunity to "punish" the other player for previous non-cooperative play. Cooperation may then arise as an equilibrium outcome. The incentive to defect is overcome by the threat of punishment, leading to the possibility of a cooperative outcome. So, if the game is infinitely repeated, cooperation may be a subgame perfect Nash equilibrium, although both players defecting always remains an equilibrium and there are many other equilibrium outcomes.
The classical prisoner's dilemma
The Prisoner's Dilemma was originally framed by Merrill Flood and Melvin Dresher working at RAND in 1950. Albert W. Tucker formalized the game with prison sentence payoffs and gave it the "Prisoner's Dilemma" name (Poundstone, 1992).
The classical prisoner's dilemma (PD) is as follows:
- Two suspects, A and B, are arrested by the police. The police have insufficient evidence for a conviction, and, having separated both prisoners, visit each of them to offer the same deal: If one testifies for the prosecution against the other and the other remains silent, the betrayer goes free and the silent accomplice receives the full 10-year sentence. If both stay silent, both prisoners are sentenced to only six months in jail for a minor charge. If each betrays the other, each receives a five-year sentence. Each prisoner must make the choice of whether to betray the other or to remain silent. However, neither prisoner knows for sure what choice the other prisoner will make. So this dilemma poses the question: How should the prisoners act?
The dilemma can be summarized thus:
Prisoner B Stays Silent | Prisoner B Betrays | |
---|---|---|
Prisoner A Stays Silent | Each serves six months | Prisoner A serves ten years Prisoner B goes free |
Prisoner A Betrays | Prisoner A goes free Prisoner B serves ten years |
Each serves five years |
The dilemma arises when one assumes that both prisoners only care about minimizing their own jail terms. Each prisoner has two and only two options: Either to cooperate with his accomplice and stay quiet, or to defect from their implied pact and betray his accomplice in return for a lighter sentence. The outcome of each choice depends on the choice of the accomplice, but each prisoner must choose without knowing what his accomplice has chosen.
In deciding what to do in strategic situations, it is normally important to predict what others will do. This is not the case here. If one prisoner knows the other prisoner would stay silent, the first's best move is to betray, as he then walks free instead of receiving the minor sentence. If one knew the other prisoner would betray, the best move is still to betray, as one would receive a lesser sentence than by silence. Betraying is a dominant strategy. The other prisoner reasons similarly, and therefore also chooses to betray. Yet, by both defecting they get a lower payoff than they would get by staying silent. So rational, self-interested play results in each prisoner being worse off than if they had stayed silent. In more technical language, this demonstrates very elegantly that in a non-zero sum game a Nash Equilibrium need not be a Pareto optimum.
Note that the paradox of the situation lies in that the prisoners are not defecting in hope that the other will not. Even when they both know the other to be rational and selfish, they will both play defect. Defect is what they will play no matter what, even though they know fully well that the other player is playing defect as well and that they will both be better off with a different result.
The "Stay Silent" and "Betray" strategies are also known as "don't confess" and "confess," or the more standard "cooperate" and "defect."
One experiment based on the simple dilemma found that approximately 40 percent of participants cooperated (that is, stayed silent).[1]
Hedgehog's dilemma
The phrase hedgehog's dilemma refers to the notion that the closer two beings come to each other, the more likely they are to hurt one another; however if they remain apart, they will each feel the pain of loneliness. This comes from the idea that hedgehogs, with sharp spines on their backs, will hurt each other if they get too close. This is analogous to a relationship between two human beings. If two people come to care about and trust each other, something bad that happens to one of them will hurt the other as well, and dishonesty between the two could cause even greater problems.
The concept originates from Arthur Schopenhauer's Parerga und Paralipomena, Volume II, Chapter XXXI, Section 396. In his English translation, E.F.J. Payne translates the German "Stachelschweine" as "porcupines." Schopenhauer's parable describes a number of hedgehogs who need to huddle together for warmth and who struggle to find the distance where they are warm without hurting one another. The hedgehogs have to sacrifice warmth for comfort. The conclusion that Schopenhauer draws is that if someone has enough internal warmth, he or she can avoid society and the giving and receiving of irritation that results from social interaction.
It is also important to note that hedgehogs do not actually hurt each other when they get close; human beings tend to keep themselves more "on guard" in relationships and are more likely to sting one another in the way that a relaxed hedgehog would if spooked. When living in groups, hedgehogs often sleep close to each other.
Platonia dilemma
In the platonia dilemma introduced in Douglas Hofstadter's book Metamagical Themas, an eccentric trillionaire gathers 20 people together, and tells them that if one and only one of them sends him a telegram (reverse charges) by noon the next day, that person will receive a billion dollars. If he receives more than one telegram, or none at all, no one will get any money, and cooperation between players is forbidden. In this situation, the superrational thing to do is to send a telegram with probability 1/20.
A similar game, referred to as a "Luring Lottery," was actually played by the editors of Scientific American in the 1980s. To enter the contest once, readers had to send in a postcard with the number "1" written on it. They were also explicitly permitted to submit as many entries as they wished by sending in a single postcard bearing the number of entries they wished to submit. The prize was one million dollars divided by the total number of entries received, to be awarded to the submitter of a randomly chosen entry. Thus, a reader who submitted a large number of entries increased his or her chances of winning but reduced the maximum possible value of the prize.
According to the magazine, the rational thing was for each contestant to roll a simulated die with the number of sides equal to the number of expected responders (about 5 percent of the readership), and then send "1" if the player rolls "1." If all contestants had followed this strategy, it is likely that the magazine would have received a single postcard, with a "1," and would have had to pay a million dollars to the sender of that postcard. Reputedly the publisher and owners were very concerned about betting the company on a game.
Although the magazine had previously discussed the concept of superrationality from which the above-mentioned algorithm can be deduced, many of the contestants submitted entries consisting of an astronomically large number (including several who entered a googolplex). Some took this game further by filling their postcards with mathematical expressions designed to evaluate to the largest possible number in the limited space allowed. The magazine was unable to tell who won, and the monetary value of the prize would have been a minuscule fraction of a cent.
Security dilemma
In international relations, the security dilemma refers to a situation wherein two or more states are drawn into conflict, possibly even war, over security concerns, even though none of the states actually desire conflict. Any attempt a state makes to increase its own security will actually decrease its security.
A frequently cited example of the security dilemma is the beginning of World War I. Supporters of this viewpoint argue that the major European powers felt forced to go to war by feelings of insecurity over the alliances of their neighbors, despite not actually desiring the war. Furthermore, the time necessary to mobilize large amounts of troops for defense led some Great Powers (such as Russia) to adopt a particularly accelerated mobilization timetable, which in turn put pressure on other states to mobilize early as well. However, other scholars dispute this interpretation of the origins of the war, contending that some of the states involved really did want the conflict.
The security dilemma is a popular concept with cognitive and international relations theorists of international relations, who regard war as essentially arising from failures of communication. Functionalist theorists affirm that the key to avoiding war is the avoidance of miscommunication through proper signaling.
The notion of the security dilemma is attributed to John H. Herz, since he used it in the second issue of the second volume of World Politics and the notion is often used in realist theories of international relations which suggest that war is a regular and often inherent condition of life.
Stagflation
Stagflation, a portmanteau of the words stagnation and inflation, is a term in general use within modern macroeconomics used to describe a period of out-of-control price inflation combined with slow-to-no output growth, rising unemployment, and eventually recession. The term stagflation is generally attributed to United Kingdom Chancellor of the Exchequer, Iain MacLeod in a speech to parliament in 1965.[2] "Stag" is drawn from the first syllable of "stagnation," a reference to a sluggish economy, while "flation" is drawn from the second and third syllables of "inflation"—a reference to an upward spiral in consumer prices. Economists associate the presence of both factors as unit costs increase because fixed costs are spread over smaller output.
Stagflation is a problem because the two principal tools for directing the economy, fiscal policy, and monetary policy, offer only trade offs between growth and inflation. A central bank can either slow growth to reduce inflationary pressures, or it can allow general increases in price to occur in order to stimulate growth. Stagflation creates a dilemma in that efforts to correct stagnation only worsen inflation, and vice versa. The dilemma in monetary policy is instructive. The central bank can make one of two choices, each with negative outcomes. First, the bank can choose to stimulate the economy and create jobs by increasing the money supply (by purchasing government debt), but this risks boosting the pace of inflation. The other choice is to pursue a tight monetary policy (reducing government debt purchases in order to raise interest rates) to reduce inflation, at the risk of higher unemployment and slower output growth.
The problem for fiscal policy is far less clear. Both revenues and expenditures tend to rise with inflation, all else equal, while they fall as growth slows. Unless there is a differential impact on either revenues or spending due to stagflation, the impact of stagflation on the budget balance is not altogether clear. As a policy matter, there is one school of thought that the best policy mix is one in which government stimulates growth through increased spending or reduced taxes while the central bank fights inflation through higher interest rates. In reality, coordinating fiscal and monetary policy is not an easy task.
Responses to a dilemma
In Zen and the Art of Motorcycle Maintenance, Robert Pirsig outlines possible responses to a dilemma. The classical responses are to either choose one of the two horns and refute the other or alternatively to refute both horns by showing that there are additional choices. Pirsig then mentions three illogical or rhetorical responses. One can "throw sand in the bull's eyes" by, for example, questioning the competence of the questioner. One can "sing the bull to sleep" by, for example, stating that the answer to the question is beyond one's own humble powers and asking the questioner for help. Finally one can "refuse to enter the arena" by, for example, stating that the question is unanswerable.
Trilemma
A trilemma is a difficult choice from three alternatives, each of which is (or appears) unacceptable or unfavorable.
There are two logically equivalent ways in which to express a trilemma: It can be expressed as a choice among three unfavorable options, one of which must be chosen, or as a choice among three favorable options, only two of which are possible at the same time.
The term derives from the much older term dilemma, a choice between two difficult or unfavorable options.
Trilemmas in religion
Epicurus's trilemma
One of the earliest uses of the trilemma formulation is that of the Greek philosopher Epicurus, rejecting the idea of an omnipotent and omnibenevolent God (as summarized by David Hume):[3]
1. If God is willing but unable to prevent evil, he is not omnipotent 2. If God is able but not willing to prevent evil, he is not good 3. If God is willing and able to prevent evil, then why is there evil?
Although traditionally ascribed to Epicurus, it has been suggested that it may actually be the work of an early skeptic writer, possibly Carneades.[4]
Lewis's trilemma
One of the best known trilemmas is one popularised by C. S. Lewis. It proceeds from the assumption that Jesus claimed, either implicitly or explicitly, to be God. Therefore one of the following must be true:[5]
- Lunatic: Jesus was not God, but he mistakenly believed that he was.
- Liar: Jesus was not God, and he knew it, but he said so anyway.
- Lord: Jesus is God.
Trilemmas in economics
In economics, the trilemma (or "impossible trinity") is a term used in discussing the problems associated with creating a stable international financial system. It refers to the trade-offs among the following three goals: A fixed exchange rate, national independence in monetary policy, and capital mobility. According to the Mundell-Fleming model, a small, open economy cannot achieve all three of these policy goals at the same time: in pursuing any two of these goals, a nation must forgo the third.[6]
Stephen Pinker noted another social trilemma in his book, The Blank Slate, that a society cannot be simultaneously fair, free and equal. If it is fair, individuals who work harder will accumulate more wealth; if it is free, parents will leave the bulk of their inheritance to their children; but then it will not be equal, as people will begin life with different fortunes.
Arthur C. Clarke cited a management trilemma among a product being done quickly, cheaply, and of high quality. In the software industry, this means that one can pick any two of: Fastest time to market, highest software quality (fewest defects), and lowest cost (headcount). This is the basis of the popular project-management aphorism, "Quick, Cheap, Good: Pick two."
The Munchhausen-Trilemma
In the theory of knowledge the Munchhausen-Trilemma is a philosophical term coined to stress the impossibility to prove any certain truth even in the fields of logic and mathematics. Its name is going back to a logical proof of the German philosopher Hans Albert. This proof runs as follows: All of the only three possible attempts to get a certain justification must fail:
- All justifications in pursuit of certain knowledge have also to justify the means of their justification and doing so they have to justify anew the means of their justification. Therefore, there can be no end. People are faced with the hopeless situation of an "infinite regression."
- One can stop at self-evidence or common sense or fundamental principles or speaking "ex cathedra" or at any other evidence, but in doing so the intention to install certain justification is abandoned.
- The third horn of the trilemma is the application of a circular and therefore invalid argument.
The Trilemma of the Earth
The “Trilemma of the Earth†(or “3E Trilemmaâ€) is a term used by scientists working on energy and environment protection. 3E Trilemma stands for Economy-Energy-Environment interaction.
For the activation of economic development (E: Economy) to occur, we need to increase the energy expenditure (E: Energy) however this raises the environmental issue (E: Environment) of more emissions of pollutant gases.[7]
See also
- Trilemma
Notes
- ↑ Amos Tversky, Preference, Belief, and Similarity: Selected Writings (MIT Press, 2004, ISBN 026270093X).
- ↑ Douglas Harper, Stagflation, Online Etymology Dictionary. Retrieved May 05, 2007.
- ↑ David Hume, Dialogues Concerning Natural Religion, 1779.
- ↑ Mark Joseph Larrimore, The Problem of Evil: A Reader (Blackwell, 2001).
- ↑ C.S. Lewis, Mere Christianity (London: Collins, 1952).
- ↑ Maurice Obstfeld, Jay C. Shambaugh, and Alan M. Taylor, The Trilemma in History: Tradeoffs Among Exchange Rates, Monetary Policies, and Capital Mobility, in The Review of Economics and Statistics, 87(3): 423-438. Retrieved April 13, 2007.
- ↑ yoshihiro Hamakawa, “New Energy Option for 21st Century: Recent Progress in Solar Photovoltaic Energy Conversion,†in Japan Society of Applied Physics International, Vol 5, 30-35.
ReferencesISBN links support NWE through referral fees
- Adams, Robert Merrihew. Finite and Infinite Goods: A Framework for Ethics. New York: Oxford University Press, 2002. ISBN 0195153715
- Aertsen, Jan. Medieval philosophy and the transcendentals: the case of Thomas Aquinas. New York: Brill, 2004. ISBN 9004105859
- Campbell, Richmond, and Lanning Sowden. Paradoxes of Rationality and Cooperation Prisoner's Dilemma and Newcomb's Problem. Vancouver: University of British Columbia Press, 1985. ISBN 0774802154
- Cohen, Morris Raphael, Ernest Nagel, and John Corcoran. An Introduction to Logic. Indianapolis: Hackett Pub. Co, 1993. ISBN 0872201457
- Helm, Paul (ed.). Divine Commands and Morality Oxford, Oxford University Press, 1981. ISBN 0198750498
- Kretzmann, Norman. “Abraham, Isaac, and Euthyphro: God and the basis of morality.†In Eleonore Stump & Michael J. Murray (eds.). Philosophy of Religion: The Big Questions. Oxford: Blackwell, 1999. ISBN 0631206043
- Olin, Doris. Paradox. Montreal: McGill-Queen's University Press, 2003. ISBN 0773526781
- Plato. Euthyphro. ISBN 0140440372
- Poundstone, William. Prisoner's Dilemma. New York: Doubleday, 1992. ISBN 0385415672
- Ryle, Gilbert. Dilemmas. Cambridge: University Press, 1954.
- Tversky, Amos. Preference, Belief, and Similarity: Selected Writings. MIT Press, 2004. ISBN 026270093X
- Waller, Bruce N. Critical Thinking Consider the Verdict. Englewood Cliffs, N.J.: Prentice Hall, 1994. ISBN 0131776355
- Walton, Douglas N. Fundamentals of Critical Argumentation. Cambridge: Cambridge University Press, 2006. ISBN 0521823196
External links
All links retrieved January 29, 2024.
- Steven Kuhn. Prisoner's Dilemma, Stanford Encyclopedia of Philosophy.
- List of entries on Dilemma, Stanford Encyclopedia of Philosophy.
- John M. Frame. Euthyphro, Hume, and the Biblical God.
- Euthyphro by Plato from Project Gutenberg.
General philosophy sources
- Stanford Encyclopedia of Philosophy.
- The Internet Encyclopedia of Philosophy.
- Paideia Project Online.
- Project Gutenberg.
Credits
New World Encyclopedia writers and editors rewrote and completed the Wikipedia article in accordance with New World Encyclopedia standards. This article abides by terms of the Creative Commons CC-by-sa 3.0 License (CC-by-sa), which may be used and disseminated with proper attribution. Credit is due under the terms of this license that can reference both the New World Encyclopedia contributors and the selfless volunteer contributors of the Wikimedia Foundation. To cite this article click here for a list of acceptable citing formats.The history of earlier contributions by wikipedians is accessible to researchers here:
- Dilemma history
- Euthyphro_dilemma history
- Hedgehog's_dilemma history
- Platonia_dilemma history
- Prisoner's_dilemma history
- Security_dilemma history
- Stagflation history
- Trilemma history
The history of this article since it was imported to New World Encyclopedia:
Note: Some restrictions may apply to use of individual images which are separately licensed.