In the previous article, we discussed the Sleeping Beauty problem, rejected anthropic reasoning and and explained how the "halfer" position is the correct one and it only "loses" if you accept Causal Decision Theory, but that's okay since CDT agents lose all the time.
Well, upon some thinking, it seems that agents EDT agents can also lose, but this has nothing to do with anything anthropic. Here are two equivalent (to each other) problems that "beat" Evidential Decision Theory:
- Vincent Conitzer (2017) (simplified version): Two coins are flipped. Our good friend and lab rat Sleeping Beauty is woken up on (if HH or HT) Monday and Tuesday (if TH) Monday and Wednesday (if TT) Tuesday and Wednesday. When woken up, she is told what day it is, and offered the following bet: "1pt for correctly guessing Heads, 3pt for correctly guessing Tails". Should she take the bet?
In terms of precommitment, committing to bet heads means an expected return of 1pt, while committing to bet tails means an expected return of 1.5pt. So she should bet tails.
But if she wakes up on Monday (or symmetrically Tuesday), then betting heads means an expected return of 1.33pt, while betting tails means an expected return of 1pt. So she bets heads.
(The problem can be formulated in terms of sending two different agents into rooms, so there's nothing anthropic/memory loss/splitting people in two about this.)
- Psy-Kosh's non-anthropic problem: You have 10 identical agents with shared finances. Flip a coin -- if Heads, send 9 agents to green rooms and 1 agent to a red room. If Tails, send 1 agent to a green room and 9 agents to red rooms. Offer the agents in green rooms $(G-3R)$pt, where $G$ and $R$ are the number of agents in green and red rooms -- and the offer is executed only if all agents agree to accept it. Should they take the offer?
If the coin comes up heads, $G-3R=6$. If the coin comes up tails $G-3R=-26$.
In terms of precommitment, we know that the probability of heads is 50%, and the expected gain from taking the bet is -10pt, so the agent's shouldn't take the bet.
But when an agent actually wakes up in a green room, it figures that means a 90% chance of Heads, and the expected gain from taking the bet is 2.80pt.
I.e. you end up with maybe 1, maybe 9 green agents who think -- perfectly rationally -- "what are the odds of there being just one green agent and it happening to be me?" and assign 10% odds to that possibility, and therefore to Tails, even though 50% of the cases are actually Tails, because 90% of the times that you end up in Green, the coin has come up Heads.
It seems that superrationality is not good enough.
+related to simpson's paradox?
No comments:
Post a Comment