User Controls

Newcomb's Paradox Thread

  1. Nigger Nintendo Starving African Child
    Originally posted by Meikai It explicitly matters that the predictions are reliable. Whatever you choose is most likely to have been predicted. Nothing inside the boxes will change. Nothing inside the boxes needs to change. You just need to accept that Omega is reliable, and choose accordingly. Picking B won't magically make money appear in B if there was none, but if you're picking B it's very likely to already contain a million dollars.

    No that's wrong. It contains whatever it contains. You're not going to change the outcome by taking both boxes. This can't be any clearer. If Omega believed you were going to pick B, it will be full, if not then not. But that doesn't matter now.

    You don't have to like it, but them's the breaks. Omega works in mysterious ways. Omega's predictions are uncanny. The thought experiment is basically a midwit trap that boils down to "would you take a million dollars or a thousand dollars" in practice, and perpetually you will screech about causality to justify choosing the wrong answer. Doesn't matter if it makes formal sense, in practice the formally correct solution is a suboptimal strategy when approached by Omega.

    It's got nothing to do with liking it or not, you just have rooted a false sense of superiority inside refusing the premises of the scenario.

    "Everyone who has played the formally correct solution has had a suboptimal outcome" is cooked into the thought experiment, which is why you were trying to invent holes like "everyone who has played the formally correct solution could actually mean nobody, bro".

    It's a hole in your setup, not my fault. You've already precommitted to the idea that you're making the optimal decision but you just cannot acknowledge the fact that whatever is in the boxes is already whatever is in them.
  2. Meikai Heck This Schlong
    Originally posted by Lanny > calls people pseuds
    > spend pages and pages of this thread trying to demonstrate how smart you are by convincing simpletons of the wrong solution with sophistry


    Hey, hey. I'm fully open to the possibility that I'm demonstrating how retarded I am, but I'm undeniably also demonstrating what is correct. By choosing the winning answer. If the answer you choose (that answer being the one with greater utility only in cases where Omega proves itself to be unreliable) loses to the "wrong" answer 100 times out of 101, maybe the "wrong" answer isn't wrong.
  3. Nigger Nintendo Starving African Child
    From Wikipedia, Newcomb's Paradox without HTS fucking it up with his transsexual retard brain

    There is a reliable predictor, another player, and two boxes designated A and B. The player is given a choice between taking only box B, or taking both boxes A and B. The player knows the following:[4]

    Box A is clear and always contains a visible $1,000.
    Box B is opaque, and its content has already been set by the predictor:
    If the predictor has predicted the player will take both boxes A and B, then box B contains nothing.
    If the predictor has predicted that the player will take only box B, then box B contains $1,000,000.
    The player does not know what the predictor predicted or what box B contains while making the choice.
  4. Meikai Heck This Schlong
    Originally posted by Nigger Nintendo No that's wrong. It contains whatever it contains. You're not going to change the outcome by taking both boxes. This can't be any clearer. If Omega believed you were going to pick B, it will be full, if not then not. But that doesn't matter now.


    What Omega believed you were going to pick is most likely to be what you eventually pick. You can't change what's in the box. You can change what you pick, and Omega will likely have predicted that you would do so, and so the boxes will likely have been filled accordingly. This is fine. It doesn't break anything. Reality still works the same way it did beforehand.


    Originally posted by Nigger Nintendo It's got nothing to do with liking it or not, you just have rooted a false sense of superiority inside refusing the premises of the scenario.


    You're the one trying to refuse the premise that your suboptimal strategy has been demonstrably and reliably proven to be suboptimal within the thought experiment itself. 🤷


    Originally posted by Nigger Nintendo It's a hole in your setup, not my fault. You've already precommitted to the idea that you're making the optimal decision but you just cannot acknowledge the fact that whatever is in the boxes is already whatever is in them.

    You say "whatever's in the boxes" like it's random chance, but the reality is "what is in the boxes most likely corresponds to a correct prediction by Omega".

    EDIT:
    Originally posted by Nigger Nintendo From Wikipedia, Newcomb's Paradox without HTS fucking it up with his transsexual retard brain

    I took the formulation of the question from Eliezer Yudkowsky's little essay on it. This setup might predate that essay, I dunno, but I deemed it sufficient and more importantly fun. Because this is an SG thread.

    I've even quoted his essay ITT.
  5. Meikai Heck This Schlong
    And for the record, the Wikipedia formulation changes nothing here. The predictor is inherently reliable. You don't know what it has predicted, but you do know that whatever it has predicted will reliably be true. ie you know that if you pick A+B, the predictor will have reliably predicted that you would pick A+B, and you will reliably receive only $1000.
  6. Lanny Bird of Courage
    Originally posted by Meikai You don't know what it has predicted, but you do know that whatever it has predicted will reliably be true.

    ie you know that if you pick A+B, the predictor will have reliably predicted that you would pick A+B, and you will reliably receive only $1000.

    RETROCAUSALITY
    E
    T
    R
    O
    A
    U
    S
    A
    L
    I
    T
    Y
  7. Nigger Nintendo Starving African Child
    Originally posted by Meikai What Omega believed you were going to pick is most likely to be what you eventually pick. You can't change what's in the box. You can change what you pick, and Omega will likely have predicted that you would do so, and so the boxes will likely have been filled accordingly. This is fine. It doesn't break anything. Reality still works the same way it did beforehand.

    But what omega predicted is in the past. You cannot get past this point: if you pick B-only and it has a million dollars then it always had the million dollars in it, even if you actually took both A+B. Conversely if you take A+B and B is empty, it was always empty. Choosing B-only in this scenario would have given you squat. Your choice at the time changes nothing but potentially not getting the $1000.

    This can't be explained any clearer, you're just refusing to acknowledge it.

    You're the one trying to refuse the premise that your suboptimal strategy has been demonstrably and reliably proven to be suboptimal within the thought experiment itself. 🤷

    You haven't proven shit but how thick you are


    You say "whatever's in the boxes" like it's random chance, but the reality is "what is in the boxes most likely corresponds to a correct prediction by Omega".

    Never implied it was a matter of random chance. I'm saying what's in the boxes doesn't change based on what I pick now, just what Omega thought I would pick now when considering it in the past.

    Unless you are positing retrocausality, it doesn't matter what you pick at the moment. Just what Omega thought you would. Your decision now isn't going to cause anything to change about how Omega loaded the boxes. Only what Omega thought your decision would be.

    EDIT:


    I took the formulation of the question from Eliezer Yudkowsky's little essay on it. This setup might predate that essay, I dunno, but I deemed it sufficient and more importantly fun. Because this is an SG thread.

    I've even quoted his essay ITT.

    Yeah Eliezer is a retard, I thought we covered this before when discussing Roko's Basilisk.
  8. Meikai Heck This Schlong
    "The correct answer is the answer in which you receive the greatest utility only in cases where a reliable predictor is actually unreliable."

  9. Meikai Heck This Schlong
    Originally posted by Lanny RETROCAUSALITY
    E
    T
    R
    O
    A
    U
    S
    A
    L
    I
    T
    Y

    It is not causative. There's no causality. Your choice is just likely to correlate.
  10. Nigger Nintendo Starving African Child
    Originally posted by Meikai "The correct answer is the answer in which you receive the greatest utility only in cases where a reliable predictor is actually unreliable."


    Strawman, like a bitch.
  11. Nigger Nintendo Starving African Child
    Originally posted by Meikai It is not causative. There's no causality. Your choice is just likely to correlate.

    What omega actually predicted is irrelevant at the juncture of making your decision, B is already either empty or full.
  12. Meikai Heck This Schlong
    Originally posted by Nigger Nintendo What omega actually predicted is irrelevant at the juncture of making your decision, B is already either empty or full.

    And whether or not B is actually full will correlate reliably with whether you eventually choose to one box or two box. Correct.
  13. Nigger Nintendo Starving African Child
    Originally posted by Meikai And whether or not B is actually full will correlate reliably with whether you eventually choose to one box or two box. Correct.

    No it correlated in the past but there's no guarantee it will in the future.

    I already gave you the prior plausible scenarios where Omega could gain a 100% success rate without even particularly having any insight into the decider.
  14. Meikai Heck This Schlong
    Originally posted by Nigger Nintendo No it correlated in the past but there's no guarantee it will in the future.

    This assumes that the reliable predictor will predict unreliably. There's no guarantee, but there's a high likelihood it will in the future because the predictor is reliable.
  15. Nigger Nintendo Starving African Child
    Originally posted by Meikai This assumes that the reliable predictor will predict unreliably. There's no guarantee, but there's a high likelihood it will in the future because the predictor is reliable.

    No part of this particular setup necessarily entails Omega being a 100% reliable predictor.
  16. Meikai Heck This Schlong
    It doesn't need to be 100% reliable (aka infallible), it just needs to be reliable. There's justification to believe it will be reliable when I choose based on Omega's track record/the reliable predictor's inherent, definitional reliability. What justification do you have for supposing it will be unreliable when you choose?

    It's like saying it's reasonable to make decisions based on the possibility the sun might not rise tomorrow (admittedly, the data on Omega's track record isn't quite that good but it is certainly an example of the more ambiguous "reliability" implied by the - obviously far superior - setup you chose.)
  17. Meikai Heck This Schlong
    "noooo it might be wrong this time the box is already full or empty my decision can't change anything i'm going to reliably come out with more utility because i'll always win something"



    "i'm going to reliably come out with more utility because the predictor's predictions are reliable by definition, and my strategy maximizes for utility in cases for which that is true."

  18. Nigger Nintendo Starving African Child
    Originally posted by Meikai It doesn't need to be 100% reliable (aka infallible), it just needs to be reliable.

    I know, that makes my point, not yours.

    There's justification to believe it will be reliable when I choose based on Omega's track record/the reliable predictor's inherent, definitional reliability. What justification do you have for supposing it will be unreliable when you choose?

    No part of what I said depends on Omega being unreliable, which is why I am calling you retarded because you are clearly failing to read.

    Like I said you do not get brownie points for Omega being wrong or right.

    In my case you would pick A+B and expect to open B to find $0 in it, and Omega would be right. But you still shouldn't regret your decision... because you'd know that there was never $1m in it to begin with before you ever decided and if you actually decided otherwise you would have gotten dick.

    All I said is that the boxes don't change so it doesn't matter what you actually choose now, only what Omega thought what you would actually choose, and that's in the past.

    YOU are the one asserting that somehow your actual decision in the present affects what Omega predicted it would be in the past.

    But you have not established this in any way and it is contrary to what is presented in the setup. Unless you can establish this, you are asserting a non-sequitur.

    It's like saying it's reasonable to make decisions based on the possibility the sun might not rise tomorrow (admittedly, the data on Omega's track record isn't quite that good but it is certainly an example of the more ambiguous "reliability" implied by the - obviously far superior - setup you chose.)

    My point has nothing to do with Omega being wrong, just that your actual decision in the present has no influence on how Omega predicted it in the past.
  19. Meikai Heck This Schlong
    Originally posted by Nigger Nintendo Like I said you do not get brownie points for Omega being wrong or right.

    Like I said, no. Depending on your choice, you just get a fuckton of utility on the basis of whether Omega was right or wrong. Not brownie points - the thing we actually care about. Ignoring the fact that Omega is more likely than not to be right is asinine.
  20. Lanny Bird of Courage
    Originally posted by Meikai It is not causative. There's no causality. Your choice is just likely to correlate.

    So your choice has no causal impact on what box b contains? And you get box b with either choice right?

    Then the only thing you’re choosing between is getting 1k more than you would have otherwise. Sure sounds like A+B is correct.
    The following users say it would be alright if the author of this post didn't die in a fire!
Jump to Top