User Controls

The Hard Problem of Consciousness

  1. Obbe Alan What? [annoy my right-angled speediness]
    Originally posted by Common De-mominator Obbe's understanding of consciousness is entirely based on Westworld.

    I actually haven't seen that yet. But I am planning on it. Why are you such a negative person towards me? We used to play Minecraft together and you would ask me about God. Now you do weird drugs and are an asshole. I try to contribute to your thread and you just want to argue. I think I will just stop participating.
  2. Common De-mominator African Astronaut
    Originally posted by Obbe I actually haven't seen that yet. But I am planning on it. Why are you such a negative person towards me? We used to play Minecraft together and you would ask me about God. Now you do weird drugs and are an asshole. I try to contribute to your thread and you just want to argue. I think I will just stop participating.

    I just know it is useless to discuss philosophy with you because it's not going to go anywhere.

    And the reason for that is that you in no sense feel like learning anything or having your mind changed. You can deny it, I will tell you it is bullshit and what you say does not match what you do.

    You read one book, then defend the position found from that. That's fine. What's not fine for me (why I'm not going to entertain serious discussion with you till you demonstrate otherwise) is that you so fundamentally fail to grasp even what you're spouting that you literally cannot progress any conversation once anyone presses you on it beyond your emotional attachment to whatever you made your mind up about.

    The ought/is discussion on the meat thread was a great example because it was brutally obvious no matter how anyone posed any questions, you had no idea how to respond so you just parrotted "no ought only is" on repeat.

    You can actually Google the way that specific argument has gone on other forums and websites and do a broad comparison to the way you handled it. Go ahead, do it yourself.

    Learn intellectual honesty. Learn basic logic. Learn to examine your own position. Then we will talk. Until then I can just waste my time talking to a brick or Gadzooks instead.
  3. Common De-mominator African Astronaut
    I often ask a similar question to the epiphenomenalists: why consciousness? Why suffer or enjoy at all? Why not just feed forward and react to the environment?

    And the answer I come up with again and again is, this must evolutionarily be a great way to succeed and survive. It gives us extraordinary versatility to view ourselves as selves, to see in an integrated visual field, and so on..
  4. GGG victim of incest [my veinlike two-fold aepyornidae]
  5. Originally posted by Obbe I actually haven't seen that yet. But I am planning on it. Why are you such a negative person towards me? We used to play Minecraft together and you would ask me about God. Now you do weird drugs and are an asshole. I try to contribute to your thread and you just want to argue. I think I will just stop participating.

    because you refused his sexual advances,
  6. There's a hard problem with my cock

    Come and remedy that travesty
  7. Common De-mominator African Astronaut
    Originally posted by vindicktive vinny because you refused his sexual advances,

    Refusal is not an option.
  8. Common De-mominator African Astronaut
    Originally posted by SHARK My personal dissatisfaction comes from the fact that my mouth can obviously talk about things like the ineffable nature of the colour blue, and how hard it is for me to describe it to a blind man.

    It is difficult for me to imagine why my mouth would be talking about something like the "shape" of a ball if my integrated visual experience was not part of the process. And if it is, I don't find a good reason to separate it from its syntactical function for any good reason.

    I like to think about it by analogy to a calculator. Certainly I can generate report without consciousness, like I can generate the number 40 on a screen without ever doing any actual calculation of the number. In theory it could just contain a massive list of "if/then" statements that match an input question to fetch a precalculated output. For example "if input 2+2 then print 4" but for all possible combinations of calculations I might reasonably try.

    What convinces me calculation is actually happening is that we can understand reductively what's taking place and principally break down WHY the calculator generates the output in the general case. The explanation is completely syntactical at its most basic level, but the calculative idea is an abstraction of that.

    In the case of consciousness, there is decent evidence that our conscious perception does play some causal role our behaviour, even if we don't know HOW (and I am not saying this means evidence of conscious libertarian free will or anything, but that conscious perception feeds forward into behaviour).

    For example, ever catch a ball? Go out with a friend and have them freely toss the ball to you from far away, do it a couple of times and try to observe the contents of your mind as it happens. Now, as an unstructured informational procedure this shit is difficult as fuck to automate. But as it turns out, the "optimization" that the human brain developed to perform this function is to move the object into the middle of the visual field, then use proprioception to move your hand relative to your head and catch it. I think in that case, you're very consciously aware of what's about to happen as the ball moves in on you, and you adjust to catch it.

    You can still argue that an integrated information structure that is analogous to a visual field can exist and be used to process data without consciousness (in theory), but I think empirically that is not the case for the brain.

    As an example, you can look into the "phi illusion". One version of this illusion uses only two lights, separated by some distance. At the beginning, one is lit and the other is off. The first goes off, then the second goes on.

    However those subjected to this version of the illusion will report seeing the light move between the first to the second, even though there is no intermediate light, it is just an on/off.

    Now of course no intermediate light exists. The illusion of movement exists purely in their consciousness, and it is mistakenly reported from the subjects' consciousness.

    Now it is possible that the report is still just generated by completely unconscious processes, and consciousness of the experience is just a coincidental epiphenomenon. But I find that hard to believe because… Then why is the machine behaving like it is?

    So let's imagine we prick Lanny and Zombie Lanny with a pin in our universe and the proposed zombie universe. Both say "Ouch!" and I say "you baby, that didn't hurt!" Zlanny snaps back "Fuck you, it did!". Remember, these universes are physically identical so Zlanny surely reports for the same physical reasons as you, and surely he must be speaking with the same conviction as you… You're convinced you're having a qualitative experience but Zlanny would be convinced of the same. So… if it's just some syntactical state that produces the seeming of conscious pain, then how do you know YOU'RE not a Zombie now?

    And if that's the case, what does the additional element actually do for you that it doesn't do for the Zombie? Not "what function could it serve?" I mean literally, WHAT are we talking about at that point? What is left over in your case?

    The problem simply vanishes if you remove the proposed additional element. In reverse, I think the problem is "generated" by entertaining the additional element. So just don't add any new ingredients.

    I do have some sympathy towards property dualism though, and I think information as a concept sets up to derive consciousness as something that reducible arises from known physics. But I still think the properties of information structures are firmly physical in nature.



    It's subsumed by the physical in the sense that if we can push it around and get reports of it, we can investigate it as a physical phenomenon.

    I think what you are talking about is the software/hardware distinction, and it applies to the mind/brain distinction very well. The hardware involved is some variable syntactical machinery and the software is the input information that can configure it a particular way.


    The information stored on a CD vs on a vinyl for example is subsumed by the physical because the point is to generate the same syntactical result. The end goal is how to vibrate your auditory sensors in a particular way, and we can find different ways to accomplish that.

    The song isn't actually on the disc nor in the player, both are simply precursors that must be combined to generate that particular information structure to be interpreted by you.

    The way I view it is, it is very similar to considering the more abstract ideas of a computer.

    For example I can syntactically explain how your PC does everything it while running a Java program without ever referencing Java Virtual Machine, and in theory I could produce all the functionality of JVM from pure random chance too. And conversely if I had no idea wtf was going on from the other perspective and I went in to reverse engineer the PC from the hardware and physics, it would seem indecipherable and I'd have no idea wtf was going on above the syntactical level.

    DD's black boxes thought experiment is a great way to think about related concepts.

    http://cogprints.org/247/1/twoblack.htm



    I think there is decent evidence that conscious events are active physical events, and I find it plausible that they are defined by their physical causal properties, which would be what structures the content of our consciousness. If that is indeed the case, then I think it plays a causal role by being "what your body responds to", essentially.

    My current view lines up with most simulationists like Marvin Minsky: that consciousness is essentially the process that crunches the raw data and makes it more workable, the "user illusion", the desktop to your brain so it is actually usable, as opposed to using punch cards on a beige box with no monitor. There is a structure in the brain known as the "claustrum", which seems to be responsible for information integration. I think that, alongside the phi illusion, tells us something about how our brains must process data: consciousness is "assembled" unconsciously as a means to process the external world. So I think it's reasonable to assume that it feeds forward for your body to actually respond to it rather than just being an internal lightshow that you sort of "are".




    Think about conscious states in similar terms to software: I can generate a given text file using any computing hardware and word processing software, and open it on pretty much any hardware and software. And you can generate the text output without the file.

    But of course my text file is in fact a real thing, and it is fundamentally physical in nature. It is even principally possible to determine the ontic fact about whether or not it exists and it is there. But there are just an absurd level of abstraction layers between it and the physics involved so it's ridiculously difficult, but in principle, all information about my text file is reducible to physics.
  9. Common De-mominator African Astronaut
    One interesting thing to consider is whether consciousness is simulatable.

    If information integration is important then naively, I don't think any current AI techniques will be able to simulate consciousness. But then again, could you simulate an integrated information structure in such a way that it might have the same properties as a real one, even if it's simulated on normal, feed forward hardware?

    And what effect, if any, will neuromorphic computing have?
  10. gadzooks Dark Matter [keratinize my mild-tasting blossoming]
    Originally posted by Common De-mominator One interesting thing to consider is whether consciousness is simulatable.

    If information integration is important then naively, I don't think any current AI techniques will be able to simulate consciousness. But then again, could you simulate an integrated information structure in such a way that it might have the same properties as a real one, even if it's simulated on normal, feed forward hardware?

    And what effect, if any, will neuromorphic computing have?

    It depends on how deeply layered the neural networks are.

    For all intents and purposes, if it's software is tit for tat equivalent to the wetware, then why wouldn't it be considered conscious?
  11. Common De-mominator African Astronaut
    Originally posted by gadzooks It depends on how deeply layered the neural networks are.

    For all intents and purposes, if it's software is tit for tat equivalent to the wetware, then why wouldn't it be considered conscious?

    If IIT is correct (which I doubt) for example, then it is unsimulatable because consciousness would be impossible to simulate in any feed forward system, for example. This is because the causal structure of the integrated information structure is fundamental to consciousness in IIT.
  12. gadzooks Dark Matter [keratinize my mild-tasting blossoming]
    Originally posted by Common De-mominator If IIT is correct (which I doubt) for example, then it is unsimulatable because consciousness would be impossible to simulate in any feed forward system, for example. This is because the causal structure of the integrated information structure is fundamental to consciousness in IIT.

    I'm still reading up on IIT.

    All I know is I have constructed neural nets but they were only a single layer thick for the most part, and from that I just extrapolated that, assuming IIT is even valid, or any number of computational theories of consciousness for that matter, then it seems pretty reasonable to assume that whatever subjective experience is, it's emerging from these networks as long as they are sufficiently complex.

    But yeah, still reading up on IIT. There are a few other theories I still have to catch up on too. Shits changing so fast.
  13. Common De-mominator African Astronaut
    It would be possible to simulate IIT consciousness on neuromorphic chips apparently. But it's not simulatable in software according to Tononi, you need the direct causal properties.
  14. Common De-mominator African Astronaut
    Christof Koch talked about how IIT'w "shapes" are in some qualitative field, but I don't know if he is actually invoking some fundamental structure there, or if the phenomenology is only fundamental to the formulation of the theory. You would need the direct causal efficacy of integrated information structures in the brain to operate in that field.
  15. Common De-mominator African Astronaut
    GWS explained by Baars



    GWS is the current leading theory of consciousness in the scientific space.
  16. Lanny Bird of Courage
    Originally posted by SHARK My personal dissatisfaction comes from the fact that my mouth can obviously talk about things like the ineffable nature of the colour blue, and how hard it is for me to describe it to a blind man.

    It is difficult for me to imagine why my mouth would be talking about something like the "shape" of a ball if my integrated visual experience was not part of the process. And if it is, I don't find a good reason to separate it from its syntactical function for any good reason.

    I think this is a good point and well put. The fact that we can report on experiential phenomena doesn't seem like a terrible challenge to the epiphenomenalist, but the fact that we have reportability on second-order things like qualia would seem to suggest that experience has a causal impact on behavior. Or at least that would seem to be the simplest explanation. I've been mulling this over since you posted it and don't have a fleshed out response. I'm not convinced it's a conclusive refutation of epiphenomenalism but it certainly is a challenge.

    Just wanted to say I thought that was a good point and the reason I haven't replied is because I'm not totally sure what I think about it yet.
  17. Common De-mominator African Astronaut
    You got it, I just wanted to discuss the topic more.
  18. HTS highlight reel
    Lol my username on MSN when MSN instant messenger was still a thing used to be X: Return of the Bicameral Mind, because bicameralism made me lose my shit when I was like 17. It is a really interesting theory. Check it out, De-Mominator.
  19. Common De-mominator African Astronaut
    I know of it but it is just fiction.
  20. HTS highlight reel
    Originally posted by Common De-mominator I know of it but it is just fiction.

    Like Gadzooks said, that is probably too firm a stance to take. It's a collection of hypotheses. An untested (or even untestable) hypothesis is not necessarily fiction. :/
Jump to Top