User Controls

The Hard Problem of Consciousness

  1. gadzooks Dark Matter [keratinize my mild-tasting blossoming]
    Originally posted by Lanny Given I have good reason to believe consciousness exists, but no evidence that it has any physical effect on the world, nor any physical evidence that even exists, it seems quite reasonable to put it into some non-physical category.

    So essentially you would describe yourself as a mind-body dualist?

    Logically, it does seem to necessarily follow that consciousness can't possibly be represented in physical terms (as you said in your post).

    I guess that makes me a dualist as well, but it's hard not to start seeking religious explanations for the origins and existence of this non-physical space.
  2. Lanny Bird of Courage
    Yeah, when you say "dualist" people start thinking Descartes and souls and "because God lol".

    But that's part of why there's a distinction between substance and property dualism.

    Specifically I don't think there's some mind substance floating around somewhere that's somehow bound to my physical body. I think minds are a non-physical product of physical systems, the state of my body determines the state of my mind but mind and body are fundamentally different kinds of things.
    The following users say it would be alright if the author of this post didn't die in a fire!
  3. Obbe Alan What? [annoy my right-angled speediness]
    Originally posted by gadzooks Where exactly is this experiencing subject located?

    Where is consciousness, or the "mind-space"? Most people feel that it is in their head, behind the eyes. But everyone here knows there is no such space in anyone's head at all. We are continually inventing these spaces in our own and other people's heads knowing they don't actually exist, and that the location is arbitrary.

    When I am conscious, I am defiantly using parts inside my head. But so am I when riding a bicycle. The bicycle riding does not occur inside my head. Bicycle riding has a definite geographical location. Consciousness does not. Consciousness has no location whatsoever beyond where we imagine it has.
  4. gadzooks Dark Matter [keratinize my mild-tasting blossoming]
    I would definitely consider myself a property dualist then.

    But still, why does this particular property need to even exist? Do sea slugs have some kind of super limited dual property as well?

    Do liserds, mice, cats, dogs have a subjective mind space?

    What about non-human primates?

    And then, what about AI, either in it's current state, or hypothetically in the future when it is exponentially more advanced?

    Like when we start using LTSM-based neural networks, and constructing complex aggregated machine learning models into multifunctional integrated "brains", will this same dual property emerge for them as well?
  5. Originally posted by Lanny Yeah, when you say "dualist" people start thinking Descartes and souls and "because God lol".

    But that's part of why there's a distinction between substance and property dualism.

    Specifically I don't think there's some mind substance floating around somewhere that's somehow bound to my physical body. I think minds are a non-physical product of physical systems, the state of my body determines the state of my mind but mind and body are fundamentally different kinds of things.

    This gave me a boner and that’s the hard problem of consciousness
    The following users say it would be alright if the author of this post didn't die in a fire!
  6. gadzooks Dark Matter [keratinize my mild-tasting blossoming]
    What about a little non-physical monism...

    I'm bringing Berkeley back.

  7. gadzooks Dark Matter [keratinize my mild-tasting blossoming]
    There is no physical world, folks.

    It's all an illusion.
  8. Krow African Astronaut
    Originally posted by gadzooks There is no physical world, folks.

    It's all an illusion.

    they why do I have an ear infection from hell?
  9. gadzooks Dark Matter [keratinize my mild-tasting blossoming]
    Originally posted by Krow they why do I have an ear infection from hell?

    The ear infection is merely an idea in the mind of God.

    Note: I am actually not a metaphysical idealist / mind-body non-physical monist.
  10. Krow African Astronaut
    Muppet Says NOooooooooooooooooo!
  11. SHARK Houston
    Originally posted by mmQ I've often considered an experiment of having a child and keeping it in a pitch black room for however many years with no sound or any interaction whatsoever other than feeding and watering it, and then introducing into 'the real world' and seeing what it does.

    They would be several mentally retarded. This "experiment" has been tried before under similar conditions, but less extreme. More extreme would just lead to more retarded children. Genetically, we are geared to be learning machines based on cultural transmission. No culture and you're basically left with a sub ape level creature, because even apes need nurturing.
  12. SHARK Houston
    Originally posted by gadzooks It's hard to take a firm position either way on the matter, because we have no instruments that can "detect/measure" subjective experience.

    Listen dipshit, you have been babbling this the entire time and it is just making you look like a retard.

    We already accept report to be a sign of consciousness in living beings. The very fact that you're talking about consciousness at all by making air waves through your mouth and a pattern of ATP discharges that actuate your keyboard means we have a genuine measuring instrument available. The entire point of this discussion at large is to figure out what aspect of that instrument leads to consciousness. We already have a for-sure "maximal" neural correlate of consciousness, it is the brain et al.

    The way to test for consciousness is to take an approach similar to Integrated Information Theory, which tries to find what type of physical structures could support the phenomenological properties of consciousness, then proposes to test minimal neural correlates of consciousness (MNCCs) against the phenomenology.

    One, in this framework you would test for a value called Phi to test for integration and "how conscious" a system is due to the level of integration. This is basically a test of how many interconnections each informational unit in a system is subject to. The conscious state is essentially just this state in any given instant.

    Two, you can test it by using the accepted report hardware, the brain. This is actually a current area of neuroscience research. You can simply stimulate a particular set of neural structures and generate a conscious experience, then record report from the subject.

    In fact we understand consciousness so well as an idea that researchers, a priori and from current neuroscience research and knowledge, and known principles were able to create a brand now optical illusion, never before practically or empirically observed until it was tested, and replicated.

    That's how much predictive purchase we have over the concept already. I thought you were a psych major or some shit? Do you actually read any consciousness research literature or just jerk off to the conversation like anime nerds who have fantasy battles?
    The following users say it would be alright if the author of this post didn't die in a fire!
  13. gadzooks Dark Matter [keratinize my mild-tasting blossoming]
    Originally posted by SHARK Listen dipshit, you have been babbling this the entire time and it is just making you look like a retard.

    We already accept report to be a sign of consciousness in living beings. The very fact that you're talking about consciousness at all by making air waves through your mouth and a pattern of ATP discharges that actuate your keyboard means we have a genuine measuring instrument available. The entire point of this discussion at large is to figure out what aspect of that instrument leads to consciousness. We already have a for-sure "maximal" neural correlate of consciousness, it is the brain et al.

    The way to test for consciousness is to take an approach similar to Integrated Information Theory, which tries to find what type of physical structures could support the phenomenological properties of consciousness, then proposes to test minimal neural correlates of consciousness (MNCCs) against the phenomenology.

    One, in this framework you would test for a value called Phi to test for integration and "how conscious" a system is due to the level of integration. This is basically a test of how many interconnections each informational unit in a system is subject to. The conscious state is essentially just this state in any given instant.

    Two, you can test it by using the accepted report hardware, the brain. This is actually a current area of neuroscience research. You can simply stimulate a particular set of neural structures and generate a conscious experience, then record report from the subject.

    In fact we understand consciousness so well as an idea that researchers, a priori and from current neuroscience research and knowledge, and known principles were able to create a brand now optical illusion, never before practically or empirically observed until it was tested, and replicated.

    So then let's say we actually isolate these physical substrates, and create an entire mapping of neural structures / neural activity to each and every possible conscious experience, I'm still not seeing how that proves physical monism.

    Does IIT even try to address issues like animal consciousness or machine consciousness?

    If these neural correlates are merely coding the information that produces phenomenological experience, then it stands to reason that consciousness is not a phenomenon restricted to biological entities.

    Originally posted by SHARK That's how much predictive purchase we have over the concept already. I thought you were a psych major or some shit? Do you actually read any consciousness research literature or just jerk off to the conversation like anime nerds who have fantasy battles?

    Psychology is an incredibly vast field.

    It just so happens I took a fourth-year seminar on philosophy of mind and consciousness. But that was nearly a decade ago. Knowledge fades over time. Also, IIT is pretty new so I haven't had much of a chance to explore it.
  14. SHARK Houston
    Originally posted by Obbe In being conscious of consciousness we feel we feel it is the defining attribute of all our waking states, our moods and affections, memories, thought and attention. We feel that consciousness is the basis of concepts, learning, reasoning. We feel that consciousness must be located within our heads. All of these statements are actually false.

    Consciousness is a much smaller part of our mental life than we are conscious of, as we cannot be conscious of what we are not conscious of - sort of like asking a flashlight in a dark room to find something that doesn't have any light shining upon it. Everywhere it looks there appears to be light, when in reality most of the room is in darkness.

    We feel that consciousness is continuous. But if you think of a minute as being 60000 milliseconds, are you conscious for every one of those milliseconds? We are conscious less often then we believe we are, because we cannot be conscious of when we are not conscious.

    Consciousness is often unnecessary. Consciousness is not necessary for concepts.
    Consciousness is not necessary for learning. Consciousness is not necessary for thinking, nor for reasoning. Consciousness is not a copy of experience. Consciousness has no location. I may elaborate on these statements more later on, but if you are actually interested in learning more about this theory of consciousness, read The Origin of Consciousness by Julian Jaynes.

    Idk what loony toons definition of consciousness you are (or whoever you stole this from is) using but none of those are necessary assumptions for consciousness, and a couple of baseless claims are made. For example if some part of visual consciousness is indeed an integrated information structure, the "why" of it being conscious (even though you don't "need" it to produce report etc) is simply a matter of "how" it comes to be effective in how our body further processes our visual perception. What makes this not so simple is that how that actually works is ludicrously complex.

    I definitely don't think IIT is a complete or final theory at all but it addresses an obviously important element of consciousness, which is information integration, something which is very important account for in the process of giving an account of the appearance of a "subjective experience".
  15. Obbe Alan What? [annoy my right-angled speediness]
    Originally posted by SHARK Idk what loony toons definition of consciousness you are (or whoever you stole this from is) using but none of those are necessary assumptions for consciousness, and a couple of baseless claims are made. For example if some part of visual consciousness is indeed an integrated information structure, the "why" of it being conscious (even though you don't "need" it to produce report etc) is simply a matter of "how" it comes to be effective in how our body further processes our visual perception. What makes this not so simple is that how that actually works is ludicrously complex.

    I definitely don't think IIT is a complete or final theory at all but it addresses an obviously important element of consciousness, which is information integration, something which is very important account for in the process of giving an account of the appearance of a "subjective experience".

    Subjective conscious mind is an analog of what is called the real world. It is built up with terms that are all metaphors or analogs of behavior in the physical world. Its reality is of the same order as mathematics. It allows us to shortcut behavioral processes and arrive at more adequate decisions. Like mathematics, it is an operator rather than a thing or repository. And it is intimately bound up with volition and decision.

    Consider the language we use to describe conscious processes. The most prominent group of words used to describe mental events are visual. We ‘see’ solutions to problems, the best of which may be ‘brilliant’, and the person ‘brighter’ and ’clearheaded’ as opposed to 'dull', 'fuzzy-minded', or 'obscure' solutions. These words are all metaphors and the mind-space to which they apply is a metaphor of actual space. In it we can 'approach' a problem, perhaps from some 'viewpoint', and 'grapple' with its difficulties, or seize together or 'com-prehend' parts of a problem, and so on, using metaphors of behavior to invent things to do in this metaphored mind-space.

    The adjectives to describe physical behavior in real space are analogically taken over to describe mental behavior in mindspace when we speak of our minds as being 'quick,' 'slow', 'agitated' (as when we cogitate or co-agitate), 'nimble-witted', 'strong-' or 'weak-minded.' The mind-space in which these metaphorical activities go on has its own group of adjectives; we can be 'broad-minded', 'deep', 'open', or 'narrow-minded'; we can be 'occupied'; we can 'get something off our minds', 'put something out of mind', or we can 'get it', let something 'penetrate', or 'bear', 'have', 'keep', or 'hold' it in mind.

    As with a real space, something can be at the 'back' of our mind, in its 'inner recesses', or 'beyond' our mind, or 'out' of our mind. In argument we try to 'get things through' to someone, to 'reach' their 'understanding' or find a 'common ground', or 'point out', etc., all actions in real space taken over analogically into the space of the mind.
  16. SHARK Houston
    Originally posted by Lanny I suppose it depends on if you consider laws involving physical things to be themselves part of the physical facts about the world. That sounds pedantic but it's kind of important, if physical facts can give rise to non-physical phenomena, and the laws governing this interaction are not part of the physical facts, then it's perfectly conceivable that the physical facts of the world can remain unaltered while the non-physical facts which supervene on them are different.

    I think the main problem (and Descartes faced this too) is that if a physical fact can affect it, then it is subsumed by the physical.

    Even if it is only a one way street (which I don't agree with) and an epiphenomenon, that fact that it is one way in one particular configuration and another way in another configuration means there is at least some reason that gives it structure one way rather than the other.

    Now you can argue that this merely establishes a correlation, and you can conceive of another universe where the physical facts are the same but whatever non physical facts are different. But I would argue that there is some kind of necessary, principally reducible relationship (and consciousness has some direct causal power) precisely because there seems to be non-arbitrary structure to it: the experience of eating an apple differs from the experience of eating a pear in ways that I could confirm by differences in what I know about them, and you can report kt. I think that the more conservative explanation is simply that consciousness is just some set of conditions in an information space. There can be information systems structured and integrated in such a way that a change in some element of that system will affect the overall state of the system, and "consciousness" is simply what it means to access and work with any element of the integrated system. It is simply a property of the model in that case. And if it's integrated in the right ways, they can pretty much start to "think" as a result of the previous integrated state of the structure.

    I think there needs to be something extra because empirical investigation seems to at least be theoretically capable of explaining of explaining all of my physical behaviors without reference to consciousness. Nervous signals, information processing, muscle actuation, all physical phenomena that don't need to make reference to consciousness to explain. Despite a legacy of substance dualism and a society that assumes, almost by necessity, that our conscious experience has some kind of executive role in our behavior we have no evidence that this is the case, and some reasonable evidence to the contrary. The physical world doesn't actually seem to give us any evidence that consciousness exists at all, there are biological machines walking around and we can explain their behavior but it's only by analogy to our own experience and form that we attribute experience to other physical systems, no where in nature have we empirically discovered consciousness.

    And yet I have immediate and intensely compelling evidence that I am in fact conscious, far better evidence than the physical sciences have ever given me for anything. And I can't even imagine why the empirical sciences could do to produce evidence of consciousness, what would it look like?

    Given I have good reason to believe consciousness exists, but no evidence that it has any physical effect on the world, nor any physical evidence that even exists, it seems quite reasonable to put it into some non-physical category.

    I think the problem is that you're thinking of two different levels of explanation as one.

    I agree that you don't need consciousness to, in principle, explain the reports of a conscious person in a syntactical capacity.

    And in theory in some branch of the quantum wave function, we could end up in a universe where monkeys banging on a typewriter output answers constantly as a matter of pure chance that completely pass the Turing test, but of course there is no consciousness going on.

    But I think you would need consciousness to explain the intermediate steps in a complete way that has total predictive power over the behaviour of systems that are structured a certain way. For example if the content of consciousness is indeed an integrated information structure, then why is it crunched a particular way? I think the generalised answer will be precisely the nature of consciousness itself. And this, again, I think will be some very abstract internally referential, integrated information scheme that will have a particular set of properties that ARE what consciousness "is", what it means to be conscious. But again, I don't think you can just meaningfully digest it by analogy, and you will need to understand it's full syntactical workings to sort of start understanding its full workings.

    I suspect that in principle we could find some abstract syntactical, coincidentally structured garbage as the explanation to our consciousness, and a lot of "meaning" will only be locatable by trying to understand culture and other environmental influences as they relate to us. Even these will ultimately become very abstract because they're so complexly linked to us over so many generations that there is no real unified first person comprehending "meaning" to you, as a person. It's not that you just don't get it, it's literally not going to be the right format file for meaningfulness in your consciousness, it's like trying to use VLC to open the VLC exe.

    A good example is colour. People often point to the explanatory gap on explaining something like "what blue is" to a blind man while establishing the domain of subjective experience.

    But even colour perception is quite demonstrably just some abstract representational phenomenon taking place within your consciousness VM. There is no reason to believe that colour is actually some magically ineffable qualia that exists of itself, but rather that it is so complexly related to our brains as an idea that again, it just makes no sense in any way except the syntactical sense, because our mind isn't meant to actually run the steps, it runs ON those steps.
  17. gadzooks Dark Matter [keratinize my mild-tasting blossoming]
    SHARK, I want to apologize for calling you a pseudo-intellectual blowhard.

    I'm reading up on this IIT research and it's actually pretty fucking fascinating.

    Back when I took that seminar in philosophy of mind / consciousness, this area of inquiry was in it's infancy, so I don't think I had any exposure to it.

    This kind of research is so right up my alley I could devote an entire academic career to it (in fact, I considered that at one point).

    Anyway, thank you for introducing me to Integrated Information Theory.

  18. Lanny Bird of Courage
    Originally posted by gadzooks But still, why does this particular property need to even exist?

    I don't know. I'm not sure what explanation you could give. I think it falls into the same category of like "why does mass even need to exist?". Maybe we can explain it in terms of simpler particles or fields but then we have to explain those things. At some point we get to the most fundamental laws it's just like "because that's how our universe happens to be".

    Do sea slugs have some kind of super limited dual property as well?

    Do liserds, mice, cats, dogs have a subjective mind space?

    What about non-human primates?

    And then, what about AI, either in it's current state, or hypothetically in the future when it is exponentially more advanced?

    Like when we start using LTSM-based neural networks, and constructing complex aggregated machine learning models into multifunctional integrated "brains", will this same dual property emerge for them as well?

    It seems pretty reasonable to assign consciousness to lower animals, there doesn't seem to be anything unique about our neurology that makes us conscious and other animals not. I mean I could conceive of cats and rats and such just being inert zombie bio-machines but it doesn't seem likely in light of my own experience and the machinery necessary to support it.

    I have no reason to assume animal neurology is the only kind of physical system that can support consciousness so I have no particular objection to the notion that computers could, at some point, have animal or human levels of consciousness. I'm pretty cynical about the whole neural networks fad and I'm not convinced they're well positioned for producing machine consciousness. Even Jets and Sharks style IAC nets in the 80s were conceptually closer to cognitive than all the deep wankery you see making headlines today.

    Originally posted by SHARK I think the main problem (and Descartes faced this too) is that if a physical fact can affect it, then it is subsumed by the physical.

    Even if it is only a one way street (which I don't agree with) and an epiphenomenon, that fact that it is one way in one particular configuration and another way in another configuration means there is at least some reason that gives it structure one way rather than the other.

    I know a lot of people are uncomfortable with epiphenomenalism but honestly it's never really bothered me, and I think once you get over it being really different than how we usually think about things it's really a pretty satisfying explanation, or at least framework for explanation, of consciousness.

    I don't see why consciousness superveneing on physical facts makes it "subsumed by the physical". Like I have a certain MP3 file in my library. There's a copy on my laptop's SSD and on a backup spinning platter drive. We'd quite naturally say these are the same file, they consist of the same byte sequence. Yet that byte sequences in one case supervenes on distribution of magnetic charge over a chunk of spinning metal and on the presence of electrical charge in the other. I wouldn't call the byte sequence "subsumed by the magnetic" in one case and "subsumed by the electrical" in the other. The byte sequence isn't magnetic or electrical, it's abstract, even if it "emerges" (be it by our design) from different physical phenomena.

    But I would argue that there is some kind of necessary, principally reducible relationship (and consciousness has some direct causal power) precisely because there seems to be non-arbitrary structure to it: the experience of eating an apple differs from the experience of eating a pear in ways that I could confirm by differences in what I know about them

    That would seem to be good evidence that the physical facts give rise to experience, but I don't see why it necessitates consciousness having causative power. It seems to demonstrate that the physical has causative power (the physical composition of a fruit affects the experience of consuming it) but not that our conscious experience has any effect on our behavior.



    I agree that you don't need consciousness to, in principle, explain the reports of a conscious person in a syntactical capacity.

    And in theory in some branch of the quantum wave function, we could end up in a universe where monkeys banging on a typewriter output answers constantly as a matter of pure chance that completely pass the Turing test, but of course there is no consciousness going on.

    But I think you would need consciousness to explain the intermediate steps in a complete way that has total predictive power over the behaviour of systems that are structured a certain way. For example if the content of consciousness is indeed an integrated information structure, then why is it crunched a particular way? I think the generalised answer will be precisely the nature of consciousness itself. And this, again, I think will be some very abstract internally referential, integrated information scheme that will have a particular set of properties that ARE what consciousness "is", what it means to be conscious. But again, I don't think you can just meaningfully digest it by analogy, and you will need to understand it's full syntactical workings to sort of start understanding its full workings.

    Sure, I'm not proposing that brains just random chemical vats that happen to give rise to coherent behavior that looks like a conscious agent is controlling it. It seems quite clear that the brain does a great deal of information processing in interpreting and producing speech. The reason I say one date and not some other when someone asks me my birthday is because of complex information processing in the brain and if you want to take "consciousness" to mean "information processing" then sure, there are no p-zombies, can't reasonably explain p-zombies without information processing. But I don't think that explains why I have a subjective experience of someone asking me a question. Databases can answer that question, do the requisite information processing, all the time without having an experience of doing so. Nor am I saying that when someone asks me a question it's just a coincidence that I have an experience that seems to correspond to that, obviously my experience corresponds to physical reality in some way. I just think that the experience of being asked a question and responding, and behavior and physical changes involved in it are different things. Even if they have the same cause and always happen together in this particular world, I have no problem imagining systems without consciousness answering the same question (this happens all the time, as with the database) or having the experience without the corresponding physical having undergoing the same changes.
  19. Krow African Astronaut
    Originally posted by SHARK Listen dipshit, you have been babbling this the entire time and it is just making you look like a retard.

    We already accept report to be a sign of consciousness in living beings. The very fact that you're talking about consciousness at all by making air waves through your mouth and a pattern of ATP discharges that actuate your keyboard means we have a genuine measuring instrument available. The entire point of this discussion at large is to figure out what aspect of that instrument leads to consciousness. We already have a for-sure "maximal" neural correlate of consciousness, it is the brain et al.

    The way to test for consciousness is to take an approach similar to Integrated Information Theory, which tries to find what type of physical structures could support the phenomenological properties of consciousness, then proposes to test minimal neural correlates of consciousness (MNCCs) against the phenomenology.

    One, in this framework you would test for a value called Phi to test for integration and "how conscious" a system is due to the level of integration. This is basically a test of how many interconnections each informational unit in a system is subject to. The conscious state is essentially just this state in any given instant.

    Two, you can test it by using the accepted report hardware, the brain. This is actually a current area of neuroscience research. You can simply stimulate a particular set of neural structures and generate a conscious experience, then record report from the subject.

    In fact we understand consciousness so well as an idea that researchers, a priori and from current neuroscience research and knowledge, and known principles were able to create a brand now optical illusion, never before practically or empirically observed until it was tested, and replicated.

    That's how much predictive purchase we have over the concept already. I thought you were a psych major or some shit? Do you actually read any consciousness research literature or just jerk off to the conversation like anime nerds who have fantasy battles?

    someone will just reiterate "G0ds Dream"
  20. SHARK Houston
    Originally posted by Obbe Subjective conscious mind is an analog of what is called the real world. It is built up with terms that are all metaphors or analogs of behavior in the physical world. Its reality is of the same order as mathematics. It allows us to shortcut behavioral processes and arrive at more adequate decisions. Like mathematics, it is an operator rather than a thing or repository. And it is intimately bound up with volition and decision.

    Consider the language we use to describe conscious processes. The most prominent group of words used to describe mental events are visual. We ‘see’ solutions to problems, the best of which may be ‘brilliant’, and the person ‘brighter’ and ’clearheaded’ as opposed to 'dull', 'fuzzy-minded', or 'obscure' solutions. These words are all metaphors and the mind-space to which they apply is a metaphor of actual space. In it we can 'approach' a problem, perhaps from some 'viewpoint', and 'grapple' with its difficulties, or seize together or 'com-prehend' parts of a problem, and so on, using metaphors of behavior to invent things to do in this metaphored mind-space.

    The adjectives to describe physical behavior in real space are analogically taken over to describe mental behavior in mindspace when we speak of our minds as being 'quick,' 'slow', 'agitated' (as when we cogitate or co-agitate), 'nimble-witted', 'strong-' or 'weak-minded.' The mind-space in which these metaphorical activities go on has its own group of adjectives; we can be 'broad-minded', 'deep', 'open', or 'narrow-minded'; we can be 'occupied'; we can 'get something off our minds', 'put something out of mind', or we can 'get it', let something 'penetrate', or 'bear', 'have', 'keep', or 'hold' it in mind.

    As with a real space, something can be at the 'back' of our mind, in its 'inner recesses', or 'beyond' our mind, or 'out' of our mind. In argument we try to 'get things through' to someone, to 'reach' their 'understanding' or find a 'common ground', or 'point out', etc., all actions in real space taken over analogically into the space of the mind.

    You are literally just babbling past me, which is what you always do. The problem is still simply, why are thoughts "had" by a system, rather than feeding forward informationally. Whether this is a metaphor or analogy or a dildo up my ass is still not even beginning to describe the problem.
Jump to Top