User Controls
The Hard Problem of Consciousness
-
2019-03-11 at 7:49 PM UTC
Originally posted by gadzooks Because the older we get, the more things start to go wrong.
It's like an old beater car.
If your 1950's jalopy breaks down, it could be because of a number of things malfunctioning in concert.
In theory, at least, human death by age and natural causes is preventable. We just have to isolate all the different mechanisms responsible for senescence in humans.
Bad analogy .. car parts don't have a regeneration ability at a molecular state or any other . Humans can be like car surgeons and remove and replace or restorevlike a plastic surgeon.. but humans cells regenerate and replace or repair cells. -
2019-03-11 at 8:05 PM UTC
Originally posted by Krow Bad analogy .. car parts don't have a regeneration ability at a molecular state or any other . Humans can be like car surgeons and remove and replace or restorevlike a plastic surgeon.. but humans cells regenerate and replace or repair cells.
Up to a point, though.
Cells age much in the same way entire organisms age.
And our ability to regenerate cells depends entirely on the type of cell, the health of the entire organism, as well as the health of the surrounding organ and other tissues. -
2019-03-11 at 8:11 PM UTC
Originally posted by SHARK The Hard Problem of consciousness is figuring how to derive the information of "what it is like" to be something from facts about "what it is".
As an example, there is an apparent difference between being your objective brain state and being in your subjective mental state. The hard problem is figuring out how you come to have internal subjective states.
It is this ability to experience at all that defines the ability for anything to matter, because subjectivity is mattering itself.
So how can matter give rise to subjectivity? This is the hard problem.
Regarding last paragraph. Matter as in phisical matter that makes up the brain and other humans cells ? All matter breaks down and at times beyond nuclease to change its component of collective molecular combinations. Can subjectivity (thought?) Be copied to "the cloud" or other storage devices. If the humans brain starts to die and fragments it will lose subjctivity or perception before matter that makes up the phisical brain which it's matter still exist beyond death.. not functioning on its own but subjectivity is lost. We can't prove if after removing a brain after death and somehow process if memory has been stored. And uploading conscious to a non humans or human made device is a copy. But humans thoughts are conjoined with human phisical needs and physucal response. So a third device clearly must exist. A soul. Some energy but not related to that of human requirements for it's matter to exist. Some argue instinctual behavior is the Soul. -
2019-03-11 at 8:13 PM UTCFuck Android typewriter
-
2019-03-11 at 8:20 PM UTC
Originally posted by Krow Can subjectivity (thought?) Be copied to "the cloud" or other storage devices.
Awesome post, but I especially love this line.
Long-term memories are clearly stored via some physical neural configuration.
Why can't we (theoretically, at least) copy that exact same configuration, thus preserving the memories?
The problem then becomes: how do we reconstruct those memories in an understandable medium?
Can we create 2d images that display on a screen for living human observers to experience?
What about 3d images?
What about AR (Augmented Reality) that allows the user to experience all natural human senses associated with that memory? Sight, sound, touch, smell, taste, etc.
The only thing missing is context (i.e. if the memory is one of looking at a currently endangered species that no longer exists when the person is wearing the AR equipment, they will be pretty confused about what they're seeing).
But, that still runs into the homunculus problem.
The person wearing the AR is not equivalent to the person who originally experienced that memory. -
2019-03-11 at 8:33 PM UTC
Originally posted by Obbe That's already a pretty elaborate explanation. If you are looking for even more elaboration on this topic read "The Origin of Consciousness" by Jaynes.
If you have a specific question I may have a specific answer.
Well it doesn't really help solve the hard problem in any way. I think maybe there is some abstract analogy to the truth in it due to the nature of information structures but ultimately you have to imagine how it boils down to particles bumping together. Proposing some new conceptual space doesn't really help the problem. -
2019-03-11 at 9:26 PM UTC
Originally posted by gadzooks Obbe, I absolutely loved your post on metaphor and language as a way of explaining the origins of inner experience. Like, I'm reading some stuff on that very subject right now, and academically speaking, it's RIGHT up my alley.
But, in terms of the actual "Hard Problem" of consciousness, it doesn't quite reach that level of explanation.
And that's precisely why it's called the "hard problem."
Your post kinda comes close to the topic, but it still doesn't explain PRECISELY when, and how, during the course of the evolution of these metaphorical and linguistic experiential phenomena, we went from physical automata to experiencing "I"'s.
But I do I want to reiterate that your post touched on some really good points that at least fall under the rubric of explaining consciousness, which is an endeavor that perplexes even the most prominent philosophers, neuroscientists, psychologists, cognitive scientists, and other scholars.
The actual "Hard Problem", though, could quite possibly be relabeled the "Impossible Problem."
There's still the issue of explaining why we have inner experience when the world could just as easily exist exactly as is without any such experience.
I agree that this is possibly an impossible problem, but let me clarify something here: the idea is that the "inner space" you speak of is actually an illusion generated by metaphorical language. There isn't really an "inner space" or an "inner experience". Consciousness is a creation of complex language combined with a complex society. In this theory, consciousness is an operation of language, not a secret world that somehow exists outside of objective reality. The idea that subjective experience somehow exists without beyond objective reality with no basis in metaphorical language seems impossible to prove so why should anyone even entertain that idea? -
2019-03-11 at 9:30 PM UTCBut what does metaphorical language have to do with anything? How could it account for the sweetness of a tomato?
-
2019-03-11 at 9:51 PM UTC
-
2019-03-11 at 9:57 PM UTC
-
2019-03-11 at 10:05 PM UTC
Originally posted by gadzooks It's the qualitative/subjective experience (qualia) of a physical phenomenon.
Do mice experience sweetness?
What about ants?
I think that is something separate from consciousness as it is defined in Julian Jaynes theory. I imagine ants and mice probably have some level of awareness of their world, but I doubt it is anything like consciousness as we experience it. As far as I am aware mice and ants lack the vocabulary required to create an inner mind-space. -
2019-03-11 at 10:13 PM UTC
Originally posted by Obbe I think that is something separate from consciousness as it is defined in Julian Jaynes theory. I imagine ants and mice probably have some level of awareness of their world, but I doubt it is anything like consciousness as we experience it. As far as I am aware mice and ants lack the vocabulary required to create an inner mind-space.
But the problem (or at least, one part of the problem) is the whole issue of when in our evolutionary history did we evolve this inner mind-space?
When "cavemen" were grunting at each other while pointing to indicate "over there", were they experiencing inner mind-space? -
2019-03-11 at 10:40 PM UTC
Originally posted by gadzooks But the problem (or at least, one part of the problem) is the whole issue of when in our evolutionary history did we evolve this inner mind-space?
When "cavemen" were grunting at each other while pointing to indicate "over there", were they experiencing inner mind-space?
As recently as 3-4000 years ago based on the theory, using written texts from before, during and after this period of time as evidence for the change. As a cultural phenomenon (not biological), the exact time depends on the complexity of the culture and language in specific, but if we focus on just western civilization about 3-4000 years ago is the theory. -
2019-03-11 at 10:42 PM UTC
Originally posted by Obbe As recently as 3-4000 years ago based on the theory, using written texts from before, during and after this period of time as evidence for the change. As a cultural phenomenon (not biological), the exact time depends on the complexity of the culture and language in specific, but if we focus on just western civilization about 3-4000 years ago is the theory.
I see what you're saying, but what kinda throws a wrench into that explanation is the issue of explaining a mechanism for that transition.
How did we gradually develop inner subjective experience?
What does it mean to be partially conscious? -
2019-03-11 at 10:48 PM UTC
Originally posted by gadzooks I see what you're saying, but what kinda throws a wrench into that explanation is the issue of explaining a mechanism for that transition.
How did we gradually develop inner subjective experience?
What does it mean to be partially conscious?
I imagine the development was somewhat similar to the gradual development of a metaphorical language to create the illusion of "inner subjective experience".
We are partially conscious right now. You are aware of the subconscious? There are various things you are not conscious of. As you read this, there are various things you could potentially be conscious of, but you're not because this is distracting you. -
2019-03-11 at 10:54 PM UTC
Originally posted by Obbe I imagine the development was somewhat similar to the gradual development of a metaphorical language to create the illusion of "inner subjective experience".
We are partially conscious right now. You are aware of the subconscious? There are various things you are not conscious of. As you read this, there are various things you could potentially be conscious of, but you're not because this is distracting you.
I'm liking your line of thought here... The whole notion of the subconscious being an indication that we are actually experiencing partial consciousness is something I had never even considered before.
But, why couldn't our brains perform all these same operations that they perform every single day without ANY consciousness?
What is it about this metaphor/language theory that necessitates subjective experience?
Once computers are able to perform these same operations (and they will, most likely even within our lifetime), will they then be just as conscious as human beings are? -
2019-03-11 at 11:14 PM UTC
Originally posted by gadzooks I'm liking your line of thought here… The whole notion of the subconscious being an indication that we are actually experiencing partial consciousness is something I had never even considered before.
But, why couldn't our brains perform all these same operations that they perform every single day without ANY consciousness?
What is it about this metaphor/language theory that necessitates subjective experience?
Once computers are able to perform these same operations (and they will, most likely even within our lifetime), will they then be just as conscious as human beings are?
The thing I have been referring to as "consciousness" or "inner mind-space" is an illusion generated by a complex language combined with a complex social world. Humans couldn't perform the way we do without consciousness because consciousness is an operation of complex language and we have evolved to communicate with each other using complex languages in complex cultures. To remove consciousness from the picture would be to remove language and culture from the picture and without language and culture nothing about our world would be the same.
Consciousness is not simple awareness, at least as it is defined in the theory. You can teach an ape sign language and ask an ape questions, and the ape is aware you are asking and will answer you. However, apes don't ask questions back. They will ask for things like food, but an ape will never ask "why do we eat food?". Nothing about their brains suggests they would be incapable of asking these types of questions. They just don't.
When you look at Human evolution, you see a similar phenomenon in ancient art. Not only are there no questions as to how things happened, the elder is quick to curtail the possibility of asking such questions by giving a myth or narrative. The world is x way because it is.
The ability to ask these questions isn't genetic. It's cultural. We've had the ability to ask these questions for 200,000 years, yet we only really started asking them recently. Over the last 3 to 4 thousand years. -
2019-03-11 at 11:18 PM UTC
Originally posted by Obbe The thing I have been referring to as "consciousness" or "inner mind-space" is an illusion generated by a complex language combined with a complex social world. Humans couldn't perform the way we do without consciousness because consciousness is an operation of complex language and we have evolved to communicate with each other using complex languages in complex cultures. To remove consciousness from the picture would be to remove language and culture from the picture and without language and culture nothing about our world would be the same.
Consciousness is not simple awareness, at least as it is defined in the theory. You can teach an ape sign language and ask an ape questions, and the ape is aware you are asking and will answer you. However, apes don't ask questions back. They will ask for things like food, but an ape will never ask "why do we eat food?". Nothing about their brains suggests they would be incapable of asking these types of questions. They just don't.
When you look at Human evolution, you see a similar phenomenon in ancient art. Not only are there no questions as to how things happened, the elder is quick to curtail the possibility of asking such questions by giving a myth or narrative. The world is x way because it is.
The ability to ask these questions isn't genetic. It's cultural. We've had the ability to ask these questions for 200,000 years, yet we only really started asking them recently.
Okay, maybe currently existing biological species are incomparable to humans in that regard.
But, returning to the computer AI comparison...
How do we know that the cutting edge AI we end up seeing 10+ years from now won't start posing questions like "what am I?" or "why do I simply do what humans tell me to do?", etc...?
Once they start doing that, are they conscious like we are? -
2019-03-11 at 11:25 PM UTC
Originally posted by gadzooks Okay, maybe currently existing biological species are incomparable to humans in that regard.
But, returning to the computer AI comparison…
How do we know that the cutting edge AI we end up seeing 10+ years from now won't start posing questions like "what am I?" or "why do I simply do what humans tell me to do?", etc…?
Once they start doing that, are they conscious like we are?
I would imagine so. Given the potential for super-human processing power, I imagine that once AI begins to create it's own language and own "social world", more complex than any human culture is capable of, these AI's would have much more consciousness then we can possibly imagine. -
2019-03-12 at 7:02 PM UTC
Originally posted by gadzooks I see what you're saying, but what kinda throws a wrench into that explanation is the issue of explaining a mechanism for that transition.
How did we gradually develop inner subjective experience?
What does it mean to be partially conscious?
the bible has been very clear about.
we developed consciousness the momment eve plucked the apple ate it.