User Controls
What if what we think is consciousness is just a bunch of if/else statements?
-
2017-12-17 at 4:50 PM UTCIDK man. I often times see computer science niggas. Comment on AI and in a cheeky way say something along the lines of: Hon hon hon!(IDK Why they're French just roll with it OK) That's not AI! That's le if/else statements!
What's the difference? And maybe what we perceive as our own consciousness is the same?
I just went full Obbe didn't i? -
2017-12-17 at 4:57 PM UTCConsciousness is not what most people think it is.
-
2017-12-17 at 5:30 PM UTC^ so it is what few people think what it is ????
-
2017-12-17 at 9:03 PM UTCIs someone still human if they are in a state of unconsciousness?
-
2017-12-17 at 9:04 PM UTC
-
2017-12-17 at 9:09 PM UTCI've often wondered myself whether we enjoy a certain complexity only because of an intricately woven tapestry of interconnected phenomenal binaries, or whether it is more like a chorus of relatively complex individual pixels, but I think there's probably data to support many different conceptions of how consciousness is produced and many of these models could be at odds with one another.
Post last edited by Zanick at 2017-12-17T21:11:52.389898+00:00 -
2017-12-17 at 9:11 PM UTC
-
2017-12-17 at 9:13 PM UTC
-
2017-12-17 at 9:26 PM UTC
-
2017-12-17 at 10:02 PM UTCthey can dress it up all they want, functionally, AI is just a huge series of runtime-rewritable conditionals
-
2017-12-17 at 10:15 PM UTC
Originally posted by Zanick I've often wondered myself whether we enjoy a certain complexity only because of an intricately woven tapestry of interconnected phenomenal binaries, or whether it is more like a chorus of relatively complex individual pixels, but I think there's probably data to support many different conceptions of how consciousness is produced and many of these models could be at odds with one another.
Post last edited by Zanick at 2017-12-17T21:11:52.389898+00:00
Maybe even if we are just bio-machines, consciousness might be an emergent property of that state of being. -
2017-12-17 at 10:16 PM UTC
-
2017-12-17 at 10:23 PM UTCEven self-learning AI goes by that priciple, right? It has to "know" the end goal and then goes by trial and error to move on. I saw it in a video with Super Mario. The chess one does the same.
Pretty sure this applies to most ways of learning. Cionsciousness itself does not operate on that priciple as it can ask about and question the end goal itself. It can also deny. Mhmm.. maybe it can. Gets pretty deterministic here and that shit is 2meta4me. -
2017-12-17 at 10:36 PM UTC
Originally posted by Sophie Good point. Now how about people?
I'd think so, but the brain doesn't seem to handle instructions and memory in the same way as a computer so we don't really have a way to prove or disprove it yet
Some conditions (instincts) are hardcoded, but generally learning works the same way - natural curiosity exposes a person to as much external stimuli as possible, and through the response these conditions are built (ie. naive curiosity leads a child to touch the stove, and from there they build a set of conditions to detect heat so it doesn't happen again). -
2017-12-17 at 10:40 PM UTC
Originally posted by aldra I'd think so, but the brain doesn't seem to handle instructions and memory in the same way as a computer so we don't really have a way to prove or disprove it yet
Some conditions (instincts) are hardcoded, but generally learning works the same way - natural curiosity exposes a person to as much external stimuli as possible, and through the response these conditions are built (ie. naive curiosity leads a child to touch the stove, and from there they build a set of conditions to detect heat so it doesn't happen again).
Detecting heat and feeling pain from experiencing too much heat are two different programs though. A baby will still scream in agony if you burn them. I'd say pain response is hard coded as well probably, lol. -
2017-12-17 at 10:47 PM UTC
-
2017-12-17 at 10:58 PM UTC
Originally posted by Sophie Detecting heat and feeling pain from experiencing too much heat are two different programs though. A baby will still scream in agony if you burn them. I'd say pain response is hard coded as well probably, lol.
well yeah, that was more just an example of how behaviours might be constructed as a response to external stimuli -
2017-12-17 at 10:59 PM UTC
-
2017-12-18 at 12:12 AM UTC
-
2017-12-18 at 12:20 AM UTCbup