User Controls

Human-like A.I. within 9 years

  1. #61
    antinatalism Tuskegee Airman
    Originally posted by Hikikomori-Yume

    What an exiting world we live in ^_^

    can't wait for the day I'm riding in my autonomous car, watching weedtuber videos in AR while my robot @home tends to the grow-op.

    if by "human-like" you mean something that resembles your (lack of) brain activity, then ENIAC was already good enough, you don't need to wait
  2. #62
    Originally posted by antinatalism if by "human-like" you mean something that resembles your (lack of) brain activity, then ENIAC was already good enough, you don't need to wait

  3. #63
    Obbe Alan What? [annoy my right-angled speediness]
    172. First let us postulate that the computer scientists succeed in developing intelligent machines that can do all things better than human beings can do them. In that case presumably all work will be done by vast, highly organized systems of machines and no human effort will be necessary. Either of two cases might occur. The machines might be permitted to make all of their own decisions without human oversight, or else human control over the machines might be retained.

    173. If the machines are permitted to make all their own decisions, we can't make any conjectures as to the results, because it is impossible to guess how such machines might behave. We only point out that the fate of the human race would be at the mercy of the machines. It might be argued that the human race would never be foolish enough to hand over all the power to the machines. But we are suggesting neither that the human race would voluntarily turn power over to the machines nor that the machines would willfully seize power. What we do suggest is that the human race might easily permit itself to drift into a position of such dependence on the machines that it would have no practical choice but to accept all of the machines decisions. As society and the problems that face it become more and more complex and machines become more and more intelligent, people will let machines make more of their decision for them, simply because machine-made decisions will bring better result than man-made ones. Eventually a stage may be reached at which the decisions necessary to keep the system running will be so complex that human beings will be incapable of making them intelligently. At that stage the machines will be in effective control. People won't be able to just turn the machines off, because they will be so dependent on them that turning them off would amount to suicide.

    174. On the other hand it is possible that human control over the machines may be retained. In that case the average man may have control over certain private machines of his own, such as his car of his personal computer, but control over large systems of machines will be in the hands of a tiny elite -- just as it is today, but with two difference. Due to improved techniques the elite will have greater control over the masses; and because human work will no longer be necessary the masses will be superfluous, a useless burden on the system. If the elite is ruthless the may simply decide to exterminate the mass of humanity. If they are humane they may use propaganda or other psychological or biological techniques to reduce the birth rate until the mass of humanity becomes extinct, leaving the world to the elite. Or, if the elite consist of soft-hearted liberals, they may decide to play the role of good shepherds to the rest of the human race. They will see to it that everyone's physical needs are satisfied, that all children are raised under psychologically hygienic conditions, that everyone has a wholesome hobby to keep him busy, and that anyone who may become dissatisfied undergoes "treatment" to cure his "problem." Of course, life will be so purposeless that people will have to be biologically or psychologically engineered either to remove their need for the power process or to make them "sublimate" their drive for power into some harmless hobby. These engineered human beings may be happy in such a society, but they most certainly will not be free. They will have been reduced to the status of domestic animals.
  4. #64
    Originally posted by Open Your Mind 172. First let us postulate that the computer scientists succeed in developing intelligent machines that can do all things better than human beings can do them. In that case presumably all work will be done by vast, highly organized systems of machines and no human effort will be necessary. Either of two cases might occur. The machines might be permitted to make all of their own decisions without human oversight, or else human control over the machines might be retained.

    173. If the machines are permitted to make all their own decisions, we can't make any conjectures as to the results, because it is impossible to guess how such machines might behave. We only point out that the fate of the human race would be at the mercy of the machines. It might be argued that the human race would never be foolish enough to hand over all the power to the machines. But we are suggesting neither that the human race would voluntarily turn power over to the machines nor that the machines would willfully seize power. What we do suggest is that the human race might easily permit itself to drift into a position of such dependence on the machines that it would have no practical choice but to accept all of the machines decisions. As society and the problems that face it become more and more complex and machines become more and more intelligent, people will let machines make more of their decision for them, simply because machine-made decisions will bring better result than man-made ones. Eventually a stage may be reached at which the decisions necessary to keep the system running will be so complex that human beings will be incapable of making them intelligently. At that stage the machines will be in effective control. People won't be able to just turn the machines off, because they will be so dependent on them that turning them off would amount to suicide.

    174. On the other hand it is possible that human control over the machines may be retained. In that case the average man may have control over certain private machines of his own, such as his car of his personal computer, but control over large systems of machines will be in the hands of a tiny elite – just as it is today, but with two difference. Due to improved techniques the elite will have greater control over the masses; and because human work will no longer be necessary the masses will be superfluous, a useless burden on the system. If the elite is ruthless the may simply decide to exterminate the mass of humanity. If they are humane they may use propaganda or other psychological or biological techniques to reduce the birth rate until the mass of humanity becomes extinct, leaving the world to the elite. Or, if the elite consist of soft-hearted liberals, they may decide to play the role of good shepherds to the rest of the human race. They will see to it that everyone's physical needs are satisfied, that all children are raised under psychologically hygienic conditions, that everyone has a wholesome hobby to keep him busy, and that anyone who may become dissatisfied undergoes "treatment" to cure his "problem." Of course, life will be so purposeless that people will have to be biologically or psychologically engineered either to remove their need for the power process or to make them "sublimate" their drive for power into some harmless hobby. These engineered human beings may be happy in such a society, but they most certainly will not be free. They will have been reduced to the status of domestic animals.

    Didn't read
  5. #65
    Obbe Alan What? [annoy my right-angled speediness]
    Originally posted by Captain Falcon Didn't read

    No need to get sour with me CF, I will continue discussing determinism and free will with you when I either fix or replace my PC.
  6. #66
    I'm just expressing that I didn't read your post.
  7. #67
    Obbe Alan What? [annoy my right-angled speediness]
    So how come you don't always post about stuff you don't do?

    "Didn't climb a mountain just now."

    "Didn't shit my pants today."

    You know why. Expressing stuff you didn't do is stupid. You were just being sour.
    The following users say it would be alright if the author of this post didn't die in a fire!
  8. #68
    Originally posted by Open Your Mind So how come you don't always post about stuff you don't do?

    "Didn't climb a mountain just now."

    "Didn't shit my pants today."

    You know why. Expressing stuff you didn't do is stupid. You were just being sour.

    Because there's a difference between something I am not doing and something I'm actively refusing to do.

    For example when Dave Chappelle did not take a $50 million contract, it would be retarded to say "SO WHAT?!? YOU ALSO DIDN'T SHIT YOUR PANTS?!!?!?"
  9. #69
    Obbe Alan What? [annoy my right-angled speediness]
    So in your mind refusing to read my posts is like refusing a 50 million dollar contract?
  10. #70
    Originally posted by Open Your Mind So in your mind refusing to read my posts is like refusing a 50 million dollar contract?

    No, you retard. The analogy is about refusing to do something vs merely not doing something.
  11. #71
    Originally posted by greenplastic Ugh, ok, I'll watch it later though, I'm not going to last 26 minutes right now without having to take a shit.

    I just wanted you to know that had you read my post yesterday about rubbing one out constipated or not.. I waited to poop and succesfully rubbed one out. Being constipated (which you clearly have a form of) is dangerous to your blood vessels. you can vertical tear an aorta being constipated all the time.

    and I say this because a healthy bowel will inform you of pooping with in 1 minute or less from the time of urge to pooping. if you ignore it, it will create constipation.

    You're like me.. you know it will take about 30 minutes to poop. that means you're putting pressure on your blood vessels and you need more fiber in your diet like me.

    go drink citricel
  12. #72
    greenplastic, I mean it.. go drink citracel
  13. #73
    mashlehash victim of incest [my perspicuously dependant flavourlessness]
    I can imagine an AI breaking someone's dick off.
  14. #74
    Originally posted by mashlehash I can imagine an AI breaking someone's dick off.

    that seems like an obvious at some point if programmed into the "nature" of such a creature. which the programmer would be responsible for in a libel case?
  15. #75
    Obbe Alan What? [annoy my right-angled speediness]
    Originally posted by Captain Falcon No, you retard. The analogy is about refusing to do something vs merely not doing something.

    You should have said "refused to read" instead of "Didn't read", you mong.
  16. #76
    HampTheToker African Astronaut
    Originally posted by Hikikomori-Yume They could just map the human senses then code that information into the robots or they could pull it from the cloud

    Explain how our sense of smell works.

    I'll wait.
    The following users say it would be alright if the author of this post didn't die in a fire!
  17. #77
    "Siren Server" is the collective data of all search and inputs request information by quantifying the needed data and then it allocates only what it has searched and formulate an answer by that search.

    it's figuring out and solving needed questions by looking up the best solutions based on all conversations or searches and answers since the beginning of internet. I bet it could even find out information related to 9/11 in more accuracy in just a few seconds to that of what NIST spit out over the years and millions they made.
  18. #78
    Originally posted by Open Your Mind You should have said "refused to read" instead of "Didn't read", you mong.

    The refusal to read is implied by the fact that I'm making note of not reading. You should review some of the foundational principles of philosophical discussion, particularly the principle of charity:

    https://en.wikipedia.org/wiki/Principle_of_charity
  19. #79
    Originally posted by Captain Falcon The refusal to read is implied by the fact that I'm making note of not reading. You should review some of the foundational principles of philosophical discussion, particularly the principle of charity:

    https://en.wikipedia.org/wiki/Principle_of_charity

    Didn't read
  20. #80
    Lanny Bird of Courage
    No one gives a shit if you incidentally didn't read something, actively refused to read something, are mentally incapable of reading something, or transcendentally beyond reading something. It's just a snotty reply that neither generates discussion or is an amusing retort. I'm not even the one you're not reading and it's obnoxious to see the same reply to every other post got dang.
    The following users say it would be alright if the author of this post didn't die in a fire!
Jump to Top