Page 5 of 9 FirstFirst ... 2345678 ... LastLast
Results 41 to 50 of 87

Thread: If Robots Walked The Earth...

  1. #41
    Snee's Avatar Error xɐʇuʎs BT Rep: +1
    Join Date
    Sep 2003
    Location
    on something.
    Age
    45
    Posts
    17,971
    Oki Doki then.

    Self-awarness and willingness to serve shouldn't preclude each other if you set up the parameters right.

    If you have to give them emotions I suppose you make it a bit more tricky.

    The question is whether this automatically follows intelligence.

  2. Lounge   -   #42
    bujub22's Avatar THE GREAT
    Join Date
    Nov 2003
    Location
    ny
    Age
    45
    Posts
    9,938
    Originally posted by SnnY@10 August 2004 - 10:34
    Oki Doki then.

    Self-awarness and willingness to serve shouldn't preclude each other if you set up the parameters right.

    If you have to give them emotions I suppose you make it a bit more tricky.

    The question is whether this automatically follows intelligence.
    im going to make my robot get me a sugar cookie



    damn sugar cookie getter

  3. Lounge   -   #43
    RGX's Avatar Unstoppable
    Join Date
    Mar 2003
    Posts
    3,012
    Originally posted by SnnY@10 August 2004 - 14:34
    Oki Doki then.

    Self-awarness and willingness to serve shouldn't preclude each other if you set up the parameters right.

    If you have to give them emotions I suppose you make it a bit more tricky.

    The question is whether this automatically follows intelligence.
    I would think that if you gave it morals (in order to protect human beings) and reasoning (to decide when to use its abilitys) it would logically reason that it was being used immoraly to serve others...it would take some careful coding to prevent this...may even cause a conflict between its basic rule to protect and serve and its ability to use reasoning to counter act its decisions.

    Either way, it would be interesting to watch.

  4. Lounge   -   #44
    Lick My Lovepump
    Join Date
    May 2003
    Age
    22
    Posts
    2,657
    Originally posted by manker@10 August 2004 - 11:54
    Everyone knows that robots only go wrong when they are fitted with emotion chips. I say outlawing the development of emotion chips would make roboteering a safer pastime.

    Just say no to e-chips, kids.
    I'll just have to remember to not fit all my robots with emotion chips then, yeah...
































  5. Lounge   -   #45
    manker's Avatar effendi
    Join Date
    May 2004
    Location
    I wear an Even Steven wit
    Posts
    32,371
    Originally posted by RGX@10 August 2004 - 15:29
    Whatever we do, we must prevent them from direct connecting and copulating. The artificial intelligence produced could cause the end of mankind. Imagine skynet x10, controlling every computer system capable of connecting to a network...and refusing to run Vice city on any of them.

    And oi you two I made a good point above, this is what happens when you try and bring some ethical debate into a bar.


    Indeed you did, I was thinking of a reply before I got sidetracked with keyboard issues.

    I think the ability to reason would be a product of the implementation of many other traits rather than a straight upload. For a robot to conclude through reasoning that it is wrong to enslave a race it would have to posses a number of other characteristics such as empathy and comparitive skills.

    Also with so many humans coing to the conclusion that racism is perfectly fine I doubt it would be such a foregone conclusion that the robot would be able to reason his way to that particular conclusion even when all of the data is at his disposal. More likely he would just accept the situation.
    I plan on beating him to death with his kids. I'll use them as a bludgeon on his face. -

    --Good for them if they survive.

  6. Lounge   -   #46
    bujub22's Avatar THE GREAT
    Join Date
    Nov 2003
    Location
    ny
    Age
    45
    Posts
    9,938
    Originally posted by RGX+10 August 2004 - 10:40--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (RGX @ 10 August 2004 - 10:40)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-SnnY@10 August 2004 - 14:34
    Oki Doki then.

    Self-awarness and willingness to serve shouldn&#39;t preclude each other if you set up the parameters right.

    If you have to give them emotions I suppose you make it a bit more tricky.

    The question is whether this automatically follows intelligence.
    I would think that if you gave it morals (in order to protect human beings) and reasoning (to decide when to use its abilitys) it would logically reason that it was being used immoraly to serve others...it would take some careful coding to prevent this...may even cause a conflict between its basic rule to protect and serve and its ability to use reasoning to counter act its decisions.

    Either way, it would be interesting to watch. [/b][/quote]
    i say everyone stop depending on robots and most technology and get off there ass and do it there self that&#39;s why obeastty is at an all time high and i know i spelled that wrong

  7. Lounge   -   #47
    Snee's Avatar Error xɐʇuʎs BT Rep: +1
    Join Date
    Sep 2003
    Location
    on something.
    Age
    45
    Posts
    17,971
    Originally posted by RGX+10 August 2004 - 16:40--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (RGX @ 10 August 2004 - 16:40)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-SnnY@10 August 2004 - 14:34
    Oki Doki then.

    Self-awarness and willingness to serve shouldn&#39;t preclude each other if you set up the parameters right.

    If you have to give them emotions I suppose you make it a bit more tricky.

    The question is whether this automatically follows intelligence.
    I would think that if you gave it morals (in order to protect human beings) and reasoning (to decide when to use its abilitys) it would logically reason that it was being used immoraly to serve others...it would take some careful coding to prevent this...may even cause a conflict between its basic rule to protect and serve and its ability to use reasoning to counter act its decisions.

    Either way, it would be interesting to watch. [/b][/quote]
    I think you my friend, should read some Asimov, if you haven&#39;t.

    I, Robot, (the book, not the silly movie)treats subjects like this, to mention one book.

  8. Lounge   -   #48
    manker's Avatar effendi
    Join Date
    May 2004
    Location
    I wear an Even Steven wit
    Posts
    32,371
    Originally posted by Mad Cat+10 August 2004 - 15:41--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (Mad Cat @ 10 August 2004 - 15:41)</td></tr><tr><td id='QUOTE'> <!--QuoteBegin-manker@10 August 2004 - 11:54
    Everyone knows that robots only go wrong when they are fitted with emotion chips. I say outlawing the development of emotion chips would make roboteering a safer pastime.

    Just say no to e-chips, kids.
    I&#39;ll just have to remember to not fit all my robots with emotion chips then, yeah...

    [/b][/quote]
    I take it you never watched that particular episode of Robot Wars.

    A broken leg, three severed fingers, a missing child and Craig Charles still has trouble going to the toilet alone. Emotion chips are bad. Mmkay.
    I plan on beating him to death with his kids. I'll use them as a bludgeon on his face. -

    --Good for them if they survive.

  9. Lounge   -   #49
    RGX's Avatar Unstoppable
    Join Date
    Mar 2003
    Posts
    3,012
    Originally posted by bujub22+10 August 2004 - 14:42--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (bujub22 @ 10 August 2004 - 14:42)</td></tr><tr><td id='QUOTE'>
    Originally posted by RGX@10 August 2004 - 10:40
    <!--QuoteBegin-SnnY
    @10 August 2004 - 14:34
    Oki Doki then.

    Self-awarness and willingness to serve shouldn&#39;t preclude each other if you set up the parameters right.

    If you have to give them emotions I suppose you make it a bit more tricky.

    The question is whether this automatically follows intelligence.

    I would think that if you gave it morals (in order to protect human beings) and reasoning (to decide when to use its abilitys) it would logically reason that it was being used immoraly to serve others...it would take some careful coding to prevent this...may even cause a conflict between its basic rule to protect and serve and its ability to use reasoning to counter act its decisions.

    Either way, it would be interesting to watch.
    i say everyone stop depending on robots and most technology and get off there ass and do it there self that&#39;s why obeastty is at an all time high and i know i spelled that wrong [/b][/quote]
    In that case, I want all your further posts hand-mailed to every member you intend to see them.

    Technology has its uses.



    @ Snny: I keep meaning to read Asimov but I never get around to it, it would seem that his subject matter and reasoning would make for some great reading. I&#39;ll use this to spur me to get some of his books, as I love this kind of ethical/technological debate.

    Ideally, it needs to be late at night and we all need to be drunk for this discussion to work.

    @ Manker: Good point(s) well put. I agree it would take more than straight upload of basic reasoning skills but we humans tend to push things too far and I dont think it would be too long before emotions would be experimented with on robots, to give us a deeper understanding of how we use them, albeit on a more basic and controlled level.

    And as to whether the AI would just accept its fate....hard to speculate on....I&#39;d like to think that it would be intelligent enough to beleive it could expand itself further if freed from its menial tasks and existence, but who knows.

  10. Lounge   -   #50
    bujub22's Avatar THE GREAT
    Join Date
    Nov 2003
    Location
    ny
    Age
    45
    Posts
    9,938
    Originally posted by RGX+10 August 2004 - 10:50--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (RGX @ 10 August 2004 - 10:50)</td></tr><tr><td id='QUOTE'>
    Originally posted by bujub22@10 August 2004 - 14:42
    Originally posted by RGX@10 August 2004 - 10:40
    <!--QuoteBegin-SnnY
    @10 August 2004 - 14:34
    Oki Doki then.

    Self-awarness and willingness to serve shouldn&#39;t preclude each other if you set up the parameters right.

    If you have to give them emotions I suppose you make it a bit more tricky.

    The question is whether this automatically follows intelligence.

    I would think that if you gave it morals (in order to protect human beings) and reasoning (to decide when to use its abilitys) it would logically reason that it was being used immoraly to serve others...it would take some careful coding to prevent this...may even cause a conflict between its basic rule to protect and serve and its ability to use reasoning to counter act its decisions.

    Either way, it would be interesting to watch.

    i say everyone stop depending on robots and most technology and get off there ass and do it there self that&#39;s why obeastty is at an all time high and i know i spelled that wrong
    In that case, I want all your further posts hand-mailed to every member you intend to see them.

    Technology has its uses.



    [/b][/quote]
    point taken on somethings this computer like a big telephone hows that

Page 5 of 9 FirstFirst ... 2345678 ... LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •