This forum is in permanent archive mode. Our new active community can be found here.

Love and Robots

edited January 2008 in Technology
David Levy has a new book titled Love and Sex with Robots in which he predicts that robot tech will allow for such human-like creations in the near future that we can fall in love with them, that the robots can (at least mimic) that they love us back, and that they will be viable sex partners.

Of course, there is nothing new with the prediction. It's one of the oldest tropes in science fiction. But it raises old interesting questions that can be used to while away slack time at work. The discussion is probably good for that one particular thing only, but seeing as how work is particularly slack right now, it might be a welcome diversion.

Can we become so attached to a machine that we can be said to love it? Would we feel gratified if a machine mimicked loving us back, or would we realize that it was just acting in accordance with programming? Would a robot necessarily be mimicking, or could it possibly begin feeling? For that matter, how can we be sure that other people aren't just mimicking emotions? If Mr. Levy walked in on his spouse using a robot as a sex partner, would he be right to feel jealous or angry?

Comments

  • edited January 2008
    Have they learned nothing from I Dated a Robot?
    Post edited by chaosof99 on
  • edited January 2008
    Please use HTML.
    Post edited by Ametto on
  • Every 'emotion' we have is a result of either chemical or electrical activity within our bodies. If we are able to create an emulation of these behaviors in an artificial creature, are they less valid? I'm not sure I could answer that.
  • edited January 2008
    I think it's in all people to give some sort of human qualities to non-living things, to feel some incredible attachment to objects. I mean, do you love your first car? Think of all the things that happened, with, around, in, and because of that. The car definitely does not love you back, but that doesn't mean you are attached.

    We could ask the same questions about loving animals. Does your dog love you back? Or is he just loyal because you give him food?
    Post edited by Rym on
  • We could ask the same questions about loving animals. Does your dog love you back? Or ishejust loyal because you give him food?
    I've seen photographs of some dogs "giving it back" to their owners. Those dogs look mighty "excited" in those photographs, if you get my drift.
  • I can use PGP to encrypt more than email? Friggin sweet!
  • I can use PGP to encrypt more than email? Friggin sweet!
    The thread is not for the show.
  • edited January 2008
    We could ask the same questions about loving animals. Does your dog love you back? Or ishejust loyal because you give him food?
    I've seen photographs of some dogs "giving it back" to their owners. Those dogs look mighty "excited" in those photographs, if you get my drift.
    But how many times have you seen a dog giving it back to a couch? Or another dog, in more ways than one?

    I think you'd have to be prrrrreeeeety fucked in the head to fall in love with any inanimate object, or a fully functional robot.

    What was it? Was it Stand Alone Complex where a man had a robot wife and they were "in love"? They ran away from it all to be together forever blah blah blah. Really she was programmed to merely act like it. She didn't feel any compassion for him, really. She's a robot. She had learned the idea of love from watching film and translated that in to her life. It's a decent example, I guess, that robots couldn't really love us back - that's not to say you CAN'T love them... It's just weird.
    Post edited by MitchyD on
  • The interesting question here is where the line is drawn between a machine being "programmed to feel love" in a simplistic manner and one that actually feels love as an (indirect) result of its programming (an AI with actual emotion, in other words). It's funny you should mentions Ghost In The Shell, because the lines between software, organic consciousness, and machine consciousness are a major theme of those series.

    All signs I've seen point to human consciousness (and therefore emotion and intelligence) being the result of electrochemical machinery. So there's no reason, in principle, that an AI based on similarly complex machinery couldn't feel emotion. But if that AI is developed by humans, it could just as easily be said that it has been "programmed" to feel that emotion. Conversely, human emotion can be modified by chemical imbalances in our bodies; even our "natural" feelings are mediated by chemical changes in the brain. This question will be answered, if it can be answered at all, only as our understanding of neuroscience and cognitive systems deepens; personally, I suspect the line is more than a little arbitrary and subjective.
  • edited January 2008
    . . . it could just as easily be said that it has been "programmed" to feel that emotion.
    You can also easily say that other people are mimicking emotional responses that they've learned fit certain situations. I know what I'm feeling because I have direct experience of my feelings, but how do I know that Scott's feelings are the same as mine? How do I know whether Scott has any feelings at all? How easy would it be for him to simply report to us that he has certain feelings when, in reality, he's as heartless as a photocopier?

    What I'm getting at here is: Why should we just assume other humans have the same emotions as we do? Just because they're human? Have you never seen variations in human emotional response? That is, have you never seen someone moved to tears by an opera while another person sitting nearby is completely unmoved? If that's the case, why can't we go one step further and say that we can never just assume that humans have the same emotional responses and that, since our only indication that they might have something approximating our own emotional responses is our own observations of their actions, that the actions of robots or animals are the indications of emotional responses that are just as valid?
    Post edited by HungryJoe on
  • As it stands now, we rely on people suffering brain damage through strokes or accidents in order to increase our knowledge of what part of the brain controls what aspect of our behavior. Unfortunately, as these occurrences are relatively rare, and totally unpredictable, it's not usually possible to have a baseline comparison outside of anecdotal evidence.

    The only way I could see this field advancing significantly is if someone were to begin 'Nazi-style' controlled human experiments. I would hate to see that happen for obvious reasons.
  • edited January 2008
    Why do you need to have such experiments? We can posit that each of us has the requisite apparatus to feel emotion but it's obvious that people feel things differently and the fact that people feel things differently leads to the question of how we can be sure others have anything at all in common with the way we feel, even if our biologies are the same.

    It becomes a philosophical question. How can you be assured that someone is feeling an emotion that's the same or similar to the ones you feel? I'm saying here that it's only through their actions, e.g. their body language, vocalizations, and so forth. Now if this is so, why shouldn't we be satisfied to ascribe those same emotions to a robot that can mimic the same actions?

    So: If you had a child robot that spilled milk on the floor and then looked up at you with big dewey puppy dog eyes, quivering lips, and said in a small, trembling voice, "I sorry Papa", what would be so wrong about ascribing some emotions to it? How would the robot child be any different than a meat child that performed exactly the same actions?
    Post edited by HungryJoe on
  • Here's something I know you guys would like
    Youparklikeanasshole.com
  • I realize after listening to GeekNights that I basically made the exact reference that Rym did. DERRRRR!
  • ...shouldn't we be satisfied to ascribe those same emotions to a robot that can mimic the same actions?
    Silicon breast implants... As good, bad, or real as they look, to many people, they will always be fake boobs.
  • ... Dude, I'll cuddle with anything if it's warm. If I can -control- this heat ("Can you turn up the heat? Thank you!") then I'd love that.
  • What I'm getting at here is: Why should we just assume other humans have the same emotions as we do? Just because they're human? Have you never seen variations in human emotional response?
    The human brain is a collection of neurons which exists in a finite state. A neuron is either at rest or excited. There are also more complicated circuits where one neuron firing will cause another to fire, or multiple neurons being required to set the next one off, or even one neuron inhibiting another so that it requires even more stimuli to set off the next one. It might be immensely complex, but the human nervous system is a finite state machine on a molecular level.

    Of course the possibility exists that different people have brains that are 'wired' differently. The point I was trying to make was that it seems like emotions are magical because we are unable to study it to the point of perfect information. Maybe one day, we will discover a technology that will enable us to better understand the biological cause of each emotion, even to a point where we can manipulate it.
  • Wasn't that the point Rym made about someday pushing a button and becoming happy for no good reason? Don't we already have that technology in the form of pharmaceutical drugs?
  • Wasn't that the point Rym made about someday pushing a button and becoming happy for no good reason? Don't we already have that technology in the form of pharmaceutical drugs?
    Not without side effects. What Rym's talking about would be more like soma, only it wouldn't be used by the government to control the population, or would it? Read Brave New World if you haven't already.
  • edited January 2008
    Maybe one day, we will discover a technology that will enable us to better understand the biological cause of each emotion, even to a point where we can manipulate it.
    Do you think emotions can be manipulated so completely and easily? What you're talking about would not be the persuasion of advertising and politics, where people often talk of manipulating emotions. Your suggestion seems to me to mean that you envision a technology that would allow a user to cause someone to actually feel real emotion. Think what could be done with that sort of tech. A politician could use the tech to not only persuade, but actually cause others to fervently support any bizarre scheme he endorses. The Vatican could use that tech to cause everyone to suddenly convert to Catholicism.

    Are emotions that cheap? What about free will?
    Post edited by HungryJoe on
  • Are emotions that cheap?
    Until we find some evidence of neurological actions that do not have a biological origin, we have no reason to believe that they are more than the emergent behaviour of a complex system.
    What about free will?
    I've seen studies showing that the majority of the neural activity associated with an action occurs -after- the person has reacted, not before. We may well not have the degree of free will we believe ourselves to have. After all, everything we experience is filtered and presented by our own brain: we cannot trust our own perception 100%.

    The best we can say is that we feel as though we control our actions. I believe that the more effective people in the world are better at controlling their thoughts and actions than others, and that people who strive to achieve greater control have, effectively, more free will.
  • edited January 2008
    Until we find some evidence of neurological actions that do not have a biological origin, we have no reason to believe that they are more than the emergent behaviour of a complex system.
    This discussion makes me think that free will isn't all that it's cracked up to be.

    Slightly off topic: Glowing cats, Glowing pigs, mice genetically modifed to have no fear of cats - Is it just me, or are does anyone else think that the genetic researcher dudes are at the point that they're just trying stuff to see if it works?

    Genetic Researcher #1: Gee, I'm bored. What abomination can we come up with today? Let's see . . . I wonder if we can make a snake that wears galoshes and has bat wings?

    Genetic Researcher #2: I don't know, let's find out . . .
    Post edited by HungryJoe on
  • we cannot trust our own perception 100%.
    Like Rym and Scott's argument about whether or not a certain color was red or orange.
Sign In or Register to comment.