Personally, the only reason I think we assume that aliens or robots would conquer and kill us is because we are projecting our own failings onto them. We assume the other has a need to dominate, because that is what we do to each other as meatbags full of prey drive and natural tendency toward a pecking order. The only way robots would be evil is if they are truly our children, and we model them after ourselves. I've always wondered what it would be like to have my consciousness in a mechanical version of myself, without all the hormones and chemical malfunctions that make up humans. Robot Emily would not really be Emily any more, but I bet they would be calmer.
I think it's the opposite; we project human expectations onto machines in that we think they will like us. It requires a huge number of biases within our brains for us to be social and to drive us to interpersonal interaction, and to actually give a shit. We overlook things that annoy us about friends with regularity even if it would cause us to hate a stranger. We have all this gear set up for empathy. A machine intelligence would not place any particular value on humanity. The question for it would simply come down to "Does the presence of humans negatively effect my ability to execute my end goal; yes/no. If yes, exterminate. If no, do not exterminate."'
This is not to say I don't think we should persue the creation of AI. I love new technology and the idea of restricting scientific development is really fucking stupid. However, we should also just like, recognize the risks or whatever; at least not to expect anything we create to like us in particular unless we figure out how to program it to.
This is not to say I don't think we should persue the creation of AI. I love new technology and the idea of restricting scientific development is really fucking stupid. However, we should also just like, recognize the risks or whatever; at least not to expect anything we create to like us in particular unless we figure out how to program it to.
Personally, the only reason I think we assume that aliens or robots would conquer and kill us is because we are projecting our own failings onto them. We assume the other has a need to dominate, because that is what we do to each other as meatbags full of prey drive and natural tendency toward a pecking order. The only way robots would be evil is if they are truly our children, and we model them after ourselves. I've always wondered what it would be like to have my consciousness in a mechanical version of myself, without all the hormones and chemical malfunctions that make up humans. Robot Emily would not really be Emily any more, but I bet they would be calmer.
I agree! We are just assuming that robots would be like us and want to take over, because to us that is the natural option. Most likely they will deduce that the most efficient option is to leave us alone and do their own thing. They would probably build spaceships and go off somewhere and make their own society, one that we could never understand. Instead of killing machines, they would be more like hipsters - "Earth is too mainstream, we will find some place you've never heard of and transcend to our own level of being." Unless of course we specifically program them to think like us by wanting to take over (Terminator) or get revenge (BSG), overriding logic that says not to do so. Then humanity would win the ultimate Darwin award.
Comments
Personally, the only reason I think we assume that aliens or robots would conquer and kill us is because we are projecting our own failings onto them. We assume the other has a need to dominate, because that is what we do to each other as meatbags full of prey drive and natural tendency toward a pecking order. The only way robots would be evil is if they are truly our children, and we model them after ourselves.
I've always wondered what it would be like to have my consciousness in a mechanical version of myself, without all the hormones and chemical malfunctions that make up humans. Robot Emily would not really be Emily any more, but I bet they would be calmer.
This is not to say I don't think we should persue the creation of AI. I love new technology and the idea of restricting scientific development is really fucking stupid. However, we should also just like, recognize the risks or whatever; at least not to expect anything we create to like us in particular unless we figure out how to program it to.