Post by Kilarin on Jul 10, 2007 23:07:05 GMT -5
Xaa:
Thank you for the insight into your creative process!
Xaa:
True. But I'm not certain Liz would be easier. It might be though. We really don't understand the wetware as well as we do the dryware right now. BUT, the wetware is capable of so much MORE right now, and you can do a lot with it without fully understanding it. Hmmm...
Liantedan:
Douglas Hofstadter's book Godel Escher Bach[/quote] is quite fascinating on the subject of AI (and lots of OTHER things as well). He seems to be of the opinion that by the time we create anything complex enough to be considered intelligent, we will see behavior similar to "emotions" cropping up as well. I think he's got a good point, but it's still guess work.
He also states that by the time we make a robot that has a complex enough mind to be able to do general housework, we will have created a robot that has a complex enough mind to be bored doing housework.
I like that Xaa sets emotions as a software phenomenon. Some SF writers have tried to tie emotions strongly into hormones. And, of course, they are right, there IS a strong connection in our brains. But whatever that connection is, it's just a trigger, and dryware could be triggered in the same way by different phenomenon. If we simulate a brain, it's not going to be much more difficult to simulate the stimulation of various glands dumping into it at different times. Whether emotions turn out to be inevitably emergent or not, they will definitely be in the software.
In Pandora, we get to see a wide range of dryware AI possibilities. That adds to the appeal of the story, in my humble opinion.
Laintendan:
Oh, he certainly does! But I would consider copies of a human brain to still be an AI.
And I think Xaa hit upon a very important point with Lucifer. Lucifer survived and succeeded better than his fellow Iron men, in part, because he was an incomplete copy. It would have been VERY difficult for a creature with a full range of emotions to have survived the long, long duty he was assigned to.
And this is an important theme in Pandora. Note the story progression:
We start with the Iron Man/Maiden and see how they have survived specifically BECAUSE they have no emotions. We also see what a terrible weakness it is, because it allows them to do horrible things.
Then we move to mars, where we see that Erica's emotions are what's making her survival on Mars difficult. The next generation is raised as Stoics, who DO have emotions, but keep them under strict control. The Stoics manage to survive, without being the monsters that the Iron Men/Maidens are.
In parallel with the humans on Mars, we are exploring AI systems like Caesar who don't exactly have emotions like we humans do, but still have a sense of ethics and show empathy.
And then we contrast this with the Jovians, who are consumed by their own hateful emotions and led by this to their own destruction. Note that it's not JUST that they hate, but also that they are so incapable of controlling their emotions that they make STUPID mistakes.
And we end up with the Confederacy who certainly has emotions, but in a more positive sense than the Jovians. The Confederacy has many positive things to contribute from their emotions, BUT, their very survival depends upon which they will let reign supreme, Emotions or Reason. If they cant make the logical choice quickly, they will disappear and their emotions will count for nothing.
I see the message as: Emotions are good, they may even be necessary, things can get VERY UGLY without empathy. BUT, our emotions must be ruled by reason. There are many different way's to express our emotions within the bounds of reason: from Caesar, through the Stoics, and on to the Confederacy. But if emotions become the RULER instead of the ruled, we will self destruct. Much like an engine that gets out of balance and shakes itself apart with it's own force.
The first stage is the planning stage....
Thank you for the insight into your creative process!
Xaa:
Remember that technically, a creature like Liz from Muse is an artificial intelligence.
True. But I'm not certain Liz would be easier. It might be though. We really don't understand the wetware as well as we do the dryware right now. BUT, the wetware is capable of so much MORE right now, and you can do a lot with it without fully understanding it. Hmmm...
Liantedan:
if/when we get around to creating that level of AI, it will be more like Caesar first was, in how the operationg system works, yet it will also be as cold and logical as Lucifer
Douglas Hofstadter's book Godel Escher Bach[/quote] is quite fascinating on the subject of AI (and lots of OTHER things as well). He seems to be of the opinion that by the time we create anything complex enough to be considered intelligent, we will see behavior similar to "emotions" cropping up as well. I think he's got a good point, but it's still guess work.
He also states that by the time we make a robot that has a complex enough mind to be able to do general housework, we will have created a robot that has a complex enough mind to be bored doing housework.
I like that Xaa sets emotions as a software phenomenon. Some SF writers have tried to tie emotions strongly into hormones. And, of course, they are right, there IS a strong connection in our brains. But whatever that connection is, it's just a trigger, and dryware could be triggered in the same way by different phenomenon. If we simulate a brain, it's not going to be much more difficult to simulate the stimulation of various glands dumping into it at different times. Whether emotions turn out to be inevitably emergent or not, they will definitely be in the software.
In Pandora, we get to see a wide range of dryware AI possibilities. That adds to the appeal of the story, in my humble opinion.
Laintendan:
Lucifer has his own personality, he's no mere AI.
Oh, he certainly does! But I would consider copies of a human brain to still be an AI.
And I think Xaa hit upon a very important point with Lucifer. Lucifer survived and succeeded better than his fellow Iron men, in part, because he was an incomplete copy. It would have been VERY difficult for a creature with a full range of emotions to have survived the long, long duty he was assigned to.
And this is an important theme in Pandora. Note the story progression:
We start with the Iron Man/Maiden and see how they have survived specifically BECAUSE they have no emotions. We also see what a terrible weakness it is, because it allows them to do horrible things.
Then we move to mars, where we see that Erica's emotions are what's making her survival on Mars difficult. The next generation is raised as Stoics, who DO have emotions, but keep them under strict control. The Stoics manage to survive, without being the monsters that the Iron Men/Maidens are.
In parallel with the humans on Mars, we are exploring AI systems like Caesar who don't exactly have emotions like we humans do, but still have a sense of ethics and show empathy.
And then we contrast this with the Jovians, who are consumed by their own hateful emotions and led by this to their own destruction. Note that it's not JUST that they hate, but also that they are so incapable of controlling their emotions that they make STUPID mistakes.
And we end up with the Confederacy who certainly has emotions, but in a more positive sense than the Jovians. The Confederacy has many positive things to contribute from their emotions, BUT, their very survival depends upon which they will let reign supreme, Emotions or Reason. If they cant make the logical choice quickly, they will disappear and their emotions will count for nothing.
I see the message as: Emotions are good, they may even be necessary, things can get VERY UGLY without empathy. BUT, our emotions must be ruled by reason. There are many different way's to express our emotions within the bounds of reason: from Caesar, through the Stoics, and on to the Confederacy. But if emotions become the RULER instead of the ruled, we will self destruct. Much like an engine that gets out of balance and shakes itself apart with it's own force.