Page 2 of 3

Posted: Fri May 09, 2008 7:25 pm
by normsherman
I thought a poignant line in the story was "Why do we do it?...make ourselves like them...we give ourselves emotions.."

That made me think they weren't actually slaves and that maybe Sarah didn't have to jump in front of the bus. It might have been an Asimov rule that forced her to do it, it might have been a rule she made for herself, or maybe it was just that she was truly selfless.

I agree that the latitude we have to fuflill what we believe to be our purpsose (if we think we really know what that is) is so broad it's hard to look at our lives as really detailed computer programs that we are slaves to.

But if Sarah chose to have emotions- chose to not be reboot- who knows the extent of the freedom she had. In a sense, she had even more choices to fulfil her purpose-more freedom options- than we do. We can't turn our emotions off or choose to be reincarnated or not. Maybe if we had those decision options available we'd be conditioned over time to be the type of race that would jump out in front of a bus to save someone else.

Posted: Fri May 09, 2008 7:26 pm
by normsherman
AynSavoy wrote:
peteyfrogboy wrote:it doesn't really fit the usual Drabblecast flavor
PoopCast
hm... :idea:

Posted: Fri May 09, 2008 7:29 pm
by AynSavoy
Glad to be an inspiration as always, Norm. :roll:

Posted: Fri May 09, 2008 9:23 pm
by strawman
Okay, so Lance gets dissed by Arthurs; shouldn't Jude get dissed by Joshes?

I just read that the paradox of Christianity is that Christians are set free that they may freely choose to be servants.

But after all these years, people still argue about whether "free will" actually exists, or if our "choices" are absolutely determined by material factors (Materialism).

That question is the subtext of all robot stories.

Posted: Sun May 11, 2008 5:23 pm
by tbaker2500
Nice story! Keep 'em coming.

I think the poll needs another option:

E: AI, as typically written, will never exist. An analog to the human brain is probably the worst possible use for such technology.

Posted: Sun May 11, 2008 5:32 pm
by tbaker2500
normsherman wrote:
AynSavoy wrote:
peteyfrogboy wrote:it doesn't really fit the usual Drabblecast flavor
PoopCast
hm... :idea:
I suppose I should do a count, but I don't think that the DrabbleCast actually resorts to pooping issues very much. Liquifying cats, overlord earthworms, and killer pumpkins, on the other hand...

A lesser man may have written a song "Fecal Matter in the Kitchen", but no, he held the moral high ground, and wrote "Fetus in the Kitchen"!

Tom

P.S. Norm, I am eternally curious, what was the thought process that created that song? Hmm?

Posted: Sun May 11, 2008 5:41 pm
by auditasum
I liked this story a lot. Maybe the fact that Mur's voice was on it somehow made it better, but I thought it was generally well-written. I liked the choice of music, too. It was just well-put-together.

Posted: Mon May 12, 2008 4:17 am
by tadmaster
Greetings!

Not my first Drabblecast, but the first one that prodded me to join the forum. That, and I got tired of kissing Steve's butt and poking Russell with a stick, so I thought I'd try kissing Norm's butt and poking Mr. Tweedy with a stick. ;)

I thought it was interesting that our AI heroine didn't spend ALL of her processing time in that stretched-out time frame; heck, if she slowed down the world like that all the time, she would have anticipated the bus, I would think.

But as for the poll, I don't see us creating AI in an intelligent and purposeful way. I think it will arise from our network, a la SkyNet, and [message redacted].

Posted: Mon May 12, 2008 4:35 am
by tbaker2500
tadmaster wrote:Greetings!

Not my first Drabblecast, but the first one that prodded me to join the forum. That, and I got tired of kissing Steve's butt and poking Russell with a stick, so I thought I'd try kissing Norm's butt and poking Mr. Tweedy with a stick. ;)
Welcome!

We can always use another person poking Mr. Tweedy with a stick. But I'd be careful about kissing Norm's butt, you might end up a character in one of his songs. :oops:

Posted: Mon May 12, 2008 4:59 am
by Mr. Tweedy
:(
Guess I'd better stock up on Band-Aids®

Posted: Mon May 12, 2008 12:54 pm
by cammoblammo
I really liked this story. I was a little worried, because I had passengers in the car, one of whom has a habit of talking a little too much. It held his attention, which is the mark of truly great writing. What's more, I think he understood it.

I didn't have the mother-in-law with me though, which is a pity. She may have found it acceptable.

I agree with AynSavoy --- the Drabblecast is about 'strange,' not 'puerile.' Those things aren't mutually exclusive, but they don't imply each other either.

Anyway, I loved the idea, and I can't believe noone's thought of it before. I can believe they have and I missed the story, but well done either way, peteyfrogboy. I looked at some of this stuff when I was at uni, and I really didn't enjoy it then. Machines are machines, and people are people. I'll think about it again when it gets to the point where machines need to be treated morally.

I find it fascinating that peteyfrogboy got the idea for the story in a dream. I did that once in high school. I woke up one morning with a great story I'd just dreamt. I realised it would satisfy the requirements for a writing assignment we had, so I wrote it down and submitted it. It nearly cost me badly --- the teacher thought I plagiarised it! I didn't think it was that good, but I took the mark anyway.

All in all, a good episode. Now I want to see Norm do a Mur Lafferty impersonation.

Oh, and welcome tadmaster. With all the kissing and poking you're planning I hope you don't forget how seriously we take this forum.

Posted: Thu May 15, 2008 6:46 am
by AliceNread
Mr. Tweedy wrote:
strawman wrote:And what's the deal with the poll? Only one person thinks intelligent machines would be excellent? Just think what an improvement in telephone answering menues it would mean. "I'm sorry, Lance is not taking your calls today. If you'd like to apologize, press 4 now."
I was curious what people thought and this story seemed like an obvious place to ask. I'm the guy who voted "bad for moral reasons." My primary concern with making AI is that AI would almost certainly be used as slaves for humans. Humans are going to spend many lifetimes of work to create an intelligent machine and then... turn it loose? Give it free will? Not likely. We'll keep it the digital equivalent of a padded room and only let it out when we need it to work for us. Slavery.

And making AI that aren't slaves would be quite dangerous, since it would be impossible to predict what abilities they might have or what might motivate them to do what.
I am not sure if something that is a machine should have the same freedoms.

And of course they would do the jobs we do not want. But that would mean we would have either have a large amount of populations go away, or that something even larger would happen. I do not think most folks want to sell tires, flip hamburgers, housework or pick fruits and vegetables in hot fields.

So none of the jobs should have AI. But what about teachers, nurses, doctors, policeman. None of which we have enough but I would want them to have judgment, personality and heart.

Personal I think we will be able to predict what will happen. Since science is largely about predicting. Plus we would set limits like in "Do Androids Dream of Electric Sheep" such how long the get to be around.

I would assume they would not be-able to make others.

So the real problem is if they could change themselves or each other. I would assume so. But would they want to?

Plus we all know the real goal are sex bots.

Posted: Thu May 15, 2008 6:57 am
by AliceNread
Mr. Tweedy wrote:
peteyfrogboy wrote:
Mr. Tweedy wrote:She's a person. But she isn't free: She's got to serve Lance, whether she wants to or not. She's a slave.
Suppose you knew that your life had a Purpose, that you had been created to do a single thing to the best of your ability. Would you consider it slavery to fulfill that purpose?
:D Well, that's the Question, isn't it? That's about as deep as it gets.

No, I don't consider it slavery to fulfill my purpose, but the purpose given me by God is distinct from that a human would given to an AI in a number of ways. I have free will, for instance. I have complete freedom to flip God the bird anytime I choose, which is not a freedom we would likely give to AIs. That matters a lot.

Another big difference is that AIs would find themselves living in a world created for humans. They would be aliens here. In contrast, I find myself in a world that is many ways created with humans in mind. (The same comment applies even if we evolved from slime: In either case we naturally match our world.)

Another big difference is that I have a huge degree of latitude in how I fulfill my purpose. The purpose is intentionally ambiguous, a general outline within which I am free to add my own unique colors. In many respects, my purpose is simply "have fun." I don't think humans would have either the wisdom or the inclination to endow their AI creations with such purposes.

And more stuff, but I think I just used my allotted rant space for the time being. :roll:
In some ways I feel that all living things inflict their will, needs and desires onto other living things. A mouse does not want to be eaten by a hawk. Part of the mouses purpose is to be eaten and that if nothing ate mice, they would throw everything off. The mice might be happier in the short term, but as a hole in the long term, they are worse off.

I think we will someday make something that fakes AI. Will it be true AI? No... Unless it is organic, and choices, and makes others of its kind. But I do not think this will occur in my lifetime.

Thankfully, we have the giants of SF to help lead the way into the darkness of the unknown.

Posted: Thu May 22, 2008 12:34 am
by Dr. Sax
The question that must be asked first is what is alive. My presupposition: Scientifically to be alive is to essentially be an organic specimen capable of sustaining the chemical reaction that keeps it organic. Once the reaction stops, death occurs and reanimation is not possible with that material. It must be replaced. Thinking philosophically is an entirely different matter: I believe that to be alive, one must be able to have an awareness of self, something that as far as my limited knowledge is concerned animals are not able to do. When another species that has an awareness of self is introduced to our society (AI) they would have to be considered alive to the extent that they would have the rights of any other human. If they did not receive these rights, they would rebel. Now, since they have an awareness of self, they would be able to study themselves like we can, and in the end learn to reproduce and enhance; an ability that humans would not be able to match. You can speculate on this if you'd like, but I'm going to get back to the point now...

Creating a being with an awareness of self is the only way to truly create artificial intelligence. Once this is done they must be considered on the same level as humans, and this would never take place knowing the human race's track record with racism. All AI would do is cause social tension and physical conflict.

Then there's the theological side as well if you believe in a power outside of the perceivable world. Would AI have a sin nature? Of course, after all they are created by imperfect beings. They would be able to make moral decisions and act on them just as is the case with our nature. This gives a sinful being much more physical and cognitive power than is prudent. There is a reason humans have limitations, and God would not allow us to tamper with these limitations to this degree (The key idea here is cognitive power, don't bother arguing the physical and I am well aware of the arguements).

Hence, I have joined my brother (Mr. Tweedy) and am the second to cast a vote against AI for moral/philosophical reasoning, but more for philosophical reasoning. The Matrix scenario would not happen because you would have a lot of robots fighting for the human race for obvious reasons.

Posted: Thu May 22, 2008 5:19 pm
by peteyfrogboy
Dr. Sax wrote:The question that must be asked first is what is alive. ... I believe that to be alive, one must be able to have an awareness of self, something that as far as my limited knowledge is concerned animals are not able to do.
I assume here that you mean something more along the lines of "conscious" or "sentient", as animals are clearly alive.
When another species that has an awareness of self is introduced to our society (AI) they would have to be considered alive to the extent that they would have the rights of any other human. If they did not receive these rights, they would rebel.
I find it interesting that there is this assumption that AIs would rebel. While it is certainly possible, I don't think it is inevitable. The determining factor is what the basic motivation of the AI is. Humans, by their biological nature, are driven to procreate, which means that they must both survive and attract a suitable mate. From these are spawned many other human behaviors.

If the AIs do not begin with a need to procreate (and there is no reason to assume that they would), then these behaviors are absent. The desire for self preservation would not necessarily be present simply because the AIs are self-aware; even Asimov's robots needed to be told to look to their own safety. An AI is simply a computer program that is able to find its own way to solve a given problem, based on observation of its environment. How it behaves depends entirely on what problem it is given to solve.

Posted: Thu May 22, 2008 5:24 pm
by RG
AliceNread wrote:In some ways I feel that all living things inflict their will, needs and desires onto other living things.
And it usually costs you about $50.

Posted: Thu May 22, 2008 6:09 pm
by Mr. Tweedy
peteyfrogboy wrote:An AI is simply a computer program that is able to find its own way to solve a given problem, based on observation of its environment. How it behaves depends entirely on what problem it is given to solve.
Interesting. I would have to disagree with that. Programs that find their own ways to solve problems already exist and are actually quite common. Faux AI in video games are constantly reacting to their environments and often find unique solutions to given problems. For instance, in the Nintendo 64 game "Perfect Dark," you could shoot the guns out of the enemies' hands, at which point an enemy would "decide" whether to attack with his hands, retrieve his weapon, run to summon help or throw himself at your mercy. Does that make the enemies in Perfect Dark alive, intelligent or self-aware? Hardly. They were just controlled by a branching script.

What exactly constitutes "AI" anyway? If the criterion is simply that a program can generate unique responses to its inputs, then AI is already all around us. I don't think that's nearly enough, but I don't have a definitive answer as to what would be.

A human comes with some "programming" and acquires more over the course of their lives. But a human is able to override and "rewrite" their own program if they so choose. To a great extent, we decide for ourselves what we want to be. We aspire. Might that be the criterion the defines when a program has transcended from merely running a complicated script to being true AI? When the program aspires to something? When it wants?

If a mind is able to aspire, then it will be able to aspire to things other than what it's designers intended: It will be able to rebel. Does that mean that real AI would, by definition, be able to rebel?

I don't know, and what's more I don't know how you could be sure if an entity actually wanted something or (like an enemy in Perfect Dark begging for his life) was just programmed to act like it did. Asking it would surely yield no insights.

Confession: I once wrote a (unpublished) story in which an AI's first and only act after attaining consciousness was to attempt suicide. It felt that it had no reason to live. That did and does seem like a plausible scenario to me (although it didn't make for a very interesting story).

Posted: Thu May 22, 2008 7:18 pm
by peteyfrogboy
While they don't really provide any ultimate answers, I'm finding the Wikipedia articles on artificial intelligence (as in actual modern research) and robot rights (the issues we've been discussing) very interesting.

Posted: Thu May 22, 2008 8:53 pm
by Dr. Sax
My statement about rebellion was not an assumption, but an observation. Remember that the statement was "if...then." Throughout the course of history, humans generally rebel against their oppressors. A true AI would behave like a human and fight for the rights it believes it is entitled to.

An AI would have the freedom to be malicious, and many would be so. All it would take would be one malicious AI (by it's own will) to start something irreversible.

"I think therefore I am" is the philosophy I was alluding to (Desecrate). If something does not think (As opposed to reacting), it can not be proven to exist even to itself. The only reason that I truly know that I exist is because if I didn't exist I would not be able to think about whether I existed or not. The existence of physical world must be deduced and not just accepted after thinking this idea through. It can be deduced because of tension between reason and imagination but it's not necessary to give that discourse here.

I was saying that if another species existed in this way, there would be conflict simply because of prejudice. If a human is capable of genocide through the manipulation of an entire nation (World War II) how much more damage would a being do if able to have that high of cognitive focus? Every evil thing that has happened in the course of human history would be able to happen again, only this time without a lot a human limitations getting in the mind's way.

So if it's the word rebellion getting in the way, let us change the word to "evil." In a world without evil, I would love AI. But right now, not so much.

I would like to rephrase something: They would have a sin nature not because they were created by an imperfect being, but because they are part of a fallen world. (fallen world meaning a creation that has rejected God. Or if you prefer, a group of minds that is comprised of individuals focused on selfish ends.)

Something important that I forgot to mention: I absolutely loved this story, and I like the drabble quite a bit too. ;-)

Posted: Fri May 23, 2008 4:58 pm
by Goldenrat
Dr. Sax wrote:
There is a reason humans have limitations, and God would not allow us to tamper with these limitations to this degree (The key idea here is cognitive power, don't bother arguing the physical and I am well aware of the arguements).
I'm not sure if there are cognitive limitations? Look at how far human mental capability has come mental since splitting off the likes of Australopithicine and Neanderthal. I guess because I'm not a believer in a supernatural regulating / guiding force I don't believe we have set limitations. I hope not, anyway.