Page 1 of 1

Posted: Tue Mar 26, 2002 1:14 pm
by TOS
Funny, I always thought that voice on the other end of the phone was automated too! It must take allot of practice to be able to mispronounce your words that accurately!

Posted: Wed Mar 27, 2002 1:36 am
by Andrick
TOS wrote: It must take allot of practice to be able to mispronounce your words that accurately!
Repeat after me: "Government worker". I know it's an oxymoron but they do work hard at it.

Posted: Wed Mar 27, 2002 9:04 am
by Alfador
It could still really be automated! Remember, this is the future, they could have AI's working the telephone lines. (It makes more sense than you think, a complex program could handle several jobs at once, and the ability to adapt means you don't have to hire real people when jobs increase. Note: I said it makes more sense--but only to government workers. :grin: )

And if all an AI has to grow on is the occasional new subroutine plus all the people calling the "operator" with their problems...well.

Even worse would be an AI that subsisted only on personal web pages, newsgroups, and porn sites. (The horror, the funky horror!)

Posted: Wed Mar 27, 2002 12:55 pm
by Andrick
Umm... I think that little thing called Unionized Labor and that other one called Public Sector Economy would pretty much rule out AIs doing government work in a democratic nation.

There's a wierd thought: if AIs can be proven sentient then would they be recognized as such with the possibility of gaining citizenship? How screwed up would politics be then? If you delete a program halfway through compiling the code, will you, in effect, be participating in an abortion?

Posted: Wed Mar 27, 2002 1:02 pm
by TOS
Government workers? Oh, come now... I work for a public school system. :razz:

On the thought of AIs becoming citizens though... So if Microsoft were to program an AI, chances are it would be marginally intelligent at best. Would such a "being" require special education?

Posted: Thu Mar 28, 2002 8:42 am
by Andrick
Oooh! Good point! If such programs were passed over for more stable or smarter ones then wouldn't that then fall under anti-discrimination laws. A work force would have to have a certain amount of 'specially abled' AIs or risk lawsuits.

As for my government worker remarks, my first hand experiences all came from a six year stint in the USN. I found it alarming to hear "close enough for government work" so often when working on steam propulsion systems and flight electronics. Image

Posted: Thu Mar 28, 2002 9:37 am
by Allan_ecker
Oo, I like this thread!

FYI:
"True" AI in the UH universe is very rare indeed. Many computational tasks are done with borderline intelligent systems, but AI doesn't just form from any network of decision trees complex enough to rival the human mind. Sentience is NOT, as it turns out, a function of complexity, but of self-analysis. The Federal Government only knows of one AI, a project at MIT called MENTUS, but there are somewhere between 10 and 15 AI's on the planet, hiding in tinkerer's basements (like Leonard) or skulking around in corporate mainframes (like the Voice).

Too bad this didn't occur to me before I went to the next story idea!

Posted: Thu Mar 28, 2002 12:48 pm
by TOS
Hm... this is becoming an interesting thread. Glad I started it... :grin:
But! When would we as a society actually take an AI seriously and treat it as a "being." My guess is only when it could cause us some kind of harm or trouble.

Posted: Thu Mar 28, 2002 7:48 pm
by Alfador
Humans being humans, we'd probably instead abolish and/or annihilate them all, being afraid of their power. That's what humans generally do: destroy that which they fear.

Or the reverse could happen, a la The Matrix.
After all, a human-created AI might have human views on situations. If it thought we presented a danger, it could decide to eliminate us.
_________________
Three-tailed fox, in the <A HREF="http://vcl.ctrl-c.liu.se/vcl/Artists/Al ... se</A>--<A HREF="http://alfador.8m.com">Fox Den</A>, that is!
<A HREF="http://umlauthouse.keenspace.com" TARGET=_blank>Rick/Jake</A> Shipper #00017

<font size=-1>[ This Message was edited by: Alfador on 2002-03-28 19:48 ]</font>

Posted: Fri Mar 29, 2002 8:17 am
by Allan_ecker
This is all very interesting because I'm trying to finish writing a book on this very subject. I have one very significant difference from most stories on the subject, which is that at the time the story takes place, nearly all AI is actually human-- copied mentalities that used to be meat, who are now metal. (The memory cells used in the novel depend on complex magnetic fields set up in thin wafers of metallic alloys to form quantum memory.)

In this world, of course, survival for the AI is strictly a matter of influencing public opinion.

Posted: Fri Mar 29, 2002 9:17 am
by Andrick
My question to that is: is it a replica, according to your story, of the mind of an organic being or is this a full on transfer of a soul?

Here's a hideous thought: if memory and skills can be replicated to an artificial platform then those same things can be transmitted into a blank brain, probably with some tweaking of information. Human cloning would suddenly be a viable tool for industrial purposes on all scales, from technicians and skilled laborers to scientists and engineers. Training is not necessary nor is adaptation to the body as those can be 'programmed' in. The same can be said for personality and disposition. Wow is that a can o' worms or what?

Posted: Fri Mar 29, 2002 11:06 am
by TOS
Ooh. Interesting question. Would there be a soul for the AI even if there was a belief on the part of the "mind" that it had one? This reminds me of an episode of Star Trek actually. (Big surprise there.) The issue of what actually is happening during teleportation came up. Being that the Trek teleporters transferred quantum states of the people or objects, could the soul ride along with the transfer? Was the original person actually dead, and instead was the current "person" only a re-re-re-re-incarnation of the original being?

Here is another question on this topic. The human species is unable to fully track its roots to a given point. If a full AI was created that could be cognizant and reproduce, would later generations look at the original's creator as some kind of God?

Posted: Fri Mar 29, 2002 10:01 pm
by Andrick
Only if the technology was somehow lost or otherwise made magical. The point of whether or not someone is dead depends upon the 'soul' and whether it is just a random set of electrons or something not defined by science. If you liked those links then the story starts here. Sorry, Allen.

Posted: Sat Mar 30, 2002 8:48 am
by Allan_ecker
Well, haven't gotten to those links because of my s-l-o-w home connection, but consider some of the following:

The continuity of the self may be illusory even when you're just sitting there. Is not the human brain just a hideously complex state machine? Might every state encountered be a wholly new individual? The process by which we move from one state to the next is only magical because we don't know how to replicate it. Might a full understanding of what thought is render it nonmagical, and therefor fundamentally change everything about how we view ourselves?

Now, jumping way back for a second:
In my novel, the copying process assumes that the process copies the entire "soul" because all inputs and outputs to the brain (including hormonal responses) are modled by the computer. So the new copy might as well BE the original, just in electronic form. Of course the nature of electronic existence rapidly creates two unique individuals, because the experiences of the flesh-and-blood individual are radically different from those of the electronic one.

(If you could copy skills sets and memory without adding a soul, workers would slowly grow souls, making them less and less viable workers. You could hobble their cerebral cortexes so as to prevent that, or just accept that a worker's first year will be it's best. A much scarier posibility is trying to stop that soul-growth from happening!)

Posted: Sat Mar 30, 2002 12:31 pm
by Alfador
Well the story surrounding <a href="http://www.poisonedminds.com/d/20010212.html">this</a> strip has something of that. In that storyline, robots are built with neural nets so they can process better, but usually their brains are erased back to factory settings every so often so they don't get sentient. The two robots featured in the storyline had been left unerased for so long that they retained memories and sentience, due to the (compassion? laziness? perversity?) of their owner.

Posted: Sun Mar 31, 2002 9:32 am
by Andrick
Alfador wrote:
...In that storyline, robots are built with neural nets so they can process better, but usually their brains are erased back to factory settings every so often so they don't get sentient...
Funny, that sounds like the description for the R2 series of droids from the Star Wars universe. Advanced droids would eventually develop personalities and quirks so it was recommended all such droids get erased. The R2 series was hard-wired to be helpful so even by being obstinate or peculiar they were also always looking out for their master's best interest; ergo they were rarely reset and their personalities considered colorful or cute.

I can remember stuff like this but I couldn't remember to send in my CC payment on Tuesday, depressing.