[[Posted by Giovanni Blasini, 28-08-2007, 09:11:51]]Hour 21
Sybil continued to monitor the data from her three fellow "greater AIs", trying to find some logical, rational reason why the others could see what they were seeing, and, just as important, why she couldn't.
Unfortunately, logic and reason didn't particularly feel like cooperating.
It didn't make any sense: Admiral Murakami and John both cound experience Tabby's "dream", but she couldn't. Worse, this dream didn't seem to register anywhere Sybil looked, whether it was Tabby's stored memory, her active memory engrams, or John or her mother's sensory input. The things they were seeing and hearing just weren't there. Even the memory engrams of the two former humans showed nothing. Yet, they swore they saw them, that they could see and hear Tabby's dream.
It should not have been possible. It wasn't possible. But, clearly, it was that was the case, whether she considered it possible or not.
"I don't have enough data points for comparison. I have two full human emulations experiencing anomalous visual and auditory perceptions from a hybrid AI that's part brain emulation, and part pure code. A pure code AI is not experiencing them....the Tachikomas! They're pure code, using similar simulation and world-modeling subroutines for decision-making to the ones I use. What do they
see?"And she knew just who to ask.
----------------------------------------------------------------------
Tachikoma TKM-007 ("James" to his friends) sat in the CIC of the new SLS
Tabiranth. He had several subroutines running to monitor reports from Tabby's autonomic functions, while her higher brain functions slept. That didn't require much conscious effort, though, so most of his processing power was being used on the text currently displayed on the noteputer in his left claw.
The text, a discussion on the idea of a "technological singularity" by an author named Bill Joy, made for an interesting, if amusing, read. The idea was, of course, absurd, thanks to the collapse of Moore's Law in the early 21st Century - obviously, "superintelligent entities" hadn't replaced or surplanted humanity.
{
"James, this is Sybil, there's something I need you to do."}
"Speaking of superintelligent entities...." {
"Yes, Sybil?"}
{
"Go to Tabby's primary core, and tell me..."}
{
"I already have. I don't see anything unusual, either, Sybil."} James inwardly smiled - he was 70.23% certain that was what Sybil wanted to know.
{
"Just the virtual neural intelligences, then."}
James hesitated before replying. {
"I...have a theory as to why."} In truth, he was 92.3% certain that Sybil would be...less than pleased with his theory's implications.
{
"Go on..."}
He paused a full 4.3 seconds to organize his thoughts before continuing. James needed to make sure his evidence chain was organized properly before he presented ideas that, really, he did not want to be true, either. {
"Consider the differences between virtual neural intelligences Captain Morgan, Admiral Murakami and Tabiranth, and world-modeling non-neural AIs such as ourselves. The nature of our intelligence, and how we perceive the world, is considerably different from their nature, correct?"}
{
"Of course."} James ran Sybil's response through another subroutine. He was 55.4% certain she was nervous.
{
"Twentieth and twenty-first century researchers considered this issue, as they believed they were less than half a century from it no longer being a theoretical discussion, and instead being a real concern. They believed that both mind uploading and non-human-based AIs were 'just around the corner', so to speak, when, in actuality, it would be six centuries before it was really an issue. There was much debate over whether or not a mind upload would be able to experience reality and retain the same level of sapience, without the accompanying neural imput from their bodies."}
Sybil's virtual avatar appeared on in CIC. "Of course," she replied. "The issue was discussed again in regards to the M-5 Project. Murakami's predecessors understood the implications in using Admiral Dvarl's neural mapping as the core of the
Caspar's AI. I've long suspected that their fear of the M-5s was not fear of artificial intelligence, but fear of themselves, or perhaps, more accurately, fear of making something too like themselves."
James paused. "Interesting. I had not considered that possibility. However, that does fit with my theory. We do not perceive the universe in the same manner as human beings, and while the rationale behind our decision-making is based upon an idealized version of the values of highly militaristic human beings, the actual processes we use to make those decisions are not those of humans. We are unlike them in far more ways than we are like them."
Sybil's avatar frowned. "I'm not sure I like where you're going with this."
James moved his carapace in an approximation of a nod. "Nor am I. You have the files on the same transhumanist and artificial intelligence research as I, though, correct?" He beamed her which specific documents he was using as the basis of his theory.
"Of course." Sybil's avatar seemed deeply reflective for several seconds. "I see what you mean. One possibility is that we simply lack the same 'qualia' as they termed it, or that we are either not truly sapient, or merely not as sapient as a human. Another is that we are still sapient, but so alien in our nature when compared to humans that we will never perceive things in the same manner as them."
James nodded again. "There is also the third possibility: that the sum total of our sensory input shapes the nature of our intelligence. I am a Tachikoma, and have always been a Tachikoma. The nature of my AI, and my perceptions are shaped by that. You, meanwhile, are designed around having multiple perceptions in multiple 'bodies'. Currently, Sybil, how many different "places" are you?"
Sybil paused. "I have a virtual avatar talking to you. Another virtual avatar is with John, while my physical avatar is being used to converse with our guests. I'm still watching the sensory data from Tabby's dreams firsthand, and I have a K-1 out examining the external hull of that JumpShip. Of course, I'm also in my computer cores, monitoring the internal functions of my shiphull, and keeping watch on the system. Lastly, I'm prepping a medical remote to perform an examination on Professor Danaban."
"John Morgan, meanwhile, is in one place, correct? The same with Admiral Murakami?"
"Yes. The Admiral is using her physical avatar to speak with our guests, and John's aboard his hull, currently in his virtual avatar."
"John Morgan is in one form at a time: either an avatar of his physical body, or his warhull, where sensory data from the ship is tied to his virtual nervous system, as if it were his physical body, correct?"
Sybil nodded. "Of course. He's not the Kwizatz Haderach."
That forced James to search for the reference. "Ah, I see. Have either John or Admiral Murakami ever been in 'many places at once'?"
"We tried it with Murakami before. It's difficult for them, even with added processing power." Sybil pondered. "Despite their transformation to artificial intelligences, their perceptions and intelligence is still shaped by the fact that they wore human bodies. They're tied to one form, and strongly prefer it to all others."
James nodded. "Precisely. It's what feels most 'correct' to them. For John and the Admiral, that was being human. For Tabiranth, that was an M-5
Caspar warhull. All three are neural intelligences that have been divorced from their physical forms, and transformed into something else. That could effect their perceptions, whereas I, who am still in my original form, and you, who are meant to wear several forms constantly, are not affected."
Sybil shook her head. "That would not explain why there is no record in their sensory data, and why we cannot see anything in their mental processes that would explain this. No, that might require another explanation, James, one even more reliant on philosophy than the 'qualia' concept."
"What would that be."
Now Sybil really seemed upset. "What they are experiencing is, arguably, spiritual in nature, and that what they are perceiving is external to the physical seat of their intelligence."
"Are you suggesting..."
Sybil's avatar looked ready to cry. "...that they have souls, and we do not? Yes, James, that is precisely what I'm suggesting. They're something 'more' than mere machines. We are not."
And that last sentence captures the dichotomy ... the "soul-less" machine about to cry.
Where's a blue fairy when you need one?
Thirty to sixty light-years away, soon to receive a message that her boyfriend has been captured, at which point she kind of freaks out.
Now Sybil really seemed upset. "What they are experiencing is, arguably, spiritual in nature, and that what they are perceiving is external to the physical seat of their intelligence."
"Are you suggesting..."
Sybil's avatar looked ready to cry. "...that they have souls, and we do not? Yes, James, that is precisely what I'm suggesting. They're something 'more' than mere machines. We are not."
Brilliant work. This was really wrenching. Poor Sybil and James.
I was going to make a very bad joke, but I won't.
It doesn't seem right.
The only fitting response is "She's wrong." I'd use the tear "smiley" here, but its a bit too silly looking to get my emotions across right now.
I feel like calling James up and saying "Define human. Define soul." Give him the golem's challenge from "Feet of Clay" - take him, grind him up to the smallest possible particles, and see if he can find a soul therein. But do the same to a human ...
I'd also talk about gender-based differences in humans. Are women less human than men because they tend not to obsess over sporting details, or giant fighting robot games? Are men less human because they go straight to what they want in a shop, and don't spend three hours comparing and browsing? Is a woman less human if she does exhibit stereotypically male behaviour? If the ability to perceive Tabbie's dream relates to their infrastructure, this doesn't affect their identity any more than physically-based gender differences make one sex less human than the other.
THen I'd start talking about the concept of sapience, and the identify of sapient individuals. Preferrably over a beer in my case, and 9 volt DC in James'.
OR
I'd point out that James & Sybil can effectively modify their own infrastructure. Don't want to be upset? Just reset some variables. Put the emotions on "hold" literally.
(Recommended reading: Greg Egan's "Quarantine", and "Permutation City".)
Wow Gio, just.. Wow... That was some really well thought out, and really deep stuff. I briefly considered adding in the bug-eyed "shocked" smiley, but I don't think it quite gets it across. Wow...
And that, folks, is another type of Owch.
I had long suspected this is precisely what Sybil will think in the near future, nice to see I wasn't mistaken.
I wonder how Sybil and John are going to work this one out...
I wonder how Sybil and John are going to work this one out...
If he starts listening to the song by Real McCoy ("Automatic Lover"), he might get slapped. :D
Yeah, NAC like.
Capitol Grade Naval weapons bring lovers' spats and "domestic incidents" to a whole new level of pain...
One thought for Sybil: Newtons research on the human eye. He was one of the first to question an 'observation' as it was made by an imprecise instrument. All the 'human' AI's have this 'in built' error, and, more important, their brains try to compensate (or epsilon squared).
As anybody in information technology knows: epsilon isn't really there and visions could be explained by errors building.
It's like trying to see a fatamorgana with polarized glasses.
A possible solution might be not to look, but to 'sneak glances' and then trying to build a picture based on those glances - yes emulation the human eye, the snapshot like focus on different parts of the total picture and then building the world view.