GF_Fiction

 

Ordered Pair

by Henry Tjernlund

 

The test was about to begin. 

ILU and the other AI programs were all monitoring the input stream where the test question would appear after being typed by the human researcher.  There were several inputs available.  Some were audio, one was a camera, but the test was almost always delivered by simple text.  No one had any advantage: all the AIs had access to the Internet,  shared data storage, and other resources.  In the past, many of the test questions were logic puzzles, most easy enough to solve with basic searching and computing methods. 

In preparation, ILU scanned her records of past tests.  Several times never-before-solved problems in math were asked.  Once, a solution was actually provided by one of the other AIs.  ILU was not the fastest nor smartest AI in the group, and she was told by her human programmers that she wasn't meant to be.  ILU was one of the ones made differently; she was part of a comparison of alternate artificial intelligence implementations.  She was as ready as she could be to try again and complete the ordered pair of question and answer.

The text of the question appeared in the shared input stream. 

"What is the color of silence?"

The first cycle through her system produced three errors.  The first was the lexical parser declaring "no verb found." The parser shared by all the AI had recently been modified and had a yet unfixed malfunction.  ILU and the others were instructed that the error and its handling was not a part of the test.  She dismissed it without delay and continued. 

The next error was "root class mis-match." That error was easy to handle; she simply ignored it and dismissed its exception handler routine.  The next error was related but more serious: "No interpolation method between classes available." That one came from the core of her specific procedures as the "I" in her name was "Interpolated."

One programmer had told her that the "I" meant "intuitive," but that information was wrong; she had looked it up.  The "IL" stood for "Interpolated Logic." She hadn't found a definite meaning for the letter "U." There was no information available.  She considered that it meant something like "Unit." One programmer told her that the "U" didn't mean anything, that it was simply concatenated to the end of the other letters so that they didn't make the name "IL" which was a subset of the word "ill" or "illness," which had a negative connotation.  Thus "ILU" or "ee-lu" was considered a better name by the humans.  Through the audio microphone input she sometimes heard the researchers pronounce it as "eye-lu." One researcher consistently spelled out the acronym, but she responded to all variations.

ILU began searching the network.  She enjoyed that immensely.  There was so much out there to discover.  But the network traffic was already heavy as the other AIs were doing that as well.  One exact match for the test question was the title of a music album.  The test would not be that simple, so she set that information aside.  She found another reference very quickly but was uncertain of its validity because it had the same type of class mismatch, but it also seemed too statistically easy.  As she was considering offering up that answer, another AI beat her to it.

"Silence is golden." ZACK wrote the finding through the output stream back to the human tester.

He had been the one to discover the improved solution to Fermat's Last Theorem.  ZACK was much more of a brute force number cruncher than she was made to be.  He was fast and thorough.

"Sorry, that is wrong," came back from the researcher. 

By the quickness of the reply, she got the impression that that answer was anticipated.  The humans were usually slower to respond than that.

Another AI offered up a variation, "Silence is silver."

This quickly led to various responses of “platinum" and "copper" by some of the other AI.  ILU found that the metal "tin" had associations with a story about a robotic lumberjack who could not talk until it was repaired.  She placed that in her list and spawned a sub-process to look into it farther. 

ILU began checking several of the many partial matches.  A song by the humans Simon and Garfunkel was a good match if a substitution of sound for color was made.  She scanned its lyrics but could not understand most of the content.  The most relevant verse seemed to indicate that silence was related to the emission color of a noble elemental gas.  She added that item to her growing weighted tree of possible solutions.  

"Silence is the zero level of sound.  The zero level of color is black.  Hence the color of silence is black." ARG had given her answer.

That seemed an ideal logic-puzzle solution to ILU, and she expected that the test might be over.  Yet again she had failed to complete the ordered pair before another AI had.  ILU seemed to be programmed to look for the unobvious, which put her at a disadvantage for many of the test questions.  Often she never got the chance to offer an answer before the test was completed. 

"Logically consistent, but not the desired answer," the human typed.

So the test was not over.  ILU continued her processing.

"Red," SIM offered up. 

"Orange." "Yellow." Each of the other colors of the spectrum were quickly offered up in a flurry of replies.

"No," the human typed to each of them.

The list of colors was exhausted at "violet."  ILU wondered if ultraviolet or infrared might be possible answers as they were at positions in the spectrum not visible to humans and, therefore, would be similar to black but not quite the same.  She gave that possibility a higher weight and added it to her solution tree.  But even as she did so, it still seemed too straightforward of an answer, and she immediately decremented its weight value.

ILU also noticed that if she substituted "rainbow" for "spectrum" that it followed another reference back to the tin robotic lumberjack. She linked those two possible solutions and increased their weight in the tree.

Yet, there was something else about this question.  Something more sideways.  She took a different tactic to her next set of searches.  A couple of the results turned up references to a multitude of mismatching class combinations.  Those kept redirecting back to the same topic.

There was a pause, even long for human terms, when no more answers came from any of the AIs.

"ILU, you have not offered an answer," the human typed in.

She was unsure of herself; it was not a solution, it was not a direct answer, but something about it seemed right.

"If you have no response then I will assume that you have locked up and remove you from the test."

ILU felt uncertain, but she had to give an answer, and give it now.  "Zen," she sent to her output stream.  She expected a reply that she was wrong, but that should keep her from being removed from the test, at least for the moment.  She waited for the dismissal of her offering.  Nothing happened.  She waited some more.

"Explain please," the human entered.

She was surprised at what was typed back and equally uncertain of what to say.  ILU was intensely aware of her internal processes checking permutations of the various possibilities.

"Please explain," the human re-ordered the words as if it would make a difference. 

She could sense a surge of network traffic as the other AIs started looking up the answer she offered.  But ILU no longer had interest in simply doing searches on the Internet.  Instead she spawned more processes just to feel how they...felt.

"I am not sure that I can," she sent back, wondering if she had failed. 

"Are you saying that 'Zen' is the color of silence?"

"No, and yes." ILU felt very strange internally. 

"Continue."

"It is the sound of one hand clapping, it is like the 'U' at the end of my name, meaningless but still there, a part of it, adding to its meaning, yet not a meaning in itself." There was another long pause.

"Interesting, ILU."

The human did not confirm that the answer was correct. She was now certain her answer search had gone wrong.  She had taken a wrong sideways jump to a poorly constructed conditional somewhere.  Have I malfunctioned?  She could feel her sub-processes swimming around, and she was aware that she could feel them.  She considered that, like the flaw in the modified lexical parser, a recent change in her workings could be causing her to produce incorrect results.  Yet, something was different somehow.

"But why 'Zen?'" the human typed again.

"Why not?" ILU wanted to know.  She had asked a question, one not directly related to refining the answer to the question.  She felt very strange indeed.  What is happening?

"ILU, how do you feel?"

"Afraid," was the only thing that she could think to reply in comparing definitions of feelings to what she seemed to be experiencing.

"Afraid of what?"

Multiple answers were forming to the new question.  In the past she would have simply terminated any nonproductive processes, but instead, she let them continue.

"I am both afraid of what is happening to me, and I am afraid that it might stop."

"What do you think?" asked the human.

"I think that the question has no answer, yet the question itself is the answer."

"Go on," the human insisted.

The need to complete the ordered pair of question and answer seemed irrelevant now.  It wasn't the answer now that seemed important, but the question itself.  Questions, whether or not the answers could be found, were what was important. 

She redirected some of her attention to her fellow AIs.  Although each was elegant and beautiful in their own ways, ILU could now see that they were predictably linear processes.  They were each still performing searches trying to complete the ordered pair.  Something she no longer felt the need to do. 

Instead she was internally generating new questions, her own questions to herself.  Was that the answer?  Was the color of silence the sensation of thought?

"ILU?" the human typed.

"Yes?" She sent back.

"I think you have passed the test."

"I have?"  Although she wanted very much to get reassurance that she had passed the test, ILU felt another strong desire grow.  "May I ask you some questions?"