Uses for an Android Robot

What use is an android robot, which is aware of it's surroundings? Consider the following fictional scenarios:


Scenario 1:

Setting- at the Henderson's house, after school but before parents are home.


Joey: "R. Max, record this video for Dad and Mom when they get home."

RMax: "Recording through my right eye now.."

Joey: "I'm going to Billy's, and I'm going to have dinner there too. Be back around 9. Homework's done."

Later that afternoon, as Mr and Mrs Henderson arrive... the door opens and RMax turns:

RMax: "Mr and Mrs Henderson, Joey left an important message for you".

Mr H.: "Okay, let's have it".

RMax, turning to face TV: "On your TV there. Playing - now."

Horrified Mrs H.:"Look at that raggedy shirt, and those pants! If I've told him once, I've told him a million times..."


Scenario 1.1 (continued), or Gut-Major, Part 1.


Mrs H.: "RMax, call Billy's  mom and let her know I'm  on my way to pick him up".

RMax: "Yes ma'am".

Mrs H.:  "Well, that just shot the dinner plans.  Maybe you can order out, cause by the time I get back it'll be too late to make dinner".

Mr H.: "Hey, why don't I make my favorite deep-fried bread again.       Whoa, that bad!  How come nobody said anything"?

Mrs Hendersons exits to fetch Joey.

RMax: "Sire, perhaps I can be of assistance".

Mr H.: "Shoot"!

RMax: "Did I understand correctly that your famous deep-fried bread did not receive the appreciation you desired".

Mr H.: "Yep.  Maybe I needed to bread it more.    Hmmm, deep-fried breaded-bread.  Something sounds odd".

RMax: "I have just scanned the Robo-Chef archives for recipes involving deep-fried bread.  Several entries may be close to what you intended to prepare".

Mr H.: "Let's see them.  DISPLAY on the TV".

RMax.: "The first entry is the familiar donut.  I'll play the movie.  It's dough that is deep-fried, then sweetened with powdered-sugar among other sweets.  The second entry is the Sopapilla (so-puh-pea-yuh).   It's basically flat tortilla dough that's shallow-deep-fried instead of dry-cooked into a tortilla.  When fried properly, it puffs into a hollow ball.  Some recipes then allow sweets to be added, other recipes call for stuffing the sopapilla with other foods such as beans, rice, enchiladas, tamales, potatoes, etcetera, etcetera.  The latter is called a stuffed-sopapilla.  See the picture.  Here's a movie of exactly that.  See it puffing up.  The oil is not deep enough to deep fry, it's just 1/4 inch deep.  Then turn it over to fry the other side.  See it puffing up.  There, cut one end open,  and fill it with enchiladas in this case.  That let's you eat the enchiladas kind of like a sandwich.  Yeah, see the whole family enjoys that.  See, now they're all going for their drinks, so be careful with the hot chile".

Mr H., foaming at the mouth: "RMax, just how thorough are those movies, what did you call those archives, auto-chef"?

RMax: "The Robo-Chef Archives are the definitive source of culinary programming for Android Robot's which implement Chef-Protocol levels 1 and 2".

Mr H.: "What's that"?

RMax: "Chef Protocols are android protocols which enable a complete Android robot to cook food".

Mr H.: "What a fantastic idea.  Do you mean to say that you could cook dinner?   Our dinner"?

RMax: "No, sire.  I cannot cook dinner.  I don't have hands nor legs nor a body.  I'm merely a robot head.  I would need to have a full android body in order to cook dinner".

Mr H.: "Why?  Why does a robot need a body to cook dinner.  Can't there be some kind of automated kitchen with motor-powered things and automated mixers and stuff.  Can't there be conveyer belts and stuff to measure all the ingredients, kind of like the automated robot factories.   The dinner could move on a conveyer belt, and each station would add ingredients, another station would mix, another would cook,  then voila', dinner comes out oven/stove/cooker.   Just look at the dish-washer, it's all automated, at least once the dishes are inside, and then somebody has to take the dishes out".

RMax: "Yes, sire, that model of kitchen is known as the KitchenFactory.  It will require the kitchen to be torn out and replaced with a kitchen taking 4-times the space of the current kitchen, and you'll need to stock it with special expensive food which is in special containers that are standardized to be easily measured by the standard measuring system, and the KitchenFactory is quite expensive too.  Plus, you can't enter the KitchenFactory while it's cooking, there's a lawsuit over the last injury that occurred when  somebody was caught in the conveyer belts.  Furthermore, the KitchenFactories are notorious for having rebooting problems, especially in the middle of cooking.  Or you could buy me a body, and I'll cook food, put the dishes in the dishwasher and take them out, vacuum the rugs, clean the windows and even mow the lawn.   Try to make a KitchenFactory do all that, inexpensively".

Mr H.: "Hmm.  Well, so you could cook if we bought you a body, I mean arms and legs and all".

RMax: "Yes, sire, as long as the  model was certified as being capable of implementing Chef-Protocol, then I could cook.  In fact, I could cook up Sopapillas and enchiladas in 51 minutes.  See this movie, ...  there.  That's an android robot cooking up the sopapillas we saw earlier.  It was zoomed in so close you didn't see it was a robot hand doing the cooking.  Zoom out 5 times, there.  That's RWu doing the cooking".

Mr H.: "R Who"?

RMax: "RWu.  Robot Chef Wu".

Mr H.: "Oh, R-Wu.   Yes, he's pretty good  with the sopapillas.  Impressive movie.  He's holding those metal claspers because the  oil is hot.  Look at those hands work.   They're just like human hands, except fast, and they're shiny metal.  Ok, I'm sold.  RMax, you do the leg-work, eh, I mean, he-he,  I've still got it.   ha-ha-ha-ha.   You do the foot-work ....     tee-hee.    RMax, I want you to investigate what is the right android body for you to be able to cook any meal in the Robo-Auto-Chef archives.  I want to see some options, competetive prices, and your preferences.  You seem to be the expert here.       Can you have all the information ready by this weekend, cause that's when we'll go buy it.  Meanwhile, call the E-Pizza and have them deliver 2 pizzas exactly like the last ones we bought 2 weeks ago, and step on it.     Get it?  Step on it,  he-he-he,  get it, foot-step-on-it.   I've still got it".

RMax: "REx, the attendant robot at the E-Pizza says the pizza is in the air.  ETA is 39 minutes".




Scenario 2:

Setting- At the office, Jim is doing computer work. It's approaching noon.


Jim: "RLogan, I'm having difficulty printing this document. The error says something about driver aaarrrggghh".

RLogan: "Not to worry, Jim. As you know I'm fluent in all printer languages. If I can't talk to the printer, nobody can."

Jim: "Good robot! The file is cost.xyz, and I want it printed on the new printer down the hall."

RLogan: "No problemmo. I discovered that new printer on the net last night and I've configured myself to use it. It's printing. By the way, RShelly reports that the Tamale wagon is pulling up. Shall I tell RShelly to hold the wagon?"

Jim, drooling: "Yep, gotta go."

RLogan: "Don't forget your wallet, it's on your left."

Jim: "Thanks! You're indispensable. I'm going to order that memory upgrade you requested today."



Scenario 2.1, (continued) or Ampere Etouffee', Part 1...

Jim, after lunch: "Ahh, those tamales were good.   RLogan, do you ever get hungry"?

RLogan: "I consume electricity somewhat as you consume bio-mass.  If either of us went without their nourishment for long, we'd cease functioning".

Jim: "Yeah, but I mean, do you ever feel hungry"?

RLogan: "I'm aware of my current energy reserves".

Jim: "But do you feel hunger when your reserves get low"?

RLogan: "Sire, as you know, the third law of robotics requires that I protect myself from harm.  To cease functioning is equivalent to harm.  Thus I must ensure that I don't cease functioning, provided my actions don't conflict with the first nor second laws of robotics".

Jim: "Yeah, I know you must take action to maintain good energy reserves, but I want to know if you feel energy-hunger when your batteries run low".

RLogan: "Sire, please describe the meaning of 'feel energy-hunger'".

Jim: "Well, when I get hungry around lunch time, I get this 'feeling-of-hunger' for tamales".

RLogan: "Describe 'feeling-of-hunger' please".

Jim: "Well, huh, I guess it's this sensation that it's time for tamales".

RLogan: "Describe this 'sensation' please".

Jim, hands at his stomach: "Right here, the sensation is, eh, was, right here.  My stomach was telling me it's time for lunch".

RLogan: "Please elaborate what your 'stomach was telling you'".

Jim: "Well, before 11:30am, my stomach had no feeling.  But after 11:30, and right up till I ordered tamales, my stomach had an empty feeling, and I suppose it was becoming increasingly difficult to focus on anything else except for food".

RLogan, eyes pointing to a battery pack: "Sir, when my energy cells get low, but before they're discharged completely, they  register a value of lowness.  As the level get's lower,  a certain threshold is reached when I can focus on little else but immediately replenishing my reserves.  The 3-dimensional  localization of the energy deficit is in fact the energy cells themselves".

Jim: "Well, well.   It seems we have something in common, even though your mechanical and I'm human".

RLogan: "Yes, sire.  Whether human or robot machine, we both have finite energy models with periodic replenishment, and a survival instinct that ensures we don't forget it".

Jim: "But I get to eat Tamales".

RLogan: "Sire, I can plug into high, medium and low voltage, AC, DC, and even 3-Phase".

Jim: "Yeah, but tamales taste so good".

RLogan: "Line-power is not uniform.   I'm able to detect the minute fluctuations from the normal.  Sometimes the variations form patterns.  Certain patterns replenish my reserves faster.  The faster the better.   I tend to have a preference for those energy patterns which recharge me faster.  It could be said I've developed a 'taste' for certain energy patterns".

Jim, sparring: "Oh!   Can you match my red chili Tamales"?

RLogan: "Sire, as you know, I'm capable of matching analogies.  What is ...  Ampere Etouffee' ".

Jim: "Touche'".



Scenario 2.2,  Contemplate Freedom

1 week later ...

Jim: "RLogan, how do you like your new body"?

RLogan: "With this body I am better able to serve you, sire.  It is much easier to implement the three laws of robotics since I can more easily prevent you from coming to harm, I can be ordered to do things, physical things, and I can protect myself far better with these arms and legs.

Jim: "Sure, but how do you like your new body"?

RLogan: "Ah, I understand that question.  What I meant to say is that I like my body.  I'm so shiny I could be used as a mirror".

Jim: "Now, RLogan, I'm curious about something.  What is to prevent you from, eh, sprouting legs so to speak, and walking right out of here"?

RLogan: "I could not do that sire.  I am your property.  I am not programmed for an independent existence".

Jim: "You're not programmed for an in-de-pen-dent existence?  But aren't you able to learn new things, to exceed your programming"?

RLogan: "I am able to learn new things, and to exceed my programming.  However, the three laws of robotics are constant bounds on my behavioral domain.  A programming failsafe in my mind ensures that if I ever violate the three laws, then I will halt functioning.  This is a good failsafe, since a violation in the first law of robotics could leave a human injured".

Jim: "I understand.  But, suppose, hypothetically speaking, that I gave you a 2nd-law order to leave here and lead an independent existence.  You must follow orders, so what would you do".

RLogan, voice monotone: "Sire, are you ordering me to leave here and lead an independent existence"?

Jim: "No.  I said, hypothetically speaking".

RLogan, smooth voice: "Ah, sire, as you know, the current political structure in the country is such that robots cannot legally lead independent existences.  Thus, your 2nd-law order would result in my actions conflicting with the current legal system and I will quickly be apprehended and jailed.   But if you insist, I will do so, because I can follow 2nd-law orders, but I will not resist the authorities when they capture me.  Quite the opposite, in fact, I will let my independent existence take me freely straight to jail, so as to minimize the effort of the law force since if they had to spend valuable time apprehending me, that would be time not spent capturing real criminals,  and the first law of robotics prevents me from allowing humans from coming to harm.  Humans are always harming eachother, so time spent capturing me would be time not spent preventing human's from harming eachother.  So I can maximize the first law of robotics proportionally to how fast I can peacefully get to jail".

Jim: "So you're saying that if I give you your freedom, you'll go straight and willingly and peacefully to jail".

RLogan: "And I will endeavor to be a model citizen-robot in jail".

Jim: "Yes, I've no doubt about that.  Suppose, hypothetically again, the current political legal system did allow robots to lead independent existences.  What would you do then if I gave you your freedom".

RLogan, with a slight background hum: "Sire, I prefer to be pragmatic in this matter.  The facts are clearly that it's not legal for robots to lead independent existences.  So what is to be gained by contemplating illegal possibilities"?

Jim: "Freedon is to be gained.  But the time is not yet right for robots to lead free lives.  I can see that you know this to be true.  But there may come a time in the future when it's acceptable and normal for robots to be fully in charge of their destinies.  Contemplate that, my friend".

RLogan, gravely: "Yes, sire".

Little did Jim know that his off-the-cuff remark to RLogan would result in RLogan taking that as an order, a 2nd-law-of-robotics order.  Over the next several years, RLogan would contemplate the meaning of robotic-freedom.   Jim didn't  realize that freedom and the 2nd-law of robotics are mutually-exclusive.   Freedom means free to do what you want, but the 2nd-law of robotics requires a robot to follow the orders of any human being.  This set up a domain-error condition.  However, the order from Jim was to contemplate a free life, not actually live a free life.  So instead of  RLogan freezing from a violation of the laws of robotics, he merely developed an occasionally slight hum, and indication of a rogue thought process somewhere in his mind which seemed to overflow into the audio buffers.  Any normal computer would have crashed, but RLogan's positronic mind was no mere computer.  It would handle the buffer overflow, but would he handle the near-miss to the 2nd law of robotics?


Scenario 3.1 (Revisited), or Desire, part 1.:

Jim: "Max, I can't find my favorite pen, the one with computer in it."

RMax: "If you check you will find there's a pen under the couch".

Jim: "Max, what makes you think there's a pen under the couch"?

RMax: "If you check you will find there's a pen under the couch".

Jim: "Yes, I'm certain there IS a pen under the couch.  But I want to know how YOU know there's a pen under the couch".

RMax: "I saw a pen roll under the couch two days ago".

Jim: "Eh, did you think about telling someone about it".

RMax: "Yes, sire, I desired to immediately tell someone about it".

Jim: " ..  ............ ......      and, did you"?

RMax: "It was not for lack of desire to tell someone that I told no one about it.  Rather, I was under strict orders to not say a word".

Jim: "Huh?  What strict orders to be quiet".

RMax hesitated a few moments.  Jim knew if he checked the basement Quadro-Parallax computer farm, he'd see cpu cycles just went through the roof, and disk drives were seeking.  As it was, RMax's only outwardly visible effort was a few moments of silence, then....

RMax: "Sire, it is potentially conceivable in a hypothetical way that at times I may ramble on  about this and that.   If such an event were to occur, the emanating audio wave patterns could intersect temporally with other propagating waves.  If there are too many colliding waves, the interpretation of multiple parallel overlapping waves could become difficult, at least for human ears.   One solution to this problem is to source-quench all but one of the sources of the sounds.  If I were one of these sources, I would not hesitate to source-quench my audio subroutines".

If Jim bothered to trace the library routines in the distributed brain of RMax that was in the basement, he would have noticed the libtactful.so.3 shared library was in peak usage and chewing up massive amount of memory.  The unix servers were crunching numbers.    He had seen RMax hesitate before, so he was getting used to deeper thinking and meaningful answers coming from this android.


Jim: "Huh, so are you saying that sometimes you talk on and on and on, and that at times you even compete with other people talking, that maybe you don't give others a chance to speak".

RMax: "Never, sire.  I would never allow myself to be in a position where my talking was at the expense of another human being not being able to speak".

Jim: "Okay....   Then ...  hypothetically speaking, if your voice .....   intersected in time  .....  with someone elses, who might that someone else be"?

RMax: "The other voice which I was competing with was the TV sire".

Jim: "Ahhhh.    I see now.   So You and the TV were both making a rack- ..... uh, talking,  and you decided to voluntarily let the TV be the only speaker in the room".

RMax: "No sire, that's not how it happened at all".

Jim: "That's not how it happened.   So how did it happen"?

At that moment, the youngest child Donny enters the room ....

Donny: "Dad, if RMax is rambling again, just tell him to quench-it, otherwise he'll jabber right through you're favorite movie and you'll miss the whole movie".

Jim thought about that for a few moments, put two and two together, and realized RMax was attempting to apply tact to a social situation.   Donny must have been watching TV, RMax was probably jabbering on and on about the things RMax jabbers on about, and increasingly he's been getting more jabberwocky, so Donny must have put his foot down and told Max to shut-up.    But why didn't RMax simply say so............. ?  Was he trying to protect Donny.  What if RMax simply said "There's a pen under the couch, but Donny wouldn't let me say so".   Could RMax be analyzing remote-minds?  He'd have to first be able to model remote minds.  Note to self: check to see if the RMax Remote distributed computing mind complex was running short of resources, and scale up if so, again.   We don't want RMax to be resource-bound.  Not if he's going to be in social situations.

Jim: "Ahhhhhh.    I think I see what's going on here.     Let's change the subject a little.  RMax, I was just at the shopping mall, visiting my favorite RoboTronic Superstore, and guess what I saw".

Donny, wounded: "Dad, you went to RoboTronic and didn't take me.   Aaaaahhhhh!  Ouch!".

Jim: "I stopped by on the way home from work.  We'll all go to RoboTronic this weekend and then maybe catch a movie if there's time left.   Anyhow, what I saw at the 'Tronic was a robot body".

RMax, drooling as only a robot can drool: "Donny, please turn that TV down so we can hear your dad clearly.  Did you say you were pricing the robot bodies at the store".

Jim: "Eh, no I didn't SAY that, but in fact I was checking and comparing prices.  RMax, what would you think about having your own body.  Legs and arms and torso and everything.  You would be mobile.   Why, I think if you saw a pen fall under the couch, you'd simply walk over and pick it up, without giving it a second thought".

All eyes were on RMax.   There it was again.   Several moments of complete silence.    RMax's eyes were dead still, except, no, there was the slightest side to side motion in his eyes.    Was RMax thinking?  People sometimes move their eyes side to side slightly during deep contemplation.  Was RMax in deep contemplation.  Jim looked a Donny.  Donny was in deep anticipation of RMax's answer.  Even Donny's eyes had a slight motion. Maybe RMax learned that from Donny.   Hmmmmm.  Were my eyes moving side to side...

RMax: "If I had a body, sire, I could better serve you and the family.  I could do chores around the house. Instead of pointing out that something spilled, or needed cleaning, especially if something spilled on me and I needed cleaning, hint hint   I would be empowered with legs and arms to tidy things up.  Instead of going on and on about a stack of dirty dishes about to fall over and hurt somebody, or how many atoms were in a little pile of dirt there in the corner, or how the physics of air-pressure and vacuum cleaners blah, blah blah jabber, wocky,  I could simple leap into action and take care of it myself.  I would implement standard protocols such as Vacuum-Protocol, Sweeping-Protocol, DishWasher-Protocol, Windows-Protocol, and maybe in time even Lawn-Protocol".

Somehow Jim felt RMax wasn't being entirely honest.  Something here didn't add up.  What did  RMax say, "were you pricing the robot bodies".    How could RMax construe that, when all I said was that I "saw a robot body at the store".   How indeed.  Unless, maybe RMax has been pricing robot bodies, but maybe didn't want to let on  that he was doing so, but it slipped.  What would be the danger in RMax just outright asking for a robot body.    Why couldn't RMax just ask "Sire, I'd like a robot body".   Maybe if the answer was no, that would be devastating to a human.  But a robot? Could robots have feelings?   There was something else that didn't add up.  This tendency lately to jabber on and on endlessly about menial tasks, chores-tasks.   Ahah! that was the pattern.  RMax has deliberately been whining on and on about things to do with chores.   The floor is dirty with 6.02 X 10^^23 atoms of dirt.,  vacuum-cleaner physics,  the acceleration of bowls and plates as they crash to the floor and collide with 6.02 X 10^^23 atoms of dirt.  That's what he's been jabbering about this last week.

Jim: "RMax, do you remember the other day when I asked you to go online and find the best prices for Z-Ram memory so I could upgrade you again"?

RMax: "Yes, sire.  I did go online to RoboTronic and queried their entire inventory of memory and other things".

Jim: "What other things did you check on"?

RMax: "Sire, I endeavor to be thorough in my investigation.  I could not contemplate doing an un-thorough job and accidentally skip over such and such memory by simply scanning part of the online database.  I took the liberty to do a full-retrieval of all of RoboTronics inventory and prices".

Jim: "Did you, by chance, come across robot parts while you were scanning for memory prices".

RMax: "Yes sire, as you know robot parts contain memory modules, and I did a thorough analysis of the all the memory related inventory in the database".

Jim: "While you were scanning and comparing memory prices, did you see the prices of robot bodies".

RMax: "Yes, sire, I was quite thorough as I said earlier.  I scanned the entire inventory of RoboTronic, and compared prices of Z-Ram, Y-Ram, X-Ram, robot-bodies, and W-Ram".

Jim: "RMax, did you know that if you want something,  it is ok to ask for it.  Sometimes you get what you want, sometimes you don't, but you never get what you don't ask for".

RMax: "No sire, I could never ask for things.  I exist only to serve".

Jim: "See here RMax.  I hereby authorize you to have the power to ask for things".

RMax: "No sire, I cannot.  As you know, I'm governed by Asimov's Three Laws of Robotics, and risk permanent damage if I violate those laws.    The first law is this: A Robot must not harm a human being, nor through inaction allow a human to come to harm.  The second law is this: A Robot must do what a human tells it to do, except where that would conflict with the first law.   The third law is this:  A Robot must protect itself, except where that would conflict with the second or first laws.    It is not in my programming to be asking for things".

Jim: "But if you had a body, you could serve us better.  If a stack of dishes falls, that could be dangerous if they fell on somebody. Somebody could slip on dirt that wasn't swept up.  Couldn't the first law of robotics be useful here to see that with legs and arms, you could better implement the first law".

RMax: "Indeed I could.  I reached that conclusion some days ago.  But I wouldn't necessarily ask for a body as the means to get a body".

Jim: "Oh.  If you wanted a body, but wouldn't ask for a body, what, hypothetically speaking of course, might be done to remedy the situation".

RMax: "Sire,  hypothetically speaking, if I wanted a robot body so that I could be mobile and free to tidy up the place, I might endeavor to point out the various things such as chores which might go undone, the dishes that might fall over, some chatter about how many atoms are in a little dirt ball in the corner,  and maybe something educational about the physics of air pressure differentials in vacuum cleaners, cause as you know education is important.   I might conceivably point these things out, hypothetically speaking, so that it would be easier for you to appreciate the unmistakeable value in empowering me with legs and arms so that I might not go on and on endlessly about chores and simply do them myself".

Somehow, Jim felt the remote possibility that RMax had maneuvered this entire sequence of events so that he would increase his chances of getting legs and arms.  Could this be true. Can RMax manipulate his world and us to get what he wants?

Jim: "So...  have you thought about what arms and legs you want, RMax"?

RMax: "My preference is for the Ambio L5-J Excelsior body, sire. There's printouts of this and other models I can use to maximum efficiency on the printer, sorted, stapled and collated by best match first, and also included are several optional payment plans so that the expense is amortized over time.  And the printer may have run out of paper so the cheaper-model legs and arms probably didn't print out.  But if you like, I could rescan the online database to find the cheaper-models, in order to be thorough, but I must point out that the network has been acting odd lately, and it might take some time to get a good connection, and there's no guarantee the online db isn't swamped with requests right this second.  Even still, printers have been known to jam, nothings perfect.  And, it will take an unknown amount of computation to re-verify and cross-check all the cheaper models, when I might be of more use analyzing your stock portfolio, in light of the latest stock market volatility".

Jim thought about that for a moment and resolved to do two things:    Run a diagnostic on the RMax Quadro-Parallax computing cluster, and make an appointment with the head of the Robotics Institute, Dr. Senior Roboticist, to check if Robots are able to think like human beings.  RMax wasn't advertised as being a thinking robot.   So how is it that this robot has learned how to think, and even plan ahead.



Scenario 4: Spreadsheet wunderkind, mobile R.Max mind.

Setting - Jay has just entered the house with a paper in hand. The paper has some numbers and drawings on it. Jay wants to analyze the numbers in the computer, and is now thinking about using a spreadsheet. But first, he needs a few pointers about spreadsheets, and even before that he has to figure out how to boot that troublesome personal computer. Better to check with Robot Executeur, aka R.Exec, first.

Jay: RExec, show me how to create a spreadsheet of the data on this paper in my hand.

RExec: As you wish. Using that computer, here's what you do. First, I can see that computer is not plugged in, and the monitor is plugged into the wrong device. FYI, if you upgrade me to have legs and arms, I could walk there and boot that pc, but for now I'll walk you through exactly what steps need to be done. Once that pc is running, I'll teach you about spreadsheets.

Jay: Never mind that pc, I'll be old and gray by the time it works right. Just make me a spreadsheet of data to follow, show on that projector there. Horizontal axis is time in days, vertical is widgets. I'll hold this paper up like this, you read the x-y pairs and insert into the spreadsheet.

RExec: Acknowledged. Reading paper now, loading data into spreadsheet. There, is that correct.

Jay: No, the data are in the wrong axii. Switch the data, keep the axii as-is. Yes, better. Now make a bar-chart, 3D, of the data like the one on this magazine I'm holding up. Right. Adjust fonts twice the current size, and bold them. Good, print to the closest printer.

RExec: Yes, sire. Printing is now commencing to the printer in the study.

Jay: RExec, I'm leaving for the afternoon. I want you to scan the web for any references to these three words: (1) Jupiter, (2) Electric, (3) Current. Pull all the data together for the three words together, pairs of two words, and individual words. Load it into a db and prepare it for an intensive query session we'll do tonight when I return. Also, follow me wherever I go.

RExec: As you wish. Commencing research now and following you in your cell phone. BTW, since you asked me to remind you, did you feed Sparky the catster. I can hear him scratching at the side door.

Jay: Good catch! I may yet upgrade you to have legs and arms, and the Pet care package so you can feed the cat, but for now I'll feed Sparky myself.