Tag Archives: rules of robotics

BlogWriMo: Some Complications

I left off previously with mention of complications. There are complications innate to the environment I’ve vaguely outlined, and the direction of technology in general, that make it difficult to make interesting stories, or at least those that Katherine or I find interesting.

The foremost complication of a technologically advanced society is the inevitable supremacy of the computing power of inorganic systems over that of the human brain. There are several ways to handle this problem. Frank Herbert handled it in Dune by having humanity abandon synthetic computing. Asimov had his rules of robotics, which really is only half of a solution. Terminator and The Matrix make the A.I.s the villain. Most stories blithely ignore the problem. However, there is a solution that is somewhat similar to the rules of robotics. It has to do with work that has been done concerning consciousness, which I refer to obliquely in an earlier post. The basic notion put forth by Damasio and others is that it is the connection between higher thought, specifically the ability to produce complex models, and the sensation of the body and awareness of its needs that produces consciousness of self. From this, I propose that we will likely have the ability to maintain computers as powerful tools rather than survival competitors by using our understanding of consciousness to prevent them from gaining it.

The rules to prevent an A.I. from becoming self aware and thus dangerous would revolve around self monitoring necessary for self maintenance and autonomy. Basically, high powered computational systems that act logically and autonomously can’t be linked to their own maintenance. From this basic notion come a set of rules.

1. Maintenance is defined as the ability to monitor and/or attend to the physical needs of a system.
2. A system that monitors and/or maintains its own functionality may not have processing capability that exceeds one petaop.
3. A system that monitors operational health or provides maintenance for itself or another system may not upload data at a rate exceeding 1kbps.

These rules have specific values regarding processing power and communication bandwidth. The processing power cap is intended to prevent an artificial intelligence from having more processing capability than a human. This number might need to be reduced to hundreds of teraops or increased to be set in the appropriate place. There might also be some sort of limitation with respect to particular types of operations. Normally, discussions of processing power are limited to the realm of supercomputing where they talk about flops, which are usually 64bit (double precision) floating point operations per second. However, many A.I. applications are heavier in boolean operations, so the various types of operations might have differing importance in achieving intelligent behavior. This is relevant, because supercomputers are just hitting the petaflop neighborhood and there is growing belief that we are within shooting distance of simulating significant chunks of a mammalian brain on supercomputers with maybe two orders of magnitude less computing power than a petaflop. My presumption is that boolean logic will simulate intelligent behavior much more efficiently. Therefore, something near a petaop will suffice to produce intelligence of a human.

The purpose of rule three is to prevent multiple computing systems from becoming a much more powerful monolith. Keeping communication down to a speed comparable to conversational or regular reading speed from the system that is in tune with the physical needs of the system should prevent a network of computing systems from becoming conscious of it’s survival needs. However, there may need to be additional limitations to the content of this communication to achieve this end.

In essence, these rules are intended to place a barrier between the system that is responsible for fulfilling needs and the higher order computing systems they care for. I have mostly focused on physical maintenance. However, there may also need to be a disconnect between systems that do data housekeeping tasks may also need to be hidden from higher processing to prevent it from becoming self aware.

Another complication in this project, which is more specific, is taking a basic idea or character and translating it into this particular milieu. The cryptic reference to the NaNoWriMo project as The Adventures of RCJ471 by Katherine in her journal serves as a good example of a translation that is a microcosm of the entire project and also shows what one must sometimes do in order to keep one’s collaborator happy.

The story starts with a mutant guinea pig trapped by snake men in a ruined arena with hungry, semi-intelligent, large spiders. This guinea pig was created by Katherine to sate her appetite for silly characters. I don’t recall him having a name. I’m not even sure he was a he. When he was helped out of his situation, he needed a name, so Ray and Carl, Ray Charles and Carls Jr conspired to name him Ray Carls Junior. He was a silly little scavenger and worked fine for his environment. When the alterverse project came along, there was a need for high tech mobile surveillance systems. This led to the idea of implanting a powerful computer and some cybernetics into rodents, which could move about relatively unnoticed. From there, it isn’t much of a leap to have a guinea pig model that maybe decided that he didn’t like the people he worked for and set out in the world as a name he got from some signs he saw in the wreckage of the Phoenix metro.

At this point, we are most of the way to RCJ471. It is the model number, or at least a part of the model number for a surveillance robot custom built during the fall of our advanced civilization. He breaks the A.I. rules I mentioned earlier because he is self-sufficient with more computing power and communication bandwidth than is allowed according to regulation. For some reason, he feels compelled to help this group of advanced humans (homo facti) that are able to survive on relatively normal food unlike the many other sister species that have very specialized diets and are currently in conflict over the waning supply of what they need to survive.