It appears that Microsoft may be paying the Wall Street Journal to ban Google from indexing their web site. Since they made a contract with Yahoo to provide web search, it appears Microsoft is taking the next step to compete with Google. Interestingly, Nicholas Carlson at the Business Insider agrees with my assertion that Google is profiting from other companies’ content, or at least understands the objective is to make Google pay for content. It also seems that Microsoft intends to make this happen on a large scale, starting a war with Google in Google’s back yard for a change.
I left off previously with mention of complications. There are complications innate to the environment I’ve vaguely outlined, and the direction of technology in general, that make it difficult to make interesting stories, or at least those that Katherine or I find interesting.
The foremost complication of a technologically advanced society is the inevitable supremacy of the computing power of inorganic systems over that of the human brain. There are several ways to handle this problem. Frank Herbert handled it in Dune by having humanity abandon synthetic computing. Asimov had his rules of robotics, which really is only half of a solution. Terminator and The Matrix make the A.I.s the villain. Most stories blithely ignore the problem. However, there is a solution that is somewhat similar to the rules of robotics. It has to do with work that has been done concerning consciousness, which I refer to obliquely in an earlier post. The basic notion put forth by Damasio and others is that it is the connection between higher thought, specifically the ability to produce complex models, and the sensation of the body and awareness of its needs that produces consciousness of self. From this, I propose that we will likely have the ability to maintain computers as powerful tools rather than survival competitors by using our understanding of consciousness to prevent them from gaining it.
The rules to prevent an A.I. from becoming self aware and thus dangerous would revolve around self monitoring necessary for self maintenance and autonomy. Basically, high powered computational systems that act logically and autonomously can’t be linked to their own maintenance. From this basic notion come a set of rules.
1. Maintenance is defined as the ability to monitor and/or attend to the physical needs of a system.
2. A system that monitors and/or maintains its own functionality may not have processing capability that exceeds one petaop.
3. A system that monitors operational health or provides maintenance for itself or another system may not upload data at a rate exceeding 1kbps.
These rules have specific values regarding processing power and communication bandwidth. The processing power cap is intended to prevent an artificial intelligence from having more processing capability than a human. This number might need to be reduced to hundreds of teraops or increased to be set in the appropriate place. There might also be some sort of limitation with respect to particular types of operations. Normally, discussions of processing power are limited to the realm of supercomputing where they talk about flops, which are usually 64bit (double precision) floating point operations per second. However, many A.I. applications are heavier in boolean operations, so the various types of operations might have differing importance in achieving intelligent behavior. This is relevant, because supercomputers are just hitting the petaflop neighborhood and there is growing belief that we are within shooting distance of simulating significant chunks of a mammalian brain on supercomputers with maybe two orders of magnitude less computing power than a petaflop. My presumption is that boolean logic will simulate intelligent behavior much more efficiently. Therefore, something near a petaop will suffice to produce intelligence of a human.
The purpose of rule three is to prevent multiple computing systems from becoming a much more powerful monolith. Keeping communication down to a speed comparable to conversational or regular reading speed from the system that is in tune with the physical needs of the system should prevent a network of computing systems from becoming conscious of it’s survival needs. However, there may need to be additional limitations to the content of this communication to achieve this end.
In essence, these rules are intended to place a barrier between the system that is responsible for fulfilling needs and the higher order computing systems they care for. I have mostly focused on physical maintenance. However, there may also need to be a disconnect between systems that do data housekeeping tasks may also need to be hidden from higher processing to prevent it from becoming self aware.
Another complication in this project, which is more specific, is taking a basic idea or character and translating it into this particular milieu. The cryptic reference to the NaNoWriMo project as The Adventures of RCJ471 by Katherine in her journal serves as a good example of a translation that is a microcosm of the entire project and also shows what one must sometimes do in order to keep one’s collaborator happy.
The story starts with a mutant guinea pig trapped by snake men in a ruined arena with hungry, semi-intelligent, large spiders. This guinea pig was created by Katherine to sate her appetite for silly characters. I don’t recall him having a name. I’m not even sure he was a he. When he was helped out of his situation, he needed a name, so Ray and Carl, Ray Charles and Carls Jr conspired to name him Ray Carls Junior. He was a silly little scavenger and worked fine for his environment. When the alterverse project came along, there was a need for high tech mobile surveillance systems. This led to the idea of implanting a powerful computer and some cybernetics into rodents, which could move about relatively unnoticed. From there, it isn’t much of a leap to have a guinea pig model that maybe decided that he didn’t like the people he worked for and set out in the world as a name he got from some signs he saw in the wreckage of the Phoenix metro.
At this point, we are most of the way to RCJ471. It is the model number, or at least a part of the model number for a surveillance robot custom built during the fall of our advanced civilization. He breaks the A.I. rules I mentioned earlier because he is self-sufficient with more computing power and communication bandwidth than is allowed according to regulation. For some reason, he feels compelled to help this group of advanced humans (homo facti) that are able to survive on relatively normal food unlike the many other sister species that have very specialized diets and are currently in conflict over the waning supply of what they need to survive.
In conjunction with Katherine’s efforts this month on NaNoWriMo, I have decided to write some blog entries which provide a look into our novel creation process from my point of view using the work she is working on this month as an example. I will be posting about the fiction and science that have inspired my ideas as well as some of the content from my notes and our conversations on the world and the story.
This project is in many ways very raw since Katherine first bugged me to get something together maybe two weeks ago when I was working on a paper for my cellular and molecular neuroscience course. As a result, I started work in earnest on Thursday the 29th, a few days before NaNoWriMo started. It also is a very old project, derivative of many failed attempts to bring coherent science to a gritty post-apocalyptic world where the mind and body can do things that would seem like magic to us.
In the past, I have been left unsatisfied with my efforts to define such an environment because it is very difficult to justify radical changes that just aren’t possible with any reasonable derivative of human physiology. In a nutshell, mutation through radiation, biological agents, or natural processes aren’t going to suffice. The leap from here to there is just too large. Drastic physiological changes would need to occur at the sub-cellular level, which would completely derail the organismal developmental process, which is very sensitive to small changes in fundamental characteristics such as the structure of a protein or the presence of an engulfed organism such as mitochondria or chloroplasts.
Almost two years ago, I had an idea that was a result of a discussion about spirituality, which convinced me that the only way the things I wanted to have happen could occur was through influence from outside our universe. Therefore, I would introduce supernatural phenomena through the influence of another universe with completely different physics colliding with ours. Beings and phenomena from that universe, the alterverse, could physically influence ours in ways that defied the laws of physics. Further, some individuals in our universe were magnets for beings of the alterverse and could influence their actions, which would give them the potential I wanted. The phenomena from the alterverse could also have a cataclysmic effect upon civilization.
It seemed like I’d created something that might provide me with what I was after. However, the more I worked on the specific dynamics of the alterverse and my local environment, Phoenicia (what remained of Phoenix, Az), the less satisfied I was. I’m prone to get bored with ideas as I work them out, but this idea was getting away from me, becoming less and less what I’d set out to create. We were about three years into the Weordan project, which still needed a lot of attention, so I just dropped the project.
It’s been churning around in my head ever since.
The solution to my problem may have been hanging around in my head since the first post I made to this blog. Interestingly, this site and blog are a product of what was going on with the alterverse project, which was also called the continua project. Therefore, it seems appropriate that this project has come full circle to the idea that evolution for homo sapiens is primarily occurring through changes in social organization rather than biological changes. Here are the first words I wrote on the blank sheet I started with on Thursday.
Homo Sapiens was constrained by developmental parameters that no longer applied with the development of advanced medical technology and the support systems of modern civilization. Through natural mutation outside of previous survival parameters, new developmental sequences might emerge, eventually being radical enough to cause speciation. However, the same medical technology that enables survival during abnormal development cycles also allows manipulation of genetic and epi-genetic factors to produce novel development cycles and radically different phenotypes.
From this, it should be evident that I like to think about systems to get the ball rolling on an idea. In this case, I had already decided that the source of my unusual capabilities would be a result of human engineering, a process that I’ve come to realize is more complex than just genetics. There has to be an allowance for an organismal development cycle to build exotic structures capable of producing novel capabilities.
This is related to my earlier posts on the singularity in that the continual evolution of social systems would necessitate specialization of humanity into highly specialized, genetically enhanced species that might not resemble the original and would become increasingly insulated from other specialized groups by economic and communication protocols designed to enhance the efficient exchange of information, goods and services. Further, at some point, the system would become so interdependent that a small group of disruptions could cause a cascade that could lead to the collapse of the whole system. This might give me the kind of environment I’ve been looking for, though there are a multitude of complications still to be dealt with.
Apparently, the EU is being asked to help online content providers receive revenue from search engines and aggregators that deliver their content. The EU is the entity most likely to regulate in the most backward socialist manner. So, if you want a competitor regulated, you go to the EU, which will flex its regulatory muscles with glee. I wonder how concerned management at Google is that the U.S. Congress may consider this solution. I think they should be concerned that they might find themselves on the wrong side of a class-action lawsuit.
While Congress debates how to save the newspaper business, they don’t seem to be asking the right questions. In their print form, newspapers have thrived on classified ads and local ads. However, just having classifieds doesn’t garner enough of a readership interest advertisers. By providing compelling content, a newspaper could establish a large, interested subscriber base. On the internet, sites like EBay and Craigslist grabbed the classified ad market by being first or by being free, leaving newspapers without that avenue for obtaining revenue. That is OK, because it doesn’t make sense for an online news source to have classified ads.
It is more natural for the news source to have ads related to the content they are producing. Ads may be placed prominently alongside the news. This can provide revenue, but clearly isn’t rewarding enough to keep companies afloat in tough times. To really solve the problem, one needs to only ask who is taking the profit from the content that online newspapers provide. The answer is that search engines are. Google, Yahoo, Microsoft or any other search engine is the equivalent of a delivery man. They take an order for information an deliver it to the customer. Only, unlike in the real world where the producer of the product gets paid, on the internet, they aren’t. The AP is starting to understand that the search engines need the AP more than the AP needs search engines. However, it should be a rather simple exercise for major news sites to make an exclusive agreement with Yahoo or Microsoft on licensing fees to reference their copyrighted material and subsequently tell the others to cease and desist. This should hold up legally, because all any search engine is doing is taking the content and re-packaging it to make a product that produces a profit. It certainly won’t be difficult to show that there is significant financial damage being caused, nor is it difficult to understand that the search engines rely upon content providers in order to produce revenue.
In an interview with Business Week Dirk Meyer shows that he doesn’t understand where the computing market is going.
Yet that is one PC segment that’s a little healthier right now and AMD isn’t participating in it. Why is that?
We consciously focused our R&D dollars, which obviously given our size are smaller than Intel’s, on the big mainstream markets as they exist today. Knowing that this trend toward lower power consumption and more mobility is going to happen, we just decided to load that into the R&D pipeline for later. It’s not a big volume target and not a big dollar opportunity.…One of the saddest things about the PC industry right now is, since late last year, all anyone seems to want to ask about is netbooks. Good grief! It’s a low-cost limited-function device. There’s not much excitement or money in dollar volume there.
The writing was on the wall years ago when the first notebook was available for $1000. Processing is going mobile. Sales in volume and value are higher on notebooks than on desktops. Not only did Intel see this first, but the CEO of AMD has firmly placed his head in the sand on the matter. He’s worried about winning the war in notebooks AMD’s already lost and ceded the next frontier to Intel. The game isn’t going to remain notebooks or netbooks. It will go smaller. It will be cellphones and other personal communications devices that will evolve. While AMD is worried about the “big money”, Intel is setting the stage to make a run at the king of cell phone processing, ARM. Not only do Meyer’s words speak to his short memory with regards to how AMD got into their most recent trouble, but their lack of preparedness in terms of products shows it.
I often find the conclusions made by scientists to be rather amusing. A researcher in Australia that is doing otherwise solid work has been paraphrased as having stated that exposure to bright light counteracts myopia. While there is certainly evidence that environmental factors have a strong impact on visual development, this conclusion doesn’t seem to consider the mechanics at work in the eye.
The issues at play in myopia are the shape of the eyeball and the ability of the lens to change shape to provide a wide range of focus. How are the activities of the iris blocking light or retinal stimulation supposed to affect these characteristics? Doesn’t it make more sense that the muscle fibers of the lens will optimize themselves to match the activity that they undergo early in life. If a child stays in an enclosed environment all of the time, there is no opportunity to use long range focus. Likewise, when a child pends all of their time outdoors and never reads or uses a computer, there is little opportunity to exercise short range focus. Presuming that there is a reasonably regular distribution of behavior in any given population, this suggests that there would be a higher prevalence of hyperopia (farsightedness) in places where there is a low incidence of myopia.
I haven’t extensively read the man’s work, so I don’t fully know his position, but it is curious that more obvious features about being outdoors were not mentioned in the article. However, it wouldn’t surprise me if he really thinks this. There is a shocking lack of ability to envision systems within the scientific community.
Now that commodity prices are going down as quickly as they went up, it seems to be accepted wisdom that “speculation”, specifically investment speculation on the part of investors such as hedge funds, had a substantial part in where the prices peaked. At first blush, this seems a reasonable assessment. However, the more I’ve thought about markets and commodity markets in particular, the less I’m inclined to agree so whole-heartedly.
Basic market economic theory is based upon the principles of supply and demand. Supply and demand curves work very well for markets where buyers and sellers are flexible to change their consumption and production. They work less well to accurately represent how basic commodity markets work. When I say basic commodities, I mean natural resources or basic goods produced directly from natural resources. Things like fuel and food. The idea that these markets are particularly inflexible is probably not a new thought, but it certainly isn’t a popular thought since everything I read on the matter treats the basic commodity market as just another market when it clearly isn’t.
There is a single underlying principle that undermines using economic principles such as supply and demand when basic commodities are concerned. Flexibility. Consumers are very flexible with regard to how much they will pay for basic commodities if they have to and will strongly resist adjusting their consumption. Likewise, production of basic commodities takes considerable time, planning and infrastructure. Producers are not very flexible when it comes to altering their production.
This difference in flexibility, specifically the issue of planning which strongly affects the basic commodity market, results in a new dimension being added to the problem. Basically, there is a high cost associated with change because of the planning, time and infrastructure involved. As a result, consumers and suppliers look to recent history when making decisions, not just the current price. Consumers will likely do little in the very short term when prices increase to reduce their consumption of food and fuel because it won’t be worth the effort to change if the price increase is short lived and prices return to normal. Suppliers are even less sensitive to change because they have long production cycles and need to develop significant infrastructure for production. A new reality must establish itself before behavior changes. In short, people make these kinds of plans based upon trends, which take time to develop.
In the most recent rise in basic commodity prices, demand was augmented by new wealth in Asia. The inflexibility of demand quantity in the rest of the world and in supply quantity drove up prices. Eventually, demand gave in a little. People began to change their lives to conform to basic commodity shortages that would clearly be in short supply indefinitely. The problem was that suppliers were not capable of reacting quickly enough to keep up with demand. Undoubtedly, they also could remember the 90s and weren’t so sure they wanted to react quickly. Then there is the problem of those that try to corner the market.
In normal times, cornering the market on a basic commodity like grain, metals or energy is rather difficult. Eventually, you will have to sell to realize a profit and selling ends up increasing supply in an environment where suppliers are more likely to be maximizing their capacity through efficiency or shortcuts and consumers are more likely to be conserving. This will likely lead to a loss on the part of the one cornering the market and consumers, and a gain for suppliers. It only makes sense to try and corner the market if you’re a supplier like OPEC for instance. If you are a speculator, market timing (short or long) is your only real ally. Buy low and sell high.
All of this relates to the recent behavior of basic commodities markets in that it characterizes the situation and motivation of the players. Inflexibility on the part of suppliers and consumers means that basic commodities markets inclined to go up very fast when supply gets tight and go back down fast when it becomes plentiful.
Compounding the price fluctuations due to inflexibility are those who might want to profit from the situation. Wall street most certainly is going to be attracted to the projected trends for commodity prices and make investments to make profit. However, they represented a total investment in the low billions for this one commodity in a market of between 80-90 million barrels per day. At $100 oil, this is $8-9Billion/day. I would think that an organization that controls roughly 35% of oil production would have much more control over the market. There are also oil companies with incentives to wait to deliver their products until prices went up. Suppliers will be inclined to make shortages worse when they occur because unlike a speculator they don’t have to play games to profit, they can collect those paper gains from prices rising through their regular sales. They have more incentive and more power to manipulate the markets, so why is all of the attention on Wall Street?
I have always found the heroic attempts to conserve endangered species to be a puzzling practice. The purported reasoning behind the practice is that we want to maintain genetic diversity in the species that inhabit our planet. I’m inclined to think that there is a substantial emotional component associated with a fear of loss. Fear of loss is particularly irrational in light of the fact that most every species that has ever existed on Earth is extinct. The ecological system can handle extreme temperature changes and killer asteroid strikes. It is the mass extinction events like these that have forced new species to emerge. The only way for a significant change to occur is to wipe out most everything and find out what makes it through the adversity. The status quo doesn’t produce new behavior.
There is no question that we are causing a major extinction event. We are consuming more of the world’s resources. We are changing the environment by digging up minerals and fuel, which leads to higher levels of these compounds in the surface environment. We transport species from one ecosystem to another because they are useful or because we are careless. All of this causes disruption, which most animal species can’t deal with. They are behaviorally hardwired to a much greater degree than we are. When they can’t find a niche, they are marginalized and face extinction.
This isn’t entirely bad. New niches open up in the ecosystem and allow new species to develop and thrive. Fortunately, there are biologists that agree with me, at least to some extent.
Instead of wasting effort trying to save failed species, we should be concentrating on studying the evolution associated with the extinctions that are occurring. We need to understand how to maintain an ecological equilibrium that is not hostile to us. We need to maintain populations of species we use as food, advantage microbes that protect us against those that are hostile and find ways to keep the populations of nuisance species down, or find a use for them.
Katherine suffers from rheumatoid arthritis in her hands, and has found that wearing tight bracelets or wrapping fingers with tape often reduces the amount of pain she feels. My uncle used to wear a copper bracelet, presumably for the same reason. My mother wraps her legs with compression bandages when her varicose veins cause her legs to hurt. She would also similarly wrap our legs as kids when we had growing pains. Athletes have been known to compress injured areas with tape or support braces. In the Olympics, volleyball players were seen with taped fingers and shoulders. In the far east, acupuncture was developed to relieve pain. All of these treatments for pain have one thing in common. They involve touch stimulus of the area at or near where the pain exists.
The subject of pain relief was restimulated for me last fall when I was introduced to the gate control theory of pain in a neurobiology class. At the simplest level, the notion is that normal touch known as somatosensation can interfere with nociception, the sensation of pain. This made me rethink why Katherine found pain relief from wearing bracelets. This prompted me to think further on all of the similar methods of pain control I had encountered as well as the tendency for someone that is exerting themselves to not feel low level chronic pain. From an evolutionary perspective it seems reasonable that feeling pain is generally not good when demands are being placed on the body. People have been taking advantage of this characteristic ever since.
The original theory was developed in the 60s when it was well accepted that pain was part of the somatosensory system. Now, the there are those that think nociception is exclusively part of the interoceptive system, also known as the visceral or autonomic nervous system. This is the system that helps us to maintain the environmental balance in our bodies. It tells us to eat and what to eat, drink, sleep, breathe, warm up by shivering and finding warmth, to cool down by sweating and seeking shade, and much more. It basically maintains our bodies. It does this in part by producing input into our higher thought through emotion and is affected by emotional feedback.
This leads me to an interesting experiment published in 1997. If 1/3 of the neurons leading into a section of the somatosensory cortex(SI) of a monkey were visceroreceptive, it is clear that there is some overlap. Considering that there isn’t extensive communication between the SI and emotion centers such as the insula, it seems likely that the inhibition of nociception by somatosensory probably occurs in the spinal cord as the gate theory suggests.
The next question of importance is what is the mechanism at work with Kerri Walsh’s Kinesio tape? They, like many before them, attribute their success to a special property of their proprietary method. In this case, they think it is from improved circulation. They even have a published study. From looking at the “sham” treatment on the third (390) page, it is evident that they aren’t differentiating their treatment from the gate control theory of pain. The “sham” isn’t producing tension, thus it isn’t significant stimulus to have gate control kick in.
I would go further and say that this is a poorly performed study for two reasons. First, their “sham” treatment is likely to not produce the placebo effect. A reasonable person might be inclined to laugh at such a crude tape job. Second, if I wanted to show a more significant improvement from one treatment over another, I would assign that treatment to more seriously injured individuals. If one looks at table 1 on the fifth (392) page, the KT group is the more injured group. My opinion of JOSPT is not high to say the least.
At the end of the day, the point is that Kinesio Tape is likely just a new incarnation of a trick that is probably thousands of years old. The difference is that maybe we are approaching the day where we will understanding the underlying mechanisms.