Unsophisticated Olympic Coverage

I have been remiss in my work on Weordan and in posting on this blog in large part due to watching the Olympics too much and at the times that I’m usually the most productive. I have seen the technical analysis done by NBC fail in two different situations which indicate a lack of technical aptitude. To a large degree, it reinforces the notion that while experience is respected, many people get a free ride based upon experience rather than real ability.

The most recent failure that I’ve noticed occurred during the Phelps 100m butterfly win. The people at NBC just stated that Phelps must have won by a fingernail. The fact is that the sensor in the pool is touch activated and can’t be sensitive enough to be triggered by a fingernail touch. Otherwise, water sloshing up against it would set off a false positive. It has to have a significant lack of sensitivity. Cavic might have touched first, but he didn’t touch hard enough to trigger the sensor. Because the sensors are there, the definition of finishing is touching the wall hard enough to trigger it.

The far uglier failure was with the gymnastics competition, especially the men’s. The difficulty portion of the scoring isn’t well calibrated between the events, especially in the men’s competition. This means that scores in certain events are generally higher than in others. Since the competitors are distributed between different stations, their scores aren’t comparable unless they are in the same group or until the end. As I was sitting there listening to the technical commentary where this was being explained and a guess was being made, I was wondering why they didn’t normalize the scores to get an idea of who was leading. This is often done when making fiscal or financial judgements, as different seasons are more active than others. The same could be applied to gymnastics, which would have made it easier to tell what was going on and actually enjoy the race instead of just wonder who was really ahead.

I would be extremely embarrassed if I were them.

One Strike Against Consumers

Our current economic situation is a result of two forces: commodity demand from the developing world, primarily China and India and an unrestrained appetite for profit today. Analyses like this one in the NY Times look at economic forces in terms of the symptom rather than the cause.

The rising prices of commodities like food, fuel and building materials have been largely driven by demand in Asia where large populations are seeing their disposable income increase. They could also be driven in some part by market speculation. There have been abnormalities in grain prices and there might be an oil bubble.

The housing bubble was driven by the same forces. The price of building materials increased due to increased building in Asia, as well as here and many other countries. Speculators got into the market buying and selling real estate as well as selling risky loans predicated on the notion that prices would go up forever, eliminating the risk of loss if they were to default. The primary driver was the large appetite for profit today and very little consideration for the future.

While there is some element of coincidence to our current situation, it really boils down to one real problem. The current investment and management climate is exaggerating the impact of major economic developments. It happened with the internet in the late 90’s then moved on to housing and commodities since then.

Cloud Computing Isn’t New and Isn’t the Answer

Over the last couple of weeks, there has been quite a bit written about cloud computing, most recently about the effort by Intel HP and Yahoo. The article I linked to makes a very important point. The notion of a dummy terminal that has services delivered to it by a server is as old as computing itself. Whenever there is an application that can only be performed in a timely manner by expensive hardware, centralized computing becomes the computing paradigm for those that want to run that application. When the PC proved up to the task of word processing and spreadsheets in the early 80’s, server sales went into a nose-dive. There were still databases that needed to be managed centrally, so the server never went away, but its role was diminished. Currently, smart phones just don’t have the processing power to do a whole lot, but that will change rather rapidly as I suggested a couple of weeks ago. We are getting to the point where many users don’t know what to do with the capabilities of a low end PC or notebook. Why do we think inflicting the poor reliability of network access on computing be better than the dominant decentralized paradigm?

Realistic Portrayal Through Gender Tendencies

When Katherine and I work on book, we try to make sure that characters are distinctive and realistic. One of the most important ways we do this is to differentiate the way they look at the world, other people and the problems that arise. My approach is to have a model of these sorts of behaviors which is linked in a credible manner to people in the real world.

What I’ve learned from experience is well characterized by a specific encounter I had last fall in one of my classes. It was near the end of the semester and a fellow grad student was worried about an assignment to write up a mock grant proposal. Her concern was that she hadn’t been provided sufficient specific guidance to assure her that she would get a high grade on the assignment. I was less concerned for two reasons. First, I’m going to do what I’m going to do regardless of the instructions. Second, it had become clear to me that this instructor was more concerned with us showing original thought in our proposals than in our conforming to some standard. I tried to relate to her that her concerns weren’t warranted in this situation, but failed to satisfy her. To some degree, it seemed that she was more looking for an opportunity to vent her frustration rather than look for a solution. The differences between us were evident in more than just our approaches to the grant proposal assignment. Our behaviors were considerably different during the conversation. I have a tendency to not look at the face of the person I’m talking to because it is distracting. It takes considerable effort for me to translate my thoughts into words, and keeping track of the expression on someone’s face while doing so is too much multi-tasking for me. This may cause me to miss some part of what is going on with other people. However, I can still get and idea of what they are feeling and whether they are being genuine from the tone of their voice and my general assessment of them as a person. Meanwhile, she was intently studying my facial expression to the point of it being a bit uncomfortable.

My proposition is that myself and the aforementioned fellow student represent fairly polarized points on a continuum where there is a trade-off between two modes of thinking. She operates in a mode of mostly associating details directly with experience to make a conclusion. On the contrary, I look at the relationships between the details themselves to recognize a system from which to draw a conclusion. Her approach is more common amongst women while mine is more common amongst men. However, there is significant overlap between the genders with some hybridization of the two strategies going on in many people.

This model is supported by gender tendencies with respect to navigation. In humans, males tend to prefer a visual map when given instructions to find a destination while females tend to prefer explicit instructions. Similarly, female and feminized male rats tend to rely upon landmarks when navigating mazes while males and masculinized females tend to rely upon room geometry. Instructions and landmarks are essentially directly related to the goal. In this case, getting to the next landmark or step in the directions. This is similar to the previously proposed female tendency to identify direct relationships. The male tendencies also are similar to the aforementioned premise. Having a map in one’s head or depending upon the shape of a room is working from the association between multiple characteristics of the area being navigated.

This particular model best relates to writing in that it suggests what information is most important to a given character. It allows characters to be somewhat distinctive and internally consistent. When confronted by someone that is nervous, the more feminine character might be more inclined to notice specific behavioral abnormalities such as tense muscles that suggest something is amiss. Meanwhile, the more masculine character might be more focused on why the individual might be motivated to be deceitful and be watchful for behavior that confirms this. Depending upon the circumstances, one approach may be more effective than the other. In any case, they will not be paying attention to the same information and can’t be written the same way. This is difficult to do when one is writing from the perspective of a character that thinks differently than oneself.

Increasing Demands on Cell Phones Will Advantage x86

We are fast approaching the point where cell phone sized devices will have sufficient processing power to perform functions traditionally executed by a personal computer. There is some question as to whether this is necessary or desirable. With access to the internet, the processing can be performed by machines on the network. This goes back to the conflict between centralized processing versus decentralized. Centralized processing is desirable when the cost of processing power is high. In this particular case, the expectation would be that the cost of processing power is lower in a server form factor than a cell phone. There are costs associated with this solution with respect to the decentralized solution. First, reliability is lower because the reliability of the network affects all processing tasks. Second, network traffic is increased because communication must take place for all processing tasks as well as other communication. Third, privacy is compromised due to the extent that personal information is stored on a server that is likely not in control of the user and is transferred frequently between the mobile device and server. For these three reasons, it is very desirable to have decentralized processing. Therefore, it will be desirable to have hand-held computing.

Given that hand-held computing will happen with it just being a matter of when, there is still the question of what will become the dominant platform. Intel and AMD are reducing the power consumption of their x86 based microprocessors and platforms while ARM based smart phones are becoming more powerful every year. On the software side, it will likely be Windows and Linux versus Nokia’s open source Symbian. Apple seems flexible enough to operate on either x86 or ARM, as they are currently doing so with Mac and the iPhone, though their software development is much more extensive on x86. Presuming that processing will migrate into mobile devices and will not become centralized, compatibility will remain important. This will lead to a natural advantage for x86 as functionality on cell phones comes to resemble regular computing.

There is also a question of performance. The challenge for x86 is to reduce power consumption down to a scale appropriate to a cell phone sized device. For ARM, the challenge is to scale up the capabilities of the architecture for general purpose computing. Intel has made it half-way by scaling their processor power consumption down to 0.5W while achieving the ability to run Windows XP with Atom. This is low enough power consumption to be a viable smart phone chip. The holdup is with the rest of the platform, which has been rather disappointing. Intel has failed to sufficiently reduce the size and power consumption of the chipset, Poulsbo. This could take a couple more revisions to iron out, a fact which may have motivated Apple to look for an alternative, possibly prompting the acquisition of P.A. Semi. It seems likely that the plan is to develop a high performance cell phone platform in either the Power or ARM architecture. It isn’t evident which, since P. A. Semi is rumored to be discontinuing support of their PWRficient processors based on the Power architecture. This implies that they are going in a different direction, possibly scaling ARM up instead of scaling Power down to meet cell phone sized computing needs. Needless to say, it shall be interesting to see what develops as there is considerable microprocessor design talent at P.A. Semi.

History Will Continue to Repeat Itself

We are on the cusp of a new era in computing. As microprocessors become more powerful and have more cores per die, there is less need for additional general purpose computational power. On the desktop, the computational load is primarily graphics, image processing or encoding/decoding of music and video. These tasks are computation heavy and branch/logic light, much like traditional supercomputing. As a result, the major microprocessor producers have been moving toward more floating point computational power in their processors. IBM produced the Cell processor, a powerPC core with 8 simpler vector processing cores, which is the workhorse for the first petaflop computer. Obviously, it is the fastest in the world. The top two graphics processing companies, nVidia and AMD, are also becoming more concerned about developing programming tools to allow the computation power of their graphics processors to be used for purposes other than graphics. Finally, Intel will be extending the x86 code base for vector processing when they produce an x86 based graphics accelerator codenamed Larrabee.

Computation power has always been important in research. Simulating nuclear devices is computation intensive, so the DOE has always had a top notch system in New Mexico. However, a new field is opening up that requires much more, biology. Specifically, the task of understanding protein folding and interaction. Stanford’s Folding @Home program asks people to borrow the processing power of their computers that their not using to do protein folding calculations. From the beginning, the PS3, which is powered by IBM’s Cell processor has been a strong contributor to the program. Recently, they have also developed a client in Cuda, the nVidia proprietary language which promises to bring the substantially higher processing power of GPUs to help solve the protein folding problem.

The only problem with Folding @Home is that the processing power of individuals is so small that it is really not possible to simulate a significantly long folding sequence. At least that is the claim made by D. E. Shaw. There is also an article in the New York Times, which is less technical.

More or less, Shaw’s argument is that a dedicated supercomputer is needed and he can produce a specialized ASIC that will do the job 1000 times faster than the processors used in current supercomputers in about 5 years. Unfortunately, while there will be an approximately 10x shrink in that time, supercomputers will be in excess of 100 times more powerful. Possibly 1000. This is because all but one of the top supercomputers is powered by either Intel Xeon processors, AMD Opteron processors or IBM Power processors. The emergence of the new Cell based system IBM built for the DOE and deals by Intel with Cray and Dreamworks suggest that mainstream supercomputing will no longer be driven by just general purpose CPUs, which aren’t very efficient at raw computing. Larrabee is a big part in this, as will Cell and Cuda. D. E. Shaw’s Anton is going to be yet another specialty chip that will be marginalized by higher volume processors.

MRI: the Particle Physics Laboratory for the Brain

Much like particle physics has particle accelerators such as the Large Hadron Collider at CERN and the Tevatron at Fermilab, neuroscience has MRI. All are high energy systems designed to induce and detect sub-atomic reactions. They also all require significantly complex mathematical tricks to discern useful information.

Increasingly more powerful magnets have enhanced the sensitivity of MRI and has called for new methods of analysis to learn about the structure of the brain. Voxel based morphemetry(VBM) is an analysis method which uses MRI to measure the volumes of gray and white matter structures in a living brain. I first encountered it while searching through the literature on the differences between typical male and female brain structure. It has also been used to recognize abnormalities for individuals with autistic spectrum disorders and should prove useful in determining the relationship between mental performance and structure of the brain.

On Monday, it was announced that a newer method, diffusion spectrum imaging (DSI), had been used to track white matter connections between different regions of gray matter in the cortex of the brain in five right handed male subjects. This finding supports work published two years ago in Brain and the theory of Antonio Damasio. A little studied region of the brain, the precuneus is implicated by Damasio as being a key player in consciousness and was the most widely connected part of the cortex in this latest DSI experiment. The notion is that an area key to consciousness would have to be connected to a wide swathe of cortex.

While these first looks at what DSI can reveal are exciting, it is only a glimpse. The study was done on a group of five right handed males who were instructed to stay alert with their eyes closed. The imaging was not functional in that results were averaged over a period of time where there might be considerable fluctuations. I look forward to the fruits of DSI analysis as MRI technology improves and more extensive studies are conducted. Not only does it have the potential to pinpoint important structures involved in consciousness, it also will be useful in understanding higher level reasoning by allowing us to correlate mental performance with the strength of the relationships between regions of the cortex.

Using Fear of Identity Theft to Fix the Credit Industry

According to an article in the New York Times, there is an effort ongoing by the Information Card Foundation to establish a central manager for online accounts. On the surface, it seems like this might help block security holes in online accounts by making phishing mostly obsolete and eliminating the use of browser password tools. However there is more to this initiative than to protect people from fraud.

Equifax, one of the three major credit bureaus, is a member of this foundation. They might be interested in the data related to financial transactions. Further, Google, Microsoft and Oracle all are heavily involved in writing search software, something that would be necessary to extract useful data from the service’s logs.

The motivation for a move like this is that there is little confidence in the credit bureaus. Lending institutions have drawn back their willingness to lend because they don’t have an accurate picture of who it is they are lending to. While a lot of the credit problems were caused by irrational expectations with regards to the future values of homes, there was also rampant credit rating manipulation in the sub-prime market where credit repair services were and are being used to manipulate the system. One strategy is to contest legitimate black marks in your credit report. If the credit bureau doesn’t investigate in 30 days , they have to remove it.

Lenders were avoiding looking too carefully at those they were lending to because they wanted to do more business and mistakenly thought there was little risk. Now, they still want to do more business so they can make profit, but they still feel the pain from earlier mistakes. A credit bureau that were to differentiate itself by offering more information to skittish lenders might seize more business in an environment which is probably not very good currently.

I am not fond of the notion of a centralized system keeping track of my activity online so I’m not fond of this “product”. However, it is obvious that the credit bureaus have to improve the methods they use for compiling credit ratings. They currently put too much weight on having had significant debt in the past, which makes someone that has had modest loans look an awful lot like someone with a checkered past, unreliable. They need to get a better look into all of the times someone pays a bill and a better idea of how someone handles their finances. This will require a better system. Maybe it will be this one, maybe another.

An Evolutionary View of Economics

I recently became aware of an article on the Singularity in the IEEE periodical Spectrum. It resembles to some extent my previous post on the topic, but misses some fine details. The key attribute of the explosions in growth is hierarchy, which is powered by specialists collaborating. The Industrial Revolution was a case of collaboration in a factory. What Hanson mistakes as an “agriculture” revolution was actually the beginning of civilization, which is collaboration in a town or city. Previous to that was collaboration between different regions of higher thought to create a complex world model. The next earlier step was collaboration between the internal and external senses to produce learning. The next earlier step was the collaboration of cells to produce a multi-cellular organism. And the first step was the collaboration of molecules to produce a single cell organism. There are the nasty little exceptions like mitochondria and chlorophyll, which are technically subsumed in the cell of a larger organism. However, those can be reasonably lumped in the cells collaborating category.

This makes six generations instead of five as mathematically estimated by Hanson. It might be argued that there is very little value in the human’s ability to model their environment that doesn’t merely lead to civilization. This would lead to five steps, and considering that there need to be changes previous to the actual addition of a new level of hierarchy, the 100k-200k years is probably not too long a preparatory step considering the previous step was hundreds of millions of years earlier. As a result, I can accept that there have been five hierarchical events.

The growth graph from Delong suggests that we are on the cusp of another growth explosion. I think it is already happening. It is being driven by modern communications which enable the worldwide scientific community to collaborate. As an example, even though designing electronics is getting harder, the rate of advance is speeding up. Electronic agents perform more complex tasks for us every year. They do searches for us, manage our computer hardware, gather items for shipping in warehouses and many other things. Clearly, hordes of electronic specialists will be a big part of the next level of hierarchy, just as will the communication system we use to communicate with them. Beyond that, it will be rather difficult for an individual human to comprehend the next hierarchy, even if it is staring them in the face.