Thinking about Moore’s Law
I think most technology people perceive that they have an understanding of Moore’s Law. It is not really a law; it was an observation of technology and product cycles. Over time the repetitive consistency of the observation changed the perception into a law. I have been thinking about Moore’s Law not in terms of the semi-conductor product cycle, but rather how the effect of Moore’s Law (ML) has been extrapolated and extended across so many segments of the technology ecosystem.
My first thought about Moore’s Law is how much I dislike how it is used. It has become a classic technology catch phrase like “killer app.” I think it has become a crutch for people to use in place of empirical evidence and original thought. How many times have you read or heard a technology executive recite Moore’s Law as an answer to a question or read a market prediction based on ML? When I hear technology executives recite Moore’s Law as an answer to question, I shut down. Unless the executive is a semi-conductor person, I do not want to hear ML as a reason for X or an answer for Y.
My second thought is I think we are approaching the exhaustion point of Moore’s Law. I will call this Moore’s Law Exhaustion (MLE). To be clear, I am not referring to the semi-conductor product cycle; I am referring to the assumption that technology doubles every 2-5 years and by the time this doubling enters into commercial production it is the catalyst for technology upgrade cycles. Many of the technology market leadership companies over the last twenty years are based on the assumption that Moore’s Law applied to their markets. For the last twenty years this has been a safe assumption, but there will be a point when we have the black swan moment and many people will realize that we tend to (i) measure the wrong metrics, (ii) incorrectly analyze the model that includes the wrong metrics and (iii) incorrectly interpret the results of the data gathering and analysis. These three points can be grouped under the phrase: the problem of induction.
Many people will think that it is crazy to bet against the great force of technology called Moore’s Law. This reaction would be logical. I wrote in an earlier post that the “Need for Thought Anchors: The process of making decisions is most comfortable when we can associate the decision to that which is familiar. We tend to justify a decision by saying, “this is just like when we made the decision to do X last year.” We begin with what is familiar and try to associate future with the past. John G. Stoessinger described this action as image transfer because “…policy makers often transfer an image automatically from one place to another or from one time to another without careful empirical comparison,” [see Nations in Darkness, John G. Stoessinger, 1971, page 279]. Dr. David D. Burns described this behavior as “all-or-nothing thinking” or “labeling” behavior [see The Feeling Good Book, David D. Burns, 1989, page 8-11]. It is much easier for humans to use thought anchors to build their decision making process around – rather than pursue unfamiliar thought positions.” Unfortunately for those who find comfort in thought anchors, the past is the past and future holds no perceived dependence on the past. This can also be called the problem of survivorship biases and how a large body of data that we know well (i.e. a group of successful technology companies over the past twenty years) fools us because we tend to gravitate towards the data set we know well (i.e. thought anchors) and fail to seek the new data or ignore the new data set completely.
I will stop for minute and use two quotes from Nassim Taleb’s book Fooled by Randomness. This is a book I read every year or two, because it keeps me honest. My copy is full of yellow sticky notes highlighting parts of the book that I find important. Here are the two quotes:
- “We take past history as a single homogeneous sample and believe that we have considerably increased our knowledge of the future from the observation of the sample of the past…in other words, what if things have changed?” [see Fooled by Randomness Nassim Taleb, page 97].
- “If people were rational then their rationality would cause them to figure out predicable patterns from the past and adapt, so that past patterns would be completely useless for predicting the future.” [see Fooled by Randomness, Nassim Nicholas Taleb, 2001, page 98].
What is Moore’s Law Exhaustion? My supposition around the exhaustion of Moore’s Law has to with what I think is a break with the past that is becoming more visible everyday, yet is still relatively unknown and certainly unaccepted. It is unknown because it is unfamiliar. It is different. It cannot be framed into the same patterns as the past. In short, I am thinking that we are measuring the wrong metrics, incorrectly analyzing the wrong data set and drawing false conclusions. I started with saying that ML was a product cycle observation. I am big believer in product cycles and have written about product cycles in my blog, but I find it hard to believe for the broader technology markets that they are not going to be affected by the exhaustion of Moore’s Law. I would assume that ML will continue in the semi-conductor domain, but I think that ML’s perceived applicability to the technology domains outside of semis is closer to the end than most realize.
For a simple framing exercise I will use networking companies to describe my supposition around MLE, but I could choose PC manufactures, mobile devices or even PC software. For much of my adult life through six networking companies the business plan was based on (1) faster connection speeds, (2) integration of services and (3) technology elements (new chips, or lasers or protocols) that would allow the company to build a product to achieve #1 and #2 thus capitalizing on upgrade cycle by our customers. Note it is this very upgrade cycle that was always referred to as a product of Moore’s Law because networking companies build their new products around the next fastest chip. This is why you see so many networking companies with in-house silicon teams and they now often present a processor roadmap that goes out five or as much as ten years. Why would a company present such a plan if they were not counting on the sustainability of Moore’s Law?
Over the past twenty years next fastest chip strategy enabled networking companies to consistently capture the upgrade cycle: SNA to multiprotocol driven by client/server; 1.2k to 2.4k to 56k to xDSL and DOCSIS 2.0 and now FTTx in the consumer WAN, kilobytes to frame and then ATM in the commercial WAN, 10M to 100M to 1G to 10G and now 100G in the ethernet LAN, 2.5G to 10G to 40G and now 100G in the optical domain. That is a summary of the last twenty years of networking. It is a cycle that is 5-7 years in length and ebbs and flows with the economy. In between the cycles, companies are created that provide transition technology like data de-duplication, compression, application acceleration, switch digital video, but these companies are really transition stages between cycles. Why do I think we are approaching MLE for networking companies? The reasons are many, but let me start with a few observations.
- Raise your hand if the price of your broadband connection has increased? I have been traveling around asking this question and the vast majority of respondents answer no. I think in the consumer market, it appears bandwidth rates are static. If you adjust for the amount of bandwidth provided, they are deflationary. In the enterprise pricing trends are clearly deflationary. Deflationary trends in pricing do not lead to ARPU growth for service providers.
- The next trend is bandwidth metering and consumption charges. These two facts of life should be a “warning Will Robinson” moment to all the ML people. Again this leads me to ARPU and service providers are finding it difficult to build profitable ethernet services on costly network infrastructure.
- From a macro perspective we can look at consumption and debt trends and conclude there is a substantial period of debt reduction now that the catch-up spend is complete.
- When I speak with service providers, enterprises, and data center companies, I develop sense that the problem is not a network upgrade problem. There is a long tail of network infrastructure built and sold on the premise of Moore’s Law that needs to be worked off. I am supposing that the next step in the network is not another layer of hardware on top of hardware because it is faster and more integrated. The problem to be solved in the network lies elsewhere and I have stated this before here and here.
- To make the network more efficient and more profitable there will be a transition from physical connections and upgrades to logical connections and management. This is the next phase of innovation and network gain (e.g. stat muxing). The new phase of the network will tie the compute element tightly to the network element.
On second thought I could be wrong, but I offer some insights from Taleb. If you read page 87 of Fooled by Randomness, Taleb tells a humorous story about a Wall Street strategy session. He was short the market, but when asked whether the market was going up or down, he said with confidence that he thought the market had a 70% chance of going up the next week – yet he was still short. When others asked if he had changed his position, he said no. What Taleb explains is he thought the market would go up the following week and it might go up a whole 100 bps, but when it went down, it would go down by 1000 bps.
* It is all about the network stupid, because it is all about compute. *
** Comments are always welcome in the comments section or in private. Just hover over the Gravatar image and click for email. **