Monday, May 18, 2009

Google's demise - a possible theory

History has shown that nothing is eternal and every real and abstract construct goes through the cycle of birth and death. I understand that this statement lacks a formal argument chain that makes it susceptible to ridicule by a lot of people. But I urge everyone to humor me as, while this is what made me think about topic, it is not the central theme of this post.

The above-mentioned belief made me start to look at the forces that may cause Google's downfall. In addition, to the belief stated above, I have been noticing that Google services are having outages that are getting progressively more conspicuous, even though the availability statistics might be getting better. I just faced ( 9.00 AM EST May 18 2009) a ( 503 Server error). This is on top of an outage reported just yesterday.

I recently heard a talk on fall of civilizations and one of the themes highlighted was that the cause for a civilization's failure is, often, the reason the civilization was successful in the first place. Arguably Google's success in large parts can be attributed to the presence of an extraordinarily large number of extraordinarily smart people. They have built a large, proprietary and hugely sophisticated system that is, possibly, unparalleled in its complexity in the modern business world. It is similar to the financial markets in its complexity which have been made complex due to an intricate web of really complicated financial instruments. Even the smartest of people with enormous talent could not stop the systemic failure in the recent financial crisis. If anything at all the smarts, arguably, contributed to increasing the risk of a systemic failure.

I believe that Google is susceptible to the risk posed by complexity of its ever evolving system. This system will continue to demand, more and more, high-horsepower intellect which will continue to become increasingly difficult to attract over time due to various competitive pressures and Google's waning charms. I hear that, already, Google search algorithms take over 600 signals to rank search results. While most of us view this as formidable IP that is hard for anyone to replicate and better, I believe that there is a risk of this beast becoming increasingly elusive to the core team at Google, which will start to affect its quality.

If I were Google, I'd employ some of the smarts to look at this specific risk - the risk of failure due to increasing systemic complexity. In my experience, Google is unique in bringing such sophistication to the consumer space while dealing with phenomenally quick changes in the problem space. One can argue that sending a space ship to the orbit is a more complex enterprise. I will not argue the complexity of the system. The competitive and problem dynamics, though, are relatively static, allowing engineers to design for a relatively unchanging problem space. The complexity dimension of the problem is not amplified by competitive and problem-space dynamism.

Google doesnt have that luxury. Taleb views efficiency as form of leverage. I'd extend that to include complexity as a form of leverage and hence a contributor to risk.