Maintain or Refresh – the IT Dilemma Meets Cloud Computing
January 18, 2010 Leave a comment
Emerging technologies have always forced business decision-makers to decide if they will embrace a new technology as a first-mover, or if they will maintain their existing technologies. Each brings a risk – does the cost of maintaining existing technology result in higher maintenance and operational expenses, or does the cost of embracing and acquiring new technology put an unwarranted capital and process change burden on the organization?
Many years ago (~15) the Northern Telecom (Nortel) DMS 100/250/300/500 line of digital telephone switches represented one of the finest technologies for digital communications. The cost was high, but the technology promised telecom carriers everything they would need to operate their networks well into the next generation, which was not yet associated with a real time horizon. At least in marketing PowerPoint slides. Buy a DMS 500, and you will be running that for a couple decades.
Then seemingly overnight the Internet matured, with communications applications such as Voice over Internet Protocol (VoIP), Skype, Vonage, and other Internet-enabled utilities. Suddenly the DMS, 5ESS, 4ESS, NEAC, DSC – all became obsolete almost overnight, replaced by simple Internet-friendly communication applications or Internet Protocol-based “soft switches” which managed telephony over the Internet protocol with a form factor about the size of a mini-refrigerator, And 100 times the switching capacity.
And, as with all soon-to-be-obsolete technologies, the cost of maintaining the legacy system, finding spare parts for the legacy system, and even finding operators for the legacy system may be rapidly hitting a point of extreme risk. The old telephone switches are now most often found in landfills, gone forever.
Traditional telecommunication transmission protocols such as SDH and SONET began falling to Ethernet, and within a period of about 5 years from 2003~2008 the “legacy” telephone technologies began to quickly fade to historical Wikipedia entries.
The Cloud Computing Analogy
We are entering a period of “plentitude” in cloud computing. The “Law of Plentitude” is loosely defined as a threshold of acceptance (of a process, technology, system, etc) that if not adopted will put an entity at a greater risk of non-participation than if they participate at the point of emergence. In technology we normally place the “Law of Plentitude” at around 15~20% diffusion into a selected environment, community, industry, or organization.
For example, when the fax was first introduced there was a single machine. By itself it is not useful, as you have nobody to fax images to on a distant end. With two fax machines it is more useful, with a community of two. The law of exponents begins at 4 users (N*N-1/2) and you end up with an addressable community of 6 potential relationships. And it continues growing.
At “plentitude,” you are at risk of not acquiring a fax machine, because your community has now adopted fax machines at a level that you need to be able to communicate with fax, or find yourself in jeopardy of losing your place in the community.
It can now be argued that cloud computing is quickly starting to reach a level of “plentitude.” Communities of interest are emerging within clouds, allowing near zero-latency in user-to-user transaction time. Think of a financial trading community. Zero-latency means zero transaction delays. At some point if you are not in the zero-latency community, your operation is at risk of either losing business, or being expelled by other members of the community who do not want to deal with your latency.
Think of companies outsourcing their IT infrastructure into a commercial cloud service provider, or even building their own internal enterprise cloud infrastructure. If all things are equal, and the cloud-enabled company is able to recover the cost of building their own data center, reducing operational expenses, and potentially greatly increasing their ability to expand and reduce their processing capacity, then they may have more resources left over to increase research and development or product production.
Think of the guys who were running DMS 500s in 2009, vs. their competitors who were running much more powerful, and cheaper soft switches. We can produce a roll call of regional telephone companies who closed their doors over the past few years because they simply did not have the ability to compete with next generation technology.
The Cloud Computing “Plentitude” Target
The trick of course is to try and plan your refresh, through a well-managed business case and review, to as close to the plentitude “risk threshold” as possible. This will ensure you do not fall prey to a bad technology, are able to see the industry trend towards adopting a new technology, and that your competition does not leave you suffering through a last minute technology refresh.
Cloud computing and data center outsourcing may not be the ultimate technology refresh, and still has a number of issues to resolve (security, compliance, data center stability, etc). However, the trend is clear, companies are outsourcing into commercial cloud service providers, and enterprise virtualization is on the mind of every IT manager and CFO in the business community.
If your company or organization has not yet started the review process, the technology refresh process, and the planning process to determine if/when cloud adoption is the right thing for you company, we would strongly encourage that process to begin. Now.
If nothing else, you owe it to yourself and your organization to ensure they are not caught on the bad side of plentitude.