The Bell Tolls for Data Centers

MC900250330In the good old days (late 90s and most of the 2000s) data center operators loved selling individual cabinets to customers.  You could keep your prices high for the cabinet, sell power by the “breakered amp,” and try to maximize cross connects  through a data center meet me room.  All designed to squeeze the most revenue and profit out of each individual cabinet, with the least amount of infrastructure burden.

Forward to 2010.  Data center consolidation has become an overwhelming theme, emphasized by the US CIO Vivek Kundra’s mandate to force the US government, as the world’s largest IT user, to eliminate most of more than 1600 federal government owned and operated data centers (into about a dozen), and further promote efficiency by adopting cloud computing.

The Gold Standard of Data Center Operators hits  Speed Bump

Equinix (EQIX) has a lot of reasons and explanations for their expected failure to meet 3rd quarter revenue targets.  Higher than expected customer churn, reducing pricing to acquire new business, additional accounting for the Switch and Data acquisition, etc., etc., etc…

The bottom line is –  the data center business is changing.  Single cabinet customers are looking at hosted services as an economical and operational alternative to maintaining their own infrastructure.  Face it, if you are paying for a single cabinet to house your 4 or 5 servers in a data center today, you will probably have a much better overall experience if you can migrate that minimal web-facing or customer facing equipment into a globally distributed cloud.

Likewise, cloud service providers are supporting the same level of Internet peering as most content delivery networks (CDNs) and internet Service Providers (ISPs), allowing the cloud user to relieve themselves of the additional burden of operating expensive switching equipment.  The user can still decide which peering, ISP, or network provider they want on the external side of the cloud, however the physical interconnections are no longer necessary within that expensive cabinet.

The traditional data centers are beginning to experience the move to shared cloud services, as is Equinix, through higher churn rates and lower sales rates for those individual cabinets or small cages.

The large enterprise colocation users or CDNs continue to grow larger, adding to their ability to renegotiate contracts with the data centers.  Space, cross connects, power, and service level agreements favor the large footprint and power users, and the result is data centers are further becoming a highly skilled, sophisticated, commodity.

The Next Generation Data Center

There are several major factors influencing data center planners today.  Those include the impact of cloud computing, emergence of containerized data centers, the need for far great energy efficiency (often using PUE-Power Utilization Effectiveness) as the metric, and the industry drive towards greater data center consolidation.

Hunter Newby, CEO of Allied Fiber, strongly believes ”Just as in the last decade we saw the assembly of disparate networks in to newly formed common, physical layer interconnection facilities in major markets we are now seeing a real coordinated global effort to create new and assemble the existing disparate infrastructure elements of dark fiber, wireless towers and data centers. This is the next logical step and the first in the right direction for the next decade and beyond.”

We are also seeing data center containers popping up along the long fiber routes, adjacent to traditional breaking points such as in-line amplifiers (ILAs), fiber optic terminals (locations where carriers physically interconnect their networks either for end-user provisioning, access to metro fiber networks, or redundancy), and wireless towers. 

So does this mean the data center of the future is not necessarily confined to large 500 megawatt data center farms, and is potentially something that becomes an inherent part of the transmission network?  The computer is the network, the network is the computer, and all other variations in between?

For archival and backup purposes, or caching purposes, can data exist in a widely distributed environment?

Of course latency within the storage and processing infrastructure will still be dependent on physics for the near term, actually, for end user applications such as desktop virtualization, there really isn’t any particular reason that we MUST have that level of proximity…  And there probably are ways we can “spoof” the systems to think they are located together, and there are a host of other reasons why we do not have to limit ourselves to a handful of “Uber Centers…”

A Vision for Future Data Centers

What if broadband and compute/storage capacity become truly insulated from the user.  What if Carr’s ideas behind the Big Switch are really the future of computing as we know it, and our interface to the “compute brain” is limited to dumb devices, and that we no longer have to concern ourselves with anything other than writing software against a well publicized set of standards?

What if the next generation of Equinix is a partner to Verizon or AT&T, and Equinix builds a national compute and storage utility distributed along the fiber routes that is married to the communications infrastructure transmission network?

What if our monthly bill for entertainment, networking, platform, software, and communications is simply the record of how much utility we used during the month, or our subscription fee for the month? 

What if wireless access is transparent, and globally available to all mobile and stationary terminals without reconfiguration and a lot of pain?

No more “remote hands” bills, midnight trips to the data center to replace a blown server or disk, dealing with unfriendly or unknowledgeable  “support” staff, or questions of who trashed the network due to a runaway virus or malware commando…

Kind of an interesting idea.

Probably going to happen one of these days.

Now if we can extend that utility to all airlines so I can have 100% wired access, 100% of the time.

The Reality of Cloud Implementation Part 1 – Hosted Applications

As a business consultant providing direction and advice to both government clients and commercial clients, several topics continue to drive discussion not only on short term IT strategy, but also longer term innovative contributions cloud computing can offer the organization.

However to get the conversation moving forward, managers contemplating major architectural change to their IT organizations need to find a good reference or pilot project to justify the expense and contribute to change.  Not the preferred approach, but a reality.

One easy IT project is the move from workstation-based applications, primarily office automation suites, to server-based applications.  The choice is between applications hosted within a private (enterprise) network, or to outsource the application to a commercial provider such as Microsoft Live Office or Google Apps.

Hosted applications make a lot of sense – for most users.  It is a great idea to offload the burden of desktop application administration IT managers when possible, with an expectation of the following:

  1. Greater control over intellectual property (files are stored on a central file server, not on individual hard drives and computers)
  2. Greater control over version and application code updates
  3. Greater control over security, anti-virus, and anti-spam definitions
  4. Application standardization (including organizational templates and branding)
  5. Better management of user licenses (and eliminating use of unauthorized or copied software)

If we look at profiles of most organizational users, the vast majority are office workers who normally do not need to travel, access files or applications from home, or stay on call 24 hours a day.  Thus we can assume, while at the office, computers are connected to a high performance LAN, with high bandwidth and throughout within the organization.

if that assumption is correct, and the organization implements either an enterprise-hosted or commercially-hosted (Google or Microsoft as an example), then those individual workstations can also eliminate keeping files on the local drives (can all be available and backed up to a file server), as well as using web-based applications for most activities.

The user’s relationship with the network, applications, and intellectual property is channeled through a workstation or web interface.  This also enables users, through use of VPNs and other access security, to use any compatible interface available when connecting to applications and files.  This includes home computers and mobile devices – as long as the data is retained on the host file server, and a record is created of all users accessing the data for both security and network/computer resource capacity management.

NOTE:  As a frequent traveler I also spend a considerable amount of time in airplanes, airports, and areas without easy access to the Internet or my file servers.  I do keep an image of MS Office on my laptop, and do have a very large hard drive, and do have a library of SD chips and flash drives for  use when un-tethered from my web apps.  I don’t see this changing in the near future – however I am probably in a  very small minority of professional road warriors who still justify use of local images.  Most do not.

An Unscientific Review of Web-Based Office Automation Applications

First, I am writing this blog entry using Microsoft’s Live Writer, a web/cloud-based application available for blog writers.  it is one application available within the Microsoft “Live-Everything” suite of web-based utilities, which include office automation and social networking applications.

writer The Live Writer application connects with my blog provider (WordPress), downloads my blog profile, and uses that as a what-you-see-is-what-you-get editing interface.  I feel as if I am typing directly into my blog, without the need to understand HTML commands or other manual features.

Adding video, tables, tags, hyperlinks, and SEO tools is effortless.

Going further into my Microsoft Live Office account I can upload, download, create, edit, and store documents in all MS Office formats, with the main free apps including World, Excel, Powerpoint, and One Note.  Mail, calendars, web sites, blogs – a variety of different utilities for personal and potentially professional use.

It is easy to share documents, create collaboration groups, and integrate MS Messenger-driven conferencing and sharing among other connected colleagues.  All available as a free environment for any user without the need to buy MS Office products for your personal computer.  Other commercial products offer a lot more utility, however as a basic test environment, the performance of MS Live Office is more than adequate for probably 95% of office workers world wide. 

Face it, most of us rarely us anything beyond the most basic features of any office automation product, and purchasing licenses for individual office automation suites for each organizational user really only benefits the vendor.

Google Docs, and the Google Apps engine provide similar features to the Microsoft Suite, and some additional unique features not currently available (or easily noticed) on the Live Office sites.  At a high level, Google provides network users:

  • Documents (word processing, spreadsheets, presentations)
  • Forms
  • Drawing/graphics
  • Templates
  • Blogs
  • Analytics
  • Lots of other stuff

In my absolutely unscientific testing of both Google and Microsoft web-based applications, I did not find a single feature which I normally use in preparing presentations, documents, and spreadsheets that could not be reproduced with the online edition.

If that is true for most users, then we can probably look toward a future where cloud-based and hosted office automation applications begin to replace software loaded on individual workstations.

The Danger of Easy Outsourcing

In a world of Service Oriented Architectures (SOA), and close inter-relationships of data, care is needed to ensure we do not create pilots “islands of unconnectable data.”  Today, nearly all data is connectable, whether tables and forms within an email message, SMS messages, spreadsheets, data bases, or any other potential SaaS application.

A word we need to keep in our IT vocabulary is “PORTABILITY.”  Anything we type into an application is a candidate for logging, enquiry, statistics, reporting, or other use of data.  This is a concern when using SaaS applications for not only office automation, but any other hosted application. 

Any and all data we create must be available to any other application which can consume or integrate organizational or industry community of interest applications.  We will look into the SaaS portability question in part 2 of this series.

A Developing Country That Can Teach Hawaii An IT Strategy Lesson

Vietnam is in the process of upgrading the entire country’s IT system. With support from organizations such as the World Bank, Vietnam is rebuilding not only physical infrastructure, but also starting from the ground up building new IT systems – including a large scale virtualization strategy.

Hawaii may not be so progressive. The first line of an Associated Press story on Hawaii’s lack of a functional IT strategy goes like this:

“In many ways Hawaii’s government runs its computers like the Internet age hardly happened.” (AP)

The story goes on to expose Hawaii’s lack of IT policy, the fact they are using old systems, a mixture of Apple and PCs for individual users, have a 1960s version of disaster recovery (offsite physical diskette storage), and other parallels with industry that add more discouraging evidence to Hawaii’s IT shortfalls.

Sensationalizing the Obvious

Information Technology in HawaiiI’ve always found it very easy to criticize. Perhaps the role of a journalist is to sensationalize the shortfalls of others, as people do tend to like watching others suffer – as long as the pain stays in somebody else’s life or reputation.

OK, so Hawaii does have some shortfalls in their IT systems. As a user, I have to say my experience using Hawaii’s eGovernment applications hasn’t been too bad. A plus in the Hawaii IT strategy column. I have never had an email rejected from a Hawaii state email server. Another plus. I could probably rack up a lot of pluses, but it is not sensational.

Now let’s look at the difficult side of journalism. Writing something positive and still trying to make it interesting to the readers.

Vietnam is an interesting case study. A larger population, and a lot more government than Hawaii. More problems to deal with – but the government is trying to drive the national IT strategy down to the city level, decentralizing actual applications and access as much as possible to promote the independence of provinces and cities – without disrupting the national IT plan to standardize IT management throughout government.

Nobody would ever suggest the US government try to standardize data strategies down to the state level, much less the city level, however there is still an interesting lesson that can be applied from the Vietnam model.

Data format standards on a national scale can facilitate information sharing and data mining. We won’t go into the personal security issues of that statement in this article, however data format standardization is a good thing for government. The commercial world and manufacturing have had data format/classification standards for many years, including projects such as RosettaNet, XBRL, and UNSPSC.

Thus a driver’s license format in Danang would look identical to the same item in Hanoi – representing 2 very different provinces. Data can easily be shared as needed for identification, reporting, law enforcement, and other data transfer.

Standardization is good.

Enter Virtualization and the Cloud

If a government bureaucracy in a state like Hawaii has extended its inefficiencies into the world of IT, and as stated in quotes the AP article included:

  • Hawaii’s department-by-department way of handling information would not work in the business world, where companies invested heavily in upgrades as the Internet and computers grew in importance.
  • It’s like we had all these little companies and they all grew at the same time, and then when the big company came along and merged everything, it never made the changes.

Beautiful Island - Not So Impressive IT StrategyWell, even in deeply entrenched bureaucracies there has to be a scheduled refresh of technology at some point. Even those precious little Macs and PCs will eventually die, become so old they cannot even load a browser, or the state will grind to a halt because a day will come when no computer in the government will be able to open a Microsoft Word 2010 document.

Maybe, just maybe – much like the government of Vietnam has come to realize, that refresh strategy could include cloud computing. The city of Los Angeles has accepted cloud, and that city probably has a larger government and bureaucracy than the entire state of Hawaii.

The AP article mentions that Governor Lingle has tried to establish an Office of the CIO within Hawaii. Good idea. One that will ultimately save the state a lot of money. Let’s push our representatives to make that happen!

A Proposal

Now select a couple of good data center locations. A couple on Oahu, maybe one each on Maui and the Big Island. Start building cloud computing centers on each island, connect them via dedicated high speed links, synchronize data and applications, then inform the state that all new editions of office automation software will be using a hosted edition of Office 2010, or other high performance hosted package.

Bang – saved money on license fees, labor for installers (those guys who are paid to update your anti-virus software and load service packs on your computer), and high performance desktop and laptop computers.

Start refreshing with dumb terminals and netbooks.

Establish a real state-wide disaster recovery model:

  • Cloud-based virtualized storage
  • Central cloud-based email system
  • Distributed DR model using network-based backups in geographically separated locations
  • Dumb terminals and netbooks backup to the centralized data base and storage – not on local equipment (unless the worker is a traveler). Access to the data is still available from a distant end location through use of VPNs.

Retrain the IT staff on developing applications in the cloud – not on under-the-desktop servers.

Could it really be that simple? Actually – yes. In addition, if the state of Hawaii can build a storefront of applications (including Office 2010-like products), and make those applications available to users on a state-wide basis, and reduce provisioning time for applications to minutes rather than months, why wouldn’t we consider this as an option to what Sen. Donna Mercado Kim (D, Kalihi Valley-Halawa) was quoted as saying, “Every department has IT (information technology) people, and they each have their own way of doing things.”

Nonsense

Very 1970s… So not 2020s…

Vietnam is rebuilding their national infrastructure, the US government under the direction of CIO Vivek Kundra is rebuilding the national IT strategy. Hawaii can rebuild ours as well. And we have great examples and precedent to learn from.

%d bloggers like this: