Data Center Consolidation and Adopting Cloud Computing in 2013

Throughout 2012 large organizations and governments around the world continued to struggle with the idea of consolidating inefficient data centers, server closets, and individual “rogue” servers scattered around their enterprise or government agencies.  Issues dealt with the cost of operating data centers, disaster management of information technology resources, and of course human factors centered on control, power, or retention of jobs in a rapidly evolving IT industry.

Cloud computing and virtualization continue to have an impact on all consolidation discussions, not only from the standpoint of providing a much better model for managing physical assets, but also in the potential cloud offers to solve disaster recovery shortfalls, improve standardization, and encourage or enable development of service-oriented architectures.

Our involvement in projects ranging from local, state, and national government levels in both the United States and other countries indicates a consistent need for answering the following concerns:

  • Existing IT infrastructure, including both IT and facility, is reaching the end of its operational life
  • Collaboration requirements between internal and external users are expanding quickly, driving an architectural need for interoperability
  • Decision support systems require access to both raw data, and “big data/archival data”

We would like to see an effort within the IT community to move in the following directions:

  1. Real effort at decommissioning and eliminating inefficient data centers
  2. All data and applications should be fit into an enterprise architecture framework – regardless of the size of organization or data
  3. Aggressive development of standards supporting interoperability, portability, and reuse of objects and data

Regardless of the very public failures experienced by cloud service providers over the past year, the reality is cloud computing as an IT architecture and model is gaining traction, and is not likely to go away any time soon.  As with any emerging service or technology, cloud services will continue to develop and mature, reducing the impact and frequency of failures.

Future Data CentersWhy would an organization continue to buy individual high powered workstations, individual software licenses, and device-bound storage when the same application can be delivered to a simple display, or wide variety of displays, with standardized web-enabled cloud (SaaS) applications that store mission critical data images on a secure storage system at a secure site?  Why not facilitate the transition from CAPEX to OPEX, license to subscription, infrastructure to product and service development?

In reality, unless an organization is in the hardware or software development business, there is very little technical justification for building and managing a data center.  This includes secure facilities supporting military or other sensitive sites.

The cost of building and maintaining a data center, compared with either outsourcing into a commercial colocation site – or virtualizing data, applications, and network access requirements has gained the attention of CFOs and CEOs, requiring IT managers to more explicitly justify the cost of building internal infrastructure vs. outsourcing.  This is quickly becoming a very difficult task.

Money spent on a data center infrastructure is lost to the organization.  The cost of labor is high, the cost of energy, space, and maintenance is high.  Mooney that could be better applied to product and service development, customer service capacity, or other revenue and customer-facing activities.

The Bandwidth Factor

The one major limitation the IT community will need to overcome as data center consolidation continues and cloud services become the ‘norm, is bandwidth.  Applications, such as streaming video, unified communications, and data intensive applications will need more bandwidth.  The telecom companies are making progress, having deployed 100gbps backbone capacity in many markets.  However this capacity will need to continue growing quickly to meet the needs of organizations needing to access data and applications stored or hosted within a virtual or cloud computing environment.

Consider a national government’s IT requirements.  If the government, like most, are based within a metro area.  The agencies and departments consolidate their individual data centers and server closets into a central or reduced number of facilities.   Government interoperability frameworks begin to make small steps allowing cross-agency data sharing, and individual users need access to a variety of applications and data sources needed to fulfill their decision support requirements.

For example, a GIS (Geospatial/Geographic Information System) with multiple demographic or other overlays.  Individual users will need to display data that may be drawn from several data sources, through GIS applications, and display a large amount of complex data on individual display screens.  Without broadband access between both the user and application, as well as application and data sources, the result will be a very poor user experience.

Another example is using the capabilities of video conferencing, desktop sharing, and interactive persistent-state application sharing.  Without adequate bandwidth this is simply not possible.

Revisiting the “4th Utility” for 2013

The final vision on the 2013 “wishlist” is that we, as an IT industry, continue to acknowledge the need for developing the 4th Utility.  This is the idea that broadband communications, processing capacity (including SaaS applications), and storage is the right of all citizens.  Much like the first three utilities, roads, water, and electricity, the 4th Utility must be a basic part of all discussions related to national, state, or local infrastructure discussions.  As we move into the next millennium, Internet-enabled, or something like Internet-enabled communications will be an essential part of all our lives.

The 4th Utility requires high capacity fiber optic infrastructure and broadband wireless be delivered to any location within the country which supports a community or individual connected to a community.   We’ll have to [pay a fee to access the utility (same as other utilities), but it is our right and obligation to deliver the utility.

2013 will be a lot of fun for us in the IT industry.  Cloud computing is going to impact everybody – one way or the other.  Individual data centers will continue to close.  Service-oriented architectures, enterprise architecture, process modeling, and design efficiency will drive a lot of innovation.   – We’ll lose some players, gain players, and and we’ll be in a better position at the end of 2013 than today.

5 Data Center Technology Predictions for 2012

2011 was a great year for technology innovation.  The science of data center design and operations continued to improve, the move away from mixed-use buildings used as data centers continued, the watts/sqft metric took a second seat to overall kilowatts available to a facility or customer, and the idea of compute capacity and broadband as a utility began to take its place as a basic right of citizens.

However, there are 5 areas where we will see additional significant advances in 2012.

1.  Data Center Consolidation.  The US Government admits it is using only 27% of its overall available compute power.  With 2094 data centers supporting the federal government (from the CIO’s 25 Point Plan  to Reform Fed IT Mgt), the government is required to close at least 800 of those data centers by 2015.

Data Center ConstructionThe lesson is not lost on state and local governments, private industry, or even internet content providers.  The economics of operating a data center or server closet, whether in costs of real estate, power, hardware, in addition to service and licensing agreements, are compelling enough to make even the most fervent server-hugger reconsider their religion.

2.  Cloud Computing.  Who doesn’t believe cloud computing will eventually replace the need for a server closets, cabinets, or even small cages in data centers?  The move to cloud computing is as certain as the move to email was in the 1980s. 

Some IT managers and data owners hate the idea of cloud computing, enterprise service busses, and consolidated data.  Not so much an issue of losing control, but in many cases because it brings transparency to their operation.  If you are the owner of data in a developing country, and suddenly everything you do can be audited by a central authority – well it might make you uncomfortable…

A lesson learned while attending a  fast pitch contest during late 2009 in Irvine, CA…  An enterprising entrepreneur gave his “pitch” to a panel of investment bankers and venture capital representatives.  He stated he was looking for a $5 million investment in his startup company. 

A panelist asked what the money was for, and the entrepreneur stated “.. and $2 million to build out a data center…”  The panelist responded that 90% of new companies fail within 2 years.  Why would he want to be stuck with the liability of a data center and hardware if the company failed? The gentleman further stated, “don’t waste my money on a data center – do the smart thing, use the Amazon cloud.”

3.  Virtual Desktops and Hosted Office Automation.  How many times have we lost data and files due to a failed hard drive, stolen laptop, or virus disrupting our computer?  What is the cost or burden of keeping licenses updated, versions updated, and security patches current in an organization with potentially hundreds of users?  What is the lead time when a user needs a new application loaded on a computer?

From applications as simple as Google Docs, to Microsoft 365, and other desktop replacement applications suites, users will become free from the burden of carrying a heavy laptop computer everywhere they travel.  Imagine being able to connect your 4G/LTE phone’s HDMI port to a hotel widescreen television monitor, and be able to access all the applications normally used at a desktop.  You can give a presentation off your phone, update company documents, or nearly any other IT function with the only limitation being a requirement to access broadband Internet connections (See # 5 below).

Your phone can already connect to Google Docs and Microsoft Live Office, and the flexibility of access will only improve as iPads and other mobile devices mature.

The other obvious benefit is files will be maintained on servers, much more likely to be backed up and included in a disaster recovery plan.

4.  The Science of Data Centers.  It has only been a few years since small hosting companies were satisfied to go into a data center carved out of a mixed-use building, happy to have access to electricity, cooling, and a menu of available Internet network providers.  Most rooms were Data Center Power Requirementsdesigned to accommodate 2~3kW per cabinet, and users installed servers, switches, NAS boxes, and routers without regard to alignment or power usage.

That has changed.  No business or organization can survive without a 24x7x265 presence on the Internet, and most small enterprises – and large enterprises, are either consolidating their IT into professionally managed data centers, or have already washed their hands of servers and other IT infrastructure.

The Uptime Institute, BICSI, TIA, and government agencies have begun publishing guidelines on data center construction providing best practices, quality standards, design standards, and even standards for evaluation.  Power efficiency using metrics such as the PUE/DCiE provide additional guidance on power management, data center management, and design. 

The days of small business technicians running into a data center at 2 a.m. to install new servers, repair broken servers, and pile their empty boxes or garbage in their cabinet or cage on the way out are gone.  The new data center religion is discipline, standards, discipline, and security. 

Electricity is as valuable as platinum, just as cooling and heat are managed more closely than inmates at San Quentin.  While every other standards organization is now offering certification in cabling, data center design, and data center management, we can soon expect universities to offer an MS or Ph.D in data center sciences.

5.  The 4th Utility Gains Traction.  Orwell’s “1984” painted a picture of pervasive government surveillance, and incessant public mind control (Wikipedia).  Many people believe the Internet is the source of all evil, including identity theft, pornography, crime, over-socialization of cultures and thoughts, and a huge intellectual time sink that sucks us into the need to be wired or connected 24 hours a day.

Yes, that is pretty much true, and if we do not consider the 1000 good things about the Internet vs. each 1 negative aspect, it might be a pretty scary place to consider all future generations being exposed and indoctrinated.  The alternative is to live in a intellectual Brazilian or Papuan rain forest, one step out of the evolutionary stone age.

The Internet is not going away, unless some global repressive government, fundamentalist religion, or dictator manages to dismantle civilization as we know it.

The 4th utility identifies broadband access to the ‘net as a basic right of all citizens, with the same status as roads, water, and electricity.  All governments with a desire to have their nation survive and thrive in the next millennium will find a way to cooperate with network infrastructure providers to build out their national information infrastructure (haven’t heard that term since Al Gore, eh?).

Without a robust 4th utility, our children and their children will produce a global generation of intellectual migrant workers, intellectual refugees from a failed national information sciences vision and policy.

2012 should be a great year.  All the above predictions are positive, and if proved true, will leave the United States and other countries with stronger capacities to improve their national quality of life, and bring us all another step closer.

Happy New Year!

The Argument Against Cloud Computing

As a cloud computing evangelist there is nothing quite as frustrating, and challenging, as the outright rejection of anything related to data center consolidation, data center outsourcing, or use of shared, multi-tenant cloud-based resources.  How is it possible anybody in the late stages of 2010 can possibly deny a future of VDIs and virtual data centers?

Actually, it is fairly easy to understand.  IT managers are not graded on their ability to adopt the latest “flavor of the day” technology, or adherence to theoretical concepts that look really good in Powerpoint, but in reality are largely untested and still in the development phase.

Just as a company stands a 60% chance of failure if they suffer disaster without a recovery or continuity plan, moving the corporate cookies too quickly into a “concept” may be considered just as equally irresponsible to a board of directors, as the cost of failure and loss of data remains extremely high.

The Burden Carried by Thought Leaders and Early Adopters

Very few ideas or visions are successful if kept secret.  Major shifts in technology or business process (including organizational structure) require more than exposure to a few white papers, articles, or segments on the “Tech Hour” of a cable news station.

Even as simple and routine as email is today, during the 1980s it was not fully understood, mistrusted, and even mocked by users of “stable” communication systems such as Fax, TELEX, and land line telephones. in 2010 presidents of the world’s most powerful nations are cheerfully texting, emailing, and micro-blogging their way through the highest levels of global diplomacy.

It takes time, experience, tacit knowledge, and the trend your business, government, or social community is moving forward at a rate that will put you on the outside if the new technology or service is not adopted and implemented.

The question is, “how long will it take us to get to the point we need to accept outsourcing our information technology services and infrastructure, or face a higher risk of not being part of our professional or personal community?”

E-Mail first popped up in the late 1970s, and never really made it mainstream until around the year 2000.  Till then, when executives did use email, it was generally transcribed from written memos and types in by a secretary.  Until now, we have gradually started learning about cloud computing through use of social media, hosted public mail systems, and some limited SaaS applications. 

Perhaps at the point us evangelist types, as a community, are able to start clearly articulating the reality that cloud computing has already planted its seeds in nearly every Internet-enabled computer, smart phone, or smart devices life, the vision of cloud computing will still be far too abstract for most to understand. 

And this will subsequently reinforce the corporate and organizational mind’s natural desire to back off until others have developed the knowledge base and best-practices needed to bring their community to the point implementing and IT outsourcing strategy will be in their benefit, and not be a step in their undoing.

In fact, we need to train the IT community to be critical, to learn more about cloud computing, and question their role in the future of cloud computing.  How else can we expect the knowledge level to rise to the point IT managers will have confidence in this new service technology?

And You Thought is was About Competitive Advantage?

Yes, the cloud computing bandwagon is overflowing with snappy topics such as:

  • Infrastructure agility
  • Economies of scale
  • Enabling technology
  • Reduced provisioning cycles
  • Relief from capital expense
  • better disaster recovery
  • Capacity on demand
  • IT as a Service
  • Virtual everything
  • Publics, privates, and hybrids
  • Multi-resource variability
  • Pay as you go

Oh my, we will need a special lexicon just to wade through the new marketing language of the main goals of cloud computing, which in our humble opinion are:

  • Data center consolidation
  • Disaster recovery
  • IT as a Service
    Cloud computing itself will not make us better managers and companies.  Cloud computing will serve as a very powerful tool to let us more efficiently, more quickly, and more effectively meet our organizational goals.  Until we have he confidence cloud computing will serve that purpose, it is probably a fairly significant risk to jump on the great marketing data dazzling us on Powerpoint slides and power presentations.

We will Adopt Cloud Computing, or Something Like It

Now to recover my cloud computing evangelist enthusiasm.  I do deeply believe in the word – the word of cloud computing as a utility, as a component of broadband communications, as all of the bullets listed above.  it will take time, and I warmly accept the burden of responsibility to further codify the realities of cloud computing, the requirements we need to fulfill as an industry to break out of the “first mover phase,” and the need to establish a roadmap for companies to shift their IT operations to a/the cloud.  

Just as with email, it is just one of those things you know is going to happen.  We knew it in the early days of GRID computing, and we know it now.  Let’s focus our discussion on cloud computing to more of a “how” and “when” conversation, rather then a “wow” and “ain’t it cool.” conversation. 

Now as I dust off an circa 1980 set of slides discussing the value of messaging, and how it would support one-to-one, one-to-many, and many-to-many forms of interactive and non-interactive communications, it is time for us to provide a similar Introduction to Cloud. 

Get the pulpit ready

Data Centers Hitting a Wall of Cloud Computing

Equinix lowers guidance due to higher than expected churn in its data centers and price erosion on higher end customers.  Microsoft continues to promote hosted solutions and cloud computing.  Companies from Lee Technologies, CirraScale, Dell, HP, and SGI are producing containerized data centers to improve efficiency, cost, and manageability of high density server deployments.

The data center is facing a challenge.  The idea of a raised floor, cabinet-based data center is rapidly giving way to virtualization and highly expandable, easy to maintain, container farms.

The impact of cloud computing will be felt across every part of life, not least the data center which faces a degree of automation not yet seen.”

Microsoft CEO Steve Ballmer believes “the transition to the cloud <is> fundamentally changing the nature of data center deployment.” (Data Center Dynamics)

As companies such as Allied Fiber continue to develop visions of high density utility fiber ringing North America, with the added potential of dropping containerized cloud computing infrastructure along fiber routes and power distribution centers, AND the final interconnection of 4G/LTE/XYZ towers and metro cable along the main routes,the potential of creating a true 4th public utility of broadband with processing/storage capacity becomes clear.

Clouds Come of Age

Data center operators such as Equinix have traditionally provided a great product and service for companies wishing to either outsource their web-facing products into a facility with a variety of internet Service Providers or internet Exchange Points providing high performance network access, or eliminate the need for internal data center deployments through outsourcing IT infrastructure into a well-managed, secure, and reliable site.

However the industry is changing.  Companies, in particular startup companies. are finding there is no technical or business reason to manage their own servers or infrastructure, and that nearly all applications are becoming available on cloud-based SaaS (Software as a Service) hosted applications.

Whether you are developing your own virtual data center within a PaaS environment, or simply using Google Apps, Microsoft Hosted Office Applications, or other SaaS, the need to own and operate servers is beginning to make little sense.  Cloud service providers offer higher performance, flexible on-demand capacity, security, user management, and all the other features we have come to appreciate in the rapidly maturing cloud environment.

With containers providing a flexible physical apparatus to easily expand and distribute cloud infrastructure, as a combined broadband/compute utility, even cloud service providers are finding this a strong alternative to placing their systems within a traditional data center.

With the model of “flowing” cloud infrastructure along the fiber route to meet proximity, disaster recovery, or archival requirements, the container model will become a major threat to the data center industry.

What is the Data Center to Do?

Ballmer:

“A data center should be like a container – that you can put under a roof or a cover to stop it getting wet. Put in a slab of concrete, plumb in a little garden hose to keep it cool, yes a garden hose – it is environmentally friendly, connect to the network and power it up. Think of all the time that takes out of the installation.”

Data center operators need to rethink their concept of the computer room.  Building a 150 Megawatt, 2 million square foot facility may not be the best way to approach computing in the future.

Green, low powered, efficient, highly virtualized utility compute capacity makes sense, and will continue to make more sense as cloud computing and dedicated containers continue to evolve.  Containers supporting virtualization and cloud computing can certainly be secured, hardened, moved, replaced, and refreshed with much less effort than the “uber-data center.”

It makes sense, will continue to make even more sense, and if I were to make a prediction, will dominate the data delivery industry within 5~10 years.  If I were the CEO of a large data center company, I would be doing a lot of homework, with a very high sense of urgency, to get a complete understanding of cloud computing and industry dynamics.

Focus less on selling individual cabinets and electricity, and direct my attention to better understanding cloud computing and the 4th Utility of broadband/compute capacity.  I wouldn’t turn out the lights in my carrier hotel or data center quite yet, but this industry will be different in 5 years than it is today.

Given the recent stock volatility in the data center industry, it appears investors are also becoming concerned.

Communities in the Cloud

In the 1990s community of interest networks (COINs) emerged to take advantage of rapidly developing Internet protocol technologies. A small startup named BizNet on London’s Chiswell Street developed an idea to build a secure, closed network to support only companies operating within the securities and financial industries.

BizNet had some reasonable traction in London, with more than 100 individual companies connecting within the secure COIN. Somewhat revolutionary at the time, and it did serve the needs of their target market. Management was also simple, using software from a small company called IPSwitch and their soon to be globally popular “What’s Up” network management and monitoring utility.

However simplicity was the strength of BizNet. While other companies favored strong marketing campaigns and a lot of flash to attract companies to the Internet age, BizNet’s thought leaders (Jez Lloyd and Nick Holland) relied on a strong commitment to service delivery and excellence, and their success became viral within the financial community based on the confidence they built among COIN members.

As networks go, so did BizNet, which was purchased by Level 3 Communications in 1999 and subsequently the COIN network was dismantled in favor of integrating the individual customers into the Level 3 community.

Cloud Communities

Cloud computing supports the idea of a COIN, as companies can not only build their “virtual data center” within a Platform as a Service/PaaS model, but also develop secure virtual interconnections among companies within a business community – not only within the same cloud service provider (CSP), but also among cloud service providers.

In the “BizNet” version of a COIN, dedicated connections (circuits) were needed to connect routers and switches to a central exchange point run by BizNet. BizNet monitored all connections, reinforcing internal operations centers run by individual companies, and added an additional layer of confidence that helped a “viral” growth of their community.

Gerard Briscoe and Alexandros Marinos delivered a paper in 2009 entitled Digital Ecosystems in the Clouds: Towards Community Cloud Computing.” In addition to discussing the idea of using cloud computing to support an outsourced model of the COIN, the paper also drills deeper into additional areas such as the environmental sustainability of a cloud community.

As each member of the cloud community COIN begins to outsource their virtual data center into the cloud, they are able to begin shutting down inefficient servers while migrating processing requirements into a managed virtual architecture. Even the requirement for managing high performance switching equipment supporting fiber channel and SAN systems is eliminated, with the overall result allowing a significant percentage of costs associated with equipment purchase, software licenses, and support agreements to be rechanneled to customer or business-facing activities.

Perhaps the most compelling potential feature of community clouds is the idea that we can bring processing between business or trading partners within the COIN to near zero, as the interaction between members is on the same system, and will not lose any velocity due to delays induced by going through switching, routing, or short/long distance transmission through the Internet or dedicated circuits.

Standards and a Community Applications Library

Most trading communities and supply chains have a common standard for data representation, process, and interconnection between systems. This may be a system such as RosettaNet for the manufacturing industry, or other similar industry specifications. Within the COIN there should also be a central function that provides the APIs, specifications, and other configurations such as security and web services/interconnection interface specs.

As a function of developing a virtual data center within the PaaS model, standard components supporting the COIN such as firewalls, APIs, and other common applications should be easily accessible for any member, ensuring from the point of implementation that joining the community is a painless experience, and a very rapid method of becoming a full member of the community.

A Marriage of Community GRIDs and Cloud Computing?

Many people are very familiar with project such as Seti At Home, and the World Community GRID. Your desktop computer, servers, or even storage equipment can contribute idle compute and storage capacity to batch jobs supporting everything from searching for extraterrestrial life to AIDS research. You simply register your computer with the target project, download a bit of client software, and the client communicates with a project site to coordinate batch processing of work units/packets.

Now we know our COIN is trying to relieve members from the burden of operating their own data centers – at least those portions of the data center focusing on support of a supply chain or trading community of interest. And some companies are more suited to outsourcing their data center requirements than others. So if we have a mix of companies still operating large data centers with potential sources of unused capacity, and other members in the community cloud with little or no onsite data center capacity, maybe there is a way the community can support itself further by developing the concept of processing capacity as a currency.

As all individual data centers and office LAN/MAN/WANs will have physical connections to the cloud service provider (IaaS provider) through an Internet service provider or dedicated metro Ethernet connection, the virtual data centers being produced within the PaaS portion of the CSP’s will be inherently connectable to any user, or any facility within the COIN. Of course that is accepting that security management will protect non-COIN connected portions of the community.

Virtually, those members of the community with excess capacity within their own networks could then easily further contribute their spare capacity to the community for use as non-time critical compute resource, or for supporting “batch” processing. Some CSPs may even consider buying that capacity to provide members either in the COIN, or outside of the COIN, and additional resource available to their virtual customers as low cost, low performance, batch capacity much like SETI at Home or the Protein Folding Project uses spare capacity on an as-available basis. Much like selling your locally produced energy back into a power GRID.

We Have a New, Blank Cloud White Board to Play With

The BizNet COIN was good. Eleven years after BizNet was dissolved, the concept remains valid, and we now have additional infrastructure that will support COINs through community clouds, with enabling features that extend far beyond the initial vision of BizNet. CSPs such as ScaleUp have built IaaS and PaaS empowerment for COINs within their data center.

Cloud computing is an infant. Well, maybe in Internet years it is rapidly heading to adolescence, but it is still pretty young. Like an adolescent, we know it is powerful, getting more powerful by the day, but few people have the vision to wrap their head around what broadband, cloud computing, diffusion of network-enabled knowledge into the basic education system, and the continuation of Moore’s, Metcalf’s, and other laws of industry and physics.

COINs and community clouds may not have been in the initial discussions of cloud computing, but they are here now. Watching a Slingbox feed in a Jakarta hotel room connected to a television in Burbank was probably not a vision shared by the early adopters of the Internet – and cloud computing will make similar un-thought of leaps in utility and capabilities over the next few years.

However, in the near term, do not be surprised if you see the entire membership of the New York Stock Exchange and NASDAQ operating from a shared cloud COIN. It will work.

Business and Social Frog Soup – are we ready for the next decade?

Over the past couple years I have written several stories with “frog soup” as a main theme. The idea of being in cold water, and not recognizing the degree by degree Frog soup concerns for the American economyincrease of heat in the water, till at some point we are cooked, is the danger of being a cold-blooded animal. Business may follow a similar course.

In business we can follow the route of “this is the way we’ve always done it, and it works, so there is no reason to change our processes or strategies.” Innovations like virtualization or cloud computing hit the headlines, and many say “it is a cool idea, but we want the security and hands-on confidence of running our own servers and applications.”

In the United States many telecom companies continue to build business cases based on “milking” telephone settlement minutes, bilateral relationships, and controlling telecom “pipes.” Internet service providers (ISPs) continue holding on to traditional peering relationships, holding out for “paid peering,” doing everything possible to attain market advantage based on traffic ratios.

Nothing new, same ideas, different decade.

It is international frog soup.

In Vietnam the government is currently planning to build an entirely new information infrastructure, from the ground up, based on the most cutting edge telecom and data/content infrastructure. Children in Hanoi go to school at 7 a.m., take a quick lunch break, hit the books till around 5 p.m., take another break, and finish their day at study sessions till around 9 p.m.

Concentration – mathematics, physics, and language.

The children are being exposed to Internet-based technologies, combining their tacit experience and knowledge of global interconnected people with a high degree of academic sophistication.

In the United States children go to school for, at most, 6 hours a day, graduating with (on average) little capabilities in math or language – although we do have deep knowledge of metal detectors and how to smoke cigarettes in the restrooms without being caught. In Los Angeles, some locations cannot even hit a 50% graduation rate among high school students.

And oddly enough, we appear to be comfortable with that statistic.

Perhaps our approach to business is following a similar pattern. We become used to approaching our industry, jobs, and relationships on a level of survival, rather than innovation. We may not in some cases even have the intellectual tools to apply existing technology to the potential of functioning in a global economy. Then we are surprised when an immigrant takes our job or business.

Some universities, such as Stanford, aggressively recruit students from foreign countries, as they cannot attract enough qualified student s from the United States to meet their desired academic threshold. And once they graduate from Stanford, they find their way into Silicon Valley startups, with an entrepreneurial spirit that is beyond the scope of many American graduates.

Those startups have the intellectual and entrepreneurial tools to compete in a global economy, using innovative thinking, unbound by traditional processes and relationships, and are driving the center of what used to be America’s center of the global innovation world. Except that it is only based in Silicon Valley, and now represents the center of a global innovative community. Possibly due to the availability of increasingly cheaper American labor?

Frog Soup

Us Americans – we are getting lazy. Innovation to us may mean how we manipulate paper, and has nothing to do with manufacturing and business innovation. We are starting to miss the value of new products, new concepts, and execution of business plans which end up in production of goods for export and domestic use. We believe concentration on services industries will drive our economy into the future, based on products and other commercial goods imported into our country.

Except for the painful fact and reality we do not have a young generation with the intellectual tools to compete with kids in Hanoi who are on a near religious quest to learn.

The temperature is rising, and we as a country and economic factor in the global community is being diluted every day.

Time to put away the video games and get back to work. No more “time outs,” only time to roll up our sleeves and learn, innovate, learn, innovate, and innovate some more. Forget comfort, we are nearly soup.

A Cloudy Future for Networks and Data Centers in 2010

The message from the VC community is clear – “don’t waste our seed money on network and server equipment.” The message from the US Government CIO was clear – the US Government will consolidate data centers and start moving towards cloud computing. The message from the software and hardware vendors is clear – there is an enormous Data Center within a Data Center Cloudinvestment in cloud computing technologies and services.

If nothing else, the economic woes of the past two years have taught us we need to be a lot smarter on how we allocate limited CAPEX and OPEX budgets. Whether we choose to implement our IT architecture in a public cloud, enterprise cloud, or not at all – we still must consider the alternatives. Those alternatives must include careful consideration of cloud computing.

Cloud 101 teaches us that virtualization efficiently uses compute and storage resources in the enterprise. Cloud 201 teaches us that content networks facing the Internet can make use of on-demand compute and storage capacity in close proximity to networks. Cloud 301 tells us that a distributed cloud gives great flexibility to both enterprise and Internet-facing content. The lesson plan for Cloud 401 is still being drafted.

Data Center 2010

Data center operators traditionally sell space based on cabinets, partial cabinets, cages, private suites, and in the case of carrier hotels, space in the main distribution frame. In the old days revenue was based on space and cross connects, today it is based on power consumed by equipment.

If the intent of data center consolidation is to relieve the enterprise or content provider of unnecessary CAPEX and OPEX burden, then the data center sales teams should be gearing up for a feeding frenzy of opportunity. Every public cloud service provider from Amazon down to the smallest cloud startup will be looking for quality data center space, preferably close to network interconnection points.

In fact, in the long run, if the vision of cloud computing and virtualization is true, then the existing model of data center should be seen as a three-dimensional set of objects within a resource grid, not entirely dissimilar to the idea set forth by Nicholas Carr in his book the “Big Switch.”

Facilities will return to their roots of concrete, power, and air-conditioning, adding cloud resources (or attracting cloud service providers to provide those resources), and the cabinets, cages, and private suites will start being dismantled to allow better use of electrical and cooling resources within the data center.

Rethinking the Data Center

Looking at 3tera‘s AppLogic utility it brings a strange vision to mind. If I can build a router, switch, server, and firewall into my profile via a drag and drop utility, then why would I want to consider buying my own hardware?

If storage becomes part of the layer 2 switch, then why would I consider installing my own SAN, NAS, or fiber channel infrastructure? Why not find a cloud service provider with adequate resources to run my business within their infrastructure, particularly if their network proximity and capacity is adequate to meet any traffic requirement my business demands?

In this case, if the technology behind AppLogic and other similar Platform as a Service (PaaS) is true to the marketing hype, then we can start throwing value back to the application. The network, connectivity, and the compute/storage resource becomes an assumed commodity – much like the freeway system, water, or the electrical grid.

Flowing the Profile to the User

Us old guys used to watch a SciFi sitcom called “Max Headroom.” Max Headroom was a fictional character who lived within the “Ether,” being able to move around though computers, electrical grids – and pop up wherever in the network he desired. Max could also absorb any of the information within computer systems or other electronic intelligence sources, andFrom the old SciFi series Max Headroom deliver his findings to news reporters who played the role of investigative journalists.

We are entering an electronic generation not too different from the world of Max Headroom. If we use social networking, or public utility applications such as Hotmail, Gmail, or Yahoo Mail, our profile flows to the network point closest to our last request for application access. There may be a permanent image of our data stored in a mother ship, but the most active part of our profile is parsed to a correlation database near our access point.

Thus, if I am a Gmail user, and live in Los Angeles, my correlated profile is available at the Google data cache with correlated Gmail someplace with proximity to Los Angeles. If I travel to HongKong, then Gmail thinks “Hmmm…, he is in HK, and we should parse his Gmail image to our HK cache, and hope he gets the best possible performance out of the Gmail product from that point.”

I, as the user, do not care which data center my Gmail profile is cached at, I only care that my end user experience is good and I can get my work done without unnecessary pain.

The data center becomes virtual. The application flows to the location needed to do the job and make me happy. XYZ.Com, who does my mail day-to-day, must understand their product will become less relevant and ineffective if their performance on a global scale does not meet international standards. Those standards are being set by companies who are using cloud computing on a global, distributed model, to do the job.

2010 is the Year Data Centers Evolve to Support the Cloud

The day of a 100sqft data center cage is rapidly becoming as senseless as buying a used DMS250. The cost in hardware, software, peopleware, and the operational expense of running a small data center presence simply does not make sense. Nearly everything that can be done in a 100sqft cage can be done in a cloud, forcing the services provider to concentrate on delivering end user value, and leaving the compute, storage, and network access to utility providers.

And when the 100sqft cage is absorbed into a more efficient resource, the cost – both in electrical/mechanical and cost (including environmental costs) will drop by a factor of nearly 50%, given the potential for better data center management using strict hot/cold aisle separation, hot or cold aisle containment, containers – all those things data center operators are scrambling to understand and implement.

Argue the point, but by the end of 2010, the ugly data center caterpillar will come out of its cocoon as a better, stronger, and very cloudy utility for the information technology and interconnected world to exploit.

Cloud Computing Expo Kicks Off in Santa Clara – The Cloud Opportunity Window is Now Officially Open

Having gone through a couple of decades worth of technology conferences, a familiar cycle occurs. For the first couple years, technology-related conferences are attended by engineers and operations people. Only after the technology has passed a couple of feasibility gates and begun to hit the business cycle do sales and marketing people Keynote Speech - Richard Marcello - Unisystake over. Cloud is now officially past the engineering phase, well into the sales phase – and the business community is scrambling to understand the implications of a virtualized world.

At the Cloud Computing Conference and Expo in Santa Clara, California, the opening keynote session venue was completely filled, with the organizer (SYS-CON Events) obliged to quickly expand the audience into two overflow rooms, in addition to mounting displays in hallways adjacent to the main ballroom. According to the conference organizer more than twice as many have signed up and are attending the conference than planned. And cloud “buzz” is electric within the halls.

Cloud computing is here, the industry innovation machine is spooling, and the “nay-sayers” are starting to quiet down as the reality of cloud computing is articulated, codified, and presented in a format that has finally gone past the high level “concepts” of recent cloud expos and conferences.

This must be true, because the hallways are now filling with people wearing suits, ties, and polo shirts with snappy logos. Engineers still roam the halls, identifiable by their blue jeans, T-shirts, and backpacks filled with gadgets and computers. The ratio is about 50:50, indicating cloud service providers are now attending conferences for the purpose of business development, rather than to simply share ideas and further develop cloud technology as an R&D community.

The Opening Keynote – Cloud Myth-Busting

Richard Marcello, President, Technology, Consulting, and Integrations Services at Unisys kicked off the conference with a keynote speech entitled “The Time is Right for Enterprise Cloud Computing.” The presentation followed a familiar model in the public (non engineering and technician audience) conditioning of a new technology – “the Nine Myths of Cloud Computing.” A very good presentation (really), which drilled into common misconceptions of cloud computing. This type approach is useful when giving an instructional presentation, with statements such as:

  • Myth #9 – Cloud computing is brand new – a revolution
  • Myth #8 – All clouds are the same
  • Myth #7 – Cloud computing is about technology
  • Myth #5 – Cloud computing is not reliable
  • And so on…

Do a search and replace of “cloud computing” with “Internet” and you could pose the same myths, with the discriminating factor being one of how you present the response in breaking each myth. Yes, it is marketing and borderline cliché, but it does go far in visualizing cloud computing to the new attendees from the business side of our industry.

Marcello did present one eloquent response to the myth “The Internal data center is more secure than the cloud.” He showed a slide which had three separate applications creating data. The data is stored in a storage cloud, as well as being manipulated in a service cloud. Data going into the cloud service (processing), and into the storage cloud is brought into a single stream, which cannot be intercepted by a “sniffer” or other device, and the actual data instances are only recognizable by the application using the data. To all others attempting to intercept the data, it appears as “water running through a pipe.”

Actually, not a bad analogy.

Marcello went on the describe his taxonomy of the “real time access engine” which controls the data streams into each application or storage device, security within an enterprise, industry, or organizational community of interest. However the most important message delivered during his speech was the idea that cloud computing will “generate new business models and ideas that none of us have yet envisioned.”

But, That’s Not What I designed…

This message is strong. All engineers have gone through the experience of creating a product, and then observing the product being used by people for activities never envisioned by the creator. Imagine the continuing astonishment of the originators of the Internet. A simple tool for distributed applications and network survivability, and it is now the basis for nearly all communications, entertainment, business, and social interaction between humans throughout the world.

What will cloud computing bring us in the future? What will smart kids who are going through an education system with complete immersion in the global Internet cloud as a normal part of life be able to see in a potential global model of data and applications virtualization? Much as the early days of the internet represented a mere tip of the future network “iceberg,” what we see in cloud computing today is just the tip of what virtualization of compute and storage resources will ultimate become.

What will happen when SSDs (solid state disks) become part of the layer 2 switching backplane (Slapping an SSD card into a switching slot, making Fiber channel over Ethernet obsolete overnight)? An entire content delivery network and system currently using 100 cabinets of servers and disk reduced to a single card in a switch…

Integration with IPv6. Standardization in cloud services allowing formation of cloud spot markets and interoperability.

We have a lot of questions to throw both at the engineers, as well as the business visionaries attending the conference. Welcome sales and marketing folks, welcome to the new age of cloud computing.

John Savageau, Long Beach (From the Cloud Computing Conference and Expo, Santa Clara, California)

From Zero to 900 MPH – Cloud Computing Grabs Attention and Headlines

Just 18 months ago the concept of cloud computing was still an abstract to most in the IT and data center community. In fact, those who had even heard of cloud computing were a tiny minority of IT professionals.

This morning, 17 July 2009, Google listed 177 news stories with the topic or subject of cloud computing posted within the past 24 hours. Whether you believe the cloud computing story is real or not, hype or reality, a larvae of technical reality, you cannot escape the excitement cloud computing is bringing to the technical community.

Even the Wall Street Journal is now devoting a fair amount of space to the topic, with recent stories highlighting projects including Microsoft initiatives, Larry Ellison (Oracle), Google, and HP. It seems that every company that has any vision or feels they need some quick PR is launching a story or press release on cloud computing. The past 24 hours present a roll call of cloud talk (from a query on Google News) including:

  • Dell
  • HP
  • BMC
  • Microsoft
  • Google
  • Cisco
  • Sun
  • IBM
  • Altera
  • Rackspace
  • Terremark
  • And the list goes on,…and on,… and on…..

Add Gartner’s release of this year’s Web Hosting and Cloud Magic Quadrant report, and the competition for press release ratings and standings further intensifies. It appears that if you are not currently releasing a story on how your company is moving ahead with cloud initiatives, the market will begin wondering “why?”

What is Driving Cloud?

Every company in the world has been affected by the economy, the need for developing green IT infrastructure, and the ever increasing need for compute and applications power. We are living in a global social and economic, highly interactive world. Telecommunications, applications, and the need to share enormous amounts of data are driving need for both power and efficiency in our personal and professional information and communications (ICT) tools.

Cloud computing shares the burden of compute capacity requirements among many users, whether within an enterprise, a closed community of interest, or content facing the general public. The peak processing loads of most individuals and companies are far higher than average processing requirements, and the reality is we are finding it difficult to continue buying hardware and software when we only average a few percentage points of resource usage.

Cloud computing, even in its 18 month old infancy, promises this will change with the development of virtualization models and on-demand use of shared resources. In the long term, as we continue to solve security issues, latency, capacity, and billing models, individuals and organizations will benefit from the consolidation of compute capacity.

The Media Hype and Effect

While there may be a lot of bandwagon appeal occurring in the cloud vendor community and media, it does serve the purpose of quickly establishing cloud computing as a concept that is getting into the eyes, ears, and minds of most IT and financial professionals. Without a concentrated media focus on a concept like cloud, the lead time for making this an accepted technology would greatly extend into the future.

The media is forcing us to at least consider the concepts of cloud computing technology, and start to ask the questions needed to make decisions on whether or not this will be an acceptable technology, and if we need to include cloud discussions and strategies in our current and future business plans.

And with more media exposure, more stories, and more thought leadership available on the topic, we will certainly have more intellectual tools to use in making our own informed decisions.

 

John Savageau, Long Beach

2009 – A Year of IPv6 and Internet Virtualization

This article originally appeared in the Jan 2009 Any2 Exchange Newsletter

 

For the past 30 years or so we have gone through an accelerated learning process in globalization.  While the under 25 crowd has lived in a world of extreme technology diffusion, many of us still recall the days when having near real time news was an exception, and provided at a great cost in both human and financial resources.

 

Who can forget the international awakening when real time information came out of UNIX Talk sessions during the Russian Coup attempt in 1991 – when the world realized the veil of national secrecy and suppression of events had been ripped open forever.  The Internet has played a role in international transparency which has changed the concept of globalization and brought people together on a human scale that could not have been comprehended just a couple decades ago.  It is safe to assume we have successfully passed the Internet concept test. 

 

The basic tools we’ve used over the past 40 years have proven global one-to-one, one-to-many, and many-to-many communications via packets works. Now we can move on to the next phase of global communications development, with a new suite of tools that will allow even more powerful ways of bringing the world to your laptop.

 

This month we have two feature articles in the Any2 newsletter.  One by Martin Levy on the topic of IPv6, and the other by Justin Giardina on storage replication and virtualization.  In the next edition we will draw our attention to cloud computing and software as a service.

 

The Any2 Exchange fully supports deployment of IPv6 in the IXP, as well as support through the Any2Easy route servers.  The actual amount of IPv6 traffic is still relatively low, however the number of routes available through Any2 is showing steady growth.  We allocate both IPv4 and IPv6 addresses to all new Any2 Exchange members, and will assign new IPv6 addresses on request to any other existing member who does not currently have an allocation. 

 

It is important for us to aggressively promote IPv6.  ARIN recently made the following statement:

 

“With only 15% of IPv4 address space remaining, ARIN is now compelled to advise the Internet community that migration to IPv6 is necessary for any applications that require ongoing availability of contiguous IP number resources.” (www.arin.net)

 

With an accompanying resolution:

 

BE IT RESOLVED, that this Board of Trustees hereby advises the Internet community that migration to IPv6 numbering resources is necessary for any applications which require ongoing availability from ARIN of contiguous IP numbering resources.”

 

 

 

 

Those are pretty serious warnings to the Internet community that it is time to re-tool for the next generation. What is that generation?  Lots of visionaries out there who are far more creative than I, offering a landscape of virtualization and convergence of nearly every aspect of life.

 

In the scope of our activities we are seeing customer and tenant movement in the direction of virtualization, through both storage and software as a service via cloud compute provisioning.  While getting a lot of attention in the tech media as a concept, the actual deployment of these services is going ahead in a near stealth mode.  We eagerly look forward to the marriage of Brocade and Foundry, and the offspring that may further bring storage and cloud compute capacity into the switching and routing fabrics.

 

The ability to route and switch with direct addressing, rather than NAT or private addressing is going to be a requirement in the virtual compute world to help eliminate both physical and software points of failure, as well as eliminate any latency byproduct of address translation.

 

So, we have our work cut out for us.  CRG West and the Any2 Exchange see our role in this new world as the developers of infrastructure needed to support the applications and services being developed by networks, cloud companies, SaaS companies, CDNs, and the carriers.

 

At the end of the day we still need to provide solid electrical and cooling systems support, access to fiber and interconnections (including the Any2 Exchange), and a neutral place for the community to meet.

 

With the economy in turmoil, limited funding available for both capital and operational expenses, and the need to rapidly move ahead, we will strive to do our part in providing the infrastructure and community center to reduce both CAPEX and OPEX, as well as develop the facility infrastructure needed to fulfill the visions of those who lead.

 

 

%d bloggers like this: