Data Center Consolidation and Adopting Cloud Computing in 2013

Throughout 2012 large organizations and governments around the world continued to struggle with the idea of consolidating inefficient data centers, server closets, and individual “rogue” servers scattered around their enterprise or government agencies.  Issues dealt with the cost of operating data centers, disaster management of information technology resources, and of course human factors centered on control, power, or retention of jobs in a rapidly evolving IT industry.

Cloud computing and virtualization continue to have an impact on all consolidation discussions, not only from the standpoint of providing a much better model for managing physical assets, but also in the potential cloud offers to solve disaster recovery shortfalls, improve standardization, and encourage or enable development of service-oriented architectures.

Our involvement in projects ranging from local, state, and national government levels in both the United States and other countries indicates a consistent need for answering the following concerns:

  • Existing IT infrastructure, including both IT and facility, is reaching the end of its operational life
  • Collaboration requirements between internal and external users are expanding quickly, driving an architectural need for interoperability
  • Decision support systems require access to both raw data, and “big data/archival data”

We would like to see an effort within the IT community to move in the following directions:

  1. Real effort at decommissioning and eliminating inefficient data centers
  2. All data and applications should be fit into an enterprise architecture framework – regardless of the size of organization or data
  3. Aggressive development of standards supporting interoperability, portability, and reuse of objects and data

Regardless of the very public failures experienced by cloud service providers over the past year, the reality is cloud computing as an IT architecture and model is gaining traction, and is not likely to go away any time soon.  As with any emerging service or technology, cloud services will continue to develop and mature, reducing the impact and frequency of failures.

Future Data CentersWhy would an organization continue to buy individual high powered workstations, individual software licenses, and device-bound storage when the same application can be delivered to a simple display, or wide variety of displays, with standardized web-enabled cloud (SaaS) applications that store mission critical data images on a secure storage system at a secure site?  Why not facilitate the transition from CAPEX to OPEX, license to subscription, infrastructure to product and service development?

In reality, unless an organization is in the hardware or software development business, there is very little technical justification for building and managing a data center.  This includes secure facilities supporting military or other sensitive sites.

The cost of building and maintaining a data center, compared with either outsourcing into a commercial colocation site – or virtualizing data, applications, and network access requirements has gained the attention of CFOs and CEOs, requiring IT managers to more explicitly justify the cost of building internal infrastructure vs. outsourcing.  This is quickly becoming a very difficult task.

Money spent on a data center infrastructure is lost to the organization.  The cost of labor is high, the cost of energy, space, and maintenance is high.  Mooney that could be better applied to product and service development, customer service capacity, or other revenue and customer-facing activities.

The Bandwidth Factor

The one major limitation the IT community will need to overcome as data center consolidation continues and cloud services become the ‘norm, is bandwidth.  Applications, such as streaming video, unified communications, and data intensive applications will need more bandwidth.  The telecom companies are making progress, having deployed 100gbps backbone capacity in many markets.  However this capacity will need to continue growing quickly to meet the needs of organizations needing to access data and applications stored or hosted within a virtual or cloud computing environment.

Consider a national government’s IT requirements.  If the government, like most, are based within a metro area.  The agencies and departments consolidate their individual data centers and server closets into a central or reduced number of facilities.   Government interoperability frameworks begin to make small steps allowing cross-agency data sharing, and individual users need access to a variety of applications and data sources needed to fulfill their decision support requirements.

For example, a GIS (Geospatial/Geographic Information System) with multiple demographic or other overlays.  Individual users will need to display data that may be drawn from several data sources, through GIS applications, and display a large amount of complex data on individual display screens.  Without broadband access between both the user and application, as well as application and data sources, the result will be a very poor user experience.

Another example is using the capabilities of video conferencing, desktop sharing, and interactive persistent-state application sharing.  Without adequate bandwidth this is simply not possible.

Revisiting the “4th Utility” for 2013

The final vision on the 2013 “wishlist” is that we, as an IT industry, continue to acknowledge the need for developing the 4th Utility.  This is the idea that broadband communications, processing capacity (including SaaS applications), and storage is the right of all citizens.  Much like the first three utilities, roads, water, and electricity, the 4th Utility must be a basic part of all discussions related to national, state, or local infrastructure discussions.  As we move into the next millennium, Internet-enabled, or something like Internet-enabled communications will be an essential part of all our lives.

The 4th Utility requires high capacity fiber optic infrastructure and broadband wireless be delivered to any location within the country which supports a community or individual connected to a community.   We’ll have to [pay a fee to access the utility (same as other utilities), but it is our right and obligation to deliver the utility.

2013 will be a lot of fun for us in the IT industry.  Cloud computing is going to impact everybody – one way or the other.  Individual data centers will continue to close.  Service-oriented architectures, enterprise architecture, process modeling, and design efficiency will drive a lot of innovation.   – We’ll lose some players, gain players, and and we’ll be in a better position at the end of 2013 than today.

Government Clouds Take on the ESBaaS

Recent discussions with government ICT leadership related to cloud computing strategies have all brought the concept of Enterprise Service Bus as a Service into the conversation.

Now ESBs are not entirely new, but in the context of governments they make a lot of sense.  In the context of cloud computing strategies in governments they make a heck of a lot of sense.

Wikipedia defines an ESB as:

In computing, an enterprise service bus (ESB) is a software architecture construct which provides fundamental services for complex architectures via an event-driven and standards-based messaging engine (the bus). Developers typically implement an ESB using technologies found in a category of middleware infrastructure products, usually based on recognized standards.

Now if you actually understand that – then you are no doubt a software developer.  For the rest of us, this means that with the ESB pattern, participants engaging in service interaction communicate through a services or application “bus.” This bus could be a database, virtual desktop environment, billing/payments system, email, or other services common to one or more agencies. The ESB is designed to handle relationships between users with a common services and standardized data format.

New services can be plugged into the bus and integrated with existing services without any changes to the core bus service. Cloud users and applications developers will simply add or modify the integration logic.

Participants in a cross-organizational service interaction are connected to the Cloud ESB, rather than directly to one another, including: government-to-government, citizen-to-government, and business-to-government. Rules-based administration support will make it easier to manage ESB deployments through a simplified template allowing a better user experience for solution administrators.

The Benefits to Government Clouds

In addition to fully supporting a logical service-oriented architecture (SOA), the ESBaaS will enhance or provide:

  • Open and published solutions for managing Web services connectivity, interactions, services hosting, and services mediation environment
  • From development and maintenance perspective, the Government Cloud ESB allows agencies and users to securely and reliably share information between applications in a logical, cost effective manner
  • Government Cloud ESBs will simplify adding new services, or changing existing services, with minimal impact to the bus or other interfacing applications within the IT environment
  • Improvements in system performance and availability by offloading message processing and isolating complex mediation tasks in a dedicated ESB integration server

Again, possibly a mouthful, but if you can grasp the idea of a common bus providing services to a lot of different applications or agencies, allowing sharing of data and and interfaces without complex relationships between each participating agency, then the value becomes much more clear.

Why the Government Cloud?

While there are many parallels to large companies, governments are unique in the number of separate ministries, agencies, departments, and organizations within the framework of government.  Governments normally share a tremendous amount of in the past this data between each agency, and in the past this was extremely difficult due to organizational differences, lack of IT support, or individuals who simply did not want to share data with other agencies.

The result of course was many agencies built their own stand alone data systems, without central coordination, resulting in a lot of duplicate data items (such as an individual’s personal profile and information, business information, and land management information, and other similar data).  Most often, there were small differences in the data elements each agency developed and maintained, resulting in either corrupt or conflicting data.

The ESB helps identify a method of connecting applications and users to common data elements, allowing the sharing of both application format and in many cases database data sets.  This allows not only efficiency in software/applications development, but also a much higher level of standardization an common data sharing.

While this may be uncomfortable for some agencies, most likely those which do not want to share their data with the central government, or use applications that are standardized with the rest of government, this also does support a very high level of government transparency.  A controversial, but essential goal of all developing (and developed) governments.

As governments continue to focus on data center consolidation and the great economical, environmental, and enabling qualities of virtualization and on-demand compute resources, integration of the ESBaaS makes a lot of sense. 

There are some very nice articles related to ESBs on the net, including:

Which may help you better understand the concept, or give some additional ideas.

Let us know your opinion or ideas on ESBaaS

%d bloggers like this: