Can IT Standards Facilitate Innovation?

ideaIT professionals continue to debate the benefits of standardization versus the benefits of innovation, and the potential of standards inhibiting engineer and software developer ability to develop creative solutions to business opportunities and challenges.  At the Open Group Conference in San Diego last week (3~5 February) the topic of  standards and innovation popped up not only in presentations, but also in sidebar conversations surrounding the conference venue.

In his presentation SOA4BT (Service-Oriented Architecture for Business Technology) – From Business Services to Realization,   Nikhil Kumar noted that with rigid standards there is “always a risk of service units creating barriers to business units.”  The idea is that service and IT organizations must align their intended use of standards with the needs of the business units.   Kumar further described a traditional cycle where:

  • Enterprise drivers establish ->
  • Business derived technical drivers, which encounter ->
  • Legacy and traditional constraints, which result in ->
  • “Business Required” technologies and technology (enabled) SOAs

Going through this cycle does not require a process with too much overhead, it is simply a requirement for ensuring the use of a standard, or standard business architecture framework  drive the business services groups (IT) into the business unit circle.  While IT is the source of many innovative ideas and deployments of emerging technologies, the business units are the ultimate benefactors of innovation, allowing the unit to address and respond to rapidly emerging opportunities or market requirements.

Standards come in a lot of shapes and sizes.  One standard may be a national or international standard, such as ISO 20000 (service delivery), NIST 800-53 (security), or BICSI 002-2011 (data center design and operations).  Standards may also be internal within an organization or industry, such as standardizing data bases, applications, data formats, and virtual appliances within a cloud computing environment.

In his presentation “The Implications of EA in New Audit Guidelines (COBIT5), Robert Weisman noted there are now more than 36,500 TOGAF (The Open Group Architecture Framework) certified practitioners worldwide, with more than 60 certified training organizations providing TOGAF certifications.  According to ITSMinfo.com, just in 2012 there were more than 263,000 ITIL Foundation certifications granted (for service delivery), and ISACA notes there were more than 4000 COBIT 5 certifications granted (for IT planning, implementation, and governance) in the same period.

With a growing number of organizations either requiring, or providing training in enterprise architecture, service delivery, or governance disciplines, it is becoming clear that organizations need to have a more structured method of designing more effective service-orientation within their IT systems, both for operational efficiency, and also for facilitating more effective decision support systems and performance reporting.  The standards and frameworks attempt to provide greater structure to both business and IT when designing technology toolsets and solutions for business requirements.

So use of standards becomes very effective for providing structure and guidelines for IT toolset and solutions development.  Now to address the issue of innovation, several ideas are important to consider, including:

  • Developing an organizational culture of shared vision, values, and goals
  • Developing a standardized toolkit of virtual appliances, interfaces, platforms, and applications
  • Accepting a need for continual review of existing tools, improvement of tools to match business requirements, and allow for further development and consideration when existing utilities and tools are not sufficient or adequate to task

Once an aligned vision of business goals is available and achieved, a standard toolset published, and IT and business units are better integrated as teams, additional benefits may become apparent.

  • Duplication of effort is reduced with the availability of standardized IT tools
  • Incompatible or non-interoperable organizational data is either reduced or eliminated
  • More development effort is applied to developing new solutions, rather than developing basic or standardized components
  • Investors will have much more confidence in management’s ability to not only make the best use of existing resources and budgets, but also the organization’s ability to exploit new business opportunities
  • Focusing on a standard set of utilities and applications, such as database software, will not only improve interoperability, but also enhance the organization’s ability to influence vendor service-level agreements and support agreements, as well as reduce cost with volume purchasing

Rather than view standards as an inhibitor, or barrier to innovation, business units and other organizational stakeholders should view standards as a method of not only facilitating SOAs and interoperability, but also as a way of relieving developers from the burden of constantly recreating common sets and libraries of underlying IT utilities.  If developers are free to focus their efforts on pure solutions development and responding to emerging opportunities, and rely on both technical and process standardization to guide their efforts, the result will greatly enhance an organization’s ability to be agile, while still ensuring a higher level of security, interoperability, systems portability, and innovation.

PTC 2015 Wraps Up with Strong Messages on SDNs and Automation

Software Defined Networking and Network Function Virtualization (NVF) themes dominated workshops and side conversations throughout the PTC 2015 venue in Honolulu, Hawai’i this week.

Carrier SDNs SDNs, or more specifically provisioning automation platforms service provider interconnections, and have crept into nearly all marketing materials and elevator pitches in discussions with submarine cable operators, networks, Internet Exchange Points, and carrier hotels.

While some of the material may have included a bit of “SDN Washing,” for the most part each operators and service provider engaging in the discussion understands and is scrambling to address the need for communications access, and is very serious in their acknowledgement of a pending industry “Paradigm shift” in service delivery models.

Presentations by companies such as Ciena and Riverbed showed a mature service delivery structure based on SDNS, while PacNet and Level 3 Communications (formerly TW Telecom) presented functional on-demand self-service models of both service provisioning and a value added market place.

Steve Alexander from Ciena explained some of the challenges which the industry must address such as development of cross-industry SDN-enabled service delivery and provisioning standards.  In addition, as service providers move into service delivery automation, they must still be able to provide a discriminating or unique selling point by considering:

  • How to differentiate their service offering
  • How to differentiate their operations environment
  • How to ensure industry-acceptable delivery and provisioning time cycles
  • How to deal with legacy deployments

Alexander also emphasized that as an industry we need to get away from physical wiring when possible.   With 100Gbps ports, and the ability to create a software abstraction of individual circuits within the 100gbps resource pool (as an example), there is a lot of virtual or logical provision that can be accomplished without the need for dozens or hundreds off physical cross connections.

The result of this effort should be an environment within both a single service provider, as well as in a broader community marketplace such as a carrier hotel or large telecomm interconnection facility (i.e., The Westin Building, 60 Hudson, One Wilshire).  Some examples of actual and required deployments included:

  • A bandwidth on-demand marketplace
  • Data center interconnections, including within data center operators which have multiple interconnected meet-me-points spread across a geographic area
  • Interconnection to other services within the marketplace such as cloud service providers (e.g., Amazon Direct Connect, Azure, Softlayer, etc), content delivery networks, SaaS, and disaster recovery capacity and services

Robust discussions on standards also spawned debated.  With SDNs, much like any other emerging use of technologies or business models, there are both competing and complimentary standards.  Even terms such as Network Function Virtualization / NFV, while good, do not have much depth within standard taxonomies or definitions.

During the PTC 2015 session entitled  “Advanced Capabilities in the Control Plane Leveraging SDN and NFV Toward Intelligent Networks” a long listing of current standards and products supporting the “concpet” of SDNs was presented, including:

  • Open Contrail
  • Open Daylight
  • Open Stack
  • Open Flow
  • OPNFV
  • ONOS
  • OvS
  • Project Floodlight
  • Open Networking
  • and on and on….

For consumers and small network operators this is a very good development, and will certainly usher in a new era of on-demand self-service capacity provisioning, elastic provisioning (short term service contracts even down to the minute or hour), carrier hotel-based bandwidth and service  marketplaces, variable usage metering and costs, allowing a much better use of OPEX budgets.

For service providers (according to discussions with several North Asian telecom carriers), it is not quite as attractive, as they generally would like to see long term, set (or fixed) contracts or wholesale capacity sales.

The connection and integration of cloud services with telecom or network services is quite clear.  At some point provisioning of both telecom and compute/storage/application services will be through a single interface, on-demand, elastic (use only what you need and for only as long as you need it), usage-based (metered), and favor the end user.

While most operators get the message, and are either in the process of developing and deploying their first iteration solution, others simply still have a bit of homework to do.  In the words of one CEO from a very large international data center company, “we really need to have a strategy to deal with this multi-cloud, hybrid cloud, or whatever you call it thing.”

Oh my…

Focusing on Cloud Portability and Interoperability

Cloud Computing has helped us understand both the opportunity, and the need, to decouple physical IT infrastructure from the requirements of business.  In theory cloud computing greatly enhances an organization’s ability to not only decommission inefficient data center resources, but even more importantly eases the process an organization needs to develop when moving to integration and service-orientation within supporting IT systems.

Current cloud computing standards, such as published by the US National Institute of Standards and Technology (NIST) have provided very good definitions, and solid reference architecture for understanding at a high level a vision of cloud computing.

image However these definitions, while good for addressing the vision of cloud computing, are not at a level of detail needed to really understand the potential impact of cloud computing within an existing organization, nor the potential of enabling data and systems resources to meet a need for interoperability of data in a 2020 or 2025 IT world.

The key to interoperability, and subsequent portability, is a clear set of standards.  The Internet emerged as a collaboration of academic, government, and private industry development which bypassed much of the normal technology vendor desire to create a proprietary product or service.  The cloud computing world, while having deep roots in mainframe computing, time-sharing, grid computing, and other web hosting services, was really thrust upon the IT community with little fanfare in the mid-2000s.

While NIST, the Open GRID Forum, OASIS, DMTF, and other organizations have developed some levels of standardization for virtualization and portability, the reality is applications, platforms, and infrastructure are still largely tightly coupled, restricting the ease most developers would need to accelerate higher levels of integration and interconnections of data and applications.

NIST’s Cloud Computing Standards Roadmap (SP 500-291 v2) states:

…the migration to cloud computing should enable various multiple cloud platforms seamless access between and among various cloud services, to optimize the cloud consumer expectations and experience.

Cloud interoperability allows seamless exchange and use of data and services among various cloud infrastructure offerings and to the the data and services exchanged to enable them to operate effectively together.”

Very easy to say, however the reality is, in particular with PaaS and SaaS libraries and services, that few fully interchangeable components exist, and any information sharing is a compromise in flexibility.

The Open Group, in their document “Cloud Computing Portability and Interoperability” simplifies the problem into a single statement:

“The cheaper and easier it is to integrate applications and systems, the closer you are getting to real interoperability.”

The alternative is of course an IT world that is restrained by proprietary interfaces, extending the pitfalls and dangers of vendor lock-in.

What Can We Do?

The first thing is, the cloud consumer world must make a stand and demand vendors produce services and applications based on interoperability and data portability standards.  No IT organization in the current IT maturity continuum should be procuring systems that do not support an open, industry-standard, service-oriented infrastructure, platform, and applications reference model (Open Group).

In addition to the need for interoperable data and services, the concept of portability is essential to developing, operating, and maintaining effective disaster management and continuity of operations procedures.  No IT infrastructure, platform, or application should be considered which does not allow and embrace portability.  This includes NIST’s guidance stating:

“Cloud portability allows two or more kinds of cloud infrastructures to seamlessly use data and services from one cloud system and be used for other cloud systems.”

The bottom line for all CIOs, CTOs, and IT managers – accept the need for service-orientation within all existing or planned IT services and systems.  Embrace Service-Oriented Architectures, Enterprise Architecture, and at all costs the potential for vendor lock-in when considering any level of infrastructure or service.

Standards are the key to portability and interoperability, and IT organizations have the power to continue forcing adoption and compliance with standards by all vendors.  Do not accept anything which does not fully support the need for data interoperability.

%d bloggers like this: