Putting Enterprise Architecture Principles to Work

This week brought another great consulting gig, working with old friends and respected colleagues.  The challenge driving the consultation was brainstorming a new service for their company, and how best to get it into operation.

image The new service vision was pretty good.  The service would fill a hole, or shortfall in the industry which would better enable their customers to compete in markets both in the US and abroad.  However the process of planning and delivering this service, well, simply did not exist.

The team’s sense of urgency to deliver the service was high, based on a perception if they did not move quickly, then they would suffer an opportunity loss while competitors moved quickly to fill the service need themselves.

While it may have been easy to “jump on the bandwagon” and share the team’s enthusiasm, they lacked several critical components of delivering a new service, which included:

  • No specific product or service definition
  • No, even high level, market analysis or survey
  • No cost analysis or revenue projection
  • No risk analysis
  • No high level implementation plan or schedule

“We have great ideas from vendors, and are going to try and put together a quick pilot test as quickly as possible.  We are trying to gather a few of our customers to participate right now” stated one of the team.

At that point, reluctantly, I had to put on the brakes.  While not making any attempt to dampen the team’s enthusiasm, to promote a successful service launch I forced them to consider additional requirements, such as:

  • The need to build a business case
  • The need for integration of the service into existing back office systems, such as inventory, book-to-bank, OSS, management and monitoring, finance and billing, executive dashboards (KPIs, service performance, etc.)
  • Staffing and training requirements
  • Options of in-sourcing, outsourcing, or partnering to deliver the service
  • Developing RFPs (even simple RFPs) to help evaluate vendor options
  • and a few other major items

“That just sounds like too much work.  If we need to go through all that, we’ll never deliver the service.  Better to just work with a couple vendors and get it on the street.”

I should note the service would touch many, many people in the target industry, which is very tech-centric.  Success or failure of the service could have a major impact on the success or failure of many in the industry.

Being a card-carrying member of the enterprise architecture cult, and a proponent of other IT-related frameworks such as ITIL, COBIT, Open FAIR, and other business modeling, there are certainly bound to be conflicts between following a very structured approach to building business services, and the need for agile creativity and innovation.

In this case, asking the team to indulge me for a few minutes while I mapped out a simple, structured approach to developing and delivering the envisioned service.  By using simplified version of the TOGAF Architecture Development Method (ADM), and adding a few lines related to standards and service development methodology, such as the vision –> AS-IS –> gap analysis –> solutions development model, it did not take long for the team to reconsider their aggressive approach.

When preparing a chart of timelines using the “TOGAF Light,” or EA framework, the timelines were oddly similar to the aggressive approach.  The main difference being at the end of the EA approach the service not only followed a very logical, disciplined, measurable, governable, and flexible service.

Sounds a bit utopian, but in reality we were able to get to the service delivery with a better product, without sacrificing any innovation, agility, or market urgency.

This is the future of IT.  As we continue to move away from the frenzy of service deliveries of the Internet Age, and begin focusing on the business nature, including role IT plays in critical global infrastructures, the disciplines of following product and service development and delivery will continue to gain importance.

Developing a New “Service-Centric IT Value Chain”

imageAs IT professionals we have been overwhelmed with different standards for each component of architecture, service delivery, governance, security, and operations.  Not only does IT need to ensure technical training and certification, but it is also desired to pursue certifications in ITIL, TOGAF, COBIT, PMP, and a variety of other frameworks – at a high cost in both time and money.

Wouldn’t it be nice to have an IT framework or reference architecture which brings all the important components of each standard or recommendation into a single model which focuses on the most important aspect of each existing model?

The Open Group is well-known for publishing TOGAF (The Open Group Architecture Framework), in addition to a variety of other standards and frameworks related to Service-Oriented Architectures (SOA), security, risk, and cloud computing.  In the past few years, recognizing the impact of broadband, cloud computing, SOAs, and need for a holistic enterprise architecture approach to business and IT, publishing many common-sense, but powerful recommendations such as:

  • TOGAF 9.1
  • Open FAIR (Risk Analysis and Assessment)
  • SOCCI (Service-Oriented Cloud Computing Infrastructure)
  • Cloud Computing
  • Open Enterprise Security Architecture
  • Document Interchange Reference Model (for interoperability)
  • and others.

The open Group’s latest project intended to streamline and focus IT systems development is called the “IT4IT” Reference Architecture.  While still in the development, or “snapshot” phase, IT4IT is surprisingly easy to read, understand, and most importantly logical.

“The IT Value Chain and IT4IT Reference Architecture represent the IT service lifecycle in a new and powerful way. They provide the missing link between industry standard best practice guides and the technology framework and tools that power the service management ecosystem. The IT Value Chain and IT4IT Reference Architecture are a new foundation on which to base your IT operating model. Together, they deliver a welcome blueprint for the CIO to accelerate IT’s transition to becoming a service broker to the business.” (Open Group’s IT4IT Reference Architecture, v 1.3)

The IT4IT Reference Architecture acknowledges changes in both technology and business resulting from the incredible impact Internet and automation have had on both enterprise and government use of information and data.  However the document also makes a compelling case that IT systems, theory, and operations have not kept up with either existing IT support technologies, nor the business visions and objectives IT is meant to serve.

IT4IT’s development team is a large, global collaborative effort including vendors, enterprise, telecommunications, academia, and consulting companies.  This helps drive a vendor or technology neutral framework, focusing more on running IT as a business, rather than conforming to a single vendor’s product or service.  Eventually, like all developing standards, IT4IT may force vendors and systems developers to provide a solid model and framework for developing business solutions, which will support greater interoperability and data sharing between both internal and external organizations.

The visions and objectives for IT4IT include two major components, which are the IT Value Chain and IT4IT Reference Architecture.  Within the IT4IT Core are sections providing guidance, including:

  • IT4IT Abstractions and Class Structures
  • The Strategy to Portfolio Value Stream
  • The Requirement to Deploy Value Stream
  • The Request to Fulfill Value Stream
  • The Detect to Correct Value Stream

Each of the above main sections have borrowed from, or further developed ideas and activities from within ITIL, COBIT, and  TOGAF, but have taken a giant leap including cloud computing, SOAs, and enterprise architecture into the product.

As the IT4IT Reference Architecture is completed, and supporting roadmaps developed, the IT4IT concept will no doubt find a large legion of supporters, as many, if not most, businesses and IT professionals find the certification and knowledge path for ITIL, COBIT, TOGAF, and other supporting frameworks either too expensive, or too time consuming (both in training and implementation).

Take a look at IT4IT at the Open Group’s website, and let us know what you think.  Too light?  Not needed?  A great idea or concept?  Let us know.

NexGen Cloud Conference in San Diego – Missing the Point

The NexGen Cloud Computing Conference kicked off on Thursday in San Diego with a fair amount of hype and a lot of sales people.  Granted the intent of the conference is for cloud computing vendors to find and NexGen Cloud Conference develop either sales channels, or business development opportunities within the market.

As an engineer, the conference will probably result in a fair amount of frustration, but will at least provide a level of awareness in how an organization’s sales, marketing, and business teams are approaching their vision of a cloud computing product or service delivery.

However, one presentation stood out.  Terry Hedden, from Marketopia, made some very good points.  His presentation was entitled “How to Build a Successful Cloud Practice.”  While the actual presentation is not so important, he made several points, which I’ll refer to as “Heddonisms,” which struck me as important enough, or amusing enough, to record.

Some of the following “Heddonisms” were paraphrased either due to my misunderstanding of his point, or because I thought the point was so profound it needed a bit of additional highlight.

Heddonisms for the Cloud Age:

  • Entire software companies are transitioning to SaaS development.  Lose the idea of licensed software – think of subscription software.
  • Integrators and consultants have a really good future – prepare yourself.
  • The younger generation does not attend tech conferences.  Only old people who think they can sell things, get new jobs, or are trying to put some knowledge to the junk they are selling (the last couple of points are mine).
  • Companies selling hosted SaaS products and services are going to kill those who still hang out at the premise.
  • If you do not introduce cloud services to your customers. your competitor will introduce cloud to your customers.
  • If you are not aspiring to be a leader in cloud, you are not relevant.
  • There is little reason to go into the IaaS business yourself.  Let the big guys build infrastructure – you can make higher margins selling their stuff.  In general, IaaS companies are really bad sales organizations (also mine…).
  • Budgets for security at companies like Microsoft are much higher than for smaller companies.  Thus, it is likely Microsoft’s ability to design, deploy, monitor, and manage secure infrastructure is much higher than the average organization.
  • Selling cloud is easy – you are able to relieve your customers of most up front costs (like buying hardware, constructing data centers, etc.).
  • If you simply direct your customer to Microsoft or Google’s website for a solution, then you are adding no value to our customer.
  • If you hear the word “APP” come up in a conversation, just turn around and run away.
  • If you assist a company in a large SaaS implementation (successfully), they will likely be your customer for life.
  • Don’t do free work or consulting – never (this really hurt me to hear – guilty as charged…).
  • Customers have one concern, and one concern only – Peace of Mind.  Make their pains go away, and you will be successful.  Don’t give them more problems.
  • Customers don’t care what is behind the curtain (such as what kind of computers or routers you are using).  They only care about you taking the pain of stuff that doesn’t make them money away from their lives.
  • Don’t try to sell to IT guys and engineers.  Never.  Never. Never.
  • The best time to work with a company is when they are planning for their technology refresh cycles.

Heddon was great.  While he may have a bit of contempt for engineers (I have thick skin, I can live with the wounds), he provided a very logical and realistic view of how to approach selling and deploying cloud computing.

Now about missing the point.  Perhaps the biggest shortfall of the conference, in my opinion, is that most presentations and even vendor efforts solved only single silos of issues.  Nobody provided an integrated viewpoint of how cloud computing is actually just one tool an organization can use within a larger, planned, architecture.

No doubt I have become bigoted myself after several years of plodding through TOGAF, ITIL, COBIT, Risk Assessments, and many other formal IT-supporting frameworks.  Maybe a career in the military forced me into systems thinking and structured problem solving.  Maybe I lack a higher level of innovative thinking or creativity – but I crave a structured, holistic approach to IT.

Sadly, I got no joy at the NexGen Cloud Computing Conference.  But, I would have driven from LA to San Diego just for Heddon’s presentation and training session – that made the cost of conference and time a valuable investment.

Nurturing the Marriage of Cloud Computing and SOAs

In 2009 we began consulting jobs with governments in developing countries with the primary objective to consolidate data centers across government ministries and agencies into centralized, high capacity and quality data centers.  At the time, nearly all individual ministry or agency data infrastructure was built into either small computers rooms or server closets with some added “brute force” air conditioning, no backup generators, no data back up, superficial security, and lots of other ailments.

CC-SOA The vision and strategy was that if we consolidated inefficient, end of life, and high risk IT infrastructure into a standardized and professionally managed facility, national information infrastructure would not only be more secure, but through standardization, volume purchasing agreements, some server virtualization, and development of broadband infrastructure most of the IT needs of government would be easily fulfilled.

Then of course cloud computing began to mature, and the underlying technologies of Infrastructure as a Service (IaaS) became feasible.  Now, not only were the governments able to decommission inefficient and high-risk IS environments, they would also be able to build virtual data centers  with levels of on-demand compute, storage, and network resources.  Basic data center replacement.

Even those remaining committed “server hugger” IT managers and fiercely independent governmental organizations cloud hardly argue the benefits of having access to disaster recovery storage capacity though the centralized data center.

As the years passed, and we entered 2014, not only did cloud computing mature as a business model, but senior management began to increase their awareness of various aspects of cloud computing, including the financial benefits, standardization of IT resources, the characteristics of cloud computing, and potential for Platform and Software as a Service (PaaS/SaaS) to improve both business agility and internal decision support systems.

At the same time, information and organizational architecture, governance, and service delivery frameworks such as TOGAF, COBIT, ITIL, and Risk Analysis training reinforced the value of both data and information within an organization, and the need for IT systems to support higher level architectures supporting decision support systems and market interactions (including Government to Government, Business, and Citizens for the public sector) .

2015 will bring cloud computing and architecture together at levels just becoming comprehensible to much of the business and IT world.  The open Group has a good first stab at building a standard for this marriage with their Service-Oriented Cloud Computing Infrastructure (SOCCI). According to the SOCCI standard,

“Infrastructure is a foundational element for enterprise architecture. Infrastructure has been  traditionally provisioned in a physical manner. With the evolution of virtualization technologies  and application of service-orientation to infrastructure, it can now be offered as a service.

Service-orientation principles originated in the business and application architecture arena. After  repeated, successful application of these principles to application architecture, IT has evolved to  extending these principles to the infrastructure.”

At first glance the SOCII standard appears to be a document which creates a mapping between enterprise architecture (TOGAF) and cloud computing.  At second glance the SOCCI standard really steps towards tightening the loose coupling of standard service-oriented architectures through use of cloud computing tools included with all service models (IaaS/PaaS/SaaS).

The result is an architectural vision which is easily capable of absorbing existing IT requirements, as well as incorporating emerging big data analytics models, interoperability, and enterprise architecture.

Since the early days of 2009 discussion topics with government and enterprise customers have shown a marked transition from simply justifying decommissioning of high risk data centers to how to manage data sharing, interoperability, or the potential for over standardization and other service delivery barriers which might inhibit innovation – or ability of business units to quickly respond to rapidly changing market opportunities.

2015 will be an exciting year for information and communications technologies.  For those of us in the consulting and training business, the new year is already shaping up to be the busiest we have seen.

It is Time to Get Serious about Architecting ICT

Just finished another ICT-related technical assistance visit with a developing country government. Even in mid-2014, I spend a large amount of time teaching basic principles of enterprise architecture, and the need for adding form and structure to ICT strategies.

Service-oriented architectures (SOA) have been around for quite a long time, with some references going back to the 1980s. ITIL, COBIT, TOGAF, and other ICT standards or recommendations have been around for quite a long time as well, with training and certifications part of nearly every professional development program.

So why is the idea of architecting ICT infrastructure still an abstract to so many in government and even private industry? It cannot be the lack of training opportunities, or publicly available reference materials. It cannot be the lack of technology, or the lack of consultants readily willing to assist in deploying EA, SOA, or interoperability within any organization or industry cluster.

During the past two years we have run several Interoperability Readiness Assessments within governments. The assessment initially takes the form of a survey, and is distributed to a sample of 100 or more participants, with positions ranging from administrative task-based workers, to Cxx or senior leaders within ministries and government agencies.

Questions range from basic ICT knowledge to data sharing, security, and decision support systems.

While the idea of information silos is well-documented and understood, it is still quite surprising to see “siloed” attitudes are still prevalent in modern organizations.  Take the following question:

Question on Information Sharing

This question did not refer to sharing data outside of the government, but rather within the government.  It indicates a high lack of trust when interacting with other government agencies, which will of course prevent any chance of developing a SOA or facilitating information sharing among other agencies.  The end result is a lower level of both integrity and value in national decision support capability.

The Impact of Technology and Standardization

Most governments are considering or implementing data center consolidation initiatives.  There are several good reasons for this, including:

  • Cost of real estate, power, staffing, maintenance, and support systems
  • Transition from CAPEX-based ICT infrastructure to OPEX-based
  • Potential for virtualization of server and storage resources
  • Standardized cloud computing resources

While all those justifications for data center consolidation are valid, the value potentially pales in comparison of the potential of more intelligent use of data across organizations, and even externally to outside agencies.  To get to this point, one senior government official stated:

“Government staff are not necessarily the most technically proficient.  This results in reliance on vendors for support, thought leadership, and in some cases contractual commitments.  Formal project management training and certification are typically not part of the capacity building of government employees.

Scientific approaches to project management, especially ones that lend themselves to institutionalization and adoption across different agencies will ensure a more time-bound and intelligent implementation of projects. Subsequently, overall knowledge and technical capabilities are low in government departments and agencies, and when employees do gain technical proficiency they will leave to join private industry.”

There is also an issue with a variety of international organizations going into developing countries or developing economies, and offering no or low cost single-use ICT infrastructure, such as for health-related agencies, which are not compatible with any other government owned or operated applications or data sets.

And of course the more this occurs, the more difficult it is for government organizations to enable interoperability or data sharing, and thus the idea of an architecture or data sharing become either impossible or extremely difficult to implement or accomplish.

The Road to EA, SOAs, and Decision Support

There are several actions to take on the road to meeting our ICT objectives.

  1. Include EA, service delivery (ITIL), governance (COBIT), and SOA training in all university and professional ICT education programs.  It is not all about writing code or configuring switches, we need to ensure a holistic understanding of ICT value in all ICT education, producing a higher level of qualified graduates entering the work force.
  2. Ensure government and private organizations develop or adopt standards or regulations which drive enterprise architecture, information exchange models, and SOAs as a basic requirement of ICT planning and operations.
  3. Ensure executive awareness and support, preferably through a formal position such as the Chief Information Officer (CIO).  Principles developed and published via the CIO must be adopted and governed by all organizations,
    Nobody expects large organizations, in particular government organizations, to change their cultures of information independence overnight.  This is a long term evolution as the world continues to better understand the value and extent of value within existing data sets, and begin creating new categories of data.  Big data, data analytics, and exploitation of both structured and unstructured data will empower those who are prepared, and leave those who are not prepared far behind.
    For a government, not having the ability to access, identify, share, analyze, and address data created across agencies will inhibit effective decision support, with potential impact on disaster response, security, economic growth, and overall national quality of life.
    If there is a call to action in this message, it is for governments to take a close look at how their national ICT policies, strategies, human capacity, and operations are meeting national objectives.  Prioritizing use of EA and supporting frameworks or standards will provide better guidance across government, and all steps taken within the framework will add value to the overall ICT capability.

Pacific-Tier Communications LLC provides consulting to governments and commercial organizations on topics related to data center consolidation, enterprise architecture, risk management, and cloud computing.

Why IT Guys Need to Learn TOGAF

ByeBye-Telephones You are No Longer RequiredJust finished another frustrating day of consulting with an organization that is convinced technology is going to solve their problems.  Have an opportunity?  Throw money and computers at the opportunity.  Have a technology answer to your process problems?  Really?.

The business world is changing.  With cloud computing potentially eliminating the need for some current IT roles, such as physical server huggers…, information technology professionals, or more appropriately information and communications technology (ICT) professionals, need to rethink their roles within organizations.

Is it acceptable to simply be a technology specialist, or do ICT professionals also need to be an inherent part of the business process?  Yes, a rhetorical question, and any negative answer is wrong.  ICT professionals are rapidly being relieved of the burden of data centers, servers (physical servers), and a need to focus on ensuring local copies of MS Office are correctly installed, configured, and have the latest service packs or security patches installed.

You can fight the idea, argue the concept, but in reality cloud computing is here to stay, and will only become more important in both the business and financial planning of future organizations.

Now those copies of MS Office are hosted on MS 365 or Google Docs, and your business users are telling you either quickly meet their needs or they will simply bypass the IT organization and use an external or hosted Software as a Service (SaaS) application – in spite of your existing mature organization and policies.

So what is this TOGAF stuff?  Why do we care?

Well…

As it should be, ICT is firmly being set in the organization as a tool to meet business objectives.  We no longer have to consider the limitations or “needs” of IT when developing business strategies and opportunities.  SaaS and Platform as a Service (PaaS) tools are becoming mature, plentiful, and powerful.

Argue the point, fight the concept, but if an organization isn’t at least considering a requirement for data and systems interoperability, the use of large data sets, and implementation of a service-oriented architecture (SOA) they will not be competitive or effective in the next generation of business.

TOGAF, which is “The Open Group Architecture Framework,” brings structure to development of ICT as a tool for meeting business requirements.   TOGAF is a tool which will force each stakeholder, including senior management and business unit management, to work with ICT professionals to apply technology in a structured framework that follows the basic:

  • Develop a business vision
  • Determine your “AS-IS” environment
  • Determine your target environment
  • Perform a gap analysis
  • Develop solutions to meet the business requirements and vision, and fill the “gaps” between “AS-IS” and “Target”
  • Implement
  • Measure
  • Improve
  • Re-iterate
    Of course TOGAF is a complex architecture framework, with a lot more stuff involved than the above bullets.  However, the point is ICT must now participate in the business planning process – and really become part of the business, rather than a vendor to the business.
    As a life-long ICT professional, it is easy for me to fall into indulging in tech things.  I enjoy networking, enjoy new gadgets, and enjoy anything related to new technology.  But it was not until about 10 years ago when I started taking a formal, structured approach to understanding enterprise architecture and fully appreciating the value of service-oriented architectures that I felt as if my efforts were really contributing to the success of an organization.
    TOGAF was one course of study that really benefitted my understanding of the value and role IT plays in companies and government organizations.  TOGAF provide both a process, and structure to business planning.
    You may have a few committed DevOps evangelists who disagree with the structure of TOGAF, but in reality once the “guardrails” are in place even DevOps can be fit into the process.  TOGAF, and other frameworks are not intended to stifle innovation – just encourage that innovation to meet the goals of an organization, not the goals of the innovators.
    While just one of several candidate enterprise architecture frameworks (including the US Federal Enterprise Architecture Framework/FEAF, Dept. of Defense Architecture Framework /DoDAF), TOGAF is now universally accepted, and accompanying certifications are well understood within government and enterprise.

What’s an IT Guy to Do?

    Now we can send the “iterative” process back to the ICT guy’s viewpoint.  Much like telecom engineers who operated DMS 250s, 300s, and 500s, the existing IT and ICT professional corps will need to accept the reality they will either need to accept the concept of cloud computing, or hope they are close to retirement.  Who needs a DMS250 engineer in a world of soft switches?  Who needs a server manager in a world of Infrastructure as a Service?  Unless of course you work as an infrastructure technician at a cloud service provider…
    Ditto for those who specialize in maintaining copies of MS Office and a local MS Exchange server.  Sadly, your time is limited, and quickly running out.  Either become a cloud computing expert, in some field within cloud computing’s broad umbrella of components, or plan to be part of the business process.  To be effective as a member of the organization’s business team, you will need skills beyond IT – you will need to understand how ICT is used to meet business needs, and the impact of a rapidly evolving toolkit offered by all strata of the cloud stack.

Even better, become a leader in the business process.  If you can navigate your way through a TOGAF course and certification, you will acquire a much deeper appreciation for how ICT tools and resources could, and likely should, be planned and employed within an organization to contribute to the success of any individual project, or the re-engineering of ICTs within the entire organization.


John Savageau is TOGAF 9.1 Certified

ICT Modernization Planning

ICT ModernizationThe current technology refresh cycle presents many opportunities, and challenges to both organizations and governments.  The potential of service-oriented architectures, interoperability, collaboration, and continuity of operations is an attractive outcome of technologies and business models available today.  The challenges are more related to business processes and human factors, both of which require organizational transformations to take best advantage of the collaborative environments enabled through use of cloud computing and access to broadband communications.

Gaining the most benefit from planning an interoperable environment for governments and organizations may be facilitated through use of business tools such as cloud computing.  Cloud computing and underlying technologies may create an operational environment supporting many strategic objectives being considered within government and private sector organizations.

Reaching target architectures and capabilities is not a single action, and will require a clear understanding of current “as-is” baseline capabilities, target requirements, the gaps or capabilities need to reach the target, and establishing a clear transitional plan to bring the organization from a starting “as-is” baseline to the target goal.

To most effectively reach that goal requires an understanding of the various contributing components within the transformational ecosystem.  In addition, planners must keep in mind the goal is not implementation of technologies, but rather consideration of technologies as needed to facilitate business and operations process visions and goals.

Interoperability and Enterprise Architecture

Information technology, particularly communications-enabled technology has enhanced business process, education, and the quality of life for millions around the world.  However, traditionally ICT has created silos of information which is rarely integrated or interoperable with other data systems or sources.

As the science of enterprise architecture development and modeling, service-oriented architectures, and interoperability frameworks continue to force the issue of data integration and reuse, ICT developers are looking to reinforce open standards allowing publication of external interfaces and application programming interfaces.

Cloud computing, a rapidly maturing framework for virtualization, standardized data, application, and interface structure technologies, offers a wealth of tools to support development of both integrated and interoperable ICT  resources within organizations, as well as among their trading, shared, or collaborative workflow community.

The Institute for Enterprise Architecture Development defines enterprise architecture (EA) as a “complete expression of the enterprise; a master plan which acts as a collaboration force between aspects of business planning such as goals, visions, strategies and governance principles; aspects of business operations such as business terms, organization structures, processes and data; aspects of automation such as information systems and databases; and the enabling technological infrastructure of the business such as computers, operating systems and networks”

ICT, including utilities such as cloud computing, should focus on supporting the holistic objectives of organizations implementing an EA.  Non-interoperable or shared data will generally have less value than reusable data, and will greatly increase systems reliability and data integrity.

Business Continuity and Disaster Recovery (BCDR)

Recent surveys of governments around the world indicate in most cases limited or no disaster management or continuity of operations planning.  The risk of losing critical national data resources due to natural or man-made disasters is high, and the ability for most governments maintain government and citizen services during a disaster is limited based on the amount of time (recovery time objective/RTO) required to restart government services, as well as the point of data restoral (recovery point objective /RPO).

In existing ICT environments, particularly those with organizational and data resource silos,  RTOs and RPOs can be extended to near indefinite if both a data backup plan, as well as systems and service restoral resource capacity is not present.  This is particularly acute if the processing environment includes legacy mainframe computer applications which do not have a mirrored recovery capacity available upon failure or loss of service due to disaster.

Cloud computing can provide a standards-based environment that fully supports near zero RTO/RPO requirements.  With the current limitation of cloud computing being based on Intel-compatible architectures, nearly any existing application or data source can be migrated into a virtual resource pool.   Once within the cloud computing Infrastructure as a Service (IaaS) environment, setting up distributed processing or backup capacity is relatively uncomplicated, assuming the environment has adequate broadband access to the end user and between processing facilities.

Cloud computing-enabled BCDR also opens opportunities for developing either PPPs, or considering the potential of outsourcing into public or commercially operated cloud computing compute, storage, and communications infrastructure.  Again, the main limitation being the requirement for portability between systems.

Transformation Readiness

ICT modernization will drive change within all organizations.  Transformational readiness is not a matter of technology, but a combination of factors including rapidly changing business models, the need for many-to-many real-time communications, flattening of organizational structures, and the continued entry of technology and communications savvy employees into the workforce.

The potential of outsourcing utility compute, storage, application, and communications will eliminate the need for much physical infrastructure, such as redundant or obsolete data centers and server closets.  Roles will change based on the expected shift from physical data centers and ICT support hardware to virtual models based on subscriptions and catalogs of reusable application and process artifacts.

A business model for accomplishing ICT modernization includes cloud computing, which relies on technologies such as server and storage resource virtualization, adding operational characteristics including on-demand resource provisioning to reduce the time needed to procure ICT resources needed to respond to emerging operational  or other business opportunities.

IT management and service operations move from a workstation environment to a user interface driven by SaaS.  The skills needed to drive ICT within the organization will need to change, becoming closer to the business, while reducing the need to manage complex individual workstations.

IT organizations will need to change, as organizations may elect to outsource most or all of their underlying physical data center resources to a cloud service provider, either in a public or private environment.  This could eliminate the need for some positions, while driving new staffing requirements in skills related to cloud resource provisioning, management, and development.

Business unit managers may be able to take advantage of other aspects of cloud computing, including access to on-demand compute, storage, and applications development resources.  This may increase their ability to quickly respond to rapidly changing market conditions and other emerging opportunities.   Business unit managers, product developers, and sales teams will need to become familiar with their new ICT support tools.  All positions from project managers to sales support will need to quickly acquire skills necessary to take advantage of these new tools.

The Role of Cloud Computing

Cloud computing is a business representation of a large number of underlying technologies.  Including virtualization, development environment, and hosted applications, cloud computing provides a framework for developing standardized service models, deployment models, and service delivery characteristics.

The US National Institute of Standards and Technology (NIST) provides a definition of cloud computing accepted throughout the ICT industry.

“Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction.“

While organizations face decisions related to implementing challenges related to developing enterprise architectures and interoperability, cloud computing continues to rapidly develop as an environment with a rich set of compute, communication, development, standardization, and collaboration tools needed to meet organizational objectives.

Data security, including privacy, is different within a cloud computing environment, as the potential for data sharing is expanded among both internal and potentially external agencies.  Security concerns are expanded when questions of infrastructure multi-tenancy, network access to hosted applications (Software as a Service / SaaS), and governance of authentication and authorization raise questions on end user trust of the cloud provider.

A move to cloud computing is often associated with data center consolidation initiatives within both governments and large organizations.  Cloud delivery models, including Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) support the development of virtual data centers.

While it is clear long term target architectures for most organizations will be an environment with a single data system, in the short term it may be more important to decommission high risk server closets and unmanaged servers into a centralized, well-managed data center environment offering on-demand access to compute, storage, and network resources – as well as BCDR options.

Even at the most basic level of considering IaaS and PaaS as a replacement environment to physical infrastructure, the benefits to the organization may become quickly apparent.  If the organization establishes a “cloud first” policy to force consolidation of inefficient or high risk ICT resources, and that environment further aligns the organization through the use of standardized IT components, the ultimate goal of reaching interoperability or some level of data integration will become much easier, and in fact a natural evolution.

Nearly all major ICT-related hardware and software companies are re-engineering their product development to either drive cloud computing, or be cloud-aware.  Microsoft has released their Office 365 suite of online and hosted environments, as has Google with both PaaS and SaaS tools such as the Google Apps Engine and Google Docs.

The benefits of organizations considering a move to hosted environments, such as MS 365, are based on access to a rich set of applications and resources available on-demand, using a subscription model – rather than licensing model, offering a high level of standardization to developers and applications.

Users comfortable with standard office automation and productivity tools will find the same features in a SaaS environment, while still being relieved of individual software license costs, application maintenance, or potential loss of resources due to equipment failure or theft.  Hosted applications also allow a persistent state, collaborative real-time environment for multi-users requiring access to documents or projects.  Document management and single source data available for reuse by applications and other users, reporting, and performance management becomes routine, reducing the potential and threat of data corruption.

The shortfalls, particularly for governments, is that using a large commercial cloud infrastructure and service provider such as Microsoft  may require physically storing data in location outside of their home country, as well as forcing data into a multi-tenant environment which may not meet security requirements for organizations.

Cloud computing offers an additional major feature at the SaaS level that will benefit nearly all organizations transitioning to a mobile workforce.  SaaS by definition is platform independent.  Users access SaaS applications and underlying data via any device offering a network connection, and allowing access to an Internet-connected address through a browser.    The actual intelligence in an application is at the server or virtual server, and the user device is simply a dumb terminal displaying a portal, access point, or the results of a query or application executed through a command at the user screen.

Cloud computing continues to develop as a framework and toolset for meeting business objectives.  Cloud computing is well-suited to respond to rapidly changing business and organizational needs, as the characteristics of on-demand access to infrastructure resources, rapid elasticity, or the ability to provision and de-provision resources as needed to meet processing and storage demand, and organization’s ability to measure cloud computing resource use for internal and external accounting mark a major change in how an organization budgets ICT.

As cloud computing matures, each organization entering a technology refresh cycle must ask the question “are we in the technology business, or should we concentrate our efforts and budget in efforts directly supporting realizing objectives?”  If the answer is the latter, then any organization should evaluate outsourcing their ICT infrastructure to an internal or commercial cloud service provider.

It should be noted that today most cloud computing IaaS service platforms will not support migration of mainframe applications, such as those written for a RISC processor.  Those application require redevelopment to operate within an Intel-compatible processing environment.

Broadband Factor

Cloud computing components are currently implemented over an Internet Protocol network.  Users accessing SaaS application will need to have network access to connect with applications and data.  Depending on the amount of graphics information transmitted from the host to an individual user access terminal, poor bandwidth or lack of broadband could result in an unsatisfactory experience.

In addition, BCDR requires the transfer of potentially large amounts of data between primary and backup locations. Depending on the data parsing plan, whether mirroring data, partial backups, full backups, or live load balancing, data transfer between sites could be restricted if sufficient bandwidth is not available between sites.

Cloud computing is dependent on broadband as a means of connecting users to resources, and data transfer between sites.  Any organization considering implementing cloud computing outside of an organization local area network will need to fully understand what shortfalls or limitations may result in the cloud implementation not meeting objectives.

The Service-Oriented Cloud Computing Infrastructure (SOCCI)

Governments and other organizations are entering a technology refresh cycle based on existing ICT hardware and software infrastructure hitting the end of life.  In addition, as the world aggressively continues to break down national and technical borders, the need for organizations to reconsider the creation, use, and management of data supporting both mission critical business processes, as well as decision support systems will drive change.

Given the clear direction industry is taking to embrace cloud computing services, as well as the awareness existing siloed data structures within many organizations would better serve the organization in a service-oriented  framework, it makes sense to consider an integrated approach.

A SOCCI considers both, adding reference models and frameworks which will also add enterprise architecture models such as TOGAF to ultimately provide a broad, mature framework to support business managers and IT managers in their technology and business refresh planning process.

SOCCIs promote the use of architectural building blocks, publication of external interfaces for each application or data source developed, single source data, reuse of data and standardized application building block, as well as development and use of enterprise service buses to promote further integration and interoperability of data.

A SOCCI will look at elements of cloud computing, such as virtualized and on-demand compute/storage resources, and access to broadband communications – including security, encryption, switching, routing, and access as a utility.  The utility is always available to the organization for use and exploitation.  Higher level cloud components including PaaS and SaaS add value, in addition to higher level entry points to develop the ICT tools needed to meet the overall enterprise architecture and service-orientation needed to meet organizational needs.

According to the Open Group a SOCCI framework provides the foundation for connecting a service-oriented infrastructure with the utility of cloud computing.  As enterprise architecture and interoperability frameworks continue to gain in value and importance to organizations, this framework will provide additional leverage to make best use of available ICT tools.

The Bottom Line on ICT Modernization

The Internet Has reached nearly every point in the world, providing a global community functioning within an always available, real-time communications infrastructure.  University and primary school graduates are entering the workforce with social media, SaaS, collaboration, and location transparent peer communities diffused in their tacit knowledge and experience.

This environment has greatly flattened any leverage formerly developed countries, or large monopoly companies have enjoyed during the past several technology and market cycles.

An organization based on non-interoperable or standardized data, and no BCDR protection will certainly risk losing a competitive edge in a world being created by technology and data aware challengers.

Given the urgency organizations face to address data security, continuity of operations, agility to respond to market conditions, and operational costs associated with traditional ICT infrastructure, many are looking to emerging technology frameworks such as cloud computing to provide a model for planning solutions to those challenges.

Cloud computing and enterprise architecture frameworks provide guidance and a set of tools to assist organizations in providing structure, and infrastructure needed to accomplish ICT modernization objectives.

The Reality of Cloud Implementation Part 1 – Hosted Applications

As a business consultant providing direction and advice to both government clients and commercial clients, several topics continue to drive discussion not only on short term IT strategy, but also longer term innovative contributions cloud computing can offer the organization.

However to get the conversation moving forward, managers contemplating major architectural change to their IT organizations need to find a good reference or pilot project to justify the expense and contribute to change.  Not the preferred approach, but a reality.

One easy IT project is the move from workstation-based applications, primarily office automation suites, to server-based applications.  The choice is between applications hosted within a private (enterprise) network, or to outsource the application to a commercial provider such as Microsoft Live Office or Google Apps.

Hosted applications make a lot of sense – for most users.  It is a great idea to offload the burden of desktop application administration IT managers when possible, with an expectation of the following:

  1. Greater control over intellectual property (files are stored on a central file server, not on individual hard drives and computers)
  2. Greater control over version and application code updates
  3. Greater control over security, anti-virus, and anti-spam definitions
  4. Application standardization (including organizational templates and branding)
  5. Better management of user licenses (and eliminating use of unauthorized or copied software)

If we look at profiles of most organizational users, the vast majority are office workers who normally do not need to travel, access files or applications from home, or stay on call 24 hours a day.  Thus we can assume, while at the office, computers are connected to a high performance LAN, with high bandwidth and throughout within the organization.

if that assumption is correct, and the organization implements either an enterprise-hosted or commercially-hosted (Google or Microsoft as an example), then those individual workstations can also eliminate keeping files on the local drives (can all be available and backed up to a file server), as well as using web-based applications for most activities.

The user’s relationship with the network, applications, and intellectual property is channeled through a workstation or web interface.  This also enables users, through use of VPNs and other access security, to use any compatible interface available when connecting to applications and files.  This includes home computers and mobile devices – as long as the data is retained on the host file server, and a record is created of all users accessing the data for both security and network/computer resource capacity management.

NOTE:  As a frequent traveler I also spend a considerable amount of time in airplanes, airports, and areas without easy access to the Internet or my file servers.  I do keep an image of MS Office on my laptop, and do have a very large hard drive, and do have a library of SD chips and flash drives for  use when un-tethered from my web apps.  I don’t see this changing in the near future – however I am probably in a  very small minority of professional road warriors who still justify use of local images.  Most do not.

An Unscientific Review of Web-Based Office Automation Applications

First, I am writing this blog entry using Microsoft’s Live Writer, a web/cloud-based application available for blog writers.  it is one application available within the Microsoft “Live-Everything” suite of web-based utilities, which include office automation and social networking applications.

writer The Live Writer application connects with my blog provider (WordPress), downloads my blog profile, and uses that as a what-you-see-is-what-you-get editing interface.  I feel as if I am typing directly into my blog, without the need to understand HTML commands or other manual features.

Adding video, tables, tags, hyperlinks, and SEO tools is effortless.

Going further into my Microsoft Live Office account I can upload, download, create, edit, and store documents in all MS Office formats, with the main free apps including World, Excel, Powerpoint, and One Note.  Mail, calendars, web sites, blogs – a variety of different utilities for personal and potentially professional use.

It is easy to share documents, create collaboration groups, and integrate MS Messenger-driven conferencing and sharing among other connected colleagues.  All available as a free environment for any user without the need to buy MS Office products for your personal computer.  Other commercial products offer a lot more utility, however as a basic test environment, the performance of MS Live Office is more than adequate for probably 95% of office workers world wide. 

Face it, most of us rarely us anything beyond the most basic features of any office automation product, and purchasing licenses for individual office automation suites for each organizational user really only benefits the vendor.

Google Docs, and the Google Apps engine provide similar features to the Microsoft Suite, and some additional unique features not currently available (or easily noticed) on the Live Office sites.  At a high level, Google provides network users:

  • Documents (word processing, spreadsheets, presentations)
  • Forms
  • Drawing/graphics
  • Templates
  • Blogs
  • Analytics
  • Lots of other stuff

In my absolutely unscientific testing of both Google and Microsoft web-based applications, I did not find a single feature which I normally use in preparing presentations, documents, and spreadsheets that could not be reproduced with the online edition.

If that is true for most users, then we can probably look toward a future where cloud-based and hosted office automation applications begin to replace software loaded on individual workstations.

The Danger of Easy Outsourcing

In a world of Service Oriented Architectures (SOA), and close inter-relationships of data, care is needed to ensure we do not create pilots “islands of unconnectable data.”  Today, nearly all data is connectable, whether tables and forms within an email message, SMS messages, spreadsheets, data bases, or any other potential SaaS application.

A word we need to keep in our IT vocabulary is “PORTABILITY.”  Anything we type into an application is a candidate for logging, enquiry, statistics, reporting, or other use of data.  This is a concern when using SaaS applications for not only office automation, but any other hosted application. 

Any and all data we create must be available to any other application which can consume or integrate organizational or industry community of interest applications.  We will look into the SaaS portability question in part 2 of this series.

Trouble at the Telecom and Communicator’s Bar

Have you heard the news? Unemployment is skyrocketing, companies are closing, there’s no investment money for startups, and the sky is falling, the sky is falling? Don’t I know, as the layoff frenzy hit my own Hanging out at the communicator's barhome, that it is a scary economic place to take a swim… Sharks, really hungry sharks, circling with an eye to take every last cent you have been able to hide.

And the outlook remains bleak. The New York Times reports that Europe is suffering in youth unemployment – even more than the US. 42.9% unemployment is Spain, 28% unemployment in Ireland, an EU average of 20.7% Makes California look like the “promised land.”

And, California may actually be the “promised land.” California still attracts the best of global engineering to the Silicon Valley, and the most creative minds in communications and entertainment to Los Angeles. Whether you are a European, Chinese, Indian, or even Canadian, Silicon Valley and LA offer an environment that is unsurpassed around the world. Our universities embrace people from other cultures and countries, and our ability to support entrepreneurs draws not only students, but the best engineers and thought leaders from around the world.

Back at the Communicator’s Bar

There are still tables with discussions reviewing the indignities of being laid off by struggling companies. There are still discussions with the whine of people talking about the “damn foreigners” who are here stealing our jobs. Still “barflys” slopped over the bar worrying about their Audi payments and how their ARM mortgage has put them under water.

Then there are other bars with tables full of Americans, And A scatter shot of foreigners talking about fun stuff. Fun stuff like cloud computing, virtualization, globalization, distributing computing, “the network is the computer,” “the computer is the network,” and how the carriers will return to their roots of providing high quality “big, fat, dumb” telecom pipes. The talk is of how we can finally start putting all this intellectual property that we’ve spent billions n producing Powerpoint slides into reality.

Green is here

Virtualization is here

Data Center outsourcing is here

2010 is a blank whiteboard set up to codify the thought leadership and technology spawned in the waning years of the 200x decade and put it into business plans and CAPEX budgets.

2010 is the year we aggressively deliver Internet-enabled technology to every man, woman, and child in the world who has a desire to live a life beyond killing their own food for dinner. Here is a funny though – if a radical 8 year old in one currently scary country is able to Yahoo chat or Facebook their way into discussions and relationships with kids in California and Beijing, doesn’t it make just a little sense the desire to blow each other up would be diluted, even just a little?

If the guy living next to me is producing a telecom switch that is head and shoulders above what is currently on the market, do I really care if his brain was conceived in Hanoi?

2010 is also the beginning of a true period of globalization. That doesn’t mean out hillbilly friends in Duluth, Minnesota have to quit drinking 3.2 beer and hanging out at setup bars watching Vikings reruns, it means that the hillbilly’s kid can participate in a lecture series online from Stanford or MIT. The kid might eventually invent a pickup truck that runs on pine cones, and a 3.2 beer that is actually palatable.

Embrace 2010

If not for the simple fact you have no other choice, consider all the great ideas being pumped out by companies like 3tera, the Google borg, Microsoft, VM Ware, and all the other companies with tremendous innovative ideas. Never before in our history have some many new intellectual and business tools been put on the shelf at the same time. Never before have we had such good reason to consider implanting those ideas (yes, I am a tree hugger and do believe in global warming).

So, even if you are currently living in a car under a bridge near you former upscale Orange County community – shave, wash your car, take a shower at the beach, and let’s get our depression, anger, tacit knowledge back into the business saddle. The young guys still need our experience to get their feet on the ground, and we need them to ensure we will have social security in the future.

Welcome 2010 – you have taken a long time to arrive

John Savageau, Honolulu

%d bloggers like this: