The Changing Role of IT Professionals

Information Technology is a great field. With technology advancing at the speed of sound, there is never a period when IT becomes boring, or hits an intellectual wall. New devices, new software, more network bandwidth, and new opportunities to make all this technology do great things for our professional and private lives.

image  Or, it becomes a frightening professional and intellectual cyclone which threatens to make our jobs obsolete, or diluted due to business units accessing IT resources via a web page and credit card, bypassing the IT department entirely.

One of the biggest challenges IT managers have traditionally encountered is the need for providing both process, as well as utility to end users and supported departments or divisions within the organization. It is easy to get tied down in a virtual mountain of spreadsheets, trouble tickets, and unhappy users while innovation races past.

The Role of IT in Future Organizations

In reality, the technology component of IT is the easy part. If, for example, I decide that it is cost-effective to transition the entire organization to a Software as a Service (SaaS) application such as MS 365, it is a pretty easy business case to bring to management.

But more questions arise, such as does MS 365 give business users within the organization sufficient utility, and creative tools, to help solve business challenges and opportunities, or is it simply a new and cool application (in the opinion of the IT guys…) that IT guys find interesting?

Bridging the gap between old IT and the new world does not have to be too daunting. The first step is simply understanding and accepting the fact internal data center are going away in favor of virtualized cloud-enabled infrastructure. In the long term Software as a Service and Platform as a Service-enabled information, communication, and service utilities will begin to eliminate even the most compelling justifications for physical or virtual servers.

End user devices become mobile, with the only real requirement being a high definition display, input device, and high speed network connection (not this does not rely on “Internet” connections). Applications and other information and decision support resources are accessed someplace in the “cloud,” relieving the user from the burden of device applications and storage.

The IT department is no longer responsible for physical infrastructure

If we consider disciplines such as TOGAF (The open Group Architecture Framework), ITIL (Service Delivery and Management Framework), or COBIT (Governance and Holistic Organizational Enablement), a common theme emerges for IT groups.

IT organizations must become full members of an organization’s business team

If we consider the potential of systems integration, interoperability, and exploitation of large data (or “big data”) within organization’s, and externally among trading partners, governments, and others, the need for IT managers and professionals to graduate from the device world to the true information management world becomes a great career and future opportunity.

But this requires IT professionals to reconsider those skills and training needed to fully become a business team member and contributor to an organization’s strategic vision for the future.  Those skills include enterprise architecture, governance modeling, data analytics, and a view of standards and interoperability of data.  The value of a network routing certification, data center facility manager, or software installer will edge towards near zero within a few short years.

Harsh, but true.  Think of the engineers who specialized in digital telephone switches in the 1990s and early 2000s.  They are all gone.  Either retrained, repurposed, or unemployed.  The same future is hovering on the IT manager’s horizon.

So the call to action is simple.  If you are a mid-career IT professional, or new IT professional just entering the job market,  prepare yourself for a new age of IT.  Try to distance yourself from being stuck in a device-driven career path, and look at engaging and preparing yourself for contributing to the organization’s ability to fully exploit information from a business perspective, an architectural perspective, and fully indulge in a rapidly evolving and changing information services world.

Can IT Standards Facilitate Innovation?

ideaIT professionals continue to debate the benefits of standardization versus the benefits of innovation, and the potential of standards inhibiting engineer and software developer ability to develop creative solutions to business opportunities and challenges.  At the Open Group Conference in San Diego last week (3~5 February) the topic of  standards and innovation popped up not only in presentations, but also in sidebar conversations surrounding the conference venue.

In his presentation SOA4BT (Service-Oriented Architecture for Business Technology) – From Business Services to Realization,   Nikhil Kumar noted that with rigid standards there is “always a risk of service units creating barriers to business units.”  The idea is that service and IT organizations must align their intended use of standards with the needs of the business units.   Kumar further described a traditional cycle where:

  • Enterprise drivers establish ->
  • Business derived technical drivers, which encounter ->
  • Legacy and traditional constraints, which result in ->
  • “Business Required” technologies and technology (enabled) SOAs

Going through this cycle does not require a process with too much overhead, it is simply a requirement for ensuring the use of a standard, or standard business architecture framework  drive the business services groups (IT) into the business unit circle.  While IT is the source of many innovative ideas and deployments of emerging technologies, the business units are the ultimate benefactors of innovation, allowing the unit to address and respond to rapidly emerging opportunities or market requirements.

Standards come in a lot of shapes and sizes.  One standard may be a national or international standard, such as ISO 20000 (service delivery), NIST 800-53 (security), or BICSI 002-2011 (data center design and operations).  Standards may also be internal within an organization or industry, such as standardizing data bases, applications, data formats, and virtual appliances within a cloud computing environment.

In his presentation “The Implications of EA in New Audit Guidelines (COBIT5), Robert Weisman noted there are now more than 36,500 TOGAF (The Open Group Architecture Framework) certified practitioners worldwide, with more than 60 certified training organizations providing TOGAF certifications.  According to ITSMinfo.com, just in 2012 there were more than 263,000 ITIL Foundation certifications granted (for service delivery), and ISACA notes there were more than 4000 COBIT 5 certifications granted (for IT planning, implementation, and governance) in the same period.

With a growing number of organizations either requiring, or providing training in enterprise architecture, service delivery, or governance disciplines, it is becoming clear that organizations need to have a more structured method of designing more effective service-orientation within their IT systems, both for operational efficiency, and also for facilitating more effective decision support systems and performance reporting.  The standards and frameworks attempt to provide greater structure to both business and IT when designing technology toolsets and solutions for business requirements.

So use of standards becomes very effective for providing structure and guidelines for IT toolset and solutions development.  Now to address the issue of innovation, several ideas are important to consider, including:

  • Developing an organizational culture of shared vision, values, and goals
  • Developing a standardized toolkit of virtual appliances, interfaces, platforms, and applications
  • Accepting a need for continual review of existing tools, improvement of tools to match business requirements, and allow for further development and consideration when existing utilities and tools are not sufficient or adequate to task

Once an aligned vision of business goals is available and achieved, a standard toolset published, and IT and business units are better integrated as teams, additional benefits may become apparent.

  • Duplication of effort is reduced with the availability of standardized IT tools
  • Incompatible or non-interoperable organizational data is either reduced or eliminated
  • More development effort is applied to developing new solutions, rather than developing basic or standardized components
  • Investors will have much more confidence in management’s ability to not only make the best use of existing resources and budgets, but also the organization’s ability to exploit new business opportunities
  • Focusing on a standard set of utilities and applications, such as database software, will not only improve interoperability, but also enhance the organization’s ability to influence vendor service-level agreements and support agreements, as well as reduce cost with volume purchasing

Rather than view standards as an inhibitor, or barrier to innovation, business units and other organizational stakeholders should view standards as a method of not only facilitating SOAs and interoperability, but also as a way of relieving developers from the burden of constantly recreating common sets and libraries of underlying IT utilities.  If developers are free to focus their efforts on pure solutions development and responding to emerging opportunities, and rely on both technical and process standardization to guide their efforts, the result will greatly enhance an organization’s ability to be agile, while still ensuring a higher level of security, interoperability, systems portability, and innovation.

OSS Development for the Modern Data Center

Modern Data Centers are very complex environments.  Data center operators must have visibility into a wide range of integrated data bases, applications, and performance indicators to effectively understand and manage their operations and activities.

While each data center is different, all Data Centers share some common systems and common characteristics, including:

  • Facility inventories
  • Provisioning and customer fulfillment processes
  • Maintenance activities (including computerized maintenance management systems <CMMS>)
  • Monitoring
  • Customer management (including CRM, order management, etc.)
  • Trouble management
  • Customer portals
  • Security Systems (physical access entry/control and logical systems management)
  • Billing and Accounting Systems
  • Service usage records (power, bandwidth, remote hands, etc.)
  • Decision support system and performance management integration
  • Standards for data and applications
  • Staffing and activities-based management
  • Scheduling /calendar
  • etc…

Unfortunately, in many cases, the above systems are either done manually, have no standards, and had no automation or integration interconnecting individual back office components.  This also includes many communication companies and telecommunications carriers which previously either adhered, or claimed to adhere to Bellcore data and operations standards.

In some cases, the lack of integration is due to many mergers and acquisitions of companies which have unique, or non standard back office systems.  The result is difficulty in cross provisioning, billing, integrated customer management systems, and accounting – the day to day operations of a data center.

Modern data centers must have a high level of automation.  In particular, if a data center operator owns multiple facilities, it becomes very difficult to have a common look and feel or high level of integration allowing the company to offer a standardized product to their markets and customers.

Operational support systems or OSS, traditionally have four main components which include:

  • Support for process automation
  • Collection and storage for a wide variety of operational data
  • The use of standardized data structures and applications
  • And supporting technologies

With most commercial or public colocation and Data Centers customers and tenants organizations represent many different industries, products, and services.  Some large colocation centers may have several hundred individual customers.  Other data centers may have larger customers such as cloud service providers, content delivery networks, and other hosting companies.  While single large customers may be few, their internal hosted or virtual customers may also be at the scale of hundreds, or even thousands of individual customers.

To effectively support their customers Data Centers must have comprehensive OSS capabilities.  Given the large number of processes, data sources, and user requirements, the OSS should be designed and developed using a standard architecture and framework which will ensure OSS integration and interoperability.

OSS Components We have conducted numerous Interoperability Readiness surveys with both governments and private sector (commercial) data center operators during the past five years.  In more than 80% of surveys processes such as inventory management have been built within simple spreadsheets.  Provisioning of inventory items was normally a manual process conducted via e-mail or in some cases paper forms.

Provisioning, a manual process, resulted in some cases of double booked or double sold inventory items, as well as inefficient orders for adding additional customer-facing inventory or build out of additional data center space.

The problem often further compounded into additional problems such as missing customer billing cycles, accounting shortfalls, and management or monitoring system errors.

The new data center, including virtual data centers within cloud service providers, must develop better OSS tools and systems to accommodate the rapidly changing need for elasticity and agility in ICT systems.  This includes having as single window for all required items within the OSS.

Preparing an OSS architecture, based on a service-oriented architecture (SOA), should include use of ICT-friendly frameworks and guidance such as TOGAF and/or ITIL to ensure all visions and designs fully acknowledge and embrace the needs of each organization’s business owners and customers, and follow a comprehensive and structured development process to ensure those objectives are delivered.

Use of standard databases, APIs, service busses, security, and establishing a high level of governance to ensure a “standards and interoperability first” policy for all data center IT will allow all systems to communicate, share, reuse, and ultimately provide automated, single source data resources into all data center, management, accounting, and customer activities.

Any manual transfer of data between offices, applications, or systems must be prevented, preferring to integrate inventory, data collections and records, processes, and performance management indicators into a fully integrated and interoperable environment.  A basic rule of thought might be that if a human being has touched data, then the data likely has been either corrupted or its integrity may be brought into question.

Looking ahead to the next generation of data center services, stepping a bit higher up the customer service maturity continuum requires much higher levels of internal process and customer process automation.

Similar to NIST’s definition of cloud computing, stating the essential characteristics of cloud computing include “self-service provisioning,” “rapid elasticity,” ”measured services,” in addition to resource pooling and broadband access, it can be assumed that data center users of the future will need to order and fulfill services such as network interconnections, power, virtual space (or physical space), and other services through self-service, or on-demand ordering.

The OSS must strive to meet the following objectives:

  • Standardization
  • Interoperability
  • Reusable components and APIs
  • Data sharing

To accomplish this will require nearly all above mentioned characteristics of the OSS to have inventories in databases (not spreadsheets), process automation, and standards in data structure, APIs, and application interoperability.

And as the ultimate key success factor, management DSS will finally have potential for development of true dashboard for performance management, data analytics, and additional real-time tools for making effective organizational decisions.

%d bloggers like this: