Nurturing the Marriage of Cloud Computing and SOAs

In 2009 we began consulting jobs with governments in developing countries with the primary objective to consolidate data centers across government ministries and agencies into centralized, high capacity and quality data centers.  At the time, nearly all individual ministry or agency data infrastructure was built into either small computers rooms or server closets with some added “brute force” air conditioning, no backup generators, no data back up, superficial security, and lots of other ailments.

CC-SOA The vision and strategy was that if we consolidated inefficient, end of life, and high risk IT infrastructure into a standardized and professionally managed facility, national information infrastructure would not only be more secure, but through standardization, volume purchasing agreements, some server virtualization, and development of broadband infrastructure most of the IT needs of government would be easily fulfilled.

Then of course cloud computing began to mature, and the underlying technologies of Infrastructure as a Service (IaaS) became feasible.  Now, not only were the governments able to decommission inefficient and high-risk IS environments, they would also be able to build virtual data centers  with levels of on-demand compute, storage, and network resources.  Basic data center replacement.

Even those remaining committed “server hugger” IT managers and fiercely independent governmental organizations cloud hardly argue the benefits of having access to disaster recovery storage capacity though the centralized data center.

As the years passed, and we entered 2014, not only did cloud computing mature as a business model, but senior management began to increase their awareness of various aspects of cloud computing, including the financial benefits, standardization of IT resources, the characteristics of cloud computing, and potential for Platform and Software as a Service (PaaS/SaaS) to improve both business agility and internal decision support systems.

At the same time, information and organizational architecture, governance, and service delivery frameworks such as TOGAF, COBIT, ITIL, and Risk Analysis training reinforced the value of both data and information within an organization, and the need for IT systems to support higher level architectures supporting decision support systems and market interactions (including Government to Government, Business, and Citizens for the public sector) .

2015 will bring cloud computing and architecture together at levels just becoming comprehensible to much of the business and IT world.  The open Group has a good first stab at building a standard for this marriage with their Service-Oriented Cloud Computing Infrastructure (SOCCI). According to the SOCCI standard,

“Infrastructure is a foundational element for enterprise architecture. Infrastructure has been  traditionally provisioned in a physical manner. With the evolution of virtualization technologies  and application of service-orientation to infrastructure, it can now be offered as a service.

Service-orientation principles originated in the business and application architecture arena. After  repeated, successful application of these principles to application architecture, IT has evolved to  extending these principles to the infrastructure.”

At first glance the SOCII standard appears to be a document which creates a mapping between enterprise architecture (TOGAF) and cloud computing.  At second glance the SOCCI standard really steps towards tightening the loose coupling of standard service-oriented architectures through use of cloud computing tools included with all service models (IaaS/PaaS/SaaS).

The result is an architectural vision which is easily capable of absorbing existing IT requirements, as well as incorporating emerging big data analytics models, interoperability, and enterprise architecture.

Since the early days of 2009 discussion topics with government and enterprise customers have shown a marked transition from simply justifying decommissioning of high risk data centers to how to manage data sharing, interoperability, or the potential for over standardization and other service delivery barriers which might inhibit innovation – or ability of business units to quickly respond to rapidly changing market opportunities.

2015 will be an exciting year for information and communications technologies.  For those of us in the consulting and training business, the new year is already shaping up to be the busiest we have seen.

Data Center Consolidation and Adopting Cloud Computing in 2013

Throughout 2012 large organizations and governments around the world continued to struggle with the idea of consolidating inefficient data centers, server closets, and individual “rogue” servers scattered around their enterprise or government agencies.  Issues dealt with the cost of operating data centers, disaster management of information technology resources, and of course human factors centered on control, power, or retention of jobs in a rapidly evolving IT industry.

Cloud computing and virtualization continue to have an impact on all consolidation discussions, not only from the standpoint of providing a much better model for managing physical assets, but also in the potential cloud offers to solve disaster recovery shortfalls, improve standardization, and encourage or enable development of service-oriented architectures.

Our involvement in projects ranging from local, state, and national government levels in both the United States and other countries indicates a consistent need for answering the following concerns:

  • Existing IT infrastructure, including both IT and facility, is reaching the end of its operational life
  • Collaboration requirements between internal and external users are expanding quickly, driving an architectural need for interoperability
  • Decision support systems require access to both raw data, and “big data/archival data”

We would like to see an effort within the IT community to move in the following directions:

  1. Real effort at decommissioning and eliminating inefficient data centers
  2. All data and applications should be fit into an enterprise architecture framework – regardless of the size of organization or data
  3. Aggressive development of standards supporting interoperability, portability, and reuse of objects and data

Regardless of the very public failures experienced by cloud service providers over the past year, the reality is cloud computing as an IT architecture and model is gaining traction, and is not likely to go away any time soon.  As with any emerging service or technology, cloud services will continue to develop and mature, reducing the impact and frequency of failures.

Future Data CentersWhy would an organization continue to buy individual high powered workstations, individual software licenses, and device-bound storage when the same application can be delivered to a simple display, or wide variety of displays, with standardized web-enabled cloud (SaaS) applications that store mission critical data images on a secure storage system at a secure site?  Why not facilitate the transition from CAPEX to OPEX, license to subscription, infrastructure to product and service development?

In reality, unless an organization is in the hardware or software development business, there is very little technical justification for building and managing a data center.  This includes secure facilities supporting military or other sensitive sites.

The cost of building and maintaining a data center, compared with either outsourcing into a commercial colocation site – or virtualizing data, applications, and network access requirements has gained the attention of CFOs and CEOs, requiring IT managers to more explicitly justify the cost of building internal infrastructure vs. outsourcing.  This is quickly becoming a very difficult task.

Money spent on a data center infrastructure is lost to the organization.  The cost of labor is high, the cost of energy, space, and maintenance is high.  Mooney that could be better applied to product and service development, customer service capacity, or other revenue and customer-facing activities.

The Bandwidth Factor

The one major limitation the IT community will need to overcome as data center consolidation continues and cloud services become the ‘norm, is bandwidth.  Applications, such as streaming video, unified communications, and data intensive applications will need more bandwidth.  The telecom companies are making progress, having deployed 100gbps backbone capacity in many markets.  However this capacity will need to continue growing quickly to meet the needs of organizations needing to access data and applications stored or hosted within a virtual or cloud computing environment.

Consider a national government’s IT requirements.  If the government, like most, are based within a metro area.  The agencies and departments consolidate their individual data centers and server closets into a central or reduced number of facilities.   Government interoperability frameworks begin to make small steps allowing cross-agency data sharing, and individual users need access to a variety of applications and data sources needed to fulfill their decision support requirements.

For example, a GIS (Geospatial/Geographic Information System) with multiple demographic or other overlays.  Individual users will need to display data that may be drawn from several data sources, through GIS applications, and display a large amount of complex data on individual display screens.  Without broadband access between both the user and application, as well as application and data sources, the result will be a very poor user experience.

Another example is using the capabilities of video conferencing, desktop sharing, and interactive persistent-state application sharing.  Without adequate bandwidth this is simply not possible.

Revisiting the “4th Utility” for 2013

The final vision on the 2013 “wishlist” is that we, as an IT industry, continue to acknowledge the need for developing the 4th Utility.  This is the idea that broadband communications, processing capacity (including SaaS applications), and storage is the right of all citizens.  Much like the first three utilities, roads, water, and electricity, the 4th Utility must be a basic part of all discussions related to national, state, or local infrastructure discussions.  As we move into the next millennium, Internet-enabled, or something like Internet-enabled communications will be an essential part of all our lives.

The 4th Utility requires high capacity fiber optic infrastructure and broadband wireless be delivered to any location within the country which supports a community or individual connected to a community.   We’ll have to [pay a fee to access the utility (same as other utilities), but it is our right and obligation to deliver the utility.

2013 will be a lot of fun for us in the IT industry.  Cloud computing is going to impact everybody – one way or the other.  Individual data centers will continue to close.  Service-oriented architectures, enterprise architecture, process modeling, and design efficiency will drive a lot of innovation.   – We’ll lose some players, gain players, and and we’ll be in a better position at the end of 2013 than today.

5 Data Center Technology Predictions for 2012

2011 was a great year for technology innovation.  The science of data center design and operations continued to improve, the move away from mixed-use buildings used as data centers continued, the watts/sqft metric took a second seat to overall kilowatts available to a facility or customer, and the idea of compute capacity and broadband as a utility began to take its place as a basic right of citizens.

However, there are 5 areas where we will see additional significant advances in 2012.

1.  Data Center Consolidation.  The US Government admits it is using only 27% of its overall available compute power.  With 2094 data centers supporting the federal government (from the CIO’s 25 Point Plan  to Reform Fed IT Mgt), the government is required to close at least 800 of those data centers by 2015.

Data Center ConstructionThe lesson is not lost on state and local governments, private industry, or even internet content providers.  The economics of operating a data center or server closet, whether in costs of real estate, power, hardware, in addition to service and licensing agreements, are compelling enough to make even the most fervent server-hugger reconsider their religion.

2.  Cloud Computing.  Who doesn’t believe cloud computing will eventually replace the need for a server closets, cabinets, or even small cages in data centers?  The move to cloud computing is as certain as the move to email was in the 1980s. 

Some IT managers and data owners hate the idea of cloud computing, enterprise service busses, and consolidated data.  Not so much an issue of losing control, but in many cases because it brings transparency to their operation.  If you are the owner of data in a developing country, and suddenly everything you do can be audited by a central authority – well it might make you uncomfortable…

A lesson learned while attending a  fast pitch contest during late 2009 in Irvine, CA…  An enterprising entrepreneur gave his “pitch” to a panel of investment bankers and venture capital representatives.  He stated he was looking for a $5 million investment in his startup company. 

A panelist asked what the money was for, and the entrepreneur stated “.. and $2 million to build out a data center…”  The panelist responded that 90% of new companies fail within 2 years.  Why would he want to be stuck with the liability of a data center and hardware if the company failed? The gentleman further stated, “don’t waste my money on a data center – do the smart thing, use the Amazon cloud.”

3.  Virtual Desktops and Hosted Office Automation.  How many times have we lost data and files due to a failed hard drive, stolen laptop, or virus disrupting our computer?  What is the cost or burden of keeping licenses updated, versions updated, and security patches current in an organization with potentially hundreds of users?  What is the lead time when a user needs a new application loaded on a computer?

From applications as simple as Google Docs, to Microsoft 365, and other desktop replacement applications suites, users will become free from the burden of carrying a heavy laptop computer everywhere they travel.  Imagine being able to connect your 4G/LTE phone’s HDMI port to a hotel widescreen television monitor, and be able to access all the applications normally used at a desktop.  You can give a presentation off your phone, update company documents, or nearly any other IT function with the only limitation being a requirement to access broadband Internet connections (See # 5 below).

Your phone can already connect to Google Docs and Microsoft Live Office, and the flexibility of access will only improve as iPads and other mobile devices mature.

The other obvious benefit is files will be maintained on servers, much more likely to be backed up and included in a disaster recovery plan.

4.  The Science of Data Centers.  It has only been a few years since small hosting companies were satisfied to go into a data center carved out of a mixed-use building, happy to have access to electricity, cooling, and a menu of available Internet network providers.  Most rooms were Data Center Power Requirementsdesigned to accommodate 2~3kW per cabinet, and users installed servers, switches, NAS boxes, and routers without regard to alignment or power usage.

That has changed.  No business or organization can survive without a 24x7x265 presence on the Internet, and most small enterprises – and large enterprises, are either consolidating their IT into professionally managed data centers, or have already washed their hands of servers and other IT infrastructure.

The Uptime Institute, BICSI, TIA, and government agencies have begun publishing guidelines on data center construction providing best practices, quality standards, design standards, and even standards for evaluation.  Power efficiency using metrics such as the PUE/DCiE provide additional guidance on power management, data center management, and design. 

The days of small business technicians running into a data center at 2 a.m. to install new servers, repair broken servers, and pile their empty boxes or garbage in their cabinet or cage on the way out are gone.  The new data center religion is discipline, standards, discipline, and security. 

Electricity is as valuable as platinum, just as cooling and heat are managed more closely than inmates at San Quentin.  While every other standards organization is now offering certification in cabling, data center design, and data center management, we can soon expect universities to offer an MS or Ph.D in data center sciences.

5.  The 4th Utility Gains Traction.  Orwell’s “1984” painted a picture of pervasive government surveillance, and incessant public mind control (Wikipedia).  Many people believe the Internet is the source of all evil, including identity theft, pornography, crime, over-socialization of cultures and thoughts, and a huge intellectual time sink that sucks us into the need to be wired or connected 24 hours a day.

Yes, that is pretty much true, and if we do not consider the 1000 good things about the Internet vs. each 1 negative aspect, it might be a pretty scary place to consider all future generations being exposed and indoctrinated.  The alternative is to live in a intellectual Brazilian or Papuan rain forest, one step out of the evolutionary stone age.

The Internet is not going away, unless some global repressive government, fundamentalist religion, or dictator manages to dismantle civilization as we know it.

The 4th utility identifies broadband access to the ‘net as a basic right of all citizens, with the same status as roads, water, and electricity.  All governments with a desire to have their nation survive and thrive in the next millennium will find a way to cooperate with network infrastructure providers to build out their national information infrastructure (haven’t heard that term since Al Gore, eh?).

Without a robust 4th utility, our children and their children will produce a global generation of intellectual migrant workers, intellectual refugees from a failed national information sciences vision and policy.

2012 should be a great year.  All the above predictions are positive, and if proved true, will leave the United States and other countries with stronger capacities to improve their national quality of life, and bring us all another step closer.

Happy New Year!

The Argument Against Cloud Computing

As a cloud computing evangelist there is nothing quite as frustrating, and challenging, as the outright rejection of anything related to data center consolidation, data center outsourcing, or use of shared, multi-tenant cloud-based resources.  How is it possible anybody in the late stages of 2010 can possibly deny a future of VDIs and virtual data centers?

Actually, it is fairly easy to understand.  IT managers are not graded on their ability to adopt the latest “flavor of the day” technology, or adherence to theoretical concepts that look really good in Powerpoint, but in reality are largely untested and still in the development phase.

Just as a company stands a 60% chance of failure if they suffer disaster without a recovery or continuity plan, moving the corporate cookies too quickly into a “concept” may be considered just as equally irresponsible to a board of directors, as the cost of failure and loss of data remains extremely high.

The Burden Carried by Thought Leaders and Early Adopters

Very few ideas or visions are successful if kept secret.  Major shifts in technology or business process (including organizational structure) require more than exposure to a few white papers, articles, or segments on the “Tech Hour” of a cable news station.

Even as simple and routine as email is today, during the 1980s it was not fully understood, mistrusted, and even mocked by users of “stable” communication systems such as Fax, TELEX, and land line telephones. in 2010 presidents of the world’s most powerful nations are cheerfully texting, emailing, and micro-blogging their way through the highest levels of global diplomacy.

It takes time, experience, tacit knowledge, and the trend your business, government, or social community is moving forward at a rate that will put you on the outside if the new technology or service is not adopted and implemented.

The question is, “how long will it take us to get to the point we need to accept outsourcing our information technology services and infrastructure, or face a higher risk of not being part of our professional or personal community?”

E-Mail first popped up in the late 1970s, and never really made it mainstream until around the year 2000.  Till then, when executives did use email, it was generally transcribed from written memos and types in by a secretary.  Until now, we have gradually started learning about cloud computing through use of social media, hosted public mail systems, and some limited SaaS applications. 

Perhaps at the point us evangelist types, as a community, are able to start clearly articulating the reality that cloud computing has already planted its seeds in nearly every Internet-enabled computer, smart phone, or smart devices life, the vision of cloud computing will still be far too abstract for most to understand. 

And this will subsequently reinforce the corporate and organizational mind’s natural desire to back off until others have developed the knowledge base and best-practices needed to bring their community to the point implementing and IT outsourcing strategy will be in their benefit, and not be a step in their undoing.

In fact, we need to train the IT community to be critical, to learn more about cloud computing, and question their role in the future of cloud computing.  How else can we expect the knowledge level to rise to the point IT managers will have confidence in this new service technology?

And You Thought is was About Competitive Advantage?

Yes, the cloud computing bandwagon is overflowing with snappy topics such as:

  • Infrastructure agility
  • Economies of scale
  • Enabling technology
  • Reduced provisioning cycles
  • Relief from capital expense
  • better disaster recovery
  • Capacity on demand
  • IT as a Service
  • Virtual everything
  • Publics, privates, and hybrids
  • Multi-resource variability
  • Pay as you go

Oh my, we will need a special lexicon just to wade through the new marketing language of the main goals of cloud computing, which in our humble opinion are:

  • Data center consolidation
  • Disaster recovery
  • IT as a Service
    Cloud computing itself will not make us better managers and companies.  Cloud computing will serve as a very powerful tool to let us more efficiently, more quickly, and more effectively meet our organizational goals.  Until we have he confidence cloud computing will serve that purpose, it is probably a fairly significant risk to jump on the great marketing data dazzling us on Powerpoint slides and power presentations.

We will Adopt Cloud Computing, or Something Like It

Now to recover my cloud computing evangelist enthusiasm.  I do deeply believe in the word – the word of cloud computing as a utility, as a component of broadband communications, as all of the bullets listed above.  it will take time, and I warmly accept the burden of responsibility to further codify the realities of cloud computing, the requirements we need to fulfill as an industry to break out of the “first mover phase,” and the need to establish a roadmap for companies to shift their IT operations to a/the cloud.  

Just as with email, it is just one of those things you know is going to happen.  We knew it in the early days of GRID computing, and we know it now.  Let’s focus our discussion on cloud computing to more of a “how” and “when” conversation, rather then a “wow” and “ain’t it cool.” conversation. 

Now as I dust off an circa 1980 set of slides discussing the value of messaging, and how it would support one-to-one, one-to-many, and many-to-many forms of interactive and non-interactive communications, it is time for us to provide a similar Introduction to Cloud. 

Get the pulpit ready

The Bell Tolls for Data Centers

MC900250330In the good old days (late 90s and most of the 2000s) data center operators loved selling individual cabinets to customers.  You could keep your prices high for the cabinet, sell power by the “breakered amp,” and try to maximize cross connects  through a data center meet me room.  All designed to squeeze the most revenue and profit out of each individual cabinet, with the least amount of infrastructure burden.

Forward to 2010.  Data center consolidation has become an overwhelming theme, emphasized by the US CIO Vivek Kundra’s mandate to force the US government, as the world’s largest IT user, to eliminate most of more than 1600 federal government owned and operated data centers (into about a dozen), and further promote efficiency by adopting cloud computing.

The Gold Standard of Data Center Operators hits  Speed Bump

Equinix (EQIX) has a lot of reasons and explanations for their expected failure to meet 3rd quarter revenue targets.  Higher than expected customer churn, reducing pricing to acquire new business, additional accounting for the Switch and Data acquisition, etc., etc., etc…

The bottom line is –  the data center business is changing.  Single cabinet customers are looking at hosted services as an economical and operational alternative to maintaining their own infrastructure.  Face it, if you are paying for a single cabinet to house your 4 or 5 servers in a data center today, you will probably have a much better overall experience if you can migrate that minimal web-facing or customer facing equipment into a globally distributed cloud.

Likewise, cloud service providers are supporting the same level of Internet peering as most content delivery networks (CDNs) and internet Service Providers (ISPs), allowing the cloud user to relieve themselves of the additional burden of operating expensive switching equipment.  The user can still decide which peering, ISP, or network provider they want on the external side of the cloud, however the physical interconnections are no longer necessary within that expensive cabinet.

The traditional data centers are beginning to experience the move to shared cloud services, as is Equinix, through higher churn rates and lower sales rates for those individual cabinets or small cages.

The large enterprise colocation users or CDNs continue to grow larger, adding to their ability to renegotiate contracts with the data centers.  Space, cross connects, power, and service level agreements favor the large footprint and power users, and the result is data centers are further becoming a highly skilled, sophisticated, commodity.

The Next Generation Data Center

There are several major factors influencing data center planners today.  Those include the impact of cloud computing, emergence of containerized data centers, the need for far great energy efficiency (often using PUE-Power Utilization Effectiveness) as the metric, and the industry drive towards greater data center consolidation.

Hunter Newby, CEO of Allied Fiber, strongly believes ”Just as in the last decade we saw the assembly of disparate networks in to newly formed common, physical layer interconnection facilities in major markets we are now seeing a real coordinated global effort to create new and assemble the existing disparate infrastructure elements of dark fiber, wireless towers and data centers. This is the next logical step and the first in the right direction for the next decade and beyond.”

We are also seeing data center containers popping up along the long fiber routes, adjacent to traditional breaking points such as in-line amplifiers (ILAs), fiber optic terminals (locations where carriers physically interconnect their networks either for end-user provisioning, access to metro fiber networks, or redundancy), and wireless towers. 

So does this mean the data center of the future is not necessarily confined to large 500 megawatt data center farms, and is potentially something that becomes an inherent part of the transmission network?  The computer is the network, the network is the computer, and all other variations in between?

For archival and backup purposes, or caching purposes, can data exist in a widely distributed environment?

Of course latency within the storage and processing infrastructure will still be dependent on physics for the near term, actually, for end user applications such as desktop virtualization, there really isn’t any particular reason that we MUST have that level of proximity…  And there probably are ways we can “spoof” the systems to think they are located together, and there are a host of other reasons why we do not have to limit ourselves to a handful of “Uber Centers…”

A Vision for Future Data Centers

What if broadband and compute/storage capacity become truly insulated from the user.  What if Carr’s ideas behind the Big Switch are really the future of computing as we know it, and our interface to the “compute brain” is limited to dumb devices, and that we no longer have to concern ourselves with anything other than writing software against a well publicized set of standards?

What if the next generation of Equinix is a partner to Verizon or AT&T, and Equinix builds a national compute and storage utility distributed along the fiber routes that is married to the communications infrastructure transmission network?

What if our monthly bill for entertainment, networking, platform, software, and communications is simply the record of how much utility we used during the month, or our subscription fee for the month? 

What if wireless access is transparent, and globally available to all mobile and stationary terminals without reconfiguration and a lot of pain?

No more “remote hands” bills, midnight trips to the data center to replace a blown server or disk, dealing with unfriendly or unknowledgeable  “support” staff, or questions of who trashed the network due to a runaway virus or malware commando…

Kind of an interesting idea.

Probably going to happen one of these days.

Now if we can extend that utility to all airlines so I can have 100% wired access, 100% of the time.

Data Center Consolidation and Cloud Computing in Indonesia

2010 brings great opportunities and challenges to IT organizations in Indonesia. Technology refresh, aggressive development of telecom and Internet infrastructure, with aggressive deployment of “eEverything” is shaking the ICT industry. Even the most steadfast division-level IT managers are beginning to recognize the futility in trying to maintain their own closet “data Skyline near the Jakarta Stock Exchangecenter” in a world of virtualization, cloud computing, and drive to increase both data center economics and data security.

Of course there are very good models on the street for data center consolidation, particularly on government levels. In the United States, the National Association of State Chief Information Officers (NASCIO) lists data center consolidation as the second highest priority, immediately after getting better control over managing budget and operational cost.

In March the Australian government announced a (AUD) $1 billion data center consolidation plan, with standardization, solution sharing, and developing opportunities to benefit from “new technology, processes or policy.”

Minister for Finance and Deregulation Lindsay Tanner noted Australia currently has many inefficient data centers, very suitable candidates for consolidation and refresh. The problem of scattered or unstructured data management is “spread across Australia, (with data) located in not just large enterprise data centres, but also in cupboards, converted offices, computer and server rooms, and in commercial and insourced data centers,” said Tanner.

These are primarily older data centres that are reaching the limits of their electricity supply and floor space. With government demand for data center ICT equipment rising by more than 30 per cent each year, it was clear that we needed to reassess how the government handled its data center activities.”

The UK government also recently published ICT guidance related to data center consolidation, with a plan to cut government operated data center from 130 to around 10~12 facilities. The guidance includes the statement “Over the next three-to-five years, approximately 10-12 highly resilient strategic data centers for the public sector will be established to a high common standard. This will then enable the consolidation of existing public data centers into highly secure and resilient facilities, managed by expert suppliers.”

Indonesia Addresses Data Center Consolidation

Indonesia’s government is in a unique position to take advantage of both introducing new data center and virtualization technology, as well as deploying a consolidated, distributed data center infrastructure that would bring the additional benefit of strong disaster recovery capabilities.

Much like the problems identified by Minister Tanner in Australia, today many Indonesian government organizations – and commercial companies – operate ICT infrastructure without structure or standards. “We cannot add additional services in our data center,” mentioned one IT manager interviewed recently in a data center audit. “If our users need additional applications, we direct them to buy their own server and plug it in under their desk. We don’t have the electricity in our data center to drive new applications and hardware, so our IT organization will now focus only on LAN/WAN connectivity.”

While all IT managers understand disaster recovery planning and business continuity is essential, few have brought DR from PowerPoint to reality, putting much organization data on individual servers, laptops, and desktop computers. All at risk for theft or loss/failure of single disk systems.

basic map showing palapa ringThat is all changing. Commercial data centers are being built around the country by companies such as PT Indosat, PT Telekom, and other private companies. With the Palapa national fiber ring nearing completion, all main islands within the Indonesian archipelago are connected with diverse fiber optic backbone capacity, and additional international submarine cables are either planned or in progress to Australia, Hong Kong, Singapore, and other communication hubs.

For organizations currently supporting closet data centers, or local servers facing the public Internet for eCommerce or eGovernment applications, data centers such as the Cyber Tower in Jakarta offer both commercial data center space, as well as supporting interconnections for carriers – including the Indonesia Internet Exchange (IIX), in a similar model as One Wilshire, The Westin Building, or 151 Front in Toronto. Ample space for outsourcing data center infrastructure (particularly for companies with Internet-facing applications), as well as power, cooling, and management for internal infrastructure outsourcing.

The challenge, as with most other countries, is to convince ICT managers that it is in their company or organization’s interest to give up the server. Rather than focus their energy on issues such as “control,” “independence (or autonomous operations),” and avoiding the pain of “workforce retraining and reorganization,” ICT managers should consider the benefits outsourcing their physical infrastructure into a data center, and further consider the additional benefits of virtualization and public/enterprise cloud computing.

Companies such as VMWare, AGIT, and Oracle are offering cloud computing consulting and development in Indonesia, and the topic is rapidly gaining momentum in publications and discussions within both the professional IT community, as well as with CFOs and government planning agencies.

It makes sense. As in cloud computing initiatives being driven by the US and other governments, not only consolidating data centers, but also consolidating IT compute resources and storage, makes a lot of sense. Particularly if the government has difficulty standardizing or writing web services to share data. Add a distributed cloud processing model, where two or more data centers with cloud infrastructure are interconnected, and we can now start to drive down recovery time and point objectives close to zero.

Not just for government users, but a company located in Jakarta is able to develop a disaster recovery plan, simply backing up critical data in a remote location, such as IDC Batam (part of the IDC Indonesia group). As an example, the IDC Indonesia group operates 4 data centers located in geographically separate parts of the country, and all are interconnected.

While this does not support all zero recovery time objectives, it does allow companies to lease a cabinet or suite in a commercial data center, and at a minimum install disk systems adequate to meet their critical data restoral needs. It also opens up decent data center collocation space for emerging cloud service and infrastructure providers, all without the burden of legacy systems to refresh.

In a land of volcanoes, typhoons, earthquakes, and man-made disasters Indonesia has a special need for good disaster recovery planning. Through an effort to consolidate organization data centers, the introduction of cloud services in commercial and government markets, and high capacity interconnections between carriers and data centers, the basic elements needed to move forward in Indonesia are now in place.

Southern California Tech Jobs Make a Comeback

2009 was a horrible year for job seekers, and even those holding on to existing jobs. No bonuses, no promotions, layoffs, and nobody hiring. And SoCal successfully beat most of the United States in unemployment claims, by several percentage points, attaching painful and empirical fact to the grim situation.

But that does appear to be changing. Slowly changing, but it is looking better for job seekers in the region. A recent scrape of job openings for Los Angeles and Orange Counties yielded some pretty strong job titles:

  • Director of Engineering – Marina Del Rey
  • Chief Integration Engineer – El Segundo
  • Director, information technology – San Clemente
  • VP Global Services – Los Angeles
  • Customer Services Director – El Segundo
  • Lead Systems Engineer – Los Angeles
  • Senior Industrial Director – Irvine
  • Smart GRID Architect – Rosemead
  • HL7 Integrator – Los Angeles
  • Disaster Recovery Manager – Irvine
  • Manager, Operations Systems – Van Nuys, CA
  • Systems Architecture Engineer – Huntington Beach, CA

And the list goes on… About 350 good positions listed in my 25 January search.

Tech Jobs are Out There in SoCalOne additional exciting trend in the job stack is the high number of positions in manufacturing industries. While the services market is great, manufacturing spawns input into the supply chain, which adds a lot of downstream value to those companies increasing or expanding their business operations.

Dust Off that Resume

The time is near, and technology-savvy job seekers will reap rewards if they are prepared for the next boom in business expansion. Cloud computing, unified messaging, IT operations, data center consolidation, process automation, green technologies – corporate jargon to some, but areas with increasing demand for qualified candidates.

Cloud computing and data center consolidation are quite interesting, admittedly because they are new and exciting trends in the IT community.

The Dot.Com era taught us painful lessons on the value of investment money. The venture community sat back after 2002 and made a decision to actually perform a bit of due-diligence prior to throwing money at PowerPoint companies and paper ideas. At least those which were not using private equity with large investments in real estate.

The Dot.OMG era is now just about at an end, and some of the lessons learned are focused on the execution of business plans and intelligent use of capital and operational expenses – while building business.

IT has gone from being a “darling” of the internet age, to a very powerful means of adding tremendous business value through globalization of markets, and real-time transaction processing to support the global economy and marketplace. The only problem was to support that IT engine, technical managers tried to solve their processing challenges by throwing more disk, processors, and bandwidth at their requirements.

The next age will be one where companies refocus their energy on developing their business, and begin to expect processing and IT to be more of a utility than an exceptional part of their business. Welcome to data center outsourcing, virtualization, cloud computing, and Software as a Service/SaaS. Recover the costs of expensive and inefficient data centers.

So those engineers and sales staff still hanging out in the Communicator’s Bar, get ready to get sized for your next retro-logo polo shirt. The time is now for those who can put their fantasy of re-entering the telecom community to deal in bilateral telephone minutes aside and get ready to support thought leadership strategies to bring customers into commercial data center outsourcing models – or go sell them on consolidating their in-house operations into enterprise clouds.

Look at the tech job listings again. Companies are begging for IT and tech visionary managers to solve their growth and development pain. Begging.

2010 is going to be a great year in SoCal, so let’s get out there and make sure it does not pass us by, and does not require our companies to go elsewhere to attract talent. We’ve got the talent right here, and we need to put it back to work.

Follow

Get every new post delivered to your Inbox.

Join 280 other followers

%d bloggers like this: