The Argument Against Cloud Computing

As a cloud computing evangelist there is nothing quite as frustrating, and challenging, as the outright rejection of anything related to data center consolidation, data center outsourcing, or use of shared, multi-tenant cloud-based resources.  How is it possible anybody in the late stages of 2010 can possibly deny a future of VDIs and virtual data centers?

Actually, it is fairly easy to understand.  IT managers are not graded on their ability to adopt the latest “flavor of the day” technology, or adherence to theoretical concepts that look really good in Powerpoint, but in reality are largely untested and still in the development phase.

Just as a company stands a 60% chance of failure if they suffer disaster without a recovery or continuity plan, moving the corporate cookies too quickly into a “concept” may be considered just as equally irresponsible to a board of directors, as the cost of failure and loss of data remains extremely high.

The Burden Carried by Thought Leaders and Early Adopters

Very few ideas or visions are successful if kept secret.  Major shifts in technology or business process (including organizational structure) require more than exposure to a few white papers, articles, or segments on the “Tech Hour” of a cable news station.

Even as simple and routine as email is today, during the 1980s it was not fully understood, mistrusted, and even mocked by users of “stable” communication systems such as Fax, TELEX, and land line telephones. in 2010 presidents of the world’s most powerful nations are cheerfully texting, emailing, and micro-blogging their way through the highest levels of global diplomacy.

It takes time, experience, tacit knowledge, and the trend your business, government, or social community is moving forward at a rate that will put you on the outside if the new technology or service is not adopted and implemented.

The question is, “how long will it take us to get to the point we need to accept outsourcing our information technology services and infrastructure, or face a higher risk of not being part of our professional or personal community?”

E-Mail first popped up in the late 1970s, and never really made it mainstream until around the year 2000.  Till then, when executives did use email, it was generally transcribed from written memos and types in by a secretary.  Until now, we have gradually started learning about cloud computing through use of social media, hosted public mail systems, and some limited SaaS applications. 

Perhaps at the point us evangelist types, as a community, are able to start clearly articulating the reality that cloud computing has already planted its seeds in nearly every Internet-enabled computer, smart phone, or smart devices life, the vision of cloud computing will still be far too abstract for most to understand. 

And this will subsequently reinforce the corporate and organizational mind’s natural desire to back off until others have developed the knowledge base and best-practices needed to bring their community to the point implementing and IT outsourcing strategy will be in their benefit, and not be a step in their undoing.

In fact, we need to train the IT community to be critical, to learn more about cloud computing, and question their role in the future of cloud computing.  How else can we expect the knowledge level to rise to the point IT managers will have confidence in this new service technology?

And You Thought is was About Competitive Advantage?

Yes, the cloud computing bandwagon is overflowing with snappy topics such as:

  • Infrastructure agility
  • Economies of scale
  • Enabling technology
  • Reduced provisioning cycles
  • Relief from capital expense
  • better disaster recovery
  • Capacity on demand
  • IT as a Service
  • Virtual everything
  • Publics, privates, and hybrids
  • Multi-resource variability
  • Pay as you go

Oh my, we will need a special lexicon just to wade through the new marketing language of the main goals of cloud computing, which in our humble opinion are:

  • Data center consolidation
  • Disaster recovery
  • IT as a Service
    Cloud computing itself will not make us better managers and companies.  Cloud computing will serve as a very powerful tool to let us more efficiently, more quickly, and more effectively meet our organizational goals.  Until we have he confidence cloud computing will serve that purpose, it is probably a fairly significant risk to jump on the great marketing data dazzling us on Powerpoint slides and power presentations.

We will Adopt Cloud Computing, or Something Like It

Now to recover my cloud computing evangelist enthusiasm.  I do deeply believe in the word – the word of cloud computing as a utility, as a component of broadband communications, as all of the bullets listed above.  it will take time, and I warmly accept the burden of responsibility to further codify the realities of cloud computing, the requirements we need to fulfill as an industry to break out of the “first mover phase,” and the need to establish a roadmap for companies to shift their IT operations to a/the cloud.  

Just as with email, it is just one of those things you know is going to happen.  We knew it in the early days of GRID computing, and we know it now.  Let’s focus our discussion on cloud computing to more of a “how” and “when” conversation, rather then a “wow” and “ain’t it cool.” conversation. 

Now as I dust off an circa 1980 set of slides discussing the value of messaging, and how it would support one-to-one, one-to-many, and many-to-many forms of interactive and non-interactive communications, it is time for us to provide a similar Introduction to Cloud. 

Get the pulpit ready

Government Clouds Take on the ESBaaS

Recent discussions with government ICT leadership related to cloud computing strategies have all brought the concept of Enterprise Service Bus as a Service into the conversation.

Now ESBs are not entirely new, but in the context of governments they make a lot of sense.  In the context of cloud computing strategies in governments they make a heck of a lot of sense.

Wikipedia defines an ESB as:

In computing, an enterprise service bus (ESB) is a software architecture construct which provides fundamental services for complex architectures via an event-driven and standards-based messaging engine (the bus). Developers typically implement an ESB using technologies found in a category of middleware infrastructure products, usually based on recognized standards.

Now if you actually understand that – then you are no doubt a software developer.  For the rest of us, this means that with the ESB pattern, participants engaging in service interaction communicate through a services or application “bus.” This bus could be a database, virtual desktop environment, billing/payments system, email, or other services common to one or more agencies. The ESB is designed to handle relationships between users with a common services and standardized data format.

New services can be plugged into the bus and integrated with existing services without any changes to the core bus service. Cloud users and applications developers will simply add or modify the integration logic.

Participants in a cross-organizational service interaction are connected to the Cloud ESB, rather than directly to one another, including: government-to-government, citizen-to-government, and business-to-government. Rules-based administration support will make it easier to manage ESB deployments through a simplified template allowing a better user experience for solution administrators.

The Benefits to Government Clouds

In addition to fully supporting a logical service-oriented architecture (SOA), the ESBaaS will enhance or provide:

  • Open and published solutions for managing Web services connectivity, interactions, services hosting, and services mediation environment
  • From development and maintenance perspective, the Government Cloud ESB allows agencies and users to securely and reliably share information between applications in a logical, cost effective manner
  • Government Cloud ESBs will simplify adding new services, or changing existing services, with minimal impact to the bus or other interfacing applications within the IT environment
  • Improvements in system performance and availability by offloading message processing and isolating complex mediation tasks in a dedicated ESB integration server

Again, possibly a mouthful, but if you can grasp the idea of a common bus providing services to a lot of different applications or agencies, allowing sharing of data and and interfaces without complex relationships between each participating agency, then the value becomes much more clear.

Why the Government Cloud?

While there are many parallels to large companies, governments are unique in the number of separate ministries, agencies, departments, and organizations within the framework of government.  Governments normally share a tremendous amount of in the past this data between each agency, and in the past this was extremely difficult due to organizational differences, lack of IT support, or individuals who simply did not want to share data with other agencies.

The result of course was many agencies built their own stand alone data systems, without central coordination, resulting in a lot of duplicate data items (such as an individual’s personal profile and information, business information, and land management information, and other similar data).  Most often, there were small differences in the data elements each agency developed and maintained, resulting in either corrupt or conflicting data.

The ESB helps identify a method of connecting applications and users to common data elements, allowing the sharing of both application format and in many cases database data sets.  This allows not only efficiency in software/applications development, but also a much higher level of standardization an common data sharing.

While this may be uncomfortable for some agencies, most likely those which do not want to share their data with the central government, or use applications that are standardized with the rest of government, this also does support a very high level of government transparency.  A controversial, but essential goal of all developing (and developed) governments.

As governments continue to focus on data center consolidation and the great economical, environmental, and enabling qualities of virtualization and on-demand compute resources, integration of the ESBaaS makes a lot of sense. 

There are some very nice articles related to ESBs on the net, including:

Which may help you better understand the concept, or give some additional ideas.

Let us know your opinion or ideas on ESBaaS

Disaster Recovery as a First Step into Cloud Computing

fire-articleOrganizations see the benefits of cloud computing, however many are simply mortified at the prospect of re-engineering their operations to fit into existing cloud service technology or architectures.  So how can we make the first step? 

We (at Pacific-Tier Communications) have conducted 103 surveys over the past few months in the US, Canada, Indonesia, and Moldova on the topic of cloud computing.  The surveys targeted both IT managers in commercial companies, as well as within government organizations.

The survey results were really no different than most – IT managers in general find cloud computing and virtualization an exciting technology and service development, but they are reluctant to jump into cloud for a variety of reas0ns, including:

  • Organization is not ready (including internal politics)
  • No specific budget
  • Applications not prepared for migration to cloud
  • and lots of other reasons

The list and reasoning for not going into cloud will continue until organizations get to the point they cannot avoid the topic, probably around the time of a major technology refresh.

Disaster Recovery is Different

The surveys also indi9cated another consistent trend – most organizations still have no formal disasters recovery plan.  This is particularly common within government agencies, including those state and local governments surveyed in the United States.

IT managers in many government agencies had critical data stored on laptop computers, desktops, or in most cases their organization operating data in a server closet with either no backup, or onsite backup to a tape system with no offsite storage.

In addition, the central or controlling government/commercial  IT organization had either no specific policy for backing up data, or in a worst case had no means of backing up data (central or common storage system) available to individual branch or agency users.

When asked if cloud storage, or even dedicated storage became available with reasonable technical ease, and affordable cost, the IT managers agreed, most enthusiastically, that they would support development of automated backup and individual workstation backup to prevent data loss and reinforce availability of applications.

Private or Public – Does it Make a Difference?

While most IT managers are still worshiping at the shrine of IT Infrastructure Control, there are cracks appearing in the “Great Walls of IT Infrastructure.”  With dwindling IT budgets, and diskexplosive user and organization IT utility demand, IT managers are slowly realizing the good old days of control are nearly gone.

And to add additional tarnish to pride, the IT managers are also being faced with the probability at least some of their infrastructure will find its way into public cloud services, completely out of their domain.

On the other hand, it is becoming more and more difficult to justify building internal infrastructure when the quality, security, and utility of public services often exceeds that which can be built internally.  Of course there are exceptions to every rule, which in our discussion includes requirements for additional security for government sensitive or classified information.

That information could include military, citizen identification data, or other similar information that while securable through encryption and partition management, politically(particularly in cases where the data could possible leave the borders of a country) may not be possible to extend beyond the walls of an internal data center.

For most other information, it is quickly becoming a simple exercise in financial planning to determine whether or not a public storage service or internal storage service makes more sense. 

The Intent is Disaster Recovery and Data Backup

Getting back to the point, with nearly all countries, and in particular central government properties, being on or near high capacity telecom carriers and networks, and the cost of bandwidth plummeting, the excuses for not using network-based off-site backups of individual and organization data are becoming rare.

In our surveys and interviews it was clear IT managers fully understood the issue, need, and risk of failure relative to disaster recovery and backup.

Cloud storage, when explained and understood, would help solve the problem.  As a first step, and assuming a successful first step, pushing disaster recovery (at least on the level of backups) into cloud storage may be an important move ahead into a longer term move to cloud services.

All managers understood the potential benefits of virtual desktops, SaaS applications, and use of high performance virtualized infrastructure.  They did not always like it, but they understood within the next refresh generation of hardware and software technology, cloud computing would have an impact on their organization’s future.

But in the short term, disaster recovery and systems backup into cloud storage is the least traumatic first step ahead.

How about your organization?

The Bell Tolls for Data Centers

MC900250330In the good old days (late 90s and most of the 2000s) data center operators loved selling individual cabinets to customers.  You could keep your prices high for the cabinet, sell power by the “breakered amp,” and try to maximize cross connects  through a data center meet me room.  All designed to squeeze the most revenue and profit out of each individual cabinet, with the least amount of infrastructure burden.

Forward to 2010.  Data center consolidation has become an overwhelming theme, emphasized by the US CIO Vivek Kundra’s mandate to force the US government, as the world’s largest IT user, to eliminate most of more than 1600 federal government owned and operated data centers (into about a dozen), and further promote efficiency by adopting cloud computing.

The Gold Standard of Data Center Operators hits  Speed Bump

Equinix (EQIX) has a lot of reasons and explanations for their expected failure to meet 3rd quarter revenue targets.  Higher than expected customer churn, reducing pricing to acquire new business, additional accounting for the Switch and Data acquisition, etc., etc., etc…

The bottom line is –  the data center business is changing.  Single cabinet customers are looking at hosted services as an economical and operational alternative to maintaining their own infrastructure.  Face it, if you are paying for a single cabinet to house your 4 or 5 servers in a data center today, you will probably have a much better overall experience if you can migrate that minimal web-facing or customer facing equipment into a globally distributed cloud.

Likewise, cloud service providers are supporting the same level of Internet peering as most content delivery networks (CDNs) and internet Service Providers (ISPs), allowing the cloud user to relieve themselves of the additional burden of operating expensive switching equipment.  The user can still decide which peering, ISP, or network provider they want on the external side of the cloud, however the physical interconnections are no longer necessary within that expensive cabinet.

The traditional data centers are beginning to experience the move to shared cloud services, as is Equinix, through higher churn rates and lower sales rates for those individual cabinets or small cages.

The large enterprise colocation users or CDNs continue to grow larger, adding to their ability to renegotiate contracts with the data centers.  Space, cross connects, power, and service level agreements favor the large footprint and power users, and the result is data centers are further becoming a highly skilled, sophisticated, commodity.

The Next Generation Data Center

There are several major factors influencing data center planners today.  Those include the impact of cloud computing, emergence of containerized data centers, the need for far great energy efficiency (often using PUE-Power Utilization Effectiveness) as the metric, and the industry drive towards greater data center consolidation.

Hunter Newby, CEO of Allied Fiber, strongly believes ”Just as in the last decade we saw the assembly of disparate networks in to newly formed common, physical layer interconnection facilities in major markets we are now seeing a real coordinated global effort to create new and assemble the existing disparate infrastructure elements of dark fiber, wireless towers and data centers. This is the next logical step and the first in the right direction for the next decade and beyond.”

We are also seeing data center containers popping up along the long fiber routes, adjacent to traditional breaking points such as in-line amplifiers (ILAs), fiber optic terminals (locations where carriers physically interconnect their networks either for end-user provisioning, access to metro fiber networks, or redundancy), and wireless towers. 

So does this mean the data center of the future is not necessarily confined to large 500 megawatt data center farms, and is potentially something that becomes an inherent part of the transmission network?  The computer is the network, the network is the computer, and all other variations in between?

For archival and backup purposes, or caching purposes, can data exist in a widely distributed environment?

Of course latency within the storage and processing infrastructure will still be dependent on physics for the near term, actually, for end user applications such as desktop virtualization, there really isn’t any particular reason that we MUST have that level of proximity…  And there probably are ways we can “spoof” the systems to think they are located together, and there are a host of other reasons why we do not have to limit ourselves to a handful of “Uber Centers…”

A Vision for Future Data Centers

What if broadband and compute/storage capacity become truly insulated from the user.  What if Carr’s ideas behind the Big Switch are really the future of computing as we know it, and our interface to the “compute brain” is limited to dumb devices, and that we no longer have to concern ourselves with anything other than writing software against a well publicized set of standards?

What if the next generation of Equinix is a partner to Verizon or AT&T, and Equinix builds a national compute and storage utility distributed along the fiber routes that is married to the communications infrastructure transmission network?

What if our monthly bill for entertainment, networking, platform, software, and communications is simply the record of how much utility we used during the month, or our subscription fee for the month? 

What if wireless access is transparent, and globally available to all mobile and stationary terminals without reconfiguration and a lot of pain?

No more “remote hands” bills, midnight trips to the data center to replace a blown server or disk, dealing with unfriendly or unknowledgeable  “support” staff, or questions of who trashed the network due to a runaway virus or malware commando…

Kind of an interesting idea.

Probably going to happen one of these days.

Now if we can extend that utility to all airlines so I can have 100% wired access, 100% of the time.

Data Centers Hitting a Wall of Cloud Computing

Equinix lowers guidance due to higher than expected churn in its data centers and price erosion on higher end customers.  Microsoft continues to promote hosted solutions and cloud computing.  Companies from Lee Technologies, CirraScale, Dell, HP, and SGI are producing containerized data centers to improve efficiency, cost, and manageability of high density server deployments.

The data center is facing a challenge.  The idea of a raised floor, cabinet-based data center is rapidly giving way to virtualization and highly expandable, easy to maintain, container farms.

The impact of cloud computing will be felt across every part of life, not least the data center which faces a degree of automation not yet seen.”

Microsoft CEO Steve Ballmer believes “the transition to the cloud <is> fundamentally changing the nature of data center deployment.” (Data Center Dynamics)

As companies such as Allied Fiber continue to develop visions of high density utility fiber ringing North America, with the added potential of dropping containerized cloud computing infrastructure along fiber routes and power distribution centers, AND the final interconnection of 4G/LTE/XYZ towers and metro cable along the main routes,the potential of creating a true 4th public utility of broadband with processing/storage capacity becomes clear.

Clouds Come of Age

Data center operators such as Equinix have traditionally provided a great product and service for companies wishing to either outsource their web-facing products into a facility with a variety of internet Service Providers or internet Exchange Points providing high performance network access, or eliminate the need for internal data center deployments through outsourcing IT infrastructure into a well-managed, secure, and reliable site.

However the industry is changing.  Companies, in particular startup companies. are finding there is no technical or business reason to manage their own servers or infrastructure, and that nearly all applications are becoming available on cloud-based SaaS (Software as a Service) hosted applications.

Whether you are developing your own virtual data center within a PaaS environment, or simply using Google Apps, Microsoft Hosted Office Applications, or other SaaS, the need to own and operate servers is beginning to make little sense.  Cloud service providers offer higher performance, flexible on-demand capacity, security, user management, and all the other features we have come to appreciate in the rapidly maturing cloud environment.

With containers providing a flexible physical apparatus to easily expand and distribute cloud infrastructure, as a combined broadband/compute utility, even cloud service providers are finding this a strong alternative to placing their systems within a traditional data center.

With the model of “flowing” cloud infrastructure along the fiber route to meet proximity, disaster recovery, or archival requirements, the container model will become a major threat to the data center industry.

What is the Data Center to Do?

Ballmer:

“A data center should be like a container – that you can put under a roof or a cover to stop it getting wet. Put in a slab of concrete, plumb in a little garden hose to keep it cool, yes a garden hose – it is environmentally friendly, connect to the network and power it up. Think of all the time that takes out of the installation.”

Data center operators need to rethink their concept of the computer room.  Building a 150 Megawatt, 2 million square foot facility may not be the best way to approach computing in the future.

Green, low powered, efficient, highly virtualized utility compute capacity makes sense, and will continue to make more sense as cloud computing and dedicated containers continue to evolve.  Containers supporting virtualization and cloud computing can certainly be secured, hardened, moved, replaced, and refreshed with much less effort than the “uber-data center.”

It makes sense, will continue to make even more sense, and if I were to make a prediction, will dominate the data delivery industry within 5~10 years.  If I were the CEO of a large data center company, I would be doing a lot of homework, with a very high sense of urgency, to get a complete understanding of cloud computing and industry dynamics.

Focus less on selling individual cabinets and electricity, and direct my attention to better understanding cloud computing and the 4th Utility of broadband/compute capacity.  I wouldn’t turn out the lights in my carrier hotel or data center quite yet, but this industry will be different in 5 years than it is today.

Given the recent stock volatility in the data center industry, it appears investors are also becoming concerned.

The Reality of Cloud Implementation Part 1 – Hosted Applications

As a business consultant providing direction and advice to both government clients and commercial clients, several topics continue to drive discussion not only on short term IT strategy, but also longer term innovative contributions cloud computing can offer the organization.

However to get the conversation moving forward, managers contemplating major architectural change to their IT organizations need to find a good reference or pilot project to justify the expense and contribute to change.  Not the preferred approach, but a reality.

One easy IT project is the move from workstation-based applications, primarily office automation suites, to server-based applications.  The choice is between applications hosted within a private (enterprise) network, or to outsource the application to a commercial provider such as Microsoft Live Office or Google Apps.

Hosted applications make a lot of sense – for most users.  It is a great idea to offload the burden of desktop application administration IT managers when possible, with an expectation of the following:

  1. Greater control over intellectual property (files are stored on a central file server, not on individual hard drives and computers)
  2. Greater control over version and application code updates
  3. Greater control over security, anti-virus, and anti-spam definitions
  4. Application standardization (including organizational templates and branding)
  5. Better management of user licenses (and eliminating use of unauthorized or copied software)

If we look at profiles of most organizational users, the vast majority are office workers who normally do not need to travel, access files or applications from home, or stay on call 24 hours a day.  Thus we can assume, while at the office, computers are connected to a high performance LAN, with high bandwidth and throughout within the organization.

if that assumption is correct, and the organization implements either an enterprise-hosted or commercially-hosted (Google or Microsoft as an example), then those individual workstations can also eliminate keeping files on the local drives (can all be available and backed up to a file server), as well as using web-based applications for most activities.

The user’s relationship with the network, applications, and intellectual property is channeled through a workstation or web interface.  This also enables users, through use of VPNs and other access security, to use any compatible interface available when connecting to applications and files.  This includes home computers and mobile devices – as long as the data is retained on the host file server, and a record is created of all users accessing the data for both security and network/computer resource capacity management.

NOTE:  As a frequent traveler I also spend a considerable amount of time in airplanes, airports, and areas without easy access to the Internet or my file servers.  I do keep an image of MS Office on my laptop, and do have a very large hard drive, and do have a library of SD chips and flash drives for  use when un-tethered from my web apps.  I don’t see this changing in the near future – however I am probably in a  very small minority of professional road warriors who still justify use of local images.  Most do not.

An Unscientific Review of Web-Based Office Automation Applications

First, I am writing this blog entry using Microsoft’s Live Writer, a web/cloud-based application available for blog writers.  it is one application available within the Microsoft “Live-Everything” suite of web-based utilities, which include office automation and social networking applications.

writer The Live Writer application connects with my blog provider (WordPress), downloads my blog profile, and uses that as a what-you-see-is-what-you-get editing interface.  I feel as if I am typing directly into my blog, without the need to understand HTML commands or other manual features.

Adding video, tables, tags, hyperlinks, and SEO tools is effortless.

Going further into my Microsoft Live Office account I can upload, download, create, edit, and store documents in all MS Office formats, with the main free apps including World, Excel, Powerpoint, and One Note.  Mail, calendars, web sites, blogs – a variety of different utilities for personal and potentially professional use.

It is easy to share documents, create collaboration groups, and integrate MS Messenger-driven conferencing and sharing among other connected colleagues.  All available as a free environment for any user without the need to buy MS Office products for your personal computer.  Other commercial products offer a lot more utility, however as a basic test environment, the performance of MS Live Office is more than adequate for probably 95% of office workers world wide. 

Face it, most of us rarely us anything beyond the most basic features of any office automation product, and purchasing licenses for individual office automation suites for each organizational user really only benefits the vendor.

Google Docs, and the Google Apps engine provide similar features to the Microsoft Suite, and some additional unique features not currently available (or easily noticed) on the Live Office sites.  At a high level, Google provides network users:

  • Documents (word processing, spreadsheets, presentations)
  • Forms
  • Drawing/graphics
  • Templates
  • Blogs
  • Analytics
  • Lots of other stuff

In my absolutely unscientific testing of both Google and Microsoft web-based applications, I did not find a single feature which I normally use in preparing presentations, documents, and spreadsheets that could not be reproduced with the online edition.

If that is true for most users, then we can probably look toward a future where cloud-based and hosted office automation applications begin to replace software loaded on individual workstations.

The Danger of Easy Outsourcing

In a world of Service Oriented Architectures (SOA), and close inter-relationships of data, care is needed to ensure we do not create pilots “islands of unconnectable data.”  Today, nearly all data is connectable, whether tables and forms within an email message, SMS messages, spreadsheets, data bases, or any other potential SaaS application.

A word we need to keep in our IT vocabulary is “PORTABILITY.”  Anything we type into an application is a candidate for logging, enquiry, statistics, reporting, or other use of data.  This is a concern when using SaaS applications for not only office automation, but any other hosted application. 

Any and all data we create must be available to any other application which can consume or integrate organizational or industry community of interest applications.  We will look into the SaaS portability question in part 2 of this series.

Indonesia’s Wireless Vision Goes High Speed

In Los Angeles we are pretty happy with our Android phones, iPhones, and other smart handheld devices. We can buy EVDO card for our laptops, and now 4G cards are starting to POP up in some locations. In Jakarta people laugh at such nonsense. With high speed wireless infrastructure covering HSPA Sales Media in Jakarta Mall Ambassadorover 95% of the addressable Indonesian population, the country has leap-frogged not only America, but also much of Asia in delivering high speed wireless service.

If you take a walk through Jakarta’s Mall Ambassador you are presented with a dizzying array of high speed wireless access options for both smart phones and USB flash modems – and oh yes, even EVDO if that is what you really want. So you select your option, is it HSPDA? HSPA? HSPA+? In Jakarta you can easily buy HSPA+ flash modems and base stations that actually deliver between 21~42Mbps to an end user device.

While the highest speeds may not be affordable to the masses, nearly all smartphones and base stations are more than adequate for web browsing and streaming media. In fact, Indonesia has the largest number of mobile FaceBook users in the world, and that number continues to grow at an astonishing rate, as more Indonesians invest in internet-enabled devices as a tool for their future.

But let’s go beyond the city limits of Jakarta, and look at what this means toHSPA Flash Modem Sales Jakarta other rural and remote parts of the country.

If 95% of the population is covered by wireless antennas, and all of those antennas are capable of supporting at least some level of Internet access, then the need for laying copper cable to end users in remote locations becomes less important. An HSPDA base station that connects to a 7.2Mbps data stream can easily connect a LAN of dumb terminals (NetBooks) to a school in remote parts of Sumatra or Papua. eLearning, including remote transmission of lectures, lessons, podcasts, or other means of delivering knowledge becomes possible, giving a level academic playing field to anybody in the country.

City offices, commercial businesses, and even individual homes can connect to the HSPDA signal, allowing Internet access with the same or better performance many users experience with cable modems or organizational LANs connecting to a local ISP or carrier. Add a bit of cloud computing offering a suite of hosted SaaS applications and secure storage in a data center available to users throughout the country, and we have the beginnings of national access to the 4th Utility (marriage of broadband access and cloud computing resources) in Indonesia.

WarNet in Samarinda IndonesiaBut probably the most interesting, and useful example of delivering Internet access to those who need it most is the WarNet. The Warnet is the Indonesian version of an Internet Café. In many rural communities and urban inner-city areas people do not have the money to afford buying their own computer, or do not have the ability to connect to the Internet from their homes or offices. The WarNet may connect a small Internet Kiosk to wireless Internet in a remote location, offer some basic printing services, and that kiosk becomes a social, educational, business, and entertainment hub for small communities.

Schools could follow the same model as WarNets, connecting to broadband wireless through a local base station and extending an access LAN to student workstations and terminals. Again, with eLearning those terminals can be dumb, with the applications and student working storage on a data center hosted platform.

HSDPA Base station in JakartaHigh speed broadband wireless is effectively bringing the Internet to nearly all Indonesians. Now the effort needs to be making access devices more affordable and more available, as well as producing high quality content and content delivery into the wireless networks. As most of the wireless networks are still not exceeding ~30% of their transmission capacity at peak, there is ample room for growth.

Backbone fiber networks owned by the wireless carriers and wholesale providers will continue to expand, enhancing the wireless operator’s ability to increase their capacity to meet the potential of future wireless technologies such as LTE and 4G. And Indonesians will continue to approach the Internet’s technical edge.

Not bad Indonesia… not bad at all

A Cloud Computing Epiphany

One of the greatest moments a cloud evangelist indulges in occurs at that point a listener experiences an intuitive leap of understanding following your explanation of cloud computing. No greater joy and intrinsic sense of accomplishment.

Government IT managers, particularly those in developing countries, view information and communications technology (ICT) as almost a “black” art. Unlike the US, Europe, Korea, Japan, or other countries where Internet and network-enabled everything has diffused itself into the core of Generation “Y-ers,” Millennials, and Gen “Z-ers.” The black art gives IT managers in some legacy organizations the power they need to control the efforts of people and groups needing support, as their limited understanding of ICT still sets them slightly above the abilities of their peers.

But, when the “users” suddenly have that right brain flash of comprehension in a complex topic such as cloud computing, the barrier of traditional IT control suddenly becomes a barrier which must be explained and justified. Suddenly everybody from the CFO down to supervisors can become “virtual” data center operators – at the touch of a keyboard. Suddenly cloud computing and ICT becomes a standard tool for work – a utility.

The Changing Role of IT Managers

IT managers normally make marginal business planners. While none of us like to admit it, we usually start an IT refresh project with thoughts like, “what kind of computers should we request budget to buy?” Or “that new “FuzzPort 2000″ is a fantastic switch, we need to buy some of those…” And then spend the next fiscal year making excuses why the IT division cannot meet the needs and requests of users.

The time is changing. The IT manager can no longer think about control, but rather must think about capacity and standards. Setting parameters and process, not limitations.

Think about topics such as cloud computing, and how they can build an infrastructure which meets the creativity, processing, management, scaling, and disaster recovery needs of the organization. Think of gaining greater business efficiencies and agility through data center consolidation, education, and breaking down ICT barriers.

The IT manager of the future is not only a person concerned about the basic ICT food groups of concrete, power, air conditioning, and communications, but also concerns himself with capacity planning and thought leadership.

The Changing Role of Users

There is an old story of the astronomer and the programmer. Both are pursuing graduate degrees at a prestigious university, but from different tracks. By the end of their studies (this is a very old story), the computer science major focusing on software development found his FORTRAN skills were actually below the FORTRAN skills of the astronomer.

“How can this be” cried the programmer? “I have been studying software development for years, and you studying the stars?”

The astronomer replied “you have been studying FORTRAN as a major for the past three years. I have needed to learn FORTRAN and apply it in real application to my major, studying the solar system, and needed to learn code better than you just to do my job.”

There will be a point when the Millenials, with their deep-rooted appreciation for all things network and computer, will be able to take our Infrastructure as a Service (IaaS), and use this as their tool for developing great applications driving their business into a globally wired economy and community. Loading a LINUX image and suite of standard applications will give the average person no more intellectual stress than a “Boomer” sending a fax.

Revisiting the “4th” Utility

Yes, it is possible IT managers may be the road construction and maintenance crews of the Internet age, but that is not a bad thing. We have given the Gen Y-ers the tools they need to be great, and we should be proud of our accomplishments. Now is the time to build better tools to make them even more capable. Tools like the 4th utility which marries broadband communications with on-demand compute and storage utility.

The cloud computing epiphany awakens both IT managers and users. It stimulates an intellectual and organizational freedom that lets creative people and productive people explore more possibilities, with more resources, with little risk of failure (keep in mind with cloud computing your are potentially just renting your space).

If we look at other utilities as a tool, such as a road, water, or electricity – there are far more possibilities to use those utilities than the original intent. As a road may be considered a place to drive a car from point “A” to point “B,” it can also be used for motorcycles, trucks, bicycles, walking, a temporary hard stand, a temporary runway for airplanes, a stick ball field, a street hockey rink – at the end of the day it is a slab of concrete or asphalt that serves an open-ended scope of use – with only structural limitations.

Cloud computing and the 4th utility are the same. Once we have reached that cloud computing epiphany, our next generations of tremendously smart people will find those creative uses for the utility, and we will continue to develop and grow closer as a global community.

Communities in the Cloud

In the 1990s community of interest networks (COINs) emerged to take advantage of rapidly developing Internet protocol technologies. A small startup named BizNet on London’s Chiswell Street developed an idea to build a secure, closed network to support only companies operating within the securities and financial industries.

BizNet had some reasonable traction in London, with more than 100 individual companies connecting within the secure COIN. Somewhat revolutionary at the time, and it did serve the needs of their target market. Management was also simple, using software from a small company called IPSwitch and their soon to be globally popular “What’s Up” network management and monitoring utility.

However simplicity was the strength of BizNet. While other companies favored strong marketing campaigns and a lot of flash to attract companies to the Internet age, BizNet’s thought leaders (Jez Lloyd and Nick Holland) relied on a strong commitment to service delivery and excellence, and their success became viral within the financial community based on the confidence they built among COIN members.

As networks go, so did BizNet, which was purchased by Level 3 Communications in 1999 and subsequently the COIN network was dismantled in favor of integrating the individual customers into the Level 3 community.

Cloud Communities

Cloud computing supports the idea of a COIN, as companies can not only build their “virtual data center” within a Platform as a Service/PaaS model, but also develop secure virtual interconnections among companies within a business community – not only within the same cloud service provider (CSP), but also among cloud service providers.

In the “BizNet” version of a COIN, dedicated connections (circuits) were needed to connect routers and switches to a central exchange point run by BizNet. BizNet monitored all connections, reinforcing internal operations centers run by individual companies, and added an additional layer of confidence that helped a “viral” growth of their community.

Gerard Briscoe and Alexandros Marinos delivered a paper in 2009 entitled Digital Ecosystems in the Clouds: Towards Community Cloud Computing.” In addition to discussing the idea of using cloud computing to support an outsourced model of the COIN, the paper also drills deeper into additional areas such as the environmental sustainability of a cloud community.

As each member of the cloud community COIN begins to outsource their virtual data center into the cloud, they are able to begin shutting down inefficient servers while migrating processing requirements into a managed virtual architecture. Even the requirement for managing high performance switching equipment supporting fiber channel and SAN systems is eliminated, with the overall result allowing a significant percentage of costs associated with equipment purchase, software licenses, and support agreements to be rechanneled to customer or business-facing activities.

Perhaps the most compelling potential feature of community clouds is the idea that we can bring processing between business or trading partners within the COIN to near zero, as the interaction between members is on the same system, and will not lose any velocity due to delays induced by going through switching, routing, or short/long distance transmission through the Internet or dedicated circuits.

Standards and a Community Applications Library

Most trading communities and supply chains have a common standard for data representation, process, and interconnection between systems. This may be a system such as RosettaNet for the manufacturing industry, or other similar industry specifications. Within the COIN there should also be a central function that provides the APIs, specifications, and other configurations such as security and web services/interconnection interface specs.

As a function of developing a virtual data center within the PaaS model, standard components supporting the COIN such as firewalls, APIs, and other common applications should be easily accessible for any member, ensuring from the point of implementation that joining the community is a painless experience, and a very rapid method of becoming a full member of the community.

A Marriage of Community GRIDs and Cloud Computing?

Many people are very familiar with project such as Seti At Home, and the World Community GRID. Your desktop computer, servers, or even storage equipment can contribute idle compute and storage capacity to batch jobs supporting everything from searching for extraterrestrial life to AIDS research. You simply register your computer with the target project, download a bit of client software, and the client communicates with a project site to coordinate batch processing of work units/packets.

Now we know our COIN is trying to relieve members from the burden of operating their own data centers – at least those portions of the data center focusing on support of a supply chain or trading community of interest. And some companies are more suited to outsourcing their data center requirements than others. So if we have a mix of companies still operating large data centers with potential sources of unused capacity, and other members in the community cloud with little or no onsite data center capacity, maybe there is a way the community can support itself further by developing the concept of processing capacity as a currency.

As all individual data centers and office LAN/MAN/WANs will have physical connections to the cloud service provider (IaaS provider) through an Internet service provider or dedicated metro Ethernet connection, the virtual data centers being produced within the PaaS portion of the CSP’s will be inherently connectable to any user, or any facility within the COIN. Of course that is accepting that security management will protect non-COIN connected portions of the community.

Virtually, those members of the community with excess capacity within their own networks could then easily further contribute their spare capacity to the community for use as non-time critical compute resource, or for supporting “batch” processing. Some CSPs may even consider buying that capacity to provide members either in the COIN, or outside of the COIN, and additional resource available to their virtual customers as low cost, low performance, batch capacity much like SETI at Home or the Protein Folding Project uses spare capacity on an as-available basis. Much like selling your locally produced energy back into a power GRID.

We Have a New, Blank Cloud White Board to Play With

The BizNet COIN was good. Eleven years after BizNet was dissolved, the concept remains valid, and we now have additional infrastructure that will support COINs through community clouds, with enabling features that extend far beyond the initial vision of BizNet. CSPs such as ScaleUp have built IaaS and PaaS empowerment for COINs within their data center.

Cloud computing is an infant. Well, maybe in Internet years it is rapidly heading to adolescence, but it is still pretty young. Like an adolescent, we know it is powerful, getting more powerful by the day, but few people have the vision to wrap their head around what broadband, cloud computing, diffusion of network-enabled knowledge into the basic education system, and the continuation of Moore’s, Metcalf’s, and other laws of industry and physics.

COINs and community clouds may not have been in the initial discussions of cloud computing, but they are here now. Watching a Slingbox feed in a Jakarta hotel room connected to a television in Burbank was probably not a vision shared by the early adopters of the Internet – and cloud computing will make similar un-thought of leaps in utility and capabilities over the next few years.

However, in the near term, do not be surprised if you see the entire membership of the New York Stock Exchange and NASDAQ operating from a shared cloud COIN. It will work.

Expanding the 4th Utility to Include Cloud Computing

A lot has been said the past couple months about broadband as the fourth utility. The same status as roads, water, and electricity. As an American, the next generation will have broadband network access as an entitlement. But is it enough?

Carr, in “the Big Switch” discusses cloud computing being analogous to the power grid. The only difference is for cloud computing to be really useful, it has to be connected. Connected to networks, homes, businesses, SaaS, and people. So the next logical extension for a fourth utility, beyond simply referring to broadband network access as a basic right for Americans (and others around the world – it just happens as an American for purposes of this article I’ll refer to my own country’s situation), should include additional resources beyond simply delivering bits.

The “New” 4th Utility

So the next logical step is to marry cloud computing resources, including processing capacity, storage, and software as a service, to the broadband infrastructure. SaaS doesn’t mean you are owned by Google, it simply means you have access to those applications and resources needed to fulfill your personal or community objectives, such as having access to centralized e-Learning resources to the classroom, or home, or your favorite coffee shop. The network should simply be there, as should the applications needed to run your life in a wired world.

The data center and network industry will need to develop a joint vision that allows this environment to develop. Data centers house compute utility, networks deliver the bits to and from the compute utility and users. The data center should also be the interconnection point between networks, which at some point in the future, if following the idea of contributing to the 4th utility, will finally focus their construction and investments in delivering big pipes to users and applications.

Relieving the User from the Burden of Big Processing Power

As we continue to look at new home and laptop computers with quad-core processors, more than 8 gigs of memory, and terabyte hard drives, it is hard to believe we actually need that much compute power resting on our knees to accomplish the day-to-day activities we perform online. Do we need a quad core computer to check Gmail or our presentation on Microsoft Live Office?

In reality, very few users have applications that require the amounts of processing and storage we find in our personal computers. Yes, there are some applications such as gaming and very high end rendering which burn processing calories, but for most of the world all we really need is a keyboard and screen. This is what the 4th utility may bring us in the future. All we’ll really need is an interface device connecting to the network, and the processing “magic” will take place in a cloud computing center with processing done on a SaaS application.

The interface device is a desktop terminal, intelligent phone (such as an Android, iPhone, or other wired PDA device), laptop, or anything else that can display and input data.

We won’t really care where the actual storage or processing of our application occurs, as long as the application’s latency is near zero.

The “Network is the Computer” Edges Closer to Reality

Since John Gage coined those famous words while working at Sun Microsystems, we’ve been edging closer to that reality. Through the early days of GRID computing, software as a service, and virtualization – added to the rapid development of the Internet over the past 20 years, technology has finally moved compute resource into the network.

If we are honest with ourselves, we will admit that for 95% of computer users, a server-based application meets nearly all our daily office automation, social media, and entertainment needs. Twitter is not a computer-based application, it is a network-enabled server-based application. Ditto for Facebook, MySpace, LinkedIN, and most other services.

Now the “Network is the Computer” has finally matured into a utility, and at least in the United States, will soon be an entitlement for every resident. It is also another step in the globalization of our communities, as within time no person, country, or point on the earth will be beyond our terminal or input device.

That is good

%d bloggers like this: