Digital Africa 2010 and Cloud Computing in Developing Countries

At the Digital Africa Summit 2010 in Kampala, Uganda, discussion is rightly focused on both telecommunications policy and economic development. Cloud computing is a topic heard among sidebar Near Kampala Uganda and Digital Africa 2010discussions, although it has yet to hit the mainstream of conference programming.

We will bring a series of reports from Digital Africa – it is a very exciting group of people who truly have the best interests of Africa as their key objective. Kicked off by Dr. Gilbert Balibaseka Bukenya, Vice President of Uganda, the conference also included ministers of communications from Uganda, Niger, Cameroon, and Burkina Faso. Other nations are well represented with representatives from the private sector, government, and education.

With that many politicians, you would think protocol would prevent any level of innovation or open discussion. Not the case, it was a very cooperative environment.

Why is cloud important in developing countries?

It is a reasonable question, and a reasonable answer. The basic requirements in developing countries (beyond clean water and food) are infrastructure, education, jobs, and eGovernment (including banking). Nothing works without the infrastructure in place. In countries without stable electricity and limited telecom infrastructure, this has to be a high priority.

When building out the basic infrastructure in countries with a tremendous amount of sunlight, wind or solar energy makes a lot of sense. A lot more sustainable than running diesel generators, and as an unfortunate byproduct of global warming, more sunny days each year are available to provide power.

In rural areas we are talking about enough power to provide electricity for schools, internet kiosks or cafes, and wireless access points in city centers. 15kW would do it, and that is not unreasonable. It is not unreasonable if we are looking at low-powered NetBooks and terminals that do not have a large burden of local resources for processing power, memory, storage, and high performance video applications.

According to several presentations at Digital Africa, there is strong evidence that with each 10% of any population in Africa having access to mobile or Internet technologies, there is a corresponding 1.8% increase in that nation’s GDP. Evidence that simply bringing Internet and education to the rural and unwired population will increase the national wealth, and quality of life, by a an annual increase of 1.8%

Bring the cable to the school, wire up a NetBook-based LAN, connect via wireless to a local access point, and you have an entry-level connected school. An entry-level school that can access Stanford classes online, from rural areas of Niger. Once that is available, and children are able to diffuse wired intellectual exposure into their intellectual tacit knowledge library, and we are creating a much more level playing field.

OK, let’s drop the physical fiber runs and electricity planning for just a moment. We’ll save that for a future article.

Cloud Computing Driving the Community

If we can build a data center in a couple of national locations with stable power, and with international or local funding build out a basic data center infrastructure, then with a bit of creativity and planning we will expect Infrastructure virtualization (IaaS) as a basic component of the data center.

Utility processing, storage, and memory available for the community. With a bit of further planning, adding one or more good PaaS models on the infrastructure, and we have a resource that can be used to host academic applications, business applications, and government applications. Remember this is the early days of development – in most cases there is no infrastructure to start with, so we can design this as a best practice from Day 1.

Take the burden of infrastructure away from the schools, startup companies, and existing SMEs and offer a virtual data center utility to server both their office automation and IT needs, as well as granting access to the global marketplace.

A Novel Idea – the Mobile Data Center

Bringing education to the students in UgandaUConnect is a project run by several independent souls who want to bring education to the small rural school children in Uganda. A panel truck, lined with computers, and a server hosting a wide variety of eLearning applications, UConnect drives to schools and lets the children work on computers for a couple hours each week. A project bringing education to areas where just a year ago there would be no opportunity for children to be exposed to either computer technologies, or formal education materials.

Hero bringing education to children in rural UgandaThis is creativity, and a refusal to let the children grow up in a world where they are completely out of touch with their global community counterparts. A technology baby step for us, a giant leap for Ugandan children. But not good enough. We need to inspire children to succeed, and to do that children need exposure to the same intellectual tools as a child in Calabasas, California.

Cloud computing can, should, and will be part of that plan. It makes sense.

Developing Disaster Recovery Models with Cloud Computing

How does a small or medium business ensure it can meet the basic needs for disaster recovery and business continuity? Whether it be Internet-facing applications, or Enterprise-facing applications and data, one of the most important issues faced by small companies is the potential loss of information and applications needed to run their operations.

Disaster Recovery Point and Time ObjectivesDisaster recovery and business continuity. Recovery point objectives and recovery time objectives. Backing up data to offsite locations, and potentially running mirrored processing sites – it is an expensive business requirement to fulfill. Particularly for budget conscious small and medium-sized companies.

Christoph Streit, founder of Hamburg-based ScaleUp Technologies, believes cloud computing may offer a very cost-effective, powerful solution for companies needing not only to protect their company’s data, but also reduce their recovery point objectives to near zero.

“In a traditional disaster recovery model the organization must have an exact duplicate of their hardware, applications, and data in the disaster recovery location” explains Christoph. “With cloud computing models it is possible to replicate applications virtually, spinning up capacity as needed to meet the processing requirements of the organization in the event a primary processing location becomes unavailable.”

ScaleUp did in fact demonstrate their ability to replicate databases between data centers in an October 2009 test with Cari.net, where ScaleUp was able to bring up a VPN appliance and replicate data and applications between Germany and Cari.net’s data center in San Diego, California.

While there may be issues with personal data being in compliance with European Data Protection Laws, nearly every company and organization around the world participates in a global market place. This means applications and data serving the global market cannot be considered local, and the next logical step is to extend access and presentation of the company’s network presence as close to the network edge (customers) as possible.

Some companies may have physical network capacity in multiple geographies, others may look to companies such as ScaleUp to develop relationships with other cloud service providers to allow “federated” relationships.

Until a true industry standard is determined to define data structures and protocols to use between cloud infrastructure and platform providers, it is probably easiest for relationships to develop between companies using the same cloud platform as a service (PaaS) application. Such is the case with ScaleUp and Cari.net, who used a common platform provided by 3Tera’s AppLogic.

The cloud service provider industry will provide a tremendous service to small and medium businesses which normally cannot afford near zero recovery time and recovery point objectives. Whether it is real-time replication of entire data bases, subsets of data bases, or simply parsing correlated data from edge locations at regular intervals, disaster recovery modeling is changing.

A backup location can be made in some cases by logging into a cloud service provider and opening an account with a credit card – or through a very fast negotiation with the service provider. Certainly not without cost, but potentially at a much lower cost of operation than in models requiring physical data center space, hardware, and operations staff at each location.

The important lesson for small companies is that both disaster recovery and a company’s ability to recover from either a physical disaster such as a fire in their data center, or data corruption, may limit or prevent a company’s ability to continue operations. Adding cloud services to the disaster recovery model may provide a very powerful, simplified, and cost-effective model to protect your business.

3tera and AppLogic SWAG Moves to the Cloud Computing Retro Collection

CA and 3tera have announced CA’s acquisition of the innovative cloud computing Infrastructure as a Service vendor. This is a great thing for Computer Associates, and perhaps a bit sad for the cloud community in general. Why? It is hard to fit the energy and enthusiasm felt when walking into 3Tera’s Aliso Veijo office into words. A tight group of committed entrepreneurs and innovators, with a bit of cockiness due to the unique stature they held in the cloud computing community.

Not that Computer Associates is a bad company. In fact, they have always been one of the best kept secrets in business and enterprise software. Rock solid systems, professional sales and engineering – just not as well known to the broader community as other large enterprise systems vendors.

AppLogic brought the cloud community many firsts. The first to integrate IPv6 into their provisioning system. The first to really simplify the drag and drop provisioning process. Perhaps the first to really test and prove the concept of globally distributed processing and disaster recovery models. And they are really great guys.

Bert, Peter, Sean, and the rest of 3tera’s public face spent a tremendous amount of time supporting the community through participation in training events, community organizations such as the Convergence Technology Council of California, the Any2 Exchange Community – all with not only good community spirit, but also providing strong thought leadership to motivate the community into learning more about cloud computing and the future of information technology.

We will deeply miss 3tera, and hope the team will eventually regroup with a new set of ideas, and lead us into another generation of technology that will further enhance the industry’s ability to deliver a true, global, massively distributed cloud computing reality.

Computer Associates will bring value to the cloud community as well. With the power of CA’s organization behind recent acquisitions such as 3tera, Oblicore, NetQoS, Orchestria, Platinum Technology, Netreon, and others related to process, database and large data set management, the stage is set for increased competition in the cloud service industry. CA has the ability to provide a broad understanding of all aspects of enterprise and Internet-facing tools equal or better than IBM, Microsoft, or any other full-service integrator.

We will look forward to seeing the product of 3tera integration into the CA family, and hope the innovation and enthusiasm 3tera’s team brought to the cloud community is not swallowed up into a large company bureaucracy.

Southern California Tech Jobs Make a Comeback

2009 was a horrible year for job seekers, and even those holding on to existing jobs. No bonuses, no promotions, layoffs, and nobody hiring. And SoCal successfully beat most of the United States in unemployment claims, by several percentage points, attaching painful and empirical fact to the grim situation.

But that does appear to be changing. Slowly changing, but it is looking better for job seekers in the region. A recent scrape of job openings for Los Angeles and Orange Counties yielded some pretty strong job titles:

  • Director of Engineering – Marina Del Rey
  • Chief Integration Engineer – El Segundo
  • Director, information technology – San Clemente
  • VP Global Services – Los Angeles
  • Customer Services Director – El Segundo
  • Lead Systems Engineer – Los Angeles
  • Senior Industrial Director – Irvine
  • Smart GRID Architect – Rosemead
  • HL7 Integrator – Los Angeles
  • Disaster Recovery Manager – Irvine
  • Manager, Operations Systems – Van Nuys, CA
  • Systems Architecture Engineer – Huntington Beach, CA

And the list goes on… About 350 good positions listed in my 25 January search.

Tech Jobs are Out There in SoCalOne additional exciting trend in the job stack is the high number of positions in manufacturing industries. While the services market is great, manufacturing spawns input into the supply chain, which adds a lot of downstream value to those companies increasing or expanding their business operations.

Dust Off that Resume

The time is near, and technology-savvy job seekers will reap rewards if they are prepared for the next boom in business expansion. Cloud computing, unified messaging, IT operations, data center consolidation, process automation, green technologies – corporate jargon to some, but areas with increasing demand for qualified candidates.

Cloud computing and data center consolidation are quite interesting, admittedly because they are new and exciting trends in the IT community.

The Dot.Com era taught us painful lessons on the value of investment money. The venture community sat back after 2002 and made a decision to actually perform a bit of due-diligence prior to throwing money at PowerPoint companies and paper ideas. At least those which were not using private equity with large investments in real estate.

The Dot.OMG era is now just about at an end, and some of the lessons learned are focused on the execution of business plans and intelligent use of capital and operational expenses – while building business.

IT has gone from being a “darling” of the internet age, to a very powerful means of adding tremendous business value through globalization of markets, and real-time transaction processing to support the global economy and marketplace. The only problem was to support that IT engine, technical managers tried to solve their processing challenges by throwing more disk, processors, and bandwidth at their requirements.

The next age will be one where companies refocus their energy on developing their business, and begin to expect processing and IT to be more of a utility than an exceptional part of their business. Welcome to data center outsourcing, virtualization, cloud computing, and Software as a Service/SaaS. Recover the costs of expensive and inefficient data centers.

So those engineers and sales staff still hanging out in the Communicator’s Bar, get ready to get sized for your next retro-logo polo shirt. The time is now for those who can put their fantasy of re-entering the telecom community to deal in bilateral telephone minutes aside and get ready to support thought leadership strategies to bring customers into commercial data center outsourcing models – or go sell them on consolidating their in-house operations into enterprise clouds.

Look at the tech job listings again. Companies are begging for IT and tech visionary managers to solve their growth and development pain. Begging.

2010 is going to be a great year in SoCal, so let’s get out there and make sure it does not pass us by, and does not require our companies to go elsewhere to attract talent. We’ve got the talent right here, and we need to put it back to work.

Maintain or Refresh – the IT Dilemma Meets Cloud Computing

Emerging technologies have always forced business decision-makers to decide if they will embrace a new technology as a first-mover, or if they will maintain their existing technologies. Technology refresh and the law of plentitiudeEach brings a risk – does the cost of maintaining existing technology result in higher maintenance and operational expenses, or does the cost of embracing and acquiring new technology put an unwarranted capital and process change burden on the organization?

Many years ago (~15) the Northern Telecom (Nortel) DMS 100/250/300/500 line of digital telephone switches represented one of the finest technologies for digital communications. The cost was high, but the technology promised telecom carriers everything they would need to operate their networks well into the next generation, which was not yet associated with a real time horizon. At least in marketing PowerPoint slides. Buy a DMS 500, and you will be running that for a couple decades.

Then seemingly overnight the Internet matured, with communications applications such as Voice over Internet Protocol (VoIP), Skype, Vonage, and other Internet-enabled utilities. Suddenly the DMS, 5ESS, 4ESS, NEAC, DSC – all became obsolete almost overnight, replaced by simple Internet-friendly communication applications or Internet Protocol-based “soft switches” which managed telephony over the Internet protocol with a form factor about the size of a mini-refrigerator, And 100 times the switching capacity.

And, as with all soon-to-be-obsolete technologies, the cost of maintaining the legacy system, finding spare parts for the legacy system, and even finding operators for the legacy system may be rapidly hitting a point of extreme risk. The old telephone switches are now most often found in landfills, gone forever.

Traditional telecommunication transmission protocols such as SDH and SONET began falling to Ethernet, and within a period of about 5 years from 2003~2008 the “legacy” telephone technologies began to quickly fade to historical Wikipedia entries.

The Cloud Computing Analogy

We are entering a period of “plentitude” in cloud computing. The “Law of Plentitude” is loosely defined as a threshold of acceptance (of a process, technology, system, etc) that if not adopted will put an entity at a greater risk of non-participation than if they participate at the point of emergence. In technology we normally place the “Law of Plentitude” at around 15~20% diffusion into a selected environment, community, industry, or organization.

For example, when the fax was first introduced there was a single machine. By itself it is not useful, as you have nobody to fax images to on a distant end. With two fax machines it is more useful, with a community of two. The law of exponents begins at 4 users (N*N-1/2) and you end up with an addressable community of 6 potential relationships. And it continues growing.

At “plentitude,” you are at risk of not acquiring a fax machine, because your community has now adopted fax machines at a level that you need to be able to communicate with fax, or find yourself in jeopardy of losing your place in the community.

It can now be argued that cloud computing is quickly starting to reach a level of “plentitude.” Communities of interest are emerging within clouds, allowing near zero-latency in user-to-user transaction time. Think of a financial trading community. Zero-latency means zero transaction delays. At some point if you are not in the zero-latency community, your operation is at risk of either losing business, or being expelled by other members of the community who do not want to deal with your latency.

Think of companies outsourcing their IT infrastructure into a commercial cloud service provider, or even building their own internal enterprise cloud infrastructure. If all things are equal, and the cloud-enabled company is able to recover the cost of building their own data center, reducing operational expenses, and potentially greatly increasing their ability to expand and reduce their processing capacity, then they may have more resources left over to increase research and development or product production.

Think of the guys who were running DMS 500s in 2009, vs. their competitors who were running much more powerful, and cheaper soft switches. We can produce a roll call of regional telephone companies who closed their doors over the past few years because they simply did not have the ability to compete with next generation technology.

The Cloud Computing “Plentitude” Target

The trick of course is to try and plan your refresh, through a well-managed business case and review, to as close to the plentitude “risk threshold” as possible. This will ensure you do not fall prey to a bad technology, are able to see the industry trend towards adopting a new technology, and that your competition does not leave you suffering through a last minute technology refresh.

Cloud computing and data center outsourcing may not be the ultimate technology refresh, and still has a number of issues to resolve (security, compliance, data center stability, etc). However, the trend is clear, companies are outsourcing into commercial cloud service providers, and enterprise virtualization is on the mind of every IT manager and CFO in the business community.

We hope

If your company or organization has not yet started the review process, the technology refresh process, and the planning process to determine if/when cloud adoption is the right thing for you company, we would strongly encourage that process to begin. Now.

If nothing else, you owe it to yourself and your organization to ensure they are not caught on the bad side of plentitude.

Trouble at the Telecom and Communicator’s Bar

Have you heard the news? Unemployment is skyrocketing, companies are closing, there’s no investment money for startups, and the sky is falling, the sky is falling? Don’t I know, as the layoff frenzy hit my own Hanging out at the communicator's barhome, that it is a scary economic place to take a swim… Sharks, really hungry sharks, circling with an eye to take every last cent you have been able to hide.

And the outlook remains bleak. The New York Times reports that Europe is suffering in youth unemployment – even more than the US. 42.9% unemployment is Spain, 28% unemployment in Ireland, an EU average of 20.7% Makes California look like the “promised land.”

And, California may actually be the “promised land.” California still attracts the best of global engineering to the Silicon Valley, and the most creative minds in communications and entertainment to Los Angeles. Whether you are a European, Chinese, Indian, or even Canadian, Silicon Valley and LA offer an environment that is unsurpassed around the world. Our universities embrace people from other cultures and countries, and our ability to support entrepreneurs draws not only students, but the best engineers and thought leaders from around the world.

Back at the Communicator’s Bar

There are still tables with discussions reviewing the indignities of being laid off by struggling companies. There are still discussions with the whine of people talking about the “damn foreigners” who are here stealing our jobs. Still “barflys” slopped over the bar worrying about their Audi payments and how their ARM mortgage has put them under water.

Then there are other bars with tables full of Americans, And A scatter shot of foreigners talking about fun stuff. Fun stuff like cloud computing, virtualization, globalization, distributing computing, “the network is the computer,” “the computer is the network,” and how the carriers will return to their roots of providing high quality “big, fat, dumb” telecom pipes. The talk is of how we can finally start putting all this intellectual property that we’ve spent billions n producing Powerpoint slides into reality.

Green is here

Virtualization is here

Data Center outsourcing is here

2010 is a blank whiteboard set up to codify the thought leadership and technology spawned in the waning years of the 200x decade and put it into business plans and CAPEX budgets.

2010 is the year we aggressively deliver Internet-enabled technology to every man, woman, and child in the world who has a desire to live a life beyond killing their own food for dinner. Here is a funny though – if a radical 8 year old in one currently scary country is able to Yahoo chat or Facebook their way into discussions and relationships with kids in California and Beijing, doesn’t it make just a little sense the desire to blow each other up would be diluted, even just a little?

If the guy living next to me is producing a telecom switch that is head and shoulders above what is currently on the market, do I really care if his brain was conceived in Hanoi?

2010 is also the beginning of a true period of globalization. That doesn’t mean out hillbilly friends in Duluth, Minnesota have to quit drinking 3.2 beer and hanging out at setup bars watching Vikings reruns, it means that the hillbilly’s kid can participate in a lecture series online from Stanford or MIT. The kid might eventually invent a pickup truck that runs on pine cones, and a 3.2 beer that is actually palatable.

Embrace 2010

If not for the simple fact you have no other choice, consider all the great ideas being pumped out by companies like 3tera, the Google borg, Microsoft, VM Ware, and all the other companies with tremendous innovative ideas. Never before in our history have some many new intellectual and business tools been put on the shelf at the same time. Never before have we had such good reason to consider implanting those ideas (yes, I am a tree hugger and do believe in global warming).

So, even if you are currently living in a car under a bridge near you former upscale Orange County community – shave, wash your car, take a shower at the beach, and let’s get our depression, anger, tacit knowledge back into the business saddle. The young guys still need our experience to get their feet on the ground, and we need them to ensure we will have social security in the future.

Welcome 2010 – you have taken a long time to arrive

John Savageau, Honolulu

Business and Social Frog Soup – are we ready for the next decade?

Over the past couple years I have written several stories with “frog soup” as a main theme. The idea of being in cold water, and not recognizing the degree by degree Frog soup concerns for the American economyincrease of heat in the water, till at some point we are cooked, is the danger of being a cold-blooded animal. Business may follow a similar course.

In business we can follow the route of “this is the way we’ve always done it, and it works, so there is no reason to change our processes or strategies.” Innovations like virtualization or cloud computing hit the headlines, and many say “it is a cool idea, but we want the security and hands-on confidence of running our own servers and applications.”

In the United States many telecom companies continue to build business cases based on “milking” telephone settlement minutes, bilateral relationships, and controlling telecom “pipes.” Internet service providers (ISPs) continue holding on to traditional peering relationships, holding out for “paid peering,” doing everything possible to attain market advantage based on traffic ratios.

Nothing new, same ideas, different decade.

It is international frog soup.

In Vietnam the government is currently planning to build an entirely new information infrastructure, from the ground up, based on the most cutting edge telecom and data/content infrastructure. Children in Hanoi go to school at 7 a.m., take a quick lunch break, hit the books till around 5 p.m., take another break, and finish their day at study sessions till around 9 p.m.

Concentration – mathematics, physics, and language.

The children are being exposed to Internet-based technologies, combining their tacit experience and knowledge of global interconnected people with a high degree of academic sophistication.

In the United States children go to school for, at most, 6 hours a day, graduating with (on average) little capabilities in math or language – although we do have deep knowledge of metal detectors and how to smoke cigarettes in the restrooms without being caught. In Los Angeles, some locations cannot even hit a 50% graduation rate among high school students.

And oddly enough, we appear to be comfortable with that statistic.

Perhaps our approach to business is following a similar pattern. We become used to approaching our industry, jobs, and relationships on a level of survival, rather than innovation. We may not in some cases even have the intellectual tools to apply existing technology to the potential of functioning in a global economy. Then we are surprised when an immigrant takes our job or business.

Some universities, such as Stanford, aggressively recruit students from foreign countries, as they cannot attract enough qualified student s from the United States to meet their desired academic threshold. And once they graduate from Stanford, they find their way into Silicon Valley startups, with an entrepreneurial spirit that is beyond the scope of many American graduates.

Those startups have the intellectual and entrepreneurial tools to compete in a global economy, using innovative thinking, unbound by traditional processes and relationships, and are driving the center of what used to be America’s center of the global innovation world. Except that it is only based in Silicon Valley, and now represents the center of a global innovative community. Possibly due to the availability of increasingly cheaper American labor?

Frog Soup

Us Americans – we are getting lazy. Innovation to us may mean how we manipulate paper, and has nothing to do with manufacturing and business innovation. We are starting to miss the value of new products, new concepts, and execution of business plans which end up in production of goods for export and domestic use. We believe concentration on services industries will drive our economy into the future, based on products and other commercial goods imported into our country.

Except for the painful fact and reality we do not have a young generation with the intellectual tools to compete with kids in Hanoi who are on a near religious quest to learn.

The temperature is rising, and we as a country and economic factor in the global community is being diluted every day.

Time to put away the video games and get back to work. No more “time outs,” only time to roll up our sleeves and learn, innovate, learn, innovate, and innovate some more. Forget comfort, we are nearly soup.

A Cloudy Future for Networks and Data Centers in 2010

The message from the VC community is clear – “don’t waste our seed money on network and server equipment.” The message from the US Government CIO was clear – the US Government will consolidate data centers and start moving towards cloud computing. The message from the software and hardware vendors is clear – there is an enormous Data Center within a Data Center Cloudinvestment in cloud computing technologies and services.

If nothing else, the economic woes of the past two years have taught us we need to be a lot smarter on how we allocate limited CAPEX and OPEX budgets. Whether we choose to implement our IT architecture in a public cloud, enterprise cloud, or not at all – we still must consider the alternatives. Those alternatives must include careful consideration of cloud computing.

Cloud 101 teaches us that virtualization efficiently uses compute and storage resources in the enterprise. Cloud 201 teaches us that content networks facing the Internet can make use of on-demand compute and storage capacity in close proximity to networks. Cloud 301 tells us that a distributed cloud gives great flexibility to both enterprise and Internet-facing content. The lesson plan for Cloud 401 is still being drafted.

Data Center 2010

Data center operators traditionally sell space based on cabinets, partial cabinets, cages, private suites, and in the case of carrier hotels, space in the main distribution frame. In the old days revenue was based on space and cross connects, today it is based on power consumed by equipment.

If the intent of data center consolidation is to relieve the enterprise or content provider of unnecessary CAPEX and OPEX burden, then the data center sales teams should be gearing up for a feeding frenzy of opportunity. Every public cloud service provider from Amazon down to the smallest cloud startup will be looking for quality data center space, preferably close to network interconnection points.

In fact, in the long run, if the vision of cloud computing and virtualization is true, then the existing model of data center should be seen as a three-dimensional set of objects within a resource grid, not entirely dissimilar to the idea set forth by Nicholas Carr in his book the “Big Switch.”

Facilities will return to their roots of concrete, power, and air-conditioning, adding cloud resources (or attracting cloud service providers to provide those resources), and the cabinets, cages, and private suites will start being dismantled to allow better use of electrical and cooling resources within the data center.

Rethinking the Data Center

Looking at 3tera‘s AppLogic utility it brings a strange vision to mind. If I can build a router, switch, server, and firewall into my profile via a drag and drop utility, then why would I want to consider buying my own hardware?

If storage becomes part of the layer 2 switch, then why would I consider installing my own SAN, NAS, or fiber channel infrastructure? Why not find a cloud service provider with adequate resources to run my business within their infrastructure, particularly if their network proximity and capacity is adequate to meet any traffic requirement my business demands?

In this case, if the technology behind AppLogic and other similar Platform as a Service (PaaS) is true to the marketing hype, then we can start throwing value back to the application. The network, connectivity, and the compute/storage resource becomes an assumed commodity – much like the freeway system, water, or the electrical grid.

Flowing the Profile to the User

Us old guys used to watch a SciFi sitcom called “Max Headroom.” Max Headroom was a fictional character who lived within the “Ether,” being able to move around though computers, electrical grids – and pop up wherever in the network he desired. Max could also absorb any of the information within computer systems or other electronic intelligence sources, andFrom the old SciFi series Max Headroom deliver his findings to news reporters who played the role of investigative journalists.

We are entering an electronic generation not too different from the world of Max Headroom. If we use social networking, or public utility applications such as Hotmail, Gmail, or Yahoo Mail, our profile flows to the network point closest to our last request for application access. There may be a permanent image of our data stored in a mother ship, but the most active part of our profile is parsed to a correlation database near our access point.

Thus, if I am a Gmail user, and live in Los Angeles, my correlated profile is available at the Google data cache with correlated Gmail someplace with proximity to Los Angeles. If I travel to HongKong, then Gmail thinks “Hmmm…, he is in HK, and we should parse his Gmail image to our HK cache, and hope he gets the best possible performance out of the Gmail product from that point.”

I, as the user, do not care which data center my Gmail profile is cached at, I only care that my end user experience is good and I can get my work done without unnecessary pain.

The data center becomes virtual. The application flows to the location needed to do the job and make me happy. XYZ.Com, who does my mail day-to-day, must understand their product will become less relevant and ineffective if their performance on a global scale does not meet international standards. Those standards are being set by companies who are using cloud computing on a global, distributed model, to do the job.

2010 is the Year Data Centers Evolve to Support the Cloud

The day of a 100sqft data center cage is rapidly becoming as senseless as buying a used DMS250. The cost in hardware, software, peopleware, and the operational expense of running a small data center presence simply does not make sense. Nearly everything that can be done in a 100sqft cage can be done in a cloud, forcing the services provider to concentrate on delivering end user value, and leaving the compute, storage, and network access to utility providers.

And when the 100sqft cage is absorbed into a more efficient resource, the cost – both in electrical/mechanical and cost (including environmental costs) will drop by a factor of nearly 50%, given the potential for better data center management using strict hot/cold aisle separation, hot or cold aisle containment, containers – all those things data center operators are scrambling to understand and implement.

Argue the point, but by the end of 2010, the ugly data center caterpillar will come out of its cocoon as a better, stronger, and very cloudy utility for the information technology and interconnected world to exploit.

A Cloud Computing Wish List for 2010

A cloud spot market allows commercial cloud service providers the ability to announce surplus or idle processing and storage capacity to a cloud exchange. The exchange allows A look into cloud development for 2010buyers to locate available cloud processing capacity, negotiate prices (within milliseconds), and deliver the commodity to customers on-demand.

Cloud processing and storage spot markets can be privately operated, controlled by industry organizations, or potentially government agencies. Spot markets frequently attract speculators, as cloud capacity prices are known to the public immediately as transactions occur.

The 2010 cloud spot market allows commercial cloud service providers to support both franchise (dedicated service level agreement) customers, as well as on-demand customers to participate in a spot market that allows customers to automatically move their applications and storage to providers offering the best pricing and service levels based on a pre-defined criteria.

I don’t really care who’s CPUs and disk I am using, I really only care that it is there when I want it, offers adequate performance, has proximity to my end users, and meets my pricing expectations.

Cloud Storage Using SSDs on the Layer 2 Switch

Content delivery networks/CDNs want to provide end users the best possible performance and quality – often delivering high volume video or data files. Traditionally CDNs build large storage arrays and processing systems within data centers, preferably adjacent to either a carrier hotel meet-me-room or Internet Exchange Point/IXP.

Sometimes supported by bundles of 10Gigabit ports connecting their storage to networks and the IXP.

Lots of recent discussion on topics such as Fiber Channel over Ethernet/FCoE and Fiber Channel over IP/FCoIP. Not good enough. I want the SSD manufacturers and the switch manufacturers to produce an SSD card with a form factor that fits into a slot on existing Layer 2 switches. I want a Petabyte of storage directly connected to the switch backplane allowing unlimited data transfer rates from the storage card to network ports.

Now a cloud storage provider does not have to buy 50 cabinets packed with SAN/NAS systems in the public data center, only slots in the switch.

IPv6

3tera got the ball rolling with IPv6 support in AppLogic. No more excuses. IPv6 support first, then add on IPv4 support as a failover to IPv6. The basic criteria to all other design issues. No IPv6 – then shred the design.

Cloud Standardization

Once again the world is being held hostage by equipment and software vendors posturing to make their product the industry standard. The user community is not happy. We want spot markets, the ability to migrate among cloud service providers when necessary, and a basis for future development of the technology and industry.

The IP protocols were developed through the efforts of a global community dedicated to making the Internet grow into a successful utility. Almost entirely supported through a global community of volunteers, the Internet Engineering Task Force and innovators banded together and built a set of standards (RFCs) for all to use when developing their hardware and applications.

Of course there were occasional problems, but their success is the Internet as it is today.

Standardization is critical in creating a productive development environment for cloud industry and market growth. There are several attempts to standardize cloud elements, and hopefully there will be consolidation of those efforts into a common framework.

Included in the efforts are the Distributed Management Task Force/DMTF Open Cloud Standards Incubator, Open Grid Forum’s Open Cloud Computing Interface working group, The Open Group Cloud Work Group, The Open Cloud Manifesto, the Storage Network Industry Association Cloud Storage Technical Work Group, and others.

Too many to be effective, too many groups serving their own purposes, and we still cannot easily write cloud applications that find the lower levels of cloud X as a Service/XaaS proprietary.

What is on your 2010 wish list?

Happy Cloud New Year!

Cloud Computing Expo Kicks Off in Santa Clara – The Cloud Opportunity Window is Now Officially Open

Having gone through a couple of decades worth of technology conferences, a familiar cycle occurs. For the first couple years, technology-related conferences are attended by engineers and operations people. Only after the technology has passed a couple of feasibility gates and begun to hit the business cycle do sales and marketing people Keynote Speech - Richard Marcello - Unisystake over. Cloud is now officially past the engineering phase, well into the sales phase – and the business community is scrambling to understand the implications of a virtualized world.

At the Cloud Computing Conference and Expo in Santa Clara, California, the opening keynote session venue was completely filled, with the organizer (SYS-CON Events) obliged to quickly expand the audience into two overflow rooms, in addition to mounting displays in hallways adjacent to the main ballroom. According to the conference organizer more than twice as many have signed up and are attending the conference than planned. And cloud “buzz” is electric within the halls.

Cloud computing is here, the industry innovation machine is spooling, and the “nay-sayers” are starting to quiet down as the reality of cloud computing is articulated, codified, and presented in a format that has finally gone past the high level “concepts” of recent cloud expos and conferences.

This must be true, because the hallways are now filling with people wearing suits, ties, and polo shirts with snappy logos. Engineers still roam the halls, identifiable by their blue jeans, T-shirts, and backpacks filled with gadgets and computers. The ratio is about 50:50, indicating cloud service providers are now attending conferences for the purpose of business development, rather than to simply share ideas and further develop cloud technology as an R&D community.

The Opening Keynote – Cloud Myth-Busting

Richard Marcello, President, Technology, Consulting, and Integrations Services at Unisys kicked off the conference with a keynote speech entitled “The Time is Right for Enterprise Cloud Computing.” The presentation followed a familiar model in the public (non engineering and technician audience) conditioning of a new technology – “the Nine Myths of Cloud Computing.” A very good presentation (really), which drilled into common misconceptions of cloud computing. This type approach is useful when giving an instructional presentation, with statements such as:

  • Myth #9 – Cloud computing is brand new – a revolution
  • Myth #8 – All clouds are the same
  • Myth #7 – Cloud computing is about technology
  • Myth #5 – Cloud computing is not reliable
  • And so on…

Do a search and replace of “cloud computing” with “Internet” and you could pose the same myths, with the discriminating factor being one of how you present the response in breaking each myth. Yes, it is marketing and borderline cliché, but it does go far in visualizing cloud computing to the new attendees from the business side of our industry.

Marcello did present one eloquent response to the myth “The Internal data center is more secure than the cloud.” He showed a slide which had three separate applications creating data. The data is stored in a storage cloud, as well as being manipulated in a service cloud. Data going into the cloud service (processing), and into the storage cloud is brought into a single stream, which cannot be intercepted by a “sniffer” or other device, and the actual data instances are only recognizable by the application using the data. To all others attempting to intercept the data, it appears as “water running through a pipe.”

Actually, not a bad analogy.

Marcello went on the describe his taxonomy of the “real time access engine” which controls the data streams into each application or storage device, security within an enterprise, industry, or organizational community of interest. However the most important message delivered during his speech was the idea that cloud computing will “generate new business models and ideas that none of us have yet envisioned.”

But, That’s Not What I designed…

This message is strong. All engineers have gone through the experience of creating a product, and then observing the product being used by people for activities never envisioned by the creator. Imagine the continuing astonishment of the originators of the Internet. A simple tool for distributed applications and network survivability, and it is now the basis for nearly all communications, entertainment, business, and social interaction between humans throughout the world.

What will cloud computing bring us in the future? What will smart kids who are going through an education system with complete immersion in the global Internet cloud as a normal part of life be able to see in a potential global model of data and applications virtualization? Much as the early days of the internet represented a mere tip of the future network “iceberg,” what we see in cloud computing today is just the tip of what virtualization of compute and storage resources will ultimate become.

What will happen when SSDs (solid state disks) become part of the layer 2 switching backplane (Slapping an SSD card into a switching slot, making Fiber channel over Ethernet obsolete overnight)? An entire content delivery network and system currently using 100 cabinets of servers and disk reduced to a single card in a switch…

Integration with IPv6. Standardization in cloud services allowing formation of cloud spot markets and interoperability.

We have a lot of questions to throw both at the engineers, as well as the business visionaries attending the conference. Welcome sales and marketing folks, welcome to the new age of cloud computing.

John Savageau, Long Beach (From the Cloud Computing Conference and Expo, Santa Clara, California)

%d bloggers like this: