Are Public Mail Systems a Danger in Developing Countries?

Over the past two years I’ve interviewed dozens of government ICT managers in countries throughout Asia, the Caribbean, and Europe.  One of the surprising items collected during the interviews is the large number of government employees – some at the highest levels, using public mail systems for their professional communications.

While this might appear as a non-issue with some, others might find it both a security issue (by using a foreign commercial company to process and store government correspondence), as well as an identity issue (by using an XXX@gmail.com or XXX@yahoo.com ) while communicating with a government employee or official.

Reasons provided in interviews concluded the reason why government employees are using commercial email systems include:

  • Lack of timely provisioning by government ICT managers
  • Concerns over lack of privacy within a government-managed email system
  • Desire to work from home or while mobile, and the government system does not support remote or web access to email (or the perception this is the case)
  • Actual mail system performance is better on public systems than internal government-operated systems
  • Government ICT systems have a high internal transfer cost, even for simple utilities such as email

and so on.

When pressed further, many were not aware of the risk that government correspondence processed through public systems potentially resulted in images being stored on storage systems probably located in other countries.  Depending on the country, that email image could easily be provided to foreign law enforcement agencies under lawful warrants – thus exposing potentially sensitive information for exploitation by a foreign government.

Are Public Email Accounts Bad?

Not at all.  Most of us use at least one personal email address on a public mail system, some many addresses.  Public systems allow on-demand user creation of accounts, and if desired allow individuals to create anonymous identities for use when using other social media or public networks. 

Public addresses can separate an individual’s online identity from their “real world” identity, allowing higher levels of privacy any anonymous participation in social media or other activities where the user wishes to not have their full identity revealed.

The addresses are also quite simple to use, cost nothing, and are in use around the world.

Governments are also starting to make better use of commercial or public email outsourcing, with the City of Los Angeles being one of the more well-known projects.  The City of LA has service level agreements with Google (their outsource company), assuring security an confidentiality, as well as operational service levels. 

This is no doubt going to be a continuing trend, with public private partnerships (PPPs) relieving government users from the burden of infrastructure and some applications management.  With the US CIO Vivek Kundra aggressively pushing the national data center consolidation and cloud computing agenda, the move towards hosted or SaaS applications will increase.

Many benefits here as well, including:

  1. Hosted mail systems may keep an image of mail in storage – much more secure than if an individual PC loses single images of mail from a POP server
  2. Access from any Internet connected workstation or computer (of course assuming good passwords and security)
  3. Standardization among organizational user (both for mail formatting and client use)
  4. Cheaper operating costs

To address recent budget and human resource challenges, the City of Orlando moved its e-mail and productivity solution to the cloud (application and cloud  hosting services provided by Google).  The City has realized a 65 percent reduction in e-mail costs and provided additional features to increase the productivity of workers. (CIO Council, State of Public sector Cloud Computing)

For developing countries this is probably a good thing – have all the features and services of the best in class email systems, while significantly reducing the cost and burden of developing physical data center facilities.

But for the meantime, as that strategy and vision is defined, the use of public or cloud hosted email services in many developing countries in one of convenience.  We will only hope that commercial email providers safeguard data processed by government user’s personal accounts, used for communicating all levels of government information, with the same service level agreements offered large users such as the City of LA or City of Orlando.

Papau Struggles to Level the Internet Playing Field

Papua-NetThe Warnet** was full. Students and adults shared a few old computers running the Windows XP operating system, connecting to Facebook, MySpace, Gmail, and other social networking sites. A few looked at web pages from universities scattered around the world, and a few simply indulged in the escape of online gaming. This is Jayapura, Papua, Indonesia. The provincial capital of Indonesia’s eastern-most province, just a couple of miles from the international border with Papua New Guinea.

Internet access is accomplished via satellite connections, mostly provided by the national PTT Telkom Indonesia through their “Speedy” Internet DSL service. However “Speedy” should be best considered a simple branding term – unrelated to the reality of Internet access that is limited by around 83 Mbps satellite capacity serving the needs of a city totaling more than 350,000 people. That is not likely to change any time soon, as the Palapa fiber optic ring is still on the drawing board, and satellite coverage and capacity over the Papua region is limited.

Connecting to Skype via the hotel WiFi connection (Aston Hotel in Jayapura), you can get a relatively decent video call – depending on the time of day (normally between 0100 and 0700). Not HD quality, but movement is good, and audio quality is good. You wouldn’t want to be downloading files via email, or web surfing on high density pages, however if the computer is basically idle, and the network not heavily in use, you can get the call.

Other Internet access is available through PT Indosat and the mobile carriers, however each have their own limitations, whether it be by location, cost, or services offered to users.

Moving to West Papua

Manokwari, the provincial capital of West Papua, is a different experience. Internet bandwidth to the city is very limited, to the point getting any level of Internet access is considered good. However, while in Manokwari, sitting outside the Blue and White Warnet at 0500 in the morning, connecting to a prepaid WiFi access point – I was able to call home using Skype. Lots of clipping and echo, a few rounds of “hey, say that again, the connection is not very good,” and a bit of frustration, but at 0500 I called home.

papua-blue-whiteThe Blue and White Warnet is probably among the best public access points in Manokwari. It also serves as a mini community center, hangout location for young people, and café. For young people with dreams of a successful, happy life, the Warnet provides a healthy opportunity to explore other parts of the world. They can build their dreams of education, job opportunities, and travel to parts of the world which seem like a science fiction novel compared to their surroundings of jungle and poverty.

Whether it is Jayapura, or Manokwari, or any other remote area in this huge country, the message is clear – “we need more, better, and faster Internet.” Students and young people understand their global competition is children from cities like Sunnyvale or Seoul, where access to the vast world of Internet knowledge and opportunity is taken for granted, at speeds to papua-wifiindividual homes exceeding the entire access capacity of their province.

But yet a crowd gathers at the Blue and White Warnet every day and evening. And students continue to squeeze every bit of value from their low speed Internet connections possible, continuing to grasp at threads of dreams they may someday become full members of the connected global community.

As mentioned in earlier posts on the Warnet culture in western Indonesia, the Warnets in general have no problems with users accessing pornography or trying to hack – most users are genuinely trying to use the resource to learn more, and get a brief glimpse into a better life.

Palapa Ring – East

Bringing the Palapa Fiber Optic Ring to eastern Indonesia is an essential key to connecting the major islands back to western Indonesia and the rest of the world. While satellite capacity begins to run dry, the hope of bringing a high performance fiber system to the shores of Papua would enable bandwidth needed to bring modern eGovernment, education, and capacity for private industry to fully join the global economy, subsequently improving quality of life for all citizens.

As a neutral cable, Palapa Ring – East will also promote competition among Indonesia’s carriers and service providers to extend their networks to Papua and West Papua bringing better price competition, quality of service (including customer service), and variety of services. An Internet Exchange Point (IXP) in the major cities will boost local content and communications performance, without having to make the trek from Papua to Jakarta to Papua for accessing locally hosted content.

That is the good news. The bad news is that Palapa Ring east only exists on Powerpoint slides and meeting discussions. A great idea, which everybody appears to want, but no schedule, and no solid plan for the project. It will happen someday, we just do not know what day that might be.

The Developing World Needs Access

Whether Burma, Laos, North Korea, Somalia, or any other developing region, Internet access is best considered a human right withheld by the government, or limited by technical capability shortfalls within the country. With a child growing up in a city such as Burbank (California) having global Internetworking technologies and applications diffused into their lives from nearly the time they can walk, the digital divide in 2010 has continued to expand to new extremes.

While those hanging out at the Blue and White café are able to use Facebook, some eLearning applications, Twitter, chats, and email – 20 miles into the jungle is a completely different story. The access is cut off, and for hundreds of villages located throughout Papua, Internet is simply not available.

Within the city center in cities such as Jayapura, you do have a scattering of good buildings, and within some of the new settlement areas outside of the city better infrastructure is being produced.  However for the most part, people struggle everyday to learn, to earn, and to meet the most basic requirements in Maslow’s Hierarchy of needs. 

papua-jayapuraImagine if you were sitting in Costa Mesa (California) and you could not connect to the Internet. Imagine that it is simply not available in your area. Hard to imagine. Today each child born and raised in Papua and West Papua is burdened with an environment that simply does not give them the intellectual tools to compete with children in Jakarta, Burbank, or any other wired city.

And in Burbank we consider it a crime if our cable TV provider has less than 100 HD channels available, or every sporting event on planet available in real time.

Instant communications, instant access to information, instant access to thousands of applications and utilities that make life better – a right of all citizens. in reality no immediate communications during disasters, no support for people when they are sick or injured, no WebMD, Wikipedia, or Yahoo Answers. Just “not available here.” As an Internet user, sitting in a hotel room where Internet is simply not available, and my next opportunity to connect is 0500 tomorrow morning – and having lived in a wired world for most of the past 25 years – this is a very strange experience.

An experience that is considered normal for everybody in Manokwari.


NOTE: Wireless access is available through companies such as Telokomsel. They have deployed 3G services to both cities mentioned in this article using flash modems, although the services are more expensive than most can handle for any level of large data transfer, not to mention the cost of user equipment (handsets and mobile/laptop computers). Again, expensive satellite connections must be paid for, and as always the end user carries the cost. But it is a step forward.

**Warnet = A Warnet is similar to an Internet Café.  However it is normally a small room, with around 10 small computer workstations connected to the Internet.  in many locations in Indonesia, the Warnet is the only location people can access the Internet, as most cannot afford their own computer, or in their area Internet access is simply not available.

We should also note that mobile telephony is available nearly everywhere in Indonesia, with the exception of remote villages within the interior of locations such as Papua.  As 3G wireless technology continues to extend into more and more remote locations, the potential of handsets becoming the dominant Internet access device is a high probability, with the only real limitation being the connection will ultimately be completed using satellite links.

John Savageau

From Manokwari, West Papua, indonesia

Government Clouds Take on the ESBaaS

Recent discussions with government ICT leadership related to cloud computing strategies have all brought the concept of Enterprise Service Bus as a Service into the conversation.

Now ESBs are not entirely new, but in the context of governments they make a lot of sense.  In the context of cloud computing strategies in governments they make a heck of a lot of sense.

Wikipedia defines an ESB as:

In computing, an enterprise service bus (ESB) is a software architecture construct which provides fundamental services for complex architectures via an event-driven and standards-based messaging engine (the bus). Developers typically implement an ESB using technologies found in a category of middleware infrastructure products, usually based on recognized standards.

Now if you actually understand that – then you are no doubt a software developer.  For the rest of us, this means that with the ESB pattern, participants engaging in service interaction communicate through a services or application “bus.” This bus could be a database, virtual desktop environment, billing/payments system, email, or other services common to one or more agencies. The ESB is designed to handle relationships between users with a common services and standardized data format.

New services can be plugged into the bus and integrated with existing services without any changes to the core bus service. Cloud users and applications developers will simply add or modify the integration logic.

Participants in a cross-organizational service interaction are connected to the Cloud ESB, rather than directly to one another, including: government-to-government, citizen-to-government, and business-to-government. Rules-based administration support will make it easier to manage ESB deployments through a simplified template allowing a better user experience for solution administrators.

The Benefits to Government Clouds

In addition to fully supporting a logical service-oriented architecture (SOA), the ESBaaS will enhance or provide:

  • Open and published solutions for managing Web services connectivity, interactions, services hosting, and services mediation environment
  • From development and maintenance perspective, the Government Cloud ESB allows agencies and users to securely and reliably share information between applications in a logical, cost effective manner
  • Government Cloud ESBs will simplify adding new services, or changing existing services, with minimal impact to the bus or other interfacing applications within the IT environment
  • Improvements in system performance and availability by offloading message processing and isolating complex mediation tasks in a dedicated ESB integration server

Again, possibly a mouthful, but if you can grasp the idea of a common bus providing services to a lot of different applications or agencies, allowing sharing of data and and interfaces without complex relationships between each participating agency, then the value becomes much more clear.

Why the Government Cloud?

While there are many parallels to large companies, governments are unique in the number of separate ministries, agencies, departments, and organizations within the framework of government.  Governments normally share a tremendous amount of in the past this data between each agency, and in the past this was extremely difficult due to organizational differences, lack of IT support, or individuals who simply did not want to share data with other agencies.

The result of course was many agencies built their own stand alone data systems, without central coordination, resulting in a lot of duplicate data items (such as an individual’s personal profile and information, business information, and land management information, and other similar data).  Most often, there were small differences in the data elements each agency developed and maintained, resulting in either corrupt or conflicting data.

The ESB helps identify a method of connecting applications and users to common data elements, allowing the sharing of both application format and in many cases database data sets.  This allows not only efficiency in software/applications development, but also a much higher level of standardization an common data sharing.

While this may be uncomfortable for some agencies, most likely those which do not want to share their data with the central government, or use applications that are standardized with the rest of government, this also does support a very high level of government transparency.  A controversial, but essential goal of all developing (and developed) governments.

As governments continue to focus on data center consolidation and the great economical, environmental, and enabling qualities of virtualization and on-demand compute resources, integration of the ESBaaS makes a lot of sense. 

There are some very nice articles related to ESBs on the net, including:

Which may help you better understand the concept, or give some additional ideas.

Let us know your opinion or ideas on ESBaaS

Managing Disasters with Internet Utilities

Fire season is here. Southern California fire departments and forestry services are urging residents to cut back brush on their properties and create “defensible space” Burbank is in a High Risk Period for Wildfirebetween the dry chaparral and their homes. Local news stations have spooled their resources to bring fire-related journalism to the population. And, we have already seen extreme technology such as DC-10s and 747s dumping insane amounts of Foscheck and water to quickly knock down fires which have popped up early in the season.

Southern California has fires, just as Kansas has tornadoes and Florida has hurricanes. Disasters are a natural part of nature and life. How we deal with natural disasters, our ability to survive and overcome challenges, and how we restore our communities defines our society.

Technology tools in place or being developed are having a major impact on our ability to react, respond, and recover from disaster. In the early stages of any disaster, communication is key to both survival and response. As nearly every person in the world is now tethered to a wireless device, the communication part isDefensible space to avoid brush fires becoming much easier, as even the most simple handset will support basic features such as text messaging and voice communications.

Getting the Message Out

Over the past 25 years the world has adopted Internet-enabled communications in a wide variety of formats for everything from email to citizen journalism. It is hard to find an event occurring anyplace in the world that is not recorded by a phone camera, YouTube video, blog, or real time broadcast.

In the 2008 Santa Barbara Tea Fire students from UC Santa Barbara used Twitter to warn fellow students and local residents to get out of the fire’s path as it raced through 2000 acres and more than 210 houses within the city limits. While it is not possible to put a statistic on the value of Twitter on evacuations and emergency notification, interviews following the fire with students revealed many had their initial notification through Twitter lists, and indicated they were able to get out of areas consumed in the fire (while screaming the heads off to others in the neighborhood to get out) before public safety officials were able to respond to the fire.

NOTE: I was driving through Santa Barbara (along the ‘101) during the initial phase of the fire, and can personally verify the fire moved really, really fast through the city. It looked like lava streaming out of a volcano, and you could see houses literally exploding as the fire hit them and moved through… I wasted no time myself getting through the city and on the way to LA.

Houses in Burbank's Verdugu MoutnainsThis article will not review all the potential technologies or software becoming available for emergency notifications, however we will look at the basic utility enabling all the great stuff happening to keep our citizens safe. The Internet.

Internet’s Utility is Now Bigger than Individuals and Companies

We all remember the infamous interview with Ed Whitcare, former CEO at AT&T.

Q: How concerned are you about Internet upstarts like Google, MSN, Vonage, and others?

A: How do you think they’re going to get to customers? Through a broadband pipe. Cable companies have them. We have them. Now what they would like to do is use my pipes free, but I ain’t going to let them do that because we have spent this capital and we have to have a return on it. So there’s going to have to be some mechanism for these people who use these pipes to pay for the portion they’re using. Why should they be allowed to use my pipes?

The Internet can’t be free in that sense, because we and the cable companies have made an investment and for a Google or Yahoo or Vonage or anybody to expect to use these pipes [for] free is nuts!

This statement, clearly indicates many in the internet network and service provider business do not yet get the big picture of what this “4th Utility” represents. The internet is not funny cat videos, porn, corporate web sites, or Flickr. Those features and applications exist on the Internet, but they are not the Internet.

Internet, broadband, and applications are a basic right of every person on the planet. The idea that two network administrators might have an argument at a bar, and subsequently consider the possibility of “de-peering” a network based on personalities or manageable financial considerations borders on being as irresponsible as a fire department going on strike during a California wildfire.

From http://www.wired.com/autopia/2009/09/evergreen-supertanker/As a utility, the Internet has value. Just as electricity, water, or roads. The utility must be paid for either before or after use, however the utility cannot be denied to those who need the service. When a city grows, and attracts more traffic, residents, and commerce, the intent is normally not to restrict or control the process, you build better roads, better infrastructure, and the people will eventually pay the price of that growth through taxes and utility bills. The 4th Utility is no different. When it gets oversubscribed, it is the carrier’s responsibility to build better infrastructure.

Disputes between network administrators, CFOs, or colocation landlords should never present a risk that SMS, Twitter, email, or other citizen journalism could be blocked, resulting is potential loss of life, property, and quality of life.

Communicating in the Dangerous Season

Fire season is upon us. As well as riots, traffic congestion, government crackdowns, take downs, and other bad things people need to know so they can react and respond. The Internet delivers CalTrans traffic information to smart phones, SMS, and web browsers to help us avoid gridlock and improve our quality of life. Twitter and YouTube help us understand the realities of a Tehran government crackdown, and Google Maps helps guide us through the maze of city streets while traveling to a new location.

We have definitely gone well past the “gee whiz” phase of the Internet, and must be ready to deal with the future of the Internet as a basic right, a basic utility, and essential component of our lives.

Net neutrality is an important topic – learn more about network neutrality, and weigh in on how you believe this utility should be envisioned.

The Utility and Pain of Internet Peering

In the early 1990s TWICS, a commercial bulletin board service provider in Tokyo, jumped on the Internet. Access was very poor based on modern Internet speeds, however at the time 128kbps over frame relay (provided by Sprint international) was unique, and in fact represented the first truly commercial Internet access point in Japan.

The good old boys of the Japanese academic community were appalled, and did everything in their power to intimidate TWICS into disconnecting their connection, to the point of sending envelopes filled with razor blades to TWICS staff and the late Roger Boisvert (*), who through Intercon International KK acted as their project manager. The traditional academic community did not believe anybody outside of the academic community should ever have the right to access the Internet, and were determined to never let that happen in Japan.

Since the beginning, the Internet has been a dichotomy of those who wish to control or profit from the Internet, and those who envision potential and future of the Internet. Internet “peering” originally came about when academic networks needed to interconnect their own “Internets” to allow interchange of traffic and information between separately operated and managed networks. In the Internet academic “stone age” of the NSFNet, peering was a normal and required method of participating in the community. But,… if you were planning to send any level of public or commercial traffic through the network you would violate the NSFNET’s “acceptable use policy/AUP” preventing use of publically-funded networks for non-academic or government use.

Commercial internet Exchange Points such as the CIX, and eventually the NSF supported network access points/NAPs popped up to accommodate the growing interest in public access and commercial Internet. Face it, if you went through university or the military with access to the Internet or Milnet, and then jumped into the commercial world, it would be pretty difficult to give up the obvious power of interconnected networks bringing you close to nearly every point on the globe.

The Tier 1 Subsidy

To help privatize the untenable growth of the NSFNet (due to “utility” academic network access), the US Government helped pump up American telecom carriers such as Sprint, AT&T, and MCI by handing out contracts to take over control and management of the world’s largest Internet networks, which included the NSFNet and the NSF’s international Connection Managers bringing the international community into the NSFNet backbone.

This allowed Sprint, AT&T, and MCI to gain visibility into the entire Internet community of the day, as well as take advantage of their own national fiber/transmission networks to continue building up the NSFNet community on long term contracts. With that infrastructure in place, those networks were clear leaders in the development of large commercial internet networks. The Tier 1 Internet provider community is born.

Interconnection and Peering in the Rest of the World

In the Internet world Tier1 networks are required (today…), as they “see” and connect with all other available routes to individual networks and content providers scattered around the world. Millions and millions of them. The Tier 1 networks are also generally facility-based network providers (they own and operate metro and long distance fiber optic infrastructure) which in addition to offering a global directory for users and content to find each other, but also allows traffic to transit their network on a global or continental scale.

Thus a web hosting company based in San Diego can eventually provide content to a user located in Jakarta, with a larger network maintaining the Internet “directory” and long distance transmission capacity to make the connection either directly or with another interconnected network located in the “distant end” country.

Of course, if you are a content provider, local internet access provider, regional network, or global second tier network, this makes you somewhat dependant on one or more “Tier 1s” to make the connection. That, as in all supply/demand relationships, may get expensive depending on the nature of your business relationship with the “transit” network provider.

Thus, content providers and smaller networks (something less than a Tier 1 network) try to find places to interconnect that will allow them to “peer” with other networks and content providers, and wherever possible avoid the expense of relying on a larger network to make the connection. Internet “Peering.”

Peering Defined (Wikipedia)

Peering is a voluntary interconnection of administratively separate Internet
networks for the purpose of exchanging traffic between the customers of each network. The pure definition of peering is settlement-free or “sender keeps all,” meaning that neither party pays the other for the exchanged traffic; instead, each derives revenue from its own customers. Marketing and commercial pressures have led to the word peering routinely being used when there is some settlement involved, even though that is not the accurate technical use of the word. The phrase “settlement-free peering” is sometimes used to reflect this reality and unambiguously describe the pure cost-free peering situation.

That is a very “friendly” definition of peering. In reality, peering has become a very complicated process, with a constant struggle between the need to increase efficiency and performance on networks, to gaining business advantage over competition.

Bill Norton, long time Internet personality and evangelist has a new web site called “DR Peering,” which is dedicated to helping Internet engineers and managers sift through the maze of relationships and complications surrounding Internet peering. Not only the business of peering, but also in many cases the psychology of peering.

Peering Realities

In a perfect world peering allows networks to interconnect, reducing the number of transit “hops” along the route from points “A” to “B,” where either side may represent users, networks, applications, content, telephony, or anything else that can be chopped up into packets, 1s and 0s, and sent over a network, giving those end points the best possible performance.

Dr Peering provides an “Intro to Peering 101~204,” reference materials, blogs, and even advice columns on the topic of peering. Bill helps “newbies” understand the best ways to peer, the finances and business of peering, and the difficulties newbies will encounter on the route to a better environment for their customers.

And once you have navigated the peering scene, you realize we are back to the world of who wants to control, and who wants to provide vision. While on one level peering is determined by which vendor provides the best booze and most exciting party at a NANOG “Beer and Gear” or after party, there is another level you have to deal with as the Tier 1s, Tier 1 “wanna-be networks,” and global content providers jockey for dominance in their defined environment.

At that point it becomes a game, where personalities often take precedence over business requirements, and the ultimate loser will be the end user.

Another reality. Large networks would like to eliminate smaller networks wherever possible, as well as control content within their networks. Understandable, it is a natural business objective to gain advantage in your market and increase profits by rubbing out your competition. In the Internet world that means a small access network, or content provider, will budget their cost of global “eyeball or content” access based on the availability of peering within their community.

The greater the peering opportunity, the greater the potential of reducing operational expenses. Less peering, more power to the larger Tier 1 or regional networks, and eventually the law of supply and demand will result in the big networks increasing their pricing, diluting the supply of peers, and increasing operational expenses. Today transit pricing for small networks and content providers is on a downswing, but only because competition is fierce in the network and peering community supported by exchanges such as PAIX, LINX, AMS-IX, Equinix, DE-CIX, and Any2.

At the most basic level, eyeballs (users) need content, and content has no value without users. As the Internet becomes an essential component of everybody on the planet’s life, and in fact becomes (as the US Government has stated) a “basic right of every citizen,” then the existing struggle for internet control and dominance among individual players becomes a hindrance or roadblock in the development of network access and compute/storage capacity as a utility.

The large networks want to act as a value-added service, rather than a basic utility, forcing network-enabled content into a tiered, premium, or controlled commodity. Thus the network neutrality debates and controversy surrounding freedom of access to applications and content.

This Does Not Help the Right to Broadband and Content

There are analogies provided for just about everything. Carr builds a great analogy between cloud computing and the electrical grid in his book the “Big Switch.” The Internet itself is often referred to as the “Information Highway.” The marriage of cloud computing and broadband access can be referred to as the “4th Utility.”

Internet protocols and technologies have become, and will continue to be reinforced as a part of the future every person on our planet will engage over the next generations. This is the time we should be laying serious infrastructure pipe, and not worrying about whose content should be preferred, settlements between networks, and who gives the best beer head at a NANOG party.

At this point in the global development of Internet infrastructure, much of the debate surrounding peering – paid or unpaid, amounts to noise. It is simply retarding the development of global Internet infrastructure, and may eventually prevent the velocity of innovation in all things Internet the world craves to bring us into a new generation of many-to-many and individual communications.

The Road Ahead

All is not lost. There are visionaries such as Hunter Newby aggressively pushing development of infrastructure to “address America’s need to eliminate obstacles for broadband access, wireless backhaul and lower latency through new, next generation long haul dark fiber construction with sound principles and an open access philosophy.”

Oddly, as a lifelong “anti-establishment” evangelist, I tend to think we need better controls by government over the future of Internet and Internet vision. Not by the extreme right wing nuts who want to ensure the Internet is monitored, regulated, and restricted to those who meet their niche religions or political cults, but rather on the level of pushing an agenda to build infrastructure as a utility with sufficient capacity to meet all future needs.

The government should subsidize research and development, and push deployment of infrastructure much as the Interstate Highway System and electrical and water utilities. You will have to pay for the utility, but you will – as a user – not be held hostage to the utility. And have competition on utility access.

In the Internet world, we will only meet our objectives if peering is made a necessary requirement, and is a planned utility at each potential geographic or logical interconnection point. In some countries such as Mongolia, an ISP must connect to the Mongolia Internet Exchange as a requirement of receiving an ISP license. Why? Mongolia needs both high performance access to the global Internet – as well as high performance access to national resources. It makes a lot of sense. Why give an American, Chinese, or Singaporean money to send an email from one Mongolian user to another Mongolian user (while in the same country)? Peering is an essential component of a healthy Internet.

The same applies to Los Angeles, Chicago, Omaha, or any other location where there is proximity between the content and user, or user and user. And peering as close to the end users as technically possible supports all the performance and economic benefits needed to support a schoolhouse in Baudette (Minn), without placing an undue financial burden on the local access provider based on predatory network or peering policies mandated by regional or Tier 1 networks.

We’ve come a long way, but are still taking baby steps in the evolution of the Internet. Let’s move ahead with a passion and vision.

(*)  Roger Boisvert was a friend for many years, both during my tensure as  US Air Force officer and telecom manager with Sprint based in Tokyo (I met him while he was still with McKinsey and a leader in the Tokyo PC User’s Group), and afterwards through different companies, groups, functions, and conferences in Japan and the US.  Roger was murdered in Los Angeles nine years ago, and is a true loss to the internet community, not only in Japan but throughout the world.

Data Center Consolidation and Cloud Computing in Indonesia

2010 brings great opportunities and challenges to IT organizations in Indonesia. Technology refresh, aggressive development of telecom and Internet infrastructure, with aggressive deployment of “eEverything” is shaking the ICT industry. Even the most steadfast division-level IT managers are beginning to recognize the futility in trying to maintain their own closet “data Skyline near the Jakarta Stock Exchangecenter” in a world of virtualization, cloud computing, and drive to increase both data center economics and data security.

Of course there are very good models on the street for data center consolidation, particularly on government levels. In the United States, the National Association of State Chief Information Officers (NASCIO) lists data center consolidation as the second highest priority, immediately after getting better control over managing budget and operational cost.

In March the Australian government announced a (AUD) $1 billion data center consolidation plan, with standardization, solution sharing, and developing opportunities to benefit from “new technology, processes or policy.”

Minister for Finance and Deregulation Lindsay Tanner noted Australia currently has many inefficient data centers, very suitable candidates for consolidation and refresh. The problem of scattered or unstructured data management is “spread across Australia, (with data) located in not just large enterprise data centres, but also in cupboards, converted offices, computer and server rooms, and in commercial and insourced data centers,” said Tanner.

These are primarily older data centres that are reaching the limits of their electricity supply and floor space. With government demand for data center ICT equipment rising by more than 30 per cent each year, it was clear that we needed to reassess how the government handled its data center activities.”

The UK government also recently published ICT guidance related to data center consolidation, with a plan to cut government operated data center from 130 to around 10~12 facilities. The guidance includes the statement “Over the next three-to-five years, approximately 10-12 highly resilient strategic data centers for the public sector will be established to a high common standard. This will then enable the consolidation of existing public data centers into highly secure and resilient facilities, managed by expert suppliers.”

Indonesia Addresses Data Center Consolidation

Indonesia’s government is in a unique position to take advantage of both introducing new data center and virtualization technology, as well as deploying a consolidated, distributed data center infrastructure that would bring the additional benefit of strong disaster recovery capabilities.

Much like the problems identified by Minister Tanner in Australia, today many Indonesian government organizations – and commercial companies – operate ICT infrastructure without structure or standards. “We cannot add additional services in our data center,” mentioned one IT manager interviewed recently in a data center audit. “If our users need additional applications, we direct them to buy their own server and plug it in under their desk. We don’t have the electricity in our data center to drive new applications and hardware, so our IT organization will now focus only on LAN/WAN connectivity.”

While all IT managers understand disaster recovery planning and business continuity is essential, few have brought DR from PowerPoint to reality, putting much organization data on individual servers, laptops, and desktop computers. All at risk for theft or loss/failure of single disk systems.

basic map showing palapa ringThat is all changing. Commercial data centers are being built around the country by companies such as PT Indosat, PT Telekom, and other private companies. With the Palapa national fiber ring nearing completion, all main islands within the Indonesian archipelago are connected with diverse fiber optic backbone capacity, and additional international submarine cables are either planned or in progress to Australia, Hong Kong, Singapore, and other communication hubs.

For organizations currently supporting closet data centers, or local servers facing the public Internet for eCommerce or eGovernment applications, data centers such as the Cyber Tower in Jakarta offer both commercial data center space, as well as supporting interconnections for carriers – including the Indonesia Internet Exchange (IIX), in a similar model as One Wilshire, The Westin Building, or 151 Front in Toronto. Ample space for outsourcing data center infrastructure (particularly for companies with Internet-facing applications), as well as power, cooling, and management for internal infrastructure outsourcing.

The challenge, as with most other countries, is to convince ICT managers that it is in their company or organization’s interest to give up the server. Rather than focus their energy on issues such as “control,” “independence (or autonomous operations),” and avoiding the pain of “workforce retraining and reorganization,” ICT managers should consider the benefits outsourcing their physical infrastructure into a data center, and further consider the additional benefits of virtualization and public/enterprise cloud computing.

Companies such as VMWare, AGIT, and Oracle are offering cloud computing consulting and development in Indonesia, and the topic is rapidly gaining momentum in publications and discussions within both the professional IT community, as well as with CFOs and government planning agencies.

It makes sense. As in cloud computing initiatives being driven by the US and other governments, not only consolidating data centers, but also consolidating IT compute resources and storage, makes a lot of sense. Particularly if the government has difficulty standardizing or writing web services to share data. Add a distributed cloud processing model, where two or more data centers with cloud infrastructure are interconnected, and we can now start to drive down recovery time and point objectives close to zero.

Not just for government users, but a company located in Jakarta is able to develop a disaster recovery plan, simply backing up critical data in a remote location, such as IDC Batam (part of the IDC Indonesia group). As an example, the IDC Indonesia group operates 4 data centers located in geographically separate parts of the country, and all are interconnected.

While this does not support all zero recovery time objectives, it does allow companies to lease a cabinet or suite in a commercial data center, and at a minimum install disk systems adequate to meet their critical data restoral needs. It also opens up decent data center collocation space for emerging cloud service and infrastructure providers, all without the burden of legacy systems to refresh.

In a land of volcanoes, typhoons, earthquakes, and man-made disasters Indonesia has a special need for good disaster recovery planning. Through an effort to consolidate organization data centers, the introduction of cloud services in commercial and government markets, and high capacity interconnections between carriers and data centers, the basic elements needed to move forward in Indonesia are now in place.

Giving Ourselves a Broadband Facelift for the 2010 Matrix

Of all the memories the telecom community has of the 80s and 90s, one of the most vivid is the sight of long haul fiber optic cable systems being buried throughout the United States. A product of deregulation, competition, and the birth of the Internet, American telecom companies saw a desperate need for greatly increasing transmission capacity, and responded with investments in long haul fiber, metro fiber, and digital switching needed to meet all visions of what we knew in those wonderful days of innovation.

Globally, broadband Internet, 3G + wireless, and the convergence of everything from entertainment to telephony into digital formats is driving not only Internet technologies, but also physical telecom transmission systems to the threshold of existing capacity. This explosive growth in information and communications technologies creates an interesting dilemma for telecom companies.

Do you spend your efforts finding ways to control the use of existing capacity? Or do we acknowledge the fact our network-enabled global community is not likely to get any smaller, and the world now needs our telecom thought leadership to both greatly expand what we already have, while aggressively investing in developing transmission technology that will enable, not restrict, growth in all things digital.

Not a US-Only Challenge

When a child in South Africa, Hanoi, or Denpasar has equal access to Hulu TV, Skype video chats, and eLearning systems from either a fixed workstation or mobile phone, it can be argued technology is serving the purpose of enabling and providing a new generation with the intellectual tools they need to flatten the geographic and political barriers we have lived with since the beginning of time.

All great, benevolent thoughts. Our children may need the tools to correct the problems we’ve created through irresponsible use of fossil fuels, exploitation of natural resources, human transmitted disease, war, and creation of toxic “stuff” that continues to restrict our planet’s ability to create an acceptable quality of life for all.

Face it, educated people in general do not make as many BIG mistakes as those who blindly follow others due to ignorance or lack of exposure to a wide variety of knowledge. Internet and telecom-enabled technologies may facilitate some people who thrive on physical or ideological control, however that is also diluted as the percentages bring their own knowledge of fact, and exposure to a liberal dosage or prism of different perspectives.

Or in other words, we can hope primary school students from different countries and cultures who meet each other through chatting or cooperative educational projects will be more likely to collaborate on useful endeavors in later life than those who are only exposed to a narrow view of society, culture, ideologies, and leadership.

Getting to the Vision

All this is great. An altruistic, warm, and fuzzy view of the future. Getting our vision to reality requires a tremendous amount of work. The current caretakers of industry and leadership do not have all the intellectual tools needed to keep up with a developing generation of children who were birthed in the Internet Age.

However we (the current caretakers) are pretty good at building things. Among those things are fiber optic transmission systems spanning oceans, continents, cities, and now even homes. We are good at building wireless transmission towers, and are still pretty good at building devices that can connect all this fiber, tower, and wireless infrastructure together.

And the younger generation is beginning to envision ways to exploit the transmission “matrix” that is beyond the comprehension of our current caretaker generation.

“The world is becoming one, big, ubiquitous, homogeneous system because of “the network” and the network exists and needs to exist because it exists (in other places) already. This is the justification to build. It is a self-fulfilling chain reaction.” (Hunter Newby, CEO Allied Fiber)

The Republicans in the US like to scream the need for Americans to “Drill Baby Drill,” exploiting domestic sources of fossil fuels, reducing our dependence on foreign sources for energy. In the telecom industry we are beginning to feel the need to “Dig Baby Dig.”

We need to increase our ability to continue delivering the network transmission capacity required to give our next generation the tools needed to really make a “Matrix-enabled” future, rather than spend our efforts scrambling, as in the energy analogy, to control or reduce our dependence on existing sources of telecom capacity.

How it is Going to Happen

In the US, for the past 30 years deregulation has allowed the telecom industry to build their infrastructure without any oversight other than what local or state governments impose for licensing and access to rights of way. Most debates have surrounded topics such as net neutrality, control over markets, or conduct of both content and users connecting to the Internet.

The US National Science Foundation inadvertently created the current, sometimes restrictive environment within the US Internet community by passing control of the NSFNet backbone to a select few commercial providers (AT&T, MCI, and Sprint). This award increased incentives for carriers to control their part of the US Internet space, and reduce incentives to aggressively build out physical capacity needed to meet the exponentially increasing demands for bandwidth and capacity.

It did not greatly meet infrastructure requirements needed to support the convergence of everything that can, does, should, and will travel over Internet Protocol (IP) networks over the next 25 or 30 years. While there are some positive developments in the local loop (FiOS, LTE, WiMAX, Uverse, etc), Newby cautions in the US there is a dearth of long haul and metro capacity needed to string all the local initiatives together.

The answer is to dig. Dig more conduits around the United States and Canada, drop the highest existing capacity fiber cabling within the conduits, connect wireless towers supporting LTE/4G+ to the high capacity backbone, connect buildings and homes, and develop new even higher capacity transmission technologies to parallel or exceed similar models of growth such as Moore’s Law and Metcalf’s Law.

But to give us the space needed to develop those technologies, for now, dig baby dig. Give fiber optic long haul, metro, and local digs the same tolerance we give to filling potholes and expanding lanes on a freeway system – while in the background we hope our leadership designs high speed rail, better road construction materials, and better ways to move from point “A” to point “B.”

Consider broadband, hyper-band, and uber-band development the true 4th Utility justifying extreme social priority, without which we will suffer the same fate as losing electricity, water, and roads. As with roads, everything we do going into the future will ride the broadband “matrix,” and without enough available lanes we will reduce ourselves to a frustrating gridlock of intellectual, business, and social development.

Dig baby dig…

NOTE: I was first introduced to the concept of the “Matrix” in the early 1990s, when a friend of mine suggested I read a book by John S. Quarterman entitled “The Matrix: Computer Networks and Conferencing Systems Worldwide.” 20 years after, and it is still the most enlightening view of the Internet, what the internet cloud and should be, as well as look into the future as anything I have ever read on the topic. It takes William Gibson, Neal Stephensen, and translates their fiction into a reality which continues to become part of our day to day lives. Or maybe it gave both authors additional ideas needed for them to develop fiction…

Expanding the 4th Utility to Include Cloud Computing

A lot has been said the past couple months about broadband as the fourth utility. The same status as roads, water, and electricity. As an American, the next generation will have broadband network access as an entitlement. But is it enough?

Carr, in “the Big Switch” discusses cloud computing being analogous to the power grid. The only difference is for cloud computing to be really useful, it has to be connected. Connected to networks, homes, businesses, SaaS, and people. So the next logical extension for a fourth utility, beyond simply referring to broadband network access as a basic right for Americans (and others around the world – it just happens as an American for purposes of this article I’ll refer to my own country’s situation), should include additional resources beyond simply delivering bits.

The “New” 4th Utility

So the next logical step is to marry cloud computing resources, including processing capacity, storage, and software as a service, to the broadband infrastructure. SaaS doesn’t mean you are owned by Google, it simply means you have access to those applications and resources needed to fulfill your personal or community objectives, such as having access to centralized e-Learning resources to the classroom, or home, or your favorite coffee shop. The network should simply be there, as should the applications needed to run your life in a wired world.

The data center and network industry will need to develop a joint vision that allows this environment to develop. Data centers house compute utility, networks deliver the bits to and from the compute utility and users. The data center should also be the interconnection point between networks, which at some point in the future, if following the idea of contributing to the 4th utility, will finally focus their construction and investments in delivering big pipes to users and applications.

Relieving the User from the Burden of Big Processing Power

As we continue to look at new home and laptop computers with quad-core processors, more than 8 gigs of memory, and terabyte hard drives, it is hard to believe we actually need that much compute power resting on our knees to accomplish the day-to-day activities we perform online. Do we need a quad core computer to check Gmail or our presentation on Microsoft Live Office?

In reality, very few users have applications that require the amounts of processing and storage we find in our personal computers. Yes, there are some applications such as gaming and very high end rendering which burn processing calories, but for most of the world all we really need is a keyboard and screen. This is what the 4th utility may bring us in the future. All we’ll really need is an interface device connecting to the network, and the processing “magic” will take place in a cloud computing center with processing done on a SaaS application.

The interface device is a desktop terminal, intelligent phone (such as an Android, iPhone, or other wired PDA device), laptop, or anything else that can display and input data.

We won’t really care where the actual storage or processing of our application occurs, as long as the application’s latency is near zero.

The “Network is the Computer” Edges Closer to Reality

Since John Gage coined those famous words while working at Sun Microsystems, we’ve been edging closer to that reality. Through the early days of GRID computing, software as a service, and virtualization – added to the rapid development of the Internet over the past 20 years, technology has finally moved compute resource into the network.

If we are honest with ourselves, we will admit that for 95% of computer users, a server-based application meets nearly all our daily office automation, social media, and entertainment needs. Twitter is not a computer-based application, it is a network-enabled server-based application. Ditto for Facebook, MySpace, LinkedIN, and most other services.

Now the “Network is the Computer” has finally matured into a utility, and at least in the United States, will soon be an entitlement for every resident. It is also another step in the globalization of our communities, as within time no person, country, or point on the earth will be beyond our terminal or input device.

That is good

Developing Countries in the Cloud

Developing countries may be in a great position to take advantage of virtualization and cloud computing. During a recent visit to Indonesia, it was clear the government is struggling with the problem of both building a national ICT plan (Information and Using Cloud Computing to Support EGovernment and eLearningCommunications Technology), as well as consolidating a confusing array of servers, small data centers, and dearth of policies managing the storage and protection of data.

When we consider the need for data protection, considering physical and information security, decentralization of data without adequate modeling for both end user performance, as well as data management is essential in giving the national tools needed to implement eGovernment projects, as well as fully understand implications ICT planning will have for the future economic and social growth of the country.

Considering an E-Government Option Using Cloud Computing

If, as in the case of Indonesia, each governmental organization ranging from the Ministry of Education, to the Ministry of Agriculture, to individual licensing and tax administration offices are running on running on servers which may in fact be connected to normal wall outlets under a desk, you can see we have a challenge, and great opportunity, to create a powerful new ICT infrastructure to lead the country into a new information-based generation.

Let’s consider education as one example. Today, in many developing countries, there is very limited budget available for developing an ICT curriculum. Classrooms consolidate several different classes (year groups), and even text books are limited. However, in many, if not most developing countries, more than 95% of the population is covered by mobile and cellular phone networks.

This means that while there may be limited access to text books, with a bit of creativity we can bring technology to even the most remote locations via wireless access. This was very apparent during a recent conference (Digital Africa), where nearly every country present, including Uganda, Rwanda, Mali, and Chad all indicated aggressive deployments of wireless infrastructure. Here are a couple of simple ideas on the access side:

  1. Take advantage of low cost solar panels to provide electricity and battery backup during daylight hours
  2. Take advantage of bulk discounts, as well as other international donor programs to acquire low cost netbooks or “dumb terminals” for delivery to remote classrooms
  3. Install wireless access points or receivers near the ubiquitous mobile antennas, and where necessary subsidize the mobile carriers to promote installation of data capacity within the mobile networks
  4. Take advantage or E-Learning programs that provide computer-based training and lessons
  5. Centralize the curriculum and student management programs in a central, cloud-based software as a service (SaaS) model in a central or distributed cloud architecture

Now, we can further consider building out two or three data centers in the country, allowing for both load balancing and geographic data backup. Cloud storage, cloud processing, and a high capacity fiber optic backbone interconnecting the facilities. Again, not out of the question, as nearly all countries have, or are developing a fiber backbone that interconnects major metropolitan areas.

So, starting with our eLearning SaaS model, let’s add a couple more simple applications.

If we can produce terminals and electricity for small schools anyplace in the country, why can’t we extend the same model to farmers (eAgriculture), local governments, and individuals through use of “Internet Kiosks” or cafes, possibly located near village offices or police stations? We can, and in fact that is a model being used in countries such as Indonesia, where Internet cafes and kiosks called “WarNets” dot the countryside and urban areas. Many WarNets supplement their electricity with solar energy, and provide Internet access via either fixed lines or wireless.

Cloud Computing Drives the Country

While some may reject the idea of complete standardization of both government and commercial applications at a national level, we can also argue that standardization and records management of the education system may in fact be a good thing. In addition, when a student or adult in Papua (Indonesia) gains the necessary intellectual skills through local eLearning programs, and is able to spend the weekend watching videos or reading through transcripts from the Stanford Education Program for Gifted Youth, the Center for Innovation, or Entrepreneur series.

However when a nation is able to take advantage of an economy of scale that says compute capacity is now a utility, available to all government agencies at a fixed cost, and the nation is able to develop a comprehensive library of SaaS applications that are either developed locally or made available through international agencies such as UNDP, the World Bank, USAID, and others.

With effective use of SaaS, and integration of the SaaS applications on a standardized data base and storage infrastructure, agencies and ministries with small, inefficient, and poorly managed infrastructure have the opportunity for consolidation into a centrally managed, professionally managed, and supported national ICT infrastructure that allows not only the government to operate, but also support the needs of individuals.

With a geographic distributed processing and data center model, disaster recovery becomes easier based on high performance interconnecting backbones allowing data mirroring and synchronization, reducing recovery time and point objectives to near zero.

The US CIO, Vivek Kundra, who manages the world’s largest IT organization (the United States Government), is a cloud believer. Kundra supports the idea of both national and local government standardization of applications and infrastructure, and in fact in a recent Government Technology News interview said he’s “moving forward with plans to create a storefront where federal government agencies could easily acquire standard, secure cloud computing applications.”

This brings a nation’s government to the point where online email, office automation, graphics, storage, database, and hosting services are a standard item that is requested and provisioned in near real time, with a secure, professionally managed infrastructure. It is a good vision of the future that will provide tremendous utility and vision for both developed and developing countries.

I am thinking about a school in Papua, Indonesia. The third year class in Jakarta is no longer in a different league from Papua, as students in both cities are using the same lessons available through the national eLearning system. It is a good future for Indonesia, and a very good example of how cloud computing will help bring developing countries into a competitive, global society and economy.

Wiring Indonesia with WARNETs, Wifi Hotspots, and Mobile

Jakarta is a city of cafes, coffee shops, and mobile phones. With a mobile penetration hitting nearly 62% of the population, the world’s 4th most populace nation represents a huge market, and tremendous infrastructure challenges. With more than 50% of the country making less than $50/month, the percentage of people with access to mobile phones and the Internet is astonishing.

WarNet in Samarinda IndonesiaThis is very apparent when driving through villages that are well under the poverty line, such as you will drive through on the way from Balikpapan to Samarinda (in Eastern Borneo, East Kalimantan Province). A large percentage of the “homes” you pass would not have a prayer to hold water out of the “house” during a heavy rainstorm, but you will see many, if not most, of the residents carrying a mobile phone.

Most of the mobile phones are pre-paid, meaning of course the user pays up front for the handset and phone minutes, however even the poorest people have access to handsets.

The next interesting item is the ubiquitous “WarNet.” WarNet is actually a combination of two words, Warung (Café) and Internet. While not as available as mobile phones, nearly every village has one or two WarNet rooms, which (from my observation) have most of the available terminal stations filled with users.

As a large percentage of the population lacks disposable income needed to purchase their own computer, or Internet access, the WarNet is the only place young people (and older folk) are able to access and take advantage of either computers or network-enabled communications.

Strolling the streets of Samarinda after 2200, in an entirely unscientific poll, I was able to count about 2 WarNets per city block in the downtown area. A similar stroll earlier in Batam (a free port near Singapore) yielded similar results, with Jakarta only slightly less, probably due to the fact my unscientific strolling poll was confined to a relatively opulent area with more WiFi hotspots available at coffee shops such as Starbucks and the Coffee Bean and Tea Leaf, with patrons carrying their own laptop computers.

This did drop the number of WarNets to a scarce one per city block when you are off the main roads.

WarNets are Not Just for Fun

While the Korean Internet Café experience of the 1990s was fueled by insatiable demands for higher performance multi-user gaming networks, the Indonesian experience appears to be much more broad in scope. According to Mr. Ibenk, an official with the Indonesian Government’s Kominfo (national ICT organizer), WarNet’s serve the community by providing both exposure and low cost access to the Internet for students, business people, as well as access to social media and entertainment.

WarNet is downtown Batam Indonesia“Access to a WarNet costs users less than 3000 Rp (Indonesian Rupiah, around $.35) per hour. While still a reasonably high cost to a poor user, nearly everybody can afford at least a couple hours per week to access the network” added Ibenk.

WarNets are used by students, professionals, and from my observation a lot of foreign tourists trekking through both Jakarta and other more remote locations. Students spend a lot of time on the Internet, and it appears schools encourage use of WarNets for some students to access research, write reports (most WarNets also have sideline services such as printing, copying, and faxing), and as one student told me, they are now even submitting some homework assignments through the Internet.

You may question why this would be necessary, and the answer is simple – most schools in poor sections of Jakarta and most rural areas do not have sufficient budget to build ICT within their school or curriculum. However both students and teachers know that for a child to be competitive in the new wired world, they need exposure to Internet technologies to gain skills critical to their future success in a global economy.

Porn, hacking, and other nefarious use of WarNets

While it may seem unbelievable, most WarNet operators claim use of WarNet’s to access pornography and conduct illegal activities occurs, it is probably at a level much lower than we’d expect. “Niki,” a former WarNet operator in Sumatra now working as an ICT manager in Jakarta, explained “Indonesia is a Muslim majority country. Muslim’s may have a stricter social manner than in some other countries, and thus the negative uses of WarNet’s may be lower than you would expect.”

Not sure if that is entirely true, however most of the WarNet’s I visited during the past 10 days in Indonesia appeared to be meeting the objectives noted above. Just a lot of people chatting, researching, doing email, or using word processing programs (including Google Docs and MS Live Office). Cloud computing, whether the users know it or not, has actually made a very positive contribution to the community by providing applications and online storage that would not have been available just a couple years ago.

WarNets are a Positive Contributor to Indonesia

A report by Rudi Rusdiah, from APWKomtel, claims WarNet’s account for more than 40% of all Internet access in Indonesia. I’d believe that number is actually higher, given the number of WarNets I observed in rural areas throughout Java, Sumatra, and Kalimantan.

Rusdiah’s report includes a listing of the positive social impacts of WarNet’s, including:

  • Extending public Internet access to serve people with no computer or Internet access at home;
  • Providing value-addition to small and medium businesses in the community, strengthening the economy by creating employment and business opportunities;
  • With the support of the Ministry of Industry and Trade, setting up of Warsi (Warung Informasi or Information Centers) near small traditional industry clusters;
  • Providing Internet access and literacy to the small businesses in the community and cluster;
  • Promoting the products and services beyond local and traditional markets, to global and national reach;
  • With Open University and OSOL, programs to promote the use of IT as a tool for education;
  • Providing tourists, travelers and commuters with Internet access.

In a world where many governments struggle with bringing broadband Internet to every home as a public utility, developing nations need to exercise great creativity in delivering “any” internet access to the community. The WarNet provides that utility, and the creativity of Indonesians to find ways to deliver Internet to nearly every community in the country through use of satellite, microwave, mobile phones, DSL, and telephone access should be applauded.

Not the final solution, but with the world’s fourth most populace nation getting wired, we will expect a lot of new ideas from a lot of motivated Indonesians in the near future.

%d bloggers like this: