Lightweight Programming Models and Cost-Effective Scalability

The final pattern, but certainly not my last blog on Web 2.0… But before we get all emotional lets focus on the task at hand… The final pattern I will be discussing is “Lightweight models and cost-effective scalability” which refers to services, not packaged software, with cost-effective scalability.

Innovation within Web 2.0 is developing so rapidly that it’s no surprise that with each passing year much more can be done for less. This is becoming a growing trend within the web development industry by companies utilizing light-weight models and cost effective scalability to design their services, but what does that mean exactly? And how can the everyday web developer capitalize on this?

In a world where the user’s requirements are always changing and the priority for developers to meet the high needs and expectations of consumers, it is vital that the applications and services have to be faster and updated without downtime. There are a number of successful cases through using the lightweight programming models such as RSS, Google Maps’ simple AJAX (JavaScript and XML) interface.

With considerable changes over the years in cost, reusability, process and strategy, it has become to be expected services should be “doing more with less”. Additionally to this, new services should assume a cost effective and scalable business and development model to allow themselves to be opened up for efficient and effective expansion. This concept has seen massive growth in recent years like never before seen in any industry. An exceptional example for this is just look at how many apps are now available in Apple’s iTunes Store, and then check it again in 6months, or perhaps even check how many new ones even appear by the time you go to sleep and wake up again!! And more importantly this scalability is cheap and effective.

Tim O’Reilly suggests that “Lightweight Programming Models” are the obvious way forward. The three significant lessons from this design pattern he recommends when implementing Web 2.0 services are;

  1. Support lightweight programming models that allow for loosely coupled systems.
  2. Think syndication, not coordination.
  3. Design for “hackability” and remixability.

A perfect example of Lightweight Programming Models and Cost-Effective Scalability is  Windows Live SkyDrive, which is a part of Microsoft’s Windows Lives’ suite of Web 2.0 offerings. Providing you with a free 25GB, Windows Live SkyDrive, although having some limitations (Individual files can be no bigger than 50MB each) allows you to store any type of file to a Private, Public, or Shared folder.

Utilizing your Windows Live SkyDrive credentials, no one except you can access Private folders; but also allowing anyone on the internet to view your Public folders, and invite others to see Shared folders. Being one of the largest companies on the planet, Microsoft utilizes this pattern extremely well, with a service that is easy to maintain, build upon easily and provide an excellent user friendly service to the consumer.

Competitors to Skydrive consist of Drop box and Cloud drive among others which do have their advantages and disadvantages over SkyDrive. However what really sets SkyDrive apart from the rest is the fact that it is already tied in with your own primary email at Live. However, it does fall short with some features offered by the aforementioned like allowing you to stream stored audio files and perform automated backups and lets you synchronize data between two computers.

In a nutshell, the philosophy behind developing for Web 2.0 by using “Lightweight Programming Models and Cost-Effective Scalability” is that “less is more”. Its objectives are simplicity and efficiency. By designing light, adaptable applications companies are able to respond quickly to market needs as in the world of Web 2.0 success depends on the overall user experience and satisfaction.


 Musser. John (nd) Web 2.0 Principles and Best Practices. Retrieved 11th May from,

 Oreilly. Tim (2007) What is Web 2.0: Design Patterns and Business Models for the Next Generation of Software. Retrieved 11th May from,

 Microsoft (2012) Why Skydrive? Retrieved 11th May from,



Filed under Web 2.0

Web 2.0 – Leveraging the long tail

Another complicated phrase right? What the heck is Leveraging the long tail?? Let’s start from the beginning… “The Long Tail” was a phrase first coined in 2004 by Chris Anderson, and later popularized as one of O’Reilly’s Web 2.0 patterns.

“O’Reilly describes it as, “the collective power of the small sites that make up the bulk of the webs content.”

As mentioned this term wasn’t originally aimed at the web, but in recent times has been used to describe the strategies used by internet companies to leverage the online market.

Anderson defined “The Long Tail” as a statistical curve showing the advantage that website based companies with a mass amount of items have other the traditional brick and mortal retail stores with limited shelf space for the mass markets.

The idea is that your traditional retail stores have evaluated that the cost of stocking low volume items on their shelves is just not the worth the shelf, storage and labor cost required in distributing them. However a website doesn’t have this problem, there is no shelf space required, simply a virtual shopping center that can order products on demand. Imagine if say Target or Kmart could remove the costs of real estate, staff and inventory stocks and give these savings back to the consumer.

Another key factor and advantage online retailers have over the standard store front retailers is that when you purchase a product, you are generally offered recommendations with links based on your purchase that encourage looking at several others.  Most notably the companies that sell book, video and music sales, where there is a vast supply of product, have benefited significantly from this approach. Amazon, iTunes, and eBay are great examples of this.

In 2006 Anderson pointed out that the long tail accounts for between 25 percent to 40 percent of’s sales”

As one of the largest industries on the planet, the tourism and travel industry boast many websites that utilizes this Web 2.0 pattern well. With over five million visitors a month, has been successful by offering a unique way for travelers to plan their trips. Providing detailed information on numerous countries and cities around the world, as well as forums and newsletters offering travel advice and opinions from other travelers.

Additionally they offer an innovate accommodation system with a vast range of options from hotels to hostels and finally they also provide flight bookings, travel insurance and ultimately anything related to catering to your travel needs. Now you ask what has this got to do with a Web 2.0 pattern?? Lonely Planet follows some of the best practices of ‘leveraging the long tail’ by offering travel services different than the countless other travel websites by providing users with a system that is not just based on entering dates and looking for prices to your destination, but by allowing users to choose the country they would like to travel to, and then search for an activity that interest them. provides a service that most store front travel stores simply just can’t offer such as;

More selection –Lonely Planet offers books on top destinations

Lower price – less overhead (no storefronts for either store)

Scalability – can sell more ‘items’ by simply increasing the online ‘store’ – no extra shelving required, and they do not require any physical delivery or inventory to be held, they are simply selling a service or acting as an agent between two parties.

Wisdom of Crowds – By using this philosophy and encouraging user contributions in the form of feedbacks, reviews, rankings and user ratings.

Algorithmic Data Management – Lonelyplanet is great example of a site that helps customers find similar products based on their ‘clicks by endorsing their products in a ‘you might like…’ window based on the area of the world you are showing interest in.

In a nutshell, online retailers have a great advantage when “leveraging the long tail” of Web 2.0, due to the lack of inventory and additional costs required hosting a traditional store, and therefore can offer additional services and products to their consumers.



Anderson. Chris (2009) The Long Tail of Travel. Retrieved 4th May from, (2012) Retrieved 4th May from,


O’Reilly. Tim (2005) What Is Web 2.0? Retrieved 4th May from,


Filed under Web 2.0

Web 2.0 – Perpetual Beta

When people hear the word “beta” they instantly think of a new or incomplete application that is buggy, unstable and experiences frequent crashes. However quite the opposite can be the truth in many cases, and “beta” isn’t always a bad thing. So much so is this, that some Web 2.0 applications have the ability to opt into “perpetual beta”, and more than likely most everyday users won’t even notice their favorite  apps are in fact constantly in beta.

In simple terms, the “Perpetual Beta” pattern by O’Reilly refers to an application or piece of software that remains in constant development and although the application may still contain all complete features, and minimal bugs present, new features and updates are regularly been applied.

According to O’Reilly, when it comes to Web 2.0,

“The users must be treated as co-developers”, thus, “the product is developed in the open with new features slipstreamed in on a monthly, weekly or even daily basis.”

While many users associate software updates with those annoying popup messages on the bottom of the screen, telling us we have updates to be applied, some Web 2.0 applications can apply updates as frequently as every half hour without us even knowing.

Web 2.0 has become such a crucial component of the Internet era that big companies such as Google have devised ways to update their services in order to enhance the user experience but also avoiding large updates and downtime. Simply by slapping a “Beta” label on a service or application they are essentially employing everyday users as real-time testers.  Why do this?? Easy, because this allows them to receive feedback on flaws/bugs within the system that they may miss without real life users.

In the past “Perpetual Beta” was most commonly used by developers and tinkerers learning new skills and testing applications, which does have enormous advantages for reducing problems in the application. In recent years this has become quite common with large software companies such as Facebook and Twitter. The example I find particular interesting is Google which also uses “Perpetual beta” and has copped quite a lot of mixed opinions on the topic.

Pretty much every single application developed by Google over the years has been in a state of “Perpetual beta”, from Gmail, Google Apps and their Web Browser Google Chrome. Google’s Approach to leaving products in this testing phase has been advantageous in many ways, but also frustrating for some users, due to at times applications staying in beta for as long as 5 years.

“Analysts have dubbed Google’s approach “perpetual beta.” Under this strategy, Google launches early versions of new products to see what sticks with consumers. The problem is that some of these experiments aren’t sticking — especially when users have to pay for products”

No matter the gripes people may have with “Perpetual Beta” this method is here to stay and the old days of us traditionally buying software, installing it, and running it over and over again until we buy the next version, or an update was released are becoming less and less common. As the internet evolves, every time we visit a website we are downloading newly available updated content without even knowing so….


Filed under Web 2.0

Web 2.0 – Software Above the Level of a Single Device

As discussed in my previous blog, “Rich User Experiences” you have already learnt how powerful Web 2.0 and the applications that are built off it are becoming and how they are now reaching far beyond that of our everyday desktop computer; hence the ability to reach target markets has increased dramatically.

In 2010 an astounding five billion devices were connected to the internet an incredible number to reach users and target markets. These 5 billion devices were certainly not limited to your traditional desktops, but including mobile phones, tablets and any number of other devices. Clearly, one version of an application cannot suit all devices due to the variances in the features of devices. Even more amazing was a prediction made by CISCO in 2011;

“By the end of 2012, the number of mobile-connected devices will exceed the number of people on Earth, and by 2016 there will be 1.4 mobile devices per capita.”There will be over 10 billion mobile-connected devices in 2016… Exceeding the world’s population at that time (7.3 billion).”


Today I will be discussing the next pattern of Web 2.0 identified by O’Reilly in 2004 “Software Above the Level of a Single Device”. Even those that are unfamiliar with how certain applications work most would agree applications that are limited to a single device are less valuable than those that are designed across multiple platforms. In an era of ubiquitous computing, the PC is definitely no longer the one and only device that we use to connect to the internet. Therefore, this has resulted in billions of devices in all shapes and sizes that can be connected to the internet.

So what does this have to do with “Software Above the Level of a Single Device” you ask? The whole idea of this core pattern is that web applications should be tailored to meet the needs of individual devices by focusing on the most important aspects of the service and then customize it to the resources available on the various devices. By doing so the application can create a rich and tailored service that can be used efficiently without the need for a PC.

The benefits of ‘Software Above the Level of a Single Device’ include:

  • Opens new markets
  • Access to your applications anywhere
  • Ability for location and context awareness

One such service that is a perfect example of this pattern is iTunes. iTunes which is a proprietary digital media player application, which is used for organizing and playing music and video files on a Mac or PC, which is then usually added to an iPod, iPhone, iPad or Apple TV.

The application works extremely well in seamlessly breaching the gap between handheld device to a massive web back-end, with the PC acting as a local cache and control station.  Although iTunes was not the first of this type of service, and there has been many attempts to bring web content to portable devices, iTunes has been by one the most successful and was one of the first such applications designed from the ground up to span multiple devices.

Ultimately iTunes, and other services that expand their applications among many different devices, demonstrate clearly that the PC is no longer the only access device for Internet applications, and ultimately if the application is designed only for a PC then its value is considerably less than those that expand into multiple platforms. In the end the main goal for any designer of a Web 2.0 application should be to produce an efficient and useable application for Internet services across all platforms, and push the limits of the technology.


Brodkin. Jon (2012) Mobile Internet devices will outnumber humans this year, Cisco predicts. Retrieved 20th April from,

O’Reilly. Tim (2007) Software Above the Level of a Single Device. Retrieved 20th April from,


Filed under Web 2.0

Rich User Experiences For The Win!!

This week’s blog, boys and girls is on the fourth pattern in Tim O’Reilly’s “Design Patterns and Business Models”

Firstly let’s cover the terms that will be covered in the blog. Rich User Experiences refer to the combination of GUI -style (Graphical User Interface) applications and multimedia content.

The overall value of the creation of a web based application that utilises platforms to give the user an experience similar to that of computer based application is a growing need in the world of Web 2.0. A collection of technologies that is integral and a key component to this type of experience is AJAX. AJAX which stands for Asynchronous JavaScript and XML is a collection of technologies used together to create a rich user experience.
When you hear the term Rich Internet Applications (RIA’s) they are referring to the web-based applications that have many features/characteristics of a desktop application which are generally hosted via a web page using different types of browser plug-ins such as Adobe Flash, Java and Microsoft Silverlight. The key element that makes these applications so successful is their ability to combine the best elements of a desktop interface with web pages which therefore generate a richer user experience not to mention a engaging experience that improves user satisfaction and increases productivity.

Gmail which is of course Google’s online email application facilitates a rich user experience similar to that of a desktop mail program, but with being accessible from anywhere and having strong database search ability. Created in 2004 With the design intent for web-based use, large storage space and a powerful search tool.

Another service that provides a rich user experience along with a powerful community of sharing and visual dialogue through a RIA is Flickr, a photo organizer, poster, and comment gatherer which was also started in 2004.

Not only is this used by users to share and embed personal photographs, the service is widely used by bloggers to host images that they embed in blogs and social media. Flickr is another perfect example of utilizing Web 2.0 to create a Rich User Experience. 

I myself cannot predict where the web will be in years from now, but one thing that is for sure is that Rich Internet Applications will only continue to grow and play a large role in forming the web.  In the not to distant future, I believe that there really will be no distinction between “browser based apps” and “desktop apps”, however I don’t believe one or the other will die or win, but more a mixture of both is what really makes the next generation of software compelling.


O’Reilly. Tim (2005) Design Patterns and Business Models for the Next Generation of Software. Retrieved March 29th from,

Ward. James (2007) What is a Rich Internet Application? Retrieved March 29th from,


Filed under Web 2.0

Innovation in Assembly

We live in a world that is growing so rapidly, with development proceeding at a astronomical rate, it’s not hard to believe that the world wide web that we all use so frequently is developing quickly too. These days’ large technology companies are allowing business and personal users the opportunity to customize their own websites and contribute to others like never before. Hence this is the concept of Innovation in Assembly.

Stop! Ok what is Innovation in Assembly!!! This notion is about the way in which Web 2.0 applications can be used as a platform to build on. Still don’t get it? Ok quite simply the main principle behind this core pattern is the concept of a organizations being to develop new and innovative ideas by modifying or building upon pre-existing ideas.

If you look at in these terms, why start a development application project from scratch when the hard work in development may have already been done, and your company can simply build upon the application and modify it to your individuals business needs.

The key benefits among the many that surrounds this platform strategy is allowing your business to get a more complete idea of how certain services are used and attached to other applications easily by using API. API??….Yet another confusing term… API is abbreviated for “Application Programming Interface” this is a process that is used to allow other developers to use the data and coding from one application on another.

Being the largest online video community on the planet which allows users from all corners of the globe to share and watch videos of various natures YouTube is a perfect example of Innovation in Assembly. Allowing contributors that can range from a teenager to a CEO of a major organisation and everything in between to upload their clips and share with others, as well as the ability to edit it on their YouTube page.

The way in which the YouTube API is an extremely effective method of Innovation in Assembly is the ability to let you integrate YouTube video content and functionality into your website, software application, or device. By building on an already well developed platform you harness the ability to control the YouTube player as well as how YouTube videos look on your site.

Costing $0 to use YouTube’s APIs, it is one of the easiest and most cost effective methods of using Innovation in Assembly to reach people. Additionally, there is always the added benefit of knowing that YouTube will continue to improve and add features as times goes on.


Howstuffworks (2012) How to Leverage an API for Conferencing. Retrieved 23rd March from,

YouTube (2012) What are the YouTube APIs and Tools? Retrieved 23rd March from,


Filed under Web 2.0

“Data is the next intel inside”

In recent times, data has without a doubt become one of our most valuable resources in our everyday lives, not to mention to some of the biggest technology based companies on the planet. Due to this no matter which way you look at it data has become one, if not the highest priority for companies to protect and obtain at any cost.

Every day, clients, consumers, business and personal users from every corner of the globe are creating, obtaining and sharing valuable date online. It is through the use of Web 2.0 that people have been able to create a vastly growing global resource community to connect with each other and gain instantaneous information.

When speaking of large technology based companies that have had success in harvesting and gathering raw data, Google has been possibly the most effective. One method Google has used to gather data is through the implementation of a multitude of free services. Everyone from a personal to business user loves the concept of free services. Why? Because they are free…Google knows this all too well, and uses it extremely effectively to lure people to generate different types of data that they can find useful.

Without the mass range of data collected and stored over the years, these new services would not have been made possible!!

One of these services that Google uses in this way to gather data is Google Analytics. A relatively simply service that was implemented by Google in the later part of 2005, which allows you to record information on how people found your website, how many visited it on any given date, how they explored it, and ultimately how you can enhance the visitor experience. Thw overall draw point for this service being that your can improve your website return on investment long-term.

Correct me if I’m wrong, but I would doubt any business trying to utilise their website as a selling point and a business grabber would not benefit from this data. The potential you business holds in its hands by having data compiled for you for free that gives you information on how your customers interact on your website is a very valuable resource. When used effectively your webmaster will have the tool in their arsenal to tweak and improve the website to withhold current as well as draw in new clients

However with all free services, generally there is a catch, especially when you’re dealing with a technology data hungry giant like Google. By using Google Analytics you are essentially giving Google access to data that comes from thousands of websites, giving them up to date information on the latest trends in the online universe of the web.

A rather interesting statement I stumbled upon in a Google Analytics case study, was by a Kintek web developer names James,

“If data is the next ‘Intel Inside’ then Google Analytics is the operating manual for websites operators trying to understand how to improve the usefulness and success of their site”

I think the main point to consider though; it is nothing new that Google’s ethics when it comes to privacy has been questioned in the past and even though Google clearly states that the data will not be used for their own benefits, unless you agree to the sharing of the data, they are always considerations when not having control of your own data.


Google (2012) Privacy Policy. Retrieved March 18th from,

James (2009) Google Analytics Case Study – Data is the next ‘Intel Inside’. Retrieved March 18th from,

O’Reilly. Tim (2007) Google Admits “Data is the Intel Inside”. Retrieved March 18th from,


Filed under Web 2.0