No, I don’t want to update!

Automatic updates aren’t always a good thing.

Software updates. We all all love them don’t we? Unlike any other industry I can think of, users of software expect to get updates on a regular basis. When Microsoft included Windows Update with Windows 98 it hailed a new era of self-updating software. But what happens when you are forced to update software? Microsoft has always been pretty conservative when it comes to forcing users to install updates, prompting, prompting again, and providing administrators with means of delaying updates (much to the dismay of web developers, as many institutions are still using IE6, which is 9 years old) while Google on the other hand takes the opposite approach with its web browser, Chrome that updates itself so quietly you’d never notice. This is fine with me, as Google have continued to add features and make the browser better and faster.

So what happens when (as seems to happen with all software) the browser starts to get a bit bloated and therefore needs a beefier machine to run on? Will I be so pleased when my browser suddenly takes longer to start up? Of course two things redeem Google here, firstly I would be free to downgrade and turn off automatic updates, and secondly they give it away for free, so it’s not as if would have lost anything, except perhaps the time investment spent learning and configuring the browser (which rules by the way, I used it on all my PCs and can’t recommend it highly enough)

I want my money back

When you pay for a piece of software, the publisher changes it considerably to its detriment, and then forces the update upon you then I have a problem. This was the case with Team Fortress 2. What started out as a fun, well balanced, class-based classic game of Red vs. Blue has slowly morphed into a terrible, badly balanced game of class-based luck that has more in common with Monopoly than its excellent predecessor Team Fortress Classic. Class upgrades and in game purchases are not what I originally bought into and it’s certainly not how the game was sold to me when I paid for it almost 3 years ago. Yet here I am 3 years later unable to play the game I originally purchased because that game no longer exists. Instead I have a considerably different game where they expect you to pay extra to get the best weapons. Can you imagine if your TV manufacture insisted you upgrade to a 3D TV and then burdened you with paying extra for silly 3D glasses you’d have to wear for the rest of your TV viewing years? Maybe there is an analogy here with the switch-off of analogue TV or when motorists were forced to stop using lead based petrol – but these were in the interests of the public good, and the decisions were no doubt made after lots of discussion by elected members of government.

I am struggling to think of any instances of when this sort of forced update would be acceptable. Security patches are a different matter, because they rarely take away features.and so I fully support them. Of course when the updates are actually good, then I don’t mind (TFC for example gained teleporters midway through its life) – but one person’s great new feature could just as easily ruin the game for someone else. There is no easy solution, I’m sure Value would argue it’s impossible to support each version ever released and still have people playing against each other. Had they released a whole new game called Team Fortress 3, then people would accused them of cashing in like they did when Left 4 Dead 2 was released.

So no easy answers, but as cloud computing becomes more commonplace in business I don’t see this problem going away.

The application bandwagon?

A couple of years ago I remember reading in the technology press about how desktop software was dead, and that the web was the future. Skip forward to today and I still hear the same thing, only with the word ‘web’ replaced with ‘cloud’ – cloud being a buzzword, simply meaning some server, somewhere.

I liked that promise, in a world where Microsoft Outlook takes about 10 times longer than the combination of Google Chrome and Gmail to load, web based software certainly seems to me to be the future. So I was surprised today when I discovered that the BBC is launching some new pieces of client software for mobile phones. These days, all the major web sites have their own ‘app’ – The Guardian, RadioTimes, WordPress, even the White House to name just a few examples – yet nearly all of those apps could work just fine in the web browser. Does this mean the web application is dead? I pondered this, and came to the conclusion that no, web apps aren’t dead. The fact that The Guardian and The Whitehouse haven’t released an application for Windows or MacOSX tells me that this is just a mobile thing, and that installable applications are perhaps easier to use than web sites, with their caching, gestures and smooth animations. The BBC’s applications will feature high-quality flash, something Apple don’t allow inside the iPhone’s web browser so that’s probably their reasoning. Installable applications (on the iPhone in particular) create a desktop presence, you can bet that people who’ve installed The Guardian’s app visit The Guardian more often that those who have a bookmark buried away somewhere, or type the address manually, so there is added benefit to the content provider. Of course, as the Windows Quick Launch area taught us, too much branding on the desktop can get annoying, so maybe it is a fad after all, and we’ll all be using web based mobile apps in 2 years time?

Is that client software? Google cleverly pretends to install client-side software, when it is in fact, just a web application.