The future of commerce is so close

A couple of months ago Apple Pay was finally enabled for my bank account and credit card, which meant I was finally able to pay for things by either placing my phone on the card reader, or by bashing my watch against it instead.

I look forward to a day when I can leave the house with nothing in my pockets – my watch will have a cellular radio and be capable of keeping me in contact with the people that matter, so too will it allow me to unlock my car/house, in addition to letting me pay for things. While the first two in that list might be a few years away (decades at the rate I upgrade cars), my watch can actually make payments today, how cool is that?

The reality is somewhat less cool. The problem is the payment limit. Since Apple Pay uses the existing 'Contactless' payment systems, it's also hampered by the same £20 limit. While this limit makes sense with a contactless debit card (there is zero authentication), both the Apple Watch and iPhone are secure; the iPhone asks for your fingerprint, and the watch asks for a PIN when you first put it on, as long as it says in contact with your wrist it is authorised for Apple Pay.

This authentication is also a hindrance – why would I fiddle about trying to get my phone to detect my fingerprint (while everyone in the queue is staring at me) or roll three layers of sleeve up to try and get my watch to be recognised when I can whip out my wallet and tap my debit card? The key point is that I still have to have my wallet on me in case the shop in question doesn't support contactless, or the amount comes to over £20. Don't get me wrong, Apple Pay is much better than entering a PIN, it's just not as fast as tapping your card.

So what needs to happen? I'd like to see the limit raise for Apple Pay purchases to something more reasonable, most cash machines allow you to take out £300 in a day, so why not the same for an arguably more secure system such as Apple Pay, while keeping existing contactless limits where they are of course

Is it time to stop assuming users have Flash installed?

Remember when the iPad first came out in 2010, and the first thing everybody said was it was doomed because it didn’t have Flash? Well, it turns out most web site owners were able to accommodate this requirement, and these days even Android tablets and phones lack the once ubiquitous browser plugin. Yet, if you’ve ever browsed the web on an iOS device or an Android device (the chances are you have) you’ll know that in the vast majority of cases, everything has continued to work as normal. Staples of the Internet from BBC News to YouTube keep on working – when it comes to video at least (if you want to run Farmville sans plugin then you’re out of luck).

So when setting up my MacBook with a clean installation of OS X 10.9, I decided to see if it was possible to live without Flash. My guess was it would be, and why not? One less thing installed on your system means a reduced attack surface for malware, fewer processes running and hence better longer life, and in my experience, fewer browser hangs. I was wrong however – instead of using “feature detection” (as good web developers should) to determine whether the browser supports the Flash alternative to video, “HTML5 Video”– it seems the vast majority of sites employ user agent sniffing and will only show you the non-Flash version if you’re on a known mobile device. I kept on being asked to install Flash, even though my iPad works just fine without it. User agent sniffing is the reason why sites designed for IE6 will ask you to “upgrade” if you visit in IE11 –I can forgive any web developer working back when IE6 came out in 2001 for following what was then a standard industry practice, but User Agent Sniffing is now generally considered outdated, so why are so many sites still doing it when it comes to playing video?

App Centric Old Fashioned?

So the in thing these days is for tech pundits to declare that an app centric ecosystem is old fashioned, and that what people really want is a people centric ecosystem; witness Facebook Home, and Windows Phone 8 before it.

That's all well and good, but do app developers want their brand to be muffled into another system? I think most developers like having their icon on the home screen, like being able to design their app as they see fit, and wouldn't be happy if the operating system simply surfaced their content as part of a people centric approach.

Ecosystems are nothing without the support of developers and big brand apps. Everyone wants their cut of the 'mind share' pie.