Apple, MacWorld, User Experience, and the Multicore Crisis
Posted by Bob Warfield on January 16, 2008
Looking over the parachute drops of information from MacWorld, I was struck by some underlying themes. I won’t bore you with a recitation of the huge amount of surface level activity: plenty of better more firsthand places to get that. But some of those first hand sources excited some patterns I’m familiar with.
First, the multicore crisis bit. I’ve written about it before, but let me recap. What is the multicore crisis? It is a wave of change that is being unleashed by virtue of the fact that microprocessors have stopped getting faster every 18 months. Instead of gaining a faster clock speed with free benefits for all at scarcely any effort, we get more cores. That ain’t bad, but it takes considerable effort at the software end to take advantage of the additional cores. For the most part, we are far from keeping up with the availability of those cores. For emphasis, here is a graph of Intel clock speeds that vividly shows just how long the curve has been flattened out:
We’ve had another year in 2007 while the curve remained flat.
What does this have to do with Apple and MacWorld? Well, on a simple vein, it was the multicore crisis checking in that caused Mathew Ingram to write, “Hey, Steve–you broke the Internet.” He was remarking about how Twitter was virtually unusable for hours. Twitter has become somewhat of an unwilling canary in the coal mine: if something is hot and getting traffic, Twitter seems bound to go down. Why? Because it is a victim of the Multicore Crisis. The system’s architecture isn’t scaling. It may be a software problem, i.e. it is not designed to take advantage of enough cpu’s, or an infrastructure problem, i.e. it can only take advantage of the cpus Twitter has physically bought and installed in their data center. These can both be overcome. Software can be made to take advantage of lots more processors. Services like Amazon and others offer let you scale up to many more cpu’s on short notice without having to buy physical hardware. Failure to provide for both these contingencies is succumbing to the Multicore Crisis.
Twitter was not unique. Mathew’s blog was very slow to come up when I tried to access the article, having been Techmemed. He mentions Fake Steve Jobs got creamed and couldn’t make CoverIt Live work (Zoli mentions CoveritLive was CoveritDead). The Apple store was down at one point too.
Scoble tells a similar story: Engadget was up but very slow, Qik’s macworld channel was up and down, Mogulus was slow to unreachable. Live video was hard to come by. TUAW fairly unreachable. There were a couple sites that passed muster including TechCrunch (bravo!) and MacRumorsLive. TechCrunch hammers Twitter for being down. Again. If, as its pundits like to think, Twitter will play a signficiant role in reporting events, it needs to work all the time. It is, after all, a communication channel. Moreover, it’s a communication channel under constant scrutiny.
This brings me to a point I want to make about the Multicore Crisis and The Big Switch (what Nick Carr calls the trend to move to Cloud Computing). These two megatrends are combining to change what the important core competencies are to succeed. Once upon a time, it was enough just to be able to lash together all the myriad pieces needed to create a web application with a good user design. You could count on Moore’s Law to make machines faster and your customer growth was slow enough that scalability could be comfortable pushed out into the future as a high quality problem to deal with if you succeeded. That’s no longer the case. The ability for new ideas to catch on has become viral on the web for a variety of reasons, not the least of which is that so many more people are on the web and they’re interconnected in so many more ways than simple e-mail, search, and web browsing.
There is another, more subtle manifestation of all this. The new MacBook Air personifies this. In the Multicore era: user experience is the new black for hardware. Why? Well, in the old days, everyone wanted to upgrade every two years. For a while, I bought a new PC every year. And it was worth it. The new machines were significantly faster than the old. In a world where the upgrade cycle is so short, you want to buy cheap hardware. Result? Dell wins big. They’re the best at building their hardware cheap, so you can buy it more often, so you can get that speed. Dell was driven by the Need for Speed, and the relative ease with which Moore’s Law delivered it.
Times have changed. In an era when you probably won’t upgrade every two years, let alone every year, it makes sense to look at something other than speed. I have an idea, how about looking at the User Experience? Is the machine sexier? Does it do cool things? I love the Air’s ability to “borrow” a disk drive via WiFi from a nearby machine as well as its ability to handle iPhone-like gestures on its touch pad. Combining Apple’s trademark radically uber-cool Industrial Design with genuine usability innovation is a winning formula. If it gets you to buy a new machine when you otherwise would be happy to stand pat, they win. The fact that so much of what one does on a computer is via the Internet combined with the rise of very effective virtualization software has radically lowered the barriers to PC/Windows users buying a Mac as well. The latter is the Big Switch component.
That’s two significant changes brought on by the Multicore Crisis and The Big Switch. What is your company doing to get ahead of these trends before some competitor uses them to ride right over your business?