I’ve come to realize that the iPhone platform is really pretty crappy in a lot of ways. And these ways are mostly not due to hardware limitations, but rather artificial limitations put in place by Apple. And mostly these are limitations which have been put in place For Our Own Protection, and which have been, shockingly, praised from many quarters.
Well, there was also a crowd who praised the announcement that developers would only be able to write Web applications and ridiculed those who wanted native ones.
Apple’s focus and attention seems to be on the iPhone, and the sentiment coming out of Cupertino is one that the iPhone is good, all of the stupid, crippling restrictions on how it works are good, and Apple always knows best.…This is the same keynote, let’s remember, where high-up Apple people ridiculed the idea that anyone would ever have a legitimate reason to run applications in the background. Unless that application is made by Apple, of course. And then they came up with their brilliant idea of push notifications, which totally replace the need for background processes, unless you’re writing a music player, or a Web browser, or GPS logger, or a terminal emulator, or file downloader, or….
I think most of what Apple has done is defensible. With a new platform and limited engineering resources, a case can be made for a conservative approach that starts with a very closed platform and slowly opens it up. You don’t maximize the device’s potential, but you prevent any bad surprises from occurring. Theoretically, by controlling everything, you can keep the quality of the experience high while you build market share. You can get away with this for a while because there are no significant competitors. This is not the approach I would have taken—I like Ash’s idea of third-party developers stepping in to do what Apple won’t or hasn’t yet—but I can easily believe that Apple thinks it’s a good idea, and they may be right.
On the Mac side, Apple encouraged developers to write 64-bit Carbon applications, but then quietly removed this option. Developers would have been better off following the conventional wisdom, that Carbon was a transitional API and Cocoa was the future, than listening to Apple’s explicit statements to the contrary. At WWDC 2006, Steve Jobs declined to demonstrate Leopard’s top secret features because “We don't want our friends to start their photocopiers any sooner than they have to.” Once Leopard shipped, we saw that there were no such features. At WWDC 2008, Jobs looked sickly and Apple PR claimed that he just had a “common bug,” though he eventually admitted off the record that this wasn’t true.
Ash now worries about what Apple has planned for the future of the Mac. It could be bleak. It seems like a crazy idea, but Apple is known for betting (often successfully) on crazy ideas. I don’t think Apple would go that far, but it’s frightening that it would be possible.
I think the bottom line is that, because of the way Apple has behaved, people don’t trust it as much. This makes them less willing to give it the benefit of the doubt. And it increases uncertainty, which makes it difficult to plan. Mac developers were encouraged to learn how to write Web applications when a Cocoa-based SDK was just around the corner. It ended up being better to act based on supposition (that there would be an SDK) and experiment with a jailbroken phone, than to do what Apple had recommended. I’m not suggesting that Apple should reveal all the details or make commitments prematurely, but in most cases I think the spinning is counterproductive. I would prefer candor. If the reality doesn’t match the rhetoric, people will find out. They could be unhappy that they were talked down to and misled, or they could appreciate being told the straight story, even if it’s less than insanely great.
Stay up-to-date by subscribing to the Comments RSS Feed for this post.