Archive for August 3, 2008

Sunday, August 3, 2008

Unit Testing Roadblocks

Daniel Jalkut:

I’m ashamed to admit that this is how many a unit test has been put off. Other laziness incubators include the friction of adding a suitable unit test bundle target to a project, and the difficulty of deciding how to factor your unit tests so that they make sense in the context of your project.

[…]

But now you’re bound to run into a vexing question: “how the heck do I debug this thing?”. Since unit tests are generally built into a standalone bundle, there’s nothing for Xcode to run. But when you come across a failing unit test and you can’t figure out why, you find yourself wishing you could step through the code just as you might in an application.

Jalkut points to various tips for unit testing with Xcode. That’s how I originally wrote and ran my unit tests (using ObjcUnit, since Xcode didn’t yet have built-in testing support). Now I’m taking a different approach: using Python and PyObjC to write the tests and py.test to run them. You may prefer not to deploy an application written in Python or Ruby, but that’s no reason not to take advantage of those languages during development. The strengths of dynamic languages are a good fit for writing and debugging unit tests, and the weaknesses don’t matter so much in the context of testing.

Welcome to iPhone: Your Crappy Mac of Tomorrow, Today!

Mike Ash:

I’ve come to realize that the iPhone platform is really pretty crappy in a lot of ways. And these ways are mostly not due to hardware limitations, but rather artificial limitations put in place by Apple. And mostly these are limitations which have been put in place For Our Own Protection, and which have been, shockingly, praised from many quarters.

Well, there was also a crowd who praised the announcement that developers would only be able to write Web applications and ridiculed those who wanted native ones.

Apple’s focus and attention seems to be on the iPhone, and the sentiment coming out of Cupertino is one that the iPhone is good, all of the stupid, crippling restrictions on how it works are good, and Apple always knows best.…This is the same keynote, let’s remember, where high-up Apple people ridiculed the idea that anyone would ever have a legitimate reason to run applications in the background. Unless that application is made by Apple, of course. And then they came up with their brilliant idea of push notifications, which totally replace the need for background processes, unless you’re writing a music player, or a Web browser, or GPS logger, or a terminal emulator, or file downloader, or….

I think most of what Apple has done is defensible. With a new platform and limited engineering resources, a case can be made for a conservative approach that starts with a very closed platform and slowly opens it up. You don’t maximize the device’s potential, but you prevent any bad surprises from occurring. Theoretically, by controlling everything, you can keep the quality of the experience high while you build market share. You can get away with this for a while because there are no significant competitors. This is not the approach I would have taken—I like Ash’s idea of third-party developers stepping in to do what Apple won’t or hasn’t yet—but I can easily believe that Apple thinks it’s a good idea, and they may be right.

What’s not so defensible is what Apple has been saying these past few years. It’s been spinning like crazy. They introduced the iPhone as a platform that included Cocoa and lots of great developer technologies, but soon it became clear that these were only for Apple’s use. First there was, “Cingular doesn’t want to see their West Coast network go down because some application messed up.” As far as I can tell this was just FUD. Then: “You can write amazing Web 2.0 and AJAX apps that look exactly and behave exactly like apps on the iPhone. And these apps can integrate perfectly with iPhone services.” Turns out there were significant look and behavior differences compared with native applications, and nearly all of the phone’s data and services were off limits. It was reasonable for Apple not to have an SDK ready at that time. It was not reasonable to suggest that Web applications, which we already knew would be supported, were something new and “innovative” and “a very sweet solution.” Since Steve Jobs said that this was something Apple had just “come up with,” some people assumed that there would be a JavaScript API or perhaps a widget environment. In fact, there was nothing. The touted integration ended up being that Google Maps and YouTube URLs would open in those applications rather than in Safari. Then, finally, the SDK was announced, and developers saw that far more was missing from the OS than the Mac desktop patterns and sounds. How would iPhone applications be developed and deployed? With music, Apple had given the appearance of being against DRM, but for applications it delivered one that was even more restrictive.

On the Mac side, Apple encouraged developers to write 64-bit Carbon applications, but then quietly removed this option. Developers would have been better off following the conventional wisdom, that Carbon was a transitional API and Cocoa was the future, than listening to Apple’s explicit statements to the contrary. At WWDC 2006, Steve Jobs declined to demonstrate Leopard’s top secret features because “We don't want our friends to start their photocopiers any sooner than they have to.” Once Leopard shipped, we saw that there were no such features. At WWDC 2008, Jobs looked sickly and Apple PR claimed that he just had a “common bug,” though he eventually admitted off the record that this wasn’t true.

Ash now worries about what Apple has planned for the future of the Mac. It could be bleak. It seems like a crazy idea, but Apple is known for betting (often successfully) on crazy ideas. I don’t think Apple would go that far, but it’s frightening that it would be possible.

I think the bottom line is that, because of the way Apple has behaved, people don’t trust it as much. This makes them less willing to give it the benefit of the doubt. And it increases uncertainty, which makes it difficult to plan. Mac developers were encouraged to learn how to write Web applications when a Cocoa-based SDK was just around the corner. It ended up being better to act based on supposition (that there would be an SDK) and experiment with a jailbroken phone, than to do what Apple had recommended. I’m not suggesting that Apple should reveal all the details or make commitments prematurely, but in most cases I think the spinning is counterproductive. I would prefer candor. If the reality doesn’t match the rhetoric, people will find out. They could be unhappy that they were talked down to and misled, or they could appreciate being told the straight story, even if it’s less than insanely great.

VMware and OS X

Gus Mueller:

Why is this so important to me? As a developer (and specifically an indie developer) setting up and testing my applications on a clean install of Mac OS X can be a pain in the ass. I’m not the type to have multiple machines for this purpose since I can’t stand the clutter. Plus, once I’ve run my one of my apps on the clean system, it will leave little bits of debris around the file system in the form of preferences

I’ve been using multiple Macs and SuperDuper, but using virtual machines would be better in some ways. This would be best for testing with older versions of the OS, some of which require older hardware that I might not want to keep around. For how long after a version of Mac OS X Server is discontinued does ADC provide serial numbers?