Everything Is Broken
If I had to guess, I’d say I probably work around hundreds of bugs in an average week, and thousands in a bad week. It’s not unusual for me to run into a hundred new bugs in a single week. But I often get skepticism when I mention that I run into multiple new (to me) bugs per day, and that this is inevitable if we don’t change how we write tests. Well, here’s a log of one week of bugs, limited to bugs that were new to me that week. After a brief description of the bugs, I’ll talk about what we can do to improve the situation.
See also: Will Thompson (tweet).
This is how I feel using Apple’s software lately, but I guess the grass isn’t greener.
Update (2017-05-15): See also: Cédric Luthi.
macOS 10.12.5 bug has broken Calendar’s coolest custom alert--the one that lets you open a file automatically. Appppllllle!!! <shakes fist>
6 Comments RSS · Twitter
Last summer I attempted to start blogging some of the more significant bugs I would run into (because, like everyone else, it feels like bugs are everywhere, every day, these days), but Blogging is Hard™ and Blogging Bugs is Even Harder™, and stuff happens, and I've only managed two entries. I really should get working on my pile of iOS bugs (at least the non-Siri-related ones)….
Warning: old fart here.
IMHO that's because the number of software layers between us -- the user -- and the silicon has increased manifolds. When I started tinkering with computers, you were most of times bound within a single platform using a simple OS that talked more or less directly to the machine. More often than not, you used ascii terminals.
Programs were simpler and while a good coder was meant to be intimate with the innards of his/her machine's processor and videocard, the knowledge required to make stuff was smaller. Programs were self-contained -- everything on disk.
Nowadays, even the humblest 'hello, world!' require *remote dependencies*, meaning more attack surfaces for bugs. Programming languages come with fancy, ready-made frameworks, that sure have been battle-tested... but again, more spaces for bugs. And on, and on, and on....
The bonus feature is that today's coder has ready access to a lot of shiny new toys -- I needed some Perlin Noise texture generation capability and I found some ready-made libraries, for instance -- but code complexity increases, and so bugs we have to put up with.
MaxL: And on top of that, there's so much more software in our lives now than in the days of ASCII terminals, so all of those issues you identified with individual “pieces” of software are multiplied by 10 or 100 or 1000 “programs” we use/encounter in daily life.
Warning: Yet another old fart.
As per MaxL, older software had to be built from the ground up. On the PC in the 80's, all software cost a lot. Buying (say) a database tool like BTRIEVE cost real money (it felt like). But bugs were fixed and we had to pay for the upgrade.
Now, everything is f r e e e e ! ! For example, databases like mySQL, SQLite, etc. Multiply that by 1,000 for every type of software that might exist. We're living in a golden age for FREE, but FREE means half-assed.
By half-assed, I mean that since no one is paying money to most of the developers who build these free tools, those same devs:
a) only do the interesting stuff- I get that.
b) give up half way done- sure.
c) don't test- we all think our code is perfect, and with no boss, why test?
d) help and documentation: no one is paying for that.
I love having all this free stuff. We never had it this good. And I can build a program, standing on the shoulders of all that free software, that impresses the old me. The caveat is that my 25 lines of new code to make a program, leverages literally a million lines of somewhat reasonable (but not perfect) free code.
Getting back to Dan Luu, the problem is not that Dan is mad at bugs, he should be mad at the system which created "free" code. That free code is being used in all the paid software- IE, Windows OS, Mac OS, almost every application. We are all building on what came before. And we are not paying for it.
since no one is paying money to most of the developers who build these free tools
This is actually a bit of a myth in the open source world, especially when it comes to higher-end tools like mySQL. Look into it and you'll probably find that many of the top developers of many apps and tools are employed by people like IBM, or Red Hat, or SUSE, or even the US government.
The idea that open source = nothing but bedroom coders isn't true.
@Lou Killdozer: That argument would be way more compelling if Linux wasn't a lot more reliable than OS X :-)