Years ago, in an episode of the Permaculture Podcast, the host joked that Computer Science was less science and more superstition and witchcraft. That had a familiar ring to it.
Several years earlier, I had started a new job at a company which was almost entirely Windows based (all my previous jobs had been primarily Unix). One day I was trying to track down a complicated problem. I asked my boss if there were any more diagnostic tools at hand which would help, to which he responded “would the problem go away if the computer is rebooted?” Most likely, but that won’t fix the root prob- “Then reboot the computer, it’s Windows, people are used to that. Don’t waste your time debugging.” I verbally agreed, but in my head I thought “how are things supposed to get better if we do that?”
Fast forward two decades… our lives have become awash in pervasive and increasingly complicated devices. Every day, we encounter strange problems: Spinners which never stop, devices which won’t connect, blank screens, mysterious crashes. Why these happen is a topic for a different day. But how do you troubleshoot it? Google the problem and you’ll find numerous articles which are a variation on this theme:
I may be exagerating a bit… but not much, here’s an example and another. I know I’ve seen others, I’m sure you have too. Feel free to post others in the comments.
But the upshot of this is that nobody knows what is going wrong. All these so-called “troubleshooting” steps are just guesses and workarounds for a wide variety of bugs which are unlikely to ever be fixed. Nobody cares. Just keep rebooting until it works or until there’s an update which breaks other things. The next big version will change everything, anyway, so why bother fixing it? Buy the next latest and greatest gadget, repeat ad nauseum.
There’s an old saying that I first heard a couple of decades ago:
Good. Fast. Cheap. Pick two.
I was listening to an NPR story about Moore’s Law. At first I was thinking that it gave us computers that are “fast” and “cheap”. But we never got the “fast”. The workstation on my desk 25 years ago was just as fast as the one I’m sitting at now. Ah, but that old Sun 3/50 didn’t have to do nearly as much as my current workstation, which is true. That old workstation didn’t have color, virtual desktops, animated 3d icons, streaming audio and video, bloated web and email programs, etc. But somehow I got my work done just as quickly. What’s happening here is another law is cancelling out the “fast” part of Moore’s Law: Wirth’s Law. That law basically says that software is getting slower faster than hardware is getting faster.
Case in point: 25 years ago, when I fired up Emacs (which served as my text editor, mail and usenet reader, and web browser), the 4 megs of virtual memory it used had a noticeable impact on other users. Nowadays Emacs is a lightweight. Right now my email client (Thunderbird) is taking up 1.2 gigs of virtual memory!
Moore, himself, acknowledged that his law has its limits, and some people place that 10 years in the future. Thus far, Moore’s Law has managed to just barely keep up with Wirth’s law. So what happens when the latter tops out? I seriously doubt the latter has any limits, as I have yet to see a limit on human wastefulness and incompetence.
I guess we’ll need to go back to the trinity listed above. Maybe we need to start doing something toward “good”: stop adding new bells and whistles and go back and fix bugs, make software more reliable and more informative when something does fail, and generally reduce all the frustration that everyone feels when using computers. In other words, do the opposite of what we’ve been doing. It’s a massive challenge, and, by and large, unfamiliar, virgin territory.
I know this is probably another one of my utopian dreams, and will probably never happen, but it would be nice if, for once, I could encounter someone using a computer and not feel the urge to apologize on behalf of my profession.