Earlier this month, Ars Technica ran an article on attempting to use a Macintosh IIsi in the modern day to do actual work. I was never really part of the classical Mac world—even our school computer labs were either Apple IIgs machines or 32-bit x86 systems. (While I wasn’t really aware of it at the time, based on what I remember of the labs I think they were actually running DOS under Novell NetWare.)
What I found interesting was that this wasn’t the first time they’d done a stunt like this; four years or so ago they pulled the same stunt with a G3 PowerBook running System 9. This was interesting both because the author there had much more trouble dealing with System 9 than the newer author had with System 7, but also because it linked to Ars Technica’s comprehensive review of OS X 10.0 during its public beta period and initial release, and those initial reviews were unkind indeed.
The thing that jumped out the most at me from looking through these articles was the way classic Mac OS was presented as a system designed to be aggressively micromanaged and configured within an inch of its life. I first seriously entered the Mac world around 2008 when the transition to Intel architecture was just starting to get entrenched with consumers. By then, the common stereotype was that Apple systems were designed to make basically all design and interaction decisions for the user and personalization of one’s computing experience was generally considered a Bad Thing by the community. This stereotype very clearly had no basis until OS X, and in the runup to the OS X initial releases there was a palpable sense of panic from the writing that the System 9 faithful were going to be relegated to a niche market not worth serving in the new world.
In retrospect, I should have recognized some of this in advance. Around the time I was learning to develop in and for Mac OS X, I encountered Andrew Plotkin’s silly user-experience diary of installing and acclimating to Mac OS X 10.1. A lot of the trouble he had revolved around setting up his interaction model to let him work the way he wanted to. In rereading it now I’m struck by the way his “One True Way” of organizing applications in the Apple Menu matched the ways I organized my Start Menu back in the days when you organized your Start Menu rather than running a search engine over it.
I did not, and still don’t, have much interest in learning about the internals and idioms of classic Mac OS. Everything I have seen about it reminds me of what I consider the “awkward era” of personal computing, running from about 1995 through about 2005. We knew, more or less, what we wanted a personal computer to be, at this point, but the only machines that could actually do it were workstation-level systems that, while they weren’t exactly filling entire rooms, still ended up costing tens or even hundreds of thousands of dollars. After the home computer apocalypse of the early 1990s, Apple and Microsoft ended up evolving their systems towards an ideal obviously set by systems like SGI Unix workstations—but they needed to preserve the things that made their computers attractive to their users along the way. The story of Mac OS from System 7 through OS X 10.4 or so and the story of Windows from Windows 95 through Windows XP is one of that evolution, with the steps necessary to reach that endpoint taken in very different orders.
Unlike the Windows case, though, where XP was acknowledged as the NT line of business-level operating systems finally getting compatible enough with consumer hardware and Win9x-era software to let consumers use an NT kernel with all the benefits that was already known to have, it seems undeniable that in the Mac OS 9 to OS X transition, something real was lost.