We join our tale only to find it already underway, so a potted history is warranted. Since the end of the one-hundred and ninety-sixth Decade Domini, it has been possible to order kits from magazines such as Electronics World to build computers. Not just any computers, these are small enough to fit in the home and only cost a few thousand dollars. Less than twenty years previously, IBM were confident that a world market existed for "around five computers."
Commodore's Personal Electronic Transactor becomes the first off-the-shelf machine to make the leap from "programmable calculator" to "microcomputer", almost certainly because it didn't fit in anyone's pocket. Fifteen years later the company would be bankrupt and their latest machine (of which more anon) would be on the slate for powering a fridge.
All of the above is, of course, in the past even as far as this story is concerned. To understand what it was like to grow up as a geek, I want to take you to Swansea in the Autumn of 1982.
I am not yet a year old, but the seeds of my geekiness are being sown. At the same time as the smiley :-) is invented at Carnegie Mellon University, new Welsh company Dragon Data Limited are unveiling their product. It is called the Dragon 32, and before the first anniversary of my birth such a machine is sat on a table in my house.
Forget GUI, DOS, WYSIWYG, WWW, FTP, or virtually any other acronym from the computing industry of 2002. Forget preemptive multitasking, symmetric multiple processors, pipeline burst mode cache. The Dragon 32 features a 0.9MHz MC6809E processor, 32 kilobytes of random access memory and although there's an internal modem, its use is to load and save data on audio cassette. The computer is controlled by a version of the BASIC programming language, provided by a small Seattle-based software house known as Microsoft.
Immediately we can see why the geeks of yesteryear have an upper hand on latter-day propellerheads. Anyone who owns one of these machines or a different computer of the same period needs at least a rudimentary knowledge of programming just to bootstrap the system. Computers are nowhere near as widespread as they will be at the turn of the millenium (only some eighteen years hence), so the owners of the systems are naturally computer hobbyists. Very few computers are used solely for writing letters or calculating the tax return. Indeed, if you want to use your digital accumulator to "do the books" for a business, you have to declare this fact with HM Customs and Excise who must audit your software to make sure it isn't going to cheat them.
Although I won't own one of his products for over one and a half decades, now would be a good time to mention good old Sir Clive and his eponymous company Sinclair Research. The year is 1984 and unfortunately for comedy's sake I don't become a Big Brother until next January. Last year the Bee Gees released the world's first ever CD album. Already one of the most successful home computers is the legendary ZX Spectrum, famed for a huge 48k RAM bank, blisteringly fast 3.5MHz Zilog Z80 processor, annoying rubbery keyboard (this was removed from subsequent editions) and eye-popping eight colour graphics display. The Spectrum still uses the audio cassette for long-term store, although a version with integrated diskette drive will be produced by Amstrad in the next few years. However the balding bespectacled head of the most successful company since DeLorean had bigger fish to grill.
Stepping in early in the next-generation microcomputers - and trying with futility to demolish jokes by particle physicists about the size of a "Quantum Leap" - is the latest addition to Sinclair's arsenal, the QL. With its Motorola MC68008 processor and 128k RAM (a full megabyte is pagable), this is possibly the first home computer to feature the glorious capability to multitask (share CPU time between two or more processes). It also moves away from the audio cassette, using a purpose-designed tape system called Microdrive on which to store data. Psion provide Sinclair with a suite of office software, and the machine comes with a full Disk Operating System, although this is still accessed through a version of BASIC. Unsurprisingly for a great British innovation, the computer does not sell. Existing Sinclair customers are incensed that the supposedly fantastic new machine can't play their existing games, no-one else sees any point in paying out more money for a souped-up machine when their existing 64k Commodore machine can do everything they need. If they think the QL is on steroids, they're going to cry "Computer Growth Hormone" when they see what's around the corner...
Again, a huge (though not quantum) leap in our tale brings me via 1985 to 1993. For it is in the latter of these years that I will first own a machine launched by Andy Warhol and Debby Harry on an unsuspecting Chicago public in the former. The manufacturers are long-time geek buddies Commodore, and the micro is the Amiga, the Betamax to IBM's VHS. The PC had been around for a little while, but it's an ugly clunky beige box with a silly little monotasking 8086 processor and no graphical display of note. The Amiga, meanwhile, prods buttock with a 7MHz Motorola MC68000, 4096 colour display, full stereo sound and a relatively new addition to the home computer world: Commodore note how sexy Apple's computers are and bless their new acquisition with a Graphical User Interface. This revolutionary software designed by the copying company finally removes the micro from the hands of the intellectual elite and opens the computing world to any shaven ape with a minimal attention span.
The end of that last paragraph may seem a little cruel, but it is true. No longer does a computer user need to know how a computer works in order to get their job done. The geek fraternity is diluted and marginalised, programming is now seen as a blacke magick rather than a necessity and the multitasking operating systems mean that even among programmers, very few people learn the machine language that until recently they could almost speak fluently in. By providing a layer of abstraction between the computer and the software, operating systems such as Intuition or Mac OS introduce that most unwieldy and satanic of artifacts, the Applications Programming Interface. Previously simple tasks such as drawing to the screen are now replaced by tediously long lists of function calls and kernel procedures. Granted, wheel reinvention is significantly reduced but the chance of one person fully understanding the workings of a computer has just been significantly reduced.
With the greater processing power, and the need for greater abstraction in software design, programming shifts away from Assembler language and interpreted BASIC towards compiled languages such as C, Pascal and (in the scientific community) Fortran. To a limited extent it is now possible to take code designed for one computer, say an Amiga, and build it on another like a PC or NeXT. This further reduces the need to understand the target computer's architecture (and no, knowing the endianness of your machine does not count).
Because the major players in the programming world no longer have to worry about conserving every bit of the memory, and some (such as the Free Software Foundation) are occupied more with making software portable rather than making every cycle count on any specific architecture, program bloat begins to creep in. Unnecessary functionality (the Blinky Light disorder, after "We know it's working, we can see the blinky lights") and inefficient programming - both in terms of use of store and processor time - are the order of the day, and the big O gives way to the big S with a line through it.
Despite technical inferiority and aesthetic sadism, the PC survives the collapse of the Amiga market and the marginalisation of Apple (who are at the moment letting NeXT sow the seeds of their resurgence) to become the dominant home computer architecture of the nineties.