The Legs Race
I was musing on the notion of evolution and dear old John Brunner's quip in his visionary novel of a future dominated by a [inter]Net and its corruptibility: he named the rogue files worms but they were viruses by another name. In The Shockwave Rider, one of Brunner's protagonists describes human evolution through three stages: first is the legs race, then comes the arms race, finally -- if we survive stage two -- we enter the brains race.
Evolution is a funny old thing. Haters and disbelievers point to all manner of areas of contentiousness, but love it or hate it, the average man on the Clapham omnibus knows all he thinks he needs to know about Darwinian natural selection when he quotes the catchphrase: survival of the fittest. In the minds of such glib quoters is, of course, the legs race. Fittest means -- must mean to their undeveloped and under-challenged intellectual analysis -- fitness; fitness of a physical nature, faster, stronger, smarter (because intelligence is a physical commodity too. How wrong they are in their simplistic view. Fittest does not mean what they think -- or want to think -- it means.
Subtract the superlative from fittest and we are left with a simple adjective with a number of similar meanings, only one of which is appropriately relevant to mostly physical requirements: fit to race for example. A square peg may be physically perfect, but it will never be fit for a round hole. Likewise, a tropical orchid would never be fit for survival in a desert. In other words, evolution has nothing at all to do with races, be they legs, arms, or brains. Brunner's genius was in his whimsical usage. Human post-evolutionary development has fuck-all to do with Darwinian theory.
So .... I was thinking about all this while I waited for my newly reconstituted PC to go through the terrible birthing pains of installing Windows. Jesus H. Christ! A modern PC can copy several DVDs -- and burn them to new media -- in less time than Windows takes to install a paltry few gigabytes onto a virgin hard drive. The new PC is pretty nifty. It's off the scale on the Windows rating, and it runs like the starship Enterprise with a wasp up its arse. But this is where I began to think about evolution and to wonder.
There's an old saw about computing power. An ad hoc rule, based only on historical evidence, which asserts that computing processing power will double every eighteen months. This "law" has held true for three decades. But as a Joanie-cum-lately to the computer age, with only fifteen years of Windows-based PC experience and another ten years before that on an 8-bit 8080-based machine, I have come fully yo appreciate the maxim concerning computer bloat-age.
Bloat is a new law concerning computer "evolution". Back in the days when a "personal computer" had a mere sixteen kilobytes of memory: for the operating system, the application, as well as for whatever data the application needed to operate on, software writers had to be ingenious in their economy of resources. Over the years I have bought, or built, ever faster, more powerful, and more memory-stacked machines and yet their performance has not advanced in line with expectation. Sure they have been a bit quicker; sure the the graphics have been glitzier and richer, slicker and faster.
My new system boasts a 6 core processor and eight gigabytes of RAM. The graphics card is not the hottest item on the market, but it easily copes with all I can throw at it -- including using the machine as a television. But here paranoia sets in. How long will it be before the software wonks make my machine obsolete? Not long at all, is my fear. Twenty years ago when I first looked into building my own PC I convinced myself that I would never need a hard drive with more than twenty megabytes. How hysterically risible is that?
Of course software guys will want to write new programs to exploit the potential of the latest hardware but do they have be such precious cunts in the process? 99% of every possible software routine was written in 1980s and -- probably -- all the most useful stuff ever written would fit in the memory of my twelve year old mobile phone, which is not very large by modern standards. Year after year, I have seen games gain numbers almost as fast the calendar and every time the new, plus one, version offers a small quantum jump in graphics performance and costs a massive, light year sized, jump in memory and processing requirements.
I'm not talking total bollocks here. I wrote some pretty cool programs in 1989 on my Amstrad PCW with its massive 256 kilobytes (total) memory; there was no hard drive. 256 kB was all you had and half that was occupied by the operating system. I even wrote a simple flight simulation, complete with an animated runway you had to steer at to land on; this with a system not designed for graphics, let alone animation, the elements were all keyboard characters taken from the 256 strong ascii character set. I also wrote programs ranging from text-based adventures (mazes with bells, whistles and things to collect and use), to database-spreadsheet style things for use as appointment diaries and so on.
The sad truth is that is that in computing terms evolution is going backwards. The faster and more resource rich the hardware becomes, the lazier and more complacent the software engineers become. I smell capitalism at its worst. If the latest software does not make last year's hardware sweat then it isn't doing its bit for the [silicon] economy. In case I need to explain myself, the silicon economy can only function if the consumer perceives -- and desires -- the need for greater computing power; the flip side of the equation depends on the "idleness" of the software sector who ignore the resource-economic solutions of their forbearers and who, instead, re-invent simple routines which use significantly more memory than the simple and effective routine devise by their predecessor. This would not be so bad if it wasn't for the fact that in one of the most commercially profitable sectors of the software industry -- computer games -- the software industry routinely takes a title and relaunches it with a plus-one number which is seriously broken, because the bastards behind the coding threw out (or ignored) all the code of the previous incarnation -- which worked -- and instead wrote a whole new pile of crap which is not only flawed because it is untested, but also because it hogs twice as much memory as the code it replaces.
In short, the whole computer and computer software industry is a giant con-trick. This game/Office-application/whatever needs a better computer to run. So you buy one. As soon as the hardware companies have sold enough they release their next -- bigger, better, faster -- product, and the software wonks rush to release their product in version x+1, taking great care to ensure that V x+1 needs double the memory, graphics, and processing power of its predecessor.
Hell. It's not evolution, is it? It's a rip-off is what it is. If evolution -- by any definition -- was involved, by now we'd have computer games offering seriously life-like virtual reality, and business software would be so smart there would be no need for humans!
Yup. Computing in the 21st century is still, very much, in the legs race.