As the Christmas season of 2002 approaches, and I anticipate the refinement of facilities in my newly completed computer museum, I start to take stock in the real meaning of the cyber revolution. Having taught a course about a year ago in virtual reality and its implications, I faced new insights as I view the hundreds of winking, flashing, and beeping machines which reside on my shelves. The first thing I notice is the ergonomic factor: the interface between human and machine as defined by countless tinkerers over the last 40 odd years, all trying to define what a computer is. All kinds of keyboards ( not necessarily QWERTY) invite the human touch, along with many mysterious "F"keys. Initally terminals, and later video screens of increasing resolution beckon. Tactile devices (mice, trackballs, joysticks, and other digital tools) permit not only the entry of data but real interaction. With increasing sensitivity, these things react to me, and the networked ones react to each other. That is, of course the real, qualitative difference that the last generation has made: this new reality may be one of two things. 1) It is a new way of extending our intellects and senses. 2) It is merely a new way of realizing what we already possess. My museum really takes off from the ending point of the legendary 1971 Charles and Ray Eames exhibit at IBM in new York. In that exhibit the emphasis was on number crunching, data entry, and the graphic display was barely a topic since machines of the day were still stuggling with the size of the bit-word (the 8-16-32-64 bit ladder had not been standardized). Even when initially 8 emerged in the earliest minicomputers in the 70's, not enough information could be moved to create true graphics.
The new computer museum is curious in that the whole picture is presented in a space of time defined by less than one lifetime, unlike museums of art, sculpture, or even the automobile. In addition, everything there is the result of a small number of revolutionary devices: the vacuum tube, transitor, integrated circuit, microprocessor (I am indebted to Alfred Chandler, writer of Inventing the Electronic Century for this insight). Suddenly the concept of the telescoping of notions of progress from cycles of centuries to months hits home. The museum is a real, accelerating time machine. Even the process of miniaturization is not perceived as a linear process: the original Compaq luggable or the Osborne 1 show a move in that direction at the outset, and many Japanese electronics firms were marketing so-called "pocket computers" (really glorified calculators with replaceable dedicated rom chips)
What can one learn from a visit to this place? Most importantly, from the genesis of the computer (a word which initialy meant a person or persons who sat in a room and did calculations) all the inventors, scientists, geeks, nerds, and other computer aficionados were dealing with the man-machine problem from the start. The new cognitive artifact had to serve and well as teach. It had to embody but not eclipse human nature. It had to surpass but not replace people. Lastly, the whole perceptual epiphany that occured with the advent of immersive technology had to be assimilated in a constructive way by the man on the street. Modern cognitive scientists like Sherry Turkel have been investigating the impact of the emergence of multiple personalities in chat rooms and the ultimate fragmentation of a single personality in extended confrontation with virtual reality. Others have discussed the difference in perception between film and the video realms, real physical differences that occur in front of a video monitor. Probably no one could have predicted that the computer would change the nature of perceived reality and offer up this version as one of choice (do I see the ghosts of Descartes and Bishop Berkeley rising up in consternation?).
Seeing the development of all of this cultural complexity in one room was an overwhelming experience for me, even though I created it. IT was a grotesque extension of the pseudointimacy that rises when a single person logs on to the internet looking for book bargains and winds up reading an article on Tibetan chant. Virtual space appears limitless. Physical time has been conquered: why get in you car to go to the bookstore when there is Amazon.com? I also find that this room has a reassuring familiarity about it: a comfortable hangout for old friends, even though they may be made of silicon and metal and have voices like rubber bands.
A few other things struck me as I wandered in the sea of blinking lights and buzzings drives: this history is also a story of the evolution of industrial design. We have become complacent in the me-too sea of beige towers and slate-slab laptops (Apple excluded), but in the past when there were hundreds of computer manufacturers, ther were also hundreds of interfaces. Some of the earliest micro and minicomputers had unmarked keys, arranged in a decorative but nonstandard array, and the user put a cardboard template over the keys, selecting what each key would do in each program, loaded on cassete tape. In different countries, even the placement of critical keys (like the return/enter key) was geared to the way the written/spoken language looked on paper (I had an experience with a Fujitsu OASYS all-in-one machine wiht a Japanese OS where it took me a week to find the enter key). The look of some of these dinosaurs (particularly the SONY and Apple machines) summons up visions of the really great scifi films of the fifties. Even machines like the 1975 IMSAI 8080 (of War games fame) or the early Sym-1 would probably not even be recognized as computers by anyone under the age of 30. One of my favorites is the Husky Huner, a military pocket computer which was waterproof and could be dropped three stories while it still computed. It looked like the machine that would be on the passenger seat of a Hummer. Altough this design revolution is really an American thing, machines from the UK, Russia, Japan, and Scandinavian countries added amazing features like built in printers,cassettes, some kind of networking via telephone handset, etc.
So, what were the developments? First there were mainframes and minicomputers: they were always reached by terminals that plugged into the serial (all data in a line, one at a time) port. Networking was accomplished the same way. The desktop PC put the terminal and CPU together, but the interface still involved what many call "mysterious typing" - commands invented by dominatrix cyber-Nazis dedicated to making sure those commands could never be remembered. With the Xerox Alto a "desktop" was created, a simulacrum of a real desk with files, drawers, etc. The visual validation of virtual space brought the average Joe into this mysterious universe in which the measurements of Anna Nicole Smith and the IRS tax form of Henry Kissinger looked basically similar in machine language. As I gaze at the Atari 520 ST, the Amiga 1000, or the Apple Lisa, I am struck by the simplicity of the desktop. Also, color has yet to appear: the Amiga has limited bit planes of color (2 to the X number of bit words). I am also amazed by the stability of these machines: they really don't crash, don't have elaborate shutdown routines, and in the case of many S-100 machines, are not that slow, even though the CPU run at 4 mHz, because the bus speed (rate in and out of the ports to peripherals) is high.
Those familiar with computer lore know about widgets and shared libraries: these are objects that are used by more than one software manufacturer to get a program to be user frendly and run on a particular platform. As a result most software looks the same today: on the left is "file" open, print, save, save as, etc. then edit and so forth. Before the days of widgets the proprietary software had to generate its own desktop and look. Much of this approaches peasant art or what Peter Wollen (in Raiding the Icebox ) calls tourist art. I did not expect to be so entranced with the variety of desktop worlds, including the virtual worlds of early computer games. All that stuff for the Commodore Vic-20, 64 or Atari 800 XL or Timex Sinclair had a distinctive look, much unlike today's in which Laura Croft, Tomb Raider will look that same on a Playstation or Xbox.
There is a last gedankenblitz that appeared to me like the ghost of Christmas to Come: the whole computer thing was an obscurant, limited, geek thing (remember, the conclusion 40 years ago from the ARPANET gurus that 30 networked computers was quite enough) to the greates explosion in mass perception expansion, with a real generation gap perception (I remember a conversation with my 85 year old mother last year in which she proclaimed: "The senior citizens are NOT interested in computers." after a bunch of them had wrecked a PC donated to the local senior citizen center). I was reminded of the Amish who decry anything not mentioned in the Bible.
What I did not expect from this almost decade excursion into vintage computer collecting was the realization that human nature has an almost limitless capacity for exspansion. We depend on our limitations to give us a sign. Then we develop a space shuttle that travels out of our orbit, a computer that calculates ocean tides, cel phones that surf the internet, or submarines that chart the ocean floor. It is my own personal view that the major contribution of modern western culture will be the cornucopia of technology that we has spilled upon the land. The Italians had their Renaissance, the British their Colonialism and Industrual Revolution, but the Americans of the McDonald's Burger, Hip Hop, and the TV talk show will leave a legacy of techno-evolution of the species: we will find out what we can really do, at last.