The Ecphorizer

The "Whither Computers" Game
George Towner

Issue #51 (November 1985)



Everybody knows that the introduction of personal computers has created a whole new class of computer users. Several million people now have in their homes the kind of computing power that ten years ago belonged only to large companies and [quoteright]institutions.

But

What can these clever gadgets do?

I believe that in the long run, the principal impact of the PC revolution will be on the design and application of computers themselves. Most big mainframes today perform the same kinds of work that they did in the 1950s; but the little machines are doing things that Turing and Von Neumann never dreamed of. The new class of computer users has spawned a new class of software and hardware designers.

As a result, speculating about the next stage of computer applications is one of the hottest games in Silicon Valley today. What can these clever gadgets do? I enjoy such speculations, and my profession gives me a slight edge in making them. So this article represents my current thinking in the "Whither Computers" game.

It seems to me that all the present uses of computer hardware fall into three categories: information handling, event controlling, and personal interaction. Many applications bridge these classes. Of course it's possible that a whole new kind of computer usage will pop up some day. But while we're waiting for that to happen, there is a lot of room for innovation just within the three categories.

Information handling is what most people think of when you mention computers. The machine takes in symbols — usually, words and numbers — and puts out the same kind of things. In between it "processes" this "data" by calculation, rearrangement, storage and retrieval. What you are now reading, for example, was arranged in the form you see by an Apple III computer [see right sidebar].

Yet the possibilities of information handling are limited only by the ways we human beings can encode information. Language and numbers were the traditional modes, but new ones are springing up. Magnetic strips and bar codes, for instance, have become a standard way that computers identify physical objects. In effect, proper names have been added to the computer lexicon.

In a related area, image processing has now blossomed. Design (CAD/CAM), drawing, and publishing programs help the user create images on paper to any degree of complexity and finish. TV serials and even whole movies are ground out by computers. Machines routinely "enhance" satellite images. Yet much more is possible. Once you treat a letter (for example) as an image rather than as a word stream, you can imagine many useful ways it could be analyzed, manipulated, edited, and filed by a computer.

A similar explosion is waiting to happen in audio processing. Sounds can be treated as one-dimensional images and processed the same way. Some progress has been made in synthesizing music and speech, as well as "enhancing" such things as noisy recordings. But treating a phone call as an audio image that can be handled by a database program (for example) is yet to come.

Event controlling was actually the first widespread application for proto-computers. Industrial "process controllers" were the earliest machines that made decisions on the basis of a stored program. Today most production lines would cease to function without computers. In the consumer area, computers are increasingly visible in transport. Aircraft depend on them; in fact there are military planes now that are too unstable to be flown by mere human reflexes. And of course computers are the latest thing in cars and in city traffic control.

Anyone who has a VCR or autofocus camera is familiar with computerized appliances. A friend of mine recently installed a "smart" household thermostat; it took him two days to learn how to program all its functions. Thus the possibilities in this field seem to be limited only by the patience of the user. But I believe that the future here depends on integrating smart appliances — at least the non-portable ones — into existing "user-friendly" computer systems. My friend's thermostat and his VCR (plus a dozen other appliances yearning to be smart) could more conveniently be controlled from a single keyboard. This kind of development, however, would require designing new computer "peripherals," an area that many hardware engineers find messy. Much of the potential in this area has already been explored in science fiction treatments of the "electronic house."

Personal interaction with computers had a sudden vogue in video games. But after all the wham-bam had died down, many people discovered they were more comfortable interacting with a computer than with other human beings. I suspect that a lot of amateurs write programs not because they need the results, but because they enjoy that peculiar kind of intercourse with something resembling another mind. As "expert systems" and high-level languages become more humanoid, they will become an increasingly important factor in the "quality" of our computer usage.

"Interactive entertainment" is a current buzzword for amusements in which viewers participate in and control the action. For example, a mystery story on the TV screen stops periodically and asks you (as one of the characters) what to do next. It then branches to a sequence of events appropriate to your reply. You can watch (play?) it dozens of times without ever repeating the same plot.

This is a powerful idea, and one which is just beginning to unfold. I see the branches becoming finer and less perceptible, until the interaction seems continuous; the plotting becoming less B-movie and more sophisticated; and ultimately the computer building most of the story as it goes along, instead of merely branching to canned segments. I see new computer languages in which the author's creations and the user's interactions becomes expressed. I see something approaching classical literature but with a whole new discursive dimension. It would be a welcome change from today's television, which force-feeds simplistic ideas in one direction to utterly passive viewers.

On a less dramatic level, there is plenty of room for computers to become companions. Horrors! you say. But if a dog or a goldfish (or a teddy bear), why not a smart machine? After all, everybody loved R2D2. At the very least, I can imagine a screen in the corner of the living room displaying the image of a lop-eared mutt. From time to time it scratches a flea, barks for attention, plays with its ball. When you talk to it, it wags its tail and sits up. For many people this would serve 90% of their needs for a pet, without hair in the carpet or a pooper-scooper.

So this is how I play the "Whither Computers" game. It's fun because no matter how blue-sky your imagination, events will prove that you were conservative. If you don't believe me, just re-read this article ten years from now. 


GEORGE TOWNER, glassy-eyed from staring at too many computer screens, tries in this issue to convince you that these machines are worth something. Our readers, people of Mensa intelligence, know better. You can read about George's latest book here! Ed. Note 2009:  And of course things have taken turns even George couldn't have imagined in 1985; for example you're reading this article thanks to your web browser.  Before you lies the end product of several processes:  Scanning the original printed page into an image file, translation of that image file into text that is editable, putting that text into a form that your browser can read, and finally copying the file to a server whose address is ecphorizer.com.  All this using an old Apple Macintosh computer and a couple of software programs.

More Articles by George Towner



close
Title:
Link:
Summary:
We have collected the essential data you need to easily include this page on your blog. Just click and copy!close
E-mail Print to PDF Blog
Return to Table of Contents for Issue #51