The Ecphorizer

The Future of Computers
Bruce Rowe

Issue #70 (September 1987)


Back in the old days (the 6()'s) computers were large (and expensive) machines. It was big business at both ends. Big business built the machines, and big business used them. It made sense, since only a business with mountains of repetitious work to be done could get the machines to pay for themselves.

The solution, as I see it, is the self-programming computer.

Only a cloistered priesthood understood the things. Science fiction seized the concept, carried it to its logical conclusion, and created robots and androids. However, everyone assumed it would be centuries before a computer as powerful as the human brain could be as small as it was.

In the Seventies, things changed considerably. The concept of a "personal computer" was born and small companies sprang up making them. "Big Business" ignored these upstarts since the machines did not seem to be good for more than a game of [quoteright]"Pong." But these little things begun to be surprisingly useful. Adding machine and slide rule manufacturers went into a decline as the price of calculators came down. Swiss watchmakers changed their emphasis from accurate timekeeping to fashion in order to justify their higher prices. The old Kettering ignition system in cars — successful for fifty years — now had a microprocessor controlling it. Some (mostly the people who made the things) made predictions that one day every home would have a computer to keep the checkbook, mailing list, and recipe box. While this never came to pass, microprocessors became so useful they kept popping up everywhere.

Enough of this ancient history. What was happening was that the hardware was getting better, faster, and cheaper at an accelerating rate. The lag in software development was not noticed as much — but it was there then and is becoming very apparent now. At the user level, a computer is a simple thing. All you do is push buttons on a keyboard, don't you? A violin is a simple thing too. All you do is pull and push the bow against the strings. With both the computer and the violin, the problem is similar: it takes a while to learn to use the thing.

The problem that hardware manufacturers are running into is that every time the capabilities increase, the instructions become correspondingly complicated to write. And the capabilities are increasing at an accelerating rate. Apple came out with the Macintosh, by all accounts a wonderful machine. It was about a year, though, before the software caught up. In 1984 Intel unveiled the 286 CPU. Six months ago the 386 CPU hit the market. About eighteen months from now, Intel is planning to come out with the 486 CPU. Each new upgrade represents a big jump in capability and speed. What makes all these upgrades largely academic is that the operating system for the 286 CPU was just unveiled a month ago — two years behind the hardware. The operating system, for those who don't know, is the set of "laws" that tell all the internal components how to talk to each other and work together. It is the operating system that all our word processors, spreadsheets, and databases are built on. Without the operating system to make use of them, all the improvements in RAM, clock speed, etc., are no more a measure of a machine's value to society than an I.Q. score is for a human.

The solution, as I see it, is the self-programming computer. Or what is commonly called "Artificial Intelligence." Isn't that what happens with humans? 

More Articles by Bruce Rowe



close
Title:
Link:
Summary:
We have collected the essential data you need to easily include this page on your blog. Just click and copy!close
E-mail Print to PDF Blog
Return to Table of Contents for Issue #70