Computer Predictions

RIVER BENDER - October 2005

It's a bad idea to ever say something can't or won't be done but history is full of bad predictions, especially in technology and computers. We'll mention some of the bummers and follow up with a few current predictions. You might end up wondering if anybody can make an accurate prediction.

Guess who said "I think there is a world market for maybe five computers" It was allegedly Thomas Watson, founder of IBM in 1943, but I suspect it's just an urban legend much like Bill Gate's alleged comment in the '80s that "Nobody needs more than 640K of RAM."

As you know, ENIAC was the first electronic computer. Here's a prediction that appeared in Popular Mechanics in 1949: "Where a calculator on the ENIAC is equipped with 18,000 vacuum tubes and weighs 30 tons, computers in the future may have only 1,000 vacuum tubes and weigh only 1.5 tons." Thank goodness we did better than that.

It was the microchip that revolutionized the size of computers but in 1968 an engineer at the Advanced Computing Systems Division of IBM commented on the microchip by saying "But what...is it good for?"

Western Union had a wonderful opportunity to get in the early telephone business but not after the following comment appeared in an internal memo in 1876: "This 'telephone' has too many shortcomings to be seriously considered as a means of communication. The device is inherently of no value to us."

British scientist Lord Kelvin (William Thomson) was a pessimist in 1899 when he predicted, "Radio has no future. Heavier-than-air flying machines are impossible. X-rays will prove to be a hoax."

Here's one I love: "So we went to Atari and said, 'Hey, we've got this amazing thing, even built with some of your parts, and what do you think about funding us? Or we'll give it to you. We just want to do it. Pay our salary, we'll come work for you.' And they said, 'No.' so then we went to Hewlett-Packard, and they said, 'Hey, we don't need you. You haven't got through college yet.'" That was Apple Computer founder Steve Jobs on his attempts to get Atari and HP interested in his and Steve Wozniak's personal computer. .

Even smart people screw up: "There is not the slightest indication that nuclear energy will ever be obtainable. It would mean that the atom would have to be shattered at will." Albert Einstein in 1932.

OK, enough about past predictions that went awry. What do the experts tell us about the future and should we believe them? I recall in my AT&T days mentioning "bubble memory" to my college professor and all he said was "I'll believe it when I see it." It turned out he was right. Bubble memory from Bell Labs apparently died on the vine. Too often predictions are overly ambitious.

"Hard drives are living on borrowed time and will be replaced with solid-state Flash memory," says Samsung's semi-conductor CEO Dr Chang Gyu Hwang. Sure they will, but when? I've been hearing about flash memory for years and yes it's in cell phones and applications but nowhere near the original predictions. There's no question we've got to get rid of mechanical hard drives just as we did tape drives but will they be replaced with flash memory?

People continue to talk about Moore's Law which states that computer speed will double every 18 months. The pace shows little sign of slowing and nowadays we see desktop PCs with gigahertz processors, gigabytes of memory, and tens of gigabytes of hard drive and prices dropping. But you know, I don't see all that much increase in speed when I turn on my PC. Windows XP is so bloated that it probably uses up all the extra speed and memory. I'm sure the race will continue between faster speed and larger software and we'll keep talking about Moore's law but some folks may eventually say "Whoa, I want off the fast track."

There's lot of talk about making computers faster by using "quantum computing." Present computers handle information in binary form, representing everything as zeroes or ones and have circuit elements that can either be turned off (0 state) or on (1 state). In quantum computing the elements representing data are called quantum bits, or qubits, and can be in a combination of both state 0 and 1 at once. This enables a number of possibilities to be computed simultaneously thus speeding up processing. Is this going to happen any time soon? Who knows?