Science fiction has long inspired real-world technology, but have the authors of sci-fi stories finally run out of steam? Stuart Andrews investigates
From the earliest days of Jules Verne and HG Wells, science fiction and technology have enjoyed a mutually beneficial relationship. Sci-fi stories and novels expressed man’s desire to conquer space, find new worlds or explore the ocean depths, and while man would probably have landed on the moon or launched deep-sea expeditions without them, these tales inspired those who made such giant leaps.
In turn, real-world technology has inspired the science-fiction writer. After all, it’s science fiction that charts what happens when humanity meets high technology, asking what will happen, where it will take us, and what we’ll find when we get there. This is as true of computer technology as it was of the space race. Perhaps, even more so.
The geek and hacker cultures that have powered so much of the PC and internet revolution are hugely sci-fi literate. Writers and experts have even crossed paths; the academics and software engineers becoming sci-fi writers, the writers earning a name as futurologists.
In this feature, we’ll explore how science fiction has motivated trends and products in computing, and catch a glimpse of where this relationship might take us in the future.
Visions of the future
Does sci-fi really have that great an impact on the technology that emerges from the labs of the world’s biggest technology companies? Labs that are so well funded (Microsoft alone spent $8 billion on research last year) that they can afford to scoop up the brightest talent emerging from MIT and beyond? Indeed it does, according to Bruce Hillsberg, director of storage systems at IBM Research in Almaden. For him, the value of science fiction is that it “paints visions of the future that cause people to think about possibilities beyond what is possible today”.
Hillsberg believes the fact so many hi-tech visionaries are sci-fi fans, tied to fiction’s power to stimulate creative thought processes, means that an interest in the genre can lead to real breakthroughs. “I don’t believe sci-fi necessarily sets the agenda for researchers,” said Hillsberg. “That is, I don’t think most researchers try to invent what they read about or see in movies. Rather, they try to move science or technology forward, and sci-fi can consciously or unconsciously help them think outside the box.”
History bears out his theory. Do a little digging and you’ll be surprised to find how many big names in the computing world are sci-fi fans: Apple’s Steve Wozniak, Netscape’s Marc Andreessen, Tim Berners-Lee, Google’s Sergey Brin and the GNU Linux creator Richard Stallman, to name only a few of the tech elite. Microsoft co-founder Paul Allen has even helped fund a museum of science fiction in Seattle.
To Hal and back
It wasn’t long into the history of computing that the sci-fi greats began to see technology’s potential. During the 1950s, Isaac Asimov wrote a sequence of stories featuring Multivac, a huge, artificially intelligent computer, culminating in the classic The Last Question – a tale that tracks the evolution of Multivac and the human race.
Asimov recognised that computers would grow both smaller and more powerful, with Multivac transforming from a sprawling giant into an entity that exists outside of space and time. He merely underestimated the timescale – Asimov thought it would take thousands of years for Multivac to shrink to a vaguely mobile form.
Computing owes an even greater debt to Asimov’s contemporary, Arthur C Clarke. In his work on the 1968 film and novel 2001: A Space Odyssey, Clarke created HAL, the model for all future dedicated, logical, mildly psychotic AI. In creating HAL, Clarke and director Stanley Kubrick sought guidance from Marvin Minsky, co-founder of the Artificial Intelligence Laboratory at MIT. In turn, the film would inspire a new generation of engineers and designers, including a young Rodney Brooks, who would go on to be director of that same institution.
In the book HAL’s Legacy: 2001’s Computer as Dream and Reality, Brooks describes the movie as “a revelation, because I grew up in a place without a lot of technology and I was really interested in AI and then to see that movie, it told me that there were other people in the world with the same sort of weird ideas that I had.” For Brooks, “the film really inspired me and pushed me to push my whole life towards Artificial Intelligence”.
Clarke also influenced the man who would go on to create the World Wide Web. In a 1997 interview with Time magazine, Tim Berners-Lee mentions a youthful fascination with Clarke’s 1964 short story Dial F for Frankenstein, where computers networked together pass a critical threshold and learn to think autonomously. In the interview, Berners-Lee makes it clear that he doesn’t see the web as the fulfilment of Clarke’s prophecy, but he does see it as having emergent properties with the potential to transform society – and 12 years later, he’s been proven right.
Full Article | PC Pro