Staring at a face reflected in the gleaming surface of a new iPad, or hearing Siri's synthesized voice answer questions on an iPhone, it's easy to imagine that there's something more than microchips and wires inside those smart machines. There isn't. But scientists envision a day when computing devices and their software will not only be as smart as the humans who designed them but be able to upgrade themselves. Then what happens?
At a recent and somewhat whimsical panel discussion in Austin, Texas, a trio of robotics experts disagreed on whether events are leading inexorably to a Hollywood-style battle between robots and their creators. But they raised an issue to grapple with today: whether humans should treat their increasingly lifelike machines humanely.
The relentless advance of technology is driven largely by the increasing power of microprocessors. As that power increases, so do the capabilities of the researchers who are pushing the limits of artificial intelligence. It's just a matter of time before the Jeopardy!-dominating capabilities of IBM's Watson supercomputer are a routine feature of cheap PCs; at that point, panelist William Hertling said, hobbyists and scientists together will develop machines that rewrite their own software and eliminate their own shortcomings.
It's impossible to predict where that sort of "technological singularity" might lead. But Hertling and fellow panelist Daniel Wilson — who have written apocalyptic novels about the future of artificial intelligence — said we will have plenty of time to avert a global robot rebellion. Panelist Chris Robson, a mathematician who once worked for Hewlett-Packard on self-modifying hardware, wasn't so sure, saying the era of machines that improve themselves has already arrived.
In the meantime, all three agreed, smart machines pose moral quandaries that earlier gadgets didn't. It's not that machines have rights that must be respected. It's that there's something corrosive about abusing a device that mimics a living creature. As Wilson put it, society might consider it okay to smash a toaster but not an interactive toy with a synthetic personality, even if it doesn't really have feelings. To do otherwise risks raising a generation of people inured to cruelty.
The downside is that the more we treat machines as our friends, the easier it will be for them to enslave us. But then, smartphones appear to have done that already.
© 2012 the Los Angeles Times