I, robot? No, you cyborg!
Date: June 17, 2006
By: Gary Kemble
If you have been stocking up on heavy high energy plasma rifles in anticipation of the looming battle with the Terminators, or have been studying up on the Voight-Kampff Test to help uncover rogue replicants, Terry Dartnall has some bad news for you - the cyborgs are already here, and you are one of them!
You say that we're cyborgs. What do you mean by a cyborg?
The concept has changed since Clynes and Kline coined the term in 1960. They meant someone who's been modified and augmented to survive in a harsh environment, such as space travel. Electronic inserts in the body would combine with external mechanisms to bypass lung-based breathing, control heart rate and blood pressure etc. "Cyborg" now includes creatures like the Terminator, who isn't a cyborg in the traditional sense. He's a robot.
Humans don't look like the Terminator (except for that Californian governor)
We've modified our minds rather than our bodies. Modifying our bodies would be flashy, but it wouldn't be deep. What's deep is the way we've modified our minds through a series of mindware upgrades, acquiring increasingly sophisticated forms of cognitive technology. The first cognitive technology was probably language, but we're not sure. Since then we've upgraded our mindware through writing, different types of printing, and we're about to engage in an intimate relationship with machines. This augmentation makes everything that we hold dear possible: art, science, our exploration of the universe and ourselves. It makes us characteristically and distinctively human. Our wetware, without the technology, is good at interacting with the world and at pattern recognition. For anything else we need the technology. If we built a Terminator that had only the abilities of our naked brains it would be plain boring. One of the charming things about him is the way he learns language - hasta la vista, baby!
I'm still not convinced that using language, wearing a watch or using a PC means that we're cyborgs
Language was probably the first big step. We're genetically very different to other primates, but at the same time we're very, very different. Language seems to be the key. We had to be smart to develop it in the first place, but once we had it we were drawn upwards in a virtuous circle from one upgrade to another. Getting onto the upgrade ladder was the big step. If we ever do become flashy, ostentatious cyborgs there'll be less distance between them and us than there is between us and other animals.
But I don't think we'll become ostentatious cyborgs. Cognitive technologies are largely invisible in use and the more sophisticated they are, the less visible they become. Telephones used to be big, clunky things. Now they fit into your pocket or your ear. They enable us to see and talk to people on the other side of the world by bouncing signals off satellites. When integrated with the web they'll enable us to access pretty much all of human knowledge - through a piece our plastic in our ear.
We'll soon have personalised webbots scurrying about on the web retrieving information that might interest us-a sort of ongoing subconscious activity spread over the web. We're becoming increasingly integrated with these things. We take them for granted: language, writing, books, pictures, phones, calculators, laptops, the web. We don't notice them - until we have to do without them, and then we experience something like brain damage.
Alzheimer patients can cope surprisingly well in their homes because they "scaffold" their environments with lists and photographs, always on open display, of friends, family, things they have to do. When removed from these environments and taken into care they deteriorate rapidly, because the support has been removed from an already compromised host. I don't think we're very different. Imagine what it would be like to lose your laptop, or your ability to write, or your ability to use language.
This is about reperceiving ourselves. We had to reperceive ourselves when we discovered that we're animals that have evolved from more primitive organisms over a long period of time. What a wonderful reperception that was! Now it's time for another reperception. We're strange animals - animals that have modified their minds with external technologies. Even without the technologies we're what I call "bioborgs." We have toolkit brains containing beautifully coordinated subsystems that get on with the job of interacting with the world, only sometimes reporting back to consciousness. This leaves us ripe for mechanical augmentation.
Another thing is that our brains use the world as an external memory store. Rather than storing the knowledge that Bill has blue eyes we have high-level knowledge of where Bill's eyes are and the ability to zoom in and grab the information when we need it. It's computationally cheaper that way. Putting these things together, it's hardly surprising that we developed cognitive technologies that enable us to perform cognitive operations in the world.
Is that what you mean when you say that mind loops out into the world?
Yes. The cyborg thesis overlaps with something called "the extended mind hypothesis," associated most obviously with the philosopher and cognitive scientist Andy Clark. The hypothesis says that when we solve a problem with pen and paper, or a pocket calculator, mind loops out into the world. And not only cognitive processes-cognitive states do so as well. The standard example is Otto's notebook. Otto suffers from Alzheimer's disease. He hears that there's an exhibition at the Museum of Modern Art. He consults his notebook, which says that the museum is on 53rd Street. He walks to 53rd Street and goes to the museum. The hypothesis says that the notebook plays the same role for Otto that biological memory plays for the rest of us. It just happens that this information lies beyond the skin.
His belief is out there in the world?
And not only that. He believed the museum was on 53rd Street before he looked it up, courtesy of the functional isomorphism between the notebook entry and a corresponding "entry" in biological memory. If something is stored in biological memory we say that someone knows it before they look it up-before they retrieve it from memory. What difference does it make if it's stored in a notebook, rather than in biological memory? What matters is ease of access.
Suppose you have a car accident. You wake up in hospital and a doctor tells you that some of the information that was stored in the wetware of your brain has been transferred to silicon chips. The chips have been implanted in your brain. From your point of view nothing has changed. Your memories are intact. If this happened we would say that some of the cognitive states that used to be in your wetware are now in the chips. It makes no difference to you, just so long as you have easy access to the information.
Now suppose that the chips weren't put into your head but were inserted into your shoulder, or put into a bank vault. (In this case they would need to be linked to your brain by something like radio.) So long as you have easy access to the information it doesn't matter where the chips are - head, shoulder or bank vault. If they're outside your head then some of your cognitive states are outside your head - in your shoulder or in a bank vault.
That was the point of the wrist watch example, by the way. It illustrates our relationship with externally stored knowledge. If someone asks you if you know the time you say that you do and then you look at your watch, just as you say that you know something and then retrieve it from long-term memory. You know the time, even though the knowledge is out there in the world.
Is there a downside to any of this? Do you foresee any problems with our increasing cyborgisation? Will it increase the gap between rich and poor?
The educational implications are puzzling and possibly problematic. What happens when we can easily access all of the world's knowledge? Will this be the end of learning, and exams? Will we become intellectually lazy? Some say that we will still need to be able to ask the right sorts of questions and problem solve in the right kind of way. But can't we be given this ability as well? Can't we be given built-in AIs as well as built-in databases? I suppose that would leave us as high-level administrators. Maybe it wouldn't be that much different to how things are now, when our minds work on a problem overnight and give us the answer in the morning.
Will it increase the gap between rich and poor? I don't know. We're not looking at expensive surgery here and presumably the upgrades will be cheap enough. It might close the gap because we won't need an expensive education any more. The rich and powerful might try to control the technology but the indications are that knowledge is becoming increasingly available to everyone. And knowledge is power. I'm optimistic about the future.