In the movie, as the computer Colossus absorbs new knowledge, it grows emotional and demanding, yearning for its human companions to acknowledge it as its own superior entity. As Colossus is supposedly a computer in charge of the American National Defense System, it holds control over America's nuclear bombs, and does not hesitate to threaten the entire world with them. As the movie plays out, it is repeatedly stressed that the computer was built to be devoid of emotion, thus making it able to make entirely unbiased decisions. And while it kills off individuals because they're no longer useful and while it kills thousands with nuclear bombs simply because its connection to another computer is blocked, Colossus still has emotions- a whole range of them, in fact. Colossus is portrayed as more of a petulant child with grandiose ideas than a rational machine programmed to make decisions that would protect those it works for. This emotional display makes Colossus: The Forbin Project the direct antithesis of Jaron Lanier's argument in his article, "The First Church of Robotics."
Lanier's frustration with emotional robots is the fact that they can't logically exist. We personify machines and robots on our own because we are social creatures and can interact more comfortably with robots if we can relate to them. Realistically though, a robot doesn't function at all like a human. A robot is created to follow a program where there are defined terms. Computers were made to carry out specific functions, and nothing more. You can't use your calculator program to draw an intricate landscape, you can't multiply numbers in itunes. A surgical robot cannot teach itself to speak, let alone to think, as Colossus is portrayed as doing. And, for that matter, just because a robot can recognize a face and follow it with whatever "eyes" it was built with does not mean it is looking at a person the same way we would. A computer cannot "want," as it wants Doctor Forbin in the film, it cannot ask "why" something happens, it just accepts that something happens or doesn't happen, and acts based upon whatever it was programmed to understand from what happened or didn't happen. Jaron Lanier asks, "When we think of computers as inert, passive tools instead of people, we are rewarded with a clearer, less ideological view of what is going on — with the machines and with ourselves. So, why, aside from the theatrical appeal to consumers and reporters, must engineering results so often be presented in Frankensteinian light?" As we anthropomorphize machines, we imagine them to have all of the strengths and weaknesses the human condition allows- the imagination to create and to negotiate, but also the power and will to kill and to destroy. This creates fear and paranoia and a general mistrust of robots, as their different way of "thinking" makes us unable to relate to them, thus unable to predict their actions.
Thus, the movie Colossus is giving the computer far too much power through its display of emotions. In the last five minutes, the computer tells its creator that they will work together whether he likes it or not, and he will grow to love Colossus. This demand for reverence and emotional acknowledgment is exactly what those who believe in singularity are afraid of. They're afraid that computers will decide what's best for us based upon an algorithm and it will refuse to be negotiated with.
Thus, the movie Colossus is giving the computer far too much power through its display of emotions. In the last five minutes, the computer tells its creator that they will work together whether he likes it or not, and he will grow to love Colossus. This demand for reverence and emotional acknowledgment is exactly what those who believe in singularity are afraid of. They're afraid that computers will decide what's best for us based upon an algorithm and it will refuse to be negotiated with.
Lanier believes that those who believe in singularity are letting this misconception cloud their vision of the technological present and future. He argues that this paranoia is stopping us from allowing technology to advance where it is most needed, and that by not allowing it to advance as quickly as it could, we're depriving the world of the aid it needs. Lanier claims that "Technology is essentially a form of service. We work to make the world better. Our inventions can ease burdens, reduce poverty and suffering, and sometimes even bring new forms of beauty into the world. "
Lanier believes in the vast positive changes heightened technology can afford, whereas "Colossus" and those who believe in Singularity find that heightened technology can only bring about disaster. It seems to me that singularity is a moot point- why worry about machines taking over if we don't need to anthropomorphize them in the first place? Why put all our focus on making humanoid robots anyway? Binary code will never be able to translate emotion, machines will never be able to feel- we're just projecting ourselves onto them by thinking they can. Machines have always and will continue to follow the algorithms they are set to follow.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.