It’s bad enough that we consume too much. Now we’ve gone ahead and created other beings who have the potential to do the same: robots. That’s right, I give you the self-replicating robot.
Whether you like it or not, robots are going to be produced. So, you can either fight it or help ensure that they will have as little impact on the environment as possible. I’ll go with the latter.
Robots are slowly being integrated into our lives. They regularly appear in movies, on television and in books. They clean our floors, entertain our children, diffuse bombs and explore distant planets. Some are even modeled after their makers, acting as greeters and guides. In the future, robots may assist humans with special needs, or perform duties dangerous or undesirable to us. In short, as depicted in the movie I, Robot, robots could eventually become as pervasive as the automobile. While this future is most likely far off, current events dictate that we act now.
Now I have to admit, watching the online video of these simplistic little buggers replicating themselves is cool. But, as I watch, I become concerned. Not only should one ask What resources will humans use to create an abundance of advanced robots in the first place?, but also, How will these robots go about reproducing on their own? Will they be dependent on humans to “feed” them the ingredients necessary for replication or will they be autonomous in this respect? I mean, will they start creating their own robot factories? You can see where I’m going with this … robots and humans — along with the rest of Earth’s species — competing for already depleted resources.
So, the question is: What set of guiding principles do we hard-code into robots now so they’re most likely included in robots produced in the future? To put it another way, do we want the Terminator or the likes of the machines in The Matrix? Or do we want Data, 3-CPO and Marvin? I thought so.Fortunately, we have a starting point: Isaac Asimov’s Laws of Robotics. If these sound familiar, perhaps you’ve seen the movie Bicentennial Man or the aforementioned I, Robot. Fans of Asimov’s robot series and Foundation novels can probably recite them from memory:
1. A robot may not harm a human being, or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law.
The three original laws were later revised to include (and be superceded by) the Zeroeth Law:
0. A robot may not injure humanity, or, through inaction, allow humanity to come to harm.
Assuming robots are programmed with these laws: Will they be sufficient to prevent the destruction of our planet’s ecosystems and life as we know it? Or will we need additional laws, specific to the environment? I mean, would the Zeroeth Law require that robots don’t engage in any activity that harms the environment and thus humans? Would they even intervene in human affairs and stop us from mucking things up? How would they treat non-human life?
These are questions that need answers — today people! Cuz I don’t want future generations living underground.