I just read an article in the Science Times by Cornelia Dean about current research into “intelligent robots,” that could potentially “behave more ethically in the battlefield than humans currently can.” I’ve quoted a researcher named Ronald C. Arkin, from Georgia Tech. Arkin’s basic hypothesis is that the stressors of wartime, including but not limited to, fear, anger, anxiety, the loss of colleagues, the handling of corpses, etc. are likely to cause human soldiers to abuse noncombatants, as well as the Geneva Conventions/rules of engagement. Okay, war is hell and we all know that it often brings out the worst in us. But are autonomous, armed robots, entrusted with the power to make real time battlefield decisions really a viable answer? (I’d like to point out that the article I read and all of its scientific contributors were very even handed about the potential moral and ethical issues here; however the research is funded and underway.)
Why pursue autonomous robots when we have solid “drone” technology already in place? The point is not to relieve ourselves of the potentially damning moral and ethical dilemmas of wartime, is it?! I don’t think so. The point is to reduce the number of dead, disfigured and otherwise mentally destroyed human beings that wars churn out by the thousands. What we want to do is live up to the standards we’ve tried to put in place. Replacing actual human beings with “autonomous robots” isn’t likely to do this. What I think it would do, is deaden us to the very grim realities that we are trying to avoid. And just in terms of logic, doesn’t drone technology eliminate, in at least as much theory as that supporting Arkin’s research, the very stressors that he claims cause us humans to behave immorally and unethically in times of war in the first place?! The point of drones is to replace the precious lives of actual human beings with metal. But the decisions are still in our hands. Replacing those decisions with more metal, is, in my opinion, antithetical to the principals we’re trying to uphold.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment