GPL & Asimov’s First Law
Ars Technica reports on a Open source project called GPU. The purpose of this project is to provide an infrastructure for distributed computing (i.e. sharing CPU cycles). The developers of this project are apparently pacifists, and they’ve modified the GPL (the GNU General Public License, which is the primary license for open source software) to make that clear. One of the developers explains it thusly: “The fact is that open source is used by the military industry. Open source operating systems can steer warplanes and rockets. [This] patch should make clear to users of the software that this is definitely not allowed by the licenser.”
Regardless of what you might think about the developers’ intentions, the thing I find strangest about this is the way they’ve chosen to communicate their desires. They’ve modified the standard GPL to include a “patch” which is supposedly for no military use (full text here). Here is what this addition says [emphasis mine]:
PATCH FOR NO MILITARY USE
This patch restricts the field of endeavour of the Program in such a way that this
license collides with paragraph 6 of the Open Source Definition. Therefore, this
modified version of the GPL is no more OSI compliant.
The Program and its derivative work will neither be modified or executed to harm a
ny human being nor through inaction permit any human being to be harmed.
This is Asimov’s first law of Robotics.
This is astoundingly silly, for several reasons. First, as many open source devotees have pointed out (and as the developers themselves even note in the above text), you’re not allowed to modify the GPL. As Ars Technica notes:
Only sentences after their patch comes the phrase, “Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed.” This is part of the GPL, and by modifying the license, the developers seem to run afoul of it. The Free Software Foundation has already contacted them about the matter.
Next, Asimov’s laws of robotics were written for autonomous beings called robots. This might seem obvious to some, but apparently not to the developers, who have applied it to software. As Ars notes: “Code is not an autonomous agent that can go around bombing people or hauling them from burning buildings.” Also, Asimov always alluded to the fact that the plain English definitions (which is what the developers used in their “patch”) just gave you the basic idea of what the law did – the code that implemented this functionality in his robots was much more complex.
Third, we have a military for a reason, and their purpose extends far beyond bombing the crap out of people. For example, many major disasters are met with international aid delivered and administered by… military transports and personnel (there are many other examples, but this is a common one that illustrates the point well). Since this software is not allowed, through inaction, to permit any human being from being harmed, wouldn’t the military be justified (if not actually required) to use it? Indeed, this “inaction” clause seems like it could cause lots of unintended consequences.
Finally, Asimov created the laws of robotics in a work of fiction as a literary device that allowed him to have fun with his stories. Anyone who has actually read the robot novels knows that they’re basically just an extended exercise in subverting the three laws (eventually even superseding them with a “zeroth” law). He set himself some reasonable sounding laws, then went to town finding ways to get around them. For crying out loud, he had robots attempting murder on humans all throughout the series. The laws were created precisely to demonstrate how foolish it was to have such laws. Granted, many fictional stories with robots have featured Asimov’s laws (or some variation), but that’s more of an artistic homage (or parody, in a lot of cases). It’s not something you put into a legal document.
Ars notes that not all the developers agree on the “patch,” which is good, I guess. If I were more cynical, I’d say this was just a ploy to get more attention for their project, but I doubt that was the intention. If they were really serious about this, they’d probably have been a little more thorough with their legalese. Maybe in the next revision they’ll actually mention that the military isn’t allowed to use the software.
Update: It seems that someone on Slashdot has similar thoughts:
Have any of them actually read I, Robot? I swear to god, am I in some tiny minority who doesn’t believe that this book was all about promulgating the infallible virtue of these three laws, but was instead a series of parables about the failings that result from codifying morality into inflexible dogma?
And another commenter does too:
From a plain English reading of the text “the program and its derivative work will neither be modified or executed to harm any human being nor through inaction permit any human being to be harmed”, I am forced to conclude that the program will not through inaction allow any human being to be harmed. This isn’t just silly; it’s nonsensical. The Kwik-E-Mart’s being robbed, and the program, through inaction (since it’s running on a computer in another state, and has nothing to do with a convenience store), fails to save Apu from being shot in the leg. Has it violated the terms of it’s own license? What does this clause even mean?
Heh.