Starting another class 3 thread since the readings/viewings were so broad and this will just focus on a subset of the subjects: governance and the ethical responsibility of technologists.
The “Unstoppable Code” video was quite an evocative manifesto. One line that Andreas (the speaker) was excited about: “Governance is the killer app”. In my opinion, governance has been “the killer app” for centuries. Governance is a technology for decision making and smart contracts are a further iteration of tools that aid this technology.
Later in the video Andreas discusses how different governing bodies have different morals and laws, and points out how therefore not all laws can be followed at once without contradiction. He then encourages smart contract developers to freely break laws when writing their unstoppable code. But at what point is this a new kind of governance versus a lack of governance if laws and norms set by governments can be ignored? When is this governance and when is this anarchy?
When is the right time for individual smart-contract developers to consider themselves so morally superior to a government’s laws that they should break them? When are we talking about ethics and when are we talking about ego?
I’d like to broaden this conversation to the wide moral responsibilities of technologists, in response to “The Moral Character of Cryptographic Work”.
I read the first parts of this paper instead of cutting directly to the suggested pages 25-27. I started off very skeptical of this paper, almost offended that it drew a comparison between nuclear weapons, tools that have the power to ultimately destroy life and leave long-lasting devastation, and cryptography.
The author introduced the paper with a story about nuclear arms, and Bertrand Russell, and suggested that since he and other scientists were a part of creating technologies, they had a moral imperative to speak out about their potential harms. However, there is an alternative and more general moral reading to this story of Bertrand Russell and other scientists of his time speaking out. That is that they had power and respect, and therefore a platform to voice what needed to be heard for the public good. Their moral responsibility to vocally stand for what was right was due to their power, rather than their position in science.
The power of cryptographers is different and more subtle Most people do not know what cryptography is, or who cryptographers are, in the same way that the general (western) public knew about and respected the famous scientists like Russell following the second world war.
Despite this, once into the depths of the paper, I did appreciate how the author framed the social importance of cryptography.
Framing: “Surveillance is an instrument of power… While surveillance is nothing new, technological changes have given governments and corporations an unprecedented capacity to monitor everyone’s communication and movement.”
The author also discusses how drone attacks other acts of warfare follow information gathered by surveillance.
This all raises questions for me as to what extent people who work on generic tools, like cryptography, without working on their applications, have a moral imperative to consider all of their possible applications. How can a mathematician imagine all possible ways that their creation can be used for evil? Should they distract themselves with all of the possible uses of one of their creations instead of doing their work? Did Satoshi know that prediction markets for assassinations (e.g. on Augur) would one day operate on the blockchain, and should that have mattered? If humans are to create weapons for evil and your tool is a small contribution to their weapon, is that a problem? The evil weapon likely could have been created anyway building on top of other tools. For example, assassination markets on Augur use blockchain technology, but if neither Augur nor blockchains existed, these markets would form elsewhere.
Instead of asking these questions, the author goes on to focus on the crippling effects of a potential surveillance state, like that described in dystopian novels, or the one present in North Korea. Do cryptographers have the moral responsibility to build the tools to shield individuals from surveillance? This draws the more general question: Do privileged technologists like us have the moral responsibility to build tools for good, just because we are the ones who can? I think yes. I wonder what others think…