Blockchain Ethics

(class 3 readings) governance and ethical responsibilities of technologists

Starting another class 3 thread since the readings/viewings were so broad and this will just focus on a subset of the subjects: governance and the ethical responsibility of technologists.

The “Unstoppable Code” video was quite an evocative manifesto. One line that Andreas (the speaker) was excited about: “Governance is the killer app”. In my opinion, governance has been “the killer app” for centuries. Governance is a technology for decision making and smart contracts are a further iteration of tools that aid this technology.
Later in the video Andreas discusses how different governing bodies have different morals and laws, and points out how therefore not all laws can be followed at once without contradiction. He then encourages smart contract developers to freely break laws when writing their unstoppable code. But at what point is this a new kind of governance versus a lack of governance if laws and norms set by governments can be ignored? When is this governance and when is this anarchy?
When is the right time for individual smart-contract developers to consider themselves so morally superior to a government’s laws that they should break them? When are we talking about ethics and when are we talking about ego?

I’d like to broaden this conversation to the wide moral responsibilities of technologists, in response to “The Moral Character of Cryptographic Work”.
I read the first parts of this paper instead of cutting directly to the suggested pages 25-27. I started off very skeptical of this paper, almost offended that it drew a comparison between nuclear weapons, tools that have the power to ultimately destroy life and leave long-lasting devastation, and cryptography.
The author introduced the paper with a story about nuclear arms, and Bertrand Russell, and suggested that since he and other scientists were a part of creating technologies, they had a moral imperative to speak out about their potential harms. However, there is an alternative and more general moral reading to this story of Bertrand Russell and other scientists of his time speaking out. That is that they had power and respect, and therefore a platform to voice what needed to be heard for the public good. Their moral responsibility to vocally stand for what was right was due to their power, rather than their position in science.
The power of cryptographers is different and more subtle Most people do not know what cryptography is, or who cryptographers are, in the same way that the general (western) public knew about and respected the famous scientists like Russell following the second world war.
Despite this, once into the depths of the paper, I did appreciate how the author framed the social importance of cryptography.
Framing: “Surveillance is an instrument of power… While surveillance is nothing new, technological changes have given governments and corporations an unprecedented capacity to monitor everyone’s communication and movement.”
The author also discusses how drone attacks other acts of warfare follow information gathered by surveillance.
This all raises questions for me as to what extent people who work on generic tools, like cryptography, without working on their applications, have a moral imperative to consider all of their possible applications. How can a mathematician imagine all possible ways that their creation can be used for evil? Should they distract themselves with all of the possible uses of one of their creations instead of doing their work? Did Satoshi know that prediction markets for assassinations (e.g. on Augur) would one day operate on the blockchain, and should that have mattered? If humans are to create weapons for evil and your tool is a small contribution to their weapon, is that a problem? The evil weapon likely could have been created anyway building on top of other tools. For example, assassination markets on Augur use blockchain technology, but if neither Augur nor blockchains existed, these markets would form elsewhere.
Instead of asking these questions, the author goes on to focus on the crippling effects of a potential surveillance state, like that described in dystopian novels, or the one present in North Korea. Do cryptographers have the moral responsibility to build the tools to shield individuals from surveillance? This draws the more general question: Do privileged technologists like us have the moral responsibility to build tools for good, just because we are the ones who can? I think yes. I wonder what others think…

1 Like

The unstoppable code video makes me itch all over. I can see that some of you really like it, and I know that Rhys likes it too. It should make for good discussions - I hope. It seems to me that Andreas is busy painting a picture of evil powers and that “we” (Ethereum users) are the ones to fight them. No doubt in his mind that he could be the evil one. He uses language like: “the cloud: an international surveillance engine where you put your data on other people’s servers, so they can rape your privacy and make billions”.

Andreas is a moral relativist - so am I. From that I draw the conclusion that we need to muddle through using much-maligned committee meetings. As Churchill once said: “Democracy is the worst kind of government - except for all the others.” Amen! Andreas’ conclusion is we need unstoppable code, so the one true way of doing things so that it will never ever be subject to further committee muddling ever again. It’s almost like he is saying: we don’t need five haircuts, only one. He is bad-mouthing North Korea, but unstoppable code is North Korea. (Just exaggerating to get a reaction.)

Oh, and then he claims the only thing preventing Ethereum from success is evil forces. Well, no, Ethereum has to prove itself, Darwin-style, just like all other new technologies and solutions.

I agree with aberke. The “moral character of cryptographic work” frames the issue within our really existing complex network of power, whereas Andreas has an evil dark power running the world that “we”, the heroes, are out to combat.

Casey’s focus on the centralization-decentralization tension makes sense to me. In addition to his argument, it still needs to be shown that decentralized is scalable. Facebook’s Libra Masterplan: Yes, Facebook is deliberately trying to simultaneously claim the advantages of centralized and decentralized. The vagueness is productive for them, but in my view unethical.