An indictment of economists and engineers
A piece on moral and software engineering.
Recently I had an exchange on Twitter with a friend I originally know from an un-conference. This encounter inspired this post because it is an important subject to me.
So, here’s what happend: He is responding to a tweet about how our smart TVs are watching us watching TV - and mean this quite litteraly. They record video and audio, sending our private data, images and audio to the manufacturers, or heaven knows where.
He expressed, that he - especially as a developer - feels particularly helpless in the sense that he spends his day working for corporations doing exactly the thing he despises: Building bare minimum products that are just usefull enough to attract users in order to get their private data and selling it, when he would much rather spend time and effort playing for the other team.
I responded - not very delicately - that there is always the option of saying NO. By this I meant, saying NO to requirements that make me build a system that tracks it’s users. Saying NO to storing data beyond the point where it’s actually needed. In short: Saying NO to building a system that vicitmizes it’s users.
He pointed me to the fact, that he “has fought this fight many times” and the only thing he acceived was “straining his interpersonal relationships” (I’m paraphrasing, this is not a litterlal quote!). I share his observation.
At the ground of this discussion lays an interesting moral question, that we as software engineers should be well versed in and have a strong opinion on (but - unfortunately - usually don’t): At what point do I refuse to implement functionality that goes against my moral convictions?
Moral refers to an individuals own principles on what they deem right and wrong. It stands in contrast to ethics which refer to rules provided by an external source like religion and/or society. It is for this reason that we have immediate influence on our moral, but only have little influence over ethics. So, how do we apply this distinction to the situation at hand:
My friend and I agree on the fact that mining, storing and trading our customers private data is morally wrong! But being morally wrong does not make something ethically wrong. Our globally connected society has decided that it is ethically justified to unwithingly trade private data in exchange for (mostly free) services.
I have gotten to know most engineers as idealistic creatures. We actually believe that we can and know how to make the world a better place. We love taking on the worlds complex problems to provide solutions and do so with little regard for our own gain. The fact that there are billons of lines of code published under permissive open source licenses like MIT or CC speaks to the level of our idealism.
If we implement functionalities that infringe on our customers interests, like unnecessary tracking, boundless data collection and retention or down right theft of personal information, we do so because our hand is forced. And if it is, it’s usually economists doing it. They have been taught two fallacies in their business schools: Firstly that it is ethically right to trade on the unconscious ignorance of their customers, take (and in many cases down right steal) personal information in order to trade it for additional financial gains. And secondly that what is ethically right must be morally right. It’s a neat trick: Erasing the distinction between ethical and moral wrongdoing, removes the remorse for immoral actions.
If this sounds like an indictment of economists, then you are right on the money (pun intended). But if we indict economists, we must leave some space at the defendants table. If economists are indicted for the lies they learnt at fancy business schools, the we too, the ones that do know better, deserve to be indicted alongside them. Not necessarily for the things we’ve built (which might enslave entire generations and their data to unaccountable corporations) but for letting ourselves be blinded, persuaded, forced or bullied into doing something we knew to be against our moral convictions.