"If you are neutral in situations of injustice, you have chosen the side of the oppressor."
-- Desmond Tutu
Programming is not -- and has never been -- apolitical or amoral.
From 2015 to 2018, the world's largest tech companies spent a collective $582 million lobbying the United States Congress. Tech is, and has always been, shaped by the moral and ideological goals of particular people and groups.
Tech is shaped by the government through regulation and standardisation, but the government is also shaped by the tech industry. Algorithmic gerrymandering has been a problem in the United States for decades now and researchers are looking to correct the problem with -- what else -- different algorithms. This close-knit relationship between government and computing has existed as long as there have been computers. For instance...
Konrad Zuse, the creator of the world's first programmable computer and first high-level programming language, Plankalkül, was financed by the Nazi German government. He never disavowed their support or expressed regret for furthering the Nazi war effort.
Alan Turing, the father of theoretical computer science and namesake of the Turing machine and Turing test, cracked Nazi codes, shortening WWII and saving an estimated 14-21 million lives. Turing was then chemically castrated by the British government for the crime of being homosexual and later committed suicide.
ENIAC, the world's first general-purpose digital computer, was used by the United States government to "produce ballistics tables and refine hydrogen bomb designs".
Child labor is still commonly used to mine materials used in the manufacture of electronics. Apple, Google, Dell, Microsoft, and Tesla have all been accused of using child labor to mine raw materials.
Social media giants Facebook, Twitter, and YouTube were famously used to organise protests during the Arab Spring of 2010-2012, leading to the toppling of several regimes in the region.
Facebook has been used to facilitate genocide in Myanmar. Facebook has also been criticised for allowing pro-rape groups, providing a platform to promote anti-semitic terrorism, and picking and choosing which political groups and parties to allow on their website.
Employee mistreatment has led to so many suicides at Chinese electronics factories -- where products like the iPhone are made -- that they have installed suicide nets to discourage workers from trying to jump off the roofs of buildings.
A recent study seems to confirm what many have begun to suspect -- that YouTube's "algorithmic amplification" can lead to radicalisation, pushing controversial content and leading viewers down a rabbit hole of outspoken conservative ->
alt-lite ->
alt-right videos and channels.
Amazon scrapped a resumé-screening tool for sexist biases, and has had issues with automatic removal of LGBT works for "pornography", in addition to their horrific history of labor abuses.
Tesla and Uber are confronting a real-life trolley problem in its programming of autonomous vehicles. Are the passengers' lives more valuable? Or pedestrians'? Should the government dictate the ethics of driverless cars? Or should a person be allowed to personalise their vehicle's "ethics settings"?
The code we write is biased, whether we intend for it to be or not. Refusing to acknowledge this fact does nothing to solve the problems created by it. We must actively work to counter our biases and use code for good, and not evil.