Not long after I published "King Google's Song", a moderator removed the #blacklivesmatter tag I had included. It has since been replaced, and I wholeheartedly thank them for that. However, it being Juneteenth, I'm in the mood to do something I don't usually do: explain my art. The poem was abstract, so if a mod didn't get it, then there are probably many out there who didn't. This is the note that I sent the Dev team about removing the tag. It's significantly edited and expanded a bit because I wrote the original note in about 20 minutes while sleep-deprived. (Sorry!) It is, more than a little bit, a screed. An enjoyable one, I hope.
I do not believe in singular interpretations of art, let alone in an author's fictitious ability to prescribe art's meaning. Presenting this piece as intended to do that would be a terrible insult to me.
Aside from this, my preferred way to celebrate Juneteenth is to read some of the narratives of former slaves. The US government collected thousands of them. Enslaved people were people, and the best way to understand their humanity is to read their life stories in their own words.
Ahoy.
I'm disputing the removal of the #blacklivesmatter tag from my poem "King Google's Song". The message associated with the removal said that the mod didn't understand the poem's relevance to Black Lives Matter. I do not take the removal personally, because the poem is somewhat abstract, but I tagged the poem for what it is. The tag should be reinstated.
At a base level, the poem is about a poor black woman being subjected to a Google interview. She is dismissed by the interviewer because she does not fit his preconceptions about who a programmer must be. (That is, a white man who runs certain software and can do whiteboard interviews in their sleep.)
Her name, Katherine Clay, is more than just alliteration. It is allusion to famous African-Americans in science history, Katherine Johnson and Roy L. Clay, Sr. To be honest, I used Johnson's name because it fit the pattern, not because she had much to do with computer science. The point was that cutting people out because of the color of their skin means that companies and computing as a whole are constantly missing out on potential greats.
Deriving the poem from "King Herod's Song" connects Katherine to the trial of Jesus Christ, a povertous miracle worker, before King Herod, a wealthy autocrat, as depicted in the musical Jesus Christ Superstar. Before the interview, Katherine has had to perform God-like feats because that was the only way she was ever going to get King Google's attention in the first place. Once she has made it into the room at the start of the poem, her skills are met with open, sarcastic hostility. Her identity as a programmer is only doubted. She is called a liar in spite of what everyone knows about her and everything she ever accomplished for her community before she had to participate in this hazing ritual at the mercy of a rich white man.
There are ways to take the Jesus analogy further. There are imperfections in the way the poem captures it. However, I will only mention one more because it is so indirect. The header image above the poem is an 1860 woodcut depicting the Massacre of the Innocents. This is the Bible story where King Herod's father tried to kill Jesus as a baby by ordering the murder of every child under the age of two. The culling of black people's opportunity begins when they are born, long before the in-person interview. The domination of software development by white programmers was a deliberate cycle initiated by white men of the past. (It later became more open to white women, but white women are not diversity.) Deliberate self-interest in the legacy that gives white programmers birthright career advantages looms large over white programmers who perpetuate it in the present.
It is wrong for companies to cut people out of opportunities because they don't have certain backgrounds. We know that this practice is widespread. The Google Interview called out by the poem's title is literally designed to emphasize interviewers' biased gut feelings about candidates' performance under artificial pressure even though the actual job is nothing like this. Most software jobs do not require the onerous mathematical specialization overtly gauged by it. No software jobs require the fuzzier qualities subtlely gauged by it. For example, it filters for people who have time and inclination to endlessly drill as training for the Google Interview, which is widely acknowledged to mean white men. It also filters for people who already have comfortable, high-class, low-stress living arrangements, which is widely acknowledged in America to mean white people and particularly white men. Further, the interview process usually really starts with a referral from an employee at these monolithically white companies, and white people are widely acknowledged to have monolithically white social networks. The Google interview process tests for previous success in an environment where opportunities to succeed have been monopolized by the white race, and we all know it.
Software engineers operate in is a nightmarish, Kafkaesque, white-supremacist hellscape. People in power launder this fact with the thin wash of "meritocracy" provided by technical interviews. If this "meritocracy" was going to increase the diversity of tech companies, then that would have happened already. It did not. The technical teams at Big N companies remain monolithically white and the share of black employees in technical roles has not increased more than 1% or 2% in the last decade. Some companies, like Google, have even changed their reporting to hide the exact racial makeup of their technical teams. Amazon does the same, cravenly lumping together their well-paid, white tech employees and their systemically exploited, marginalized non-white warehouse workers. Apple didn't even publish a diversity report for 2019. (Peep the "December 2018" fine print.) This all comes down to institutionalized racism in hiring. To them, I say: Black Lives Matter. Hire black people. We are already ready.
I do not reject the poem's potential appeal to other communities (such as it would ever get). The point that interviewers favor candidates who match their preconceptions is obviously broader than the black woman I chose as the perspective character. Gay people, for one, are also widely discriminated against by tech companies. Much like with Black Lives Matter—while there's pain all over, reading "King Google's Song" evokes black pain to me right now.
I tagged the poem #blacklivesmatter because the poem says "Black Lives Matter." Please put it back.
Thank you for your consideration,
Mike Overby