Image by Katharina Brenner

It’s no surprise that the internet is straight. Its basic building blocks are binaries, its moguls are predominantly male, and its algorithms trained on heteronormative data destined to replicate traditional values. 

Over the past decade, civil rights groups and individual LGBTQIA+ users have spoken out against marginalization of the queer community online, which appears as “innocuous” as shadowbanning queer posts on Instagram and as dangerous as limiting access to healthcare and HIV/AIDS information

In 2018, the watchgroup CitizenLab found that internet filters manufactured by the Canadian company Netsweeper misidentified and blocked non-pornographic queer content in over 30 countries. In 2020, the ACLU of Northern California took on the case of two queer musicians who claimed Facebook had censored multiple LGBTQ ads. In two reports released in 2022, Global Project Against Hate and Extremism (GPAHE) proclaimed that large technology platforms including Amazon, Google, Facebook, Youtube, and Twitter were letting conversation therapy disinformation run rampant, especially in languages other than English.

To pretend that digital hostility towards LGBTQIA+ folks phenomena is coincidental, rather than deliberate, is to misunderstand the fundamental structure of the internet as a whole.

Although these tech giants are rarely held responsible, the repeated discrimination of the queer community by other users and platforms themselves indicates a rampant virtual hostility towards LGBTQIA+ folks and to pretend this phenomena is coincidental, rather than deliberate, is to misunderstand the fundamental structure of the internet as a whole. 

Queerness was misaligned with pornography by the alt-right long before the inception of the world wide web. Now, this false conflation is codified (literally), claims film and media studies scholar Alexander Monea in The Digital Closet: How the Internet Became Straight, published by MIT Press. Monea’s book — a social history of anti-pornography rhetoric, an analysis of machine vision algorithms, a testimony to repeated marginalization of the LGBTQIA+ community, and an exploration of heteronormative design — traces the history of pornographic censorship to situate the current virtual landscape in a larger cultural context. Throughout, Monea demands that internet service providers (ISPs) and social media platforms stop censoring non-pornographic LGBTQIA+ content, while also advocating for broader sexual liberation online. Its comprehensive examination of exclusion and discrimination towards the queer community online, situated in a web of conservative policy and heteronormative pornography, reminds us that digital infrastructure not only reinforces bias but can also establish a harmful online world order with physical repercussions.  

Digital infrastructure not only reinforces bias but can also establish a harmful online world order with physical repercussions.  

It was evangelical conservatives, anti-porn feminists, and the alt-right that initially formed a unified front in the war on pornography, sewing the seeds for one of many ways that biases became embedded in the machine visual algoritms that dictate how we navigate content online today. These unlikely bedfellows set the groundwork for online ‘morality’ policing, undertaken nowadays by the National Council on Sex Exploitation (NCOSE) in the United States. Such context is crucial to understanding the escalating battles over online pornography that have taken place over the past two years, as Monea demonstrates. 

In March 2021, Utah Governor Spencer Cox signed a measure which would automatically enable filters blocking pornography and other “material harmful to minors” on all all tablets and smartphones sold in state. This law, which will go into effect if five other states adopt similar policies, appears as a modern day equivalent of the 1873 Comstock Act, which banned the mailing of pornography, contraceptives, and other sexually explicit content. Just as the Comstock act law was weilded as a weapon for the persecution of gay publishers and consumers across the country throughout the mid twentieth century, contemporary anti-pornographic rhetoric may be used a tool for LGBTQIA+ discrimination while failing to minimize the production and dissemination of heternormative pornography, on or offline. 

At every level, Monea demonstrates how the coders, code, and moderators responsible for web-scale censorship reify heteronormativity and reproduce this historical LGBTQIA+ exclusion online, as well as extend the digital infrastructure of common tube sites. From the stock image used to test the efficacy of image compression algorithms — a Playboy centerfold of Swedish model Lenna Söderberg — to the pervasive sexism in the Google workplace, the social context in which code is developed is laden with bias, which unsurprisingly appears in the code itself. The most prominent machine learning content filter, Google Cloud Vision API, “exacerbates sexualization of women’s and female-presenting bodies” writes Monea, while reinforcing gender binaries. Furthermore, the heteronormative pornographic browsing experience serves as a foundation for digital design, with lightweight controls, discretion, and support for common, repeated actions becoming a mainstay of online large-scale platforms and ever-gamified websites looking to increase the addictive properties and usability of their pages. Unsurprisingly, both the form and content of digital infrastructure are averse to non-normative bodies or modes of being online. 

Unsurprisingly, both the form and content of digital infrastructure are averse to non-normative bodies or modes of being online.

Once a form of hypermedia in comparison with other digital design principles, pornography now operates in tandem with search engines, advertisements, and social media platforms. To overcompensate for the repetitive nature of heteronormative erotic offerings, digital pornography relies on overstimulation, which is now employed across internet by companies hoping to achieve a monopoly in the attention economy. Nowadays, such platforms require passive engagement in exchange for entertainment, much like early tube sites designed to capture the attention of a partially engrossed viewer

Repetition of heteronormative content, in pornographic and non-erotic contexts, force invisibility upon the queer community, which often has material consequences. LinkedIn’s alleged censorship of queer content (as Eye on Design’s editors also recently observed while trying to promote a story on a queer publishing) drastically limits the network of LGBTQIA+ professionals. YouTube’s repeated demonetization of trans, gay, and lesbian content has disproportionately impacted smaller-scale content creators, who cannot appeal the ruling without evidence of at least 1,000 views in the past seven days. For queer creators looking to find community or create content online, this form of financial censorship can be entirely inhibitive. This problem has been taken up by queer artists such as Zach Blas, whose transCoder organizes words outside of WordNet’s semantic biases or trans artist Courtney Demone’s #DoIHaveBoobsNow? Project, a collection of photos of Demone’s breasts while transitioning to confront censorship of feminine bodies on Facebook. Still, Monea argues larger scale change is necessary to build a freer and safer online space.

Monea advocates for building a new internet from scratch, arguing that the current internet is perhaps broken beyond repair

The Digital Closet concludes by suggesting action items that are at once achievable and urgent. He demands vigilance and accountability through data collection, open public discourse on sexual speech, more and better evidence on the impact of sexual speech, anti-censorship commitment, better adjudication mechanisms, AI explicability, “human algorithm” explicability, and the ability to opt-out, in the hands of users. In addition, Monea outlines a more revolutionary response to the digital closet, with the hopes of creating a genuinely free digital space. To this end, he suggests defunding the police, legalizing sex work on and offline, making sex a concern for the welfare state, direct action through community organizing, transitioning communications infrastructure and/or social media platforms into public utilities, and creating a queer digital utopia, entirely distinct from the internet as we know it today. 

In many ways, Monea advocates for building a new internet from scratch, arguing that the current internet is perhaps broken beyond repair. While I struggle to imagine a new online that fundamentally abandons its background, I do believe designers and engineers have a responsibility to encourage an online world in which all bodies may be free. To do so may involve rethinking the act of navigation itself, so users may not be nudged through narrow tunnels but rather find the freedom to explore.