Digital Rights
Compared to other users of internet technology, sex workers pour a disproportionate amount of time and resources into advocating for access, arguing with social media platforms to preserve our accounts, maintaining online visibility and thus our networks to each other and our source of income. For many of us, the internet is our workplace; and a vital source of community and information sharing.
However governments are pushing for platforms to add stricter verification and content moderation policies under the guise of online safety. These policies push sex workers out of online spaces (e.g. deplatforming), and sometimes demand our legal identity and thus compromise our security.
Data protection is also of serious significance to sex workers, as the linking of our sex work identities with our legal identities can lead to further discrimination such as loss of access to banking and payment processors.
Deplatforming, censorship, shadowbanning, classification of our advertising as X18+ or R18+ (aka Class 2 content), using AI to out us, or writing Terms and Conditions to exclude us, are not ways to make the internet a safer place.
Background
Online technologies have the potential to improve income, health, education and community connection for sex workers.
In recent years, and particularly since the beginning of the COVID-19 pandemic, many sex workers engage in online forms of sex work as we navigage an increasingly precarious labour environment and need to diversify our income streams. Sex workers also use online tools to create and distribute health information, harm reduction resources, share safety strategies and form community connection.
Alongside an expansion of sex work in online spaces, in-person sex work also increasingly relies on internet technologies. Like other workers and businesses, sex workers increasingly rely on internet technologies to carry out everyday business activities such as advertising, social media, communications, payment processing, banking, business administration and other services that have moved to digital/virtual service delivery.
‘Online safety’ or sex worker erasure?
Over the last decade, sex workers have experienced wide-scale loss of access to online tools and services. This has largely been a result of law and policy seeking to hold the tech sector responsible for what governments perceive as online and real-world harms, and the tech sector’s responses to those changes. Examples include the Allow States and Victims to Fight Online Sex Trafficking Act 2018 (US) and the Stop Enabling Sex Traffickers Act 2018 (US) (the laws known as FOSTA-SESTA), as well as the United Kingdom’s Online Safety Act 2023 (UK), and the Online Safety Act 2021 (Cth) in so-called Australia. These laws essentially force tech services to implement crude content moderation practices that result in the deplatforming of sex workers, and the loss of the digital tools we use in the course of conducting business and keeping our communities safe, informed and connected.
Australia’s Online Safety Act 2021 (Cth) also established the Office of the eSafety Commissioner, a non-elected regulator with powers to develop and implement industry codes and standards for:
- social media services (e.g. X, formerly Twitter)
- search engines (e.g. Google and Bing)
- relevant electronic services (e.g. WhatsApp)
- designated internet services (everything else, e.g. Dropbox, ChatGPT).
Industry codes and standards cover things like how adult content (also known as Class 2 content) is regulated, how illegal or non-permitted content is detected and removed, and how tech companies should check users’ ages (age verification or age assurance). Scarlet Alliance has engaged with the Office of the eSafety Commissioner on behalf of sex workers in unceded Australia – outlining how bad tech regulation hurts our income, safety and community. These concerns have largely been dismissed or ignored.
Threats to Privacy and Encryption
Proposed eSafety industry Standards and Codes (aka rules) require online platforms to implement measures to detect, remove, disrupt and deter illegal content such as child sexual abuse material (CSAM) and pro-terror content.
For messaging or cloud storage platforms, platforms must scan content during transmission (which is impossible with platforms that use end-to-end encryption), or be scanned on the sender and/or receiver’s device before or after sending (known as client side scanning). This has massive privacy and practical implications. Scanning technologies include:
- AI detection where AI reviews the content and determines if it is illegal or not (e.g. if the video is of a child or an adult)
- Hashing, which is where the unique code of the content is matched against a database of known illegal content. If the unique code matches, then it is flagged.
Both methods have been widely criticised by privacy and security researchers, digital rights advocacy organisations and human rights groups around the world. Apple announced that it would implement scanning of iCloud photos for CSAM in 2021, but in late 2023 announced it was not able to be effectively implemented. In 2022 Google reported a man to police, and he had his Google account blocked because he sent a photo of his child to a doctor for medical advice and Google had flagged it as CSAM. Even after police cleared him of any wrongdoing, Google refused to reinstate his account.
AI, algorithmic bias and shadowbanning
Tech services often use algorithms (AI technologies) to detect, hide and remove sex worker accounts and content. These new technologies are simply not able to make fair decisions – they can’t distinguish between sex and sex education, or between consensual and non-consensual porn. They are not developed to take into account that sex work is decriminalised in many parts of the world, and that in some areas sex workers have anti-discrimination protections. They replicate anti-sex work laws, even where such laws do not exist.
Research shows that practices such as shadowbanning and deplatforming have significant consequences for sex workers – including negative mental health, discouraging engagement in political activity, preventing the circulation of health and safety information and increasing the ability of sex workers to work independently and autonomously.
The use of algorithms to discriminate against sex workers online has real world consequences. Airbnb is known to use algorithms and flawed tech to identify and exclude users based on presumptions about risk or illegality, PayPal algorithms automatically flag ‘sexually oriented’ materials and services and Stripe will proactively shut down sex work business accounts, claiming ‘financial risk’.
Algorithms or AI used by tech companies to detect if someone is a sex worker could include:
- tracking travel movements
- linking credit card data
- profiling based on age, gender, ethnicity, and consumer data
- face matching from scanning sex worker advertisements.
In any kind of algorithm or AI there are baked in assumptions and discrimination about who is ‘risky’ or ‘deviant’ and worthy of being banned. There have also been reports of friends and family members of sex workers having their accounts taken down.
The Australian Government is currently conducting a review of the Online Safety Act 2021. Scarlet Alliance will make a submission to this review, and you can visit here for updates or follow our socials to find out more about our submission, and how tech law and policy impacts sex workers in Australia and internationally.
The internet is a workplace
Regulation should intend to empower internet users of all ages to have safer spaces online. Sex worker online presence is not in competition with the interests of children and young people. Many sex workers are also parents and carers, whose livelihood depends on the existence of age-appropriate online spaces for all internet users.
Further Reading
Tech discrimination in unceded Australia
It’s sex discrimination: banks strip brothels and escort agencies of their rights Crikey (20 May 2020)
Sex workers fear a new wave of deplatforming — and the proposed Online Safety Bill ABC News (20 February 2021).
Rushed through Parliament: sex workers’ concerns about the Online Safety Act SBS News (1 March 2021)
How Banks are Exploiting a Loophole to Legally Discriminate against Sex Workers Junkee (10 November 2021)
Australian sex workers have been banned from Linktree. What they are doing is not illegal Vice (21 January 2022)
High Risk Hustling: Payment Processors, Sexual Proxies, and Discrimination by Design City University New York Law Review (2023)
Threats to Privacy and Encryption
Local and international organisations urge Australia’s eSafety Commissioner against requiring the tech industry to scan users’ personal files and messages Digital Rights Watch (20 December 2023)
Apple’s Decision to Kill Its CSAM Photo-Scanning Tool Sparks Fresh Controversy Wired (31 August 2023)
A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal The New York Times (21 August 2022)
Google refuses to reinstate man’s account after he took medical images of son’s groin (23 August 2022)
Inside Apple’s impossible war on child exploitation (7 September 2023)
AI, algorithmic bias and shadowbanning
This is how AI bias really happens — and why it’s so hard to fix MIT Technology Review (4 February 2019)
Instagram’s murky “shadow bans” just serve to censor marginalised communities The Guardian (9 November 2019).
‘What exactly is shadow banning?’ Bustle (2 August 2020)
Posting into the Void Hacking//Hustling (October 2020)
Report says shadowbanning is real–and it’s suppressing sex workers Daily Dot (16 Oct 2020)
Censorship of Marginalised Communities on Instagram Salty Algorithmic Bias Collective (27 September 2021)
“There is no standard”: investigation finds AI algorithms objectify women’s bodies The Guardian (8 February 2023)
FOSTA-SESTA and tech discrimination
Sex workers are canaries in the free speech coal mine Buzzfeed News (8 April 2018)
Erased: The impact of FOSTA-SESTA and the removal of Backpage on sex workers Hacking//Hustling (2020)
“I’ve Never Been So Exploited”: The consequences of FOSTA-SESTA in Aotearoa New Zealand Anti-Trafficking Review (2020)
Are you ready to be surveilled like a sex worker? Wired (27 June 2022)
Sex Workers Have Been Banned From Airbnb for Years. Will You Be Next? The Nation (26 November 2022)
Financial Discrimination and the Adult Industry Free Speech Coalition (May 2023)
Maximising tech safety for sex workers
Digital Security: The Smart Sex Workers Guide Global Network of Sex Work Projects (October 2021)