Digital Rights

Compared to other users of internet technology, sex workers pour a disproportionate amount of time and resources into advocating for access, arguing with social media platforms to preserve our accounts, maintaining online visibility and thus our networks to each other and our source of income. For many of us, the internet is our workplace; and a vital source of community and information sharing.

However governments are pushing for platforms to add stricter verification and content moderation policies under the guise of online safety. These policies push sex workers out of online spaces (e.g. deplatforming), and sometimes demand our legal identity and thus compromise our security.

Data protection is also of serious significance to sex workers, as the linking of our sex work identities with our legal identities can lead to further discrimination such as loss of access to banking and payment processors.

Deplatforming, censorship, shadowbanning, classification of our advertising as X18+ or R18+ (aka Class 2 content), using AI to out us, or writing Terms and Conditions to exclude us, are not ways to make the internet a safer place.

Background

Online technologies have the potential to improve income, health, education and community connection for sex workers.

In recent years, and particularly since the beginning of the COVID-19 pandemic, many sex workers engage in online forms of sex work as we navigage an increasingly precarious labour environment and need to diversify our income streams. Sex workers also use online tools to create and distribute health information, harm reduction resources, share safety strategies and form community connection.

Alongside an expansion of sex work in online spaces, in-person sex work also increasingly relies on internet technologies. Like other workers and businesses, sex workers increasingly rely on internet technologies to carry out everyday business activities such as advertising, social media, communications, payment processing, banking, business administration and other services that have moved to digital/virtual service delivery.

‘Online safety’ or sex worker erasure?

Over the last decade, sex workers have experienced wide-scale loss of access to online tools and services. This has largely been a result of law and policy seeking to hold the tech sector responsible for what governments perceive as online and real-world harms, and the tech sector’s responses to those changes. Examples include the Allow States and Victims to Fight Online Sex Trafficking Act 2018 (US) and the Stop Enabling Sex Traffickers Act 2018 (US) (the laws known as FOSTA-SESTA), as well as the United Kingdom’s Online Safety Act 2023 (UK), and the Online Safety Act 2021 (Cth) in so-called Australia. These laws essentially force tech services to implement crude content moderation practices that result in the deplatforming of sex workers, and the loss of the digital tools we use in the course of conducting business and keeping our communities safe, informed and connected.

Australia’s Online Safety Act 2021 (Cth) also established the Office of the eSafety Commissioner, a non-elected regulator with powers to develop and implement industry codes and standards for:

  1. social media services (e.g. X, formerly Twitter)
  2. search engines (e.g. Google and Bing)
  3. relevant electronic services (e.g. WhatsApp)
  4. designated internet services (everything else, e.g. Dropbox, ChatGPT).

Industry codes and standards cover things like how adult content (also known as Class 2 content) is regulated, how illegal or non-permitted content is detected and removed, and how tech companies should check users’ ages (age verification or age assurance). Scarlet Alliance has engaged with the Office of the eSafety Commissioner on behalf of sex workers in unceded Australia – outlining how bad tech regulation hurts our income, safety and community. These concerns have largely been dismissed or ignored.

Threats to Privacy and Encryption

Proposed eSafety industry Standards and Codes (aka rules) require online platforms to implement measures to detect, remove, disrupt and deter illegal content such as child sexual abuse material (CSAM) and pro-terror content.

For messaging or cloud storage platforms, platforms must scan content during transmission (which is impossible with platforms that use end-to-end encryption), or be scanned on the sender and/or receiver’s device before or after sending (known as client side scanning). This has massive privacy and practical implications. Scanning technologies include:

  1. AI detection where AI reviews the content and determines if it is illegal or not (e.g. if the video is of a child or an adult)
  2. Hashing, which is where the unique code of the content is matched against a database of known illegal content. If the unique code matches, then it is flagged.

Both methods have been widely criticised by privacy and security researchers, digital rights advocacy organisations and human rights groups around the world. Apple announced that it would implement scanning of iCloud photos for CSAM in 2021, but in late 2023 announced it was not able to be effectively implemented. In 2022 Google reported a man to police, and he had his Google account blocked because he sent a photo of his child to a doctor for medical advice and Google had flagged it as CSAM. Even after police cleared him of any wrongdoing, Google refused to reinstate his account.

AI, algorithmic bias and shadowbanning

Tech services often use algorithms (AI technologies) to detect, hide and remove sex worker accounts and content. These new technologies are simply not able to make fair decisions – they can’t distinguish between sex and sex education, or between consensual and non-consensual porn. They are not developed to take into account that sex work is decriminalised in many parts of the world, and that in some areas sex workers have anti-discrimination protections. They replicate anti-sex work laws, even where such laws do not exist.

Research shows that practices such as shadowbanning and deplatforming have significant consequences for sex workers – including negative mental health, discouraging engagement in political activity, preventing the circulation of health and safety information and increasing the ability of sex workers to work independently and autonomously.

The use of algorithms to discriminate against sex workers online has real world consequences. Airbnb is known to use algorithms and flawed tech to identify and exclude users based on presumptions about risk or illegality, PayPal algorithms automatically flag ‘sexually oriented’ materials and services and Stripe will proactively shut down sex work business accounts, claiming ‘financial risk’.

Algorithms or AI used by tech companies to detect if someone is a sex worker could include:

  • tracking travel movements
  • linking credit card data
  • profiling based on age, gender, ethnicity, and consumer data
  • face matching from scanning sex worker advertisements.

In any kind of algorithm or AI there are baked in assumptions and discrimination about who is ‘risky’ or ‘deviant’ and worthy of being banned. There have also been reports of friends and family members of sex workers having their accounts taken down.

The Australian Government is currently conducting a review of the Online Safety Act 2021. Scarlet Alliance will make a submission to this review, and you can visit here for updates or follow our socials to find out more about our submission, and how tech law and policy impacts sex workers in Australia and internationally.

The internet is a workplace

Regulation should intend to empower internet users of all ages to have safer spaces online. Sex worker online presence is not in competition with the interests of children and young people. Many sex workers are also parents and carers, whose livelihood depends on the existence of age-appropriate online spaces for all internet users.

Further Reading

AI, algorithmic bias and shadowbanning
Maximising tech safety for sex workers

Digital Security: The Smart Sex Workers Guide Global Network of Sex Work Projects (October 2021)