Submission on Safe and responsible AI in Australia, Dept of Industry, Science and Resources

As communities that experience everyday consequences of digital discrimination and algorithmic bias, sex workers are uniquely positioned to provide input into technology governance and regulation. As a historically criminalised undertaking, sex work and sex workers have been marginalised through well-established discriminatory barriers relying on markers such as gender, race, migration status, marital status, asset ownership or lack thereof, familial geneology, class, education, citizenship, ability to present identity documents, proof and regularity of income, source of income and position in society as measured by having a respected ‘job’.

These structural barriers, many of which amount to human rights violations, are now embedded in the delivery of systems and services both in human and algorithmic decision-making. In opposition to this, decades of campaigns to dismantle these barriers have led to significant progress towards the realisation of human rights for sex workers and other marginalised communities. Sex workers are highly invested in the question of if and how historic discrimination will be perpetuated and reintroduced through the widespread use of AI and ADM in Australia.

Big tech companies are aware of the existence of bias in Automated Decision Making used to identify and hide or remove content that it deems ‘inappropriate’, are aware that this bias impacts specific marginalised communities, and are aware that this bias may mean that their products and services may be operating in breach of Australian law.

During the past decade, sex workers in Australia and overseas have also experienced a loss of access to technological tools and services at exponential rates, largely as a result of a global trend towards laws and policies that create obligations for private entities to respond to what governments envision as online and real-world harms. The most far-reaching of these measures, the United States legislation Allow States and Victims to Fight Online Sex Trafficking Act 2017 (commonly referred to as FOSTA-SESTA) has yet to generate a successful conviction for a victim/survivor of human trafficking offending, but has been found to diminish the ability of sex workers across the globe to work independently, through loss of income, lack of access to health information and harm reduction strategies, and lack of access to peer support and networking.

A survey of more than 200 sex workers and adult entertainment performers in the United States (where sex work is mostly criminalised) identified shadow-banning and algorithmic bias as having negative impacts on mental health, as well as having a chilling effect on the sharing of health and safety information and engagement in political speech.

The final recommendation of this submission to the Department of Industry, Science and Resources states “Scarlet Alliance, sex worker peer organisations and peer-led organisations from other marginalised communities affected by AI/ADM bias be resourced to provide information to our communities on data privacy and security and navigating emerging AI/ADM technologies.