Anti-Trafficking Review

ISSN: 2286-7511
E-ISSN: 2287-0113

The Anti-Trafficking Review promotes a human rights-based approach to anti-trafficking. It explores trafficking in its broader context including gender analyses and intersections with labour and migrant rights.

DOI: 10.14197/atr.201220141

Editorial: Between Hope and Hype: Critical evaluations of technology’s role in anti-trafficking

Jennifer Musto, Mitali Thakor, and Borislav Gerasimov

Abstract

Over the past decade, scholars, activists, and policymakers have repeatedly called for an examination of the role of technology as a contributing force to human trafficking and exploitation. Attention has focused on a range of issues from adult services websites and the use of social media to recruit victims and facilitate trafficking to the utilisation of data analytics software to understand trafficking and identify ‘hotspots of risk’. This article introduces the Special Issue of Anti-Trafficking Review devoted to the role of technology in (anti-)trafficking. It outlines the main assumptions and critiques some of the proposed ‘solutions’ in the field and presents briefly the articles included in the issue. It concludes that the factors that enable and sustain human trafficking are varied and complex and require political will – not tech solutionist fixes.

Please cite this article as: J Musto, M Thakor, and B Gerasimov, ‘Editorial: Between Hope and Hype: Critical evaluations of technology’s role in anti-trafficking’, Anti-Trafficking Review, issue 14, 2020, pp. 1-14, https://doi.org/10.14197/atr.201220141.

The year 2020 may well be remembered as the year of COVID-19, an unprecedented moment when a pandemic upended myriad facets of political, social, and economic life. Speculative forecasts aside, at the time of writing, this much is clear: in a relatively short period of time, a novel coronavirus has sealed off borders, restricted travel, and curtailed in-person gatherings at school, workplaces, and conference venues. Whatever meaning, however fraught, was attached to the notion of ‘business (and we would add politics and life) as usual’ before the spread of the virus has been indefinitely suspended, and global public attention daily trained to tracking confirmed cases, tallying death counts, and taking stock of the virus’s disruptive social, political, and economic effects.

The links between technology and anti-trafficking—the focus of this Special Issue of Anti-Trafficking Review—and COVID-19 may seem topically distant and their analytical connections not readily apparent. However, by situating COVID-19 as an analytical launch pad into the Special Issue, our aim is to spark creative interdisciplinary approaches in tracking how distinctive global phenomena constitutively overlap in moments of social and economic disruption. And, more pointedly, we hope to better understand how issues framed as exceptional give rise to solutions,[1] including state and non-governmental solutions augmented by technology, which may further contribute to structural vulnerabilities.

Consider one COVID-19 example that dovetails with sex work, technology, and anti-trafficking politics. As travel bans, border containment efforts, and a mix of mandatory and voluntary quarantines continue apace, the upending of various industries and businesses have left many workers reeling. Workers ineligible for paid leave and lacking worker protections are especially vulnerable, including (though not limited to) people in the sex trades. In the absence of meaningful state assistance, some groups have taken to crowdfunding and found other ways to help sex workers impacted by the pandemic, for instance by raising money and sharing advice and resources.[2]

The use of technology in these instances reveals the resiliency of sex workers organising to help people access critically important resources and ease financial losses. However, the bitter irony is that sex workers’ use of technology—to advertise services, screen clients, share information with peers, and bank online—has come under intense scrutiny, not to mention criminal sanction, on the heels of a decade’s worth of legislative and advocacy efforts to disrupt trafficking online by shuttering sites and holding platforms liable for activities presumed to facilitate trafficking. The passage of the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) in the United States in 2018 amplified extant anti-prostitution efforts posturing as anti-trafficking protection with sweeping, censorious, and harmful effects on sex workers in the US and beyond.[3]

In our current moment, widely viewed as unprecedented, we also wonder: might an already constrained situation worsen for people in the sex trade as well as for workers in the gig economy, manufacturing, or service industries such as tourism and hospitality? How might technology exacerbate already precarious labour arrangements? And what analytical insights from past research to document anti-trafficking and technology might be brought to bear in mitigating current and future vulnerabilities?

These questions are not completely speculative. They draw on more than a decade’s worth of critical trafficking studies highlighting how anti-trafficking/anti-slavery ‘cures’ produce injurious and sometimes worse effects than the ‘epidemic’ itself—to use, albeit critically, the language of media outlets that frequently characterise human trafficking as an ‘epidemic’.[4] Human trafficking is also commonly analogised as a form of slavery, a discursive move that elevates its exceptional status.[5] Framing a complex phenomenon like trafficking as exceptional authorises ‘uncompromising [calls to] action’[6] to address it, such as rigid border controls, and innovative solutions, including technological ones, that heighten state and humanitarian surveillance efforts.[7] What this research also draws our attention to is that efforts to stave off a crisis—whether the threat of human trafficking or a virus-induced public health emergency—can obscure structural factors that shape vulnerability and contribute to inequalities. People who endure structural vulnerabilities during more typical moments—for instance, migrants, refugees, ethnic and racial minorities, sex workers, and incarcerated, homeless, and working-class people—often face intensified conditions of constraint and economic precarity in the face of extraordinary situations.[8] Moreover, exceptional state and non-state actions generated in response to crises in general[9] and human trafficking in particular can contribute to intensified forms of surveillance for groups deemed ‘at risk’.[10] Such surveillance is made possible through data and technology—themes this Special Issue takes up.

Technology and Anti-Trafficking

Over the past decade, scholars, activists, and policymakers have repeatedly called for an examination of the role of technology as a contributing force to human trafficking and exploitation. Attention has focused on a range of issues from adult services websites and the use of social media to recruit victims and facilitate trafficking[11] to the utilisation of data analytics software to understand trafficking and identify ‘hotspots of risk’.[12] For many anti-trafficking stakeholders, technology, assumed to be a contributing force to exploitation, can be reworked, and ‘transformed from a liability into an asset’.[13] Yet the idea that technology can be harnessed to address human trafficking relies on limited data and a number of assumptions.[14]

Just as facts are contested in human trafficking policy and research,[15] there is added contestation where technology is concerned. In contrast to anti-trafficking stakeholders’ unscrutinised optimism about technology, pitching its benefits in unilaterally positive terms, researchers have begun to seriously grapple with the assumptions that underlie discussions about technology and anti-trafficking, for instance whether anti-trafficking efforts augmented by technology are effective, or if instead such efforts do more harm than good.[16] Moreover, as critical scholars have pointed out, assumptions that vex the understanding of trafficking are mirrored and magnified in the understanding of technology-facilitated trafficking too.[17] These include uninterrogated claims that trafficking occurs mainly in the sex industry, that women in the sex trades are especially vulnerable while men are empowered, and that the general public has a central role to play in identifying victim-survivors.

Less understood are the ways in which power and technology cohere in anti-trafficking policy and practice and to what effect. Investigating these questions is further complicated by the fact that definitions of technology vary widely. We understand technology as a range of techniques that structure and are structured by power and expertise.[18] We also understand technology as ‘co-produced’,[19] which is to say, its practical form and ultimate meaning is indelibly tied to discourses, institutions, and arrangements of power that authorise its development and use. Understanding technology as equal parts technical, political, and social is instructive in demonstrating how an uncritical embrace of deploying technological solutions for complex social problems can increase the repressive, controlling arm of the state, as several of the contributions to this Special Issue illuminate. It further helps to map the uneven benefits of technology on different actors, for instance, when tech solutions benefit corporations more than workers, or where technical fixes hailed as innovative fail to address poor working conditions, bad labour migration regimes, and business demand for profits.

Platform Regulation and Tech Solutionism

Politicians, law enforcement, and users of social media like Facebook and Instagram have issued urgent calls for technology companies to take actions toward ‘cleaning up’ their platforms.[20] These demands are premised on the notion that technology companies bear responsibility to monitor activities and content deemed illicit. The current default is that companies are not doing enough to regulate platforms but ought to. Though there are mounting demands for non-state actors and entities to regulate their platforms, as Tarleton Gillespie has pointed out, technology companies have actively promoted the political and discursive framing of their sites as ‘platforms’ in order to skirt regulatory obligations required of telecommunications providers, while ensuring many of the protections of free speech legislation.[21] In fact, companies’ profit motive directs them not to regulate their platforms, protect user privacy or to be meaningfully accountable to them. Yet such critiques have done little to squelch a tide of data entrepreneurs who have gotten in on the business of digital disruption, presenting technologies like apps as capable of ‘solving’ slavery/trafficking.

Notable too is that data-driven ‘disruption’ leverages ideas of moral entrepreneurship. Kelly Gates has argued that tech solutionist ‘moral entrepreneurs’ present themselves as rescuers to humanitarian problems by reframing those problems as technological ones.[22] Tech solutionism is driven by moral appeals that technology will cleanly and uncomplicatedly solve all of the problems wrought by complex issues like human trafficking. In this regard, tech solutionism echoes other findings from critical anti-trafficking scholarship that suggests anti-trafficking has become a ‘rescue industry’.[23]

One of the key modes of data entrepreneurship we see regarding trafficking is the proposal to harness ‘big data’.[24] Recent work in Science and Technology Studies has turned a critical eye toward data science and data collection techniques,[25] insights that hold important lessons for researchers and advocates whose work explores trafficking and exploitation. While state-sponsored data classification schemes, such as racial categorisations, have come under intense scrutiny in the sociological and historical scholarship,[26] scholars have recently called attention to the ways non-governmental organisations and corporations also harvest ‘big data’ from users.[27] Mark Andrejevic and Kelly Gates argue that today’s state policing agencies hold a prevailing attitude of ‘collect-everything’ in their approaches to data collection.[28] This attitude assumes that problems can best be solved with the aggregation of maximum information. We see this collect-all approach to big data presented in proposed solutions to trafficking. These new forms of data collection involve subtle and sometimes intimate forms of surveillance,[29] collecting user information to generate algorithmic identity profiles.

In the anti-trafficking field, digital worker reporting apps perpetuate the illusion that the collection of more worker data will present self-evident solutions to labour exploitation.[30] But, as Andrejevic and Gates caution, large-scale databases ‘can generate patterns that have predictive power but not necessarily explanatory power’.[31] Data generated by apps, worker reporting tools, and automation are also laundered through a human rights ‘indicator culture’ that gives it the veneer of accuracy and objectivity[32] despite sizable gaps in data that may also be taken out of context.

It is also critical to note that the big data collection proposed by data entrepreneurs requires a massive expansion of surveillance infrastructure. That the creation of a data-oriented infrastructure has been lauded by figures in the anti-trafficking/anti-slavery movement[33] puts into sharp focus how visions of slaves’ ‘liberation’ may end up authorising the creation of a surveillance humanitarianism infrastructure to address trafficking,[34] all the more notable in an environment where little if any regulation exists to oversee it. Ironically, for tech solutionist data entrepreneurs, the ‘freedom’ of some will require the unfreedom—through the removal of privacy safeguards—of others. Furthermore, the tech ‘solutions’ to trafficking are to be developed by corporate actors and implemented by individual consumers, not through state-level policies, thus enacting a classic neoliberal attitude toward the management of socio-economic issues. In this way, neoliberal capitalism, although sometimes acknowledged as creating the inequalities leading to trafficking, is also positioned as the means to solving it.[35]

Networked Governance

Anti-trafficking efforts augmented by technology and backed by anti-trafficking policies also draw attention to shifting governance norms.[36] Prior to the 2018 passage of FOSTA, numerous attempts occurred to raid and shutter sites like Craigslist, Backpage, MyRedbook, and others, which were presumed to support the facilitation of sexual exploitation online.[37] By upending part of Section 230 of the Communications Decency Act of 1996,[38] FOSTA has advanced a model of governance that makes the enforcement of anti-trafficking laws not only the job of law enforcement but of a diffuse network of platforms and websites.[39]

An anti-prostitution strategy camouflaged as anti-trafficking protection, FOSTA illuminates two networked effects that were in development before its passage but which have been further extended following it: first, the rise of networked neo-abolition policy and practice across borders. Secondly, the advancement of a networked policing strategy reliant on third-party actors to anticipatorily police networks, pre-emptively analyse, filter, and scrub content presumed to be linked to commercial sex, and cooperate with law enforcement.[40] This has led to heightened vulnerabilities for sex workers and trafficked persons in the United States but also beyond, as two of the contributions in this Special Issue demonstrate.

More broadly, the anti-trafficking movement itself has become a counter-network to the trafficking it seeks to address, with the way it has united a diverse group of actors, including state and municipal authorities, international organisations, philanthropies, women’s rights groups, trade unions, celebrities, religious leaders, and corporations. As Thakor and boyd have argued, technology-facilitated trafficking is destabilising, and anti-trafficking agencies deploy new technologies in attempts to stabilise networks.[41] Yet, while this anti-trafficking network demands the transparency and accountability of technology for its potential involvement in trafficking, it has continued to operate in its own ‘accountability vacuum’[42] and remained surprisingly immune to calls for transparency and accountability for the rights violations of migrants, sex workers, and other marginalised groups that it has promoted. As many of the articles in this Special Issue show, this is just as necessary in technology-facilitated anti-trafficking measures.

This Special Issue

The articles featured in this Special Issue offer sharp analyses of the ideologies of intervention and governance that have bolstered tech solutionism in anti-trafficking efforts. The issue opens with an article by Sanja Milivojevic, Heather Moore, and Marie Segrave who trace the development of the discourse surrounding technology and (anti-)trafficking from the early 2000s to the present day, where technology is framed as part of both the cause of and solution to trafficking. They analyse and critique four main assumptions about the role of technology in anti-trafficking efforts. The authors conclude with a call to anti-trafficking stakeholders to look past technology and re-focus their efforts on advocating for humane migration policies, and decent work and economic opportunities for all.

The next three articles examine different types of apps developed with the goal of preventing or combating exploitation. Stephanie Limoncelli analyses three apps aimed at encouraging ethical consumption by providing information to consumers about the risks of trafficking, exploitation, and child labour associated with various everyday products. Limoncelli notes the multiple problems with these apps, such as the obscure methodology used to rank them, limited or outdated sources, and contradictory information about the companies or products consumers are advised to choose or avoid. Furthermore, these apps, the author argues, reinforce neoliberal ideologies about the limited role governments should play in regulating businesses by locating the responsibility for the eradication of exploitation with individual consumers instead of collective action by workers

Apps concerned with the views of workers are the subject of the next article, by Laurie Berg, Bassina Farbenblum, and Angela Kintominas. On the basis of literature review and expert consultations, the authors present the limitations of so-called ‘digital worker reporting’ tools—apps through which global brands aim to collect information from hard-to-reach workers about their working and living conditions. While these are often touted as an efficient and cost-effective way to gather data directly from workers, the authors note a number of limitations, some of which are the same that have been plaguing traditional social audits for decades. These include that digital tools may not capture data from a representative cohort of workers and that data may be vague or superficial. A challenge specific to digital tools is that the collection of data creates new risks for workers’ wellbeing and safety. On the whole, the authors conclude, digital worker reporting tools have limited or no benefits for workers. They also emphasise that technological tools cannot address the structural causes of worker exploitation, such as the drive for business and shareholder profit and consumer demand for cheap goods and services.

As a counterpoint to these business-driven apps, in the next article, Annie Isabel Fukushima highlights how an app can be useful when it is developed by, for and with migrant workers. She showcases the app Contratados (Contracted), developed by a migrant rights organisation in the US, which allows migrant workers to find work, rate employers, share resources, and seek support. She conceptualises the app as an example of a ‘migrant futurity’—a vision of the future as imagined and enacted by migrants—as opposed to the ‘homeland futurity’ of surveillance and border control currently enacted by the US and many other governments. Using primary data collected from migrant workers and survivors of violence and trafficking in the ‘tech city’ of San Francisco, Fukushima argues that technology can be used to both help and harm migrant workers.

The next two articles analyse the impact of a relatively new, and highly controversial, measure to reduce human trafficking in the sex industry—the closure of websites hosting sex work advertisements. Samantha Majic analyses the closure of two such websites—MyRedbook, used by female sex workers and their clients, and Rentboy, used by male sex workers and their clients. Her analysis reveals that while concerns about human trafficking were cited as a reason for the closure of the former, such discourse was absent in the latter case. This reflects long-standing stereotypes about female sex workers as helpless and vulnerable victims and male sex workers as free and empowered agents. Furthermore, while only sex workers and their allies expressed outrage at the closure of MyRedbook, not only sex workers, but also LGBT people, their advocates, and civil liberties groups reacted to the closure of Rentboy. Majic critiques the LGBT movement’s ‘respectability politics’ and urges it to show greater solidarity with sex workers and other marginalised groups, given the fragile gains of the movement and the opportunities and constraints that technological developments offer in the pursuit of gender, racial, and sexual justice.

The final thematic article, by Erin Tichenor, examines the impact of the closure of another adult advertisements website, Backpage, on sex workers in New Zealand following the passage of FOSTA. Drawing on twenty interviews with sex workers in Auckland, Tichenor shows how the closure of Backpage allowed a local platform, NewZealandGirls, to hike up its prices and force unfavourable conditions on sex workers who had little choice but to accept them. These findings further demonstrate how technology allows the overzealous US ‘anti-trafficking’ policy to extend far beyond its borders. Tichenor concludes by calling for anti-trafficking measures that prioritise community well-being and empowerment rather than those that strengthen the carceral state’s stronghold on people’s lives.

The first of the three short articles that conclude the issue also examines the impact of FOSTA. Danielle Blunt and Ariel Wolf present the findings of a community-based, sex worker-led survey that asked sex workers about their experiences since the closure of Backpage and adoption of FOSTA. The vast majority of research participants stated that their financial situation has deteriorated, as has their ability to access community and screen clients. The authors conclude that FOSTA is just the latest example of the US government using anti-trafficking policy and restrictions on technology to police already marginalised people.

In the next short article, Isabella Chen and Celeste Tortosa reflect on their experience providing legal and social support to twenty Venezuelan women who were trafficked to Austria. In particular, Chen and Tortosa describe how the women were trafficked through the use of social media and chat apps. They also share how the digital evidence from online interactions between the women and their traffickers was used in the investigation and successful prosecution of the case. They warn, however, that this does not apply to all women their NGO supports, and thus digital evidence, and technology, have only limited application in anti-trafficking efforts.

The final article, by Kate Mogulescu and Leigh Goodmark, describes how some victims of human trafficking in the sex industry in the US are prosecuted alongside traffickers and put on sex offender registries. The result? Both a criminal record and an indefinite digital mark that limits their ability to find a job, settle in a new community, and see their children. The authors conclude with a call for a careful, critical look at the system of sex offender registries and, more broadly, policing and prosecution strategies, including in cases of human trafficking, in the United States.

Conclusion

Although the articles in this Special Issue examine different aspects of the ‘trafficking-technology nexus’, they ultimately converge around several main points. First, the role of technology as either a facilitator or disruptor of human trafficking remains poorly understood and largely based on ideology, political agendas, and limited evidence: more often than not, it simply repeats long-standing erroneous assumptions about sex work, migration, and precarious labour. Secondly, the currently available technological ‘solutions’ have limited, if any, benefit for the trafficked persons, migrants, and low-wage workers they purport to help; rather, they benefit technology corporations, reinforcing the very neoliberal capitalism that creates and exacerbates people’s vulnerability to trafficking. Finally, anti-traffickers’ obsession with technology is a smoke-screen that obscures the role of gender discrimination, labour market deregulation, restrictive migration policies, and crucially, the rise of networked responses that pass as humanitarian yet are inextricably tied to a surveillance capitalist system[43] that exploits people’s personal data for profit. Not only do these systems and approaches create the conditions—including networked vulnerabilities—that exacerbate inequalities and expose people to the risks of trafficking. They also draw precious attention and limited resources away from measures capable of preventing trafficking and exploitation: decent work, gender, economic and racial justice, the free movement of people, and social protections grounded in transparency and accountability. Such prevention and protection efforts demand political will, not tech solutionist cures.

These insights also hold some lessons, even if speculative, in accounting for the effects of technology in response to COVID-19. First, community-based actions, tech or otherwise, are uniquely positioned to prevent exploitation. Relatedly, a robust public health response is needed to contain the spread of the virus and to mitigate its widespread effects. Yet in the absence of a coordinated global response, we see a surfacing of philanthrocapitalist-backed techno-solutionist fixes[44] and calls to enlist ‘big tech companies’ for support.[45] Placing trust in tech firms whose platforms have provided the technical blueprint for state surveillance efforts,[46] cloud-supported immigration enforcement,[47] and that have compromised users’ privacy in exchange for advancing facial recognition technologies proves limited.[48] It is likewise short-sighted to assume that tech companies are equipped to fill in the slack of an otherwise unresponsive state if such efforts are not accompanied by meaningful efforts to address the social, political, and economic barriers that make it hard for people to avoid the virus in the first place but also to survive its devastating financial effects.

As the articles in this Special Issue show, reliance on technological solutions does not necessarily translate into improved conditions for trafficking victims and other vulnerable communities. Indeed, if unaccompanied by wider socio-political shifts to address structural vulnerabilities, tech interventions may limit ameliorative efforts or, worse, create barriers to obtaining meaningful relief.

Jennifer Musto is an Associate Professor of Women’s and Gender Studies at Wellesley College. She is an interdisciplinary scholar whose research explores the laws, technologies, and modes of governance designed to respond to human trafficking and sex work in the United States. Her book, Control and Protect: Collaboration, carceral protection, and domestic sex trafficking in the United States (University of California Press, 2016) examines state, non-state, and technology responses to domestic sex trafficking situations in the US and she has lectured and published widely on these topics. Email: jmusto@wellesley.edu

Mitali Thakor is an Assistant Professor of Science in Society at Wesleyan University. She is an anthropologist of technology with interests in feminist and critical race studies of surveillance, policing, artificial intelligence, and robotics. Her book project, provisionally titled Facing the Child: The digital policing of child pornography, is an ethnographic study of the global network of experts tasked with the policing and content moderation of child abuse images. Email: mthakor@wesleyan.edu

Borislav Gerasimov is Communications and Advocacy Coordinator at the Global Alliance Against Traffic in Women and the Editor of Anti-Trafficking Review. He holds a degree in English Philology from Sofia University St. Kliment Ohridski, Bulgaria and has previously worked at women’s rights and anti-trafficking organisations in Bulgaria and the Netherlands. He has also been involved in the work of organisations supporting Roma youth, LGBTI people, people living with HIV/AIDS, and sex workers. Email: borislav@gaatw.org

Notes:

[1]      J Musto, Control and Protect: Collaboration, carceral protection, and domestic sex trafficking in the United States, University of California Press, Oakland, 2016; J Quirk, The Anti-Slavery Project: From the slave trade to human trafficking, University of Pennsylvania Press, Philadelphia, 2011.

[2]      The Red Umbrella Fund has published a list of such efforts as of 31 March 2020: Red Umbrella Fund, ‘Sex-workers’ resilience to the COVID crisis: a list of initiatives’, 31 March 2020, https://www.redumbrellafund.org/covid-initiatives.

[3]      J Musto et al., ‘FOSTA-SESTA, Networked Neo-Abolition, and Sexual Humanitarian Scope Creep’, Presentation Paper, Law and Society Association, Washington D.C., June 2019; B Chapman-Schmidt, ‘“Sex Trafficking” as Epistemic Violence’, Anti-Trafficking Review, issue 12, 2019, pp. 172-187, https://doi.org/10.14197/atr.2012191211.

[4]      See, for example: J Galucci, ‘Human Trafficking Is an Epidemic in the U.S. It’s also big business’, Fortune, 14 April 2019, https://fortune.com/2019/04/14/human-sex-trafficking-us-slavery.

[5]      Quirk; see also: I Grewal, Saving the Security State: Exceptional citizens in twenty-first-century America, Duke University Press, Durham, 2017.

[6]      Quirk.

[7]      Musto, 2016.

[8]      Consider another example linking COVID-19 to forced labour practices. Amid consumers’ panic buying of hand sanitiser and face masks, state officials in Hong Kong and New York conscripted incarcerated people to produce these high-demand items. Some commentators have framed prison labour as akin to slave labour, see: H Grant, ‘Vulnerable Prisoners “Exploited” to Make Coronavirus Masks and Hand Gel’, The Guardian, 12 March 2020, https://www.theguardian.com/global-development/2020/mar/12/vulnerable-prisoners-exploited-to-make-coronavirus-masks-and-hand-gel, and J McKinley, ‘Cuomo’s Fix for Sanitizer Shortage: 100,000 Gallons Made by Prisoners’, New York Times, 9 March 2020, https://www.nytimes.com/2020/03/09/nyregion/coronavirus-newyork-sanitizer.html.

[9]      Concerns about COVID-19 have informed state surveillance efforts, including the rise of biometric surveillance. See Y N Harari, ‘The World After Coronavirus’, Financial Times, 20 March 2020, https://www.ft.com/content/19d90308-6858-11ea-a3c9-1fe6fedcca75.

[10]     Musto, 2016; Grewal; Chapman-Schmidt.

[11]     See: M Latonero et al., The Rise of Mobile and the Diffusion of Technology-Facilitated Trafficking, University of Southern Carolina, 2012; M Latonero et al., Human Trafficking Online: The role of social networking sites and online classifieds, University of Southern Carolina, 2011; V Greiman and C Bain, ‘The Emergence of Cyber Activity as a Gateway to Human Trafficking’, Journal of Information Warfare, vol. 12, no. 2, 2013, pp. 41-49.

[12]     M Latonero et al., Technology and Labor Trafficking Project: Framing document, University of Southern Carolina, 2014.

[13]     Organization for Security and Co-Operation in Europe, ‘Using Technology to Combat Trafficking in Human Beings: OSCE Alliance against Trafficking conference explores how to turn a liability into an asset’, OSCE, 9 April 2018, https://www.osce.org/secretariat/416744.

[14]     J Musto, ‘The Limits and Possibilities of Data-Driven Anti-Trafficking Efforts’, Georgia State University Law Review, forthcoming, 2020; J Musto and d boyd, ‘The Trafficking-Technology Nexus’, Social Politics, vol. 21, no. 3, 2014, pp. 461-483, https://doi.org/10.1093/sp/jxu018; see also the contributions of Milivojevic et al. and Limoncelli in this Special Issue.

[15]     S Majic, ‘It’s Blue and It’s Up to You! Policy narratives and anti-trafficking awareness in the United States’, forthcoming, 2020.

[16]     Musto, 2020; see also the contribution of Milivojevic et al. in this Special Issue.

[17]     Musto and boyd, p. 15.

[18]     This perspective has a long tradition in Science and Technology Studies, where scholars have argued that technological artifacts are not neutral or objective, but political (L Winner, ‘Do Artifacts Have Politics?’, Daedalus, vol. 109, no. 1, 1980, pp. 121-136) and intimately shaped by social relations.

[19]     S Jasanoff, States of Knowledge: The co-production of science and social order, Routledge, New York, 2004.

[20]     Most recently, journalistic coverage has focused on the traffic in child pornography and abuse images, with a callout of companies’ apparent failure to properly remove such images. See, for example: M H Keller and G J X Dance, ‘The Internet Is Overrun With Images of Child Sexual Abuse. What went wrong?’, New York Times, 29 September 2019, https://www.nytimes.com/interactive/2019/09/28/us/child-sex-abuse.html.

[21]     T Gillespie, ‘The Politics of “Platforms”’, New Media & Society, vol. 12, issue 3, 2010, pp. 347-364, https://doi.org/10.1177/1461444809342738.

[22]     K Gates, Our Biometric Future: Facial recognition technology and the culture of surveillance, NYU Press, New York, 2010; see also K Gates, ‘Identifying the 9/11 “Faces of Terror”: The promise and problem of facial recognition technology’, Cultural Studies, vol. 20, no. 4-5, 2006, pp. 417-440, https://doi.org/10.1080/09502380600708820.

[23]     L M Agustín, Sex at the Margins: Migration, labour markets and the rescue industry, Zed Books, London, 2007; see also G Soderlund, ‘Running from the Rescuers: New U.S. crusades against sex trafficking and the rhetoric of abolition’, NWSA Journal, vol. 17, no. 3, 2005, pp. 64-87, https://doi.org/10.1353/nwsa.2005.0071, and the contribution of Milivojevic et al. in this Special Issue.

[24]     See, for example: D Thorpe, ‘The New Sheriff in Human Trafficking Is Wielding Big Data’, Forbes, 11 October 2018, https://www.forbes.com/sites/devinthorpe/2018/10/11/the-new-sheriff-in-human-trafficking-is-wielding-big-data/#6b70e5857520.

[25]     See, for example: T Boellstorff, ‘Making Big Data, in Theory’, First Monday, vol. 18, no. 10, 2013, https://doi.org/10.5210/fm.v18i10.4869; M Andrejevic, ‘The Big Data Divide’, International Journal of Communication, vol. 8, 2014, pp. 1673-1689; L Gitelman (ed.), Raw Data is an Oxymoron, MIT Press, Cambridge, 2013.

[26]     G Bowker and S L Star, Sorting Things Out: Classification and its consequences, MIT Press, Cambridge, 1999.

[27]     See, for example: d boyd and K Crawford, ‘Critical Questions for Big Data: Provocations for a cultural, technological, and scholarly phenomenon’, Information, Communication & Society, vol. 15, no. 5, 2012, pp. 662-679, https://doi.org/10.1080/1369118X.2012.678878; J Cheney-Lippold, We Are Data: Algorithms and the making of our digital selves, NYU Press, New York, 2017; D Lyon, ‘Surveillance, Snowden, and Big Data: Capacities, consequences, critique’, Big Data & Society, 2014, pp. 1-13, https://doi.org/10.1177/2053951714541861; J van Dijck, ‘Datafication, Dataism and Dataveillance: Big data between scientific paradigm and ideology’, Surveillance & Society, vol. 12, no. 2, 2014, pp. 197-208, https://doi.org/10.24908/ss.v12i2.4776.

[28]     M Andrejevic and K Gates, ‘Big Data Surveillance: Introduction’, Surveillance and Society, vol. 12, no. 2, 2014, pp. 185-196, https://doi.org/10.24908/ss.v12i2.5242.

[29]     K E C Levy, ‘Intimate Surveillance’, Idaho Law Review, vol. 51, no. 3, 2015, pp. 679-693.

[30]     See the contribution by Berg et al. in this Special Issue.

[31]     Andrejevic and Gates, p. 186.

[32]     S E Merry, ‘Measuring the World: Indicators, human rights, and global governance: with CA comment by John M. Conley’, Current Anthropology, vol. 52, no. S3, 2011, pp. S83-S95, https://doi.org/10.1086/657241.

[33]     For instance, at the launch of the Global Human Trafficking Hotline Network by Google Ideas in April 2013, researcher Kevin Bales noted that the hotline may help to get a better quantitative handle on the modern slavery problem, observing ‘Every image, every second of film is data, that we can use to find and root out to reach into those hidden places, open them up, find the people in slavery, help them to step up to their own liberation.’ (Jennifer Musto, fieldnotes, April 2013).

[34]     M Latonero, ‘Stop Surveillance Humanitarianism’, New York Times, 11 July 2019, https://www.nytimes.com/2019/07/11/opinion/data-humanitarian-aid.html.

[35]     K Kempadoo, ‘The Modern-Day White (Wo)Man’s Burden: Trends in anti-trafficking and anti-slavery campaigns’, Journal of Human Trafficking, vol. 1, issue 1, 2015, pp. 8-20, https://doi.org/10.1080/23322705.2015.1006120.

[36]     Musto, 2016.

[37]     Ibid.; see also: M Thakor and d boyd, ‘Networked Trafficking: Reflections on technology and the anti-trafficking movement’, Dialectical Anthropology, issue 37, 2013, pp. 277-290, https://doi.org/10.1007/s10624-012-9286-6, and the contribution of Limoncelli in this Special Issue.

[38]     A provision that gave internet providers and publishers immunity from being held liable for content posted by users linked to criminal activity occurring on their networks.

[39]     Musto, 2020.

[40]     Musto et al., 2019.

[41]     Thakor and boyd.

[42]     A T Gallagher, ‘Editorial’, Anti-Trafficking Review, issue 1, 2012, pp. 2-9, p. 3, https://doi.org/10.14197/atr.2012111.

[43]     S Zuboff, The Age of Surveillance Capitalism: The fight for a human future at the new frontier of power, Public Affairs, New York, 2019.

[44]     W Knight and L Matsakis, ‘Jack Ma Offers to Supply the US with Covid-19 Tests and Masks’, Wired, 13 March 2020, https://www.wired.com/story/jack-ma-supply-us-covid-19-tests-masks.

[45]     D A Kessler, ‘How to Fix the Coronavirus Testing Mess in 7 Days’, New York Times, 13 March 2020, https://www.nytimes.com/2020/03/13/opinion/coronavirus-testing.html.

[46]     A Mitchell and L Diamond, ‘China’s Surveillance State Should Scare Everyone’, The Atlantic, 2 February 2018, https://www.theatlantic.com/international/archive/2018/02/china-surveillance/552203.

[47]     K Ferrari, ‘How Marc Benioff and Salesforce Profit from ICE Camps’, East Bay Majority, 11 October 2019, https://eastbaymajority.com/marc-benioff-salesforce-ice-homeland-security.

[48]     M Murgia, ‘Microsoft Quietly Deletes Largest Public Face Recognition Data Set’, Financial Times, 6 June 2019, https://www.ft.com/content/7d3e0d6a-87a0-11e9-a028-86cea8523dc2.