Symposium: What Works? 5 Lessons Learned on Slavery Risk for Policy Actors
One year ago, the UK National Audit Office released Reducing modern slavery, a landmark review of the governance and effectiveness of key parts of the UK’s anti-slavery activities. Launching that report, Amyas Morse, the head of the Office, noted that:
The campaign to drive out modern slavery is in the early stages. So far it is helping to establish the scale and international nature of this issue. To combat modern slavery successfully, however, government will need to build much stronger information and understanding of perpetrators and victims than it has now.
Good policymaking requires the wise allocation of scarce resources. And understanding risk is central to allocating resources and ensuring policies and programming will be more effective and more efficient.
With this Symposium on Modelling Slavery Risk, leading anti-slavery experts discussed the benefits and limits of an innovative model for predicting risk of slavery on the individual and national levels. Beyond the science, the Symposium also offers five lessons for policy actors:
1. We are on the verge of breakthroughs allowing risk-informed policies and programming
The paper by Jacqueline Joudo Larsen and Pablo Diego-Rossel that motivated this Symposium demonstrates the feasibility of modelling slavery risk. As they acknowledge, and the contributions from Bernard Silverman, Laura Gauer Bermudez and Shannon Stewart, and Kelly Gleason explore, there are limits to our understanding of risk right now, and to the predictive capabilities of the model Joudo Larsen and Diego-Rossel have developed.
But those limits can be overcome. Joudo Larsen’s and Diego-Rossel’s research points to the possibility that individual factors such as age, gender, employment status, feelings about household income and about one’s life may be predictors of slavery risk. As research continues, our certainty about whether they or other factors are predictors will increase.
And as our understanding of the reliability and strength of such factors in predicting slavery outcomes improves, policies, programmes and interventions can be better tailored and targeted. For policymakers, this means that anti-slavery work will become both more effective and more efficient, making the case for investment in anti-slavery efforts easier and easier to make.
2. Limits on data quality, sharing and modelling are holding us back
The Symposium contributions also make clear, however, that limits on data quality and in modelling continue to hold us back. Joudo Larsen and Diego-Rossel arguably use the best available global estimates of slavery at the national level. That data, combined with the Joudo Larsen-Diego-Rossel model, gives us arguably the most powerful predictive model available right now: but it is also a system that produces striking oddities, seeming to predict, for example, that the number of victims of modern slavery in the United States will be anywhere between “-0.5 million and 4 million”. Clearly, policymakers need data and models that give them greater clarity. As the contributions to this Symposium make clear, this is an important step in that direction, but we have some distance to go.
Even where good data is available, there are real and continuing barriers to sharing it. As forthcoming and collaborative as they have been, key parts of the data and methods that Joudo Larsen and Diego-Rossel rely on are proprietary. Business remains a key source of funding of slavery risk analysis, and in fact may become a bigger player, as states impose new reporting and due diligence obligations on companies. If the result is a fragmented evidence base, trapped behind corporate walls, our understanding of slavery risk will be held back. Anti-slavery investments will be less effective and less efficient. Everyone loses out.
A more effective approach would be to encourage the development of common methodologies and open data, and to invest in systems for data sharing and collective learning. To invest, in other words, in science.
3. Steps are being taken to reduce those constraints
The good news here is that the trend is clearly in the right direction. This Symposium itself is testament to the emergence of a cadre of practitioner-scholars with serious statistical credentials who are helping to strengthen the scientific foundations of policy and practice in the field.
The Call to Action to End Forced Labour, Modern Slavery and Human Trafficking includes important commitments to data sharing (paras 1(ii), 1(vi), 2(ii)). Delta 8.7 has recently begun rolling out country data dashboards to make the best available evidence available worldwide.
The International Conference of Labour Statisticians (ICLS) recently adopted new survey standards that will lead to better, more comparable data on forced labour prevalence in the years ahead.
And the Alliance 8.7 Pathfinder process will allow countries such as Albania, Chile, Madagascar, Nepal, Nigeria, Peru, Sri Lanka, Tunisia, Uganda and Viet Nam to benefit quickly from new scientific insights developments
4. The field is ripe for digital disruption
Nonetheless, the pace of progress is slow – too slow to meet the targets set out in Target 8.7 of the Sustainable Development Goals. The survey standards set out by the ICLS will deliver high quality, comparable data – but only in five to ten years. And the survey methods underlying this approach, and existing Global Estimates, cost millions of dollars to execute. This is one reason that governments have begun experimenting with other techniques, such as multiple systems estimation (MSE) – but MSE is also constrained by what data is already available.
Like many high-cost, slowing moving analytical processes, slavery prevalence estimation and risk analysis is thus ripe for digital disruption. Already the Global Fund to End Modern Slavery is experimenting with using social media and mobile technology for survey dissemination, promising significant cost reductions. Delta 8.7 has successfully used machine learning to estimate official development commitments to fight slavery. And computational analysis offers a path for rapid acceleration of the kinds of modelling begun by Joudo-Larsen and Diego-Rossel.
For these reasons, Delta 8.7, the Computing Community Consortium, the Alan Turing Institute, Rights Lab and Tech Against Trafficking will convene a two-day conference. Code 8.7 will bring together the anti-slavery community and the computational science, artificial intelligence, machine learning and tech communities, to think about how to accelerate our understanding of modern slavery and what works to fight it.
5. There’s no time to lose
If governments are serious about meeting Target 8.7, around 9,000 people need to be removed each day from the ranks of those affected by forced labour, modern slavery and human trafficking, according to the best available estimates. Right now, we simply don’t know how close we are to achieving that aggressive rate of reduction.
The pieces in this Symposium make clear that we are on the verge of significant scientific breakthroughs in understanding what is likely to make someone vulnerable to modern slavery – and in tailoring programming and policies accordingly. Achieving those breakthroughs requires continued investment in the science of anti-slavery, and a willingness to think laterally – and digitally.
And ultimately it will require the kind of honest debate and scientific rigour that all the contributors to this Symposium have so admirably modelled. At Delta 8.7, we look forward to continuing to provide a space for that debate, and for translating it for policy actors.