Refugee resettlement organizations bear the important responsibility of ensuring refugees flourish in their new locations, making decisions “without regard to race, religion, nationality, sex, or political opinion” as prescribed by the U.S Code. The recent employment of automated decision-making (ADM) systems in this process, however, may be at odds with this requirement due to their potential for bias.
ADM systems, a form of artificial intelligence, perform decision-making tasks previously undertaken by humans. Presently, at least half of the agencies regulating refugee resettlement use this technology to aid the process.
Notably, in 2018, the refugee resettlement organization HIAS collaborated with researchers to craft Annie MOORE, an ADM system that suggests placements likely to optimize employment prospects for arriving refugees. The system, however, relied heavily on short-term data such as employment outcomes 90 days post-arrival, neglecting critical areas such as long-term employment, physical and mental health, education, or household earnings.
Subsequent systems like RUTH have seen implementation but the issue of limited data remains a concern, which, as the system’s developers concede, has resulted in incomplete data for the past 15 fiscal years.
While ADM technologies can streamline the refugee resettlement process, the reliance on limited data can propagate inequality. This fact is underscored in a February 2023 working paper from Harvard Business School. The authors argue that current resettlement algorithms could present considerable legal complications due to their differential impacts on refugees based on age, education, and country of origin.
Similarly, the EU’s General Data Privacy Regulation’s Recital 71 mandates individuals governing ADM systems to prevent discriminatory effects. While the U.S hasn’t passed a comprehensive data privacy legislation yet, Title 8 of the US Code provides enough authority to regulate resettlement agencies’ decision-making methods if they discriminate against individuals on nationality grounds.
This potential for bias is further underscored in legal cases like Bauserman v. Unemployment Insurance Agency in which the court upheld the plaintiff’s right to seek damages due to an ADM’s system’s alleged violation of due process.
To avoid legal pitfalls and potential biases, resettlement agencies must ensure their use of ADM systems comply with laws and do not discriminate against refugees they’re intended to help.