As the humanitarian sector continues to adopt a range of new technologies, we, as humanitarians, need to confront the framework of power and politics within which technology is developed and deployed. Technologists and humanitarians can no longer avoid the centrality of ethics and politics in the use of tech solutions.
Earlier this year, I worked on a landscape review of emerging technologies in the humanitarian sector which looked at strategies to incorporate ethics and humanitarian principles into the design, development, and implementation of emerging technologies with the humanitarian sector. This study of global scope included interviews with leading humanitarian organisations, ethics workshops, an horizon scanning exercise, and engagements with stakeholders in research and implementation.
What became clear through this study was: a) consistent use of emerging technology is already taking place within the humanitarian sector and b) concerns over ethical use of emerging technology was paramount to humanitarian organisations, but they were struggling to devise guidelines which would guarantee that rapid technological use was informed by ethical design.
Throughout this study, humanitarians expressed concerns regarding the misuse of data, data colonialism and the need for technological solutions to be more closely designed with communities that they are seeking to assist.
- Participant in the ethics workshop“We need to create systems that prioritise human interests over organisational agendas'
Digital harm is real, and being experienced by communities in crisis. There are two ways in which we can think of digital harm. The first looks at the use of technology for war aims or destructive aims. For example, while AI solutions seek to improve efficiency in humanitarian settings, AI has also become embedded in global military operations. AI based drone warfare has been used in Ukraine and in Gaza there have been reports on the use of AI generated ‘kill-lists’ causing immense civilian harm (1). This raises serious questions on the use of certain types of technology, their sourcing, and other uses.
The second kind of digital harm looks at the misuse of data used in digital tools/applications. In one of our workshops, some participants recalled instances where data collected by humanitarian organisations was ‘requested’ by host governments. Quite understandably, there are well founded concerns that this data would be used to cause harm to certain communities or individuals. It is also important to note that the communities who were most likely to be harmed by these data privacy breaches were rarely involved in the process that determined the design and use of data systems.
Participants recognised that private technology companies were driven by profit and market penetration rather than any broader humanitarian or altruistic concerns. There is an understanding that technologies used in the humanitarian sector were often developed in other contexts, and it was unclear how data and experiences from humanitarian programming were informing/being used by tech companies in other spaces. While this danger was recognised, we rarely came across technology case studies that adequately discussed this fear. Most material focused only on the specific use case of the technology, devoid of the context of its development.
While ‘cross-cutting’ technologies, i.e. technologies that had been used in other regions or contexts, were the most developed, they were also less likely to cater to the localised needs of each community. We found that in the majority of cases, community involvement was most common towards the end of the design process, if at all. There were very few instances where active community engagement had led to re-design of technology, or cases where technology was co-developed with humanitarians, let alone the communities that we serve. Localistion agendas continue to centre external actors, which risks continuing to impose the same global power structures and hierarchies that localisation seeks to replace (2). Our deep dives found instances where externally designed systems failed to account for community structures and beliefs which impacted the potential benefit of emerging technology (3). For example, in research on early warning systems we found an instance from Pakistan where communities feel a sense of responsibility for, and belonging to, their ancestral lands, and therefore did not engage with evacuation requests.
It is abundantly clear that localisation agendas which do not place communities at the heart of the design process and remain driven by external actors, such as donors and global technology companies, will not adequately reap the benefits hoped for from the use of technology.
Kaurin, Why the localisation of aid requires the localisation of technology, 2021 (4)Improving accessibility and creating opportunities for local individuals, not necessarily to get them to use certain tools or the social media platforms we want them to use because it’s easier for us, but rather to choose the right tools and build their own, in their own language, and on their own terms’ is integral to actioning localisation more broadly in humanitarian action.
Conclusion
The use of technology in humanitarian crises is widespread and is reshaping the way humanitarian organisations function. At the same time, humanitarian organisations are struggling to build organisational guidelines that can guide the ethical design and use of technology, especially given the rapid advances in the field. To develop these frameworks, organisations will need to understand and better conceptualise how technology is politicised, and not a magical apolitical bullet to solve implementation and efficiency concerns. Addressing the power structures that underpin technology design and use is crucial to developing an ethical, inclusive approach to integrating new technologies into humanitarian programming.
In conclusion, I will leave you with three key takeaways.
- The war aims of technology are real. Humanitarian organisations and researchers must look closely at the context in which a specific technology was developed and its potential alternative uses. Enhanced due diligence and ethical reviews are critical to ensuring humanitarian organisations do not work with companies that are responsible for driving the crises that humanitarians are responding to.
- Humanitarian organisations can no longer look at technology as an apolitical tool. The use of tech is fraught with politics, and looking only at the technical elements can lead to harm to the communities in crisis. Further research on and consideration for the ethical implications of the relationships humanitarian build with the technology sector is vital.
- Community led action is crucial to identifying technologies that can have a real impact on crisis impacted communities. Localisation agendas continue to center external actors in ways that undermine the very principles of localisation. More needs to be done to centre communities and their experiences and perceptions of emerging technologies in technological development and humanitarian implementation.
To find out more about our emerging technology project, take a look at UKHIH's Systems Innovations Partnerships.
References
- Abraham, ‘Lavender: The AI machine directing Israel’s bombing spree in Gaza’, Wallace-Wells, ‘What War by AI actually looks like’
- Mulder, The paradox of externally driven localisation: a case study on how local actors manage the contradictory legitimacy requirements of top-down bottom-up aid, 2023
- Paillé, Pauline, James Besse, Hampton Toole, Chryssa Politi, Shruti Viswanathan, Eunice Namirembe, Jyoti Nayak, Sergi Martorell, Iain McLaren, Christopher Tyson, Charlie Wilkening, and Jacob Ohrvik-Stott, Emerging Technologies in the Humanitarian Sector: Technology Deep Dive Series. Santa Monica, CA: RAND Corporation, 2024. https://www.rand.org/pubs/research_reports/RRA3192-1.html.
- Kaurin, Why the localisation of aid requires the localisation of technology, 2021
Shruti Viswanathan is a data governance and digital ecosystem expert with over 13 years of experience working at the intersection of governance, service delivery, social protection and the use of technology. Co-production of knowledge and tools with local actors is a key focus of her work, ensuring that it is grounded in the needs and aspirations of the end users. In 2024, Shruti worked with UKHIH on a research study exploring the use of emerging technologies in the humanitarian sector: opportunities and ethical challenges.
All views expressed belong to the author and do not reflect the position of UKHIH.