Jennifer Chan, MD/MPH, is an Associate Faculty Member of the Harvard Humanitarian Initiative (HHI) and Lead Instructor for Humanitarian Technologies at the Humanitarian Academy at Harvard University. She is also an Assistant Professor and Director of Global Emergency Medicine at Northwestern University’s Feinberg School of Medicine. Her recent activities have focused on humanitarian technologies and crisis mapping with a focus on field operations. As a consultant, she helps evaluate open source technology organizations such as Ushahidi, trains emerging practitioners in humanitarian technologies, and researches the interface between humanitarian agencies and volunteer & technical communities.
Q: Can you tell us a bit about your work at the intersection of technology and humanitarian assistance?
A: The area of work I’ve focused on over the past couple of years is commonly referred to as crisis mapping in the field of humanitarian technology. Humanitarian technology refers to the application of technology to address humanitarian issues, including both acute onset disasters and complex ongoing humanitarian emergencies.
I’ve worked in high-tech, low-tech and near no-tech settings, and this has pushed me to continually think about and re-explore the relationship between information, privacy, security, and ethics. As a researcher and consultant I’ve worked with NGOs and other agencies to help them strategically and programmatically integrate ICT into complex humanitarian situations, including disasters, conflict settings, and drought prone regions. This work has included work on when [the use of technology] is appropriate, and how it is safe to use.
Q: From an ethics perspective, what do groups designing development projects with a technology component need to keep in mind?
A: A key ethical perspective to keep in mind is the “do no harm” principle. Codes of conduct, when revised and applied to crisis mapping projects, provide very important guidance but this is still a work in progress. A concise, practical, and widely accepted set of ethical guidelines, and best practices that take into account principles, standards or guidelines, have yet to be developed.
ICT projects have potential for faster, more accurate and wider inclusion of information, but they also present vulnerabilities that can cause risk and potential harm. Information shared using technology can be accidentally misrepresented or interpreted, carelessly distributed without consent, and even intentionally manipulated for a multitude of purposes that are not in line with the good intentions of the ICT project. And there are additional vulnerabilities that lie within the technology itself. But the flip side is that these types of obstacles are frequently surmountable with a good framework for learning, iteration, and revisions embedded in the project.
Working with technology, it is helpful to recognize and accept the tenet that access to information is related in many ways to power. Understanding that first is key. Beyond thinking about how technology can influence the design of a project, organizations should seek to understand the existing information ecosystem by asking: What decisions would be made, or how would behavior change, on the basis of the information gathered with technology? Where are the vulnerabilities of information collection, sharing and communication that exist with the intended users and beneficiaries of the project, and with the technology itself?
In my evaluations of these types of projects, frequently I see organizations thinking about how they’ll use technology before developing a nuanced understanding of what information is desired, or how the information collected would be used. Rather infrequently is there a comprehensive assessment of information as it relates to privacy, risk and harm. I often recommend that this process be inverted and efforts made to include as many users in the discussion. This speaks to the importance of user-centered design.
Q: What best practice have you seen in the use of ICT in development projects that incorporates sound ethical principles or guidelines?
A: In my experience, many of the emerging ethical principles and guidelines in these projects are focused on information, security and privacy. On the design side, organizations need to be mindful of the various planning phases and how much time it takes to understand the information security and privacy context. There needs to be an assessment at the country level of what information and privacy guidelines need to be considered. In addition, organizations should outline steps for the project, ask stakeholders to recognize and abide by codes of conduct, and link these activities to ethical principles that relate to ICT as well as to the organization’s mission.
It’s also critical to engage stakeholders at the local level to understand what is important to consider when it comes to information and privacy. In certain contexts a change in region within a country could mean a significant change in risk. Engaging the right people in the design process is essential to developing a sound security protocol. This can require a lot of resources, planning and advocacy for adequate funds to achieve these important goals, but will serve the project better over the long term.
Q: What about the importance of enabling a user to “opt in” or “opt out” in the context of a humanitarian mapping project?
A: With any crisis mapping project, the first thing you need to do is consider the purpose integrating ICT. But before you collect information you need to ask yourself and the community members involved: Why use a map? What’s the anticipated purpose? And within this exercise also ask whether or not this has a potential to do harm in addition to good.
Ideally, you would reach out to all of the people in the information ecosystem. You would ask local communities: This is what we’d like to do, what do you think about it? Find out what are people’s feelings are. Is this ability to share information beneficial or uncomfortable to them? These conversations become telling about whether there are under recognized or unanticipated challenges.
Q: Do ethical considerations change depending on whether you’re dealing with an area where there is high access to technology versus one with low or no access?
A: To some degree, yes, and to some degree no. What it really comes down to is an understanding of the information flow. You could have online-based mapping or curation of crowdsourced information, but if that information goes into a print format and is shared with communities, or if meetings take information from an online source and then folks talk about it, then you’re looking at an online-offline information pattern. Sharing information in an offline environment does not necessarily remove the risks associated with it.
To ensure information security and privacy, it’s useful for organizations to map out entire information communications systems, from high- to low- and no-tech communications, and then build in a vulnerability and risk model that engages the community in this process to make this a participatory practice. Once you have this framework, then challenge the assumptions around the risk model. Could there be risk in collecting information by itself, even in an offline environment? Having this framework facilitates community engagement, and allows analysis at both global to local levels.
Q: What else is important to consider?
A: Scenario planning is important from a programmatic design standpoint. Whether you’re using FrontlineSMS or another ICT platform, scenario planning and simulation are valuable to understand vulnerabilities, risk, and unexpected privacy issues. It also can demonstrate how much time is required to start – often much more than people realize.
Having a budget line item for a simulation and practicum period is also very useful. Taking policy level discussions into simulation to implement protocols and understand what happens in the last mile helps strengthen the implementation phase of the program. You can learn from mistakes and ideally mitigate any potential negative effects with beneficiaries. Finally, simulations and a practicum period enable people involved in the project time to learn the tool and, more importantly, the process, and hopefully come to a deeper understanding the how ICT fits within the larger project goal. Organizations that do this are much more successful than many of the implementations we’re seeing.
Q: What is the significance of organizations like United Methodist Communications incorporating ICT4D into their project design?
A: It’s important that groups thinking about engaging in ICT related activities, particularly those that are new to it, to remember that no matter what tool or technology approach you take ultimately these interventions come down to access to information and the benefit and the power associated with it. That’s why it’s important to have a thoughtful process around what that means, and what access to information entails for different people. We frequently assume that there are inherent benefits associated with increasing access to ICT – but are there also risks associated with it and harm as well?
We have to remember that when it comes to privacy and security in crisis mapping or ICT4D, this is a work in progress. Every organization should be part of the larger conversation about what works and what doesn’t—not only as listener, but as contributor. There’s still a ways to go.
For more information about best practice in privacy and security in ICT implementations, see:
Chamales, George, Lea Shanley, and Aaron Lovell. Towards Trustworthy Social Media and Crowdsourcing. The Wilson Center, Washington, D.C. 2013
Collaborative Learning Projects. The Do No Harm Handbook (the framework for analyzing the impact of assistance in conflict). Collaborative for Development Action, Inc., Cambridge, MA. 2004.
The International Committee of the Red Cross. Professional Standards for Protection Work. 2013 Edition. Chapter 6, Pg 77.