Creating a Humanitarian Data Ecosystem
Today, there is more data produced in a year than the rest of human history combined. The humanitarian sector, which once struggled with data scarcity, is now overwhelmed with the amount of data collected for a single project.
A massive amount of primary and secondary data plus media-based data sources generate information on a scale that organizations and analysts cannot handle independently. There are many advantages of making proper use of the massively available data — the analysts get access to real-time information and make better-informed decisions.
Secondary Data Review (SDR)
There are quite a few tools available to process humanitarian data today. For instance, quantitative data from survey results are processed and often shared through HDX — a UN OCHA initiative. However, most qualitative data is unstructured; thus, data processing and analysis require a significant investment in terms of time and resources leaving a large part of the data unprocessed or underutilized.
Data Entry and Exploration Platform (DEEP)
Following the Nepal Earthquake in 2015, Ewan Oglethorpe (Exec. Director at Data Friendly Space) landed in Kathmandu to lend a helping hand. Ewan, a data scientist with roots in Silicon Valley, joined the team of crisis responders at the tented camp in the UN country office compound. He realized that the UN’s system to analyze the available data was rudimentary and could benefit from more advanced technology. During the crisis, Ewan, with a small tech team, worked in consultation with humanitarian experts and analysts to create the first version of the Data Entry & Exploration Platform (DEEP) .
“There are several software solutions available to manage and process qualitative data, including Envivo, Mxeg and DEEP. Both ACAPS and UNHCR are piloting a project in DEEP, a platform specifically developed by and for humanitarian actors to process substantial amounts of unstructured data. Users can upload a variety of sources (news articles, PDFs, Word documents etc.) and tag/categorize them using custom analytical frameworks. Catalogued information can then be exported into Excel or Word for further analysis.”
Source: Pilot — Joint Processing of Qualitative Data on Rohingya Crisis (May 2018), Humanitarian Response, an OCHA Service (May 2018)
Since its inception, DEEP has been used for Secondary Data Review (SDR) and tagging large datasets in more than 1,200 projects supporting humanitarian responses across all humanitarian sectors all over the world. To name a few, DEEP was crucial to
Data Friendly Space and DEEP
Data Friendly Space (DFS) is currently the technical supervisor and host of DEEP. DFS has been implementing DEEP projects in collaboration with several major humanitarian organizations. DFS supports the board of the DEEP governing body, which includes UNICEF, UNHCR, UN OCHA, OHCHR, the International Federation of the Red Cross, ACAPS, IDMC, Okular-Analytics, JIPS and iMMAP. Since its first project in 2018, DFS has continued to increase its capacity and its partners’ capacity to make a lasting impact.
Today, with more than 85,000 annotated humanitarian response documents hosted on the platform, DEEP is in a unique position to leverage NLP models to fuel much faster responses to humanitarian crises.
Machine Learning to Expedite Humanitarian Action
Since its inception, DFS has been focused on creating data-centric applications to support humanitarian organizations in extracting actionable insights from their data and fulfilling their missions.
Today, with more than 85,000 annotated humanitarian response documents hosted on the platform, DEEP is in a unique position to leverage NLP models to fuel much faster responses to humanitarian crises. With the development of new NLP models, DFS aims to automate secondary data reviews done by content tagging teams (a lengthy process) and enable humanitarian stakeholders to respond rapidly to any crisis by focusing more on the analysis of the data and not just data acquisition.
DFS concentrates on the intersection between data automation processes powered by Artificial Intelligence and human knowledge, particularly when one can help the other execute the analysis. DFS’ NLP innovation team has grown to five specialized engineers spread across Europe.
In addition to the existing NLP team, iMMAP, a DEEP board member, has initiated a research partnership with the ISI Foundation, a prestigious private institution based in Turin (Italy) conducting research rooted in the area of Complex Systems Science. The ISI Foundation has appointed Nicolò Tamagnone, who will be exclusively working on implementing NLP features in DEEP.
Open-source Technology
As a leading global non-profit, DFS is dedicated to keeping its technologies accessible for emergency response and development organizations. To reach its goal and maximize its impact:
To learn more about DFS, its projects and DEEP, please sign up for our newsletter on www.datafriendlyspace.com
Or write to us at hello@datafriendlyspace.com
Written by: Rishi Jha — Communications & Partnerships — Data Friendly Space