Expert Panel April 19-30: Health IT for Disaster Relief and Rebuilding: Lessons from post-earthquake Haiti.

By Joaquin Blaya, PhD Moderator | 19 Apr, 2010 Last edited by Robert Szypko on 03 Aug 2011

Hi everyone,
We wanted to officially open our first GHDonline Expert Panel on “Health IT for Disaster Relief and Rebuilding: Lessons from post-earthquake Haiti.” Below are the project descriptions for each of the projects that they are doing in Haiti. We wanted to thank them for this opportunity to interact with them and also MobileActive.org for blogging about this event. This summary will be able to be found at www.mobileactive.org.

We encourage all members to send their questions and comments to this amazing panel.

John Brooks (MSF/Doctors Without Borders)
scope of work:
After the earthquake I left for Port au Prince on January 28, which was my seventh trip to Haiti in the past 18 months, and I stayed until the end of February. The main objectives of my field assignment were to adapt the OpenMRS-based data collection and reporting system to the post-earthquake setting and to establish a system for capturing follow-up variables for patients in post-operative care, including the establishment of procedures for capturing and reporting surgical complications. MSF's emergency response teams are well-experienced and extremely capable, but they are accustomed to the tried and true systems for data collection and activity reporting within MSF which are based on paper registers located in each service and ward. Because the OpenMRS project was a pilot program which had only been deployed in Haiti, most of the arriving MSF staff were not familiar with it and it was difficult to explain the utility of a robust relational database in that setting, especially in French.

The main challenges of the project in the post-earthquake setting were training the many new expat staff how to complete the data collection forms correctly, especially when any coding was required, and training the hospital staff in the critical importance of maintaining the integrity of the patient identification system. There were lots of other challenges, like computer viruses and short-circuited wiring buried in the mud of the tent floor after a rainstorm, but overall the systems for data collection and reporting stabilized by the time of my departure. I can't speak for all of MSF, but I can say that my experience in Haiti has taught me that we need to assess how we can integrate data management and better informatics tools in the disaster response and critical emergency settings.

Ed Jezierski (INSTEDD)
Problems:
- Transitioning technologies that were rapidly deployed after the earthquake into the hands of locals
- Deployment of technologies in a way that support overall health systems strengthening that are locally owned instead of lots of unintegrated efforts
Solution --- always in progress!
- Working with local developers s to help implement changes to systems or just basic technical support
- Working with telecommunication companies to provide more than one-off solutions
- Working with mayor partners (Partners In Health, iTech), mHealth alliance and others
Scale/Scope
- Port-au-Prince wide or national
Lessons Learned
- Technology investments need more data to be seen as obvious from traditional health funders. E.g. adherence to protocol, efficacy, efficiency etc.
- Opportunities to leapfrog inefficiencies are being missed and as funding is leaving, maybe will never materialize
- The influx of capital has created a competitive environment in Haiti for local organizations participating in reconstruction

Josh Nesbit (FrontlineSMS)
Working alongside a number of partners, FrontlineSMS:Medic helped coordinate the 4636 Project, an effort to create an emergency communications channel after the 2010 earthquake in Haiti. Teaming up with the Office of Innovation at the US Department of State, technology groups (Ushahidi, InSTEDD, CrisisMappers at Tufts, CrowdFlower, Samasource, Sahana Foundation, and others), Haitian mobile operators, and aid organizations, a system was created to process text messages communicating needs from the ground. Using crowd-sourced translation, categorization, and geo-tagging, reports were created for first responders within 10 minutes of receiving an SMS. Over 70,000 messages were received in the first month of operation, focusing relief efforts for thousands of Haitians.

We are currently planning new pilots with Partners in Health -- I will share details with the GHDonline community as the scope of work solidifies. The projects may incorporate OASYS (a new messaging module for OpenMRS), PatientView (an SMS-based, lightweight EMR), or modules under development.

Replies

 

Alice Liu Replied at 3:29 PM, 19 Apr 2010

Hi John, Ed, and Josh. Thanks for sharing your thoughts and experiences with us, and thanks to the GHD project for setting up this discussion. I was in Haiti also post-earthquake, in late January for Mercy Corps and also in March/April for a one-month assignment for a USAID agriculture project.

To respond on one comment about local developers, if you haven't made contact yet with the Haitian ICT association, I can put you in contact with the person who is their VP. They represent about 40 organizations spanning various segments of ICT (telcos, MNOs, ISPs, software developers, hardware resellers, etc.), mostly based in Port-au-Prince. There's also the Presidential Commission on ICT whom I couldn't fit into my scope of work but I have some contact info for one or two people on that commission also if you need it.

Also regarding this statement: "Technology investments need more data to be seen as obvious from traditional health funders. E.g. adherence to protocol, efficacy, efficiency etc. ":

One thing's for sure - with all this money going into Haiti, you know there's going to be an audit at some point, so an M&E strategy and M&E systems of some kind will be needed. The trick will be how to do this without burdening an overburdened medical team even more.

Myles Clough Replied at 1:09 AM, 20 Apr 2010

IN the run up to this panel the point was made that standardized reporting was too important to neglect. This panel would seem to be a good starting point in the process. One can assume there is going to be another disaster somewhere unexpected at some unknown but short time in the future. Following this event there will be a large number of emergency disaster relief agencies providing medical services including surgery for the victims.
Can you (the panelists) agree on the following -
A standard reporting system for disaster relief treatment should be available to all responders. They may not all use it but they all could and they should be encouraged/expected to.
All agencies likely to send medical representatives should be aware of the reporting system and teach their staff to use it as part of the pre-mission briefing. Who set this up?
The reporting system would be very cheap, very robust and very simple. Something of the order of an SMS message to a hub.
What is the minimum data-set which should be supplied
What candidates for a reporting system do you propose and why to you support that one over others?

There are some sneaky issues here. MSF sound like it is well organized and expects its doctors to keep good records. Probably better records than would be proposed in the "reporting system" we are discussing. Would MSF doctors have to report twice or would they use a general system because it is universal even though it isn't as complete as they would like? If they don't contribute to the universal system then patients treated in the emergency would likely be lost to follow-up when they come back to a non-MSF treatment center. The point that is exercising the doctors at this stage is that they don't know what was done at earlier stages; one of the key requirements is that we all contribute to the same data pool. You have to balance between providing an elaborate system that won't be used and a simple one which doesn't have enough information.

Bill Lober Replied at 9:59 AM, 20 Apr 2010

Myles raises an excellent point with his comment about "reporting twice", which brings to mind the costs and benefits of primary vs. secondary data collection.

The informatics side of our work in Haiti (since 2005) has been about implementation of facility level EMR systems, AND the secondary use of information for those systems to support M&E and surveillance. The later approach is less focused, requires more infrastructure, delivers considerably more benefit for clinical care, and may be more sustainable. Apples and oranges, humanitarian aid vs. rebuilding, perhaps. That "secondary use" piece, vs primary reporting, is an important component of sustainability (we think...)

Also, I'd be curious what the panelists thing about the system envisioned above, where "all responders... could and should... (use a standardized reporting system)."

That's a contrast to the way things work in the non-disaster mode in the States, where 1) we have all the problems of non-standardization, uneven implementation, etc., but we also have 2) local decision making and local ownership of systems, local capacity development, some inherent sustainability based on (admittedly biased & skewed) funding mechanisms, etc. I hate to use the States as an example of anything good (!), but my point is that external implementation of a standardized system in time of disaster is somewhat at odds with local capacity development. 10 years ago, there was a dictum in the US public health surveillance community that systems had to be used in time of non-crisis to be useful in a time of crisis. The evolution of syndromic surveillance in the US and EU from bioterrorism to episodic outbreak to pandemic flu would (to me) show some progress in that direction.

Bill

Background: I-TECH, in Haiti, provides TA to support the Ministry of Health's (MSPP) network of EMR systems - presently just under 70 sites deployed - with a mix of local, web, and mobile technologies. Not a big change in deployment since the earthquake, except that we lost a bunch of sites, and added some new ones.

John Brooks Replied at 12:22 PM, 20 Apr 2010

I'll respond to Myles's e-mail first because he asks about MSF
specifically and maybe Josh and/or Ed can follow up. I'm not sure it's
possible on this panel to "agree on... a standard reporting system for
disaster relief treatment." There are so many components; in MSF's work
alone the response has run the gamut from trauma surgery and general
primary and mental health care to non-food-item distribution and
installation of latrines and water sources. Reports of these activities
are relevant to a variety of actors on the ground, but no singular
reporting system would meet all the needs. So is it easier to break it
down into reporting "modules" where each activity has its own reporting
standard? Possibly. I'll start with the data collection standards MSF uses
for surgery:

All five European operational centers within MSF are running field
projects in Haiti at the moment. Systems for data collection and reporting
are different for each section, but just after the earthquake all sections
agreed to report surgical activity in a 20-column (ie +/- 20 reporting
variables) Excel spreadsheet. This data, ranging from patient and surgeon
initials to cause of the injury and intervention type, give a good
overview of activity but provide little support to the analysis of
surgical outcomes. There are no intersectional standards for post-op data
collection, nor is there any EMR system in place for multi-site access to
patient treatment history. When a patient is transferred within MSF or to
an external center, the resume of hospitalization or care is written by
hand on paper. There are currently no plans that I know of to implement an
intersectional EMR system within MSF for disaster response or any other
activity for that matter except maybe the standard database used for HIV.
There are also no projects within MSF that I know of which use SMS-based
reporting or handheld devices for data collection.

The epidemiologists who work with MSF in PaP are routinely part of health
and surveillance cluster meetings, and MSF reports to these clusters
according to the standards they have set. There are some standard MSF
Excel spreadsheets used to collect weekly tallies of cases within
outpatient and inpatient centers (they do not contain individual patient
data), but these tallies must be re-entered in the tools provided by the
surveillance and health clusters.

I certainly understand the merit of discussing the standard variables
which should be used for activity reporting, and if it would be helpful I
can describe the specific variables we use in MSF-Paris for surgical
activity and post-op follow-up. I myself am not in a position to comment
on which platforms are better than others. We had very good success with
OpenMRS during a 6-month pilot in the MSF trauma surgery hospital which
was destroyed by the earthquake, and the database itself is still
functioning well but the quality of reporting in that system is currently
limited by form completion errors.


John Brooks
Médecins Sans Frontières (MSF)/Doctors Without Borders
www.doctorswithoutborders.org

Hamish Fraser, MBChB, MRCP, MSc Moderator Emeritus Replied at 2:29 PM, 20 Apr 2010

Great discussion so far. I would like to comment on Myles point about a standard disaster reporting system. In principle I think it is right to aim for a core data set that is essential for managing patients and also reporting and surveillance. Doing this in practice is challenging even in one widespread organization. The approach I would suggest is the work out what the deliverables are for this process (number of people who have had different surgeries, ability to find out what surgery a patient had in future visits, tracking urgent supplies etc.) and then design the data collection form around that. Agreeing to a limited core set is the best approach and then projects can decide what tools to use to collect and manage that. Sites that need/want to collect more data can do that but still report the core.

With OpenMRS we plan to report core values to national systems so others can benefit from that (we already do this with aggregated data for HIV). That way we should avoid the need for double collection. Working with I-Tech in Haiti we are going to build a standard individual patient summary that will allow a particular patient's record to be requested and shared in a secure fashion. That is an alternative to a national database of everyone who has been assessed or treated.

Bill Lober Replied at 2:40 PM, 20 Apr 2010

Yes, Hamish, that reporting of core values to national systems (as both PIH and I-TECH - and others - already do), is exactly what I meant by secondary use.

The use of Emergency Department, school absenteeism, primary care visit, and other data in the US and EU for influenza surveillance (the secondary use of individual health data for population health monitoring), is a great example.

Glad you raised the point about standardized sharing of individual-level records as well. Some public health uses (notifiable condition reporting, case surveillance, quality improvement) rely on individual level reporting rather than population level reporting.

Bill

Om G Replied at 3:29 PM, 20 Apr 2010

It is common lately for me to hear that its 'too difficult' for the relief community to come up with interoperable reporting mechanisms.

Now is the perfect time as technology advances quickly into this space, development will leapfrog current mechanisms and a common frame of reference can only enable more meaningful data uses.

Even an attempt to codify will give others something to aim at, or start from.

Om

Kurt JEAN-CHARLES Replied at 4:05 PM, 20 Apr 2010

Hi All!

Very interesting initiative here! and a special thank to Bill for inviting me.

I am a member of the Haitian IT Association and I run a software company involved in different IT projects related to Public Health needs in Haiti (see www.mesi.ht). My team have also launched a web site (www.noula.ht) and a call center accessible through the 177 short code and I've been in touch with some of the panelists regarding some partnership around this initiative.

As an Haitian IT practitioner my concern related to (Health) IT initiatives, before the earthquake, was about data usage at every level of decision making and sustainability (both in a cultural and technological point-of-view) of our efforts in an environment with limited access to technology and weak public capacity (leadership). (that's a serious challenge but some of us still think we can win this battle although we do not always agree on the best path to take).

After the earthquake, there is more than ever an opportunity (and risks to go wrong) to show value in IT and to strenghten local capacities by providing efficient tools that can make sense in our environnement and for me it means some challenge in

transferring ownership (tools and processes) to locals.

showing results (not IT) stressing upon the value of processes and data transparency as much as possible.

partner with local university or ngo as soon as the needs-analysis phases if possible to get their vision and create good practices at this level.

find local champions (IT and non IT evangelists)

Clay Heaton Replied at 4:13 PM, 20 Apr 2010

I agree completely with Om. I built a Ruby on Rails EMR for the HHI field hospital in Fond Parisien in under a week while sitting on site in the pharmacy there.

It runs on a donated computer and has both web and iPhone interfaces. It is used to track pharmacy data, warehouse supplies, patient medical records, medication history, separated children, patient families, volunteers and their skills and travel schedules, and other information relevant to the aforementioned items.

Staff on site can wander the tent rows with iPhones and access up-to-date patient histories at each tent. They've also used the iPhones to manage warehouse distributions of goods (wash buckets, soap, etc.). The iPhone interface is built on jQTouch and launches from the iPhone interface just like a normal app, mimicking native app interfaces.

We're currently at > 4500 records of patient medical events (surgery, physical therapy, medication, transfers, discharge readiness, physical therapy, prosthetics, etc...) for over 600 patients, nearly 5000 records of drugs removed from the pharmacy for over 500 different medications, over 400 different types of warehouse supplies, and tracking of nearly 650 volunteer staff. It also has an electronic Wish List that staff use to track supplies they wish to request from incoming donors. All of the data is standardized and stored in a sqlite3 database.

I remotely back up the database daily via DropBox and provide a state-side clone of it for returned-volunteers to access if they need information about the work they did while on site or if HQ administrators need data for grant proposals, etc.

As for reporting: the operations people on site and in the HHI office contact me from time to time to ask if I can add features or put a report together for them. It usually only takes me minutes to write new reporting modules, SSH to the server in Haiti, and have an upgraded system in place for them.

The frameworks to get these apps off of the ground are free, easy to understand, very easy to deploy, and stable -- we haven't had a single crash of the EMR in the 2 months it has been in use. If the power goes out, as it sometimes does down there, the server reboots itself, relaunches the EMR application, and reattaches to the network. Believe it or not, I also have not had a single person complain that it was difficult to use or even ask me how to perform a task in the system... it is simple and intuitive enough that the staff train each other as they rotate in and out of the camp.

With regards to scalability, I won't worry about the system being overloaded until we have tens of thousands of patients, and at that point, I probably will simply migrate the data to a MySQL database (on the same donated computer) and put a few code optimizations in place. I don't think we'll get there this time around, though.

The great part about all of this is that it's all built around the same simple framework. Ruby on Rails is so ubiquitous that I needed only post a message to the proper Google Group to receive near-instant feedback for tricky programming hurdles I encountered while putting the system together.

Similar systems could be developed, deployed, and localized with ease in more controlled environments. HHI and UChicago plan to convene a task force to examine the performance of the data management systems after hospital activity subsides. I'm working on performance enhancements to the system, will remove some of the Fond Parisien specific modules, and will release the code as open source after the review. In the meantime, I'm happy to help people understand the technology and/or how the HHI/UChicago team uses it.

Kurt JEAN-CHARLES Replied at 4:48 PM, 20 Apr 2010

Although I do not particularly support the idea of one standardized reporting system. I think it is vital to reach a consensus about the definition of the core values (to use Hamish words) that any system should be able to provide under different situations.

I mean that we should think about having different set of indicators, defined by health category, for normal or special (natural disaster, health alert, epidemy , ...green, yellow , red ...) conditions at the country level and make sure that these core values cover the minimal need for decision making and high level reporting. This approach would be useful for the short term and the long term as it looks further than disaster relief.

We've gone pretty far for HIV with the EMR from ITECH and PIH and the aggregated data on MESI but we need to get further and make sure that we can reuse the same capacity for other health programs.

It doesn't mean that there would be no place for one national system but it would enable to spread the scopes of work between operational and governance perspective between interoperable and specialized systems designed around clear high level specs adopted by the local authorities.

Om G Replied at 6:51 PM, 20 Apr 2010

Clay,

Let's make an appliance!

Om

Clay Heaton Replied at 7:05 PM, 20 Apr 2010

Email me at and I'll show you what I've done and we can discuss.

Cheers,
Clay

Myles Clough Replied at 11:07 PM, 20 Apr 2010

Clay's solution is magnificent and I would be most interested to acquire the surgery tracking system, or at least the data structure. I agree that designing a system is the easy part. I set up a trauma reporting system on a PDA using HandBase which worked entirely to my satisfaction and to no one else's!
To expand Clay's system to one usable by everyone who responds to a disaster requires all the responders to have an iPhone (and presumably an App or two). This won't happen. I think it is reasonable to expect that virtually all responders will have a mobile phone of some sort but nothing pre-coordinated and no expensive purchases.
Looking down the road on this we are hoping to develop a global trauma data collection system which is usable in all the LMICs. The disaster database is (in our view) an urgent special case; if the community can develop a system which is robust enough to work in that situation we think that an expanded version with an fuller dataset could work all over the world. So we, at any rate, would like to see a system that the average overworked emergency doctor or orthopaedic resident in sub-Saharan Africa could and would use.
That rules out iPhones!
"We" is an ad-hoc group of orthopaedic surgeons (and other docs) who are involved in orthopaedics and trauma from an international perspective and are concerned that there is no usable data. Without a system of data collection the size of the problem cannot be reported.

Clay Heaton Replied at 12:01 AM, 21 Apr 2010

...
> To expand Clay's system to one usable by everyone who responds to a disaster requires all the responders to have an iPhone (and presumably an App or two). This won't happen. I think it is reasonable to expect that virtually all responders will have a mobile phone of some sort but nothing pre-coordinated and no expensive purchases.
...

> That rules out iPhones!

Not to get into too much technical detail here, but jQTouch runs in
any browser that supports JavaScript. In Haiti, I implemented it to be
device agnostic - iPhones are not required to use the mobile site, nor
are any other apps. Any modern-ish phone with a browser would do, as
the JavaScript gracefully depreciates. It would be very simple to add
additional device specific code to accommodate screens of different
resolution -- probably no more than 2-3 hours per specific device. The
Rails and jQTouch frameworks are accommodating enough that it took me
only 1/2 day to build the "iPhone" interface after the main browser-
based views were complete.

The jQTouch framework supports caching, too, which would allow people
to sync a phone and access the data away from the network, though I
did not implement that because Fond Parisien is lucky enough to have
wireless blanketing the hospital. They also were lucky enough to
receive a donation of iPhones in FP, which was the reason I mimicked
the interface in HTML.

You are correct that a more widely distributed system would require
both alternative input (eg. SMS) and syncing between nodes. I imagine
that either could be done in short order, especially if one of the
existing SMS bridges were employed.

I'm not trying to evangelize for the system that I built -- only
trying to point out that it is quick and relatively easy to do so.
When you can have a functioning prototype of your own design up and
running in just a few days, why not give it a shot? It can be a
strawman for iterative design if nothing else. A quick Google search
for Ruby on Rails or jQTouch will yield more technical details.

Cheers,
Clay

Chris Wilson Replied at 10:13 AM, 21 Apr 2010

Hi Myles and all,

On Tue, 20 Apr 2010, GHDonline (Myles Clough) wrote:

> Looking down the road on this we are hoping to develop a global trauma
> data collection system which is usable in all the LMICs. The disaster
> database is (in our view) an urgent special case; if the community can
> develop a system which is robust enough to work in that situation we
> think that an expanded version with an fuller dataset could work all
> over the world. So we, at any rate, would like to see a system that the
> average overworked emergency doctor or orthopaedic resident in
> sub-Saharan Africa could and would use. That rules out iPhones!

This is something that Aptivate might be interested in developing as an
open source application, e.g. for phones that support Java ME, which
should cover the widest range of handsets.

We'd want a group of real users willing to test it regularly and give us
feedback, e.g. every two weeks. Ideally we'd like to see it being used
live, with feedback based on real data and conditions.

Please could we have a "show of hands" of who would be interested in
taking part in such a group, and has sufficient time to (a) help write the
specifications and (b) test the software and report back every couple of
weeks?

This is only a "might happen" at this stage, not a definite commitment, as
we'd need to find resources to manage the project and write the software.

Cheers, Chris.
--
Aptivate | http://www.aptivate.org |
The Humanitarian Centre, Fenner's, Gresham Road, Cambridge CB1 2ES

Aptivate is a not-for-profit company registered in England and Wales
with company number 04980791.

Amy Madore Replied at 10:27 AM, 21 Apr 2010

Hi all,
Carlos Luis Sánchez Bocanegra's reply to this thread was posted as a separate discussion due to a minor technical glitch on our end. We're working to resolve this issue. In the meantime, please see Carlos' question below, and we look forward to your continued participation in this discussion! Cheers, Amy, on behalf of the GHDonline Team

Carlos: Really an interesting thread, would you please show references about how jqtouch toy has been used?

Clay Heaton Replied at 10:35 AM, 21 Apr 2010

Sure - I'll create an instance of the application with some test data and send the login info to the group in the next day or two.

Cheers,
Clay

On Apr 21, 2010, at 10:27 AM, GHDonline (Amy House) wrote:
>
> Carlos Luis Sánchez Bocanegra's reply ...
> Carlos: Really an interesting thread, would you please show references about how jqtouch toy has been used?"

Joaquin Blaya, PhD Moderator Replied at 4:08 PM, 23 Apr 2010

Ed,
In looking at the project description, it looks great in the sense of showing the vision of the project and the extent of its objectives. The one thing that I wanted to ask was if you could specify which technologies were used because that wasn’t clear to me and what you thought their immediate impact was.

Thanks,
Joaquin

Neal Lesh Replied at 5:21 PM, 28 Apr 2010

Hi everybody,

Thanks for this terrific discussion!

I'm writing to see if the panelists or others with experience from this disaster might write a bit more about what we should be doing to prepare for future ones. What were the biggest positive impacts of the systems deployed? Looking back, are there some services or applications that stand out as being especially useful in the weeks that followed the disaster? For example, was it in fact helpful to try to connect missing people via SMS-based systems? Was there much success at trying to track patient encounters over time or were things too chaotic? Alternatively, maybe we should focus on simple EMRs for emergency surgical teams, or perhaps on EMR systems that are easier to re-start after a disaster? I imagine the answer may be “all of the above” but it would great to hear some thoughts on what you think the priorities should be? Or to hear about things that might seem useful in theory but really were much less so.

I’m curious also about how eHealth and mHealth technologies were perceived. Was there a strong interest in using computer or phone-based technologies among responders or the population, or perhaps a sense that there wasn’t time to experiment with new things?

Thanks very much,
neal

Om G Replied at 5:59 PM, 28 Apr 2010

One Huge lesson learned was that SMS and Tweets emerged as a means for tech enabled citizens to participate in disaster response.

Crowdsourcing in Haiti and externally produced remarkable results.

Data aggregation and representation is the next part of closing that loop.

I think we'll see synergistic effects from every combined source that is fed back to the wider community.

My interest in 'interoperability standards' is derived from this belief and consequent projects to facilitate community reporting.

So, as far as 'lessons learned', a *lot* is still taking shape.

There were calls a while back for a single comprehensive list of required features for EHR/EMR/Clinical and Practice Management and I do agree that a single product shouldn't try to fit every need, but that list would certainly make it easier to create a modular system adaptable to many situations or at least a framework that others can develop for.

Now it is easier than ever to implement... and getting easier.

Seems like there are bigger 'needs' out there than to fuss over minor differences.

Mikhail Elias Replied at 6:10 PM, 28 Apr 2010

In terms of next steps, I would suggest perhaps taking a more structured approach to cataloging the lessons learned and the unresolved problems that need attention going forward. Moving sustainably beyond the 'fuzzy front end' means bringing better knowledge management capabilities to bear.

In particular, if this exercise is meant to inform future planning efforts, it would be invaluable to document and analyze the problem areas in more detail in order to help drive future requirements analysis.

For example, people may have run into a host of team communication problems arising from various factors - environmental, organizational, technical, etc. Not all of these will necessarily have an IT solution, but documenting all of them helps identify constraints on future technical solutions.

Having the problem analysis organized into functional and non-functional (i.e. infrastructure) domains will also help with future planning -- as will standardizing these domains into a reusable taxonomy.

Having concise, well-defined problem statements that can be understood without a lot of additional context will help solution developers better target their efforts.

Since there may not be a formal business or systems analysis process envisioned, one approach might involve creating a wiki or similar knowledge-sharing site where stakeholders can contribute to this type of collaborative problem analysis.

In general, the tendency is to jump prematurely into solution-thinking, before completing a rigorous analysis of the in-scope problem domains. This may be exacerbated when operating in a reactive, fire-fighting mode.

The importance of solid problem analysis cannot be overemphasized. It promotes better verification and validation of future requirements, and ensures better prioritization, especially when extended groups of stakeholders are involved. Problem-focused requirements analysis is too often neglected in the SDLC, even though all the top reasons for project failure inevitably relate to poor requirements management.

This process also helps level-set in terms of establishing a common vocabulary and giving definitions to otherwise ambiguous concepts; this is especially useful when not all stakeholders are participating in the initial phase of work, and may need to be brought up to speed later.

John Brooks Replied at 10:57 AM, 29 Apr 2010

Hi all
It's been exciting to read what others have built to confront the medical
informatics hurdles in post-earthquake Haiti. As I wrote in my first
e-mail on this panel, MSF as a whole has not deployed any standard
informatics tools in the disaster response beyond technology which was
available in the 20+ years ago, namely Excel spreadsheets. That being
said, MSF has a proven track-record in its ability to respond to
catastrophes in a variety of settings, and now it's up to the info-tech
experts within the organization to convince the operational directors that
the use of handheld tools and wireless transmission of data have an added
value in emergency response. I think it brings up some interesting
questions:

- What tools can be used in the first two weeks of a disaster response
when there is no guaranteed cell-network service? What happens when there
is no electricity available or there are frequent, prolonged interruptions
to electrical supply?
- Who are the users of the new handheld tools? I have some sense of what
data should be collected in the first phase of a surgical response, but
I'm really curious about how it would work. Is it realistic to expect that
surgeons who are treating many 10's of patients a day for 20-hour days are
the ones who will be entering the data in a cell phone or Android? The
nurses in this situation don't have any free time either. Do I recommend
that a new position is added to a patient care team which is dedicated to
data collection? How have other actors confronted this issue?

It's very clear to me from my time in Haiti before and after the
earthquake that an emergency response has very distinct phases. The first
24-72 hours are the most chaotic but also the most important in terms of
reducing mortality; the tools deployed during this phase must be as
reliable as a pen and paper. The next week or so sees the influx of
medical teams who have been able to prepare, both psychologically and
logistically, for the disaster response and in the case of MSF always
include experienced emergency response personnel. After ten days, the
large imports of hospital supplies have likely entered the scene and at
this phase more robust and redundant wireless networks could be deployed
which support EMR and are as much a part of the logistical supply-chain as
tents, water purification systems and medicine.

I strongly agree that defining some standards for reporting would benefit
the patient population at large. Though I'm not medical, I think it would
be tough to argue that a standard and controlled system of transferring
medical history from provider to provider would not have a positive impact
on continuity of care, especially in a transient population who have
undergone major surgical interventions which require close follow-up. Even
so, the use of patient data, even in aggregate form, can be used for
politically-motivated ends and in MSF's case could jeopardize the
all-important independence of its interventions if misused. Maybe it's
possible to start the list of standard variables which would be used in
the reporting scheme? I look to others for suggestions.

Thanks
John

John Brooks
Médecins Sans Frontières (MSF)/Doctors Without Borders

Hamish Fraser, MBChB, MRCP, MSc Moderator Emeritus Replied at 11:41 AM, 29 Apr 2010

Thanks John for this great post, I think these are really critical questions for the community. We are going to be discussing some of these issues at the Global PHAT meeting in Boston on Saturday.

When I was in Haiti last week I spent a lot of time talking to people about what they need and how we could better use information. One consistent message was that the surgeons needed to be able to document the diagnosis and procedure performed so that they or more often someone else could provide follow-up care. This also applies to physical therapists providing rehabilitation care who need to know what happened and what was done. So I think that there should be someone who can coordinate and document care as part of the early teams. That would bring immediate benefits in terms of tracking supply requirements and patient load, and then longer term benefits for individual patient care. Use of standard paper forms matched to an electronic system (with both web and cell phone form options) would allow redundancy. At first maybe you just have a ring binder with case forms and then that starts to get entered into the EMR as staff and connectivity allow. These paper forms could be photographed or scanned for back-up copies and maybe uploaded of shipped out to a more stable location on CD for offsite data entry. I think we are looking for consistency of data and redundancy of technologies and data management strategies.

Currently pneumonia and tuberculosis are major concerns, tracking outbreaks and spread of disease are clearly important for infectious diseases and these and water-born illnesses. There has been some progress in disease surveillance tools now on paper and cell phone but I don't have solid data on how well it is working in Haiti yet.

John also raises important concerns for ownership and sharing of medical data. At present there are no effective laws or regulations for this in Haiti. Individual organizations have their own procedures and rules but this is definitely a gray area. Maybe there need to be default disaster response rules for data ownership which apply if effective national rules are absent or difficult to enforce. An important but contentious issue.

Aaron Beals Replied at 3:16 PM, 29 Apr 2010

Great discussion, everyone.

John mentioned that "MSF as a whole has not deployed any standard informatics tools in the disaster response beyond technology which was available in the 20+ years ago, namely Excel spreadsheets". There's related thread in the community, started just before this panel kicked off, in which Chris Lee describes a similar (Google Docs -- I'm assuming Spreadsheet + forms) solution they used for patient discharge management:

http://www.ghdonline.org/tech/discussion/patient-tracking-in-haiti/

Looking forward to a time (and I realize I'm taking a leap here, but I'm optimistic that what we're driving toward will come to fruition) when there are some standards for reporting -- do you think it will be OK to still use these simple tools in immediate post-disaster management, so long as they are distilled afterward into reports?

Hamish Fraser, MBChB, MRCP, MSc Moderator Emeritus Replied at 4:31 PM, 29 Apr 2010

Hi Aaron
I think it is important that people can rapidly put together new tools
for situations like this. Google docs are obviously better that Excel
if you have good connectivty most of the time, with their ability to
share data and provide secure backup. Unfortunately like most Internet
based systems today they are not designed for low bandwidth and
satelite latency. PIH/ZL had to abandon their use after the earthquake
due to these issues.

So going forward we need simple tools desgned for this type of
environment. Interestingly MS Remote Desk Top is very good over slow
connections, including satelite and GPRS. It would be worth exploring
similar open source solutions also.

Hamish Fraser
Partners In Health &
Brigham and Womens Hospital

Brylie Oxley Replied at 4:34 PM, 29 Apr 2010

> John Brooks: I strongly agree that defining some standards for reporting would benefit
the patient population at large...

Maybe it's possible to start the list of standard variables which would be used in the reporting scheme? I look to others for suggestions. The Canadian Physician Health Programs (CPHP) have a list of common indicators which serve as an example of a standard schema for patient
data: http://bit.ly/caOJra

Sections 1, V, and VI might prove useful in emergency situations while collecting data from other sections (such as spiritual tradition) might be a higher (political) risk and or lower priority.

Perhaps the CPHP indicators could be used in conjunction with other standardized resources to form a global or regional health RDF specification.
--
Brylie Oxley
GNU.media Intern
The Woolman Semester

Clay Heaton Replied at 4:53 PM, 29 Apr 2010

Depending on the number of people who need access to the files, Dropbox works very well when there is low bandwidth. Files automatically sync to/from the server when there is a connection and the web-site tracks document versions.

The hospital in Fond Parisien struggled with Google Docs after they were shared with more than a handful of people. We tried to use Google Forms via Google Spreadsheets to collect some volunteer data in an organized manner, but those with access to the backend sorted, copied/pasted, and moved columns to the point where the forms broke. It was a classic case of too many cooks in the kitchen, so to speak.

This raises the point that it is as important to assess who needs access to the documents as it is to find a good collaborative-editing solution. Managerial processes tend to degrade when staff rotate frequently. New team members frequently have new ideas about how to manage data and often singlehandedly enact those plans, to the detriment of data consolidation and normalization. This may affect volunteer organizations more than permanently-staffed organizations, of course.

Clay

Frances Ryan Replied at 12:19 PM, 30 Apr 2010

Hello all,
 
I just checked out Dropbox and it looks like a virus infected my contacts list. 
 
Fran Ryan

Hamish Fraser, MBChB, MRCP, MSc Moderator Emeritus Replied at 9:59 PM, 30 Apr 2010

Hi Clay, these are important insights. Spreadsheets are of course horrible tools for data collection and storage especially with multiple users and many items. To use a simple example, (as you know) even MS Access is a much better small system for data storage than Excel because it has rules to ensure data integrity and data types. Too much flexibility is a disaster for data storage but is fine for analysis of a pre-existing data set. The biggest challenge in setting up the OpenMRS-TB system in Rwanda last year was cleaning the 250 column spreadsheet with 200+ patient records.

My intention is to build systems that do standard tasks well and are flexible but include the data integrity checks and analysis tools built in. It is a balancing act getting it right!

Om G Replied at 8:28 AM, 4 May 2010

Let's not forget new offline+online application tools like Air and
Silverlight.

Easy to use, adaptable, and well capable of handling database
interactions with the simplicity of web page development.

Thanks Hamish for pointing out that Google 'Sheet is not the best means
to capture data. Since it is 'free' and has become somewhat ubiquitous
many people flock to it. There are some neat drop ins for mapping, which
could be very useful IMO, but the possibility of developing an Open
framework that accomplishes the necessary tasks is too tantalizing to
let go.

As if we needed another disaster... Louisiana proves that the need will
not dissipate and it really is up to us to take the best of what's
available to produce tools *now*.

A 'Crisis Camp' is an excellent means to get the technical side of
things accomplished, all they require is a sufficiently detailed
description of the objective.

Handing off the project to the next set of coders, a Crisis Camp project
will literally travel around the globe following the waking hours of
coder teams and thus, progress is blazingly fast.

Open MRS is an outstanding start. Is it enough? Can we begin the actual
chipping away at the marble of this 'sculpture' ?

Health is a place where other orgs cannot argue about priority and any
discussion about adopting compatible formats would be short if "Health
IT" provided an extensible framework.

Consensus might even be approachable on a core set of features from
which specialty modules can be added.

Are we ready to start?

Om

Om G Replied at 8:30 AM, 4 May 2010

Aaron,

Wouldn't you prefer an online+offline system that is designed to suit
the purpose? "Simple tools" can be created with relative ease and made
available if we but ask.

Om G Replied at 9:11 AM, 4 May 2010

John, MSF rocks!
>
> - What tools can be used in the first two weeks of a disaster response
> when there is no guaranteed cell-network service?

Solar charged smart phones can capture data, upload to a laptop, or wait
for cell connectivity.

What happens when there
> is no electricity available or there are frequent, prolonged interruptions
> to electrical supply?

Paper forms configured to mirror the phone based forms can be available
and set up for OCR to capture and encode checkboxes, some text, simple
vital sign graphs, egk, and images like annotations on an anatomical figure.

> - Who are the users of the new handheld tools? I have some sense of what
> data should be collected in the first phase of a surgical response, but
> I'm really curious about how it would work. Is it realistic to expect that
> surgeons who are treating many 10's of patients a day for 20-hour days are
> the ones who will be entering the data in a cell phone or Android?

For extreme cases of emergent activity, the simplest, best solution
might be a voice recording device. I hesitate to suggest 'inventing'
technology to meet every need, but an USB Thumb Drive with unique rfid
identifier and single button 'record' function could be kept with the
patient and plugged in to download for transcription when time permits.

In your example, there is no time for detailed charting, however even an
Android device could be configured to take a photograph and record a
brief summary of the patient condition+treatment. We are entering the
realm of Terabyte hard drives that are weather and heat 'proof' making
some uses of technology potentially more durable than paper.

I'd sure like to see implementation of 'ad-hoc' networks basically
transferring data from phone to phone or phone to phone-connected
laptop. This has been researched and isn't a huge technical hurdle.

It would certainly improve response and scale nicely into the
progressive phases of a recovery effort.

Om

Anat Rosenthal, PhD Replied at 3:45 PM, 4 May 2010

Hi all,

Thank you for taking part in this fascinating panel.

Although the Expert Panel part of the discussion has officially ended, we invite you to continue the conversation.

We wish to thank our panelists: John Brooks, Josh Nesbit and Ed Jezierski for their contribution.

A blog post summarizing the discussion so far will soon be available at www.mobileactive.org.

Best,
Anat

Sophie Beauvais Replied at 12:26 PM, 16 Aug 2010

Dear All,

A peer-reviewed discussion brief with key references and recommendations for the panel: "Health IT for Disaster Relief and Rebuilding: Lessons from post-earthquake Haiti" is now available in the discussion and here: http://www.ghdonline.org/tech/discussion/ghdonline-health-it-expert-panel-apr...

Also, researchers at elrha (enhancing learning & research for humanitarian assistance) just published a study: "Professionalising the Humanitarian Sector" (full text http://www.elrha.org/professionalisation) which notably calls for the creation of a professional association for humanitarian worker. Thoughts?

Thank you, Sophie

Mikhail Elias Replied at 2:54 PM, 18 Aug 2010

Regarding disaster relief, I would recommend focusing first on
standardization of best practices, then secondarily on cataloguing and
harmonizing data collection practices.

Standard protocols for disaster relief should ideally be based on
research-driven evidence, and should produce a set of stable baselined
requirements for downstream implementation of data collection
practices.

The disaster relief protocols and their associated data collection
practices should answer two key questions -

Why is each specific piece of data important?
What will each specific piece of data be used for?

Its better to take a demand-driven (requirements-based) versus a
supply-driven (data-based) approach. This has the benefit of not just
optimizing the collection of data in the face of scarce resources, but
of supporting better coordination of activities across stakeholders.

For this to be successful, I would also suggest greater investment in
training and formalization around requirements analysis as routine
project activities. Formalization should include establishing metrics
to demonstrate the value of requirements analysis to projects.

This may potentially also serve as one basis for professionalization
of the humanitarian services sector.