What computing and informatics tools will help Haiti?

A great deal has been written about the how the combination of social media and cell phone technology has become a powerful fundraising mechanism in the wake of the disastrous earthquake in Haiti on Tuesday.  But that one example got me thinking about other kinds of computational tools that could help provide an accurate picture of both the reality near the epicenter of the quake and the ripple effects throughout the region and across the globe.

I posted a question about this via twitter with the hashtag #DistributedExpertise and got some interesting responses via Facebook and email that further fueled my own thinking. Some are examples of applications already providing vital information about the situation on the ground, while others could be created to provide useful ongoing coverage, especially as earthquake survivors emigrate to the US and elsewhere. I’m going to split the responses into a couple of posts. This one will focus on breaking news coverage, the next will look at lessons from the Katrina and the 2004 tsunami, and the final post will focus on tools for local coverage.

Breaking news coverage

They included searchable databases of victims, such as this Haitian Earthquake Registry,  which pulls information from a database maintained by the International Committee of the Red Cross, among other sources.

The New York Times is being widely applauded for its interactive map of the region with embedded audio and video:

Those seeking to track the seismological data on the quake would do well to consult Wolfram Alpha, which mapped and charted the quake and the aftershocks:

The Wolfram Alpha chart neatly summarizes information that is provided in greater depth by the US Geological Survey. In addition to the seismological reports, there is a podcast in which one of their experts answers questions.

Cyberjournalist.net points to another great map by USA Today that includes radio buttons that you can click on to overlay certain kinds of information

Georgia Popplewell, the invaluable managing director of Global Voices Online, has a great Twitter list that aggregates posts from Haiti

The unfortunately-named site Chartporn has a running catalog of interesting and useful data visualizations related to the disaster:

Services for donors and volunteers

Al Tompkins at the Poynter Institute points to Charity Navigator as a way to find credible organizations to receive donations.

Have you seen any interesting applications that should be included in this list?

Posted in Civic media, Computational Thinking, Journalism, News, Research and tagged , , .

professorkim

My professional background is in public information, magazine journalism, blogging and journalism education. My current research is founded on the premise that democracy requires the broad participation of a computationally fluent citizenry. Civic media industries must reflect the communities they serve at the level of ownership, research and development, news gathering, presentation and community engagement. This adds greater urgency to the already critical need to broaden participation in computing. To that end, I have collaborated on curricular models for infusing computing into journalism education at both the scholastic and collegiate levels, and for promoting civic engagement in computer science education. My current interest is in exploring the potential of stochastic networks and as enhancement to social computing tools for broadening civic participation.
While most of this blog is devoted to my research in computational journalism and trends in journalism education, I occasionally do some storytelling of my own. This blog picks up where my other blogs, Professor Kim’s News Notes (http://professorkim.blogspot.com) and The Nancybelle Project (http://kimpearson.net/nancybelle.html) left off.