Investigative Data Harvest -– The rise of the new Woodwards & Bernsteins

These days is the worldwide roll-out of the movie how Hollywood views WikiLeaks” and the figure behind it Julian Assange. The title “The Fifth Estate”was in Germany translated into “Fifth […]

(c) Wikimedia Commons

(c) Wikimedia Commons

These days is the worldwide roll-out of the movie how Hollywood views WikiLeaks” and the figure behind it Julian Assange. The title “The Fifth Estate”was in Germany translated into “Fifth Power” and reminds of the fact that traditionally the power system is divided and balanced between the Executive Branch, Legislature and Judiciary with a fourth level on top, a controlling system which is journalism and the media.

While the  internet has opened new avenues of communication, participation and transparency it has severely devaluated journalism. On blogs and social platforms every user is a communicator and many times also a journalist. Established media is struggling with this new competition with impacts on the economic performance and overall quality loss.

However there is a newly emerging group of internet-apt journalists who say: “The internet is our exit strategy out of times of crisis.” It offers a unique chance for investigative research, gather information on complex and highly sensitive issues, provide suitable graphics and publish them. There is at this point nothing illegal about it or breaking the law. It’s just a matter of passion and education. Thus, the media could reconquer its vaning position of the fourth power in the society and start to thrive again.

The eighth World Conference of Science Journalists in Helsinki WCSJ 2013 Helsinki addressed this newly unfolding technique of journalists. Dino Trescher, member of the German Science Writers TELI, helped to organize the session “Data Explored – The Code That Underpins the Future of Science Journalism” -> http://wcsj2013.org/data-explored-code-underpins-future-journalism/ and reports here about the outcome and findings, in a short overview and a more in-depth view with a toolbox and cases such as grand-style tobacco smuggling, the OffshoreLeaks Tax Fraud, and environmental Investigation. The steps he and the co-chairmen of this session recommend are:

Mining, Filtering, Visualisation, Story.

Trescher & colleagues claim that “unique stories are often hidden in mountains of data, documents and statistics and are distributed all over the internet”. This allows to discover unknown facts which interrelate and reveal new contexts. Journalists who are capable of penetrating the net and interpreting the facts come up with new layers of truth. Thus they render a valuable service to democracy which can make up for the lost capacity as gatekeeper and information broker. An example presented by the authors:

Keith Ng, a freelance journalist from Wellington, New Zealand, extracted data from a self-service counter in a social security office and thereby accessed an enormous amount of government data. The files from New Zealand’s Ministry of Social Development included names and addresses of chil-dren living in protected care, investigators and clients in fraud investigations, invoices, and medical prescriptions — the beginning of what became the government’s largest ever security breach. But Keith did not sell the story to the media, he blogged about it. So instead of being paid for a few lines, he put a link at the bottom of the story that led to a donation page on GiveALittle.co.nz. A decision which made him earn a lot of money.

“Success in this new field depends on the data driven narration”, the authors point out. This requires the adequate use of new media, the visualisation by more or less interactive graphic charts, innovative narratives and interactive websites

One of the worldwide leaders in data journalism is the British Guardian. “Comment is free, but facts are sacred”, is Simon Rogers quoted. Some years ago people just wanted to say what they thought. “Now I think it’s, increasingly, people want to find out what the facts are”.

From stock prices and soccer results to traffic and weather reports, statistics have always been the key in the media. Nowadays however more and more statistics are freely available, through governments, large organisations like the World Bank or the investigative portal Wikileaks. The computer and smart software make it easier to rummage through mountains of data and display the results graphically.

The aggregation of data, networks between major publications are the future – whether in the form of cooperation between newspapers across borders, whether in the form of multinational search pools or working with foundations and universities, suggest the autors. The data reveals that since 9-11 governments have put more money than ever before into the industrial military complex. Another pillar of the society which needs to be probed is the technological scientific industrial complex.

The harvest of data becomes a joint effort. The Guardian regularly publishes the data sources behind their stories and asks the reader to join in.

Adrian Holovaty, founder of EveryBlock, offers another example how data search changes and improves journalism. “Say a newspaper has written a story about a local fire. Being able to read that story on a cell phone is fine and dandy”, he says. But what he really wants to be able to do is explore the raw facts of that story, one by one: “With layers of attribution, and an infrastructure for comparing the details of the fire—date, time, place, victims, fire station number, distance from fire department, names and years experience of firemen on the scene, time it took for firemen to arrive—with the details of previous fires. And subsequent fires, whenever they happen.”

That British and US media seem to be leading the way through the data jungle, is no coincidence: in the USA data is readily available through freedom of information legislation. An additional thrust is provided by the so-called open-data policy in the US and the UK – always evoked by the clearly audi-ble reputation of online and civil rights activists.

Due to this the LA Times could log the performance of almost the entire teaching staff of Los Angeles: It published the official rating of 6000 primary school teachers in 470 public elementary schools in the city. The basis of assessment is the performance of students in standardised final exams. Readers of the LA Times can now see any teacher’s record with information on career and “effectiveness”.

A convincing technique with convincing examples, however the question remains: What can data journalism do for science journalists. This paper is open for discussion!

Sources

Datajournalism Handbook:
http://datajournalismhandbook.org/1.0/en/
free download – also prepared in Spanish

Free Online Data Journalism Course
http://datadrivenjournalism.net/
http://datadrivenjournalism.net/
http://datadrivenjournalism.net/

For Training & Promotion: European Journalism Centre Netherlands
http://ejc.net/

Datenjournalist.de (in German):
http://datenjournalist.de/datenjournalismus-im-oktober-september-2013/

DATA JOURNALISM READER, SHORT VERSION
WCSJ_reader_dataexplored_shortversion_trescher (1)

LONG VERSION
WCSJ_reader_data explored_longversion_tobacco case, offshoreleaks tax fraud_trescher (1)

About Wolfgang C. Goede

Wolfgang C. Goede is a science journalist based in Munich, Germany. He is a board member of the German Association of Science Writers TELI.