A recent issue of Science magazine features a news article about seven scientists in Italy who are facing manslaughter charges for not predicting the danger of an earthquake that killed 308 people. The scientists were part of a risk committee of earth scientists who testified that incipient tremors were not evidence of an oncoming earthquake in 2009. According to Science, “They agreed that no one can currently predict precisely when, where, and with what strength an earthquake will strike.” These are all accurate statements, from a scientific point of view. But the problem lies in translating these statements for decision-makers and stakeholders, which includes people in the town of L’Aquila, Italy.
The lead scientist “maintained that he and his scientific colleagues had a responsibility to provide the ‘best scientific findings’ and that it is ‘up to politicians’ to translate the scientific findings into decisions” (Science). This is the linear model of science policy at its worst, literally costing lives because of the mismatch of science and policy risk management paradigms, or as Cash et al. describe, the “loading dock” model of simply delivering scientific results and hoping that the public sphere will pick them up and use them. To the scientists, risk and uncertainty are quantifiable metrics that are difficult to translate into social action. To decision-makers and the public, risk is a socially mediated, multidimensional value that depends on more than just probabilities. Uncertainty has been a traditional sticking point in earth science and policy topics such as climate change. However, Cash et al. demonstrate how bringing together scientists and decision-makers from the beginning helped improve the utility of climate models for end-users. They write, “Scientists began to understand that managers were comfortable making decisions under uncertainty, and managers began to understand the concerns scientists had about making scientific claims in the face of uncertainty.” This was clearly not the case with the Italian scientists and decision-makers.
At first glance, this case provokes outcry from scientists afraid of losing the public’s trust and being put on trial, literally. While it may be presumptuous to actually put scientists on trial for a failure to dialogue with decision-makers, this puts into question the implicit “social contract of science” that has justified basic scientific research since the end of WWII. Sheila Jasanoff told a group of ASU graduate students last spring that, “Scientists have become arrogant, and have not explained to the people why they deserve support... The Enlightenment was not a historical event. It is a process, a mission, a continuous duty to explain yourself” (personal communication, 11 February 2011; not an exact quote, but very close). Jasanoff lays out an alternative claim to the linear model of science policy that she calls “technologies of humility”. In contrast to calls for “more science” to reduce uncertainty, Jasanoff writes that, “what is lacking is not just knowledge to fill the gaps, but also processes and methods to elicit what the public wants, and to use what is already known”. The abstract of her paper states, “governments should reconsider existing relations among decision-makers, experts, and citizens in the management of technology. Policy-makers need a set of ‘technologies of humility’ for systematically assessing the unknown and the uncertain”. Jasanoff and other Science and Society scholars have been writing about the failures of the linear science policy model in predicting risk since the 1980s, when the risk-management paradigm began to crumble in the wake of seemingly “unpredictable” human-technology-based disasters like Chernobyl. Today we face critical policy issues from climate change to toxic chemicals that fundamentally depend upon and understanding of environmental science, but just understanding the science is not enough. We need a new model of science policy that incorporates the needs of decision-makers and stakeholders from the start, not after it’s too late.
About the author: Marci Baranski is a doctoral student in ASU’s Biology and Society program.
The lead scientist “maintained that he and his scientific colleagues had a responsibility to provide the ‘best scientific findings’ and that it is ‘up to politicians’ to translate the scientific findings into decisions” (Science). This is the linear model of science policy at its worst, literally costing lives because of the mismatch of science and policy risk management paradigms, or as Cash et al. describe, the “loading dock” model of simply delivering scientific results and hoping that the public sphere will pick them up and use them. To the scientists, risk and uncertainty are quantifiable metrics that are difficult to translate into social action. To decision-makers and the public, risk is a socially mediated, multidimensional value that depends on more than just probabilities. Uncertainty has been a traditional sticking point in earth science and policy topics such as climate change. However, Cash et al. demonstrate how bringing together scientists and decision-makers from the beginning helped improve the utility of climate models for end-users. They write, “Scientists began to understand that managers were comfortable making decisions under uncertainty, and managers began to understand the concerns scientists had about making scientific claims in the face of uncertainty.” This was clearly not the case with the Italian scientists and decision-makers.
At first glance, this case provokes outcry from scientists afraid of losing the public’s trust and being put on trial, literally. While it may be presumptuous to actually put scientists on trial for a failure to dialogue with decision-makers, this puts into question the implicit “social contract of science” that has justified basic scientific research since the end of WWII. Sheila Jasanoff told a group of ASU graduate students last spring that, “Scientists have become arrogant, and have not explained to the people why they deserve support... The Enlightenment was not a historical event. It is a process, a mission, a continuous duty to explain yourself” (personal communication, 11 February 2011; not an exact quote, but very close). Jasanoff lays out an alternative claim to the linear model of science policy that she calls “technologies of humility”. In contrast to calls for “more science” to reduce uncertainty, Jasanoff writes that, “what is lacking is not just knowledge to fill the gaps, but also processes and methods to elicit what the public wants, and to use what is already known”. The abstract of her paper states, “governments should reconsider existing relations among decision-makers, experts, and citizens in the management of technology. Policy-makers need a set of ‘technologies of humility’ for systematically assessing the unknown and the uncertain”. Jasanoff and other Science and Society scholars have been writing about the failures of the linear science policy model in predicting risk since the 1980s, when the risk-management paradigm began to crumble in the wake of seemingly “unpredictable” human-technology-based disasters like Chernobyl. Today we face critical policy issues from climate change to toxic chemicals that fundamentally depend upon and understanding of environmental science, but just understanding the science is not enough. We need a new model of science policy that incorporates the needs of decision-makers and stakeholders from the start, not after it’s too late.
About the author: Marci Baranski is a doctoral student in ASU’s Biology and Society program.


If a crime has been committed, intent is usually considered quite relevant. Intent may have influenced the charges being manslaughter rather than some form of murder.
My concern in this specific case, given the available evidence (due to my lack of Italian fluency), is that one group involved in the decisionmaking appears to have foisted responsibility away to another group. If the scientists were making the decision to evacuate or not (as opposed to a recommendation on whether to evacuate or an estimate of possible tremors), then the responsibility seems clearer.
Regardless of legal liability (which seems to be assigned after the fact in this case), I don't expect this matter to do anything to clarify responsibilities, obligations and liabilities in the solicitation and use of science advice.