Some excellent recommended reading, annoted by PHC QSV Pillar physicians.
Fischhoff, B. (1975). Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty. Journal of Experimental Psychology: Human Perception and Performance, 1(3), 288–299. The classic paper. Knowledge of outcome distorts our understanding of an event and contributes to overconfidence in our ability to predict outcomes of future events. This is the primary flaw in retrospective analyses of patient safety events. Read >
Paget MA. The unity of mistakes: a phenomenological interpretation of medical work. Temple University Press. 1988. An existential view of medical work in which things go wrong as an intrinsic part of clinical work and probes the “complex sorrow” that can result. Read >
Weick KE. The collapse of sensemaking in organizations: the Mann Gulch disaster. Administrative Science Quarterly. 1993;38(4):628-652. 10.2307/2393339 Read >
Rochlin G. Safe operations as a social construct. Ergonomics. 1999;42(11):1549-1560. 1 One of the very first papers I read on safety. Based on empirical work on organizations that manage complex, technically hazardous operations. Safety understood as a dynamic, interactive and intersubjective action that is “not readily measured in terms of safety culture…and particularly vulnerable to disruption or distortion by well-meant but imperfectly informed interventions aimed at eliminating or reducing ‘human error’ that do not take into account the importance of the processes by which the construction of safe operation is created and maintained.” Read >
Dekker SWA. Why doctors are more dangerous than gun owners: a rejoinder to error counting. Human Factors. 2007;49(2):177-184. Read >
Woods DD, Dekker S, Cook R, Johannesen L, Sarter N. Behind human error, 2nd Ed. 2010. CRC Press:London. Essential reading. Goes behind ‘human error’ label.” Human error is cited over and over as a cause of incidents and accidents, and solutions are thought to lie in changing people or their role in the system. For example, we should reduce the human role with more automation, or regiment human behavior by stricter monitoring, rules or procedures. But in practice, things have proved not to be this simple. The label ‘human error’ is prejudicial and hides much more than it reveals about how a system functions or malfunctions.” Read >
Borys D, Else D, Leggett S. The fifth age of safety: the adaptive age. Journal of Health and Safety Research and Practice. 2009; 1(1): 19-27. Read >
Antonsen S. Safety culture assessment: a mission impossible. Journal of Contingencies and Crisis Management. 2009;17(4):242-254. Read>
Waring J. Constructing and re-constructing narratives of patient safety. Social Science and Medicine. 2009;69(12): 1722-1731. Read >
Hollnagel E. Coping with complexity: Past. present and future. Cognition, Technology & Work. 2012;14(3):199-205. Read >
Cook R, Rasmussen J. “Going solid”: a model of system dynamics and consequences for patient safety. Quality and Safety in Health Care. 2005;14:130-134. Read >
Mesman J. Resources of strength: an exnovation of hidden competences to preserve patient safety. In: A Socio-Cultural Perspective on Patient Safety. E. Rowley & J. Waring (eds.). 2012. Aldershot: Ashgate Publishing Ltd. Read >
Wears RL, Hunte GS. Seeing patient safety ‘Like a State’. Safety Science. 2014;67:50-57. Read >
Woods DD. Four concepts for resilience and the implications for the future of resilience engineering. Reliability Engineering and System Safety. 2015;141:5-9. Read>
Braithwaite J, Wears RL, Hollnagel E. Resilient health care: turning patient safety on its head. International Journal for Quality in Health Care. 2015;1-3. 10.1093/intqhc/mzv063 Read >