|
Section: Science Life |
deutsche Version Print-Version |
ZHW Annual Meeting on Risk and Security Management The human factor |
A human being is a risk factor. When accidents occur, more often than not, the human factor is one of the causes. How can this knowledge be used? This was the theme of a meeting at the Zurich University of Applied Sciences in Winterthur. By Michael Breu We all want to be safe; sine cura, a condition that leaves us free of anxiety. "But 100 per cent security does not exist – nor can it," says Heinrich Kuhn, Director of the Competence Centre for Security and Risk Prevention (1) at the Zurich University of Applied Sciences in Winterthur (ZHW). And Martin V. Künzli, Member of ZHW's Executive Board, sums the matter up with a quote from Klaus Dörner's bestseller: "Irren ist menschlich" (to err is human). In 11 of 22 cases, human responsibility Being wrong can have fatal consequences. The accident at the nuclear power station in Chernobyl in 1986 was essentially the result of human failure. Likewise, the accident with the space shuttle Columbia last year or the plane crash in Überlingen two years ago. "When the human factor enters the equation, potentially critical situations can develop," says Ernst G. Zirngast from SwissRe, who also spoke at the meeting, "Human Factor. Challenges and Chances for Risk and Security Management" in Winterthur. Zirngast has consolidated data for the petrochemical domain. "We examined 22 major incidents that had occurred over a period of ten years with collective damages to the oil refining industry of 3.3 billion US$. In 11 of these incidents the human factor was decisive in the escalation of the incident to a major damage". Protective shields, built in to limit the extent of damage, are not very efficient. "In the 22 incidents we looked at, 79 protective shields were broken through, which showed latent errors. 20 of the incidents were a direct result of human failure." Strictly follow the rules This is why, in the 1990s, in collaboration with the Institute of Work Psychology at ETH Zurich, SwissRe worked out a concept to investigate human influence that calls the security culture into evidence. "A key problem in occupational systems is how insecurity is dealt with," explains Gudela Grote, ETH Professor for Work- and Organisation Psychology (2). "As an open and complex system one is faced with a multitude of internal and external disturbances that can prevent system goals from being met." Grote differentiates between two ways of dealing with insecurity: minimisation and mastering. "Any attempt to minimise insecurity requires complex, central planning systems and a reduction of operative leeway through regulation and automation in order to 'plan away' insecurity on the one hand, and to establish close links between centralised planning and decentralised implementation on the other.“ With the strategy of mastering insecurity we must say farewell to the myth of a totally foreseeable planning, "whereby, at the same time, it becomes possible to deal constructively with the limits of planning and focus on fostering decentralised autonomy." In order to apply data from the real world, Grote investigated the teamwork in the cockpit of a plane and an operating theatre in a hospital. Cutting to the point, one could say that, as far as team coordination is concerned, it makes sense to adhere strictly to the rules. Security culture at Swiss This is confirmed by Jürg V. Schmid, Vice-president of Security at Swiss and an experienced captain on many passenger planes. He points to the numerous checklists, which pilots have to execute in during flights. "In order to increase the security of civil aviation worldwide, the human factor must be accorded central importance. A detailed study of one of Europe's biggest airlines shows that, apart from the individual, the socio-cultural aspect, the working relationship and atmosphere between crew members also play a considerable part in critical situations, either in the sense of aggravating or alleviating the situation." For this reason, Swiss welcomes the introduction of routine reporting procedures of occurrences relevant to security – similar to the system in place at the Cantonal Hospital of Basle.
|
This procedure was introduced by Daniel Scheidegger, chief anaesthetist and Professor at the University of Basle. "In the past, as a basic qualitative characteristic of a treatment process, it is probable that not enough attention was paid to patients' safety," says Scheidegger. "This, despite the fact that up to ten per cent of patients are victims of an occurrence during their stay in hospital, and probably as out-patients, too." With the Critical Incident Reporting System all occurrences are now documented and evaluated. "The implementation of such a procedure implicitly leads to a change in the error culture of the area where it is implemented. The Critical Incident Reporting System can lastingly promote learning from mistakes by identifying potential critical weak points. Nevertheless, risk and security management remain primarily methods that embody a hierarchic structure, sums up Heinrich Kuhn of ZHW. "At the same time, a risk and security culture is the determining dynamic that diffuses risk and security management in a horizontal dimension.
|
|||||||||
Footnotes:
You can write a feedback to this article or read the existing comments. |