11th Scandinavian Conference on
SYSTEM & SOFTWARE SAFETY Stockholm, November 21-22, 2023

System and software safety in electronic systems is becoming increasingly important in many industries and in critical societal infrastructure. The systems become ever more complex, connected and autonomous and the software continues to grow. This poses many challenges even for mature organizations, requiring approaches that go beyond established best practices. Many organizations face the same kind of challenges and thus sharing of experiences becomes essential

The Scandinavian conference on safety critical systems and software has become a central meeting place for Scandinavian safety experts from industry, public and academic organizations. The conference is an opportunity to share experiences and make new contacts. There will be a plenary overview day, and a second day with parallel sessions. The parallel sessions will be in depth workshops and tutorials about different challenges, techniques, standards and methods. As for the previous years, we anticipate a healthy mixture of participants with presentations from different industries and academia.

This year’s keynotes are Nancy Leveson, MIT, Ibrahim Habli, University of York and Lena Kecklund, MTO Säkerhet AB. Nancy Leveson will also give a worshop/tutorial Monday 20 November.

We would like to invite proposals for presentations and workshops on all safety-related topics, e.g., safety in infrastructure, artificial intelligence, electrification of vehicles, agile development, design- and architectural decisions, autonomy, and quality assurance.

Do you want to share with the safety experts of Scandinavia? Do you want to "connect" with the safety experts of Scandinavia? 

You are invited to register. Early bird period ends October 6 and final registration will close November 15, 2023

 

 

Please check past conferences for previous conference details

Keynotes

The need for a new Paradigm in System Safety Engineering

by Nancy Leveson, MIT

Abstract

The traditional assumptions about the cause of accidents/losses has served us well for several hundred years. But the world has changed enough that these assumptions are no longer true. Previously unparalleled levels of complexity and new technology, particularly computers, have created new causes of accidents. Traditional models of causality and the tools based on them are not effective in understanding and preventing these new types of losses. In this talk, I will explain why something new is needed and suggest that systems theory can provide the basis for an expanded model of causality (called STAMP) that better fits today’s world.

Biography

Nancy Leveson is Jerome C. Hunsaker Professor of Aeronautics and Astronautics at MIT. Her Ph.D. was in Computer Science, but she has also studied math, cognitive psychology, and management. Dr. Leveson conducts research on all aspects of system safety including modeling and analysis, design, operations, management, and human factors and the general area of system engineering. Her research is used in a wide variety of safety-critical industries including aerospace, transportation, chemical plants, nuclear power, healthcare, and many others. She has been involved in many accident investigations in a variety of industries. One particular common element throughout all her work is an emphasis on applying systems theory and systems thinking to complex systems.

      Dr. Leveson has received many honors, most recently the 2020 IEEE Medal for Environmental and Safety Technologies. She was elected to the National Academy of Engineering in 2000. She is the author of three books: Safeware: System Safety and Computers (1995), Engineering a Safer World (2012), and a new general textbook titled Introduction to System Safety Engineering (2023)

 

Assuring the ethics of AI and autonomous systems

by Prof. Ibrahim Habli, University of York

Abstract

An assurance case is a structured argument, typically produced by safety engineers, to communicate confidence that a critical or complex system, such as an aircraft, will be acceptably safe within its intended context. Assurance cases often inform third party approval of a system. One emerging proposition within the trustworthy AI and autonomous systems (AI/AS) community is to use assurance cases to instil justified confidence that specific AI/AS will be ethically acceptable when operational in well-defined contexts. This talk brings together the assurance case methodology with a set of ethical principles to structure a principles-based ethics assurance argument pattern. The principles are justice, beneficence, non-maleficence, and respect for human autonomy, with the principle of transparency playing a supporting role. The argument pattern—shortened to the acronym PRAISE—is described. The objective of the proposed PRAISE argument pattern is to provide a reusable template for individual ethics assurance cases, by which engineers, developers, operators, or regulators could justify, communicate, or challenge a claim about the overall ethical acceptability of the use of a specific AI/AS in a given socio-technical context.

Biography

Ibrahim Habli is Professor of Safety-Critical Systems and Deputy Head of the Department of Computer Science at the University of York with responsibility for research. He currently leads the research activities of the £12M Assuring Autonomy International Programme (AAIP), funded by Lloyd’s Register Foundation, which is developing and applying methods for assuring AI-based autonomous systems in multiple industries. He is the PI on the UKRI-funded project AR-TAS (Assuring Responsibility for Trustworthy Autonomous Systems) and a Co-I on the UKRI TAS Node in Resilience (REASON). In 2015, he was awarded a Royal Academy of Engineering Industrial Fellowship and collaborated with the National Health Service in England on evidence-based methods for assuring complex digital interventions. His research on safe and ethical assurance of AI systems has involved multidisciplinary collaborations including with law, economics, and philosophy.

 

The future of System Safety – how to apply HTO (Human-Technology- Organization)

by Lena Kecklund, MTO

Abstract

The increasing use of digital technology is changing the circumstances and conditions for human work in the workplace as well as in society. The systems that humans must master becomes more and more complex and increased automatization leads to greater dependency on machines or robots to conduct parts of or all the work.

Systems that fail to interact with humans will lead to poor safety and inadequate work environments, which will impact negatively on business development. The inclusion of the human factor is paramount and should be seen as a strength rather than a weakness. This is addressed in this book emphasizing how to develop successful effective, safe, and sustainable digital systems, with focus on principles, methods, and approaches to improve work life.

System Safety must therefore be based on addressing the interaction between Human, Technology and Organizations. You will learn how to apply the HTO framework in different domains.

Biography

Lena Kecklund, Ph D, is a senior consultant and CEO at MTO Säkerhet. Lena's areas of expertise include safety culture, the interaction between humans, technology, organization (HTO) and safety management. She has extensive experience from a number of safety-critical operations such as rail, road, aviation, shipping, patient safety and industry. Lena works, among other things with investigation, training and advice to managers in Sweden and internationally. She was responsible for MTO Säkerhets's participation in the revision of the European railway safety legislation carried out on behalf of European Agency for Railways (ERA). Lena has participated in more than fifteen accident investigations for the National Swedish Accident Authority.  She has written the book  ”Den (o)mänskliga faktorn” with prof em Bengt Sandblad.

Program Committee

Even-André Karlsson SW Management Consultant - Conference Chair Addalot Consulting AB
Gunnar Andersson Board Member SER
Fredrik Asplund Doctor Mechatronics KTH
Olof Bridal Senior Specialist Volvo Group Trucks Technology
Barbara Gallina Associate Professor Mälardalen University
Hans-Martin Heyn Senior Lecturer Chalmers
Malin Levin Deputy Director SAFER Vehicle and Traffic Safety Centre
Per Johannessen Pilot Safety Manager Volvo Autonomous Solutions
Jesper Lindell Technology Manager – Critical Systems Combitech
Peter Sandberg Chief Architect Alstom
Miroslaw Staron Associate Professor University of Gothenburg
Staffan Skogby Senior System Architect Pictor Consulting AB
Martin Törngren Professor Mechatronics KTH/ICES
Fredrik Warg Senior Researcher RISE