Design of Formal Languages and Interfaces: “Formal” Does Not Mean “Unreadable”

Design of Formal Languages and Interfaces: “Formal” Does Not Mean “Unreadable”

Maria Spichkova
DOI: 10.4018/978-1-4666-4623-0.ch015
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter provides an introduction to a work that aims to apply the achievements of engineering psychology to the area of formal methods, focusing on the specification phase of a system development process. Formal methods often assume that only two factors should be satisfied: the method must be sound and give such a representation, which is concise and beautiful from the mathematical point of view, without taking into account any question of readability, usability, or tool support. This leads to the fact that formal methods are treated by most engineers as something that is theoretically important but practically too hard to understand and to use, where even some small changes of a formal method can make it more understandable and usable for an average engineer.
Chapter Preview
Top

Introduction

There are many definitions of human factors, however most of them are solely oriented on human-machine operations in terms of system and program usability, i.e. on those parts that are seen by the (end-)user, but not by the requirements, specification and verification engineers. Nevertheless, many problems during the engineering phase are completely the same as by using the final version of a system just because of a simple fact that many people sometimes forget: engineers, even those who are working on verification or formal specification, are humans too and have the same human abilities and weaknesses as people working in any other areas, from arts to construction. Moreover, developing safety-critical systems using formal methods means much harder constraints and stress than using a completed version of software application (e.g., using an entertainment software, typing a personal e-mail using a smartphone, etc.) because of consequences of any mistake: a typo in an e-mail can lead to misunderstanding which is easy to clear up, where a specification or verification error by developing of a safety-critical system, like a fly-by-wire system for airlines or pre-crash safety functionality for vehicles, can cost many human lives.

Nowadays, the research of human factors and of Human Computer Interface (HCI) mostly concentrates on the development of entertainment or every-day applications, but it was initiated and elaborated exactly because of mistakes in usage and development of safety-critical systems. For example, one of the widely cited HCI-related accidents in safety-critical systems are the accidents involved massive radiation overdoses by the Therac-25 (a radiation therapy machine used in curing cancer) that lead to deaths and serious injuries of patients which received thousand times the normal dose of radiation (Miller, 1987; Leveson & Turner, 1993). The causes of these accidents were software failures as well as problems with the system interface.

The Therac-25 was an extension of the two previous models, the Therac-6 and the Therac-20, but the upgrade was unsafe: the software was not correctly updated and adapted to the elaborated extensions in the system architecture. In this model, in comparison to the previous ones, the company tried to mix two system modes, a low-energy mode and a high-energy mode, together. In the high-energy mode the filter plate must be placed between the patients and the X-ray machine, so that a radiation beam is used in a correct way. Because of some software failures the high-energy mode was used in the Therac-25 without the filter plate. This kind of failures occurred also in the old models, but it did not lead to overdosed accidents due to hardware interlocks. In the Therac-25 the company replaced the hardware interlocks with software checks, this result in a deathly overdosed treatment.

The HCI-related problem with this machine was that the Therac-25 in some cases displayed system states incorrectly and showed just some error codes instead of full warning or error messages, and, moreover, these codes were not even well documented. As the result, the operator of the Therac-25 was not able to recognise a dangerous error situation and continued the treatment even after the machine showed warning messages, which did not look like a warning or a signal to stop the treatment. Together with very little training, this caused the operators not aware of the importance of keeping the safety guideline and as a result, they violated many of the safety guidelines. In some case, the operators conducted the treatment even when the video and audio monitoring, which were the only method to observe the patient in separated room, were not working. These accidents have shown that studying the human errors and their causation should be a significant part of software and system engineering at least in the case of safety-critical systems.

An appropriate system interface which allows a correct human computer interaction is just as important as correct, errorfree behaviour of the developed system: even if the system we develop behaves in an ideal correct way, this does not help much in the case the system interface is unclear to the user or is too complicated to be used in a proper way. According to statistics presented in (Dhillon, 2004), the human is responsible for 30% to 60% the total errors which directly or indirectly lead to the accidents, and in the case of aviation and traffic accidents, 80% to 90% of the errors were due to human. Thus, it is necessary to take human factors into account by developing safety-critical systems.

Complete Chapter List

Search this Book:
Reset