Article Preview
Top1. Introduction
Throughout their industrial experience, the authors have seen several examples of systems where security suffered due to a lack of proper consideration for the complexity of an inter-play between technical systems and their social environment. Further, the authors can see that even within the confines of a large technical system, its security depends not only on purely technical measures or mastery of technical designs, but also on the ability and willingness of design and support personnel to engage at a deeper level. The authors' current work concentrates on improving industrial practices, using elements of a proposition presented here.
In the continuous fight to improve on offerings and decrease costs, companies are increasingly dependent on complex information and communication technology (ICT) systems. While the design and deployment of such systems represents a challenge in itself, the fact is that up to 50% of those systems do not live up to their original expectations (Lippert & Davis, 2006) adding frustration and damaged reputation to lost investment and missed revenues.
While analysing the reasons for such a lack of success, it is apparent that failures can often be attributed to the lack of social adoption of such new systems. This lack of adoption often originates in inappropriately designed and applied security measures (Cranor and Garfinkel, 2005), that are either too lax (so that they expose vulnerabilities) or too stringent (so that they inspire creative rejection), or finally, they may be appropriate in strength but entirely ignore established practices. Note that quite often such security measures are designed in full accordance with requirements or specifications, yet they miss the importance of the social context of practical application (Lippert & Davis, 2006).
A system that fails to achieve adoption represents a business loss, but a system that is not fully or willingly adopted represents a significant security vulnerability, specifically if users of such a system are set to circumvent security controls by means of creative social practices. For example, even the most sophisticated access control does not provide security if users choose to use their access cards according to their perception of social relationships (and value systems) rather than according to security policies (Collins, 2007), or if a PIN code for a credit card is shared (Lacohée, Cofta, Phippen, and Furnell, 2008) .
This phenomenon of 'unintended consequences' can be best described in terms of affordance, coined by Gibson (1986) and popularised in the field of HCI and design by Norman (1988) who applied the concept to everyday artefacts. Norman defined affordance as “the perceived and actual properties of the thing, primarily those fundamental properties that determine just how the thing could possibly be used.” Affordance, therefore, determines what an ICT system can be used for, following the intentions of its users, while specification and design concentrates on how the system is intended to be used by its original designers. The disparity between both intentions creates tension that eventually undermines system adoption.
While the ‘answer’ to such 'user challenges' of 'unintended consequences' may lie partly in better education, improved usability or more stringent supervision, the underlying truth is that the deployment of an ICT system is a cause and enabler of a planned change (Lippert & Davis, 2006) that should be designed with its immediate social environment in mind. Successful technologies owe a large part of their success to the fact that they fulfil or enhance an existing human need, or fit well into an already well established social context. In common with other types of change, any unsubstantiated demand that requires a radical change of social practices will be met with rejection and creative re-use or even abuse. Therefore, a successful socio-technical approach to design should take into account social relationships and practices that surround a given system, leading to improvements in acceptance rates.