A Framework to Build Process Theories of Anticipatory Information and Communication Technology (ICT) Standardizing

A Framework to Build Process Theories of Anticipatory Information and Communication Technology (ICT) Standardizing

Kalle Lyytinen, Thomas Keil, Vladislav Fomin
Copyright: © 2010 |Pages: 40
DOI: 10.4018/978-1-60566-946-5.ch008
(Individual Chapters)
No Current Special Offers


Standards have become critical to information and communication technologies (ICTs) as they become complex and pervasive. We propose a process theory framework to explain anticipatory standardizing outcomes post hoc when the standardizing process is viewed as networks of events. Anticipatory standards define future capabilities for ICT ex ante in contrast to ex post standardizing existing practices or capabilities through de facto standardization in the market. The theoretical framework offers the following: a) a lexicon in the form of the ontology and typology of standardizing events; b) a grammar, or a set of combination rules, for standardizing events to build process representations; c) an analysis and appreciation of contexts in which standardizing unfolds; and d) logic yielding theoretical explanations of standardizing outcomes based on the analysis of process representations. We show how the framework can help analyze standardization data as networks of events as well as explain standardizing outcomes. We illustrate the plausibility of the approach by applying it to wireless standardization to explain standardizing outcomes.
Chapter Preview


Over the past decade, successful standard-setting has become critical for innovation, while Information and Communication Technologies (ICTs) have become networked, ubiquitous, and complex (David, 1995; Mansell & Silverstone, 1996). ICTs are technologies dedicated to information processing; in particular, they involve the use of computers and software to convert, store, protect, process, transmit, and retrieve information (Wikipedia, 2005). Recently, traditional standard-setting mechanisms have become rife with problems: They do not respond well to the increased scope, pace, and complexity of technological and market change associated with ICTs (Garud, et al., 2002; Schmidt & Werle, 1998; Werle, 2000). This is the case, in particular, with the exponential growth of anticipatory ICT standards—standards that embed significant technological or process innovations into the technical specification—and which are “intended to guide the emergence of new technologies and consequently indicate far ahead in advance of the market’s ability to signal the features of products that users will demand” (David, 1995, p. 29). Anticipatory standards define future capabilities for ICTs in contrast to recording and stabilizing existing practices, or capabilities de facto. Failures with anticipatory ICT standardizing are common (Steinmueller, 2005; Markus, et al., 2006) and our ability to explain their failure with the existing body of knowledge is poor.

In this article we advance process theorizing of ICT standardizing—the mission of describing, revealing, understanding, and explaining processes, features, and outcomes of ICT standardizing (Weick, 1995). To this end we formulate a theoretical framework which helps formulate plausible, generalizable, and valid explanations of why and how certain ICT standardizing outcomes emerged (Weick, 1989). The framework posits that ICT standardizing can be seen as a network of events that create and coordinate the adoption of institutionally-bound and contextualized technological repertoires (capabilities) among a set of heterogeneous actors. We draw upon Actor Network Theory (ANT) and Social Construction Of Technology (SCOT) studies (Howcroft, et al., 2004) to explicate these necessary theoretical constructs.

ICT standardizing is viewed in this study as collective engineering of technical specifications (David, 1995; Steinmueller, 2005; Baldwin & Clark, 2005). By drawing upon SCOT (Bijker, 1987), we view anticipatory standardizing as technology framing—sense-making—which at the same time builds durable socio-technical networks (Callon & Law, 1989; Latour, 1995). The framework analyses of event networks of such engineering, sense-making, and negotiation activities offers: a) a lexicon in the form of ontology and typology of standardizing events; b) a grammar, or set of combination rules for events to build process representations; c) an analysis of contexts in which events unfold; and d) a set of logical rules to yield explanations of standardizing outcomes.

The proposed framework is not a process theory of anticipating ICT standardizing outcomes. First, it is not a theory of anticipatory standards as ready-to-adopt fixed artifacts, but instead it moves towards theorizing about standardizing as a stream of social, political and design events which connect ideas, artifacts, people, and institutions to yield a specific technical specification. Like all process theories, it cannot be used to accurately predict standardizing outcomes, but rather to analyze why specific processes took place in the way they did, and why certain outcomes emerged (Mohr, 1982; Markus & Robey, 1988; Langley, 1999) as to anticipate outcomes of future standardization situations. Second, it is not a complete process theory, as it offers at current state theoretical constructs to compose statements to understand and explain concrete standardizing outcomes.

Complete Chapter List

Search this Book: