Science-Base Research for Advanced Interoperability

Science-Base Research for Advanced Interoperability

H. T. Goranson, Beth Cardier
DOI: 10.4018/978-1-4666-5142-5.ch013
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Studies show that enterprises are severely constrained by their management structures, and that those constraints become more vexing as information technologies are adopted. This is more true as “interoperability engineering” advances; the enterprise is capable of doing simple, ordinary things better, but the form of the enterprise becomes less adaptive, less agile as external firms are integrated in using lowest common denominator standards. The net result is that we are worse off now because of the constraints of integration decisions. A radical advance is required, one based on breakthroughs in the underlying science used by enterprise engineers. This chapter indicates one advanced form of enterprise that current research could make possible and uses it to illustrate desired enterprise engineering tools. It then suggests an agenda for fundamental research to support those goals.
Chapter Preview
Top

Introduction

Enterprises are the means of large scale collaboration, and so have been with us as long as civilization itself. What enterprises can accomplish is dependent on the technology used to engineer and manage them. Thus the history of such technology is punctuated by events such as Italy’s introduction of arithmetic accounting (as double-entry bookkeeping) in the 13th century (Devlin, 2000); before that, the ‘technology’ of representing value in the abstraction of bankable currency changed the nature of cooperative business. Just as language is a shaper of thought, certain modeling technologies can be seen as the driver of collaborative structure.

In the modern era, US military research introduced system-wide process metrics during World War II to integrate operations in large manufacturing enterprises. This shift depended on work sponsored through (what is now) Wright-Patterson Air Force Base into the science of coupled process modeling. Defense sponsorship in the three decades ending in the 1980s in process modeling produced other scientific foundations that can directly be traced to the general productivity increase of world-wide manufacturing in those years. A vast array of accounting tools and best management practices resulted (Walker & Wickam, 1986).

Few practitioners, be they managers or enterprise engineers, appreciate this history and the food chain it demonstrates:

  • First, basic scientific insights into abstraction, modeling and computation are developed.

  • These enable an innovative enterprise arrangement or operation.

  • Learning from this, practitioners and their suppliers develop engineering rules of thumb that are considered best practices.

  • Enterprises adopt these practices in how they are structured, inevitably producing limits to capability and agility that go largely unnoticed because all competitors use the same infrastructure tools.

Starting in the 1980s, the Pentagon confronted some of these limits. As the world’s largest buyer of complex manufactured systems, they were also the most far reaching and practiced enterprise engineer. The so-called Military Industrial Complex was then structured by some very restrictive laws, acquisition practices and legal precedents. The resulting structures were vastly more costly than they could have been, producing weapon systems that were suboptimal and in some cases systems simply could not be designed, manufactured and fielded at any cost.

One example was an air-to-air missile that was essential to military strategy. The US version was not good enough. We knew what to build; an adversary had a superior system. But the enterprise — as a manufactured system itself — had to be designed and engineered simultaneously with the product. Changes in the missile and the system to create the missile were bound in destructive loops. For instance, the integration of key components took much longer than the technology cycles of the individual components; meanwhile, the management processes used to keep track of changes and (partly) adapt were consuming 80% of the project cost (Winner, Pennell, Bertrand, & Slusarczuk, 1988).

In response, Congress funded a Defense Manufacturing Office at the Defense Advanced Research Projects Agency and work on the science base for what is now called interoperability focused on product modeling fundamentals (features, abstractions and logics). Most of this work was performed on classified aerospace programs and quietly transferred to vendors servicing the civil sector. Some was channeled through SEMATECH, the consortium addressing the enterprise interoperability crisis in the semiconductor industry. Once again, the food chain was clear: start with the science, the basics of abstraction and its calculus. Only then can significant change be reflected in tools and practices for the engineers (Goranson, 1999).

The ideas presented here were initiated in studies for a planned civil research agency to further that DARPA work, a sort of National Institutes of Health for manufacturing.

Complete Chapter List

Search this Book:
Reset