A Model-Driven Engineering Method for DRE Defense Systems Performance Analysis and Prediction

A Model-Driven Engineering Method for DRE Defense Systems Performance Analysis and Prediction

Katrina Falkner, Vanea Chiprianov, Nickolas Falkner, Claudia Szabo, Gavin Puddy
Copyright: © 2014 |Pages: 26
DOI: 10.4018/978-1-4666-6194-3.ch012
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Autonomous, Distributed Real-Time Embedded (DRE) defence systems are typically characterized by hard constraints on space, weight, and power. These constraints have a strong impact on the non-functional properties of the final system, especially its performance. System execution modeling tools permit early prediction of the performance of model-driven systems; however, the focus to date has been on the practical aspects and creating tools that work in specific cases, rather than on the process and methodology applied. In this chapter, the authors present an integrated method to performance analysis and prediction of model-driven DRE defense systems. They present both the tools to support the process and a method to define these tools. The authors explore these tools and processes within an industry case study from a defense context.
Chapter Preview
Top

Background

Mission-critical Distributed Real-time and Embedded (DRE) systems, such as naval combat systems or mission-systems, can have life-cycles that can be counted in the decades (Falkner, 2013). The design complexity and cost of such long-lived large systems continue to grow while business owners continue to seek improvements in the return on investments for such projects. While an understanding of both functional and non-functional aspects of the system design is important, issues associated with the non-functional aspect of the design are of greater concern for resource-constrained platforms, such as submarines or autonomous vehicles. With strict budget allocations for space, weight and power for various systems installed in such platforms, any early insight into the performance of these systems, and their corresponding deployment and budget considerations, becomes crucial.

Prediction of software performance has developed from early approaches based on abstract models to Model-Driven Engineering (MDE) based approaches (Woodside, 2007a). MDE operates through the definition of Domain Specific Modeling Languages (DSMLs), which are used to develop models that encapsulate the essential requirements of the problem space at a high level of abstraction, using abstractions that fit the domain of the problem space, and are hence more understandable to domain experts. MDE follows a process by which these models are transformed, either manually or automatically, through stages of increased specificity and detail, eventually resulting in the provision of an executable software system.

MDE techniques are typically applied to the development of application software components, but may also be used to model and solve the configuration and deployment phases, as well as system execution emulation, testing and analysis. System Execution Modeling (SEM) (Hill, 2010), a recent development from research into measurement-based performance prediction, provides detailed early insight into the non-functional characteristics of a DRE system design. A SEM-based approach supports the evaluation of overall (software) system performance, incorporating component interactions and the performance impact of 3rd party software such as middleware. These approaches are based upon simple models of resource consumption from the component’s “business logic” (Hill, 2010; Paunov, 2006) and support detailed performance modeling of software systems, enabling predictions of performance through execution of representative source code of behavior and workload models deployed on realistic hardware test-beds.

SEM and MDE may be used in combination to support the emulation of system components and performance models, enabling performance data to be used to redesign and reconfigure the system, prior to any construction of the corresponding real system (Falkner, 2013). This is becoming increasingly important with the trend towards the development of open DRE systems, which must support frequent and rapid evolution, changes in component integration, communication partners and respond to run-time changes (Trombetti, 2005). The use of MDE, DSMLs, automatic code generation and utilisation of off-the-shelf technologies has enabled the SEM approach to abstract the development complexities of DRE systems, while still ensuring detailed performance insight to the level required to provide performance evaluation of mission-critical systems.

Key Terms in this Chapter

Domain Specific Modeling Language (DSML): Is a computer language for creating models that are specific to a certain domain. It offers expressive power focused on a particular problem domain through appropriate notation and abstractions.

Model Driven Engineering (MDE): Is a field of software engineering, that uses models for documenting, executing, visualising and analysing software and systems.

Distributed Real-Time Embedded DRE System: Is a computer system with a dedicated function, embedded as part of a larger device, whose components are located in a network and that must guarantee response within strict time constraints.

Software Development Method: Consists of a set of modeling conventions (i.e., a modeling language) and a process.

System Execution Modeling (SEM): Offers detailed early insight into the non-functional characteristics of a DRE system design. A SEM-based approach supports the evaluation of overall (software) system performance, incorporating component interactions and the performance impact of 3rd party software such as middleware. These approaches are based upon simple models of resource consumption from the component’s “business logic”, and support detailed performance modeling of software systems.

Model Based (Software) Performance Prediction: Is the process of predicting (at early phases of the life cycle) and evaluating (at the end) based on performance models, whether the software system satisfies the user performance goals.

Complete Chapter List

Search this Book:
Reset