Manufacturing Complexity Analysis with Fuzzy AHP

Manufacturing Complexity Analysis with Fuzzy AHP

Kouroush Jenab (Society of Reliability Engineers - Ottawa, Canada), Sam Khoury (East Carolina University, USA) and Ahmad R. Sarfaraz (California State University, Northridge, USA)
Copyright: © 2012 |Pages: 16
DOI: 10.4018/jsds.2012040103
OnDemand PDF Download:
$30.00
List Price: $37.50

Abstract

Budgeting, resource allocation, and planning in manufacturing systems are important issues that can be managed by complexity measures. Manufacturing processes have several primary areas of complexity that may not be measured precisely due to uncertain situations. Therefore, this study reports a Fuzzy Analytic Hierarchy Process (FAHP) model for evaluating the process complexity that takes into account uncertain situations and manufacturing process technology. The model can rank manufacturing processes based on their relative complexities. An illustrative example for several processes is demonstrated to present the application of the model.
Article Preview

2. Literature Review

Considering the broad definition of complexity, it can be used as a yardstick for budgeting and resource allocation, and comparative analysis. However, quantitative analysis of complexity has not received the scholarly attention and recognition that other seminal concepts have received, such as flexibility or robustness (Bashir & Thomson, 2001; El-haik & Yang, 1999; Rodriguez-Toro et al., 2003; Smith & Jenks, 2006). Literature pertaining to complexity can be classified to computational, software, manufacturing & operation, projects, enterprise, and information systems complexities.

Computational complexity is a branch of computational theory that deals with computational problems such as problem instances, representing problem instances, decision problems as formal languages, function problems, and measuring the size of an instance. In this regard, Chakraborty and Choudhury (1999) studied two computing operations (i.e., addition and multiplication) with different processing times. They proposed a statistical approach based on a weighting system. Dehmer et al. (2006) investigated an algorithm to measure the structural similarity of a generalized graph. They showed the efficiency of their algorithm was promisingly better than those of the classical approaches used by Kaden (1982). De Reyck and Herroelen (1996) studied the relationship between the hardness of problem instance and the topological structure of its network by measuring complexity. Liu et al. (2007) analyzed the complexity of computing an AU measure within the D–S theory framework. They delineated the conditions that effect on computational complexity. However, Huynh and Nakamori (2010) critiqued the results of Liu et al.’s (2007) work and rectified several mistakes in the formulation of the F-algorithm developed.

Complete Article List

Search this Journal:
Reset
Open Access Articles: Forthcoming
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing