Sensitivity Analysis of Laser Cutting Based on Metamodeling Approach

Sensitivity Analysis of Laser Cutting Based on Metamodeling Approach

Toufik Al Khawli, Urs Eppelt, Wolfgang Schulz
DOI: 10.4018/978-1-4666-8823-0.ch020
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

In this work, the utility of a metamodel in gaining valuable information relating to the optimization of a laser cutting process using a CW laser source is analyzed. The simulation itself is characterized by a high dimensional input parameter set. Each parameter has its own range, and thus the complete parameter sets with their ranges form the full parameter domain space. The quality criteria are analyzed and used as objective function to optimize the process. Simulation results can only help in the build-up of process understanding, if they can be presented in their entirety and together with their origin in the parameter domain. For this purpose a metamodeling concept is presented, which takes the results from simulations and generates a process map that clearly indicates the process domains. For gaining insights, the Elementary Effect method is applied to screen the important parameters that exhibit the greater impact on the process.
Chapter Preview
Top

Introduction

Nowadays, the global interest is strongly affected by economic, ecological, and social trends of the last decades such as globalization, global warming, shortage of energy resources, or even financial crises (Schmidt et al., 2011). Due to these trends, several complexities rise in production. For example, shorter product lifecycles, a higher global price competition, and faster changing market demands, put the production companies under pressure to make their workflows more efficient and more scalable, and to provide reliable and cost-effective responses to unpredictable changes that take place in a global market (Schuh et al., 2011).

Production planning and scheduling requires a large amount of human information processing and decision-making. Complex manufacturing systems and processes involve more demanding tasks for today’s planners and schedulers (Heilala et al., 2010). Manufacturing, engineering and production management decisions involve the consideration of the effect of multidimensional parameters on preselected criteria onto the process. The relationship between interdependent parameters and criteria is first very difficult to achieve, and second very complex for the human mind to handle at a time. Recently, a modern approach to enhance the tasks of planners and schedulers to run production more efficiently has gained considerable importance (Gasser et al., 2011). It is based on using a simulation-based decision support system in the field of production technology (Mönch, 2007).

The simulation applications have positively influenced the research departments and production environments. In research departments, these applications are enabling a better understanding of the production processes, since they allow for decomposing those multi-physical interactions or complex material modeling and study their individual influence on the final processing result. However, in production environments, simulations are serving as a tool for planning and optimizing either a single production step, or even whole process chains (Otto et al. 2011).

The conventional approach in modeling and simulation of manufacturing processes so far is performing several sets of individual simulations. Pieces of information or rules are then extracted directly via data analysis of these single simulation results. The important goal of scientific data analysis is to understand and extract the behavior of the process based on a sample data set that contains the input parameters and the output criteria, and characterizes the system as a high-dimensional function (Fayyad et al., 1996). Each individual simulation is characterized by a set of parameters in a high-dimensional parameter space. If the following constraints exist: i) the time required for performing the simulation model increases; ii) the number of the parameter space dimensions rises; or iii) the complexity of the simulation model becomes highly nonlinear. The data analysis procedure will require a huge set of data that is, due to time constraints, difficult to achieve. Furthermore, when the simulation time increases, the conventional analysis tasks such as optimization, sensitivity analysis, or design space exploration will become impractical due to the high number of evaluation runs. Although the computational power available has been rapidly increasing over the years, it is still not feasible to run full numerical simulations of multi physics applications at a reasonable computational cost. For example, it takes Ford Motor Company about 36–160 hours to run one crash simulation, see (Gu, 2001). Thus, the total computation time required for optimizing such a task, by executing only 50 iterations, would be around 75 days – to 11 months, which is in practice unacceptable. Additionally, it is important to declare that the parameter-criteria relationship approach is just revealed by a set of discrete data and thereby may already be indicated by taking advantage of appropriate scatter-diagrams mapping parameters on criteria. To overcome these difficulties, researchers are recently using approximation models that mimic the behavior of the expensive simulation up to a required accuracy. These models, also known as metamodels, are not only faster but also less expensive than physical simulation models that are questioned again and again when running an operation on the model. Once a fast predictive metamodel is achieved, a sensitivity analysis study or evolutionary multidimensional optimization can be easily performed by using the metamodel as an emulator or data generator.

Key Terms in this Chapter

Metamodel: Are simple and fast approximation models that mimic the behavior of the expensive simulation up to a required accuracy. They can be used to replace the complex simulation model for the conventional engineering applications like optimization, sensitivity analysis, exploration, etc.

EE Method: Elementary Effect, known also as Morris Method, is a global sensitivity analysis technique used to screen the most influential/non-influential parameters of a high-dimensional model or a high complex computational model.

DOE Sampling: The Design of Experiment sampling techniques enables designers to select discrete data sets that contain both the input and output of a physical system in order to estimate or extract characteristics and dependencies.

RBFN: The radial basis function network (RBFN) is an artificial neural network that uses radial basis functions as a basis functions. It is mainly used to perform a multidimensional interpolation from a discrete data set that contains the inputs and outputs of a model. It is famous for generating accurate multidimensional interpolations for complex nonlinear responses.

Complete Chapter List

Search this Book:
Reset