Article Preview
Top1. Introduction
In today’s dynamic production environments, business success often requires pursuing high quality, shorter throughput time, and low cost simultaneously. These goals are identified as competitive capacities and used to measure manufacturing performance (Machuca et al., 2011; Yang and Pan, 2004; Cua et al., 2001). The pursuit of ongoing quality improvement is studied heavily since the quality revolution of the 1980s (Crosby, 1979). Ferdows and De Meyer (1990) propose that manufacturing capabilities build upon each other and quality is the foundation required for sustainable business performance. The study of variability amplification has been of interest to supply chain management scholars since the first study on demand amplification appeared in the late 1950s. Forrester (1958) proposes that decisions made by a company depend on the flow of information, materials, money, manpower, and capital equipment among its upstream and downstream partners. He provides an initial explanation and observational evidence where a small, sudden change in retail sales results in increasing variability of order rates, factory output, warehouse inventory, and unfilled orders throughout the supply chain. The most influential studies are by Lee et al. (1997a, b), who coin the term “bullwhip effect” referring to the phenomenon of increasing order quantity variance moving upstream from end customers to suppliers in a supply chain. They identify four main operational causes of the bullwhip effect and discuss options for alleviating their effects.
Since the 1990s, a considerable body of literature related to the bullwhip effect emerges with the majority focusing on quantifying the demand-order process through different types of supply chains (Agrawal et al., 2009). Such variability amplification, however, is not unique to demand-order management. Quality changes in a supply chain exhibit similar behavior. In any manufacturing system, the quality of outputs is significantly influenced by the quality of inputs and activities conducted at each stage in the production process (Baiman et al., 2000; Fernandes et al., 2017). As suggested by Taguchi’s quality model, quality generally does not drop suddenly; instead a loss of quality occurs progressively as variation increases within specification limits (Upadhayay and Vrat, 2016). These processing noises deteriorate the production gradually over time and, ultimately, can increase the variance in final product quality. As a result, the variance in quality increases as materials move downstream from the supplier through successive stages of the supply chain. Without stable quality as a foundation of the manufacturing and supply chain systems, performance will gradually deteriorate likely resulting in additional production and quality issues emerging over time (Ferdows and De Meyer, 1990; Zu and Cui, 2013).
This amplification is magnified as throughput time lengthens. Throughput time is the total time required for a product to pass through a manufacturing process, which includes inspection time, moving time, waiting/storage time, and processing time. With longer throughput time, uncertainty at each stage of a supply chain increases; consequently, the variance in quality also increases. Therefore, the objective of this paper is to demonstrate the influence of throughput time on quality variance amplification and quality loss to supply chain members which ultimately affects business performance. In order to do so, we consider a two-echelon supply chain with a manufacturer and a supplier. A time series model is developed to identify the relationship between the variance in quality incurred by the manufacturer and the variance in quality experienced by its supplier. Additionally, the supply chain members’ expected quality costs are compared.