Relationship Between Performance Error and Human Information Processing

Relationship Between Performance Error and Human Information Processing

Mitsuhiko Karashima (Tokai University, Japan)
Copyright: © 2008 |Pages: 7
DOI: 10.4018/978-1-59904-889-5.ch145
OnDemand PDF Download:
$37.50

Abstract

Human information processing (HIP) performance using the working memory can be assessed by two types of indicator when an HIP task is carried out. One is error occurrence and the other is HIP time taken when the HIP task is carried out using the working memory. Errors are classified into the error which is caused by the task requirement exceeding some human’s limitation or the error which is caused by carelessness even though all of human’s limitations still allow enough capacity to do the task (Reason, 1990). The former is regarded as an error that is caused by the lack of the HIP ability in order to do the required information processing. The latter is regarded as an error that is caused by the temporary reduction of some HIP ability such as attention. Even though there are many kinds of factors of error generation, from the view point of HIP, error can be considered to be caused by the relationship between the required quantity or quality of the information processing and the HIP ability. The characteristics of HIP can be considered to influence error generation directly. In this chapter the characteristics of HIP related to the error are illustrated with the results of the experiments. (Karashima, Okamura & Saito, 1994, Karashima & Saito, 2001)

Key Terms in this Chapter

Working Memory Resource Capacity: Working memory consists of one main system and two slave systems. Every system has workspace for information processing and storage, and each workspace has the limited capacity. When the amount of information processing and storage in each system nears full capacity, the interference between information processing and storage occurs easily and the interference makes the performance of the information processing decrease or causes the loss of stored.

Error Ratio: Error ratio is generally calculated by the ratio of the number of error occurrences divided by the number of trials in HIP task. Error ratio is measured as one of the indices of the performance, mental workload, and fatigue in the task.

Working Memory: Working memory is the theoretical framework in cognitive psychology that describes the processes for processing and storing information. Baddeley & Hitch (1974) made the model of working memory. It consists of one main system, “central executive,” and two slave systems, “phonological loop” and “visuospatical sketch pad.”

1/f Fluctuation: 1/f fluctuation is also called 1/f noise or pink noise. The power spectral density of this noise is proportional to the reciprocal of the frequency. It occurs in many fields. It is reported that people tend to feel comfortable to some time series that has 1/f fluctuation. The origin of its name is from being intermediate between white noise, 1/f0 noise, and red noise, 1/f2 noise.

FLOWM (FLuctuation of Working Memory Resource Capacity): FLOWM was proposed for explaining the performance such as the variations of the error occurrence and the HIP time: which could not be explained without contradiction by the conventional models such as Baddeley’s model due to the constant working memory resource capacity. FLOWM is the new hypothesis that adds the concept of a fluctuation on time axis to the concept of the working memory resource capacity in the conventional working memory models

Complete Chapter List

Search this Book:
Reset