Search the World's Largest Database of Information Science & Technology Terms & Definitions
InfInfoScipedia LogoScipedia
A Free Service of IGI Global Publishing House
Below please find a list of definitions for the term that
you selected from multiple scholarly research resources.

What is Distributed Memory

Encyclopedia of Information Science and Technology, Second Edition
Distributed memory means the memory is associated with individual processors and a processor is only able to address its own memory. Some authors refer to this type of system as a multicomputer , reflecting the fact that the building blocks in the system are themselves small computer systems complete with processor and memory.
Published in Chapter:
Parallel and Distributed Visualization Advances
Huabing Zhu (National University of Singapore, Singapore), Lizhe Wang (Institute of Scientific Computing, Forschungszentrum Karlsruhe, Germany), and Tony K.Y. Chan (Nanyang Technological University, Singapore)
DOI: 10.4018/978-1-60566-026-4.ch482
Abstract
Visualization is the process of mapping numerical values into perceptual dimensions and conveying insight into visible phenomena. With the visible phenomena, the human visual system can recognize and interpret complex patterns. One can detect meaning and anomalies in scientific data sets. Another role of visualization is to display new data in order to uncover new knowledge. Hence, visualization has emerged as an important tool widely used in science, medicine, and engineering. As a consequence of our increased ability to model and measure a wide variety of phenomena, data generated for visualization are far beyond the capability of desktop systems. In the near future, we anticipate collecting data at the rate of terabytes per day from numerous classes of applications. These applications can process a huge size of data, which are produced by more sensitive and accurate instruments, for example, telescopes, microscopes, particle accelerators, and satellites (Foster, Insley, Laszewski, Kesselman, & Thiebaux, 1999). Furthermore, the speed of the generation of data is still increasing. Therefore, to visualize large data sets, visualization systems impose more requirements on a variety of resources. For most users, it becomes more difficult to address all requirements on a single computing platform, or for that matter, in a single location. In a distributed computing environment, various resources are available, for example, large volume data storage, supercomputers, video equipment, and so on. At the same time, high speed networks and the advent of multi-disciplinary science mean that the use of remote resources becomes both necessary and feasible (Foster et al., 1999).
Full Text Chapter Download: US $37.50 Add to Cart
eContent Pro Discount Banner
InfoSci OnDemandECP Editorial ServicesAGOSR