Power and Performance Management of GPUs Based Cluster

Power and Performance Management of GPUs Based Cluster

Yaser Jararweh (Department of Computer Science, Jordan University of Science and Technology, Irbid, Jordan) and Salim Hariri (Department of Electrical and Computer Engineering, University of Arizona, Tucson, AZ, USA)
Copyright: © 2012 |Pages: 16
DOI: 10.4018/ijcac.2012100102
OnDemand PDF Download:
$37.50

Abstract

Power consumption in GPUs based cluster became the major obstacle in the adoption of high productivity GPU accelerators in the high performance computing industry. The power consumed by GPU chips represent about 75% of the total GPU based cluster power consumption. This is due to the fact that the GPU cards are often configured at peak performance, and consequently, they will be active all the time. In this paper, the authors present a holistic power and performance management framework that reduces power consumption of the GPU based cluster and maintains the system performance within an acceptable predefined threshold. The framework dynamically scales the GPU cluster to adapt to the variation of incoming workload’s requirements and increase the idleness of the of GPU devices, allowing them to transition to low-power state. The proposed power and performance management framework in GPU cluster demonstrated 46.3% power savings for GPU workload while maintaining the cluster performance. The overhead of the proposed framework is insignificant on the normal application\system operations and services.
Article Preview

2. Gpu Platform, Cuda And Gpu Based Cluster System

Modern GPUs such as Tesla 10 series and Fermi are very efficient in manipulating scientific applications based on their highly parallel structure. The efficient parallel structure and memory hierarchy makes GPUs more effective than traditional CPUs for most of many data parallel applications.

Complete Article List

Search this Journal:
Reset
Open Access Articles: Forthcoming
Volume 7: 4 Issues (2017)
Volume 6: 4 Issues (2016)
Volume 5: 4 Issues (2015)
Volume 4: 4 Issues (2014)
Volume 3: 4 Issues (2013)
Volume 2: 4 Issues (2012)
Volume 1: 4 Issues (2011)
View Complete Journal Contents Listing