Article Preview
Top1. Introduction
A data center is a facility where computers and related equipment such as peripherals are maintained to supply institutions. These could be universities, companies, national labs, hospitals, research centers, government organizations and others. The size of a data center is determined by the number of servers, their capacity, number of personnel, number of users etc. Thus, we have smaller data centers as in universities and larger ones as in conglomerates like Google and Amazon. In this paper, we focus on an important parameter, greenness, which refers to the tendency of data centers to be environment-friendly. This is measured by factors such as carbon footprint and power usage effectiveness. Carbon footprint characterizes carbon dioxide emissions into the atmosphere and it is desirable to keep this value low. Power usage effectiveness (PUE) is the ratio of the total facility energy consumed the whole data center to the IT equipment energy actually consumed by its computers and peripherals (Belady et al. 2008; Forest et al. 2008). A PUE of 1.0 while challenging to achieve is considered ideal, and it is advisable to keep PUE close to this value. Taking into account such factors, we address the issue of greening data centers with a perspective on cloud computing which has arrived less than a decade ago.
Data centers have experienced rapid growth in the past decade (Koomey, 2007; Koomey 2011; US EPA 2007). This growth has connected the world, but the growth of data centers and the Internet is not without challenges as exposed by Glanz (2012) and Greenberg et al. (2009). Many of these challenges are being addressed by research and commercial organizations. One potential step forward is the shift to cloud computing from internal data centers. Cloud computing offers a service-based pay-as-you-go approach rather than a product-based approach with a capital investment (Ambrust et al. 2009). It is predicted that the cloud is a disruptive technology for management of organizations. The promises for organizations have been lower capital outlays with payment for only the services that are used while eliminating the overhead of owning a data center. These promises are very attractive to start-up businesses, but for larger organizations such as banks, security and privacy issues may outweigh the cost savings of the cloud. In between large banks and start-ups fall the rest of the organizations that rely on a data center for their computing needs that want to balance the cost savings of the cloud and the strict privacy and security requirements.
Besides the potential cost savings of operating on the cloud (depending on the situation or organization), the cloud typically offers better energy efficiency (Schulz, 2009; Sun, 2009). With reference to energy efficiency, the PUE metric used in the data center industry a crucial one and is similar to the metric of miles per gallon in the transportation industry. As stated earlier, a perfectly efficient system would have a PUE of one (Stanley et al. 2007). This hypothetical situation would occur when the total facility power could be lowered to zero so there would be no costs for cooling or accessory lighting. In many cases the larger cloud providers have achieved a PUE value of near one while many internal data centers such as those in our university have an average PUE of approximately two. This higher PUE value reflects the fact that there is additional total facility power such as cooling and peripheral lighting. Future cloud or internal data centers may have the potential to achieve a below one PUE by reusing waste heat for warming other sections of the building or through district heating (Patterson et al. 2010). An area of future research would be the use of an industrial ecological approach to data centers that reuse waste heat and incorporate the use of air-side economizing or free cooling.