Abstract

A large portion of power consumed by datacenters in small and medium sized businesses is wasted consumption, as machines mostly run idle. Commonly, this is accepted by operators as many countermeasures, such as layout redesign or hardware upgrades are too. In this work we analyze how much power can be saved if an exact optimization problem is set up to determine the cost minimizing allocation of resources. Since the exact solution cannot be found in a timely manner due to the nonlinear nature of energy consumed by a system in operation, we show the goodness of various approximations in terms of power savings using numerical evaluations based on real workload traces compared to a commonly used resource allocation policy used in datacenters today.

Share

COinS