Eneracloud: A Reinforcement Learning–Based Energy-Aware Task Scheduling Framework For Sustainable Cloud Data Centers
Main Article Content
Abstract
The rapid proliferation of cloud computing infrastructures has been accompanied by a significant growth in energy consumption in massive data centers, and there is an urgent need for scheduling tasks with minimal energy consumption. The most of the current study work on cloud scheduling emphases one performance-based metric, like response time and throughput, energy is rarely taken into account. To remedy this, the paper proposes EneraCloud, a reinforcement learning based energy-aware cloud task scheduling framework. The adopted algorithm Employee-Selection Decision Tree (ESDT) represents the task scheduling problem as a sequential decision making process, where a learning-based agent schedules tasks to VMs based on current resource utilization, expected power cost and service level agreement constraints.
A utilization-based mathematical energy consumption model and a multi-objective reward function are introduced to trade-off between energy saving and QoS. We evaluate the performance of the proposed approach using trace driven simulations, utilizing real world workload traces from Google Cluster dataset and compare its performance with popular scheduling heuristics: First-Come-First-Serve (FCFS), Round Robin (RR) and Min-Min. The experimental results demonstrate that EneraCloud achieves a significant cut in total energy consumption with similar or higher SLA compliance, indicating its applicability for green cloud resource management.