Autonomous cell activation for energy saving in cloud-RANs based on dueling deep Q-network

作者:

Highlights:

摘要

Heterogeneous cloud radio access network (H-CRAN) is a promising technology to help overcome the traffic density which in 5G communication networks. One of the main challenges that will arise in H-CRAN is how to minimize energy consumption. In this paper, a deep reinforcement learning method is used to minimize energy consumption. Firstly, we propose an autonomous cell activation framework and customized physical resource allocation schemes to balance energy consumption and QoS satisfaction in C-RANs. We formulate the cell activation problem as a Markov decision process(MDP). To solve the problem, we develop a dueling deep Q-network (DQN) based autonomous cell activation framework to ensure user QoS demand and minimized energy consumption with the minimum number of active RRHs under varying traffic demand. Simulation results illustrate the effectiveness of our proposed solution in minimized energy consumption in a network.

论文关键词:Cell activation,Deep reinforcement learning,Energy efficiency,Dueling deep Q-network,H-CRAN

论文评审过程:Received 25 March 2019, Revised 1 December 2019, Accepted 6 December 2019, Available online 14 December 2019, Version of Record 24 February 2020.

论文官网地址:https://doi.org/10.1016/j.knosys.2019.105347