Designing energy-efficient communication protocols is one of the main challenges in wireless sensor networks. This work presents an adaptive radio scheduling schema employing a reinforcement learning algorithm for reducing the energy consumption while preserving the other network performances. By means of a decentralized on-line approach, each nodes determines the most beneficial radio schedule by dynamically adapting to its own traffic load and to the neighbors’ communication activities. We compare our approach with other learning-based MAC protocols as well as conventional MAC approaches and show that, under different simulating scenarios and traffic conditions, our protocol achieves better trade-offs in terms of energy consumption, latency and throughput.
|Title of host publication||Proceedings of the 7th International Conference on Internet and Distributed Computing Systems, IDCS 2014, September 22-24, 2014, Calabria, Italy|
|Editors||G. Fortino, G. Di Fatta, W. Li, S. Ochoa, A. Cuzzocrea, M. Pathan|
|Place of Publication||Berlin|
|Publication status||Published - 2014|
|Name||Lecture Notes in Computer Science|