Authors
Nam H Chu, Dinh Thai Hoang, Diep N Nguyen, Nguyen Van Huynh, Eryk Dutkiewicz
Publication date
2022/2/14
Journal
IEEE Internet of Things Journal
Volume
10
Issue
7
Pages
5778-5793
Publisher
IEEE
Description
Unmanned-aerial-vehicle (UAV)-assisted data collection has been emerging as a prominent application due to its flexibility, mobility, and low operational cost. However, under the dynamic and uncertainty of Internet of Things data collection and energy replenishment processes, optimizing the performance for UAV collectors is a very challenging task. Thus, this article introduces a novel framework that jointly optimizes the flying speed and energy replenishment for each UAV to significantly improve the overall system performance (e.g., data collection and energy usage efficiency). Specifically, we first develop a Markov decision process to help the UAV automatically and dynamically make optimal decisions under the dynamics and uncertainties of the environment. Although traditional reinforcement learning algorithms, such as -learning and deep -learning, can help the UAV to obtain the optimal policy, they …
Total citations
2022202320244198