Meta Reinforcement Learning for UAV-Assisted Energy Harvesting IoT Devices in Disaster-Affected Areas

<p dir="ltr">Over the past decade, Unmanned Aerial Vehicles (UAVs) have attracted significant attention due to their potential applications in emergency-response applications, including wireless power transfer (WPT) and data collection from Internet of Things (IoT) devices in disaste...

وصف كامل

محفوظ في:
التفاصيل البيبلوغرافية
المؤلف الرئيسي: Marwan Dhuheir (19170898) (author)
مؤلفون آخرون: Aiman Erbad (14150589) (author), Ala Al-Fuqaha (4434340) (author), Abegaz Mohammed Seid (19170901) (author)
منشور في: 2024
الموضوعات:
الوسوم: إضافة وسم
لا توجد وسوم, كن أول من يضع وسما على هذه التسجيلة!
_version_ 1864513510691569664
author Marwan Dhuheir (19170898)
author2 Aiman Erbad (14150589)
Ala Al-Fuqaha (4434340)
Abegaz Mohammed Seid (19170901)
author2_role author
author
author
author_facet Marwan Dhuheir (19170898)
Aiman Erbad (14150589)
Ala Al-Fuqaha (4434340)
Abegaz Mohammed Seid (19170901)
author_role author
dc.creator.none.fl_str_mv Marwan Dhuheir (19170898)
Aiman Erbad (14150589)
Ala Al-Fuqaha (4434340)
Abegaz Mohammed Seid (19170901)
dc.date.none.fl_str_mv 2024-03-18T12:00:00Z
dc.identifier.none.fl_str_mv 10.1109/ojcoms.2024.3377706
dc.relation.none.fl_str_mv https://figshare.com/articles/journal_contribution/Meta_Reinforcement_Learning_for_UAV-Assisted_Energy_Harvesting_IoT_Devices_in_Disaster-Affected_Areas/26324875
dc.rights.none.fl_str_mv CC BY 4.0
info:eu-repo/semantics/openAccess
dc.subject.none.fl_str_mv Engineering
Aerospace engineering
Electrical engineering
Information and computing sciences
Artificial intelligence
Distributed computing and systems software
Machine learning
Energy harvesting
UAVs positions
energy consumption
meta-reinforcement learning
UAVs
strategic locations
Internet of Things
Autonomous aerial vehicles
Optimization
Data collection
Energy consumption
Trajectory
Disasters
dc.title.none.fl_str_mv Meta Reinforcement Learning for UAV-Assisted Energy Harvesting IoT Devices in Disaster-Affected Areas
dc.type.none.fl_str_mv Text
Journal contribution
info:eu-repo/semantics/publishedVersion
text
contribution to journal
description <p dir="ltr">Over the past decade, Unmanned Aerial Vehicles (UAVs) have attracted significant attention due to their potential applications in emergency-response applications, including wireless power transfer (WPT) and data collection from Internet of Things (IoT) devices in disaster-affected areas. UAVs are more attractive than traditional techniques due to their maneuverability, flexibility, and low deployment costs. However, using UAVs for such critical tasks comes with challenges, including limited resources, energy constraints, and the need to complete missions within strict time frames. IoT devices in disaster areas have limited resources (e.g., computation, energy), so they depend on the UAVs’ resources to accomplish vital missions. To address these resource problems in a disaster scenario, we propose a meta-reinforcement learning (RL)-based energy harvesting (EH) framework. Our system model considers a swarm of UAVs that navigate an area, providing wireless power and collecting data from IoT devices on the ground. The primary objective is to enhance the quality of service for strategic locations while allowing UAVs to dynamically join and leave the swarm (e.g., for recharging). In this context, we formulate the problem as a non-linear programming (NLP) optimization problem aimed at maximizing the total EH IoT devices and determining the optimal trajectory paths for UAVs while adhering to the constraints related to the maximum time duration, the UAVs’ maximum energy consumption, and the minimum data rate to achieve a reliable transmission. Due to the complexity of the problem, the combinatorial nature of the formulated problem, and the difficulty of obtaining the optimal solution using conventional optimization problems, we propose a lightweight meta-RL solution capable of solving the problem by learning the system dynamics. We conducted extensive simulations and compared our approach with two state-of-the-art models using traditional RL algorithms represented by a deep Q-network algorithm, a Particle Swarm Optimization (PSO) algorithm, and one greedy solution. Our simulation results show that the proposed Meta-RL algorithm can outperform the IoT EH of the DQN, PSO algorithm, and the greedy solution by 25%, 32%, and 45%, respectively. The results of our simulations also demonstrate that our proposed approach outperforms the competitive solutions in terms of efficiently covering strategic locations with a high satisfaction rate and high accuracy.</p><h2>Other Information</h2><p dir="ltr">Published in: IEEE Open Journal of the Communications Society<br>License: <a href="http://creativecommons.org/licenses/by/4.0/" rel="nofollow" target="_blank">http://creativecommons.org/licenses/by/4.0/</a><br>See article on publisher's website: <a href="https://dx.doi.org/10.1109/ojcoms.2024.3377706" target="_blank">https://dx.doi.org/10.1109/ojcoms.2024.3377706</a></p>
eu_rights_str_mv openAccess
id Manara2_36ca9cee87cbc0d7d8e5905bed925e9f
identifier_str_mv 10.1109/ojcoms.2024.3377706
network_acronym_str Manara2
network_name_str Manara2
oai_identifier_str oai:figshare.com:article/26324875
publishDate 2024
repository.mail.fl_str_mv
repository.name.fl_str_mv
repository_id_str
rights_invalid_str_mv CC BY 4.0
spelling Meta Reinforcement Learning for UAV-Assisted Energy Harvesting IoT Devices in Disaster-Affected AreasMarwan Dhuheir (19170898)Aiman Erbad (14150589)Ala Al-Fuqaha (4434340)Abegaz Mohammed Seid (19170901)EngineeringAerospace engineeringElectrical engineeringInformation and computing sciencesArtificial intelligenceDistributed computing and systems softwareMachine learningEnergy harvestingUAVs positionsenergy consumptionmeta-reinforcement learningUAVsstrategic locationsInternet of ThingsAutonomous aerial vehiclesOptimizationData collectionEnergy consumptionTrajectoryDisasters<p dir="ltr">Over the past decade, Unmanned Aerial Vehicles (UAVs) have attracted significant attention due to their potential applications in emergency-response applications, including wireless power transfer (WPT) and data collection from Internet of Things (IoT) devices in disaster-affected areas. UAVs are more attractive than traditional techniques due to their maneuverability, flexibility, and low deployment costs. However, using UAVs for such critical tasks comes with challenges, including limited resources, energy constraints, and the need to complete missions within strict time frames. IoT devices in disaster areas have limited resources (e.g., computation, energy), so they depend on the UAVs’ resources to accomplish vital missions. To address these resource problems in a disaster scenario, we propose a meta-reinforcement learning (RL)-based energy harvesting (EH) framework. Our system model considers a swarm of UAVs that navigate an area, providing wireless power and collecting data from IoT devices on the ground. The primary objective is to enhance the quality of service for strategic locations while allowing UAVs to dynamically join and leave the swarm (e.g., for recharging). In this context, we formulate the problem as a non-linear programming (NLP) optimization problem aimed at maximizing the total EH IoT devices and determining the optimal trajectory paths for UAVs while adhering to the constraints related to the maximum time duration, the UAVs’ maximum energy consumption, and the minimum data rate to achieve a reliable transmission. Due to the complexity of the problem, the combinatorial nature of the formulated problem, and the difficulty of obtaining the optimal solution using conventional optimization problems, we propose a lightweight meta-RL solution capable of solving the problem by learning the system dynamics. We conducted extensive simulations and compared our approach with two state-of-the-art models using traditional RL algorithms represented by a deep Q-network algorithm, a Particle Swarm Optimization (PSO) algorithm, and one greedy solution. Our simulation results show that the proposed Meta-RL algorithm can outperform the IoT EH of the DQN, PSO algorithm, and the greedy solution by 25%, 32%, and 45%, respectively. The results of our simulations also demonstrate that our proposed approach outperforms the competitive solutions in terms of efficiently covering strategic locations with a high satisfaction rate and high accuracy.</p><h2>Other Information</h2><p dir="ltr">Published in: IEEE Open Journal of the Communications Society<br>License: <a href="http://creativecommons.org/licenses/by/4.0/" rel="nofollow" target="_blank">http://creativecommons.org/licenses/by/4.0/</a><br>See article on publisher's website: <a href="https://dx.doi.org/10.1109/ojcoms.2024.3377706" target="_blank">https://dx.doi.org/10.1109/ojcoms.2024.3377706</a></p>2024-03-18T12:00:00ZTextJournal contributioninfo:eu-repo/semantics/publishedVersiontextcontribution to journal10.1109/ojcoms.2024.3377706https://figshare.com/articles/journal_contribution/Meta_Reinforcement_Learning_for_UAV-Assisted_Energy_Harvesting_IoT_Devices_in_Disaster-Affected_Areas/26324875CC BY 4.0info:eu-repo/semantics/openAccessoai:figshare.com:article/263248752024-03-18T12:00:00Z
spellingShingle Meta Reinforcement Learning for UAV-Assisted Energy Harvesting IoT Devices in Disaster-Affected Areas
Marwan Dhuheir (19170898)
Engineering
Aerospace engineering
Electrical engineering
Information and computing sciences
Artificial intelligence
Distributed computing and systems software
Machine learning
Energy harvesting
UAVs positions
energy consumption
meta-reinforcement learning
UAVs
strategic locations
Internet of Things
Autonomous aerial vehicles
Optimization
Data collection
Energy consumption
Trajectory
Disasters
status_str publishedVersion
title Meta Reinforcement Learning for UAV-Assisted Energy Harvesting IoT Devices in Disaster-Affected Areas
title_full Meta Reinforcement Learning for UAV-Assisted Energy Harvesting IoT Devices in Disaster-Affected Areas
title_fullStr Meta Reinforcement Learning for UAV-Assisted Energy Harvesting IoT Devices in Disaster-Affected Areas
title_full_unstemmed Meta Reinforcement Learning for UAV-Assisted Energy Harvesting IoT Devices in Disaster-Affected Areas
title_short Meta Reinforcement Learning for UAV-Assisted Energy Harvesting IoT Devices in Disaster-Affected Areas
title_sort Meta Reinforcement Learning for UAV-Assisted Energy Harvesting IoT Devices in Disaster-Affected Areas
topic Engineering
Aerospace engineering
Electrical engineering
Information and computing sciences
Artificial intelligence
Distributed computing and systems software
Machine learning
Energy harvesting
UAVs positions
energy consumption
meta-reinforcement learning
UAVs
strategic locations
Internet of Things
Autonomous aerial vehicles
Optimization
Data collection
Energy consumption
Trajectory
Disasters