Smart Resource Allocation for Mobile Edge Network in IoT Using Game Theory
DOI:
https://doi.org/10.24312/ucp-jeit.03.01.543Keywords:
—: Internet of Things , Game theory, Resource allocation, Mobile Edge ComputingAbstract
Emerging from years of research and development, the modern era of computing recognizes Internet of Things (IoT) as most empowering technology to connect the digital and real world. IoT has introduced new advancements that are transforming the world, however, it still faces constraints that limit its effectiveness in various application areas, including computing power, resource allocation, reliability, and time consumption. Achieving acceptable latency for task operations on IoT devices necessitates the appropriate allocation of Mobile Edge Caching device computing resources which should be based on task size, delivery, and service latency. It is impossible to handle the billions of data requests originating from a growing number of base stations. This research proposes a general mechanism for allocating computing resources and caching to facilitate efficient scheduling in cellular networks. A game theory approach used to model miniaturization problems has been employed in this work. A wireless network system has been analyzed where each node in the system is a participant with its strategies and contributions to achieve the desired performance. The simulation results show that the proposed technique has great potential to improve resource allocation. Each IoT device increases the number of requests handled by the MEC server in the non-cooperative subgame. The proposed system efficiently allocates IoT resources and excels in performance and reduce latency.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 UCP Journal of Engineering & Information Technology

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.