Deep Reinforcement Learning-Based Task Scheduling and Resource Allocation for Vehicular Edge Computing: A Survey

Document Type

Article

Source of Publication

Ieee Transactions On Intelligent Transportation Systems

Publication Date

1-1-2025

Abstract

With the development of intelligent transportation systems, vehicular edge computing (VEC) has played a pivotal role by integrating computation, storage, and analytics closer to the vehicles. VEC represents a paradigm shift towards real-time data processing and intelligent decision-making, overcoming challenges associated with latency and resource constraints. In VEC scenarios, the efficient scheduling and allocation of computing resources are fundamental research areas, enabling real-time processing of vehicular tasks and intelligent decision-making. This paper provides a comprehensive review of the latest research in Deep Reinforcement Learning (DRL)-based task scheduling and resource allocation in VEC environments. Firstly, the paper outlines the development of VEC and introduces the core concepts of DRL, shedding light on their growing importance in the dynamic VEC landscape. Secondly, the state-of-the-art research in DRL-based task scheduling and resource allocation is categorized, reviewed, and discussed. Finally, the paper discusses current challenges in the field, offering insights into the promising future of VEC applications within the realm of intelligent transportation systems.

ISSN

1524-9050

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Disciplines

Computer Sciences

Keywords

Vehicular edge computing, deep reinforcement learning, task scheduling, task scheduling, resource allocation, resource allocation, resource allocation

Indexed in Scopus

no

Open Access

no

Share

COinS