Show simple item record

dc.contributor.authorZhang, Wenhan
dc.contributor.authorFeng, Mingjie
dc.contributor.authorKrunz, Marwan
dc.date.accessioned2024-01-09T21:15:58Z
dc.date.available2024-01-09T21:15:58Z
dc.date.issued2023-11-17
dc.identifier.citationZhang, W., Feng, M., & Krunz, M. (2023). Latency Estimation and Computational Task Offloading in Vehicular Mobile Edge Computing Applications. IEEE Transactions on Vehicular Technology.en_US
dc.identifier.issn0018-9545
dc.identifier.doi10.1109/tvt.2023.3334192
dc.identifier.urihttp://hdl.handle.net/10150/670642
dc.description.abstractMobile edge computing (MEC) is a key enabler of time-critical vehicle-to-everything (V2X) applications. Under MEC, a vehicle has the option to offload computationally intensive tasks to a nearby edge server or to a remote cloud server. Determining where to execute a task necessitates accurate estimation of the end-to-end (E2E) offloading delay. In this paper, we first conduct extensive measurements of the round-trip time (RTT) between a vehicular user and edge/cloud servers. Using these measurements, we present a latency-estimation framework for optimal task offloading. The propagation delay, measured by the RTT, is divided into two components: one that follows a trackable trend (baseline) and the other (residual) that is quasi-random. For the baseline component, we first cluster measured RTTs into several groups, depending on signal strength indicators. For each group, we develop a Long Short-Term Memory (LSTM) regression model. A statistical approach is provided for predicting the residual component, which combines the Epanechnikov Kernel and moving average functions. Predicted propagation delays are incorporated into virtual simulations to estimate the transmission, queuing, and processing delays, hence accounting for the E2E delay. Based on the estimated E2E delay, we design a task offloading scheme that minimizes the offloading latency while maintaining a low packet loss rate. Simulation results show that the proposed offloading strategy can reduce the E2E delay by approximately 60% compared to a random offloading scheme while keeping the packet loss rate below 3%.en_US
dc.description.sponsorshipNSFen_US
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.rights© 2023 IEEE.en_US
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en_US
dc.subjectElectrical and electronic engineeringen_US
dc.subjectComputer Networks and Communicationsen_US
dc.subjectAerospace Engineeringen_US
dc.subjectAutomotive Engineeringen_US
dc.subjectDelaysen_US
dc.subjectE2E delayen_US
dc.subjectlatency predictionen_US
dc.subjectLSTMen_US
dc.subjectmobile edge computingen_US
dc.subjectPacket lossen_US
dc.subjectPredictive modelsen_US
dc.subjectServersen_US
dc.subjectTask analysisen_US
dc.subjecttask offloadingen_US
dc.subjectV2X applicationsen_US
dc.subjectVehicle dynamicsen_US
dc.titleLatency Estimation and Computational Task Offloading in Vehicular Mobile Edge Computing Applicationsen_US
dc.typeArticleen_US
dc.identifier.eissn1939-9359
dc.contributor.departmentDepartment of Electrical and Computer Engineering, University of Arizonaen_US
dc.identifier.journalIEEE Transactions on Vehicular Technologyen_US
dc.description.noteImmediate accessen_US
dc.description.collectioninformationThis item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at repository@u.library.arizona.edu.en_US
dc.eprint.versionFinal accepted manuscripten_US
dc.source.journaltitleIEEE Transactions on Vehicular Technology
dc.source.beginpage1
dc.source.endpage16
refterms.dateFOA2024-01-09T21:16:01Z


Files in this item

Thumbnail
Name:
Latency_Estimation_and_Computa ...
Size:
3.251Mb
Format:
PDF
Description:
Final Accepted Manuscript

This item appears in the following Collection(s)

Show simple item record