Importance-Aware Fresh Delivery of Versions over Energy Harvesting MACs
We consider a scenario where multiple users, powered by energy harvesting, send version updates over a fading multiple access channel (MAC) to an access point (AP). Version updates having random importance weights arrive at a user according to an exogenous arrival process, and a new version renders all previous versions obsolete. As energy harvesting imposes a time-varying peak power constraint, it is not possible to deliver all the bits of a version instantaneously. Accordingly, the AP chooses the objective of minimizing a finite-horizon time average expectation of the product of importance weight and a convex increasing function of the number of remaining bits of a version to be transmitted at each time instant. The objective enables importance-aware delivery of as many bits, as soon as possible. In this setup, the AP optimizes the objective function subject to an achievable rate-region constraint of the MAC and energy constraints at the users, by deciding the transmit power and the number of bits to be transmitted by each user. We obtain a Markov Decision Process (MDP)-based optimal online policy to the problem and derive structural properties of the policy. We then develop a neural network (NN)-based online heuristic policy, for which we train an NN on the optimal offline policy derived for different sample paths of energy, version arrival and channel power gain processes. Via numerical simulations, we observe that the NN-based online policy performs competitively with respect to the MDP-based online policy.
READ FULL TEXT