Latency-Aware Service Placement on the Fog--Edge--Cloud Continuum via Integer Programming

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Modern distributed applications increasingly span the fog--edge--cloud continuum, demanding placement strategies that balance latency constraints with resource limitations and operational dependencies. This paper presents a mixed-integer linear programming (MILP) formulation for latency-aware service placement that simultaneously optimizes end-to-end response time while satisfying multi-dimensional constraints including CPU and memory capacities, anti-affinity rules, deployment dependencies, and observability requirements. We model microservice pipelines as directed acyclic graphs where services communicate through well-defined interfaces, and candidate hosting locations exhibit heterogeneous characteristics in terms of computational capacity, network latency, and operational reliability. Our formulation introduces decision variables for binary placement assignments and auxiliary variables for residency enforcement, incorporating constraints that capture resource consumption, resiliency policies, progressive rollout dependencies, and monitoring infrastructure placement. We establish theoretical properties of the formulation, including NP-hardness of the decision problem, a proof that the objective function upper-bounds critical-path latency under frequency weighting, and polynomial-time solvability for restricted topologies. Through a detailed smart-factory scenario involving a five-service computer-vision pipeline, we illustrate how the MILP model produces provably optimal placements while respecting all operational constraints. The formulation's versatility enables DevOps teams to encode complex placement policies declaratively, facilitating automated orchestration across increasingly heterogeneous infrastructure continua.

Article activity feed