Gist: Akamai announces AI Grid, combining NVIDIA infrastructure with orchestration across 4,400 edge locations for distributed AI inference. The company frames the move as a way to improve latency and inference cost by routing workloads between edge and centralized GPU clusters.
Signal reason: It reinforces a broader narrative about distributed compute and inference positioning.
