Gist: Akamai announces AI Grid, combining NVIDIA infrastructure with orchestration across 4,400 edge locations for distributed AI inference. The company frames the move as a way to improve latency and inference cost by routing workloads between edge and centralized GPU clusters.
Signal reason: The content announces a new AI Grid orchestration capability and deployment across the network.
