NVIDIA and T-Mobile on March 16th announced they are working with Nokia and a growing ecosystem of developers to bring physical AI applications over distributed edge AI networks. This collaboration demonstrates how next generation AI-RAN infrastructure can transform the wireless network into a platform for distributed high-performance edge AI computing, creating a foundation for developers to deploy vision AI agents that understand the physical world across cities, utilities and industrial worksites using the NVIDIA Metropolis platform.
NVIDIA’s AI-RAN portfolio encompasses NVIDIA ARC-Pro built on NVIDIA RTX PRO 4500 Blackwell Server Edition for power-constrained cell sites, and NVIDIA RTX PRO 6000 Blackwell Server Edition for higher-capacity mobile switching offices (MSOs).
T-Mobile was the first in the U.S. to pilot NVIDIA’s AI-RAN infrastructure with Nokia’s anyRAN software and is now working with select NVIDIA physical AI partners, demonstrating how cell sites and mobile switching offices can support distributed edge AI workloads while continuing to deliver advanced 5G connectivity.
“Telecommunication networks are evolving into the AI infrastructure enabling billions of devices — from vision AI agents to robots and autonomous vehicles — to see, hear and act in real time,” said Jensen Huang, founder and CEO of NVIDIA. “By turning the 5G network into a distributed AI computer with T-Mobile and Nokia, we’re creating a scalable blueprint for the world’s edge AI infrastructure.”
“Turning networks into distributed AI computing platforms to unlock the full potential of Physical AI will require ultra-low latency and space time coherency at the network edge for billions of endpoints, and that’s what we’ve built at T-Mobile,” said Srini Gopalan, Chief Executive Officer of T-Mobile. “With the first nationwide 5G Standalone and 5G Advanced network, we are uniquely positioned to help power a future where intelligent systems don’t wait on the cloud but rely on intelligent networks that allow them to act in real time.”
The Mobile Network as the Nervous System for Physical AI
The transition to AI-RAN built on NVIDIA accelerated computing addresses a critical bottleneck in scaling physical AI: lack of low-latency, secure and ubiquitous connectivity. While Wi-Fi is limited by reach and security, T-Mobile’s 5G standalone network provides the wide-area coverage and guaranteed quality of service required for complex AI agents to operate in busy city intersections, industrial facilities and rural areas.
This architecture enables physical AI to offload heavy computation from the device to the nearest edge location. Shifting heavy processing to the network edge allows developers to streamline hardware requirements for individual cameras and robots, making it possible to cost-effectively scale sophisticated AI models across billions of interconnected devices.

Leading Developers Bring Reasoning and Vision AI to the Edge
A growing ecosystem of developers is collaborating with NVIDIA and T-Mobile to integrate physical AI agents that are driving real-time action, built with the NVIDIA Metropolis Blueprint for video search and summarization (VSS) on T-Mobile’s distributed edge network. Pilot use cases include:
- Smart City Operations: LinkerVision, Inchor and Voxelmaps are testing integrated computer vision-based “City Operations Agents” and a digital twin that can perceive, simulate and optimize traffic light timing, targeting 5x faster incident response times for The City of San Jose.
- Automated Utility Inspection: Levatas and Skydio are automating the inspection of hundreds of thousands of miles of transmission lines over 5G with NVIDIA compute to detect and resolve anomalies such as leaning power poles, corrosion and thermal hotspots 5x faster. They are now evaluating AI-RAN infrastructure to further reduce costs, improve storm recovery time and accelerate the shift from reactive to predictive maintenance.
- Vision-Based Facility Management: Developers such as Vaidio are using the VSS blueprint to build facility management agents that move beyond simple sensors to perform threat detection and failure forecasting, triggering automated workflows to improve facility management.
- Real-Time Industrial Safety: Fogsphere provides safety AI agents for SAIPEM to detect and respond in real-time to hazardous events — such as workers under suspended loads or hydrocarbon spills — in high-risk construction onshore, offshore and drilling environments. Fogsphere is now validating how AI-RAN infrastructure can enhance the capabilities and performance of these agents — already running 24/7 without reliance on Wi-Fi — over secure and distributed network compute.
These initiatives reflect T-Mobile’s broader strategy to test and enable edge AI capabilities in collaboration with NVIDIA, Nokia and a diverse ecosystem of software providers, manufacturers and enterprise innovators.
Accelerating Vision AI Agents Development With the Metropolis VSS 3 Blueprint
While more than 1.5 billion cameras capture footage globally, less than 1% is ever reviewed by humans. NVIDIA is introducing the Metropolis VSS 3 Blueprint to enable agents to reason over video from the edge to the cloud.
Key features of the blueprint’s latest iteration include:
- Agentic Information Retrieval: AI agents can decompose complex natural language queries and search across video footage to find specific events in under five seconds.
- Modular Architecture: A flexible framework allows teams to adapt VSS 3 to diverse environments — from retail stores to warehouses — without overhauling core infrastructure.
- 100x Efficiency: VSS can summarize long-form video up to 100x faster than manual reviews, drastically reducing repetitive tasks and review costs for global physical operations.
Partners using the VSS blueprint to optimize operations and enhance safety across industries include Caterpillar, KION, Hitachi, HCLTech, Siemens Energy, Tulip and Telit Cinterion.
Learn more about the NVIDIA VSS blueprint at Nvidia.













