There are few industries today that aren’t touched by synthetic intelligence (AI). Networking could be very a lot one that’s touched. It’s barely conceivable that any community of any affordable measurement – from an workplace native space community or house router to a world telecoms infrastructure – couldn’t “simply” be improved by AI.
Simply take the phrases of Swisscom’s chief technical officer, Mark Düsener, about his firm’s partnership with Cisco-owned Outshift to deploy agentic AI – of which extra later – via his organisation. “The objective of entering into an agentic AI world, working networks and connectivity is all about decreasing the impression of service adjustments, decreasing the chance of downtime and prices – subsequently levelling up our buyer expertise.”
In different phrases, the implementation of AI ends in operational efficiencies, elevated reliability and person advantages. Appears easy, sure? However as we all know, nothing in life is straightforward, and to ensure such positive aspects, AI can’t be “simply” switched on. And maybe most significantly, the advantages of AI in networking can’t be realised absolutely with out contemplating networking for AI.
Beginning with Nvidia
It appears logical that any investigation of AI and networking – or certainly, AI and something – ought to begin with Nvidia, an organization that has performed a pivotal position in growing the AI tech ecosystem, and is about to take action additional.
Talking in 2024 at a tech convention about how AI has established itself as an intrinsic a part of enterprise, Nvidia founder and CEO Jensen Huang noticed that the period of generative AI (GenAI) is right here and that enterprises should interact with “the one most consequential know-how in historical past”. He instructed the viewers that what was taking place was the best elementary computing platform transformation in 60 years, encompassing general-purpose computing to accelerated computing.
“We’re sitting on a mountain of information. All of us. We’ve been amassing it in our companies for a very long time. However till now, we haven’t had the power to refine that, then uncover perception and codify it mechanically into our firm’s pure expertise, our digital intelligence. Each firm goes to be an intelligence producer. Each firm is constructed on domain-specific intelligence. For the very first time, we are able to now digitise that intelligence and switch it into our AI – the company AI,” he mentioned.
“AI is a lifecycle that lives ceaselessly. What we wish to do is flip our company intelligence into digital intelligence. As soon as we try this, we join our knowledge and our AI flywheel in order that we gather extra knowledge, harvest extra perception and create higher intelligence. This enables us to supply higher providers or to be extra productive, run quicker, be extra environment friendly and do issues at a bigger scale.”
Concluding his keynote, Huang harassed that enterprises should now interact with the “single most consequential know-how in historical past” to translate and condense an organization’s intelligence into digital intelligence.
That is exactly what Swisscom is aiming to attain. The corporate is Switzerland’s largest telecoms supplier with greater than six million cellular clients and 10,000 cellular antenna websites that need to be managed successfully. When its community engineers make adjustments to the infrastructure, they face a typical problem: the best way to replace methods that serve thousands and thousands of shoppers with out disrupting the service.
The answer was partnering with Outshift to develop sensible purposes of AI brokers in community operations to “redefine” buyer experiences. That’s, utilizing Outshift’s Web of Brokers to ship significant outcomes for the telco, whereas additionally assembly buyer wants via AI innovation.
However these benefits aren’t the protect of enormous enterprises equivalent to telcos. Certainly, from a networking perspective, AI can allow small- and medium-sized companies to realize entry to enterprise-level know-how that may permit them to give attention to development and eradicate the prices and infrastructure challenges that come up when managing advanced IT infrastructures.
Engineering networks for AI
From a broader perspective, Swisscom and Outshift have additionally proven that making AI work successfully requires one thing new: an infrastructure that lets companies talk and work collectively securely. And that is the place the 2 sides of AI and networking come into play.
On the occasion the place Nvidia’s Huang outlined his imaginative and prescient, David Hughes, chief product officer of HPE Aruba Networking, mentioned there have been urgent points about the usage of AI in enterprise networks, specifically round harnessing the advantages that GenAI can provide. Concerning “AI for networking” and “networking for AI”, Hughes prompt there are refined however elementary variations between the 2.
“AI for networking is the place we spend time from an engineering and knowledge science standpoint. It’s actually about [questioning] how we use AI know-how to show IT admins into super-admins in order that they’ll deal with their escalating workloads unbiased of GenAI, which is form of a load on prime of all the pieces else, equivalent to escalating cyber threats and considerations about privateness. The enterprise is asking IT to do new issues, deploy new apps on a regular basis, however they’re [asking this of] the identical variety of folks,” he noticed.
What we’re beginning to see, and anticipate extra of, is AI computing more and more happening on the edge to eradicate the gap between the immediate and the method Bastien Aerni, GTT
“Networking for AI is about constructing out, at first, the form of switching infrastructure that’s wanted to interconnect GPU [graphics processing unit] clusters. After which somewhat bit past that, fascinated by the impression of amassing telemetry on a community and the adjustments in the best way folks would possibly wish to construct out their community.”
And impression there’s. A variety of corporations presently investigating AI inside their companies discover themselves asking the best way to handle the mass adoption of AI in relation to networking and knowledge flows, such because the form of bandwidth and capability required to facilitate AI-generated output equivalent to textual content, picture and video content material.
This, says Bastien Aerni, vice-president of technique and know-how adoption at world networking and security-as-a-service agency GTT, is inflicting firms to rethink the pace and scale of their networking wants.
“To attain the return on funding of AI initiatives, they’ve to have the ability to safe and course of giant quantities of information rapidly, and to this finish, their community structure have to be configured to assist this type of workload. Utilising a platform embedded in a Tier 1 IP [internet protocol] spine right here ensures low latency, excessive bandwidth and direct web entry globally,” he remarks.
“What we’re beginning to see, and anticipate extra of, is AI computing more and more happening on the edge to eradicate the gap between the immediate and the method. Leveraging software-defined broad space community [SD-WAN] providers inbuilt the suitable platform to effectively route AI knowledge visitors can cut back latency and safety threat, and supply extra management over knowledge.”
Managing community overload
On the finish of 2023, BT revealed that its networks had come underneath large pressure after the simultaneous on-line broadcast of six Premier League soccer matches and downloads of in style video games, with the replace of Name of Responsibility Trendy Warfare notably cited. AI guarantees so as to add to this headache.
Talking at Cellular World Congress 2025, BT Enterprise chief know-how officer (CTO) Colin Bannon mentioned that within the new, reshaped world of labor, a strong and dependable community is a elementary prerequisite for AI to work, and that it requires effort to remain related to fulfill ongoing challenges confronted by the purchasers BT serves, primarily worldwide enterprise, governments and multinationals. The underside line is that community efficiency to assist the AI-enabled world is essential in a world the place “gradual is the brand new down”.
Bannon added that International Cloth, BT’s network-as-a-service product, was constructed earlier than AI “blew up” and that BT was considering of the best way to take care of a hyper-distributed set of workloads on a community and to have the ability to make it absolutely programmable.
Wanting on the challenges forward and the way the brand new community will resolve them, he mentioned: “[AI] simply makes distributed and extra advanced workflows even greater, which makes the necessity for a fabric-type community much more necessary. You want a community that may [handle data] burst, and that’s programmable, and that you may [control] bandwidth on demand as nicely. All of this programmability [is something businesses] have by no means had earlier than. I’d argue that the community is the pc, and the community is a prerequisite for AI to work.”
The consequence could be developing enterprise networks that may deal with the huge pressure positioned on utilisation from AI, particularly by way of what is required for coaching fashions. Bannon mentioned there have been three key community challenges and situations to take care of AI: coaching necessities, inference necessities and basic necessities.
He said that the dynamic nature of AI workloads means networks must be scalable and agile, with visibility instruments that provide real-time monitoring, problem detection and troubleshooting. As regards particular coaching necessities, coping with AI necessitates the motion of enormous datasets throughout the community, thus demanding high-bandwidth networks.
He additionally described “elephant” flows of information – that’s, steady transmission over time and coaching over days. He warned that community inconsistencies might have an effect on the accuracy and coaching time of AI fashions, and that tail latency might impression job completion time considerably. This implies sturdy congestion administration is required to detect potential congestion and redistribute community visitors.
However AI coaching fashions typically spell community hassle. And now the dialog is popping from the usage of generic giant language fashions (seeMaking ready networks for Trade 5.0 field) to utility/industry-dedicated small language fashions.
Deal with smaller fashions
NTT Information has created and deployed a small language mannequin referred to as Tsuzumi, described as an ultra-lightweight mannequin designed to scale back studying and inference prices. In accordance with NTT’s UK and Eire CTO, Tom Winstanley, the explanation for growing this mannequin has principally been to assist edge use circumstances.
“[That is] actually deployment on the fringe of the community to keep away from flooding of the community, additionally addressing privateness considerations, additionally addressing sustainability considerations round a few of these very giant language fashions being very particular in creating area context,” he says.
“Examples of that can be utilized in video analytics, media analytics, and in capturing conversations in actual time, however regionally, and never deploying it out to flood the community. That mentioned, the flip aspect of this was there was immense energy sitting in a few of these central hyper-scale fashions and capacities, and also you additionally subsequently want to seek out out extra [about] what’s the suitable community background, and what’s the suitable stability of your community infrastructure. For instance, if you wish to do real-time media streaming from a [sports stadium] and do the entire edits on-site, or remotely so to not need to deploy [facilities] to each single location, you then want a special spine, too.”
Winstanley notes that his firm is a part of a wider group that in media use circumstances might provide hyper-directional sound methods supported by AI. “That is wanting like a very fascinating space of know-how that’s related for supporter expertise in a stadium – dampening, sound focusing on. After which we’re again to the connection to the sting of the AI story. And that’s thrilling for us. That’s the frontier.”
However getting back from the frontier of know-how to bread-and-butter enterprise operations, even when the IT and comms group is assured that it could actually handle any technological points that come up concerning AI and networking, companies themselves is probably not so certain.
Roadblocks to AI plans
Analysis printed by managed network-as-a-service supplier Expereo in April 2025 revealed that regardless of 88% of UK enterprise leaders concerning AI as changing into necessary to fulfilling enterprise priorities within the subsequent 12 months, there are a variety of main roadblocks to AI plans by UK companies. These embody from workers and unreasonable calls for, in addition to poor current infrastructure.
Worryingly, among the many key findings of Expereo’s Enterprise horizons 2025 examine was the overall feeling from a whole lot of UK know-how leaders that expectations inside their organisation of what AI can do are rising quicker than their means to fulfill them. Whereas 47% of UK organisations famous that their community/connectivity infrastructure was not able to assist new know-how initiatives, equivalent to AI, basically, an additional 49% reported that their community efficiency was stopping or limiting their means to assist giant knowledge and AI initiatives.
Assessing the important thing developments revealed within the examine, Expereo CEO Ben Elms says that as world companies embrace AI to remodel worker and buyer expertise, setting lifelike objectives and aligning expectations will likely be essential to making sure that AI delivers long-term worth, somewhat than being considered as a fast repair.
“Whereas the potential of AI is immense, its profitable integration requires cautious planning. Expertise leaders should recognise the necessity for sturdy networks and connectivity infrastructure to assist AI at scale, whereas additionally making certain constant efficiency throughout these networks,” he says.
Summing up the state of the {industry}, Elms states that enterprise is presently at a pivotal second the place strategic investments in know-how and IT infrastructure are needed to fulfill each present and future calls for. Briefly, reflecting Düsener’s level about Swisscom’s intention to scale back the impression of service adjustments, cut back the chance of downtime and prices, and enhance buyer providers.
Simply switching on any AI system and believing that any reply is “on the market” simply received’t do. Your community might very nicely inform you in any other case.