The LTM has been trained on diverse datasets — ranging from SoftBank’s network data to the design, management, and operational know-how the company has accumulated over its lifetime.
"The LTM enables advanced inference in the design, management, and operation of cellular networks," SoftBank says
"Moving forward, SoftBank will further advance its research and development efforts, aiming to implement the LTM into its own operations."
SoftBank says it has also developed specialised AI models by fine-tuning the LTM, which is specifically designed to optimise base station configurations that enable advanced cellular network operations.
"The fine-tuned models were tasked with predicting configurations for actual base stations that had been excluded from the training phase, and their predictions were later verified by in-house experts to have over 90% accuracy," SoftBank says.
"Compared to manual or partially automated workflows, the LTM-led approach reduces the time to make these changes from days to minutes, and with similar accuracy, indicating the potential for huge operational time and cost savings, in addition to reducing human error."
|
According to SoftBank, in the future, the LTM is expected to serve as a blueprint for network design and support the development of network optimisation AI agents.
The LTM model used NVIDIA NIM, which allows for significant performance gains for the two specialised use cases, including about a fivefold improvement in both Time to First Token (TTFT) and Tokens Per Second (TPS).
NVIDIA NIM provides containers to self-host or use hosted AI models for generative, agentic, and vision applications on NVIDIA GPUs.
In developing its LTM, SoftBank used the NVIDIA DGX SuperPOD for distributed training and says it will continue collaborating with NVIDIA on NIM Microservices Optimisation for Inferencing and Aerial Omniverse Digital Twin (AODT) for simulating and validating the LTM configuration changes prior to taking actions.
“SoftBank's AI platform model, the 'Large Telecom Model' (LTM), developed for telecommunications operators, significantly transforms the processes of designing, constructing, and operating communication networks," said Ryuji Wakikawa, Vice President, Head of the Research Institute of Advanced Technology at SoftBank.
"By fine-tuning LTM, it’s possible to build AI models specialised for various processes and deploy them as agents. This not only optimises and automates operational tasks but also enhances network performance through the tuning of wireless devices.
"SoftBank will continue to leverage cutting-edge AI technologies, aiming to deliver unprecedented levels of high-quality communication services to customers.”
Chris Penrose, Vice President of Telecoms at NVIDIA, said Large Telecom Models are the foundation for simplifying and speeding up network operations by enabling the creation of network AI agents for specialised tasks such as network planning, network configuration and network optimisation.
"SoftBank’s rapid innovation in developing its new LTM, leveraging NVIDIA AI technologies, sets a powerful example for telecom operators globally to redefine their network operations processes with AI,” Penrose said.