- Last updated December 11, 2024
- In AI News
“This breakthrough allows tomorrow’s chips to communicate much like fibre optics,” says Dario Gil, SVP and director of research at IBM.

Illustration by Sanjana Gupta
The growing need for rapid and efficient data management is driving demand for high-speed data transfer in data centres. With the rise of Generative AI, the need is further amplified, with nearly 75% of data centre traffic now occurring within the centres themselves.
In a bid to tackle this, IBM has announced a new optics technology innovation designed to enhance the speed and energy efficiency of data centres for generative AI applications. The results were quite promising.
Makes Llama 3-70b 5x Faster
To test the efficiency of this technique, the IBM team developed the optical test vehicles OTV-1a and OTV-1b. These were developed and tested using lead-free flip chips, and BGA assembly processes alongside JEDEC-level stress tests.
Results showed that with collaborative efforts in materials, structures, processes, and ecosystem, samples at 50 µm pitch maintained low insertion loss under reflow and JEDEC testing.
To test this, the IBM team tested the Meta’s Llama 3-70 billion parameter model using a distributed setup with fully sharded data parallel (FSDP) and tensor parallelism (TP). The data reveals that with an increase in TP degree, the throughput can be affected as much as 5 times.
Energy Savings and Efficiency
The Co-Packaged Optics (CPO) technology uses polymer optical waveguides (PWG) to facilitate high-speed data transmission, replacing traditional copper-based electrical interconnects.
According to IBM’s official announcement, CPO reduces energy consumption in data centres by over five times compared to mid-range electrical interconnects. Additionally, it promises to speed up AI model training, cutting training times for LLMs from months to weeks.
Quote: “Generative AI demands more energy and processing power, and co-packaged optics can make these data centres future-proof,” said Dario Gil, SVP and director of research at IBM. “This breakthrough allows tomorrow’s chips to communicate much like fibre optics, enabling faster, more sustainable data handling.”
The introduction of CPO technology is expected to save significant energy during AI model training, potentially equating to the annual energy consumption of 5,000 U.S. homes per model.
Reach Expansion
IBM claims that CPO also extends the reach of interconnect cables from one to hundreds of metres, allowing greater flexibility in data centre layouts. The company’s research highlights that CPO technology can deliver 80 times the bandwidth of current electrical connections.
The design supports dense optical pathways that can transmit terabits of data per second, advancing the scalability of AI processing. The prototypes demonstrated durability under extreme conditions, including high humidity and temperatures ranging from -40°C to 125°C.
This development adds to IBM’s history of technological innovation, which includes breakthroughs such as 2 nm nanochip technology, the Bee agent framework and the Granite models.
The company also recently completed over 1,000 generative AI projects in one year. Prototypes were developed and tested at IBM facilities in Albany, New York, and Bromont, Quebec, emphasising its leadership in semiconductor research and assembly.
[This story has been read by 1 unique individuals.]
Sanjana Gupta
An information designer who loves to learn about and try new developments in the field of tech and AI. She likes to spend her spare time reading and exploring absurdism in literature.
Subscribe to The Belamy: Our Weekly Newsletter
Biggest AI stories, delivered to your inbox every week.
February 5 – 7, 2025 | Nimhans Convention Center, Bangalore
Rising 2025 | DE&I in Tech & AI
Mar 20 and 21, 2025 | 📍 J N Tata Auditorium, Bengaluru
Data Engineering Summit 2025
May, 2025 | 📍 Bangalore, India
MachineCon GCC Summit 2025
June 2025 | 583 Park Avenue, New York
September, 2025 | 📍Bangalore, India
MachineCon GCC Summit 2025
The Most Powerful GCC Summit of the year
Our Discord Community for AI Ecosystem, In collaboration with NVIDIA.