Sun, May 17, 2026
Sat, May 16, 2026
Fri, May 15, 2026
Thu, May 14, 2026
Wed, May 13, 2026

The Shift from Copper to Optical Interconnects in AI Data Centers

Expanding AI clusters are transitioning from copper to optical interconnects to overcome physical limitations, reducing latency and power consumption via light-based data transmission.

The Bottleneck of Electrical Interconnects

For years, data centers relied predominantly on copper wiring for short-range communication. However, the sheer scale of modern AI clusters has pushed electrical interconnects to their physical limits. As GPU clusters grow to include tens of thousands of accelerators, the heat generation and signal degradation associated with copper become prohibitive.

To maintain the speeds necessary for large language model (LLM) training and inference, the industry is transitioning to optical interconnects. By using light instead of electricity to transmit data, these systems significantly reduce latency and power consumption while dramatically increasing bandwidth. This technological necessity has transformed optical components from peripheral accessories into critical infrastructure for the AI era.

The Architecture of the AI Network

The shift toward optics involves several layers of hardware, each served by different industry leaders. The "AI trade" in optics is currently distributed across three primary segments: the physical medium, the laser components, and the network orchestration.

1. The Physical Medium (Fiber Optics) At the most foundational level is the glass fiber that carries the light. Corning (GLW) remains a central figure here, providing the high-performance optical fiber necessary to link servers across vast data center campuses. The demand is not just for standard fiber, but for specialized cables capable of handling the extreme densities required by AI workloads.

2. Laser Components and Transceivers To turn electrical signals into light and back again, high-precision lasers and transceivers are required. Companies such as Lumentum (LITE) and Coherent (COHR) are pivotal in this space. These firms produce the photonic integrated circuits and laser sources that power the transceivers. As the industry moves toward 800G and 1.6T speeds, the complexity and value of these components increase.

3. Network Orchestration and Switching Once the physical paths and light sources are in place, the data must be routed efficiently. Ciena (CIEN) operates in this layer, providing the networking equipment and software that ensure data packets move across the optical network without congestion, ensuring that GPUs are not left idling while waiting for data.

Market Dynamics and Investment Rotation

The financial trajectory of these stocks reflects a broader rotation in the AI trade. Early investors saw massive gains in the "compute" layer (e.g., NVIDIA). Once the market realized that compute requires massive amounts of memory, the trade shifted to HBM providers. Now, the logic suggests that compute and memory are useless if they cannot communicate instantaneously.

This rotation is driven by the realization that the physical constraints of the data center are the next major hurdle. Consequently, the valuation of optics companies is increasingly tied to the deployment schedules of the next generation of AI clusters rather than general cloud spending.

Key Technical and Market Details

  • Copper vs. Optics: Copper is hitting a "power wall" where the energy required to push signals through wires creates unsustainable heat; optics solve this by using photons.
  • Speed Requirements: The industry is transitioning from 400G to 800G and 1.6T transceivers to keep pace with GPU throughput.
  • The "Cluster" Effect: The demand for optics scales non-linearly; as a cluster grows, the number of interconnects required grows faster than the number of GPUs.
  • Primary Tickers: Lumentum (LITE), Ciena (CIEN), Corning (GLW), and Coherent (COHR) are the primary beneficiaries of this infrastructure pivot.
  • Latency Reduction: Optical networking is essential for reducing "tail latency," which can otherwise bottleneck the training of trillion-parameter models.

Conclusion

The transition toward an optical-centric data center architecture is not a luxury but a requirement for the continued scaling of artificial intelligence. While the initial hype focused on the brain of the AI (the GPU), the current focus has shifted to the nervous system--the optical networks that allow those brains to function in concert. For the AI trade to sustain its momentum, the physical layer of light-based connectivity must evolve as rapidly as the silicon it supports.


Read the Full Business Insider Article at:
https://www.businessinsider.com/optics-stocks-memory-chips-ai-trade-lite-cien-glw-cohr-2026-5