CITIC Securities: China's optical fiber export ratio has significantly increased, continuing to favor the AI computing power sector

robot
Abstract generation in progress

Source: Citic Securities Research

By| Yan Guicheng Fang Fangbo Liu Yongxu Cao Tianyu Fang Zixiao

In February this year, China exported 3,779.9 tons of optical fibers with a value of 790 million yuan, up 63.6% year over year, and up 126.8%. If converted to kilometers, in February China exported about 25.2 million fiber-kilometers, accounting for about 65% of China’s monthly effective fiber production. If we add the optical fibers included in cable exports, the proportion of fiber exports is even higher. Judging from export value, the transmission of fiber price increases to performance is not expected to be very noticeable in the first quarter. Overall, overseas buyers are rushing to purchase the optical fibers produced in China, and China’s optical fiber suppliers are in a “no worries about selling” state. Therefore, we believe the market also does does not need to be too concerned about domestic telecom operators’ fiber centralized procurement. Overall, overseas telecom networks, AI, unmanned aerial vehicles, and other factors jointly drive a high surge in fiber demand, pushing prices to keep rising. The industry is in a high-enthusiasm cycle, and we continue to recommend the fiber sector.

Google TurboQuant compression algorithm enables near-lossless compression of AI inference memory, significantly reducing the cost of long-context inference. Applications such as on-device AI and AI video generation are expected to benefit, and we remain bullish on the AI industry chain.

In February this year, China exported 3,779.9 tons of optical fibers with a value of 790 million yuan, up 63.6% year over year, and up 126.8%, representing a 17.6-fold increase in fiber export volume versus February 2018 (the historical peak of domestic demand), which was 203.5 tons. If calculated as approximately 0.15 kg per 1 km of optical fiber (including packaging, etc.), in February China exported about 25.2 million fiber-kilometers, accounting for about 65% of China’s monthly effective fiber production. If we add the optical fibers included in cable exports, the proportion of fiber exports is even higher. Judging from export value, the transmission of fiber price increases to performance is not expected to be very noticeable in the first quarter. Overall, overseas buyers are rushing to purchase the optical fibers produced in China, and China’s optical fiber suppliers are in a “no worries about selling” state. Therefore, we believe the market also does does not need to be too concerned about domestic telecom operators’ fiber centralized procurement.

Looking at the fiber-export countries in February, the top ten exporters were Côte d’Ivoire, Burkina Faso, Poland, the Philippines, Argentina, Russia, Nigeria, the United States, Panama, and Australia. Among them, the three African countries saw export volumes grow significantly, which is expected to be mainly related to local network construction demand. In addition, Russia’s demand is expected to be driven mainly by drones, while the United States, Australia, and the Philippines are expected to be related to AI. Overall, overseas telecom networks, AI, unmanned aerial vehicles, and other factors jointly drive a high surge in fiber demand, pushing prices to keep rising. The industry is in a high-enthusiasm cycle, and we continue to recommend the fiber sector.

The TurboQuant compression algorithm released by Google Research Institute can reduce memory usage of large language models (LLMs) without changing accuracy and while improving runtime speed. TurboQuant can compress the “working memory” used during AI operation—namely the key-value cache (KV cache)—by at least 6 times. On H100 GPUs, compared with a 32-bit baseline, the 4-bit computation attention speed increases by 8 times, thereby significantly reducing AI operating costs. At the same time, the most critical highlight of TurboQuant is: zero precision loss—no fine-tuning and no training data required.

TurboQuant’s optimization goal is to reduce the size of the key-value cache. TurboQuant’s core is an ingenious two-stage process. First stage: PolarQuant changes to a different coordinate system to view the world. Traditional quantization operates in the Cartesian coordinate system (X, Y, Z axes). The value ranges of each axis are not fixed, so additional storage of normalization parameters is required to align them. Second stage: QJL uses 1-bit to eliminate residual error. In TurboQuant’s second step, the Johnson-Lindenstrauss transform is applied to the residual error from the first stage to compress each error value into a single sign bit: +1 or -1. Then, together with a special estimator—performing joint computation using high-precision Query vectors and low-precision compressed Key—the system consumes only the last 1 bit to eliminate the systematic bias remaining from the first stage. With two steps, TurboQuant achieves nearly lossless compression under a total budget of only 3 bits, with no additional overhead throughout.

Google conducted rigorous verification of TurboQuant on five long-context benchmark tests: LongBench, Needle In A Haystack, ZeroSCROLLS, RULER, and L-Eval. The tested models include Gemma, Mistral, and Llama-3.1-8B-Instruct. In comprehensive tasks such as question answering, code generation, and text summarization on LongBench, TurboQuant under a 3-bit configuration performs comprehensively better than baseline methods such as KIVI, and even comes close to the performance of full-precision models. Under a 4x compression ratio, TurboQuant’s retrieval accuracy remains consistent up to 104,000 tokens, identical to full-precision models. In high-dimensional vector search, on the GloVe dataset (200 dimensions), TurboQuant defeats the two leading methods PQ and RabbiQ, achieving the best recall rate. The Google TurboQuant compression algorithm achieves near-lossless compression of AI inference memory, significantly lowering the cost of long-context inference. On-device AI and AI video generation are expected to benefit, and technological iterations will continue to drive sustained upgrades to the compute-power industry chain.

On March 23, the director of the National Data Bureau, Liu Rihong, officially announced at the China Development High-Level Forum 2026 annual meeting that the Chinese standard name for the AI core terminology Token has been determined as “token.” Liu Rihong said that tokens are “not only the value anchor of the intelligent era, but also the settlement unit connecting technology supply and business demand,” and he also disclosed that model companies have set a record of generating revenue exceeding the total revenue for all of 2025 in just 20 days. 100 billion, 1 trillion, 1.4 trillion—these are the three-step jump in China’s average daily token usage within two years.

At present, the communications industry is in a period of dual benefits driven by AI technology and bolstered by new infrastructure policies. The compute-power industry chain still remains the core main line with relatively high levels of buoyancy. Compute power and chips are the core foundation for AI industry development, and they are also the main investment themes for the current high buoyancy and high growth potential in the communications industry—so we recommend focusing on them.

International environmental changes affect the security and stability of the supply chain, and also affect the progress of relevant companies expanding overseas; tariff impacts have exceeded expectations; the development of the artificial intelligence industry has fallen short of expectations, affecting the demand of companies related to the cloud computing industry chain; intensifying market competition leads to a rapid drop in gross margin; exchange rate fluctuations affect foreign-oriented companies’ foreign exchange gains and gross margin, including companies in the ICT equipment and optical module/optical components sectors; the development of the digital economy and Digital China initiative has fallen short of expectations; telecom operators’ cloud computing business development has fallen short of expectations; operators’ capital expenditure has fallen short of expectations; cloud service providers’ capital expenditure has fallen short of expectations; demand for communications modules and smart controllers has fallen short of expectations.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin