Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Domestic large models undergo intensive iterative upgrades. Focus on three main investment themes.
Since 2026, China-based large model vendors have focused on upgrading Agent and coding capabilities, competing to release new models. For example, MiniMax’s coding ability has been further enhanced: its M2.7 SWE-Pro test score reached 56.22%, surpassing Gemini 3.1 Pro; in the VIBE-Pro test for end-to-end complete project delivery scenarios, it scored 55.6%, matching Claude Opus 4.6, with a further improved understanding of the operational logic of software systems. Meanwhile, M2 series models participate in training processes for M2.7 in scenarios such as RL, enabling the models to self-iterate.
CITIC Securities believes that the upcoming DeepSeek next-generation model is expected to continue the high-cost-performance open-source model route. In terms of capabilities, it will achieve stronger memory functions and ultra-long context processing. While refining code and Agent capabilities, it will also address multimodal shortcomings, bringing new investment opportunities in the areas of model manufacturers, AI applications, and AI infrastructure. It is recommended to focus on the following three main investment lines:
Model manufacturers: DeepSeek’s new generation model is expected to collaborate with other domestic models, driving China’s AI to accelerate its global presence. Additionally, advancements in model training will further reduce costs, and cheaper Tokens will drive an overall increase in global large-model API calls.
AI applications: Achieving model parity helps alleviate market anxiety caused by conflicting narratives about models and applications, supporting the deployment of AIAgents across thousands of industries, which benefits AI application companies with strong barriers to entry.
AI infrastructure: Cost reductions lead to increased usage, benefiting AIInfra. Domestic AI infrastructure and domestic models are progressing hand in hand.