Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Microsoft open-sources three versions of the Harrier text embedding model, with the 27B version topping the multilingual MTEB v2 leaderboard.
According to 1M AI News monitoring, Microsoft has open-sourced a multilingual text embedding model family on Hugging Face called harrier-oss-v1, with three tiers: 270M, 0.6B, and 27B. The model card shows that this series uses a decoder-only architecture, last-token pooling, and L2 normalization, supports up to 32768 tokens, and can be used for retrieval, clustering, semantic similarity, classification, bilingual mining, and re-ranking.
Multilingual MTEB v2 is an industry-standard multilingual text embedding benchmark, primarily testing tasks such as retrieval, classification, clustering, and semantic similarity. Microsoft’s model card states that the three tiers achieve scores of 66.5, 69.0, and 74.3 on this benchmark, respectively, and that the 27B version topped the leaderboard on the day it was released. The 270M and 0.6B versions also additionally use larger embedding models for knowledge distillation. All three models are released under the MIT license.