🚀 Gate Fun Chinese Meme Fever Keeps Rising!
Create, launch, and trade your own Meme tokens to share a 3,000 GT!
Post your Meme on Gate Square for a chance to win $600 in sharing rewards!
A total prize pool of $3,600 awaits all creative Meme masters 💥
🚀 Launch now: https://web3.gate.com/gatefun?tab=explore
🏆 Square Sharing Prizes:
1️⃣ Top Creator by Market Cap (1): $200 Futures Voucher + Gate X RedBull Backpack + Honor Poster
2️⃣ Most Popular Creator (1): $200 Futures Voucher + Gate X RedBull Backpack + Honor Poster
3️⃣ Lucky Participants (10): $20 Futures Voucher (for high-quality posts)
O
Microsoft Open Source New Version of Phi-4: Inference Efficiency Rises 10 Times, Can Run on Laptops
Jin10 data reported on July 10, this morning, Microsoft open sourced the latest version of the Phi-4 family, Phi-4-mini-flash-reasoning, on its official website. The mini-flash version continues the Phi-4 family’s characteristics of small parameters and strong performance, specifically designed for scenarios limited by Computing Power, memory, and latency, capable of running on a single GPU, suitable for edge devices like laptops and tablets. Compared to the previous version, mini-flash utilizes Microsoft’s self-developed innovative architecture, SambaY, resulting in a big pump in inference efficiency by 10 times, with average latency reduced by 2-3 times, achieving a significant improvement in overall inference performance.