Trending topics
#
Bonk Eco continues to show strength amid $USELESS rally
#
Pump.fun to raise $1B token sale, traders speculating on airdrop
#
Boop.Fun leading the way with a new launchpad on Solana.
🚨News: Mistral just mass deleted three of their own models
they took magistral (reasoning), pixtral (multimodal), and devstral (coding) and merged them into one single model
mistral small 4
and the timing here is interesting
they also just announced a strategic partnership with NVIDIA to co-develop frontier open source models
so what does small 4 actually look like:
>128 experts in a mixture of experts setup
>only 6B parameters active per token out of 119B total >256k context window
>apache 2.0 fully open source
>40% faster and 3x more throughput than small 3
but the part worth paying attention to is the reasoning_effort parameter
you can set it to "none" for fast lightweight responses
or crank it to "high" for deep step by step reasoning
same model doing both
this is a clear signal of where open source AI is headed
companies are done maintaining five different models for five different tasks
one model that adapts based on what you need it to do...
Top
Ranking
Favorites
