Trendaavat aiheet
#
Bonk Eco continues to show strength amid $USELESS rally
#
Pump.fun to raise $1B token sale, traders speculating on airdrop
#
Boop.Fun leading the way with a new launchpad on Solana.

Deedy
VC at @MenloVentures. Formerly founding team @glean, @Google Search. @Cornell CS. Tweets about tech, immigration, India, fitness and search.
If you're a student wondering what you should study in the world of AI, it's still the same: math, physics, chemistry, biology, computer science, engineering.
STEM teaches reasoning, objectivity and how to think. It makes you better at learning everything else. In STEM, ideas win, not institutions.
There will always be career opportunities for the analytically gifted in these fields. It makes you better at everything else. It's hard, it's competitive and it will hurt your spirits to grasp things, but that's when you uplevel yourself. And if you get used to constantly upleveling yourself, you will do quite well and be quite satisfied.
Even today, the best AI models are reinforcement learned from human experts. If you build, you can keep pushing the needle.
36,19K
China (yet again) just dropped the best open weight coding model.
Qwen3-Coder beats Kimi K2, which came out 2 weeks ago, on all coding benchmarks: a crazy 70% on SWE-Bench Verified.. and it has 1M context!
$1-6/M inp & $5-60/M out, above K2 but below Sonnet 4. Cheap.. ish.
It's on par on throughput with Gemini Flash, Kimi and Sonnet too with 60-70tok/s.
Really solid model.
Tremendously fast progress from China who are dominating the open weight ecosystem. Pushes the bar for OpenAI's much awaited open source model drop.

63,2K
This 3D interactive F1 car race visualizer would take at least ~$50k and 3mos to build 5yrs ago.
Today, Lovable Agents does it in 30mins and < $5. Its like Claude Code but with backend, auth, etc handled from the browser. Not just static websites.
Everyone should try at least 1 crazy idea on it.
1,4K
🚨 HUGE Immigration News: Trump just proposed a big change to the H-1B. No more lottery, but weighted selection by wage.
Based on your job code and location, your base salary %ile will slot you into one of 4 groups. 85k H-1Bs are selected in decreasing order IV > III > II > I. If there are more applicants than slot, lottery happens within a group.
The data from LCA applications, those selected into the H-1B, shows 13% level IV and 24% III from ~480k apps. This means mostly level IV will fill up the 85k cap.
This has about 30% chance of being a law by 2027.
What does this all mean? I outlined the big consequences of this bill below.

616,31K
Johtavat
Rankkaus
Suosikit
Ketjussa trendaava
Trendaa X:ssä
Viimeisimmät suosituimmat rahoitukset
Merkittävin