Trendaavat aiheet
#
Bonk Eco continues to show strength amid $USELESS rally
#
Pump.fun to raise $1B token sale, traders speculating on airdrop
#
Boop.Fun leading the way with a new launchpad on Solana.

Spencer Farrar
Partner @theoryvc |
Would highly recommend applying to work with @jmj & @seidtweets ! They are incredible people and investors!

Jeff Morris Jr.29.7.2025
🚨 Dream job alert: We’re hiring an Investment Partner at Chapter One in San Francisco & there’s never been a better moment.
SF is ground zero for AI, and we’re in it with a fresh fund ready to deploy.
This is the role I wish existed when I was coming up: real ownership, leading deals, and working with a tight, product-obsessed team.
If you’ve been waiting to take a bigger swing, this is your shot.

4,04K
Congrats to the Lance team! Excited to be working with you guys!

LanceDB24.6.2025
Today we’re announcing our $30 million Series A.
This round is led by @Theoryvc with support from @CRV , @ycombinator, @databricks, @runwayml , @ZeroPrimeVC , @swift_vc, and more. Your belief in a future powered by multimodal data brings us one step closer to that reality.

4,56K
Spencer Farrar kirjasi uudelleen
TL;DR: We built a transformer-based payments foundation model. It works.
For years, Stripe has been using machine learning models trained on discrete features (BIN, zip, payment method, etc.) to improve our products for users. And these feature-by-feature efforts have worked well: +15% conversion, -30% fraud.
But these models have limitations. We have to select (and therefore constrain) the features considered by the model. And each model requires task-specific training: for authorization, for fraud, for disputes, and so on.
Given the learning power of generalized transformer architectures, we wondered whether an LLM-style approach could work here. It wasn’t obvious that it would—payments is like language in some ways (structural patterns similar to syntax and semantics, temporally sequential) and extremely unlike language in others (fewer distinct ‘tokens’, contextual sparsity, fewer organizing principles akin to grammatical rules).
So we built a payments foundation model—a self-supervised network that learns dense, general-purpose vectors for every transaction, much like a language model embeds words. Trained on tens of billions of transactions, it distills each charge’s key signals into a single, versatile embedding.
You can think of the result as a vast distribution of payments in a high-dimensional vector space. The location of each embedding captures rich data, including how different elements relate to each other. Payments that share similarities naturally cluster together: transactions from the same card issuer are positioned closer together, those from the same bank even closer, and those sharing the same email address are nearly identical.
These rich embeddings make it significantly easier to spot nuanced, adversarial patterns of transactions; and to build more accurate classifiers based on both the features of an individual payment and its relationship to other payments in the sequence.
Take card-testing. Over the past couple of years traditional ML approaches (engineering new features, labeling emerging attack patterns, rapidly retraining our models) have reduced card testing for users on Stripe by 80%. But the most sophisticated card testers hide novel attack patterns in the volumes of the largest companies, so they’re hard to spot with these methods.
We built a classifier that ingests sequences of embeddings from the foundation model, and predicts if the traffic slice is under an attack. It leverages transformer architecture to detect subtle patterns across transaction sequences. And it does this all in real time so we can block attacks before they hit businesses.
This approach improved our detection rate for card-testing attacks on large users from 59% to 97% overnight.
This has an instant impact for our large users. But the real power of the foundation model is that these same embeddings can be applied across other tasks, like disputes or authorizations.
Perhaps even more fundamentally, it suggests that payments have semantic meaning. Just like words in a sentence, transactions possess complex sequential dependencies and latent feature interactions that simply can’t be captured by manual feature engineering.
Turns out attention was all payments needed!
1,22M
Johtavat
Rankkaus
Suosikit
Ketjussa trendaava
Trendaa X:ssä
Viimeisimmät suosituimmat rahoitukset
Merkittävin