Trending topics
#
Bonk Eco continues to show strength amid $USELESS rally
#
Pump.fun to raise $1B token sale, traders speculating on airdrop
#
Boop.Fun leading the way with a new launchpad on Solana.
Next step would be connecting your LLMs with others, enabling P2P sharing.
Think of it as a mini (private) decentralized training network, with each peer improving their model incrementally by exchanging updates or gradients.

Oct 15, 04:18
How long until we have a chatgpt-like model that we get to train with our own data, able to learn from historical conversations, and we can pay for a server/run it locally (no data leakages), etc?
I'd most definitely use that.
Not an expert in the field, though, so probably utopian dreams, but I guess it's technically possible for small and medium models.
1.25K
Top
Ranking
Favorites