Next step would be connecting your LLMs with others, enabling P2P sharing. Think of it as a mini (private) decentralized training network, with each peer improving their model incrementally by exchanging updates or gradients.
definikola
definikolaOct 15, 04:18
How long until we have a chatgpt-like model that we get to train with our own data, able to learn from historical conversations, and we can pay for a server/run it locally (no data leakages), etc? I'd most definitely use that.
Not an expert in the field, though, so probably utopian dreams, but I guess it's technically possible for small and medium models.
1.25K