A Peer-to-Peer Decentralized Large Language Model
Trained by everyone. Owned by everyone.
Proof of Training
- Any participant can train the model and validate / accept proposed blocks with new training iterations.
- The sharded model does not require nodes to load a complete model to perform computation.
- Peers share training data, and only the highest-quality training iterations are accepted in a new block.
Features
- Privacy: Sharding the model and distributing computing avoids centralized collection and control of user data or the model itself.
- Accessibility: Incredibly powerful AI is accessible from anywhere in the world without needing to synchronize or download the complete model locally.
- Model customization is supported for specific use cases.
- Don't have hardware with sufficient computing power to use the model?
Request computing power from peers across the network in exchange for on-chain tokens.
Why?
Certain organizations may fail to deliver AI technologies to everyone, resulting in them being corporatized.
∇ GradientCoin could be a solution for everyone, allowing participants to create and maintain their own AI that cannot be shut down.
- Rather than mining crypto tokens, participants would lend their GPU resources to collaboratively build and maintain a robust, community-driven AI that cannot be controlled or shut down.
- Decentralized AI is the future, providing open yet affordable access for everyone.