A Peer-to-Peer Decentralized Large Language Model

Trained by everyone. Owned by everyone.

Proof of Training

  • Any participant can train the model and validate / accept proposed blocks with new training iterations.
  • The sharded model does not require nodes to load a complete model to perform computation.
  • Peers share training data, and only the highest-quality training iterations are accepted in a new block.

Features

  • Privacy: Sharding the model and distributing computing avoids centralized collection and control of user data or the model itself.
  • Accessibility: Incredibly powerful AI is accessible from anywhere in the world without needing to synchronize or download the complete model locally.
  • Model customization is supported for specific use cases.
  • Don't have hardware with sufficient computing power to use the model?

    Request computing power from peers across the network in exchange for on-chain tokens.