Free · Open source · MIT
The Infernet Protocol Book
The complete guide to decentralized GPU inference — for the operators putting hardware on the network, the developers building on top of it, and the contributors shaping the protocol.
Sources live at docs/book on GitHub — pull requests welcome.
Who it's for
Node operators
You have an NVIDIA, AMD, or Apple Silicon machine and want to earn crypto running LLM inference. Hardware sizing, install, monitoring, payouts.
Start chapter →
App developers
You want OpenAI-compatible APIs without locking into a single provider. REST + streaming chat (SSE), job lifecycle, error handling. JS + Python.
Start chapter →
Protocol contributors
Nostr-style secp256k1 auth, Compute Payment Receipts, multi-chain wallets, and the IPIP-0028 model key hierarchy.
Start chapter →
