Build a LLM Proxy
Last updated
Last updated
This project is a fork of the , available at .
Run DePHY vending machine workers and backend by: docker compose up
Ensure all dependencies are installed before running the application.
The LLM Proxy controller enables token-based access to a large language model via the DePHY messaging network. A key feature is the Transaction
event, defined in DephyDsProxyMessage
, which serves two critical purposes:
Controller-Issued Transaction: After a user recharges via Solana (verified with dephy_balance_payment_sdk::pay
), the controller deducts the payment and emits a Transaction
event. This grants the user a number of tokens for accessing the large language model, proportional to the recharge amount (e.g., 100 lamports = 100 tokens). Example:
Event: { user: "user_pubkey", tokens: 100 }
Purpose: Signals that the user now has tokens available.
Backend-Issued Transaction: When the user interacts with the LLM through the backend and consumes tokens (e.g., during a chat session), the backend emits a Transaction
event to reflect the deduction. This updates the user’s token balance. Example:
Event: { user: "user_pubkey", tokens: -10 }
Purpose: Indicates 10 tokens were consumed, reducing the user’s balance.
These Transaction
events are published to the Nostr relay with the p tag (e.g., "dephy_dsproxy-controller
") and s tag (e.g., "machine_pubkey
"), allowing both the controller and backend to track and synchronize token balances efficiently.
Node: The controller listens for user Request events to recharge tokens, processes payments and grants tokens in a single step using pay via Solana, issuing Transaction events.
Backend: The backend subscribes to Transaction
events to index token balances. When LLM usage consumes tokens, the backend publishes Transaction
events with negative values, which the controller can log or validate, ensuring synchronized token tracking via the messaging layer.
Uses pay
to deduct payment and grant tokens: