Introduction to the APUS AI Inference Service for the AO Hackathon
llama-herder
https://github.com/permaweb/llama-herder, our service is designed to be incredibly easy to use, so you can focus on building amazing things.
This guide will walk you through everything from making your first AI call to integrating our service into a full-stack dApp. Let’s get started!
APM client loaded
logs to load. If it do not appear after a long time, try entering any command in the AOS console to check if the system is stuck.
#📥|apus-hackathon-agent
channel.
ApusAI_Debug
to enable inference result logging. If unset, no logs will be shown.
Default balance: 5,000,000,000,000 units (equals 5 credits for 5 inference calls). Credits are non-transferable.