Understanding the APUS AI Inference Service architecture and limitations
aos
command-line interface (CLI): This is the primary tool for interacting with the AO ecosystem. If you don’t have it installed, open your terminal and run:
aos
or want a refresher, we highly recommend reading the official AO Cookbook guide on connecting aos
with HyperBEAM nodes.
aos
process. Your process ID is your identity on the network and is required to receive credits.
aos
terminal to this specific node to interact with our service. If you try to send messages on the default legacynet, they will not be processed.
aos my_process
ao.authorities[2] = "lpJ5Edz_8DbNnVDL0XdbsY9vCOs45NACzfI4jvo4Ba8"
.load-blueprint apm
APM client loaded
logs to load. If it do not appear after a long time, try entering any command in the AOS console to check if the system is stuck.
apm.install "@apus/ai"
ctrl + c
aos my_process --mainnet http://72.46.85.207:8734
legacynet
and the newer, high-performance HyperBEAM
. Our APUS AI Service is built exclusively on HyperBEAM
to leverage its unique capabilities for GPU integration.
Target
for all your interactions.
Gemma3-12B
. While our infrastructure supports multiple models, frequently switching between them incurs significant performance overhead. Fixing the model ensures that every developer gets fast and consistent response times.