Frequently Asked Questions
This page answers common questions about the AIsa unified model gateway, compatibility, pricing, and infrastructure.How does AIsa compare to OpenRouter or LiteLLM?
While AIsa shares the core functionality of an LLM router (aggregating multiple models behind a single API), AIsa goes significantly further by integrating 100+ non-LLM data APIs (Twitter, Financial, Search), an MCP server, and native stablecoin micropayments for autonomous agents. AIsa is specifically built as the infrastructure layer for the agentic economy, where agents not only need to think (LLMs) but also act (data APIs) and transact (Machine-to-Machine Payments).Do I need to change my existing OpenAI code?
No. AIsa is fully API-compatible with the OpenAI specification. If you are using the official OpenAI Python or TypeScript SDKs, you only need to change yourbase_url to https://api.aisa.one/v1 and authenticate using an AIsa API Key. All standard parameters like temperature, top_p, and streaming responses work exactly as expected.
How is pricing calculated?
AIsa uses a unified usage-based billing system with no subscription fees.- LLM inference is billed per-token based on the underlying provider’s cost.
- Data APIs (like Search, Financial, or Twitter endpoints) are billed on a flat per-call basis.