Frequently Asked Questions

Quick answers to common questions about Big-AGI. For detailed documentation, see our Website Docs.

Connectivity

Direct Connection lets the browser call the AI provider's API directly, skipping the Big-AGI edge server. It appears as a toggle in each AI service's Advanced settings when your API key is set client-side.

When available, it is a net win: faster, fewer restrictions, more privacy.

  • No 4.5 MB upload limit (Vercel body-size cap does not apply).
  • No 300-second timeout (Vercel function timeout does not apply; call length is bound only by the AI service).
  • More privacy - connection metadata (IP, timestamp, edge region, Vercel telemetry) is not observable by the Big-AGI edge server.
  • Slightly more downlink bandwidth - when passing through the edge, Big-AGI sheds repetitive streaming frames; direct streams arrive verbatim.

When it is unavailable:

  1. Server-side keys - if the deployment stores API keys in server environment variables, the browser has no credential to send directly.
  2. Provider does not allow CORS - browsers cannot call APIs that block cross-origin requests. Most major providers permit it; Big-AGI sets any required headers.

Versions

You can see the version in the News section of the app, as per the image below.

Version location in Big-AGI

You can go in the deployments section of your Vercel project, and at a quick glance see what is the latest deployment status, time, and link to the source code.

Vercel deployments view

Each deployment links directly to its source code commit.


Missing something? Open an issue or join our Discord.

© 2026 Token Fabrics·Built with passion in San Diego