Welcome to the big-AGI Installation Guide - Whether you're a developer eager to explore, a system integrator, or an enterprise looking for a white-label solution, this comprehensive guide ensures a smooth setup process for your own instance of big-AGI and related products.
Try big-AGI - You don't need to install anything if you want to play with big-AGI
and have your API keys to various model services. You can access our free instance on big-AGI.com.
The free instance runs the latest main-stable
branch from this repository.
If you want to change the code, have a deeper configuration, add your own models, or run your own instance, follow the steps below.
Prerequisites:
Steps:
git clone https://github.com/enricoros/big-AGI.git
cd big-AGI
npm install
npm run dev
Your big-AGI instance is now running at http://localhost:3000
.The production build is optimized for performance and follows the same steps 1 and 2 as for local development.
# .. repeat the steps above up to `npm install`, then:
npm run build
npx
may be optional):
npx next start --port 3000
Your big-AGI production instance is on http://localhost:3000
.Want to pre-enable models, customize the interface, or deploy with username/password or alter code to your needs? Check out the Customizations Guide for detailed instructions.
To deploy big-AGI on a public server, you have several options. Choose the one that best fits your needs.
Install big-AGI on Vercel with just a few clicks.
Create your GitHub fork, create a Vercel project over that fork, and deploy it. Or press the button below for convenience.
Deploy on Cloudflare's global network by installing big-AGI on Cloudflare Pages. Check out the Cloudflare Installation Guide for step-by-step instructions.
Containerize your big-AGI installation using Docker for portability and scalability. Our Docker Deployment Guide will walk you through the process, or follow the steps below for a quick start.
docker build -t big-agi .
# 2A. if you built the image yourself:
docker run -d -p 3000:3000 big-agi
# 2B. or use the pre-built image:
docker run -d -p 3000:3000 ghcr.io/enricoros/big-agi
# 2C. or use docker-compose:
docker-compose up
Access your big-AGI instance at http://localhost:3000
.If you deploy big-AGI behind a reverse proxy, you may want to check out the Reverse Proxy Configuration Guide.
Deploy big-AGI on a Kubernetes cluster for enhanced scalability and management. Follow these steps for a Kubernetes deployment:
Clone the big-AGI repository:
git clone https://github.com/enricoros/big-AGI.git
cd big-AGI
Configure the environment variables:
cp docs/k8s/env-secret.yaml env-secret.yaml
vim env-secret.yaml # Edit the file to set your environment variables
Apply the Kubernetes configurations:
kubectl create namespace ns-big-agi
kubectl apply -f docs/k8s/big-agi-deployment.yaml -f env-secret.yaml
Verify the deployment:
kubectl -n ns-big-agi get svc,pod,deployment
Access the big-AGI application:
kubectl -n ns-big-agi port-forward service/svc-big-agi 3000:3000
Your big-AGI instance is now accessible at http://localhost:3000
.
For more detailed instructions on Kubernetes deployment, including updating and troubleshooting, refer to our Kubernetes Deployment Guide.
Follow the instructions found on Midori AI Subsystem Site for your host OS. After completing the setup process, install the Big-AGI docker backend to the Midori AI Subsystem.
For businesses seeking a fully-managed, scalable solution, consider our managed installations. Enjoy all the features of big-AGI without the hassle of infrastructure management. hello@big-agi.com to learn more.
Join our vibrant community of developers, researchers, and AI enthusiasts. Share your projects, get help, and collaborate with others.
For any questions or inquiries, please don't hesitate to reach out to our team.