LocalAI
x big-AGI
LocalAI lets you run your AI models locally, or in the cloud. It supports text, image, asr, speech, and more models.
We are deepening the integration between the two products. As of the time of writing, we integrate the following features:
Last updated Feb 21, 2024
Follow the guide at: https://localai.io/basics/getting_started/
http://localhost:8080
, or the address of your localAI cloud instance
Models 🔄
)In addition to using the UI, configuration can also be done using environment variables.
If the running LocalAI instance is configured with a Model Gallery:
Gallery Admin
At the time of writing, LocalAI does not publish the model context window size
.
Every model is assumed to be capable of chatting, and with a context window of 4096 tokens.
Please update the src/modules/llms/transports/server/openai/models.data.ts
file with the mapping information between LocalAI model IDs and names/descriptions/tokens, etc.