Skip to main content

Configuration

You can configure the LLMStack installation by editing the .env file in the root directory of the installation. The .env file is a simple text file with key-value pairs. You can edit the file using any text editor. Following is the list of available configuration options:

KeyDescriptionDefault Value
SECRET_KEYSecret key for your installation. This is the key used for signing data. Make sure this is changedHardcoded random string.
CIPHER_KEY_SALTSalt used to encrypt user keys and other sensitive datasalt
DATABASE_PASSWORDPassword for the database user. This is used when setting up the initial db user and for connecting to the database laterllmstack
POSTGRES_VOLUMEPath to the directory where the database data will be stored. By default, data is stored in /tmp which is ephemeral. Make sure to change this to a persistent directory if you want to persist the data./tmp/postgres_llmstack
REDIS_VOLUMEPath to the directory where the redis data will be stored. By default, data is stored in /tmp which is ephemeral. Make sure to change this to a persistent directory if you want to persist the data./tmp/redis_llmstack
WEAVIATE_VOLUMEPath to the directory where the weaviate data will be stored. By default, data is stored in /tmp which is ephemeral. Make sure to change this to a persistent directory if you want to persist the data./tmp/weaviate_llmstack
LLMSTACK_PORTPort on which the LLMStack web server will listen.3000
LOG_LEVELLog level for the LLMStack web server.ERROR
ALLOWED_HOSTSComma separated list of allowed hosts for the LLMStack API server server. If you are running LLMStack on a non localhost domain, you need to add allowed hostslocalhost
CSRF_TRUSTED_ORIGINSComma separated list of trusted origins. If you are running LLMStack on a non localhost domain, you need to add the domain to this list.http://127.0.0.1:{LLMSTACK_PORT},http://localhost:{LLMSTACK_PORT}

Default Platform Keys

You can set default keys for providers like OpenAI, Cohere etc., for all apps from the .env file. These keys will be used for all apps unless overridden by individual users from their settings page. To run LLMs locally, you can also run LocalAI setup and use it from LLMStack by configuring the LocalAI endpoint and the key. Following is the list of available keys for configuration:

KeyDescriptionDefault Value
DEFAULT_OPENAI_API_KEYDefault OpenAI API key ChatGPT, Image generation, whisper and other models from OpenAI.None
DEFAULT_DREAMSTUDIO_API_KEYDefault DreamStudio API key to use for all apps for Stability models.None
DEFAULT_AZURE_OPENAI_API_KEYDefault Azure OpenAI API key if the user wants to Azure's OpenAI.None
DEFAULT_COHERE_API_KEYDefault Cohere API key to use for all apps.None
DEFAULT_FOREFRONTAI_API_KEYDefault ForefrontAI API key to use for all apps.None
DEFAULT_ELEVENLABS_API_KEYDefault Eleven Labs API key for text to speech processor.None
DEFAULT_ANTHROPIC_API_KEYDefault Anthropic API key for models like Claude.None
DEFAULT_LOCALAI_API_KEYDefault LocalAI API to your installation .None
DEFAULT_LOCALAI_BASE_URLDefault LocalAI base URL of the installation.None
DEFAULT_AWS_SECRET_ACCESS_KEYDefault AWS Secret Access Key to use for all apps.None
DEFAULT_AWS_DEFAULT_REGIONDefault AWS Default Region to use for all apps.None
DEFAULT_AWS_ACCESS_KEY_IDDefault AWS Access Key ID to use for all apps.None
DEFAULT_GOOGLE_SERVICE_ACCOUNT_JSON_KEYDefault Google Service Account JSON Key for Google's Vertex AI offering.None