Skip to main content

LocalAI

You can run LocalAI locally and use LLMStack to build AI apps on top of open source locally running LLMs.

note

Make sure you configured LocalAI base url and API key in LLMStack's Settings. Read more about using LocalAI in our blog post.

LocalAI provides drop in replacement for OpenAI APIs. Refer to OpenAI for processor details and LocalAI, for the list of supported LLMs.