Skip to content

Quickstart

This guide gets you from zero to a running Frona AI instance in a few minutes. You’ll need Docker and Docker Compose installed.

  • Docker 24+ and Docker Compose v2
  • An LLM provider — either a cloud API key (Anthropic, OpenAI, etc.) or a local model via Ollama
  • At least 4 GB of available RAM
git clone https://github.com/fronalabs/frona.git
cd frona/examples/docker-compose

Copy the example environment file:

cp env.example .env

Edit .env with your encryption secret and LLM provider:

OLLAMA_API_BASE_URL=http://ollama:11434/

For all supported providers, see Environment Variables.

docker compose up -d

This starts three services:

ServiceDescriptionPort
fronaFrona server3001 (host)
browserlessHeadless Chromium for browser automationinternal only
searxngMeta search engine for web searchinternal only

Open http://localhost:3001 in your browser. You’ll see the registration page. Create an account with a username and password.

After logging in, you’ll land on the chat interface. The System Assistant agent is available by default. Type a message to start a conversation.

To try out agent capabilities:

  • Ask it to search the web for something (uses SearXNG)
  • Ask it to remember a fact about you (uses the memory system)
  • Ask it to browse a website (uses Browserless)