3 min read

Unleash the Power of Any LLM with bolt.new-any-llm: A Setup Tutorial

Unleash the Power of Any LLM with bolt.new-any-llm: A Setup Tutorial

Want to dive into the world of AI-powered full-stack web development? bolt.new-any-llm lets you harness the power of your favorite Large Language Models (LLMs) like Groq, OpenAI, Anthropic, and more, all within your browser. This tutorial guides you through setting up bolt.new-any-llm on your local machine, empowering you to build and deploy cutting-edge applications with ease.

1. Initial Setup

Install Dependencies:Bash

pnpm install

Use code with caution.

Navigate to the Directory:Bash

cd bolt.new-any-llm

Use code with caution.

Clone the Repository:Bash

git clone https://github.com/coleam00/bolt.new-any-llm.git

Use code with caution.

2. Configure Environment Variables

  • Create .env.local: Rename the .env.example file to .env.local. This file will store your sensitive API keys.
  • (Optional) Set Debug Level: For more detailed logging during development, uncomment and set VITE_LOG_LEVEL=debug.
  • Important: Never commit your .env.local file to version control. It's already included in .gitignore to prevent accidental exposure of your API keys.

Add API Keys: Open .env.local and add your LLM API keys. Only include keys for the LLMs you intend to use. For example:

GROQ_API_KEY=your_groq_api_key
OPENAI_API_KEY=your_openai_api_key
ANTHROPIC_API_KEY=your_anthropic_api_key

3. Run the Development Server

  • Browser Compatibility: You'll need Google Chrome Canary to run this locally if you're a Chrome user. It's a developer-focused version of Chrome with the latest features and is a valuable tool for web development.

Start the Server:Bash

pnpm run dev

Use code with caution.

4. Testing Your Setup

Run the Test Suite:Bash

pnpm test

Use code with caution.

This command executes the project's test suite, ensuring everything is functioning correctly.

5. Deployment Options

  • Docker Deployment

Run the Application:

Bash

docker run -p 5173:5173 --env-file .env.local bolt-ai:development  # Development
docker run -p 5173:5173 --env-file .env.local bolt-ai:production  # Production

Use code with caution.

Docker Compose with Profiles:

Bash

docker-compose --profile development up  # Development environment
docker-compose --profile production up  # Production environment

Use code with caution.

Direct Docker Build Commands:

Bash

docker build . --target bolt-ai-development  # Development build
docker build . --target bolt-ai-production  # Production build

Use code with caution.

Build with Helper Scripts:

Bash

npm run dockerbuild  # Development build
npm run dockerbuild:prod  # Production build

Use code with caution.

  • Coolify Deployment
  • Coolify simplifies Docker Compose deployments:
    1. Import your Git repository as a new project.
    2. Select your target environment (development/production).
    3. Choose “Docker Compose” as the Build Pack.
    4. Configure deployment domains.
    5. Set the custom start command: docker compose --profile production up.
    6. Configure environment variables, including your AI API keys.
    7. Deploy the application.

Run the Application:Bash

docker run -p 5173:5173 --env-file .env.local bolt-ai:development  # Development
docker run -p 5173:5173 --env-file .env.local bolt-ai:production  # Production

Use code with caution.

Docker Compose with Profiles:Bash

docker-compose --profile development up  # Development environment
docker-compose --profile production up  # Production environment

Use code with caution.

Direct Docker Build Commands:Bash

docker build . --target bolt-ai-development  # Development build
docker build . --target bolt-ai-production  # Production build

Use code with caution.

Build with Helper Scripts:Bash

npm run dockerbuild  # Development build
npm run dockerbuild:prod  # Production build

Use code with caution.

Cloudflare Pages:Bash

pnpm run deploy

Use code with caution.

Make sure you have the necessary permissions and Wrangler is correctly configured for your Cloudflare account.

6. VS Code Integration

  • Dev Containers: The docker-compose.yaml file is compatible with VS Code dev containers.
    1. Open the command palette in VS Code.
    2. Select the dev container configuration.
    3. Choose the “development” profile from the context menu.

Key Considerations

  • Environment Files: Always ensure your .env.local file is correctly configured with your API keys and environment-specific settings.
  • Port Mapping: Port 5173 is exposed and mapped for both development and production environments.
  • Profiles: Utilize Docker Compose profiles (development/production) to manage different deployment scenarios.

By following these steps, you'll have bolt.new-any-llm up and running, ready to explore the exciting possibilities of AI-driven web development. Happy coding!