Camel Chat

Camel Chat is a feature-rich Flutter application designed to provide a seamless interface for communicating with large language models (LLMs) served via an Ollama server. It offers a user-friendly way to interact with open-source AI models on your own hardware.

Features

  • Connect to Ollama Servers: Easily connect to any Ollama server with optional basic HTTP authentication.
  • Multiple Model Support: Chat with any model available on your Ollama server.
  • Complete Chat History: View and manage your conversation history.
  • Dark Mode Support: Switch between light and dark themes for comfortable viewing.
  • Custom System Prompts: Define system prompts to set the AI’s behaviour and context.
  • Export Conversations: Export your chats as markdown files for sharing or archiving.
  • Chat Organisation: Auto-generated meaningful titles for your conversations.
  • Responsive UI: Works seamlessly on both mobile and desktop devices.
  • Code Formatting: Proper rendering and formatting of code blocks in responses.
  • Local Storage: All your conversations are stored locally for privacy.

Getting Started

Prerequisites

  • A running Ollama server (local or remote).

Installation

Android

Download and install the APK from the releases page.

Linux

Choose one of the following packages from the releases page:

  • Debian/Ubuntu: Download and install the .deb package.
  • Fedora/RHEL: Download and install the .rpm package.
  • Arch: Download and install .zst package.
  • Other distributions: Download the AppImage, make it executable and run it.

Setting Up Your Ollama Server

  1. Install Ollama from https://ollama.com/.
  2. Pull the models you want to use (e.g., ollama pull gemma3).
  3. Run the Ollama server.
  4. Connect Camel Chat to your server by entering the URL (e.g., http://localhost:11434/).

Roadmap

Here are some features and improvements planned for future releases:

  • Stream Responses: Implement streaming responses for more interactive conversations.
  • File Attachments: Upload and process files during conversations.
  • Chat Statistics: View usage statistics and performance metrics.
  • Release on Flathub
  • Windows & macOS Support
  • Smee
    link
    fedilink
    arrow-up
    1
    ·
    2 days ago

    Looks interesting but can’t get it to run on Debian 12, neither the .deb nor the appimage.

      • Smee
        link
        fedilink
        arrow-up
        2
        ·
        2 days ago

        I did run the app image in terminal and got an error about missing something, and the app image comes with everything bundled right? Might be an issue on my end I suppose.

        I’ll redo it and paste the error message tomorrow.