Telegram AI Companion: fun project on Rust, Telegram, and local AI

Hello, tekkix! 👋

Recently I put together a small but energetic pet project — Telegram AI Companion. It's a Telegram bot that can chat with you using a local language model via LocalAI. No OpenAI, no clouds — everything runs on your own hardware.

The project's goal isn’t to revolutionize AI, but to provide an educational and fun immersion into Rust, async programming, the Telegram API, and local LLM models. A sort of “companion bot”, but more for the developer than the user :)

If you're curious about:

  • How to connect a Telegram bot to a local LLM

  • How to run Rust projects in Docker

  • How to build a REST API and handle webhooks

  • How to use LocalAI without pain

— you’re welcome!


🧩 Under the hood

Here’s what the bot can do:

✅ Replies to any message in Telegram

✅ Works with LocalAI (and OpenAI, if you want)

✅ Runs via Docker + Docker Compose

✅ Written in Rust using Actix Web

✅ Has a REST API (/chat) — you can attach any UI

✅ Supports tests, wrapped in a readable structure


⚙️ How it works

General scheme

  1. User sends a message to the bot in Telegram.

  2. Telegram calls our webhook (/telegram/webhook).

  3. The Rust app receives the message and sends it to LocalAI.

  4. Gets a response and sends it back to the user.

Tech stack

  • 🦀 Rust: a language that doesn’t forgive mistakes, but makes you think

  • 🌐 Actix Web: fast web framework

  • 📦 Docker + Compose: everything’s isolated, convenient and reproducible

  • 🧠 LocalAI: alternative to OpenAI, supports GGUF and LLaMa models


🚀 Quick start

  1. Clone the repository:

    git clone https://github.com/di-zed/tg-ai-companion
    cd tg-ai-companion
  2. Download a model (for example, Mistral 7B) and create the mistral.yaml file:

    cd models/
    wget https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.2-GGUF/resolve/main/mistral-7b-instruct-v0.2.Q4_K_M.gguf
    name: mistral
    backend: llama
    parameters:
      model: mistral-7b-instruct-v0.2.Q4_K_M.gguf
      temperature: 0.7
      top_p: 0.9
      top_k: 40
      n_ctx: 4096
  3. Alternatively, you can use OpenAI, configured in the .env file:

    OPEN_AI_URL=http://localai:8080     # or https://api.openai.com
    OPEN_AI_MODEL=mistral               # or gpt-3.5-turbo / gpt-4 / etc.
    OPEN_AI_API_KEY=your_openai_key     # required if using OpenAI
  4. Run it (don’t forget to edit .env):

    cp .env.sample .env
    cp volumes/root/.bash_history.sample volumes/root/.bash_history
    
    docker-compose up --build
    docker-compose exec rust bash
    cargo run

Now the bot works at localhost, and LocalAI — at localhost:8080.


🤖 How to create a Telegram bot

  1. Open Telegram and find @BotFather

  2. Send the command:

    /newbot
  3. Specify the name and unique username (must end in bot, for example: ai_companion_bot)

  4. You’ll get a token that will look like this:

    123456789:AAH6kDkKvkkkT-PWTwMg6cYtHEb3vY_tS1k
  5. Save it in .env under TELEGRAM_BOT_TOKEN:

    TELEGRAM_BOT_TOKEN=your_token_here

Your bot is now ready to receive messages via webhook! 🚀


🌍 Exposing the Telegram webhook via ngrok

So that Telegram can reach your local server:

ngrok http 80

Then:

curl -X POST "https://api.telegram.org/bot/setWebhook"      -H "Content-Type: application/json"      -d '{"url": "https://YOUR-ADDRESS.ngrok-free.app/telegram/webhook"}'

🔐 API mode (without Telegram)

You can use the bot as a regular LLM API:

POST /chat
Authorization: Bearer YOUR_TOKEN
{
  "prompt": "Hi, who are you?"
}

The response comes from LocalAI (or OpenAI — if you enable it in .env).


🤖 Why all this?

The goal was simple:

Make a clear, functional, and fun project to level up in Rust, try out local LLMs, and just mess around with Telegram bots.

Now you can build something serious on its basis — for example, an AI bot with memory, a text generator, a consultant, etc.


📅 Plans for the future

  • Add support for memory and dialogs

  • Integrate a web interface

  • Support for multiple language models


💬 Conclusion


If you’re just starting out with Rust or you want to try local models without any API keys — this project can be a great starting point.

📝 Note: I didn't go into all the technical details in this article — to keep it light and not overload the material.

If you're interested in digging deeper into the project's architecture, code structure, or configuration details, check out the README on GitHub — everything is laid out in detail there.

The project is open: GitHub — tg-ai-companion


🔗 Useful links

  • 🧠 LocalAI — the main engine for LLM

  • 🦀 Rust Book — the best place to start

  • ☁️ ngrok — if you want Telegram webhooks locally


Thanks for your attention! If the bot replied cheerfully — that's on me. If it's silent — well, that's Telegram or ngrok, as usual 🙂

Comments