6 KiB
Kowalski (Node.js Telegram Bot)
Kowalski is a a simple Telegram bot made in Node.js.
- You can find Kowalski at @KowalskiNodeBot on Telegram.
Self-host requirements
Important
You will only need all of them if you are not running it dockerized. Read "Running with Docker" for more information.
- Bun (latest is suggested)
- A Telegram bot (create one at @BotFather)
- FFmpeg (only for the
/yt
command) - Docker and Docker Compose (only required for Docker setup)
- Postgres
AI Requirements
- High-end CPU or GPU (~ 6GB vRAM)
- If using CPU, enough RAM to load the models (~6GB w/ defaults)
Running locally (non-Docker setup)
First, clone the repo with Git:
git clone --recurse-submodules https://github.com/ABOCN/TelegramBot
Next, inside the repository directory, create an .env
file with some content, which you can see the example .env file to fill info with. To see the meaning of each one, see the Functions section.
After editing the file, save all changes and run the bot with bun start
.
Tip
To deal with dependencies, just run
bun install
orbun i
at any moment to install all of them.
Running with Docker
Important
Please complete the above steps to prepare your local copy for building. You do not need to install FFmpeg on your host system.
Note
Using the
-d
flag when running causes Kowalski to run in the background. If you're just playing around or testing, you may not want to use this flag.
You can also run Kowalski using Docker, which simplifies the setup process. Make sure you have Docker and Docker Compose installed.
Using Docker Compose
-
Copy compose file
Without AI (Ollama)
mv docker-compose.yml.example docker-compose.yml
With AI (Ollama)
mv docker-compose.yml.ai.example docker-compose.yml
-
Make sure to setup your
.env
file first!Tip
If you intend to setup AI, the defaults for Docker are already included (just uncomment) and don't need to be changed.
Further setup may be needed for GPUs. See the Ollama documentation for more.
-
Run the container
docker compose up -d
Using Docker Run
If you prefer to use Docker directly, you can use these instructions instead.
-
Make sure to setup your
.env
file first! -
Build the image
docker build -t kowalski .
-
Run the container
docker run -d --name kowalski --restart unless-stopped -v $(pwd)/.env:/usr/src/app/.env:ro kowalski
Note
You must setup Ollama on your own if you would like to use AI features.
.env Functions
Important
Take care of your
.env
file, as it is so much important and needs to be secret (like your passwords), as anyone can do whatever they want to the bot with this token!
- botSource: Put the link to your bot source code.
- botPrivacy: Put the link to your bot privacy policy.
- maxRetries: Maximum number of retries for a failing command on Kowalski. Default is 5. If the limit is hit, the bot will crash past this number.
- botToken: Put your bot token that you created at @BotFather.
- ollamaEnabled (optional): Enables/disables AI features
- ollamaApi (optional): Ollama API endpoint for various AI features, will be disabled if not set
- handlerTimeout (optional): How long handlers will wait before timing out. Set this high if using large AI models.
- flashModel (optional): Which model will be used for /ask
- thinkingModel (optional): Which model will be used for /think
- updateEveryChars (optional): The amount of chars until message update triggers (for streaming response)
- databaseUrl: Database server configuration (see
.env.example
) - botAdmins: Put the ID of the people responsible for managing the bot. They can use some administrative + exclusive commands on any group.
- lastKey: Last.fm API key, for use on
lastfm.js
functions, like see who is listening to what song and etc. - weatherKey: Weather.com API key, used for the
/weather
command. - longerLogs: Set to
true
to enable verbose logging whenever possible.
Note
Further, advanced fine-tuning and configuration can be done in TypeScript with the files in the
/config
folder.
Troubleshooting
YouTube Downloading
Q: I get a "Permission denied (EACCES)" error in the console when running the /yt
command
A: Make sure src/plugins/yt-dlp/yt-dlp
is executable. You can do this on Linux like so:
chmod +x src/plugins/yt-dlp/yt-dlp
AI
Q: How can I disable AI features?
A: AI features are disabled by default, unless you have set ollamaEnabled
to true
in your .env
file. Set it back to false
to disable.
Contributors
Made with contrib.rocks.
About/License
BSD-3-Clause - 2024 Lucas Gabriel (lucmsilva).
Featuring some components under Unlicense.