ChatBot-TGLM6B is a chatbot utilizing the Telegram API and the ChatGLM-6B.
- Private Chat
- Group Chat (requires mention or reply)
- Chat Context (up to 16384 tokens per user)
- Invitation Mode (see admin commands)
- Independent Chat Session
- Auto CUDA Memory Management
- Auto Error Handling
- Python 3+
- Git LFS
- ChatGLM (you can use compressed models)
- PyTorch with CUDA (at least 8G of GPU memory)
- Transformers
- Telegram API Token
- Clone this repository
git clone https://github.com/Lakr233/ChatBot-TGLM6B
cd ChatBot-TGLM6B
- Download the model
# disable lfs when pull, change to int4 if needed
GIT_LFS_SKIP_SMUDGE=1 git clone https://huggingface.co/THUDM/chatglm-6b model
# inside model dir
git lfs fetch --all
git lfs checkout
- Install requirements
pip install -r requirements.txt
- Edit the config inside code
token = 'aaaaaaaaaa:88888888888888888888888888888888888'
admin_id = ['000000000']
Note: You need to disable Telegram-Bot 'Privacy Mode' to enable reply to talk in group chat.
- Run the bot
python3 ./main.py
This project is licensed under the WTFPL.
2023.3.23