📘 Chatbot with llama3 Service for Your GPU Pod
Use this helper to install and run a Chatbot+Llama3 inside your GPU pod.
You can then expose it via the Service Exposure feature.
📥 Download the Installer
bash
wget https://assets.hosted.ai/chatbot_manager.sh && chmod +x chatbot_manager.sh
🆘 Command Help
bash
./chatbot_manager.sh help
bash
lama3 + OpenWebUI wrapper
Usage:
./chatbot_manager.sh help Show this help
./chatbot_manager.sh install Download and install backend + OpenWebUI
./chatbot_manager.sh start Start backend + OpenWebUI in background
./chatbot_manager.sh stop Stop backend + OpenWebUI
./chatbot_manager.sh status Show running status
./chatbot_manager.sh logs Tail logs for both services
./chatbot_manager.sh restart Restart both services
What it does:
- install:
- Checks for python3, pip, python3-venv, wget.
- If missing and apt-get is available, installs them automatically.
- Downloads chatbotllama.tgz from https://example.com/chatbotllama.tgz
- Extracts into: /home/ubuntu/chatbotllama
- Creates two virtualenvs:
- Backend: /home/ubuntu/chatbotllama/backend/.venv
- OpenWebUI: /home/ubuntu/chatbotllama/.venv-openwebui
- Installs Python dependencies for backend + OpenWebUI
- Leaves you with a .env file at /home/ubuntu/chatbotllama/.env
(edit HF_TOKEN and any other settings if needed)
- start:
- Starts backend (Llama3 API) on port 8000
- Starts OpenWebUI on port 3000
- Logs:
- Backend: /home/ubuntu/chatbotllama/backend.log
- OpenWebUI: /home/ubuntu/chatbotllama/openwebui.log
- PID files:
- Backend: /home/ubuntu/chatbotllama/backend.pid
- OpenWebUI: /home/ubuntu/chatbotllama/openwebui.pid▶️ Install the packages & requirements
All the required system and python packages are installed. The chatbot service files are downloaded and ready to be deployed.
bash
./chatbot_manager.sh install▶️ Start the chatbot service
Start the chatbot service including downloading the model
bash
./chatbot_manager.sh start📝 Example Output (Explained)
When you start the service, you may see output similar to this:
bash
[INFO] Starting backend (Llama3 API)...
[INFO] Backend started with PID 2193, log: /home/ubuntu/chatbotllama/backend.log
[INFO] Starting OpenWebUI...
[INFO] OpenWebUI started with PID 2196, log: /home/ubuntu/chatbotllama/openwebui.log
[INFO] Both services should now be running.
[INFO] Open your browser at: http://localhost:3000🌍 Enable External Access Through Service Exposure
⛔ Stop the Chatbot Service
bash
./chatbot_manager.sh stop
[INFO] Stopping backend (PID 2012)...
[INFO] Stopping OpenWebUI (PID 2015)...
[INFO] Stop command issued. It may take a few seconds for processes to fully exit.
This safely shuts down both the chatbot and the API services.