RocketChat-Application-Assistant hosts a job application AI assistant deployed to Koyeb that connects Rocket.Chat to an early, non-distributed version of LLMProxy used in the paper LLMProxy: Reducing Cost to Access Large Language Models. Originally used to test out hosting chatbots on Rocket.Chat and has since been abandoned.
Clone repository:
git clone https://github.com/andrewelawrence/RocketChat-Application-Assistant
cd RocketChat-Application-Assistant
pip install -r requirements.txt
sudo apt install koyeb
koyeb login
# redeploy example (replace with your service name)
koyeb service redeploy <your-org>/<your-service-name>With environment set (see config/.env), you can run:
chmod +x test.sh
./test.shThis loads env vars and starts the Flask web-app locally. If flaskEnv=dev and flaskPage are set, a simple dev page is available at /dev (default address is 127.0.0.1:5000, visit config\.env to change this.); otherwise, POST to /query.
app.py: Flask app, routes (/query,/dev,/)chat.py: Welcome text and LLM response assemblyresponse.py: Dispatcher for uploads, resume mode, and general queriesllmproxy.py: Early LLMProxy clientutils.py: AWS DynamoDB session/persistence, Rocket.Chat file handling, helpersconfig/load_envs.py: Loadsconfig/.envand runs a target scriptupload.py: CLI to upload PDFs to the shared RAG sessionrequirements.txt,Procfile,test.sh
- Early LLMProxy implementation used here is from the paper: LLMProxy: Reducing Cost to Access Large Language Models.