A pure PHP API service with:
- Chat completions via Groq (
/chat) with optional SSE streaming. - Utility endpoints (
/health,/models). - A built-in web playground at
/.
Interactive playground UI.
Returns runtime health metadata.
Returns model aliases and configured default model.
Request JSON:
{
"model": "openai/gpt-oss-120b",
"messages": [{"role": "user", "content": "Hello"}],
"temperature": 0.7,
"max_tokens": 1024,
"stream": false
}Response shape is simplified:
{
"response": "Assistant reply text"
}GROQ_API_KEY(required for/chat)DEFAULT_MODEL(optional, default:openai/gpt-oss-120b)PORT(optional for local/dev)
php -S 0.0.0.0:8000 index.phpThis repo includes platform config for:
- Heroku (
Procfile,project.toml) - Render (
render.yaml) - Koyeb (
koyeb.yaml,Dockerfile) - Vercel (
vercel.json) - Any Docker-compatible platform (
Dockerfile)
- Basic per-IP rate limits are enabled on
/chat. - Streaming behavior depends on platform proxy compatibility with SSE.
We welcome contributions to improve ChatGpt Api! To contribute:
- Fork the repository.
- Create a new branch (
git checkout -b feature-branch). - Make your changes and commit (
git commit -m 'Add new feature'). - Push to the branch (
git push origin feature-branch). - Open a pull request to the main repository.
Please ensure your code follows the project's coding standards and includes appropriate documentation.
For support, join our community:
- Telegram Bot Channel: @mnbots
- Support Group: @mnbots_support
- Contact Owners:
Developed by MN BOTS
GitHub Owner: @mntgxo
This project is licensed under the MIT License. See the LICENSE file for details