🪨 why use many token when few token do trick — Claude Code skill that cuts 65% of tokens by talking like caveman
-
Updated
Apr 12, 2026 - Python
🪨 why use many token when few token do trick — Claude Code skill that cuts 65% of tokens by talking like caveman
genshijin 原始人 🗿| Claude Code / Codex 向けの超圧縮コミュニケーションスキル。caveman の日本語版をベースに、日本語特有の冗長表現に最適化。
We have caveman system prompts and skills for AI models to reduce token use, why not try bake it into the model itself with fine-tuning?
websocket-driver caveman chat example.
Make caveman worse by giving it Claptrap personality. Same 75% token savings. Personality more annoying. 🤖
Add a description, image, and links to the caveman topic page so that developers can more easily learn about it.
To associate your repository with the caveman topic, visit your repo's landing page and select "manage topics."