feat: make sentence-transformers optional; interactive ccc init#132
Merged
feat: make sentence-transformers optional; interactive ccc init#132
ccc init#132Conversation
- Move `sentence-transformers` behind `[embeddings-local]` and `[default]` extras (via `cocoindex[sentence-transformers]`), so `pip install cocoindex-code` is LiteLLM-only. Closes #117. - `ccc init` is now interactive when global settings don't exist: pick provider (sentence-transformers / litellm) and model via a questionary TUI. New `--litellm-model MODEL` flag skips prompts and is the non-TTY escape hatch for LiteLLM. Closes #70. - Change the default sentence-transformers model from `all-MiniLM-L6-v2` to `Snowflake/snowflake-arctic-embed-xs` (lighter, better quality for code). - Generated `global_settings.yml` now includes a `ccc doctor` reminder and commented-out env-var examples (OPENAI_API_KEY, GEMINI_API_KEY, ANTHROPIC_API_KEY, VOYAGE_API_KEY). - Model test during init runs in the daemon via the existing `DoctorRequest` path; the daemon loads the model once and stays running, so the user's next `ccc index` starts warm. - Docker image now installs `cocoindex-code[default]` and pre-caches the new default model. The `COCOINDEX_CODE_EMBEDDING_MODEL` env var is no longer documented for Docker; users mount a `global_settings.yml` or pass `--litellm-model`. - Extract `check_embedding` + `EmbeddingCheckResult` into `shared.py`; refactor daemon `_check_model` to delegate. Error messages in doctor output now include the exception type name (strictly more informative). - Tests switch to a lighter `paraphrase-MiniLM-L3-v2` model via a new `make_test_user_settings()` helper in `conftest.py`, leaving CI cache costs unchanged.
georgeh0
added a commit
that referenced
this pull request
Apr 14, 2026
Follow-up to #132: the skill's management.md still said `pipx install cocoindex-code` and described `ccc init` as a purely non-interactive command. Update to reflect: - Two install styles — `[default]` (batteries included) vs bare (slim, LiteLLM-only). - First-run `ccc init` is interactive; prompts for provider/model and runs a test embed via the daemon. - `--litellm-model MODEL` flag for non-TTY / scripted use. - Pointer to `ccc doctor` if the init model test fails.
georgeh0
added a commit
that referenced
this pull request
Apr 14, 2026
Follow-up to #132: the skill's management.md still said `pipx install cocoindex-code` and described `ccc init` as a purely non-interactive command. Update to reflect: - Two install styles — `[default]` (batteries included) vs bare (slim, LiteLLM-only). - First-run `ccc init` is interactive; prompts for provider/model and runs a test embed via the daemon. - `--litellm-model MODEL` flag for non-TTY / scripted use. - Pointer to `ccc doctor` if the init model test fails.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Closes #117 (optional
sentence-transformers) and #70 (interactiveccc init).sentence-transformersmoves behind new[embeddings-local]and[default]extras (delegated tococoindex[sentence-transformers]so the pin lives in one place). Barepip install cocoindex-codebecomes LiteLLM-only.ccc init— when global settings don't exist, prompts for provider (sentence-transformers / litellm) and model via aquestionaryTUI. New--litellm-model MODELflag skips prompts and is the non-TTY escape hatch for LiteLLM. The interactive flow is gated on "global settings missing" — subsequentccc initcalls in other projects keep the old behavior.sentence-transformers/all-MiniLM-L6-v2toSnowflake/snowflake-arctic-embed-xs(lighter, better for code retrieval).global_settings.ymlnow includes accc doctorreminder and commented-out env-var examples (OPENAI_API_KEY,GEMINI_API_KEY,ANTHROPIC_API_KEY,VOYAGE_API_KEY).DoctorRequestvia the existing_client.doctorpath; the daemon loads the model once and stays running, so the user's nextccc indexstarts warm. Output is wrapped in a rich spinner; HF warnings and "Loading weights" progress stream todaemon.loginstead of cluttering the init output.cocoindex-code[default]and pre-caches the Snowflake model. TheCOCOINDEX_CODE_EMBEDDING_MODELenv var is no longer documented for Docker; users mount a pre-writtenglobal_settings.ymlor pass--litellm-model.check_embedding+EmbeddingCheckResultintoshared.py. Daemon's_check_modeldelegates to it; error messages inccc doctoroutput now include the exception type name (strictly more informative).paraphrase-MiniLM-L3-v2model via a newmake_test_user_settings()helper inconftest.py, keeping CI cache costs unchanged. New E2E tests cover the init flows (TTY defaults,--litellm-model, non-TTY slim install error, failure-is-non-fatal, flag-rejected-when-settings-exist).Test plan
ccc initwith a bogus--litellm-model: FAIL tag + edit-settings hint + project init still completes. ✓