Fix Gemini and update the configs to the new settings.json#196
Fix Gemini and update the configs to the new settings.json#196teron131 wants to merge 2 commits intoautomazeio:mainfrom
Conversation
Updated configuration settings for Factory CLI, replacing legacy 'config.json' with 'settings.json' and modifying model display names and properties.
📝 WalkthroughWalkthroughDocumentation update to FACTORY_SETUP.md: configuration now uses ~/.factory/settings.json with camelCase keys (e.g., Changes
Sequence Diagram(s)(omitted) Estimated code review effort🎯 1 (Trivial) | ⏱️ ~3 minutes Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@FACTORY_SETUP.md`:
- Line 379: Update the unclear sentence under "Provider Setting": replace the
awkward phrase so it reads clearly, e.g. clarify that `provider: "openai"`
defaults to the response API and supports only OpenAI models (for example:
'`provider: "openai"` defaults to the response API and supports only OpenAI
models'), and ensure the quoted example `"provider":
"generic-chat-completion-api"` remains as the alternative.
🧹 Nitpick comments (1)
FACTORY_SETUP.md (1)
74-77: Clarify the conflict resolution behavior.The warning mentions that configurations in both files should not conflict, but it doesn't explain what happens when they do. Consider adding specific guidance on:
- Which file takes precedence when both exist
- Whether settings.json completely overrides config.json or merges with it
- Recommended migration steps for users with existing config.json files
📝 Suggested clarification
> [!IMPORTANT] -> `config.json` is considered legacy. While still supported for backwards compatibility, `settings.json` takes priority. Ensure that configurations in both files do not conflict. +> `config.json` is considered legacy. While still supported for backwards compatibility, `settings.json` takes priority and will completely override `config.json` if both exist. To migrate, rename your `config.json` to `settings.json` or copy your custom models configuration to the new file.
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
FACTORY_SETUP.md (1)
329-338: Mention the newly added Gemini 3 Flash model in the selection list.You added
gemini-3-flash-previewin the config example, but the “Select your model” list omits it. Please add it so the doc stays complete.✏️ Suggested addition
- - `gemini-3-pro-preview`, `gemini-3-pro-image-preview`, `gemini-2.5-pro`, etc. + - `gemini-3-flash-preview`, `gemini-3-pro-preview`, `gemini-3-pro-image-preview`, `gemini-2.5-pro`, etc.
🤖 Fix all issues with AI agents
In `@FACTORY_SETUP.md`:
- Around line 376-380: The current note incorrectly implies "provider": "openai"
only supports OpenAI models; update the wording to say that "provider": "openai"
targets the OpenAI response API format (and therefore can be used with
non-OpenAI backends that implement that format), while Gemini 3 Pro specifically
requires Antigravity auth and "provider": "generic-chat-completion-api"; also
update the Qwen and GLM entries to either switch their provider value to
"openai" (if they use OpenAI response API) or add a parenthetical clarifying
they are using a compatible response-API implementation so readers won't be
confused.
| > [!IMPORTANT] | ||
| > **Gemini 3 Pro Configuration Requirements**: | ||
| > - **Authentication**: Gemini 3 Pro models require **Antigravity** authentication (not Gemini CLI auth) | ||
| > - **Provider Setting**: Must use `"provider": "openai"` in Factory config (Antigravity uses OpenAI API format) | ||
| > - **Provider Setting**: Use `"provider": "generic-chat-completion-api"`. The `"provider": "openai"` setting defaults to the OpenAI response API format and only supports OpenAI models. | ||
| > - **Available in**: VibeProxy v1.0.9+ with CLIProxyAPI 6.5.1+ |
There was a problem hiding this comment.
Clarify provider guidance to match non‑OpenAI examples.
The note says "provider": "openai" only supports OpenAI models, but the same doc uses "provider": "openai" for Qwen and GLM entries later. This reads as contradictory and could mislead readers. Consider rewording to indicate it targets the OpenAI response API format (and can be used for compatible backends).
✏️ Suggested clarification
-> - **Provider Setting**: Use `"provider": "generic-chat-completion-api"`. The `"provider": "openai"` setting defaults to the OpenAI response API format and only supports OpenAI models.
+> - **Provider Setting**: Use `"provider": "generic-chat-completion-api"`. The `"provider": "openai"` setting defaults to the OpenAI response API format and is intended for OpenAI‑compatible models/endpoints.🤖 Prompt for AI Agents
In `@FACTORY_SETUP.md` around lines 376 - 380, The current note incorrectly
implies "provider": "openai" only supports OpenAI models; update the wording to
say that "provider": "openai" targets the OpenAI response API format (and
therefore can be used with non-OpenAI backends that implement that format),
while Gemini 3 Pro specifically requires Antigravity auth and "provider":
"generic-chat-completion-api"; also update the Qwen and GLM entries to either
switch their provider value to "openai" (if they use OpenAI response API) or add
a parenthetical clarifying they are using a compatible response-API
implementation so readers won't be confused.
Updated configuration settings for Factory CLI, replacing legacy 'config.json' with 'settings.json' and modifying model display names and properties.
Summary by CodeRabbit
✏️ Tip: You can customize this high-level summary in your review settings.