Skip to content

feat: Groq LLMプラグインを追加#128

Open
oboroge0 wants to merge 1 commit intomainfrom
feat/groq-llm
Open

feat: Groq LLMプラグインを追加#128
oboroge0 wants to merge 1 commit intomainfrom
feat/groq-llm

Conversation

@oboroge0
Copy link
Copy Markdown
Owner

@oboroge0 oboroge0 commented Mar 9, 2026

Summary

  • Groqの高速推論APIを使用するLLMプラグインを新規追加
  • OpenAI互換APIを利用し、Llama 3.3/3.1、Mixtral、Gemma 2モデルに対応
  • グローバル設定画面にGroqセクション(APIキー・モデル選択)を追加
  • デモモード対応(APIキー未設定時は定型文応答)

変更内容

  • plugins/groq-llm/ — プラグイン本体(manifest.json + node.ts)
  • apps/server-ts/src/engine/executor.ts — グローバル設定マッピング追加
  • apps/web/components/ui/SettingsModal.tsx — 設定画面にGroqセクション追加
  • apps/web/components/panels/NodeSettings.tsx — ノード設定パネル対応
  • apps/web/lib/constants.ts — Groqモデルリスト追加
  • apps/web/locales/{en,ja}.json — ローカライズ追加

Test plan

  • npm test — 全116テスト通過
  • npm run lint — 新規エラーなし
  • エディタでGroq LLMノードを追加し、設定画面が正しく表示されることを確認
  • グローバル設定でGroq APIキー・モデルを設定し、ノードに反映されることを確認
  • APIキー設定時にGroq APIからレスポンスが返ることを確認

closes #39 (Groq部分)

🤖 Generated with Claude Code

Summary by CodeRabbit

  • New Features
    • Added Groq LLM as a new node type with full configuration support including API key, model selection, system prompt, temperature control, and prompt builder.
    • Integrated Groq settings into the global configuration panel.
    • Added multilingual support for Groq LLM (English and Japanese).

Groqの高速推論APIを使用してテキスト生成するLLMプラグインを追加。
OpenAI互換APIを利用し、Llama 3.3/3.1、Mixtral、Gemma 2モデルに対応。

- プラグイン本体(manifest.json + node.ts)
- グローバル設定対応(APIキー・モデル)
- 設定画面にGroqセクションを追加
- 日英ローカライズ対応

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Mar 9, 2026

📝 Walkthrough

Walkthrough

This PR introduces comprehensive Groq LLM provider support by adding a new plugin with manifest and node implementation, UI configuration components across the web application, server-side settings mapping, and English and Japanese localizations.

Changes

Cohort / File(s) Summary
Server Configuration
apps/server-ts/src/engine/executor.ts
Added groq-llm entry to GLOBAL_SETTINGS_MAP mapping apiKey and model to groq-prefixed config keys.
Web UI - Node Configuration
apps/web/components/panels/NodeSettings.tsx
Added groq-llm node configuration with fields: apiKey (password), model (select), systemPrompt (textarea), promptSections (prompt-builder), and temperature (number).
Web UI - Settings Modal
apps/web/components/ui/SettingsModal.tsx
Added Groq configuration UI block with API key input and model selection, plus layout adjustment for spacing.
Web Configuration & Localization
apps/web/lib/constants.ts, apps/web/locales/en.json, apps/web/locales/ja.json
Added groq model options to LLM_MODEL_OPTIONS and node descriptions in English and Japanese locales.
Groq Plugin Implementation
plugins/groq-llm/manifest.json, plugins/groq-llm/node.ts
New Groq LLM plugin with manifest defining metadata, UI configuration, input/output schema, and config schema; node.ts implements GroqLLMNode class with setup (OpenAI client initialization), execute (prompt construction and API call), and teardown methods.

Sequence Diagram

sequenceDiagram
    participant Client as Client/UI
    participant Node as GroqLLMNode
    participant OpenAI as OpenAI Client
    participant GroqAPI as Groq API

    Client->>Node: setup(config with apiKey)
    Note over Node: Initialize OpenAI client<br/>with Groq base URL
    Node->>OpenAI: Create client instance
    
    Client->>Node: execute(inputs with prompt)
    Note over Node: Construct full prompt<br/>from sections or raw input
    Node->>Node: Append systemPrompt<br/>and character context
    
    Node->>OpenAI: createChatCompletion()
    OpenAI->>GroqAPI: POST /chat/completions
    GroqAPI-->>OpenAI: response with message
    OpenAI-->>Node: completion result
    
    Note over Node: Extract text response
    Node->>Node: Emit response.generated event
    Node-->>Client: Return { response: text }
    
    Client->>Node: teardown()
    Note over Node: Clear client reference
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Possibly related PRs

  • PR #95: Both PRs modify the same GLOBAL_SETTINGS_MAP in apps/server-ts/src/engine/executor.ts—this PR extends the global-settings mapping infrastructure introduced in the earlier PR.

Poem

🐰✨ Groq joins the LLM family tree,
Swift inference for all to see,
Config and nodes dance side by side,
UI fields prepare for the ride!
⚡ Fast responses, oh what a spree!

🚥 Pre-merge checks | ✅ 4 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The PR title 'feat: Groq LLMプラグインを追加' clearly and concisely describes the main change—adding a Groq LLM plugin—which aligns with the changeset.
Linked Issues check ✅ Passed The PR successfully implements Groq LLM provider support with manifest.json, node implementation, model selection options, and UI integration, meeting the #39 requirements.
Out of Scope Changes check ✅ Passed All changes are directly related to adding Groq LLM plugin support; no out-of-scope modifications detected.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch feat/groq-llm

Tip

Try Coding Plans. Let us write the prompt for your AI agent so you can ship faster (with fewer bugs).
Share your feedback on Discord.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

🧹 Nitpick comments (1)
apps/web/components/ui/SettingsModal.tsx (1)

308-329: Use Tailwind utilities for the new Groq card wrapper.

The new section adds another inline background style in apps/web/**/*.tsx. Please move this wrapper styling to Tailwind so the new UI follows the repo convention.

As per coding guidelines, "apps/web/**/*.tsx: Use Tailwind CSS for styling React components"

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/components/ui/SettingsModal.tsx` around lines 308 - 329, The Groq
card wrapper in SettingsModal.tsx currently uses an inline style background:
'rgba(0,0,0,0.2)'; replace that inline style with Tailwind utilities by moving
the background into the className (e.g., change the div with className "p-3
rounded-lg" and style={{ background: 'rgba(0,0,0,0.2)' }} to include a Tailwind
background like "p-3 rounded-lg bg-black/20" or an equivalent bg-[rgba(...)]
utility) and remove the style prop so styling follows the apps/web/**/*.tsx
Tailwind convention.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@apps/web/components/panels/NodeSettings.tsx`:
- Around line 885-918: The fallback nodeConfigs["groq-llm"] is missing the
maxTokens field declared in plugins/groq-llm/manifest.json; update the groq
entry to include a field with key "maxTokens", type "number", label "Max Tokens"
and a sensible placeholder/default that matches the manifest's type/shape, and
ensure any related TypeScript types used for this config are updated to keep the
manifest and this fallback in sync.

In `@apps/web/locales/ja.json`:
- Line 103: The Japanese locale is missing Groq config labels so the Groq config
panel falls back to English; add the keys expected by NodeSettings.tsx under the
existing "groq-llm" entry: "groqLlm.label", "groqLlm.apiKey", "groqLlm.model",
"groqLlm.systemPrompt", "groqLlm.promptBuilder", "groqLlm.temperature", and
"groqLlm.maxTokens" (translate them into Japanese consistent with the style of
the surrounding locale strings) so NodeSettings.tsx can resolve
nodeConfig.groqLlm.* without fallback.

In `@plugins/groq-llm/manifest.json`:
- Around line 43-47: The manifest currently marks the "apiKey" field as required
which conflicts with plugins/groq-llm/node.ts that supports running in demo mode
when the key is absent; update the manifest's "apiKey" entry so it's not
required (e.g., remove or set "required": false) and optionally clarify in the
"description" that leaving it blank enables demo mode to match the behavior in
node.ts.

In `@plugins/groq-llm/node.ts`:
- Around line 81-84: The demo-mode early return skips emitting the node's event
payload (so listeners miss response.generated); when this.client is falsy, build
the same response object used in the normal path containing
GroqLLMNode.DEMO_RESPONSE and a populated response.generated, emit it using the
same event/emit mechanism the real path uses, then return that response — i.e.,
replace the direct return with code that logs the message, constructs the
response with generated content, calls the normal emit method (the same one used
elsewhere in this class), and finally returns the response.

---

Nitpick comments:
In `@apps/web/components/ui/SettingsModal.tsx`:
- Around line 308-329: The Groq card wrapper in SettingsModal.tsx currently uses
an inline style background: 'rgba(0,0,0,0.2)'; replace that inline style with
Tailwind utilities by moving the background into the className (e.g., change the
div with className "p-3 rounded-lg" and style={{ background: 'rgba(0,0,0,0.2)'
}} to include a Tailwind background like "p-3 rounded-lg bg-black/20" or an
equivalent bg-[rgba(...)] utility) and remove the style prop so styling follows
the apps/web/**/*.tsx Tailwind convention.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: af818d9b-2714-4ec5-86f1-f63c60119ff4

📥 Commits

Reviewing files that changed from the base of the PR and between b45d01e and 21eec16.

📒 Files selected for processing (8)
  • apps/server-ts/src/engine/executor.ts
  • apps/web/components/panels/NodeSettings.tsx
  • apps/web/components/ui/SettingsModal.tsx
  • apps/web/lib/constants.ts
  • apps/web/locales/en.json
  • apps/web/locales/ja.json
  • plugins/groq-llm/manifest.json
  • plugins/groq-llm/node.ts

Comment on lines +885 to +918
"groq-llm": {
label: "LLM (Groq)",
fields: [
{
key: "apiKey",
type: "password",
label: "API Key",
placeholder: "gsk_...",
},
{
key: "model",
type: "select",
label: "Model",
options: LLM_MODEL_OPTIONS.groq,
},
{
key: "systemPrompt",
type: "textarea",
label: "System Prompt",
placeholder: "Enter character settings...",
},
{
key: "promptSections",
type: "prompt-builder",
label: "Prompt Builder",
},
{
key: "temperature",
type: "number",
label: "Temperature",
placeholder: "0.7",
},
],
},
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Keep the Groq fallback schema in sync with the manifest.

plugins/groq-llm/manifest.json exposes maxTokens, but this fallback nodeConfigs["groq-llm"] omits it. Until the plugin manifest is available, the editor renders from this fallback, so the Groq node config is incomplete.

💡 Suggested fallback field
       {
         key: "temperature",
         type: "number",
         label: "Temperature",
         placeholder: "0.7",
       },
+      {
+        key: "maxTokens",
+        type: "number",
+        label: "Max Tokens",
+        placeholder: "1024",
+      },
     ],
   },
As per coding guidelines, "Use TypeScript type safety - match manifest types with TypeScript implementations for config, inputs, and outputs"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/components/panels/NodeSettings.tsx` around lines 885 - 918, The
fallback nodeConfigs["groq-llm"] is missing the maxTokens field declared in
plugins/groq-llm/manifest.json; update the groq entry to include a field with
key "maxTokens", type "number", label "Max Tokens" and a sensible
placeholder/default that matches the manifest's type/shape, and ensure any
related TypeScript types used for this config are updated to keep the manifest
and this fallback in sync.

Comment thread apps/web/locales/ja.json
"anthropic-llm": "Anthropic APIでテキスト生成するLLMノード。\n【使い方】promptに質問を接続。設定でモデルとシステムプロンプトを指定。",
"google-llm": "Google AI APIでテキスト生成するLLMノード。\n【使い方】promptに質問を接続。設定でモデルとシステムプロンプトを指定。",
"ollama-llm": "ローカルOllamaでテキスト生成するLLMノード。\n【使い方】promptに質問を接続。設定でモデルを指定。※Ollama起動が必要",
"groq-llm": "Groqの高速推論APIでテキスト生成するLLMノード。\n【使い方】promptに質問を接続。設定でモデルとシステムプロンプトを指定。",
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Japanese Groq locale coverage is still incomplete.

This adds the node description, but NodeSettings.tsx resolves Groq field labels from nodeConfig.groqLlm.*. Without those keys here, the Groq config panel falls back to English in the Japanese UI. Please add the Groq label, apiKey, model, systemPrompt, promptBuilder, temperature, and maxTokens entries as well.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/locales/ja.json` at line 103, The Japanese locale is missing Groq
config labels so the Groq config panel falls back to English; add the keys
expected by NodeSettings.tsx under the existing "groq-llm" entry:
"groqLlm.label", "groqLlm.apiKey", "groqLlm.model", "groqLlm.systemPrompt",
"groqLlm.promptBuilder", "groqLlm.temperature", and "groqLlm.maxTokens"
(translate them into Japanese consistent with the style of the surrounding
locale strings) so NodeSettings.tsx can resolve nodeConfig.groqLlm.* without
fallback.

Comment on lines +43 to +47
"apiKey": {
"type": "password",
"label": "API Key",
"description": "Your Groq API key",
"required": true
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

apiKey cannot stay required if demo mode is supported.

plugins/groq-llm/node.ts intentionally handles a missing key by entering demo mode, but this manifest marks the same field as required. Any manifest-driven validation/UI will block the no-key flow before the node ever runs.

💡 Suggested manifest fix
     "apiKey": {
       "type": "password",
       "label": "API Key",
-      "description": "Your Groq API key",
-      "required": true
+      "description": "Your Groq API key. Leave empty to use demo mode."
     },
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
"apiKey": {
"type": "password",
"label": "API Key",
"description": "Your Groq API key",
"required": true
"apiKey": {
"type": "password",
"label": "API Key",
"description": "Your Groq API key. Leave empty to use demo mode."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@plugins/groq-llm/manifest.json` around lines 43 - 47, The manifest currently
marks the "apiKey" field as required which conflicts with
plugins/groq-llm/node.ts that supports running in demo mode when the key is
absent; update the manifest's "apiKey" entry so it's not required (e.g., remove
or set "required": false) and optionally clarify in the "description" that
leaving it blank enables demo mode to match the behavior in node.ts.

Comment thread plugins/groq-llm/node.ts
Comment on lines +81 to +84
if (!this.client) {
await context.log("[デモモード] 定型文応答を返します", "info");
return { response: GroqLLMNode.DEMO_RESPONSE };
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Demo mode should preserve the node's event contract.

This early return skips response.generated, so listeners only work when a real API key is configured. Emit the canned response before returning so demo mode behaves like the normal path.

💡 Suggested fix
     if (!this.client) {
       await context.log("[デモモード] 定型文応答を返します", "info");
+      await context.emitEvent(
+        createEvent("response.generated", {
+          text: GroqLLMNode.DEMO_RESPONSE,
+          model: this.model,
+        }),
+      );
       return { response: GroqLLMNode.DEMO_RESPONSE };
     }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@plugins/groq-llm/node.ts` around lines 81 - 84, The demo-mode early return
skips emitting the node's event payload (so listeners miss response.generated);
when this.client is falsy, build the same response object used in the normal
path containing GroqLLMNode.DEMO_RESPONSE and a populated response.generated,
emit it using the same event/emit mechanism the real path uses, then return that
response — i.e., replace the direct return with code that logs the message,
constructs the response with generated content, calls the normal emit method
(the same one used elsewhere in this class), and finally returns the response.

oboroge0 added a commit that referenced this pull request Mar 9, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat: 新LLMプロバイダー追加(Groq, Mistral等)

1 participant