Conversation
|
Warning Rate limit exceeded
Your organization is not enrolled in usage-based pricing. Contact your admin to enable usage-based pricing to continue reviews beyond the rate limit, or try again in 2 minutes and 11 seconds. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: Organization UI Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (2)
📝 WalkthroughWalkthroughPR一覧取得パイプラインを最適化し、軽量リスト取得と詳細メタデータ取得に分離。 Changes
Sequence Diagram(s)sequenceDiagram
participant Crawl as Crawl Service
participant Fetcher as Fetcher
participant GitHub as GitHub API
Crawl->>Crawl: Compute lastFetchedAt
alt PR numbers explicitly provided
Crawl->>Crawl: Skip PR list fetch, build prsToFetch
else PR numbers not provided
Crawl->>Fetcher: pullrequestList(stopBefore: lastFetchedAt)
Fetcher->>GitHub: GetPullRequestList (UPDATED_AT DESC)
GitHub-->>Fetcher: { number, updatedAt } list (paginated)
Fetcher->>Fetcher: Early termination on updatedAt < stopBefore
Fetcher-->>Crawl: Lightweight PR list
Crawl->>Crawl: Build prsToFetch from list
end
loop For each PR to fetch
Crawl->>Fetcher: pullrequest(pr.number)
Fetcher->>GitHub: GetPullRequest (full metadata)
GitHub-->>Fetcher: Full PR metadata
Fetcher->>Fetcher: Shape to ShapedGitHubPullRequest
Fetcher-->>Crawl: PR metadata with files
Crawl->>Crawl: Save prMetadata
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
🧹 Nitpick comments (2)
app/services/jobs/crawl.server.ts (1)
96-108:--pr指定時のリスト取得スキップは効果的PR 番号が明示的に指定された場合に一覧取得をスキップする実装は正しいです。
1点気になる点:
updatedAt: ''はプレースホルダーとして使用されていますが、型の整合性のためだけに存在しています。将来的にupdatedAtが利用される可能性がある場合、空文字列は予期しない動作を引き起こす可能性があります。📝 型の分離を検討
より明確にするため、
prsToFetchの型を{ number: number }[]と{ number: number; updatedAt: string }[]のユニオン型として扱うか、コメントでプレースホルダーであることを明示することを検討してください。const prsToFetch = prNumberSet - ? // --pr 指定時: リスト取得をスキップ - (input.prNumbers?.map((n) => ({ number: n, updatedAt: '' })) ?? []) + ? // --pr 指定時: リスト取得をスキップ (updatedAt は未使用のプレースホルダー) + (input.prNumbers?.map((n) => ({ number: n, updatedAt: '' as const })) ?? [])🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@app/services/jobs/crawl.server.ts` around lines 96 - 108, prsToFetch currently uses updatedAt: '' as a placeholder when input.prNumbers is provided; replace this brittle empty-string approach by making the two shapes explicit: when using input.prNumbers map to objects without updatedAt (e.g. { number: n }) and keep the fetched branch returning { number, updatedAt }; update the prsToFetch type (or signature of downstream consumers) to accept a union ({ number: number } | { number: number; updatedAt: string })[], or alternatively set updatedAt to undefined instead of '' and update types accordingly; adjust any consumers of prsToFetch (search for prsToFetch usage and functions that expect updatedAt) and add a short comment next to the prNumberSet branch explaining this is an intentionally minimal shape to avoid using an empty string placeholder.batch/github/fetcher.ts (1)
1060-1067: ISO 8601 文字列比較による早期打ち切りロジック
node.updatedAt < stopBeforeは辞書順比較ですが、ISO 8601 形式 (YYYY-MM-DDTHH:mm:ssZ) は辞書順で正しく時系列ソートされるため、この実装は正しく動作します。ただし、この前提が成り立つのは両方の値が同じフォーマット(UTC の ISO 8601)である場合に限ります。GitHub API は常に ISO 8601 を返し、
stopBeforeも ISO 8601 形式で渡されるため問題ありませんが、将来の保守性のためにコメントを追加することを検討してください。📝 コメント追加の提案
let stopped = false for (const node of pullRequests.nodes) { if (!node) continue + // ISO 8601 timestamps can be compared lexicographically for chronological order if (stopBefore && node.updatedAt < stopBefore) { stopped = true break }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@batch/github/fetcher.ts` around lines 1060 - 1067, The early-break check uses string comparison (node.updatedAt < stopBefore) which relies on both values being UTC ISO-8601 strings; add a clear comment above that conditional (near pullRequests.nodes loop / the node.updatedAt < stopBefore check) stating the assumption that GitHub returns ISO 8601 UTC timestamps and stopBefore is provided in the same format, and note that the lexicographical comparison is intentional and valid under that assumption to aid future maintainers; optionally mention that if formats may differ later they should parse to Date or use a library.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@app/services/jobs/crawl.server.ts`:
- Around line 96-108: prsToFetch currently uses updatedAt: '' as a placeholder
when input.prNumbers is provided; replace this brittle empty-string approach by
making the two shapes explicit: when using input.prNumbers map to objects
without updatedAt (e.g. { number: n }) and keep the fetched branch returning {
number, updatedAt }; update the prsToFetch type (or signature of downstream
consumers) to accept a union ({ number: number } | { number: number; updatedAt:
string })[], or alternatively set updatedAt to undefined instead of '' and
update types accordingly; adjust any consumers of prsToFetch (search for
prsToFetch usage and functions that expect updatedAt) and add a short comment
next to the prNumberSet branch explaining this is an intentionally minimal shape
to avoid using an empty string placeholder.
In `@batch/github/fetcher.ts`:
- Around line 1060-1067: The early-break check uses string comparison
(node.updatedAt < stopBefore) which relies on both values being UTC ISO-8601
strings; add a clear comment above that conditional (near pullRequests.nodes
loop / the node.updatedAt < stopBefore check) stating the assumption that GitHub
returns ISO 8601 UTC timestamps and stopBefore is provided in the same format,
and note that the lexicographical comparison is intentional and valid under that
assumption to aid future maintainers; optionally mention that if formats may
differ later they should parse to Date or use a library.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 98aaeb29-d3fc-4f23-aa43-ba8ba6aab376
📒 Files selected for processing (2)
app/services/jobs/crawl.server.tsbatch/github/fetcher.ts
PR一覧取得を number + updatedAt のみの軽量クエリに変更し、 UPDATED_AT DESC で lastFetchedAt より古い PR が出たらページング停止する。 PRメタデータは detail fetch 時に個別取得する。 - GetPullRequestListQuery: 軽量一覧クエリ(number + updatedAt のみ) - GetPullRequestQuery: 単一PRメタデータ取得クエリ - shapePullRequestNode(): PR ノード→ShapedGitHubPullRequest の共通変換関数 - pullrequestList(stopBefore?): 早期打ち切り付き軽量一覧取得 - pullrequest(number): 個別PRメタデータ取得 - crawl.server.ts: lastFetchedAt を先に取得し、軽量リストで更新PR検出、 detail fetch で PRメタデータも並行取得 - --pr 指定時は一覧取得を完全スキップ 7,600+ PR のリポジトリで通常 crawl の一覧取得が 76+ API コール → 1ページで完了。 Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
9d5ea1d to
e75b8fc
Compare
handleGraphQLError に閉じ込められていた transient error 判定を isTransientError / getErrorStatus / getPartialData に分離。 graphqlWithTimeout に 502/504 リトライ(最大2回)を追加し、 全 fetcher メソッドが transient failure に対して一貫したリトライを持つようにした。 ページネーション系は graphqlWithTimeout のリトライ後、さらに handleGraphQLError でページサイズ削減にフォールバックする階層化。 Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- --pr 指定時の prsToFetch から不要な updatedAt: '' を削除し
型を Array<{ number: number }> に明確化
- pullrequestList の stopBefore 比較に ISO 8601 文字列比較の意図をコメント
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Summary
number+updatedAtのみの軽量クエリに変更(body, assignees, reviewRequests 等を除去)orderBy: UPDATED_AT DESC+ 早期打ち切りで、通常 crawl の一覧取得が 76+ API コール → 1ページで完了pullrequest(number)新設、commits/reviews 等と並行実行)shapePullRequestNode()を抽出して PR ノード変換ロジックを共通化--pr指定時は一覧取得を完全スキップCloses #264
Test plan
pnpm validateが通るcrawl --pr 8945が動作する(一覧取得スキップ、個別 PR 取得成功)crawlが動作する(軽量リストで82件検出、detail fetch 成功)crawl --refreshで全件取得が動作する🤖 Generated with Claude Code
Summary by CodeRabbit
リリースノート