feat: implement task output caching strategy#249
Conversation
Implemented the `feat-task-caching` proposal by: - Adding `ICacheProvider` interface and `MemoryCacheProvider` implementation. - Extending `TaskStep` with `cache` configuration for dynamic keys and TTL. - Creating `CachingExecutionStrategy` to handle cache hits, context restoration, and cache misses. - Updating `TaskRunner` to configure a provider and conditionally wrap the execution strategy. - Adding comprehensive unit tests for `MemoryCacheProvider` and `CachingExecutionStrategy`. - Adding end-to-end integration tests for `TaskRunner` caching behavior. - Resolving linting issues related to explicit generic typing. Co-authored-by: thalesraymond <32554150+thalesraymond@users.noreply.github.com>
|
👋 Jules, reporting for duty! I'm here to lend a hand with this pull request. When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down. I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job! For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with New to Jules? Learn more at jules.google/docs. For security, I will only act on instructions from the user who triggered this task. |
|
Codecov Report✅ All modified and coverable lines are covered by tests. 📢 Thoughts on this report? Let us know! |
There was a problem hiding this comment.
Code Review
This pull request implements a task caching mechanism by introducing the ICacheProvider interface, a default MemoryCacheProvider, and a CachingExecutionStrategy that wraps existing execution logic. It also updates TaskRunner to support cache provider registration and TaskStep to include cache-specific configurations like keys, TTLs, and context restoration functions. Feedback suggests that returning a "skipped" status on cache hits might negatively impact downstream tasks, recommends adding an automatic cleanup mechanism to the memory cache to prevent potential memory leaks, and proposes using local constants for more robust TypeScript type narrowing.
| return { | ||
| ...cachedResult, | ||
| status: "skipped", // or "cached" if we add it to TaskStatus | ||
| }; |
There was a problem hiding this comment.
Changing the status to skipped on a cache hit is likely incorrect. If the original task execution was successful, the cached result should retain its success status so that dependent tasks can proceed normally. Marking it as skipped might cause the workflow to stop or skip downstream tasks that depend on this task's success.
If you want to distinguish cache hits, consider adding a cached: true property to the TaskResult instead of overwriting the status.
return {
...cachedResult,
};| * An in-memory implementation of ICacheProvider. | ||
| */ | ||
| export class MemoryCacheProvider implements ICacheProvider { | ||
| private readonly cache = new Map<string, { value: TaskResult; expiresAt?: number }>(); |
There was a problem hiding this comment.
The MemoryCacheProvider lacks a mechanism to clean up expired entries automatically. If many tasks use caching with TTLs but are never re-accessed, the cache Map will grow indefinitely, leading to a memory leak.
Consider implementing a periodic cleanup (e.g., using setInterval) or using a data structure that supports automatic eviction (like an LRU cache).
| if (config?.dryRun) { | ||
| strategy = new DryRunExecutionStrategy<TContext>(); | ||
| } else if (this.cacheProvider) { | ||
| strategy = new CachingExecutionStrategy(strategy, this.cacheProvider); | ||
| } |
There was a problem hiding this comment.
To ensure better type safety and avoid potential issues with property access narrowing in TypeScript, it's recommended to capture this.cacheProvider in a local constant before checking and using it. This avoids the need for non-null assertions and improves robustness against future changes, aligning with repository standards for handling potentially null or undefined values.
| if (config?.dryRun) { | |
| strategy = new DryRunExecutionStrategy<TContext>(); | |
| } else if (this.cacheProvider) { | |
| strategy = new CachingExecutionStrategy(strategy, this.cacheProvider); | |
| } | |
| const cacheProvider = this.cacheProvider; | |
| if (config?.dryRun) { | |
| strategy = new DryRunExecutionStrategy<TContext>(); | |
| } else if (cacheProvider) { | |
| strategy = new CachingExecutionStrategy(strategy, cacheProvider); | |
| } |
References
- In TypeScript, avoid using the non-null assertion operator (!). Instead, use explicit checks or safe fallbacks like the nullish coalescing operator (??) to handle potentially null or undefined values. This improves robustness against future changes that might break assumptions about the value's existence.



Implemented task caching logic based on the
feat-task-cachingOpenSpec proposal. This includes theICacheProvidercontract, an in-memory provider, theCachingExecutionStrategylogic, integration into theTaskRunner, and full test coverage. The OpenSpec change was also successfully archived.PR created automatically by Jules for task 1671360838857665010 started by @thalesraymond