Skip to content

feat: implement task output caching strategy#249

Closed
thalesraymond wants to merge 1 commit intomainfrom
feat-task-caching-1671360838857665010
Closed

feat: implement task output caching strategy#249
thalesraymond wants to merge 1 commit intomainfrom
feat-task-caching-1671360838857665010

Conversation

@thalesraymond
Copy link
Copy Markdown
Owner

Implemented task caching logic based on the feat-task-caching OpenSpec proposal. This includes the ICacheProvider contract, an in-memory provider, the CachingExecutionStrategy logic, integration into the TaskRunner, and full test coverage. The OpenSpec change was also successfully archived.


PR created automatically by Jules for task 1671360838857665010 started by @thalesraymond

Implemented the `feat-task-caching` proposal by:
- Adding `ICacheProvider` interface and `MemoryCacheProvider` implementation.
- Extending `TaskStep` with `cache` configuration for dynamic keys and TTL.
- Creating `CachingExecutionStrategy` to handle cache hits, context restoration, and cache misses.
- Updating `TaskRunner` to configure a provider and conditionally wrap the execution strategy.
- Adding comprehensive unit tests for `MemoryCacheProvider` and `CachingExecutionStrategy`.
- Adding end-to-end integration tests for `TaskRunner` caching behavior.
- Resolving linting issues related to explicit generic typing.

Co-authored-by: thalesraymond <32554150+thalesraymond@users.noreply.github.com>
@google-labs-jules
Copy link
Copy Markdown
Contributor

👋 Jules, reporting for duty! I'm here to lend a hand with this pull request.

When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down.

I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job!

For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with @jules. You can find this option in the Pull Request section of your global Jules UI settings. You can always switch back!

New to Jules? Learn more at jules.google/docs.


For security, I will only act on instructions from the user who triggered this task.

@sonarqubecloud
Copy link
Copy Markdown

sonarqubecloud bot commented Apr 3, 2026

@codecov
Copy link
Copy Markdown

codecov bot commented Apr 3, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.

📢 Thoughts on this report? Let us know!

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request implements a task caching mechanism by introducing the ICacheProvider interface, a default MemoryCacheProvider, and a CachingExecutionStrategy that wraps existing execution logic. It also updates TaskRunner to support cache provider registration and TaskStep to include cache-specific configurations like keys, TTLs, and context restoration functions. Feedback suggests that returning a "skipped" status on cache hits might negatively impact downstream tasks, recommends adding an automatic cleanup mechanism to the memory cache to prevent potential memory leaks, and proposes using local constants for more robust TypeScript type narrowing.

Comment on lines +32 to +35
return {
...cachedResult,
status: "skipped", // or "cached" if we add it to TaskStatus
};
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Changing the status to skipped on a cache hit is likely incorrect. If the original task execution was successful, the cached result should retain its success status so that dependent tasks can proceed normally. Marking it as skipped might cause the workflow to stop or skip downstream tasks that depend on this task's success.

If you want to distinguish cache hits, consider adding a cached: true property to the TaskResult instead of overwriting the status.

      return {
        ...cachedResult,
      };

* An in-memory implementation of ICacheProvider.
*/
export class MemoryCacheProvider implements ICacheProvider {
private readonly cache = new Map<string, { value: TaskResult; expiresAt?: number }>();
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The MemoryCacheProvider lacks a mechanism to clean up expired entries automatically. If many tasks use caching with TTLs but are never re-accessed, the cache Map will grow indefinitely, leading to a memory leak.

Consider implementing a periodic cleanup (e.g., using setInterval) or using a data structure that supports automatic eviction (like an LRU cache).

Comment on lines 211 to 215
if (config?.dryRun) {
strategy = new DryRunExecutionStrategy<TContext>();
} else if (this.cacheProvider) {
strategy = new CachingExecutionStrategy(strategy, this.cacheProvider);
}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

To ensure better type safety and avoid potential issues with property access narrowing in TypeScript, it's recommended to capture this.cacheProvider in a local constant before checking and using it. This avoids the need for non-null assertions and improves robustness against future changes, aligning with repository standards for handling potentially null or undefined values.

Suggested change
if (config?.dryRun) {
strategy = new DryRunExecutionStrategy<TContext>();
} else if (this.cacheProvider) {
strategy = new CachingExecutionStrategy(strategy, this.cacheProvider);
}
const cacheProvider = this.cacheProvider;
if (config?.dryRun) {
strategy = new DryRunExecutionStrategy<TContext>();
} else if (cacheProvider) {
strategy = new CachingExecutionStrategy(strategy, cacheProvider);
}
References
  1. In TypeScript, avoid using the non-null assertion operator (!). Instead, use explicit checks or safe fallbacks like the nullish coalescing operator (??) to handle potentially null or undefined values. This improves robustness against future changes that might break assumptions about the value's existence.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant