Skip to content

feat: implement task output caching strategy#245

Closed
thalesraymond wants to merge 1 commit intomainfrom
feat/task-caching-10490132154278659963
Closed

feat: implement task output caching strategy#245
thalesraymond wants to merge 1 commit intomainfrom
feat/task-caching-10490132154278659963

Conversation

@thalesraymond
Copy link
Copy Markdown
Owner

Implemented the Task Output Caching capability as detailed in the OpenSpec /feat-task-caching proposal.
Included changes:

  • ICacheProvider contract
  • MemoryCacheProvider implementation
  • TaskCacheConfig within TaskStep
  • CachingExecutionStrategy core functionality
  • Modifying TaskRunner.ts execution pipeline to wrap standard execution when cache is configured.
  • E2E unit and integration tests.

PR created automatically by Jules for task 10490132154278659963 started by @thalesraymond

Implemented the Task Output Caching specification. Added the `ICacheProvider`
interface and a `MemoryCacheProvider` implementation with TTL expiration logic.
Added the `CachingExecutionStrategy` which decorates task execution by conditionally
serving results from cache and avoiding redundant executions, while properly
resolving context-mutating restore functions.
Updated `TaskStep` with `cache` configurations and integrated the strategy dynamically
into the `TaskRunner` lifecycle via `setCacheProvider()`.
Added comprehensive tests for memory cache operations and integration workflows.

Co-authored-by: thalesraymond <32554150+thalesraymond@users.noreply.github.com>
@google-labs-jules
Copy link
Copy Markdown
Contributor

👋 Jules, reporting for duty! I'm here to lend a hand with this pull request.

When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down.

I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job!

For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with @jules. You can find this option in the Pull Request section of your global Jules UI settings. You can always switch back!

New to Jules? Learn more at jules.google/docs.


For security, I will only act on instructions from the user who triggered this task.

@sonarqubecloud
Copy link
Copy Markdown

sonarqubecloud bot commented Apr 3, 2026

@codecov
Copy link
Copy Markdown

codecov bot commented Apr 3, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.

📢 Thoughts on this report? Let us know!

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a task caching system by adding a CachingExecutionStrategy, an ICacheProvider interface, and a MemoryCacheProvider implementation. Review feedback highlights the need to prevent dry runs from polluting the cache by implementing a read-only mode. Additionally, it is recommended to wrap cache operations in try-catch blocks to ensure workflow reliability and to implement a background cleanup process in the memory provider to prevent memory leaks from expired entries.

Comment on lines +215 to +217
if (this.cacheProvider) {
strategy = new CachingExecutionStrategy(strategy, this.cacheProvider);
}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The CachingExecutionStrategy is applied even during a dry run. This causes the results of the DryRunExecutionStrategy (which are typically placeholders or mock data) to be persisted in the cache, potentially corrupting it for subsequent real executions. Consider passing a flag to the strategy to disable cache writes during dry runs.

Suggested change
if (this.cacheProvider) {
strategy = new CachingExecutionStrategy(strategy, this.cacheProvider);
}
if (this.cacheProvider) {
strategy = new CachingExecutionStrategy(strategy, this.cacheProvider, !!config?.dryRun);
}

Comment on lines +10 to +13
constructor(
private readonly innerStrategy: IExecutionStrategy<TContext>,
private readonly cacheProvider: ICacheProvider
) {}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

To prevent dry run results from polluting the cache, consider adding a readOnly flag to the constructor. This allows the strategy to still benefit from cache hits during a dry run without writing mock results back to the cache.

Suggested change
constructor(
private readonly innerStrategy: IExecutionStrategy<TContext>,
private readonly cacheProvider: ICacheProvider
) {}
constructor(
private readonly innerStrategy: IExecutionStrategy<TContext>,
private readonly cacheProvider: ICacheProvider,
private readonly readOnly: boolean = false
) {}

Comment on lines +24 to +25
const key = await step.cache.key(context);
const cachedResult = await this.cacheProvider.get(key);
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Cache retrieval operations are currently not protected against failures. If the cacheProvider throws an error (e.g., due to a connection issue with an external cache), the entire task execution will fail. It is recommended to wrap cache interactions in a try-catch block to ensure that caching remains an optional optimization and does not compromise the reliability of the workflow execution.

    const key = await step.cache.key(context);
    let cachedResult: TaskResult | undefined;
    try {
      cachedResult = await this.cacheProvider.get(key);
    } catch {
      // Fallback to execution if cache retrieval fails
    }

Comment on lines +39 to +41
if (result.status === "success") {
await this.cacheProvider.set(key, result, step.cache.ttl);
}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Ensure that cache writes are skipped when the strategy is in readOnly mode (e.g., during a dry run). Additionally, wrapping the set operation in a try-catch block prevents cache storage failures from affecting the overall task result.

    if (result.status === "success" && !this.readOnly) {
      try {
        await this.cacheProvider.set(key, result, step.cache.ttl);
      } catch {
        // Ignore cache storage failures
      }
    }

* A default in-memory implementation of ICacheProvider.
*/
export class MemoryCacheProvider implements ICacheProvider {
private cache = new Map<string, { value: TaskResult; expiry: number | null }>();
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The MemoryCacheProvider lacks a background cleanup process for expired entries. Entries are only removed when they are explicitly accessed via get(). This can lead to a memory leak if many keys with a TTL are set but never retrieved. Consider adding a periodic cleanup task (e.g., using setInterval) or using a more robust caching utility with built-in expiration support.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant