feat: implement task output caching strategy#245
Conversation
Implemented the Task Output Caching specification. Added the `ICacheProvider` interface and a `MemoryCacheProvider` implementation with TTL expiration logic. Added the `CachingExecutionStrategy` which decorates task execution by conditionally serving results from cache and avoiding redundant executions, while properly resolving context-mutating restore functions. Updated `TaskStep` with `cache` configurations and integrated the strategy dynamically into the `TaskRunner` lifecycle via `setCacheProvider()`. Added comprehensive tests for memory cache operations and integration workflows. Co-authored-by: thalesraymond <32554150+thalesraymond@users.noreply.github.com>
|
👋 Jules, reporting for duty! I'm here to lend a hand with this pull request. When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down. I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job! For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with New to Jules? Learn more at jules.google/docs. For security, I will only act on instructions from the user who triggered this task. |
|
Codecov Report✅ All modified and coverable lines are covered by tests. 📢 Thoughts on this report? Let us know! |
There was a problem hiding this comment.
Code Review
This pull request introduces a task caching system by adding a CachingExecutionStrategy, an ICacheProvider interface, and a MemoryCacheProvider implementation. Review feedback highlights the need to prevent dry runs from polluting the cache by implementing a read-only mode. Additionally, it is recommended to wrap cache operations in try-catch blocks to ensure workflow reliability and to implement a background cleanup process in the memory provider to prevent memory leaks from expired entries.
| if (this.cacheProvider) { | ||
| strategy = new CachingExecutionStrategy(strategy, this.cacheProvider); | ||
| } |
There was a problem hiding this comment.
The CachingExecutionStrategy is applied even during a dry run. This causes the results of the DryRunExecutionStrategy (which are typically placeholders or mock data) to be persisted in the cache, potentially corrupting it for subsequent real executions. Consider passing a flag to the strategy to disable cache writes during dry runs.
| if (this.cacheProvider) { | |
| strategy = new CachingExecutionStrategy(strategy, this.cacheProvider); | |
| } | |
| if (this.cacheProvider) { | |
| strategy = new CachingExecutionStrategy(strategy, this.cacheProvider, !!config?.dryRun); | |
| } |
| constructor( | ||
| private readonly innerStrategy: IExecutionStrategy<TContext>, | ||
| private readonly cacheProvider: ICacheProvider | ||
| ) {} |
There was a problem hiding this comment.
To prevent dry run results from polluting the cache, consider adding a readOnly flag to the constructor. This allows the strategy to still benefit from cache hits during a dry run without writing mock results back to the cache.
| constructor( | |
| private readonly innerStrategy: IExecutionStrategy<TContext>, | |
| private readonly cacheProvider: ICacheProvider | |
| ) {} | |
| constructor( | |
| private readonly innerStrategy: IExecutionStrategy<TContext>, | |
| private readonly cacheProvider: ICacheProvider, | |
| private readonly readOnly: boolean = false | |
| ) {} |
| const key = await step.cache.key(context); | ||
| const cachedResult = await this.cacheProvider.get(key); |
There was a problem hiding this comment.
Cache retrieval operations are currently not protected against failures. If the cacheProvider throws an error (e.g., due to a connection issue with an external cache), the entire task execution will fail. It is recommended to wrap cache interactions in a try-catch block to ensure that caching remains an optional optimization and does not compromise the reliability of the workflow execution.
const key = await step.cache.key(context);
let cachedResult: TaskResult | undefined;
try {
cachedResult = await this.cacheProvider.get(key);
} catch {
// Fallback to execution if cache retrieval fails
}| if (result.status === "success") { | ||
| await this.cacheProvider.set(key, result, step.cache.ttl); | ||
| } |
There was a problem hiding this comment.
Ensure that cache writes are skipped when the strategy is in readOnly mode (e.g., during a dry run). Additionally, wrapping the set operation in a try-catch block prevents cache storage failures from affecting the overall task result.
if (result.status === "success" && !this.readOnly) {
try {
await this.cacheProvider.set(key, result, step.cache.ttl);
} catch {
// Ignore cache storage failures
}
}| * A default in-memory implementation of ICacheProvider. | ||
| */ | ||
| export class MemoryCacheProvider implements ICacheProvider { | ||
| private cache = new Map<string, { value: TaskResult; expiry: number | null }>(); |
There was a problem hiding this comment.
The MemoryCacheProvider lacks a background cleanup process for expired entries. Entries are only removed when they are explicitly accessed via get(). This can lead to a memory leak if many keys with a TTL are set but never retrieved. Consider adding a periodic cleanup task (e.g., using setInterval) or using a more robust caching utility with built-in expiration support.



Implemented the Task Output Caching capability as detailed in the OpenSpec
/feat-task-cachingproposal.Included changes:
ICacheProvidercontractMemoryCacheProviderimplementationTaskCacheConfigwithinTaskStepCachingExecutionStrategycore functionalityTaskRunner.tsexecution pipeline to wrap standard execution when cache is configured.PR created automatically by Jules for task 10490132154278659963 started by @thalesraymond