Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
## 1. Implementation

- [ ] 1.1 Define `ICacheProvider` interface in `src/contracts/ICacheProvider.ts` with `get(key)`, `set(key, result, ttl)`, and `delete(key)` methods.
- [ ] 1.2 Implement `MemoryCacheProvider` in `src/utils/MemoryCacheProvider.ts` as the default in-memory cache implementation.
- [ ] 1.3 Update `TaskStep` interface in `src/TaskStep.ts` to include optional `cache` configuration:
- [x] 1.1 Define `ICacheProvider` interface in `src/contracts/ICacheProvider.ts` with `get(key)`, `set(key, result, ttl)`, and `delete(key)` methods.
- [x] 1.2 Implement `MemoryCacheProvider` in `src/utils/MemoryCacheProvider.ts` as the default in-memory cache implementation.
- [x] 1.3 Update `TaskStep` interface in `src/TaskStep.ts` to include optional `cache` configuration:
- `key`: `(context: TContext) => string | Promise<string>`
- `ttl`: `number` (optional, default to infinite)
- `restore`: `(context: TContext, cachedResult: TaskResult) => void | Promise<void>` (optional, to re-apply context side effects)
- [ ] 1.4 Create `CachingExecutionStrategy` in `src/strategies/CachingExecutionStrategy.ts`.
- [x] 1.4 Create `CachingExecutionStrategy` in `src/strategies/CachingExecutionStrategy.ts`.
- It should implement `IExecutionStrategy`.
- It should accept an inner `IExecutionStrategy` and an `ICacheProvider`.
- In `execute`:
Expand All @@ -18,7 +18,7 @@
- Execute inner strategy.
- If successful, store result in cache provider using `ttl`.
- Return result.
- [ ] 1.5 Update `TaskRunner.ts` to support configuring the cache provider and wrapping the execution strategy with `CachingExecutionStrategy` if caching is enabled.
- [ ] 1.6 Add unit tests for `MemoryCacheProvider`.
- [ ] 1.7 Add unit tests for `CachingExecutionStrategy`, verifying cache hits, misses, and restoration of context.
- [ ] 1.8 Add integration tests in `tests/TaskRunnerCaching.test.ts` to verify end-to-end caching behavior with context updates.
- [x] 1.5 Update `TaskRunner.ts` to support configuring the cache provider and wrapping the execution strategy with `CachingExecutionStrategy` if caching is enabled.
- [x] 1.6 Add unit tests for `MemoryCacheProvider`.
- [x] 1.7 Add unit tests for `CachingExecutionStrategy`, verifying cache hits, misses, and restoration of context.
- [x] 1.8 Add integration tests in `tests/TaskRunnerCaching.test.ts` to verify end-to-end caching behavior with context updates.
15 changes: 15 additions & 0 deletions src/TaskRunner.ts
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,8 @@ import { RetryingExecutionStrategy } from "./strategies/RetryingExecutionStrateg
import { Plugin } from "./contracts/Plugin.js";
import { PluginManager } from "./PluginManager.js";
import { DryRunExecutionStrategy } from "./strategies/DryRunExecutionStrategy.js";
import { ICacheProvider } from "./contracts/ICacheProvider.js";
import { CachingExecutionStrategy } from "./strategies/CachingExecutionStrategy.js";

const MERMAID_ID_REGEX = /[^a-zA-Z0-9_-]/g;

Expand All @@ -32,6 +34,7 @@ export class TaskRunner<TContext> {
new RetryingExecutionStrategy(new StandardExecutionStrategy());

private readonly pluginManager: PluginManager<TContext>;
private cacheProvider?: ICacheProvider;

/**
* @param context The shared context object to be passed to each task.
Expand All @@ -40,6 +43,16 @@ export class TaskRunner<TContext> {
this.pluginManager = new PluginManager({ events: this.eventBus });
}

/**
* Sets the cache provider to be used for task caching.
* @param provider The cache provider implementation.
* @returns The TaskRunner instance for chaining.
*/
public setCacheProvider(provider: ICacheProvider): this {
this.cacheProvider = provider;
return this;
}

/**
* Subscribe to an event.
* @param event The event name.
Expand Down Expand Up @@ -197,6 +210,8 @@ export class TaskRunner<TContext> {
let strategy = this.executionStrategy;
if (config?.dryRun) {
strategy = new DryRunExecutionStrategy<TContext>();
} else if (this.cacheProvider) {
strategy = new CachingExecutionStrategy(strategy, this.cacheProvider);
}
Comment on lines 211 to 215
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

To ensure better type safety and avoid potential issues with property access narrowing in TypeScript, it's recommended to capture this.cacheProvider in a local constant before checking and using it. This avoids the need for non-null assertions and improves robustness against future changes, aligning with repository standards for handling potentially null or undefined values.

Suggested change
if (config?.dryRun) {
strategy = new DryRunExecutionStrategy<TContext>();
} else if (this.cacheProvider) {
strategy = new CachingExecutionStrategy(strategy, this.cacheProvider);
}
const cacheProvider = this.cacheProvider;
if (config?.dryRun) {
strategy = new DryRunExecutionStrategy<TContext>();
} else if (cacheProvider) {
strategy = new CachingExecutionStrategy(strategy, cacheProvider);
}
References
  1. In TypeScript, avoid using the non-null assertion operator (!). Instead, use explicit checks or safe fallbacks like the nullish coalescing operator (??) to handle potentially null or undefined values. This improves robustness against future changes that might break assumptions about the value's existence.


const executor = new WorkflowExecutor(
Expand Down
23 changes: 23 additions & 0 deletions src/TaskStep.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,27 @@ import { TaskResult } from "./TaskResult.js";
import { TaskRetryConfig } from "./contracts/TaskRetryConfig.js";
import { TaskLoopConfig } from "./contracts/TaskLoopConfig.js";

/**
* Configuration for task output caching.
* @template TContext The shape of the shared context object.
*/
export interface TaskCacheConfig<TContext> {
/**
* Function to determine the cache key based on the context.
*/
key: (context: TContext) => string | Promise<string>;

/**
* Optional time-to-live for the cached result in milliseconds.
*/
ttl?: number;

/**
* Optional function to re-apply context side effects from a cached result.
*/
restore?: (context: TContext, cachedResult: TaskResult) => void | Promise<void>;
}

/**
* Represents a single, executable step within a workflow.
* @template TContext The shape of the shared context object.
Expand All @@ -15,6 +36,8 @@ export interface TaskStep<TContext> {
retry?: TaskRetryConfig;
/** Optional loop configuration for the task. */
loop?: TaskLoopConfig<TContext>;
/** Optional cache configuration for the task. */
cache?: TaskCacheConfig<TContext>;
/**
* Optional function to determine if the task should run.
* If it returns false (synchronously or asynchronously), the task is skipped.
Expand Down
29 changes: 29 additions & 0 deletions src/contracts/ICacheProvider.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
import { TaskResult } from "../TaskResult.js";

/**
* Interface for a cache provider used to store and retrieve task execution results.
*/
export interface ICacheProvider {
/**
* Retrieves a cached result by its key.
* @param key The unique cache key.
* @returns A promise resolving to the cached TaskResult, or undefined if not found or expired.
*/
get(key: string): Promise<TaskResult | undefined>;

/**
* Stores a task result in the cache.
* @param key The unique cache key.
* @param result The task result to store.
* @param ttl Optional time-to-live in milliseconds. If omitted, the cache may be indefinite.
* @returns A promise that resolves when the store operation completes.
*/
set(key: string, result: TaskResult, ttl?: number): Promise<void>;

/**
* Deletes a cached result by its key.
* @param key The unique cache key.
* @returns A promise that resolves when the delete operation completes.
*/
delete(key: string): Promise<void>;
}
46 changes: 46 additions & 0 deletions src/strategies/CachingExecutionStrategy.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
import { IExecutionStrategy } from "./IExecutionStrategy.js";
import { TaskStep } from "../TaskStep.js";
import { TaskResult } from "../TaskResult.js";
import { ICacheProvider } from "../contracts/ICacheProvider.js";

/**
* An execution strategy that wraps another strategy to provide caching capabilities.
* @template TContext The shape of the shared context object.
*/
export class CachingExecutionStrategy<TContext> implements IExecutionStrategy<TContext> {
constructor(
private readonly innerStrategy: IExecutionStrategy<TContext>,
private readonly cacheProvider: ICacheProvider
) {}

async execute(
step: TaskStep<TContext>,
context: TContext,
signal?: AbortSignal
): Promise<TaskResult> {
if (!step.cache) {
return this.innerStrategy.execute(step, context, signal);
}

const cacheKey = await step.cache.key(context);
const cachedResult = await this.cacheProvider.get(cacheKey);

if (cachedResult !== undefined) {
if (step.cache.restore) {
await step.cache.restore(context, cachedResult);
}
return {
...cachedResult,
status: "skipped", // or "cached" if we add it to TaskStatus
};
Comment on lines +32 to +35
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Changing the status to skipped on a cache hit is likely incorrect. If the original task execution was successful, the cached result should retain its success status so that dependent tasks can proceed normally. Marking it as skipped might cause the workflow to stop or skip downstream tasks that depend on this task's success.

If you want to distinguish cache hits, consider adding a cached: true property to the TaskResult instead of overwriting the status.

      return {
        ...cachedResult,
      };

}

const result = await this.innerStrategy.execute(step, context, signal);

if (result.status === "success") {
await this.cacheProvider.set(cacheKey, result, step.cache.ttl);
}

return result;
}
}
32 changes: 32 additions & 0 deletions src/utils/MemoryCacheProvider.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
import { ICacheProvider } from "../contracts/ICacheProvider.js";
import { TaskResult } from "../TaskResult.js";

/**
* An in-memory implementation of ICacheProvider.
*/
export class MemoryCacheProvider implements ICacheProvider {
private readonly cache = new Map<string, { value: TaskResult; expiresAt?: number }>();
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The MemoryCacheProvider lacks a mechanism to clean up expired entries automatically. If many tasks use caching with TTLs but are never re-accessed, the cache Map will grow indefinitely, leading to a memory leak.

Consider implementing a periodic cleanup (e.g., using setInterval) or using a data structure that supports automatic eviction (like an LRU cache).


async get(key: string): Promise<TaskResult | undefined> {
const entry = this.cache.get(key);
if (!entry) {
return undefined;
}

if (entry.expiresAt !== undefined && Date.now() > entry.expiresAt) {
this.cache.delete(key);
return undefined;
}

return entry.value;
}

async set(key: string, result: TaskResult, ttl?: number): Promise<void> {
const expiresAt = ttl !== undefined ? Date.now() + ttl : undefined;

Check warning on line 25 in src/utils/MemoryCacheProvider.ts

View check run for this annotation

SonarQubeCloud / SonarCloud Code Analysis

Unexpected negated condition.

See more on https://sonarcloud.io/project/issues?id=thalesraymond_task-runner&issues=AZ1Q2HB3v1liJNXeSKf8&open=AZ1Q2HB3v1liJNXeSKf8&pullRequest=249
this.cache.set(key, { value: result, expiresAt });
}

async delete(key: string): Promise<void> {
this.cache.delete(key);
}
}
77 changes: 77 additions & 0 deletions tests/TaskRunnerCaching.test.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
import { describe, it, expect, vi } from "vitest";
import { TaskRunner } from "../src/TaskRunner.js";
import { MemoryCacheProvider } from "../src/utils/MemoryCacheProvider.js";
import { TaskStep } from "../src/TaskStep.js";

describe("TaskRunner Caching Integration", () => {
it("should cache successful task execution and restore context", async () => {
type TestContext = {
counter: number;
restored: boolean;
};

const context: TestContext = { counter: 0, restored: false };
const runner = new TaskRunner(context);
const cacheProvider = new MemoryCacheProvider();
runner.setCacheProvider(cacheProvider);

const runMock = vi.fn().mockImplementation(async (ctx: TestContext) => {
ctx.counter++;
return { status: "success", data: "computed" };
});

const step: TaskStep<TestContext> = {
name: "cached-task",
run: runMock,
cache: {
key: () => "static-key",
restore: (ctx) => {
ctx.restored = true;
},
},
};

// First execution: cache miss
const results1 = await runner.execute([step]);
expect(results1.get("cached-task")?.status).toBe("success");
expect(results1.get("cached-task")?.data).toBe("computed");
expect(runMock).toHaveBeenCalledTimes(1);
expect(context.counter).toBe(1);
expect(context.restored).toBe(false);

// Reset restored flag just in case
context.restored = false;

// Second execution: cache hit
const results2 = await runner.execute([step]);
expect(results2.get("cached-task")?.status).toBe("skipped"); // Or cached
expect(results2.get("cached-task")?.data).toBe("computed");
expect(runMock).toHaveBeenCalledTimes(1); // Not called again
expect(context.counter).toBe(1); // Not incremented
expect(context.restored).toBe(true); // Restored was called
});

it("should not cache if step has no cache configuration", async () => {
type TestContext = { counter: number };
const context: TestContext = { counter: 0 };
const runner = new TaskRunner(context);
const cacheProvider = new MemoryCacheProvider();
runner.setCacheProvider(cacheProvider);

const runMock = vi.fn().mockImplementation(async (ctx: TestContext) => {
ctx.counter++;
return { status: "success", data: "computed" };
});

const step: TaskStep<TestContext> = {
name: "uncached-task",
run: runMock,
};

await runner.execute([step]);
await runner.execute([step]);

expect(runMock).toHaveBeenCalledTimes(2);
expect(context.counter).toBe(2);
});
});
Loading
Loading