Skip to content

feat: TTL cache — interval-aware in-memory response caching#7

Merged
FaustoS88 merged 1 commit intomainfrom
feat/ttl-cache
Mar 14, 2026
Merged

feat: TTL cache — interval-aware in-memory response caching#7
FaustoS88 merged 1 commit intomainfrom
feat/ttl-cache

Conversation

@FaustoS88
Copy link
Copy Markdown
Owner

Adds ohlcv_router.cache — a lightweight in-memory TTL cache for provider responses.

What's new:

  • cache.py: get(), set(), clear(), size(), ttl_for(), is_enabled()
  • TTLs: 30s (1m) → 120s (5m) → 1800s (1h) → 7200s (4h) → 14400s (1d) → 86400s (1w)
  • Expired entries are evicted on read (lazy eviction, no background thread)
  • registry.fetch() checks cache before hitting providers; populates on first success
  • OHLCV_CACHE_ENABLED=false disables cache process-wide — useful for scripts that always need fresh data
  • 26 tests: TTL mapping, get/set/clear, expiry + eviction, env toggle, registry integration (hit avoids provider call, disabled skips cache entirely)
  • README updated with Caching section and TTL reference table

238 total tests passing. ruff clean.

- cache.py: get/set/clear/size, TTL from 30s (1m) to 86400s (1w)
- registry.fetch() checks cache before providers, populates on success
- OHLCV_CACHE_ENABLED=false to disable process-wide
- 26 tests: TTL expiry, eviction, env toggle, registry integration
- README updated with Caching section and TTL table
@FaustoS88 FaustoS88 merged commit 6675d30 into main Mar 14, 2026
4 checks passed
@FaustoS88 FaustoS88 deleted the feat/ttl-cache branch March 14, 2026 10:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant