Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions src/aignostics_foundry_core/AGENTS.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ This file provides an overview of all modules in `aignostics_foundry_core`, thei
| **log** | Configurable loguru logging initialisation | `logging_initialize(filter_func=None, *, context=None)`, `LogSettings` (env-prefix configurable), `InterceptHandler` for stdlib-to-loguru bridging |
| **sentry** | Configurable Sentry integration | `sentry_initialize(integrations, *, context=None)`, `SentrySettings` (env-prefix configurable), `set_sentry_user(user, role_claim)` for Auth0 user context |
| **service** | FastAPI-injectable base service | `BaseService` ABC with `get_service()` (cached per-class FastAPI `Depends` factory), `key()`, and abstract `health()` / `info()` methods; concrete subclasses implement health checks and module info |
| **database** | Async SQLAlchemy session management + DB settings | `DatabaseSettings` (`OpaqueSettings` subclass; env prefix defaults to `{ctx.env_prefix}DB_`; `get_url()` with optional `db_name` substitution); `init_engine(db_url=None, pool_size=None, max_overflow=None, pool_timeout=None)` — all params optional, fall back to active context when `None`; `dispose_engine()`, `get_db_session()` (FastAPI dependency), `execute_with_session(func, …)`, `cli_run_with_db(func, …, db_url=None)`, `cli_run_with_engine(func, …, db_url=None)`, `with_engine` dual-mode decorator (supports `@with_engine`, `@with_engine()`, `@with_engine(db_url=…)`); auto-resets engine after `fork()` |
| **database** | Async SQLAlchemy session management + DB settings | `DatabaseSettings` (`OpaqueSettings` subclass; env prefix defaults to `{ctx.env_prefix}DB_`; `get_url()` with optional `db_name` substitution); `init_engine(db_url=None, pool_size=None, pool_max_overflow=None, pool_timeout=None)` — all params optional, fall back to active context when `None`; `dispose_engine()`, `get_db_session()` (FastAPI dependency), `execute_with_session(func, …)`, `cli_run_with_db(func, …, db_url=None)`, `cli_run_with_engine(func, …, db_url=None)`, `with_engine` dual-mode decorator (supports `@with_engine`, `@with_engine()`, `@with_engine(db_url=…)`); auto-resets engine after `fork()` |
| **cli** | Typer CLI preparation utilities | `prepare_cli(cli, epilog, *, context=None)` — discovers and registers subcommands via `locate_implementations`, sets epilog recursively, installs `no_args_is_help` workaround; `no_args_is_help_workaround(ctx)` — raises `typer.Exit` when no subcommand is invoked |
| **boot** | Application / library boot sequence | `boot(context, sentry_integrations, log_filter, show_cmdline)` — runs once per process: parses `--env` CLI args, initialises logging and Sentry, amends the SSL trust chain via *truststore* and *certifi*, and logs boot/shutdown messages |
| **user_agent** | Parameterised HTTP user-agent string builder | `user_agent(project_name, version, repository_url)` — builds `{project_name}-python-sdk/{version} (…)` string including platform info, current test, and GitHub Actions run URL |
Expand Down Expand Up @@ -220,7 +220,7 @@ This file provides an overview of all modules in `aignostics_foundry_core`, thei

- **Purpose**: Provides a self-contained `OpaqueSettings` subclass that reads database connection parameters from env vars. The env prefix defaults to `{FoundryContext.env_prefix}DB_` when not supplied, enabling zero-boilerplate DB configuration once a `FoundryContext` is installed.
- **Key Features**:
- `DatabaseSettings(OpaqueSettings)` — fields: `url: SecretStr` (required), `pool_size: int = 10`, `max_overflow: int = 10`, `pool_timeout: float = 30.0`, `db_name: str | None = None`
- `DatabaseSettings(OpaqueSettings)` — fields: `url: SecretStr` (required), `pool_size: int = 10`, `pool_max_overflow: int = 10`, `pool_timeout: float = 30.0`, `db_name: str | None = None`
- `__init__(_env_prefix=None, **kwargs)` — when `_env_prefix` is `None`, lazy-imports `get_context` and uses `f"{ctx.env_prefix}DB_"` as the prefix (avoids a circular import at module load time)
- `get_url() -> str` — returns the raw URL from the secret; if `db_name` is set, replaces the path component in the URL (e.g. `…/postgres` → `…/mydb`) while preserving scheme, host, port, query, and fragment
- `model_config = SettingsConfigDict(extra="ignore")` — extra env vars are silently ignored
Expand Down Expand Up @@ -323,12 +323,12 @@ This file provides an overview of all modules in `aignostics_foundry_core`, thei

- **Purpose**: Manages a process-level async database engine singleton, providing session injection for FastAPI routes, background jobs, and CLI commands. All public functions accept optional DB-config params and fall back to the active `FoundryContext.database` when they are `None`.
- **Key Features**:
- `init_engine(db_url=None, pool_size=None, max_overflow=None, pool_timeout=None)` — initialises the global `AsyncEngine` and `async_sessionmaker`; subsequent calls are silent no-ops. When `db_url` is `None`, the URL and pool settings are resolved from `get_context().database`; raises `RuntimeError` if no context is installed or `ctx.database` is `None`. Pool parameters are omitted automatically for SQLite (which does not use `QueuePool`).
- `init_engine(db_url=None, pool_size=None, pool_max_overflow=None, pool_timeout=None)` — initialises the global `AsyncEngine` and `async_sessionmaker`; subsequent calls are silent no-ops. When `db_url` is `None`, the URL and pool settings are resolved from `get_context().database`; raises `RuntimeError` if no context is installed or `ctx.database` is `None`. Pool parameters are omitted automatically for SQLite (which does not use `QueuePool`).
- `dispose_engine()` — async; disposes the engine; called during application shutdown.
- `get_db_session()` — async generator; yields an `AsyncSession`; raises `RuntimeError` if engine not initialised. Use as a FastAPI `Depends` target.
- `execute_with_session(async_func, *args, **kwargs)` — async; runs `async_func` with a session injected as the `session` keyword argument. For background jobs and CLI helpers.
- `cli_run_with_db(async_func, *args, db_url=None, pool_size=None, max_overflow=None, pool_timeout=None, **kwargs)` — synchronous wrapper: initialises engine, runs the coroutine, then disposes. All DB-config params optional; fall back to context when `None`. For CLI commands.
- `cli_run_with_engine(async_func, *args, db_url=None, pool_size=None, max_overflow=None, pool_timeout=None, **kwargs)` — like `cli_run_with_db` but does not inject a session; for jobs that manage sessions themselves.
- `cli_run_with_db(async_func, *args, db_url=None, pool_size=None, pool_max_overflow=None, pool_timeout=None, **kwargs)` — synchronous wrapper: initialises engine, runs the coroutine, then disposes. All DB-config params optional; fall back to context when `None`. For CLI commands.
- `cli_run_with_engine(async_func, *args, db_url=None, pool_size=None, pool_max_overflow=None, pool_timeout=None, **kwargs)` — like `cli_run_with_db` but does not inject a session; for jobs that manage sessions themselves.
- `with_engine` — dual-mode decorator; supports `@with_engine` (no-parens), `@with_engine()` (empty parens), and `@with_engine(db_url=…, …)` (explicit params). All params optional; fall back to context when absent. For long-lived workers; does **not** dispose after running.
- Fork safety: `multiprocessing.util.register_after_fork` resets the engine in child processes automatically.
- **Location**: `aignostics_foundry_core/database.py`
Expand Down
48 changes: 26 additions & 22 deletions src/aignostics_foundry_core/database.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ class DatabaseSettings(OpaqueSettings):

* ``{PREFIX}URL`` — required; the full database connection URL
* ``{PREFIX}POOL_SIZE`` — optional; connection pool size (default ``10``)
* ``{PREFIX}MAX_OVERFLOW`` — optional; maximum pool overflow (default ``10``)
* ``{PREFIX}POOL_MAX_OVERFLOW`` — optional; maximum pool overflow (default ``10``)
* ``{PREFIX}POOL_TIMEOUT`` — optional; pool checkout timeout in seconds (default ``30.0``)
* ``{PREFIX}NAME`` — optional; override the database name in the URL path component
"""
Expand All @@ -46,7 +46,7 @@ class DatabaseSettings(OpaqueSettings):

url: SecretStr
pool_size: int = 10
max_overflow: int = 10
pool_max_overflow: int = 10
pool_timeout: float = 30.0
db_name: str | None = None

Expand Down Expand Up @@ -121,14 +121,14 @@ class _DatabaseModuleSentinel:


_DEFAULT_POOL_SIZE = 10
_DEFAULT_MAX_OVERFLOW = 10
_DEFAULT_POOL_MAX_OVERFLOW = 10
_DEFAULT_POOL_TIMEOUT = 30.0


def _resolve_db_params(
db_url: str | None,
pool_size: int | None,
max_overflow: int | None,
pool_max_overflow: int | None,
pool_timeout: float | None,
) -> tuple[str, int, int, float]:
"""Resolve database connection parameters, falling back to the active context.
Expand All @@ -138,7 +138,7 @@ def _resolve_db_params(
params are replaced by their module-level defaults.

Returns:
A tuple of ``(db_url, pool_size, max_overflow, pool_timeout)``.
A tuple of ``(db_url, pool_size, pool_max_overflow, pool_timeout)``.

Raises:
RuntimeError: If ``db_url`` is ``None`` and no context is installed, or
Expand All @@ -154,21 +154,21 @@ def _resolve_db_params(
return (
ctx.database.get_url(),
pool_size if pool_size is not None else ctx.database.pool_size,
max_overflow if max_overflow is not None else ctx.database.max_overflow,
pool_max_overflow if pool_max_overflow is not None else ctx.database.pool_max_overflow,
pool_timeout if pool_timeout is not None else ctx.database.pool_timeout,
)
return (
db_url,
pool_size if pool_size is not None else _DEFAULT_POOL_SIZE,
max_overflow if max_overflow is not None else _DEFAULT_MAX_OVERFLOW,
pool_max_overflow if pool_max_overflow is not None else _DEFAULT_POOL_MAX_OVERFLOW,
pool_timeout if pool_timeout is not None else _DEFAULT_POOL_TIMEOUT,
)


def init_engine(
db_url: str | None = None,
pool_size: int | None = None,
max_overflow: int | None = None,
pool_max_overflow: int | None = None,
pool_timeout: float | None = None,
) -> None:
"""Initialize the database engine singleton.
Expand All @@ -190,7 +190,7 @@ def init_engine(
When ``None``, resolved from the active context's ``database`` settings.
pool_size: Number of connections to keep in the pool. Ignored for dialects that
do not support QueuePool (e.g. SQLite). Defaults to the context value or 10.
max_overflow: Number of additional connections above pool_size. Ignored for
pool_max_overflow: Number of additional connections above pool_size. Ignored for
dialects that do not support QueuePool. Defaults to the context value or 10.
pool_timeout: Seconds to wait for a connection from the pool. Ignored for
dialects that do not support QueuePool. Defaults to the context value or 30.
Expand All @@ -205,12 +205,14 @@ def init_engine(
logger.trace("Database engine already initialized, reusing existing engine and connection pool.")
return # Already initialized

db_url, pool_size, max_overflow, pool_timeout = _resolve_db_params(db_url, pool_size, max_overflow, pool_timeout)
db_url, pool_size, pool_max_overflow, pool_timeout = _resolve_db_params(
db_url, pool_size, pool_max_overflow, pool_timeout
)

logger.trace(
"Initializing global database engine with pool_size={}, max_overflow={}, pool_timeout={}",
"Initializing global database engine with pool_size={}, pool_max_overflow={}, pool_timeout={}",
pool_size,
max_overflow,
pool_max_overflow,
pool_timeout,
)

Expand All @@ -222,7 +224,7 @@ def init_engine(
}
if not db_url.startswith("sqlite"):
engine_kwargs["pool_size"] = pool_size
engine_kwargs["max_overflow"] = max_overflow
engine_kwargs["max_overflow"] = pool_max_overflow
engine_kwargs["pool_timeout"] = pool_timeout

_engine = create_async_engine(**engine_kwargs)
Expand Down Expand Up @@ -307,7 +309,7 @@ def cli_run_with_db(
*args: Any, # noqa: ANN401
db_url: str | None = None,
pool_size: int | None = None,
max_overflow: int | None = None,
pool_max_overflow: int | None = None,
pool_timeout: float | None = None,
**kwargs: Any, # noqa: ANN401
) -> Any: # noqa: ANN401
Expand All @@ -327,7 +329,7 @@ def cli_run_with_db(
*args: Positional arguments forwarded to ``async_func``.
db_url: Database connection URL. When ``None``, resolved from the active context.
pool_size: Connection pool size (ignored for SQLite).
max_overflow: Max overflow connections (ignored for SQLite).
pool_max_overflow: Max overflow connections (ignored for SQLite).
pool_timeout: Pool wait timeout in seconds (ignored for SQLite).
**kwargs: Keyword arguments forwarded to ``async_func``.

Expand All @@ -337,7 +339,7 @@ def cli_run_with_db(
import asyncio # noqa: PLC0415

logger.trace("Initializing database engine for cli_run_with_db")
init_engine(db_url=db_url, pool_size=pool_size, max_overflow=max_overflow, pool_timeout=pool_timeout)
init_engine(db_url=db_url, pool_size=pool_size, pool_max_overflow=pool_max_overflow, pool_timeout=pool_timeout)
logger.debug("Database engine initialized for cli_run_with_db")

try:
Expand All @@ -354,7 +356,7 @@ def cli_run_with_engine(
*args: Any, # noqa: ANN401
db_url: str | None = None,
pool_size: int | None = None,
max_overflow: int | None = None,
pool_max_overflow: int | None = None,
pool_timeout: float | None = None,
**kwargs: Any, # noqa: ANN401
) -> Any: # noqa: ANN401
Expand All @@ -374,7 +376,7 @@ def cli_run_with_engine(
*args: Positional arguments forwarded to ``async_func``.
db_url: Database connection URL. When ``None``, resolved from the active context.
pool_size: Connection pool size (ignored for SQLite).
max_overflow: Max overflow connections (ignored for SQLite).
pool_max_overflow: Max overflow connections (ignored for SQLite).
pool_timeout: Pool wait timeout in seconds (ignored for SQLite).
**kwargs: Keyword arguments forwarded to ``async_func``.

Expand All @@ -384,7 +386,7 @@ def cli_run_with_engine(
import asyncio # noqa: PLC0415

logger.trace("Initializing database engine for cli_run_with_engine")
init_engine(db_url=db_url, pool_size=pool_size, max_overflow=max_overflow, pool_timeout=pool_timeout)
init_engine(db_url=db_url, pool_size=pool_size, pool_max_overflow=pool_max_overflow, pool_timeout=pool_timeout)
logger.debug("Database engine initialized for cli_run_with_engine")

try:
Expand All @@ -400,7 +402,7 @@ def with_engine(
*,
db_url: str | None = None,
pool_size: int | None = None,
max_overflow: int | None = None,
pool_max_overflow: int | None = None,
pool_timeout: float | None = None,
) -> Any: # noqa: ANN401
"""Decorator (or decorator factory) to ensure database engine is initialized for async functions.
Expand All @@ -423,7 +425,7 @@ def with_engine(
without parentheses). Do not pass explicitly.
db_url: Database connection URL. When ``None``, resolved from the active context.
pool_size: Connection pool size (ignored for SQLite).
max_overflow: Max overflow connections (ignored for SQLite).
pool_max_overflow: Max overflow connections (ignored for SQLite).
pool_timeout: Pool wait timeout in seconds (ignored for SQLite).

Returns:
Expand All @@ -450,7 +452,9 @@ def decorator(f: Any) -> Any: # noqa: ANN401
@functools.wraps(f)
async def wrapper(*args: Any, **kwargs: Any) -> Any: # noqa: ANN401
logger.trace("Initializing database engine in with_engine wrapper for function {}", func_name)
init_engine(db_url=db_url, pool_size=pool_size, max_overflow=max_overflow, pool_timeout=pool_timeout)
init_engine(
db_url=db_url, pool_size=pool_size, pool_max_overflow=pool_max_overflow, pool_timeout=pool_timeout
)
logger.debug("Database engine initialized in with_engine wrapper for function {}", func_name)

try:
Expand Down
14 changes: 7 additions & 7 deletions tests/aignostics_foundry_core/database_settings_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,10 +14,10 @@
CUSTOM_PREFIX = "CUSTOM_DB_"
CUSTOM_PREFIX_URL_ENV = "CUSTOM_DB_URL"
DEFAULT_POOL_SIZE = 10
DEFAULT_MAX_OVERFLOW = 10
DEFAULT_POOL_MAX_OVERFLOW = 10
DEFAULT_POOL_TIMEOUT = 30
OVERRIDE_POOL_SIZE = 5
OVERRIDE_MAX_OVERFLOW = 20
OVERRIDE_POOL_MAX_OVERFLOW = 20
OVERRIDE_POOL_TIMEOUT = 60


Expand Down Expand Up @@ -94,24 +94,24 @@ def test_explicit_env_prefix_overrides_context(monkeypatch: pytest.MonkeyPatch)

@pytest.mark.unit
def test_pool_defaults_are_applied() -> None:
"""pool_size, max_overflow, pool_timeout take their defaults when not set in env."""
"""pool_size, pool_max_overflow, pool_timeout take their defaults when not set in env."""
settings = DatabaseSettings(_env_prefix="TEST_DB_", url=SQLITE_URL)
assert settings.pool_size == DEFAULT_POOL_SIZE
assert settings.max_overflow == DEFAULT_MAX_OVERFLOW
assert settings.pool_max_overflow == DEFAULT_POOL_MAX_OVERFLOW
assert int(settings.pool_timeout) == DEFAULT_POOL_TIMEOUT


@pytest.mark.unit
def test_pool_overrides_from_env(monkeypatch: pytest.MonkeyPatch) -> None:
"""Pool params read from {PREFIX}POOL_SIZE, {PREFIX}MAX_OVERFLOW, {PREFIX}POOL_TIMEOUT."""
"""Pool params read from {PREFIX}POOL_SIZE, {PREFIX}POOL_MAX_OVERFLOW, {PREFIX}POOL_TIMEOUT."""
monkeypatch.setenv("TEST_DB_URL", SQLITE_URL)
monkeypatch.setenv("TEST_DB_POOL_SIZE", str(OVERRIDE_POOL_SIZE))
monkeypatch.setenv("TEST_DB_MAX_OVERFLOW", str(OVERRIDE_MAX_OVERFLOW))
monkeypatch.setenv("TEST_DB_POOL_MAX_OVERFLOW", str(OVERRIDE_POOL_MAX_OVERFLOW))
monkeypatch.setenv("TEST_DB_POOL_TIMEOUT", str(OVERRIDE_POOL_TIMEOUT))

settings = DatabaseSettings(_env_prefix="TEST_DB_")
assert settings.pool_size == OVERRIDE_POOL_SIZE
assert settings.max_overflow == OVERRIDE_MAX_OVERFLOW
assert settings.pool_max_overflow == OVERRIDE_POOL_MAX_OVERFLOW
assert int(settings.pool_timeout) == OVERRIDE_POOL_TIMEOUT


Expand Down
Loading