Skip to content

API Reference

Auto-generated API documentation from source code using mkdocstrings.

Testing

MigrationTestBase

Bases: MigrationSchemaMixin, MigrationConsistencyMixin, MigrationNamingMixin

Base class for database migration tests.

Inherit from this class and provide the following fixtures to get all five migration tests for free:

Required fixtures
  • migration_db_url: str — async DSN to the test database.
  • orm_metadata: MetaData — SQLAlchemy metadata of your ORM models.

Auto-provided fixtures (override if needed): - alembic_config — reads alembic.ini from the current directory. - migration_engine — async engine with NullPool. - isolated_migration_schema — unique schema per test, auto-dropped.

Optional class attributes
  • migration_diff_ignore_tables: list[str] — table names to exclude from schema diff and naming checks (e.g. auto-generated partition tables).
  • allowed_index_prefixes, allowed_index_suffixes, allowed_fk_suffixes — override naming convention rules.

Example::

class TestMyMigrations(MigrationTestBase):
    migration_diff_ignore_tables = ["events_partitioned_default"]

    @pytest.fixture
    def orm_metadata(self) -> MetaData:
        return Base.metadata

isolated_migration_schema(migration_db_url) async

Create a unique PostgreSQL schema for this test run and drop it afterwards.

test_downgrade_all_the_way(alembic_config, migration_engine, isolated_migration_schema) async

Verify all migrations can be downgraded to base one by one.

test_migrations_up_to_date(alembic_config, migration_engine, isolated_migration_schema, orm_metadata) async

Verify the database schema after a full upgrade matches the SQLAlchemy ORM metadata.

test_naming_conventions(alembic_config, migration_engine, isolated_migration_schema) async

Verify indexes and foreign keys follow naming conventions after a full upgrade.

test_single_head_revision(alembic_config) async

Verify there is exactly one head revision (no unmerged branches).

test_stairway_upgrade_downgrade(alembic_config, migration_engine, isolated_migration_schema) async

Verify every migration can be applied and rolled back individually (stairway test).

For each revision in chronological order: 1. Upgrade to this revision. 2. Assert current revision matches. 3. Downgrade to the previous revision (or base). 4. Assert the downgrade succeeded. 5. Upgrade back before moving to the next step.

MigrationSchemaMixin

Provides the isolated_migration_schema fixture.

isolated_migration_schema(migration_db_url) async

Create a unique PostgreSQL schema for this test run and drop it afterwards.

MigrationConsistencyMixin

Core migration correctness tests.

test_downgrade_all_the_way(alembic_config, migration_engine, isolated_migration_schema) async

Verify all migrations can be downgraded to base one by one.

test_migrations_up_to_date(alembic_config, migration_engine, isolated_migration_schema, orm_metadata) async

Verify the database schema after a full upgrade matches the SQLAlchemy ORM metadata.

test_single_head_revision(alembic_config) async

Verify there is exactly one head revision (no unmerged branches).

test_stairway_upgrade_downgrade(alembic_config, migration_engine, isolated_migration_schema) async

Verify every migration can be applied and rolled back individually (stairway test).

For each revision in chronological order: 1. Upgrade to this revision. 2. Assert current revision matches. 3. Downgrade to the previous revision (or base). 4. Assert the downgrade succeeded. 5. Upgrade back before moving to the next step.

MigrationNamingMixin

Naming convention tests for database objects.

test_naming_conventions(alembic_config, migration_engine, isolated_migration_schema) async

Verify indexes and foreign keys follow naming conventions after a full upgrade.

Utilities

Migration utilities

Migration runner and schema isolation utilities.

run_alembic_upgrade(engine, alembic_config, target_schema='public', revision='head') async

Run alembic upgrade programmatically without spawning a subprocess.

Parameters:

Name Type Description Default
engine AsyncEngine

Async SQLAlchemy engine.

required
alembic_config Config

Alembic config object (typically from alembic.ini).

required
target_schema str

PostgreSQL schema to run migrations in. Defaults to "public".

'public'
revision str

Target revision identifier. Defaults to "head".

'head'
Source code in alembic_gauntlet/utils/migrations.py
async def run_alembic_upgrade(
    engine: AsyncEngine,
    alembic_config: Config,
    target_schema: str = "public",
    revision: str = "head",
) -> None:
    """Run ``alembic upgrade`` programmatically without spawning a subprocess.

    Args:
        engine: Async SQLAlchemy engine.
        alembic_config: Alembic config object (typically from ``alembic.ini``).
        target_schema: PostgreSQL schema to run migrations in. Defaults to ``"public"``.
        revision: Target revision identifier. Defaults to ``"head"``.
    """

    def _upgrade(sync_conn: Connection) -> None:
        validate_schema_name(target_schema, sync_conn)
        alembic_config.attributes["target_schema"] = target_schema
        alembic_config.attributes["connection"] = sync_conn
        try:
            command.upgrade(alembic_config, revision)
        finally:
            alembic_config.attributes.pop("connection", None)

    async with engine.begin() as conn:
        await conn.run_sync(_upgrade)

run_alembic_downgrade(engine, alembic_config, target_schema='public', revision='base') async

Run alembic downgrade programmatically without spawning a subprocess.

Parameters:

Name Type Description Default
engine AsyncEngine

Async SQLAlchemy engine.

required
alembic_config Config

Alembic config object.

required
target_schema str

PostgreSQL schema to run migrations in. Defaults to "public".

'public'
revision str

Target revision identifier. Defaults to "base".

'base'
Source code in alembic_gauntlet/utils/migrations.py
async def run_alembic_downgrade(
    engine: AsyncEngine,
    alembic_config: Config,
    target_schema: str = "public",
    revision: str = "base",
) -> None:
    """Run ``alembic downgrade`` programmatically without spawning a subprocess.

    Args:
        engine: Async SQLAlchemy engine.
        alembic_config: Alembic config object.
        target_schema: PostgreSQL schema to run migrations in. Defaults to ``"public"``.
        revision: Target revision identifier. Defaults to ``"base"``.
    """

    def _downgrade(sync_conn: Connection) -> None:
        validate_schema_name(target_schema, sync_conn)
        alembic_config.attributes["target_schema"] = target_schema
        alembic_config.attributes["connection"] = sync_conn
        try:
            command.downgrade(alembic_config, revision)
        finally:
            alembic_config.attributes.pop("connection", None)

    async with engine.begin() as conn:
        await conn.run_sync(_downgrade)

get_current_revision(engine, target_schema='public') async

Return the current Alembic revision for the given schema, or None if at base.

Parameters:

Name Type Description Default
engine AsyncEngine

Async SQLAlchemy engine.

required
target_schema str

PostgreSQL schema to inspect. Defaults to "public".

'public'
Source code in alembic_gauntlet/utils/migrations.py
async def get_current_revision(
    engine: AsyncEngine,
    target_schema: str = "public",
) -> str | None:
    """Return the current Alembic revision for the given schema, or ``None`` if at base.

    Args:
        engine: Async SQLAlchemy engine.
        target_schema: PostgreSQL schema to inspect. Defaults to ``"public"``.
    """

    def _get_rev(sync_conn: Connection) -> str | None:
        validate_schema_name(target_schema, sync_conn)
        ctx = MigrationContext.configure(
            sync_conn,
            opts={"version_table_schema": target_schema},
        )
        return ctx.get_current_revision()  # type: ignore[no-any-return]

    async with engine.connect() as conn:
        return await conn.run_sync(_get_rev)

get_all_revisions(alembic_config)

Return all migration revision IDs in chronological order (base → head).

Parameters:

Name Type Description Default
alembic_config Config

Alembic config object.

required
Source code in alembic_gauntlet/utils/migrations.py
def get_all_revisions(alembic_config: Config) -> list[str]:
    """Return all migration revision IDs in chronological order (base → head).

    Args:
        alembic_config: Alembic config object.
    """
    script = ScriptDirectory.from_config(alembic_config)
    # walk_revisions() yields head → base; reverse for base → head order.
    revisions = [rev.revision for rev in script.walk_revisions() if rev.revision]
    return list(reversed(revisions))

create_isolated_migration_schema(migration_db_url) async

Create a unique PostgreSQL schema for one test run, yield it, then drop it.

Uses a dedicated engine so that schema creation and deletion do not interfere with the connection pool used by the migration engine.

Schema name format: test_mig_{8-char hex}.

Parameters:

Name Type Description Default
migration_db_url str

Async DSN for the test database.

required

Yields:

Type Description
AsyncGenerator[str, None]

The name of the freshly created schema.

Source code in alembic_gauntlet/utils/migrations.py
async def create_isolated_migration_schema(
    migration_db_url: str,
) -> AsyncGenerator[str, None]:
    """Create a unique PostgreSQL schema for one test run, yield it, then drop it.

    Uses a dedicated engine so that schema creation and deletion do not interfere
    with the connection pool used by the migration engine.

    Schema name format: ``test_mig_{8-char hex}``.

    Args:
        migration_db_url: Async DSN for the test database.

    Yields:
        The name of the freshly created schema.
    """
    schema = f"test_mig_{uuid.uuid4().hex[:8]}"
    validate_schema_name(schema)

    engine = create_async_engine(migration_db_url, echo=False, poolclass=NullPool)
    try:
        async with engine.connect() as conn:
            await conn.execute(CreateSchema(schema))
            await conn.commit()

        yield schema

    finally:
        async with engine.connect() as conn:
            await conn.execute(DropSchema(schema, cascade=True, if_exists=True))
            await conn.commit()
        await engine.dispose()

Schema validation

Schema name validation utilities.

validate_schema_name(name, connection=None)

Validate a PostgreSQL schema name to prevent SQL injection and ensure sanity.

Checks format, length, and optionally reserved words (requires a live connection).

Source code in alembic_gauntlet/utils/validation.py
def validate_schema_name(name: str, connection: Connection | None = None) -> None:
    """Validate a PostgreSQL schema name to prevent SQL injection and ensure sanity.

    Checks format, length, and optionally reserved words (requires a live connection).
    """
    if not name:
        raise EmptySchemaNameError("Schema name cannot be empty.")
    if not _PG_IDENTIFIER_RE.match(name):
        raise InvalidSchemaNameError(
            f"Invalid schema name: {name!r}. "
            "Must start with a letter or underscore and contain only alphanumeric characters and underscores."
        )
    if len(name) > _PG_MAX_IDENTIFIER_LEN:
        raise SchemaNameTooLongError(
            f"Schema name too long: {name!r}. Maximum length is {_PG_MAX_IDENTIFIER_LEN} characters."
        )
    if connection is not None:
        reserved = get_pg_reserved_words(connection)
        if name.lower() in reserved:
            raise ReservedWordSchemaNameError(f"Schema name {name!r} is a PostgreSQL reserved word.")

Diff utilities

Schema diff filtering for migration consistency tests.

DEFAULT_IGNORE_TABLES = frozenset({'alembic_version'}) module-attribute

is_ignored_diff_item(diff_item, ignore_tables)

Return True if this diff item should be excluded from schema diff checks.

Filters out tables and their indexes that exist in the database but are absent from ORM metadata — for example, partition tables auto-created by PostgreSQL.

Parameters:

Name Type Description Default
diff_item tuple

A single item from Alembic's compare_metadata() result.

required
ignore_tables frozenset[str]

Set of table names to skip.

required
Source code in alembic_gauntlet/utils/diff.py
def is_ignored_diff_item(diff_item: tuple, ignore_tables: frozenset[str]) -> bool:
    """Return ``True`` if this diff item should be excluded from schema diff checks.

    Filters out tables and their indexes that exist in the database but are absent
    from ORM metadata — for example, partition tables auto-created by PostgreSQL.

    Args:
        diff_item: A single item from Alembic's ``compare_metadata()`` result.
        ignore_tables: Set of table names to skip.
    """
    if len(diff_item) < 2:
        return False
    op, first = diff_item[0], diff_item[1]
    if op == "remove_table":
        name = getattr(first, "name", None)
        return name in ignore_tables if name else False
    if op == "remove_index":
        table = getattr(first, "table", None)
        name = getattr(table, "name", None) if table is not None else None
        return name in ignore_tables if name else False
    return False

Naming utilities

Database naming convention validation.

fetch_table_naming_results(sync_conn, schema)

Fetch index and foreign key names for every table in the given schema.

The alembic_version table is always excluded.

Parameters:

Name Type Description Default
sync_conn Connection

Synchronous SQLAlchemy connection.

required
schema str

PostgreSQL schema to inspect.

required

Returns:

Type Description
dict[str, TableNamingResults]

Mapping of table_name → :class:TableNamingResults.

Source code in alembic_gauntlet/utils/naming.py
def fetch_table_naming_results(
    sync_conn: Connection,
    schema: str,
) -> dict[str, TableNamingResults]:
    """Fetch index and foreign key names for every table in the given schema.

    The ``alembic_version`` table is always excluded.

    Args:
        sync_conn: Synchronous SQLAlchemy connection.
        schema: PostgreSQL schema to inspect.

    Returns:
        Mapping of ``table_name`` → :class:`TableNamingResults`.
    """
    inspector = inspect(sync_conn)
    tables = inspector.get_table_names(schema=schema)

    results: dict[str, TableNamingResults] = {}
    for table in tables:
        if table == "alembic_version":
            continue
        indexes = inspector.get_indexes(table, schema=schema)
        fks = inspector.get_foreign_keys(table, schema=schema)
        results[table] = {
            "indexes": {idx["name"] for idx in indexes if idx["name"]},
            "fks": [cast(ForeignKeyInfo, fk) for fk in fks],
        }

    return results

validate_naming_results(results, allowed_index_prefixes, allowed_index_suffixes, allowed_fk_suffixes)

Assert that every index and foreign key follows naming conventions.

Index names must match at least one allowed prefix or one allowed suffix. An optional trailing digit is accepted on suffixes to accommodate PostgreSQL partition auto-naming (e.g. users_pkey1).

Parameters:

Name Type Description Default
results dict[str, TableNamingResults]

Output of :func:fetch_table_naming_results.

required
allowed_index_prefixes list[str]

Prefixes an index name may start with (e.g. ["idx_", "uq_"]).

required
allowed_index_suffixes list[str]

Suffixes an index name may end with (e.g. ["_pkey", "_idx"]).

required
allowed_fk_suffixes list[str]

Suffixes a foreign key name must end with (e.g. ["_fkey"]).

required
Source code in alembic_gauntlet/utils/naming.py
def validate_naming_results(
    results: dict[str, TableNamingResults],
    allowed_index_prefixes: list[str],
    allowed_index_suffixes: list[str],
    allowed_fk_suffixes: list[str],
) -> None:
    """Assert that every index and foreign key follows naming conventions.

    Index names must match at least one allowed prefix **or** one allowed suffix.
    An optional trailing digit is accepted on suffixes to accommodate PostgreSQL
    partition auto-naming (e.g. ``users_pkey1``).

    Args:
        results: Output of :func:`fetch_table_naming_results`.
        allowed_index_prefixes: Prefixes an index name may start with (e.g. ``["idx_", "uq_"]``).
        allowed_index_suffixes: Suffixes an index name may end with (e.g. ``["_pkey", "_idx"]``).
        allowed_fk_suffixes: Suffixes a foreign key name must end with (e.g. ``["_fkey"]``).
    """
    prefix_pats = [re.compile(f"^{re.escape(p)}.*") for p in allowed_index_prefixes]
    suffix_pats = [re.compile(f".*{re.escape(s)}\\d*$") for s in allowed_index_suffixes]
    fk_pats = [re.compile(f".*{re.escape(s)}$") for s in allowed_fk_suffixes]

    for table, data in results.items():
        for idx_name in data["indexes"]:
            valid = any(p.match(idx_name) for p in prefix_pats) or any(p.match(idx_name) for p in suffix_pats)
            assert valid, (
                f"Index '{idx_name}' on table '{table}' does not follow naming conventions. "
                f"Allowed prefixes: {allowed_index_prefixes}, suffixes: {allowed_index_suffixes} "
                "(optional trailing digit permitted)."
            )

        for fk in data["fks"]:
            fk_name = fk.get("name")
            if fk_name:
                valid = any(p.match(fk_name) for p in fk_pats)
                assert valid, (
                    f"Foreign key '{fk_name}' on table '{table}' does not follow naming conventions. "
                    f"Allowed suffixes: {allowed_fk_suffixes}."
                )

Fixtures

Core fixtures

Standard pytest fixtures for Alembic migration testing.

alembic_config()

Create an Alembic :class:~alembic.config.Config from alembic.ini.

Override this fixture in your test class to point to a different alembic.ini::

@pytest.fixture
def alembic_config(self) -> Config:
    from pathlib import Path
    from alembic.config import Config
    ini = Path(__file__).parent.parent / "alembic.ini"
    return Config(str(ini))
Source code in alembic_gauntlet/fixtures.py
@pytest.fixture
def alembic_config() -> Config:
    """Create an Alembic :class:`~alembic.config.Config` from ``alembic.ini``.

    Override this fixture in your test class to point to a different ``alembic.ini``::

        @pytest.fixture
        def alembic_config(self) -> Config:
            from pathlib import Path
            from alembic.config import Config
            ini = Path(__file__).parent.parent / "alembic.ini"
            return Config(str(ini))
    """
    return _create_alembic_config()

migration_engine(migration_db_url) async

Async engine with :class:~sqlalchemy.pool.NullPool for migration tests.

NullPool prevents connection reuse between tests, which is important for schema isolation when running tests in parallel.

Requires the migration_db_url fixture to be provided.

Source code in alembic_gauntlet/fixtures.py
@pytest.fixture
async def migration_engine(migration_db_url: str) -> AsyncGenerator[AsyncEngine, None]:
    """Async engine with :class:`~sqlalchemy.pool.NullPool` for migration tests.

    NullPool prevents connection reuse between tests, which is important for
    schema isolation when running tests in parallel.

    Requires the ``migration_db_url`` fixture to be provided.
    """
    engine = create_async_engine(migration_db_url, echo=False, poolclass=NullPool)
    try:
        yield engine
    finally:
        await engine.dispose()

Exceptions

Exception classes for alembic-gauntlet.

EmptySchemaNameError

Bases: SchemaValidationError

Schema name is empty.

InvalidSchemaNameError

Bases: SchemaValidationError

Schema name has invalid format.

ReservedWordSchemaNameError

Bases: SchemaValidationError

Schema name is a PostgreSQL reserved word.

SchemaNameTooLongError

Bases: SchemaValidationError

Schema name exceeds the PostgreSQL maximum identifier length of 63 characters.

SchemaValidationError

Bases: Exception

Base for schema name validation errors.

Contrib

Testcontainers

Optional migration_db_url fixture powered by testcontainers.

Import this fixture in your conftest.py to get a fully managed PostgreSQL container without any external setup::

# tests/conftest.py
from alembic_gauntlet.contrib.testcontainers import migration_db_url  # noqa: F401

The fixture has session scope, so the container starts once per test session. Override migration_db_url in your own conftest to use a different database.

Requires the optional extra::

pip install "alembic-gauntlet[testcontainers]"

migration_db_url()

Start a PostgreSQL 17 container and yield its async DSN.

The container is stopped automatically at the end of the test session.

Raises:

Type Description
ImportError

If testcontainers is not installed.

Type aliases

MigrationDiff