Database Migrations¶
WebMACS uses Alembic for database schema migrations with full async support.
Overview¶
| Component | Details |
|---|---|
| Migration tool | Alembic |
| ORM | SQLAlchemy 2 (async) |
| Driver | asyncpg (runtime) / psycopg2 (migrations) |
| Config | backend/alembic.ini + backend/alembic/env.py |
| Migrations | backend/alembic/versions/ |
Driver Swap
Alembic runs migrations synchronously. The env.py automatically swaps +asyncpg → +psycopg2 in the database URL.
Make sure both drivers are installed (asyncpg for the app, psycopg2-binary for migrations).
Current Migrations¶
| Migration | Description |
|---|---|
001_plugins.py |
Plugin instances, channel mappings tables |
002_plugin_packages.py |
Plugin packages table (OTA uploads) |
003_fk_ondelete.py |
Add ON DELETE CASCADE/SET NULL to all foreign keys |
Running Migrations¶
Apply All Pending Migrations¶
Check Current Revision¶
View Migration History¶
Downgrade (Rollback Last Migration)¶
Creating a New Migration¶
Step 1 — Modify the Model¶
Edit backend/src/webmacs_backend/models.py:
class MyNewTable(Base):
__tablename__ = "my_new_table"
id: Mapped[int] = mapped_column(Integer, primary_key=True, autoincrement=True)
public_id: Mapped[str] = mapped_column(
String(100), unique=True, default=lambda: str(uuid.uuid4())
)
name: Mapped[str] = mapped_column(String(200), nullable=False)
created_on: Mapped[datetime] = mapped_column(
DateTime(timezone=True), server_default=func.now()
)
Step 2 — Auto-generate the Migration¶
This creates a file in alembic/versions/ with upgrade() and downgrade() functions.
Step 3 — Review the Generated Migration¶
Always review the generated file. Alembic autogenerate is not perfect — check for:
- [ ] Correct column types and constraints
- [ ] Missing indexes
- [ ] Proper
ondeleteon foreign keys - [ ] Data migrations (Alembic can't auto-detect these)
Example migration:
"""add_my_new_table
Revision ID: abc123
Revises: 003_fk_ondelete
"""
from alembic import op
import sqlalchemy as sa
revision = "004_my_new_table"
down_revision = "003_fk_ondelete"
def upgrade() -> None:
op.create_table(
"my_new_table",
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
sa.Column("public_id", sa.String(100), nullable=False),
sa.Column("name", sa.String(200), nullable=False),
sa.Column("created_on", sa.DateTime(timezone=True), server_default=sa.func.now()),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("public_id"),
)
def downgrade() -> None:
op.drop_table("my_new_table")
Step 4 — Apply¶
Step 5 — Verify¶
# Check the migration was applied
alembic current
# Or connect to the database
just db-shell
\dt -- list tables
Data Migrations¶
For data migrations (transforming existing data, not just schema), use op.execute():
def upgrade() -> None:
# Add column
op.add_column("events", sa.Column("category", sa.String(50), nullable=True))
# Backfill data
op.execute("UPDATE events SET category = 'sensor' WHERE event_type = 'numerical'")
# Now make it non-nullable
op.alter_column("events", "category", nullable=False)
Async Considerations¶
The env.py handles async engines transparently:
async def run_async_migrations() -> None:
connectable = async_engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
async with connectable.connect() as connection:
await connection.run_sync(do_run_migrations)
await connectable.dispose()
You don't need to do anything special — just run alembic upgrade head as normal.
Docker / Production¶
In production, migrations run automatically on startup via the Docker entrypoint:
For manual production migrations:
Troubleshooting¶
"Can't locate revision"¶
"Target database is not up to date"¶
psycopg2 Not Installed¶
Install it:
Next Steps¶
- Database Architecture — table schemas and relationships
- Testing — writing tests with database fixtures
- Contributing — development workflow