Migrations#

Usually your schema changes over time. The schema of PostgreSQL-Audit is very flexible, since it stores the data in JSONB columns. Your schema can change without the need of changing the version history JSONB data columns.

However, in case you want to show the version history on the application side, you may want to reflect the changes you make to your schema to old_data and changed_data columns of activity table. The other solution is to make your application code aware of all the schema changes that have happened over time. This can get a bit tedious if your schema is quickly evolving.

Changing column name#

postgresql_audit.migrations.change_column_name(conn, table, old_column_name, new_column_name, schema=None)[source]#

Changes given activity jsonb data column key. This function is useful when you want to reflect column name changes to activity table.

from alembic import op
from postgresql_audit import change_column_name


def upgrade():
    op.alter_column(
        'my_table',
        'my_column',
        new_column_name='some_column'
    )

    change_column_name(op, 'my_table', 'my_column', 'some_column')
Parameters:
  • conn – An object that is able to execute SQL (either SQLAlchemy Connection, Engine or Alembic Operations object)

  • table – The table to run the column name changes against

  • old_column_name – Name of the column to change

  • new_column_name – New colum name

  • schema – Optional name of schema to use.

Alter column type#

postgresql_audit.migrations.alter_column(conn, table, column_name, func, schema=None)[source]#

Run given callable against given table and given column in activity table jsonb data columns. This function is useful when you want to reflect type changes in your schema to activity table.

In the following example we change the data type of User’s age column from string to integer.

from alembic import op
from postgresql_audit import alter_column


def upgrade():
    op.alter_column(
        'user',
        'age',
        type_=sa.Integer
    )

    alter_column(
        op,
        'user',
        'age',
        lambda value, activity_table: sa.cast(value, sa.Integer)
    )
Parameters:
  • conn – An object that is able to execute SQL (either SQLAlchemy Connection, Engine or Alembic Operations object)

  • table – The table to run the column name changes against

  • column_name – Name of the column to run callable against

  • func – A callable to run against specific column in activity table jsonb data columns. The callable should take two parameters the jsonb value corresponding to given column_name and activity table object.

  • schema – Optional name of schema to use.

Removing columns#

postgresql_audit.migrations.remove_column(conn, table, column_name, schema=None)[source]#

Removes given activity jsonb data column key. This function is useful when you are doing schema changes that require removing a column.

Let’s say you’ve been using PostgreSQL-Audit for a while for a table called article. Now you want to remove one audited column called ‘created_at’ from this table.

from alembic import op
from postgresql_audit import remove_column


def upgrade():
    op.remove_column('article', 'created_at')
    remove_column(op, 'article', 'created_at')
Parameters:
  • conn – An object that is able to execute SQL (either SQLAlchemy Connection, Engine or Alembic Operations object)

  • table – The table to remove the column from

  • column_name – Name of the column to remove

  • schema – Optional name of schema to use.

Adding columns#

postgresql_audit.migrations.add_column(conn, table, column_name, default_value=None, schema=None)[source]#

Adds given column to activity table jsonb data columns.

In the following example we reflect the changes made to our schema to activity table.

import sqlalchemy as sa
from alembic import op
from postgresql_audit import add_column


def upgrade():
    op.add_column('article', sa.Column('created_at', sa.DateTime()))
    add_column(op, 'article', 'created_at')
Parameters:
  • conn – An object that is able to execute SQL (either SQLAlchemy Connection, Engine or Alembic Operations object)

  • table – The table to remove the column from

  • column_name – Name of the column to add

  • default_value – The default value of the column

  • schema – Optional name of schema to use.

class postgresql_audit.migrations.jsonb_change_key_name(*clauses: Any)[source]#

Provides jsonb_change_key_name as a SQLAlchemy FunctionElement.

import sqlalchemy as sa
from postgresql_audit import jsonb_change_key_name


data = {'key1': 1, 'key3': 4}
query = sa.select(jsonb_merge(data, 'key1', 'key2'))
session.execute(query).scalar()  # {'key2': 1, 'key3': 4}

Rename table#

postgresql_audit.migrations.rename_table(conn, old_table_name, new_table_name, schema=None)[source]#

Renames given table in activity table. You should remember to call this function whenever you rename a versioned table.

from alembic import op
from postgresql_audit import rename_table


def upgrade():
    op.rename_table('article', 'article_v2')
    rename_table(op, 'article', 'article_v2')
Parameters:
  • conn – An object that is able to execute SQL (either SQLAlchemy Connection, Engine or Alembic Operations object)

  • old_table_name – The name of table to rename

  • new_table_name – New name of the renamed table

  • schema – Optional name of schema to use.