Release: 0.8.7 | Release Date: July 22, 2014

SQLAlchemy 0.8 Documentation

Core Events

This section describes the event interfaces provided in SQLAlchemy Core. For an introduction to the event listening API, see Events. ORM events are described in ORM Events.

New in version 0.7: The event system supersedes the previous system of “extension”, “listener”, and “proxy” classes.

Connection Pool Events

class sqlalchemy.events.PoolEvents

Bases: sqlalchemy.event.Events

Available events for Pool.

The methods here define the name of an event as well as the names of members that are passed to listener functions.

e.g.:

from sqlalchemy import event

def my_on_checkout(dbapi_conn, connection_rec, connection_proxy):
    "handle an on checkout event"

event.listen(Pool, 'checkout', my_on_checkout)

In addition to accepting the Pool class and Pool instances, PoolEvents also accepts Engine objects and the Engine class as targets, which will be resolved to the .pool attribute of the given engine or the Pool class:

engine = create_engine("postgresql://scott:tiger@localhost/test")

# will associate with engine.pool
event.listen(engine, 'checkout', my_on_checkout)
checkin(dbapi_connection, connection_record)

Called when a connection returns to the pool.

Note that the connection may be closed, and may be None if the connection has been invalidated. checkin will not be called for detached connections. (They do not return to the pool.)

Parameters:
  • dbapi_con – A raw DB-API connection
  • con_record – The _ConnectionRecord that persistently manages the connection
checkout(dbapi_connection, connection_record, connection_proxy)

Called when a connection is retrieved from the Pool.

Parameters:
  • dbapi_con – A raw DB-API connection
  • con_record – The _ConnectionRecord that persistently manages the connection
  • con_proxy – The _ConnectionFairy which manages the connection for the span of the current checkout.

If you raise a DisconnectionError, the current connection will be disposed and a fresh connection retrieved. Processing of all checkout listeners will abort and restart using the new connection.

connect(dbapi_connection, connection_record)

Called once for each new DB-API connection or Pool’s creator().

Parameters:
  • dbapi_con – A newly connected raw DB-API connection (not a SQLAlchemy Connection wrapper).
  • con_record – The _ConnectionRecord that persistently manages the connection
first_connect(dbapi_connection, connection_record)

Called exactly once for the first DB-API connection.

Parameters:
  • dbapi_con – A newly connected raw DB-API connection (not a SQLAlchemy Connection wrapper).
  • con_record – The _ConnectionRecord that persistently manages the connection
reset(dbapi_con, con_record)

Called before the “reset” action occurs for a pooled connection.

This event represents when the rollback() method is called on the DBAPI connection before it is returned to the pool. The behavior of “reset” can be controlled, including disabled, using the reset_on_return pool argument.

The PoolEvents.reset() event is usually followed by the PoolEvents.checkin() event is called, except in those cases where the connection is discarded immediately after reset.

Parameters:
  • dbapi_con – A raw DB-API connection
  • con_record – The _ConnectionRecord that persistently manages the connection

New in version 0.8.

SQL Execution and Connection Events

class sqlalchemy.events.ConnectionEvents

Bases: sqlalchemy.event.Events

Available events for Connectable, which includes Connection and Engine.

The methods here define the name of an event as well as the names of members that are passed to listener functions.

An event listener can be associated with any Connectable class or instance, such as an Engine, e.g.:

from sqlalchemy import event, create_engine

def before_cursor_execute(conn, cursor, statement, parameters, context,
                                                executemany):
    log.info("Received statement: %s" % statement)

engine = create_engine('postgresql://scott:tiger@localhost/test')
event.listen(engine, "before_cursor_execute", before_cursor_execute)

or with a specific Connection:

with engine.begin() as conn:
    @event.listens_for(conn, 'before_cursor_execute')
    def before_cursor_execute(conn, cursor, statement, parameters,
                                    context, executemany):
        log.info("Received statement: %s" % statement)

The before_execute() and before_cursor_execute() events can also be established with the retval=True flag, which allows modification of the statement and parameters to be sent to the database. The before_cursor_execute() event is particularly useful here to add ad-hoc string transformations, such as comments, to all executions:

from sqlalchemy.engine import Engine
from sqlalchemy import event

@event.listens_for(Engine, "before_cursor_execute", retval=True)
def comment_sql_calls(conn, cursor, statement, parameters,
                                    context, executemany):
    statement = statement + " -- some comment"
    return statement, parameters

Note

ConnectionEvents can be established on any combination of Engine, Connection, as well as instances of each of those classes. Events across all four scopes will fire off for a given instance of Connection. However, for performance reasons, the Connection object determines at instantiation time whether or not its parent Engine has event listeners established. Event listeners added to the Engine class or to an instance of Engine after the instantiation of a dependent Connection instance will usually not be available on that Connection instance. The newly added listeners will instead take effect for Connection instances created subsequent to those event listeners being established on the parent Engine class or instance.

Parameters:retval=False – Applies to the before_execute() and before_cursor_execute() events only. When True, the user-defined event function must have a return value, which is a tuple of parameters that replace the given statement and parameters. See those methods for a description of specific return arguments.

Changed in version 0.8: ConnectionEvents can now be associated with any Connectable including Connection, in addition to the existing support for Engine.

after_cursor_execute(conn, cursor, statement, parameters, context, executemany)

Intercept low-level cursor execute() events after execution.

Parameters:
  • connConnection object
  • cursor – DBAPI cursor object. Will have results pending if the statement was a SELECT, but these should not be consumed as they will be needed by the ResultProxy.
  • statement – string SQL statement
  • parameters – Dictionary, tuple, or list of parameters being passed to the execute() or executemany() method of the DBAPI cursor. In some cases may be None.
  • contextExecutionContext object in use. May be None.
  • executemany – boolean, if True, this is an executemany() call, if False, this is an execute() call.
after_execute(conn, clauseelement, multiparams, params, result)

Intercept high level execute() events after execute.

Parameters:
  • connConnection object
  • clauseelement – SQL expression construct, Compiled instance, or string statement passed to Connection.execute().
  • multiparams – Multiple parameter sets, a list of dictionaries.
  • params – Single parameter set, a single dictionary.
  • resultResultProxy generated by the execution.
before_cursor_execute(conn, cursor, statement, parameters, context, executemany)

Intercept low-level cursor execute() events before execution, receiving the string SQL statement and DBAPI-specific parameter list to be invoked against a cursor.

This event is a good choice for logging as well as late modifications to the SQL string. It’s less ideal for parameter modifications except for those which are specific to a target backend.

This event can be optionally established with the retval=True flag. The statement and parameters arguments should be returned as a two-tuple in this case:

@event.listens_for(Engine, "before_cursor_execute", retval=True)
def before_cursor_execute(conn, cursor, statement,
                parameters, context, executemany):
    # do something with statement, parameters
    return statement, parameters

See the example at ConnectionEvents.

Parameters:
  • connConnection object
  • cursor – DBAPI cursor object
  • statement – string SQL statement
  • parameters – Dictionary, tuple, or list of parameters being passed to the execute() or executemany() method of the DBAPI cursor. In some cases may be None.
  • contextExecutionContext object in use. May be None.
  • executemany – boolean, if True, this is an executemany() call, if False, this is an execute() call.

See also:

before_execute()

after_cursor_execute()

before_execute(conn, clauseelement, multiparams, params)

Intercept high level execute() events, receiving uncompiled SQL constructs and other objects prior to rendering into SQL.

This event is good for debugging SQL compilation issues as well as early manipulation of the parameters being sent to the database, as the parameter lists will be in a consistent format here.

This event can be optionally established with the retval=True flag. The clauseelement, multiparams, and params arguments should be returned as a three-tuple in this case:

@event.listens_for(Engine, "before_execute", retval=True)
def before_execute(conn, conn, clauseelement, multiparams, params):
    # do something with clauseelement, multiparams, params
    return clauseelement, multiparams, params
Parameters:
  • connConnection object
  • clauseelement – SQL expression construct, Compiled instance, or string statement passed to Connection.execute().
  • multiparams – Multiple parameter sets, a list of dictionaries.
  • params – Single parameter set, a single dictionary.

See also:

before_cursor_execute()

begin(conn)

Intercept begin() events.

Parameters:connConnection object
begin_twophase(conn, xid)

Intercept begin_twophase() events.

Parameters:
commit(conn)

Intercept commit() events, as initiated by a Transaction.

Note that the Pool may also “auto-commit” a DBAPI connection upon checkin, if the reset_on_return flag is set to the value 'commit'. To intercept this commit, use the PoolEvents.reset() hook.

Parameters:connConnection object
commit_twophase(conn, xid, is_prepared)

Intercept commit_twophase() events.

Parameters:
dbapi_error(conn, cursor, statement, parameters, context, exception)

Intercept a raw DBAPI error.

This event is called with the DBAPI exception instance received from the DBAPI itself, before SQLAlchemy wraps the exception with its own exception wrappers, and before any other operations are performed on the DBAPI cursor; the existing transaction remains in effect as well as any state on the cursor.

The use case here is to inject low-level exception handling into an Engine, typically for logging and debugging purposes. In general, user code should not modify any state or throw any exceptions here as this will interfere with SQLAlchemy’s cleanup and error handling routines.

Subsequent to this hook, SQLAlchemy may attempt any number of operations on the connection/cursor, including closing the cursor, rolling back of the transaction in the case of connectionless execution, and disposing of the entire connection pool if a “disconnect” was detected. The exception is then wrapped in a SQLAlchemy DBAPI exception wrapper and re-thrown.

Parameters:
  • connConnection object
  • cursor – DBAPI cursor object
  • statement – string SQL statement
  • parameters – Dictionary, tuple, or list of parameters being passed to the execute() or executemany() method of the DBAPI cursor. In some cases may be None.
  • contextExecutionContext object in use. May be None.
  • exception – The unwrapped exception emitted directly from the DBAPI. The class here is specific to the DBAPI module in use.

New in version 0.7.7.

prepare_twophase(conn, xid)

Intercept prepare_twophase() events.

Parameters:
release_savepoint(conn, name, context)

Intercept release_savepoint() events.

Parameters:
rollback(conn)

Intercept rollback() events, as initiated by a Transaction.

Note that the Pool also “auto-rolls back” a DBAPI connection upon checkin, if the reset_on_return flag is set to its default value of 'rollback'. To intercept this rollback, use the PoolEvents.reset() hook.

Parameters:connConnection object
rollback_savepoint(conn, name, context)

Intercept rollback_savepoint() events.

Parameters:
rollback_twophase(conn, xid, is_prepared)

Intercept rollback_twophase() events.

Parameters:
savepoint(conn, name=None)

Intercept savepoint() events.

Parameters:
  • connConnection object
  • name – specified name used for the savepoint.

Schema Events

class sqlalchemy.events.DDLEvents

Bases: sqlalchemy.event.Events

Define event listeners for schema objects, that is, SchemaItem and SchemaEvent subclasses, including MetaData, Table, Column.

MetaData and Table support events specifically regarding when CREATE and DROP DDL is emitted to the database.

Attachment events are also provided to customize behavior whenever a child schema element is associated with a parent, such as, when a Column is associated with its Table, when a ForeignKeyConstraint is associated with a Table, etc.

Example using the after_create event:

from sqlalchemy import event
from sqlalchemy import Table, Column, Metadata, Integer

m = MetaData()
some_table = Table('some_table', m, Column('data', Integer))

def after_create(target, connection, **kw):
    connection.execute("ALTER TABLE %s SET name=foo_%s" %
                            (target.name, target.name))

event.listen(some_table, "after_create", after_create)

DDL events integrate closely with the DDL class and the DDLElement hierarchy of DDL clause constructs, which are themselves appropriate as listener callables:

from sqlalchemy import DDL
event.listen(
    some_table,
    "after_create",
    DDL("ALTER TABLE %(table)s SET name=foo_%(table)s")
)

The methods here define the name of an event as well as the names of members that are passed to listener functions.

See also:

after_create(target, connection, **kw)

Called after CREATE statements are emitted.

Parameters:
  • target – the MetaData or Table object which is the target of the event.
  • connection – the Connection where the CREATE statement or statements have been emitted.
  • **kw – additional keyword arguments relevant to the event. The contents of this dictionary may vary across releases, and include the list of tables being generated for a metadata-level event, the checkfirst flag, and other elements used by internal events.
after_drop(target, connection, **kw)

Called after DROP statements are emitted.

Parameters:
  • target – the MetaData or Table object which is the target of the event.
  • connection – the Connection where the DROP statement or statements have been emitted.
  • **kw – additional keyword arguments relevant to the event. The contents of this dictionary may vary across releases, and include the list of tables being generated for a metadata-level event, the checkfirst flag, and other elements used by internal events.
after_parent_attach(target, parent)

Called after a SchemaItem is associated with a parent SchemaItem.

Parameters:
  • target – the target object
  • parent – the parent to which the target is being attached.

event.listen() also accepts a modifier for this event:

Parameters:propagate=False – When True, the listener function will be established for any copies made of the target object, i.e. those copies that are generated when Table.tometadata() is used.
before_create(target, connection, **kw)

Called before CREATE statements are emitted.

Parameters:
  • target – the MetaData or Table object which is the target of the event.
  • connection – the Connection where the CREATE statement or statements will be emitted.
  • **kw – additional keyword arguments relevant to the event. The contents of this dictionary may vary across releases, and include the list of tables being generated for a metadata-level event, the checkfirst flag, and other elements used by internal events.
before_drop(target, connection, **kw)

Called before DROP statements are emitted.

Parameters:
  • target – the MetaData or Table object which is the target of the event.
  • connection – the Connection where the DROP statement or statements will be emitted.
  • **kw – additional keyword arguments relevant to the event. The contents of this dictionary may vary across releases, and include the list of tables being generated for a metadata-level event, the checkfirst flag, and other elements used by internal events.
before_parent_attach(target, parent)

Called before a SchemaItem is associated with a parent SchemaItem.

Parameters:
  • target – the target object
  • parent – the parent to which the target is being attached.

event.listen() also accepts a modifier for this event:

Parameters:propagate=False – When True, the listener function will be established for any copies made of the target object, i.e. those copies that are generated when Table.tometadata() is used.
column_reflect(inspector, table, column_info)

Called for each unit of ‘column info’ retrieved when a Table is being reflected.

The dictionary of column information as returned by the dialect is passed, and can be modified. The dictionary is that returned in each element of the list returned by reflection.Inspector.get_columns().

The event is called before any action is taken against this dictionary, and the contents can be modified. The Column specific arguments info, key, and quote can also be added to the dictionary and will be passed to the constructor of Column.

Note that this event is only meaningful if either associated with the Table class across the board, e.g.:

from sqlalchemy.schema import Table
from sqlalchemy import event

def listen_for_reflect(inspector, table, column_info):
    "receive a column_reflect event"
    # ...

event.listen(
        Table,
        'column_reflect',
        listen_for_reflect)

...or with a specific Table instance using the listeners argument:

def listen_for_reflect(inspector, table, column_info):
    "receive a column_reflect event"
    # ...

t = Table(
    'sometable',
    autoload=True,
    listeners=[
        ('column_reflect', listen_for_reflect)
    ])

This because the reflection process initiated by autoload=True completes within the scope of the constructor for Table.

class sqlalchemy.events.SchemaEventTarget

Base class for elements that are the targets of DDLEvents events.

This includes SchemaItem as well as SchemaType.