Core Events

This section describes the event interfaces provided in SQLAlchemy Core. For an introduction to the event listening API, see Events. ORM events are described in ORM Events.

Object Name Description

Events

Define event listening functions for a particular target type.

class sqlalchemy.event.base.Events

Define event listening functions for a particular target type.

Connection Pool Events

Object Name Description

PoolEvents

Available events for Pool.

class sqlalchemy.events.PoolEvents

Available events for Pool.

The methods here define the name of an event as well as the names of members that are passed to listener functions.

e.g.:

from sqlalchemy import event

def my_on_checkout(dbapi_conn, connection_rec, connection_proxy):
    "handle an on checkout event"

event.listen(Pool, 'checkout', my_on_checkout)

In addition to accepting the Pool class and Pool instances, PoolEvents also accepts Engine objects and the Engine class as targets, which will be resolved to the .pool attribute of the given engine or the Pool class:

engine = create_engine("postgresql://scott:tiger@localhost/test")

# will associate with engine.pool
event.listen(engine, 'checkout', my_on_checkout)

Class signature

class sqlalchemy.events.PoolEvents (sqlalchemy.event.Events)

method sqlalchemy.events.PoolEvents.checkin(dbapi_connection, connection_record)

Called when a connection returns to the pool.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngineOrPool, 'checkin')
def receive_checkin(dbapi_connection, connection_record):
    "listen for the 'checkin' event"

    # ... (event handling logic) ...

Note that the connection may be closed, and may be None if the connection has been invalidated. checkin will not be called for detached connections. (They do not return to the pool.)

Parameters:
method sqlalchemy.events.PoolEvents.checkout(dbapi_connection, connection_record, connection_proxy)

Called when a connection is retrieved from the Pool.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngineOrPool, 'checkout')
def receive_checkout(dbapi_connection, connection_record, connection_proxy):
    "listen for the 'checkout' event"

    # ... (event handling logic) ...
Parameters:

If you raise a DisconnectionError, the current connection will be disposed and a fresh connection retrieved. Processing of all checkout listeners will abort and restart using the new connection.

See also

ConnectionEvents.engine_connect() - a similar event which occurs upon creation of a new Connection.

method sqlalchemy.events.PoolEvents.close(dbapi_connection, connection_record)

Called when a DBAPI connection is closed.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngineOrPool, 'close')
def receive_close(dbapi_connection, connection_record):
    "listen for the 'close' event"

    # ... (event handling logic) ...

The event is emitted before the close occurs.

The close of a connection can fail; typically this is because the connection is already closed. If the close operation fails, the connection is discarded.

The close() event corresponds to a connection that’s still associated with the pool. To intercept close events for detached connections use close_detached().

New in version 1.1.

Parameters:
method sqlalchemy.events.PoolEvents.close_detached(dbapi_connection)

Called when a detached DBAPI connection is closed.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngineOrPool, 'close_detached')
def receive_close_detached(dbapi_connection):
    "listen for the 'close_detached' event"

    # ... (event handling logic) ...

The event is emitted before the close occurs.

The close of a connection can fail; typically this is because the connection is already closed. If the close operation fails, the connection is discarded.

New in version 1.1.

Parameters:

dbapi_connection – a DBAPI connection. The _ConnectionRecord.dbapi_connection attribute.

method sqlalchemy.events.PoolEvents.connect(dbapi_connection, connection_record)

Called at the moment a particular DBAPI connection is first created for a given Pool.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngineOrPool, 'connect')
def receive_connect(dbapi_connection, connection_record):
    "listen for the 'connect' event"

    # ... (event handling logic) ...

This event allows one to capture the point directly after which the DBAPI module-level .connect() method has been used in order to produce a new DBAPI connection.

Parameters:
method sqlalchemy.events.PoolEvents.detach(dbapi_connection, connection_record)

Called when a DBAPI connection is “detached” from a pool.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngineOrPool, 'detach')
def receive_detach(dbapi_connection, connection_record):
    "listen for the 'detach' event"

    # ... (event handling logic) ...

This event is emitted after the detach occurs. The connection is no longer associated with the given connection record.

New in version 1.1.

Parameters:
method sqlalchemy.events.PoolEvents.first_connect(dbapi_connection, connection_record)

Called exactly once for the first time a DBAPI connection is checked out from a particular Pool.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngineOrPool, 'first_connect')
def receive_first_connect(dbapi_connection, connection_record):
    "listen for the 'first_connect' event"

    # ... (event handling logic) ...

The rationale for PoolEvents.first_connect() is to determine information about a particular series of database connections based on the settings used for all connections. Since a particular Pool refers to a single “creator” function (which in terms of a Engine refers to the URL and connection options used), it is typically valid to make observations about a single connection that can be safely assumed to be valid about all subsequent connections, such as the database version, the server and client encoding settings, collation settings, and many others.

Parameters:
method sqlalchemy.events.PoolEvents.invalidate(dbapi_connection, connection_record, exception)

Called when a DBAPI connection is to be “invalidated”.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngineOrPool, 'invalidate')
def receive_invalidate(dbapi_connection, connection_record, exception):
    "listen for the 'invalidate' event"

    # ... (event handling logic) ...

This event is called any time the _ConnectionRecord.invalidate() method is invoked, either from API usage or via “auto-invalidation”, without the soft flag.

The event occurs before a final attempt to call .close() on the connection occurs.

Parameters:

New in version 0.9.2: Added support for connection invalidation listening.

method sqlalchemy.events.PoolEvents.reset(dbapi_connection, connection_record)

Called before the “reset” action occurs for a pooled connection.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngineOrPool, 'reset')
def receive_reset(dbapi_connection, connection_record):
    "listen for the 'reset' event"

    # ... (event handling logic) ...

This event represents when the rollback() method is called on the DBAPI connection before it is returned to the pool or discarded. A custom “reset” strategy may be implemented using this event hook, which may also be combined with disabling the default “reset” behavior using the Pool.reset_on_return parameter.

The primary difference between the PoolEvents.reset() and PoolEvents.checkin() events are that PoolEvents.reset() is called not just for pooled connections that are being returned to the pool, but also for connections that were detached using the Connection.detach() method.

Note that the event is not invoked for connections that were invalidated using Connection.invalidate(). These events may be intercepted using the PoolEvents.soft_invalidate() and PoolEvents.invalidate() event hooks, and all “connection close” events may be intercepted using PoolEvents.close(). The PoolEvents.reset() event is usually followed by the PoolEvents.checkin() event, except in those cases where the connection is discarded immediately after reset.

In the 1.4 series, the event is also not invoked for asyncio connections that are being garbage collected without their being explicitly returned to the pool. This is due to the lack of an event loop which prevents “reset” operations from taking place. Version 2.0 will feature an enhanced version of PoolEvents.reset() which is invoked in this scenario while passing additional contextual information indicating that an event loop is not guaranteed to be present.

Parameters:
method sqlalchemy.events.PoolEvents.soft_invalidate(dbapi_connection, connection_record, exception)

Called when a DBAPI connection is to be “soft invalidated”.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngineOrPool, 'soft_invalidate')
def receive_soft_invalidate(dbapi_connection, connection_record, exception):
    "listen for the 'soft_invalidate' event"

    # ... (event handling logic) ...

This event is called any time the _ConnectionRecord.invalidate() method is invoked with the soft flag.

Soft invalidation refers to when the connection record that tracks this connection will force a reconnect after the current connection is checked in. It does not actively close the dbapi_connection at the point at which it is called.

New in version 1.0.3.

Parameters:

SQL Execution and Connection Events

Object Name Description

ConnectionEvents

Available events for Connectable, which includes Connection and Engine.

DialectEvents

event interface for execution-replacement functions.

class sqlalchemy.events.ConnectionEvents

Available events for Connectable, which includes Connection and Engine.

The methods here define the name of an event as well as the names of members that are passed to listener functions.

An event listener can be associated with any Connectable class or instance, such as an Engine, e.g.:

from sqlalchemy import event, create_engine

def before_cursor_execute(conn, cursor, statement, parameters, context,
                                                executemany):
    log.info("Received statement: %s", statement)

engine = create_engine('postgresql://scott:tiger@localhost/test')
event.listen(engine, "before_cursor_execute", before_cursor_execute)

or with a specific Connection:

with engine.begin() as conn:
    @event.listens_for(conn, 'before_cursor_execute')
    def before_cursor_execute(conn, cursor, statement, parameters,
                                    context, executemany):
        log.info("Received statement: %s", statement)

When the methods are called with a statement parameter, such as in after_cursor_execute() or before_cursor_execute(), the statement is the exact SQL string that was prepared for transmission to the DBAPI cursor in the connection’s Dialect.

The before_execute() and before_cursor_execute() events can also be established with the retval=True flag, which allows modification of the statement and parameters to be sent to the database. The before_cursor_execute() event is particularly useful here to add ad-hoc string transformations, such as comments, to all executions:

from sqlalchemy.engine import Engine
from sqlalchemy import event

@event.listens_for(Engine, "before_cursor_execute", retval=True)
def comment_sql_calls(conn, cursor, statement, parameters,
                                    context, executemany):
    statement = statement + " -- some comment"
    return statement, parameters

Note

ConnectionEvents can be established on any combination of Engine, Connection, as well as instances of each of those classes. Events across all four scopes will fire off for a given instance of Connection. However, for performance reasons, the Connection object determines at instantiation time whether or not its parent Engine has event listeners established. Event listeners added to the Engine class or to an instance of Engine after the instantiation of a dependent Connection instance will usually not be available on that Connection instance. The newly added listeners will instead take effect for Connection instances created subsequent to those event listeners being established on the parent Engine class or instance.

Parameters:

retval=False – Applies to the before_execute() and before_cursor_execute() events only. When True, the user-defined event function must have a return value, which is a tuple of parameters that replace the given statement and parameters. See those methods for a description of specific return arguments.

Class signature

class sqlalchemy.events.ConnectionEvents (sqlalchemy.event.Events)

method sqlalchemy.events.ConnectionEvents.after_cursor_execute(conn, cursor, statement, parameters, context, executemany)

Intercept low-level cursor execute() events after execution.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'after_cursor_execute')
def receive_after_cursor_execute(conn, cursor, statement, parameters, context, executemany):
    "listen for the 'after_cursor_execute' event"

    # ... (event handling logic) ...
Parameters:
  • connConnection object

  • cursor – DBAPI cursor object. Will have results pending if the statement was a SELECT, but these should not be consumed as they will be needed by the CursorResult.

  • statement – string SQL statement, as passed to the DBAPI

  • parameters – Dictionary, tuple, or list of parameters being passed to the execute() or executemany() method of the DBAPI cursor. In some cases may be None.

  • contextExecutionContext object in use. May be None.

  • executemany – boolean, if True, this is an executemany() call, if False, this is an execute() call.

method sqlalchemy.events.ConnectionEvents.after_execute(conn, clauseelement, multiparams, params, execution_options, result)

Intercept high level execute() events after execute.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'after_execute')
def receive_after_execute(conn, clauseelement, multiparams, params, execution_options, result):
    "listen for the 'after_execute' event"

    # ... (event handling logic) ...

# DEPRECATED calling style (pre-1.4, will be removed in a future release)
@event.listens_for(SomeEngine, 'after_execute')
def receive_after_execute(conn, clauseelement, multiparams, params, result):
    "listen for the 'after_execute' event"

    # ... (event handling logic) ...

Changed in version 1.4: The ConnectionEvents.after_execute() event now accepts the arguments ConnectionEvents.after_execute.conn, ConnectionEvents.after_execute.clauseelement, ConnectionEvents.after_execute.multiparams, ConnectionEvents.after_execute.params, ConnectionEvents.after_execute.execution_options, ConnectionEvents.after_execute.result. Support for listener functions which accept the previous argument signature(s) listed above as “deprecated” will be removed in a future release.

Parameters:
  • connConnection object

  • clauseelement – SQL expression construct, Compiled instance, or string statement passed to Connection.execute().

  • multiparams – Multiple parameter sets, a list of dictionaries.

  • params – Single parameter set, a single dictionary.

  • execution_options

    dictionary of execution options passed along with the statement, if any. This is a merge of all options that will be used, including those of the statement, the connection, and those passed in to the method itself for the 2.0 style of execution.

  • resultCursorResult generated by the execution.

method sqlalchemy.events.ConnectionEvents.before_cursor_execute(conn, cursor, statement, parameters, context, executemany)

Intercept low-level cursor execute() events before execution, receiving the string SQL statement and DBAPI-specific parameter list to be invoked against a cursor.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'before_cursor_execute')
def receive_before_cursor_execute(conn, cursor, statement, parameters, context, executemany):
    "listen for the 'before_cursor_execute' event"

    # ... (event handling logic) ...

This event is a good choice for logging as well as late modifications to the SQL string. It’s less ideal for parameter modifications except for those which are specific to a target backend.

This event can be optionally established with the retval=True flag. The statement and parameters arguments should be returned as a two-tuple in this case:

@event.listens_for(Engine, "before_cursor_execute", retval=True)
def before_cursor_execute(conn, cursor, statement,
                parameters, context, executemany):
    # do something with statement, parameters
    return statement, parameters

See the example at ConnectionEvents.

Parameters:
  • connConnection object

  • cursor – DBAPI cursor object

  • statement – string SQL statement, as to be passed to the DBAPI

  • parameters – Dictionary, tuple, or list of parameters being passed to the execute() or executemany() method of the DBAPI cursor. In some cases may be None.

  • contextExecutionContext object in use. May be None.

  • executemany – boolean, if True, this is an executemany() call, if False, this is an execute() call.

method sqlalchemy.events.ConnectionEvents.before_execute(conn, clauseelement, multiparams, params, execution_options)

Intercept high level execute() events, receiving uncompiled SQL constructs and other objects prior to rendering into SQL.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'before_execute')
def receive_before_execute(conn, clauseelement, multiparams, params, execution_options):
    "listen for the 'before_execute' event"

    # ... (event handling logic) ...

# DEPRECATED calling style (pre-1.4, will be removed in a future release)
@event.listens_for(SomeEngine, 'before_execute')
def receive_before_execute(conn, clauseelement, multiparams, params):
    "listen for the 'before_execute' event"

    # ... (event handling logic) ...

Changed in version 1.4: The ConnectionEvents.before_execute() event now accepts the arguments ConnectionEvents.before_execute.conn, ConnectionEvents.before_execute.clauseelement, ConnectionEvents.before_execute.multiparams, ConnectionEvents.before_execute.params, ConnectionEvents.before_execute.execution_options. Support for listener functions which accept the previous argument signature(s) listed above as “deprecated” will be removed in a future release.

This event is good for debugging SQL compilation issues as well as early manipulation of the parameters being sent to the database, as the parameter lists will be in a consistent format here.

This event can be optionally established with the retval=True flag. The clauseelement, multiparams, and params arguments should be returned as a three-tuple in this case:

@event.listens_for(Engine, "before_execute", retval=True)
def before_execute(conn, clauseelement, multiparams, params):
    # do something with clauseelement, multiparams, params
    return clauseelement, multiparams, params
Parameters:
  • connConnection object

  • clauseelement – SQL expression construct, Compiled instance, or string statement passed to Connection.execute().

  • multiparams – Multiple parameter sets, a list of dictionaries.

  • params – Single parameter set, a single dictionary.

  • execution_options

    dictionary of execution options passed along with the statement, if any. This is a merge of all options that will be used, including those of the statement, the connection, and those passed in to the method itself for the 2.0 style of execution.

method sqlalchemy.events.ConnectionEvents.begin(conn)

Intercept begin() events.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'begin')
def receive_begin(conn):
    "listen for the 'begin' event"

    # ... (event handling logic) ...
Parameters:

connConnection object

method sqlalchemy.events.ConnectionEvents.begin_twophase(conn, xid)

Intercept begin_twophase() events.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'begin_twophase')
def receive_begin_twophase(conn, xid):
    "listen for the 'begin_twophase' event"

    # ... (event handling logic) ...
Parameters:
method sqlalchemy.events.ConnectionEvents.commit(conn)

Intercept commit() events, as initiated by a Transaction.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'commit')
def receive_commit(conn):
    "listen for the 'commit' event"

    # ... (event handling logic) ...

Note that the Pool may also “auto-commit” a DBAPI connection upon checkin, if the reset_on_return flag is set to the value 'commit'. To intercept this commit, use the PoolEvents.reset() hook.

Parameters:

connConnection object

method sqlalchemy.events.ConnectionEvents.commit_twophase(conn, xid, is_prepared)

Intercept commit_twophase() events.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'commit_twophase')
def receive_commit_twophase(conn, xid, is_prepared):
    "listen for the 'commit_twophase' event"

    # ... (event handling logic) ...
Parameters:
method sqlalchemy.events.ConnectionEvents.engine_connect(conn, branch)

Intercept the creation of a new Connection.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'engine_connect')
def receive_engine_connect(conn, branch):
    "listen for the 'engine_connect' event"

    # ... (event handling logic) ...

This event is called typically as the direct result of calling the Engine.connect() method.

It differs from the PoolEvents.connect() method, which refers to the actual connection to a database at the DBAPI level; a DBAPI connection may be pooled and reused for many operations. In contrast, this event refers only to the production of a higher level Connection wrapper around such a DBAPI connection.

It also differs from the PoolEvents.checkout() event in that it is specific to the Connection object, not the DBAPI connection that PoolEvents.checkout() deals with, although this DBAPI connection is available here via the Connection.connection attribute. But note there can in fact be multiple PoolEvents.checkout() events within the lifespan of a single Connection object, if that Connection is invalidated and re-established. There can also be multiple Connection objects generated for the same already-checked-out DBAPI connection, in the case that a “branch” of a Connection is produced.

Parameters:
  • connConnection object.

  • branch – if True, this is a “branch” of an existing Connection. A branch is generated within the course of a statement execution to invoke supplemental statements, most typically to pre-execute a SELECT of a default value for the purposes of an INSERT statement.

See also

PoolEvents.checkout() the lower-level pool checkout event for an individual DBAPI connection

method sqlalchemy.events.ConnectionEvents.engine_disposed(engine)

Intercept when the Engine.dispose() method is called.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'engine_disposed')
def receive_engine_disposed(engine):
    "listen for the 'engine_disposed' event"

    # ... (event handling logic) ...

The Engine.dispose() method instructs the engine to “dispose” of it’s connection pool (e.g. Pool), and replaces it with a new one. Disposing of the old pool has the effect that existing checked-in connections are closed. The new pool does not establish any new connections until it is first used.

This event can be used to indicate that resources related to the Engine should also be cleaned up, keeping in mind that the Engine can still be used for new requests in which case it re-acquires connection resources.

New in version 1.0.5.

method sqlalchemy.events.ConnectionEvents.handle_error(exception_context)

Intercept all exceptions processed by the Connection.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'handle_error')
def receive_handle_error(exception_context):
    "listen for the 'handle_error' event"

    # ... (event handling logic) ...

This includes all exceptions emitted by the DBAPI as well as within SQLAlchemy’s statement invocation process, including encoding errors and other statement validation errors. Other areas in which the event is invoked include transaction begin and end, result row fetching, cursor creation.

Note that handle_error() may support new kinds of exceptions and new calling scenarios at any time. Code which uses this event must expect new calling patterns to be present in minor releases.

To support the wide variety of members that correspond to an exception, as well as to allow extensibility of the event without backwards incompatibility, the sole argument received is an instance of ExceptionContext. This object contains data members representing detail about the exception.

Use cases supported by this hook include:

  • read-only, low-level exception handling for logging and debugging purposes

  • exception re-writing

  • Establishing or disabling whether a connection or the owning connection pool is invalidated or expired in response to a specific exception [1].

The hook is called while the cursor from the failed operation (if any) is still open and accessible. Special cleanup operations can be called on this cursor; SQLAlchemy will attempt to close this cursor subsequent to this hook being invoked. If the connection is in “autocommit” mode, the transaction also remains open within the scope of this hook; the rollback of the per-statement transaction also occurs after the hook is called.

Note

A handler function has two options for replacing the SQLAlchemy-constructed exception into one that is user defined. It can either raise this new exception directly, in which case all further event listeners are bypassed and the exception will be raised, after appropriate cleanup as taken place:

@event.listens_for(Engine, "handle_error")
def handle_exception(context):
    if isinstance(context.original_exception,
        psycopg2.OperationalError) and \
        "failed" in str(context.original_exception):
        raise MySpecialException("failed operation")

Warning

Because the ConnectionEvents.handle_error() event specifically provides for exceptions to be re-thrown as the ultimate exception raised by the failed statement, stack traces will be misleading if the user-defined event handler itself fails and throws an unexpected exception; the stack trace may not illustrate the actual code line that failed! It is advised to code carefully here and use logging and/or inline debugging if unexpected exceptions are occurring.

Alternatively, a “chained” style of event handling can be used, by configuring the handler with the retval=True modifier and returning the new exception instance from the function. In this case, event handling will continue onto the next handler. The “chained” exception is available using ExceptionContext.chained_exception:

@event.listens_for(Engine, "handle_error", retval=True)
def handle_exception(context):
    if context.chained_exception is not None and \
        "special" in context.chained_exception.message:
        return MySpecialException("failed",
            cause=context.chained_exception)

Handlers that return None may be used within the chain; when a handler returns None, the previous exception instance, if any, is maintained as the current exception that is passed onto the next handler.

When a custom exception is raised or returned, SQLAlchemy raises this new exception as-is, it is not wrapped by any SQLAlchemy object. If the exception is not a subclass of sqlalchemy.exc.StatementError, certain features may not be available; currently this includes the ORM’s feature of adding a detail hint about “autoflush” to exceptions raised within the autoflush process.

Parameters:

context – an ExceptionContext object. See this class for details on all available members.

New in version 0.9.7: Added the ConnectionEvents.handle_error() hook.

Changed in version 1.1: The handle_error() event will now receive all exceptions that inherit from BaseException, including SystemExit and KeyboardInterrupt. The setting for ExceptionContext.is_disconnect is True in this case and the default for ExceptionContext.invalidate_pool_on_disconnect is False.

Changed in version 1.0.0: The handle_error() event is now invoked when an Engine fails during the initial call to Engine.connect(), as well as when a Connection object encounters an error during a reconnect operation.

Changed in version 1.0.0: The handle_error() event is not fired off when a dialect makes use of the skip_user_error_events execution option. This is used by dialects which intend to catch SQLAlchemy-specific exceptions within specific operations, such as when the MySQL dialect detects a table not present within the has_table() dialect method. Prior to 1.0.0, code which implements handle_error() needs to ensure that exceptions thrown in these scenarios are re-raised without modification.

method sqlalchemy.events.ConnectionEvents.prepare_twophase(conn, xid)

Intercept prepare_twophase() events.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'prepare_twophase')
def receive_prepare_twophase(conn, xid):
    "listen for the 'prepare_twophase' event"

    # ... (event handling logic) ...
Parameters:
method sqlalchemy.events.ConnectionEvents.release_savepoint(conn, name, context)

Intercept release_savepoint() events.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'release_savepoint')
def receive_release_savepoint(conn, name, context):
    "listen for the 'release_savepoint' event"

    # ... (event handling logic) ...
Parameters:
  • connConnection object

  • name – specified name used for the savepoint.

  • context – not used

method sqlalchemy.events.ConnectionEvents.rollback(conn)

Intercept rollback() events, as initiated by a Transaction.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'rollback')
def receive_rollback(conn):
    "listen for the 'rollback' event"

    # ... (event handling logic) ...

Note that the Pool also “auto-rolls back” a DBAPI connection upon checkin, if the reset_on_return flag is set to its default value of 'rollback'. To intercept this rollback, use the PoolEvents.reset() hook.

Parameters:

connConnection object

method sqlalchemy.events.ConnectionEvents.rollback_savepoint(conn, name, context)

Intercept rollback_savepoint() events.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'rollback_savepoint')
def receive_rollback_savepoint(conn, name, context):
    "listen for the 'rollback_savepoint' event"

    # ... (event handling logic) ...
Parameters:
  • connConnection object

  • name – specified name used for the savepoint.

  • context – not used

method sqlalchemy.events.ConnectionEvents.rollback_twophase(conn, xid, is_prepared)

Intercept rollback_twophase() events.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'rollback_twophase')
def receive_rollback_twophase(conn, xid, is_prepared):
    "listen for the 'rollback_twophase' event"

    # ... (event handling logic) ...
Parameters:
method sqlalchemy.events.ConnectionEvents.savepoint(conn, name)

Intercept savepoint() events.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'savepoint')
def receive_savepoint(conn, name):
    "listen for the 'savepoint' event"

    # ... (event handling logic) ...
Parameters:
  • connConnection object

  • name – specified name used for the savepoint.

method sqlalchemy.events.ConnectionEvents.set_connection_execution_options(conn, opts)

Intercept when the Connection.execution_options() method is called.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'set_connection_execution_options')
def receive_set_connection_execution_options(conn, opts):
    "listen for the 'set_connection_execution_options' event"

    # ... (event handling logic) ...

This method is called after the new Connection has been produced, with the newly updated execution options collection, but before the Dialect has acted upon any of those new options.

Note that this method is not called when a new Connection is produced which is inheriting execution options from its parent Engine; to intercept this condition, use the ConnectionEvents.engine_connect() event.

Parameters:

New in version 0.9.0.

See also

ConnectionEvents.set_engine_execution_options() - event which is called when Engine.execution_options() is called.

method sqlalchemy.events.ConnectionEvents.set_engine_execution_options(engine, opts)

Intercept when the Engine.execution_options() method is called.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'set_engine_execution_options')
def receive_set_engine_execution_options(engine, opts):
    "listen for the 'set_engine_execution_options' event"

    # ... (event handling logic) ...

The Engine.execution_options() method produces a shallow copy of the Engine which stores the new options. That new Engine is passed here. A particular application of this method is to add a ConnectionEvents.engine_connect() event handler to the given Engine which will perform some per- Connection task specific to these execution options.

Parameters:

New in version 0.9.0.

class sqlalchemy.events.DialectEvents

event interface for execution-replacement functions.

These events allow direct instrumentation and replacement of key dialect functions which interact with the DBAPI.

Note

DialectEvents hooks should be considered semi-public and experimental. These hooks are not for general use and are only for those situations where intricate re-statement of DBAPI mechanics must be injected onto an existing dialect. For general-use statement-interception events, please use the ConnectionEvents interface.

New in version 0.9.4.

Class signature

class sqlalchemy.events.DialectEvents (sqlalchemy.event.Events)

method sqlalchemy.events.DialectEvents.do_connect(dialect, conn_rec, cargs, cparams)

Receive connection arguments before a connection is made.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'do_connect')
def receive_do_connect(dialect, conn_rec, cargs, cparams):
    "listen for the 'do_connect' event"

    # ... (event handling logic) ...

This event is useful in that it allows the handler to manipulate the cargs and/or cparams collections that control how the DBAPI connect() function will be called. cargs will always be a Python list that can be mutated in-place, and cparams a Python dictionary that may also be mutated:

e = create_engine("postgresql+psycopg2://user@host/dbname")

@event.listens_for(e, 'do_connect')
def receive_do_connect(dialect, conn_rec, cargs, cparams):
    cparams["password"] = "some_password"

The event hook may also be used to override the call to connect() entirely, by returning a non-None DBAPI connection object:

e = create_engine("postgresql+psycopg2://user@host/dbname")

@event.listens_for(e, 'do_connect')
def receive_do_connect(dialect, conn_rec, cargs, cparams):
    return psycopg2.connect(*cargs, **cparams)

New in version 1.0.3.

method sqlalchemy.events.DialectEvents.do_execute(cursor, statement, parameters, context)

Receive a cursor to have execute() called.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'do_execute')
def receive_do_execute(cursor, statement, parameters, context):
    "listen for the 'do_execute' event"

    # ... (event handling logic) ...

Return the value True to halt further events from invoking, and to indicate that the cursor execution has already taken place within the event handler.

method sqlalchemy.events.DialectEvents.do_execute_no_params(cursor, statement, context)

Receive a cursor to have execute() with no parameters called.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'do_execute_no_params')
def receive_do_execute_no_params(cursor, statement, context):
    "listen for the 'do_execute_no_params' event"

    # ... (event handling logic) ...

Return the value True to halt further events from invoking, and to indicate that the cursor execution has already taken place within the event handler.

method sqlalchemy.events.DialectEvents.do_executemany(cursor, statement, parameters, context)

Receive a cursor to have executemany() called.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'do_executemany')
def receive_do_executemany(cursor, statement, parameters, context):
    "listen for the 'do_executemany' event"

    # ... (event handling logic) ...

Return the value True to halt further events from invoking, and to indicate that the cursor execution has already taken place within the event handler.

method sqlalchemy.events.DialectEvents.do_setinputsizes(inputsizes, cursor, statement, parameters, context)

Receive the setinputsizes dictionary for possible modification.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeEngine, 'do_setinputsizes')
def receive_do_setinputsizes(inputsizes, cursor, statement, parameters, context):
    "listen for the 'do_setinputsizes' event"

    # ... (event handling logic) ...

This event is emitted in the case where the dialect makes use of the DBAPI cursor.setinputsizes() method which passes information about parameter binding for a particular statement. The given inputsizes dictionary will contain BindParameter objects as keys, linked to DBAPI-specific type objects as values; for parameters that are not bound, they are added to the dictionary with None as the value, which means the parameter will not be included in the ultimate setinputsizes call. The event may be used to inspect and/or log the datatypes that are being bound, as well as to modify the dictionary in place. Parameters can be added, modified, or removed from this dictionary. Callers will typically want to inspect the BindParameter.type attribute of the given bind objects in order to make decisions about the DBAPI object.

After the event, the inputsizes dictionary is converted into an appropriate datastructure to be passed to cursor.setinputsizes; either a list for a positional bound parameter execution style, or a dictionary of string parameter keys to DBAPI type objects for a named bound parameter execution style.

The setinputsizes hook overall is only used for dialects which include the flag use_setinputsizes=True. Dialects which use this include cx_Oracle, pg8000, asyncpg, and pyodbc dialects.

Note

For use with pyodbc, the use_setinputsizes flag must be passed to the dialect, e.g.:

create_engine("mssql+pyodbc://...", use_setinputsizes=True)

New in version 1.2.9.

Schema Events

Object Name Description

DDLEvents

Define event listeners for schema objects, that is, SchemaItem and other SchemaEventTarget subclasses, including MetaData, Table, Column.

SchemaEventTarget

Base class for elements that are the targets of DDLEvents events.

class sqlalchemy.events.DDLEvents

Define event listeners for schema objects, that is, SchemaItem and other SchemaEventTarget subclasses, including MetaData, Table, Column.

MetaData and Table support events specifically regarding when CREATE and DROP DDL is emitted to the database.

Attachment events are also provided to customize behavior whenever a child schema element is associated with a parent, such as, when a Column is associated with its Table, when a ForeignKeyConstraint is associated with a Table, etc.

Example using the after_create event:

from sqlalchemy import event
from sqlalchemy import Table, Column, Metadata, Integer

m = MetaData()
some_table = Table('some_table', m, Column('data', Integer))

def after_create(target, connection, **kw):
    connection.execute(text(
        "ALTER TABLE %s SET name=foo_%s" % (target.name, target.name)
    ))

event.listen(some_table, "after_create", after_create)

DDL events integrate closely with the DDL class and the DDLElement hierarchy of DDL clause constructs, which are themselves appropriate as listener callables:

from sqlalchemy import DDL
event.listen(
    some_table,
    "after_create",
    DDL("ALTER TABLE %(table)s SET name=foo_%(table)s")
)

The methods here define the name of an event as well as the names of members that are passed to listener functions.

For all DDLEvent events, the propagate=True keyword argument will ensure that a given event handler is propagated to copies of the object, which are made when using the Table.to_metadata() method:

from sqlalchemy import DDL
event.listen(
    some_table,
    "after_create",
    DDL("ALTER TABLE %(table)s SET name=foo_%(table)s"),
    propagate=True
)

new_table = some_table.to_metadata(new_metadata)

The above DDL object will also be associated with the Table object represented by new_table.

Class signature

class sqlalchemy.events.DDLEvents (sqlalchemy.event.Events)

method sqlalchemy.events.DDLEvents.after_create(target, connection, **kw)

Called after CREATE statements are emitted.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeSchemaClassOrObject, 'after_create')
def receive_after_create(target, connection, **kw):
    "listen for the 'after_create' event"

    # ... (event handling logic) ...
Parameters:
  • target – the MetaData or Table object which is the target of the event.

  • connection – the Connection where the CREATE statement or statements have been emitted.

  • **kw – additional keyword arguments relevant to the event. The contents of this dictionary may vary across releases, and include the list of tables being generated for a metadata-level event, the checkfirst flag, and other elements used by internal events.

listen() also accepts the propagate=True modifier for this event; when True, the listener function will be established for any copies made of the target object, i.e. those copies that are generated when Table.to_metadata() is used.

method sqlalchemy.events.DDLEvents.after_drop(target, connection, **kw)

Called after DROP statements are emitted.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeSchemaClassOrObject, 'after_drop')
def receive_after_drop(target, connection, **kw):
    "listen for the 'after_drop' event"

    # ... (event handling logic) ...
Parameters:
  • target – the MetaData or Table object which is the target of the event.

  • connection – the Connection where the DROP statement or statements have been emitted.

  • **kw – additional keyword arguments relevant to the event. The contents of this dictionary may vary across releases, and include the list of tables being generated for a metadata-level event, the checkfirst flag, and other elements used by internal events.

listen() also accepts the propagate=True modifier for this event; when True, the listener function will be established for any copies made of the target object, i.e. those copies that are generated when Table.to_metadata() is used.

method sqlalchemy.events.DDLEvents.after_parent_attach(target, parent)

Called after a SchemaItem is associated with a parent SchemaItem.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeSchemaClassOrObject, 'after_parent_attach')
def receive_after_parent_attach(target, parent):
    "listen for the 'after_parent_attach' event"

    # ... (event handling logic) ...
Parameters:
  • target – the target object

  • parent – the parent to which the target is being attached.

listen() also accepts the propagate=True modifier for this event; when True, the listener function will be established for any copies made of the target object, i.e. those copies that are generated when Table.to_metadata() is used.

method sqlalchemy.events.DDLEvents.before_create(target, connection, **kw)

Called before CREATE statements are emitted.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeSchemaClassOrObject, 'before_create')
def receive_before_create(target, connection, **kw):
    "listen for the 'before_create' event"

    # ... (event handling logic) ...
Parameters:
  • target – the MetaData or Table object which is the target of the event.

  • connection – the Connection where the CREATE statement or statements will be emitted.

  • **kw – additional keyword arguments relevant to the event. The contents of this dictionary may vary across releases, and include the list of tables being generated for a metadata-level event, the checkfirst flag, and other elements used by internal events.

listen() accepts the propagate=True modifier for this event; when True, the listener function will be established for any copies made of the target object, i.e. those copies that are generated when Table.to_metadata() is used.

listen() accepts the insert=True modifier for this event; when True, the listener function will be prepended to the internal list of events upon discovery, and execute before registered listener functions that do not pass this argument.

method sqlalchemy.events.DDLEvents.before_drop(target, connection, **kw)

Called before DROP statements are emitted.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeSchemaClassOrObject, 'before_drop')
def receive_before_drop(target, connection, **kw):
    "listen for the 'before_drop' event"

    # ... (event handling logic) ...
Parameters:
  • target – the MetaData or Table object which is the target of the event.

  • connection – the Connection where the DROP statement or statements will be emitted.

  • **kw – additional keyword arguments relevant to the event. The contents of this dictionary may vary across releases, and include the list of tables being generated for a metadata-level event, the checkfirst flag, and other elements used by internal events.

listen() also accepts the propagate=True modifier for this event; when True, the listener function will be established for any copies made of the target object, i.e. those copies that are generated when Table.to_metadata() is used.

method sqlalchemy.events.DDLEvents.before_parent_attach(target, parent)

Called before a SchemaItem is associated with a parent SchemaItem.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeSchemaClassOrObject, 'before_parent_attach')
def receive_before_parent_attach(target, parent):
    "listen for the 'before_parent_attach' event"

    # ... (event handling logic) ...
Parameters:
  • target – the target object

  • parent – the parent to which the target is being attached.

listen() also accepts the propagate=True modifier for this event; when True, the listener function will be established for any copies made of the target object, i.e. those copies that are generated when Table.to_metadata() is used.

method sqlalchemy.events.DDLEvents.column_reflect(inspector, table, column_info)

Called for each unit of ‘column info’ retrieved when a Table is being reflected.

Example argument forms:

from sqlalchemy import event


@event.listens_for(SomeSchemaClassOrObject, 'column_reflect')
def receive_column_reflect(inspector, table, column_info):
    "listen for the 'column_reflect' event"

    # ... (event handling logic) ...

This event is most easily used by applying it to a specific MetaData instance, where it will take effect for all Table objects within that MetaData that undergo reflection:

metadata = MetaData()

@event.listens_for(metadata, 'column_reflect')
def receive_column_reflect(inspector, table, column_info):
    # receives for all Table objects that are reflected
    # under this MetaData


# will use the above event hook
my_table = Table("my_table", metadata, autoload_with=some_engine)

New in version 1.4.0b2: The DDLEvents.column_reflect() hook may now be applied to a MetaData object as well as the MetaData class itself where it will take place for all Table objects associated with the targeted MetaData.

It may also be applied to the Table class across the board:

from sqlalchemy import Table

@event.listens_for(Table, 'column_reflect')
def receive_column_reflect(inspector, table, column_info):
    # receives for all Table objects that are reflected

It can also be applied to a specific Table at the point that one is being reflected using the Table.listeners parameter:

t1 = Table(
    "my_table",
    autoload_with=some_engine,
    listeners=[
        ('column_reflect', receive_column_reflect)
    ]
)

The dictionary of column information as returned by the dialect is passed, and can be modified. The dictionary is that returned in each element of the list returned by Inspector.get_columns():

  • name - the column’s name, is applied to the Column.name parameter

  • type - the type of this column, which should be an instance of TypeEngine, is applied to the Column.type parameter

  • nullable - boolean flag if the column is NULL or NOT NULL, is applied to the Column.nullable parameter

  • default - the column’s server default value. This is normally specified as a plain string SQL expression, however the event can pass a FetchedValue, DefaultClause, or text() object as well. Is applied to the Column.server_default parameter

The event is called before any action is taken against this dictionary, and the contents can be modified; the following additional keys may be added to the dictionary to further modify how the Column is constructed:

listen() also accepts the propagate=True modifier for this event; when True, the listener function will be established for any copies made of the target object, i.e. those copies that are generated when Table.to_metadata() is used.

class sqlalchemy.events.SchemaEventTarget

Base class for elements that are the targets of DDLEvents events.

This includes SchemaItem as well as SchemaType.