SQLAlchemy 2.1 Documentation
SQLAlchemy Core
- SQL Statements and Expressions API
- Schema Definition Language
- SQL Datatype Objects
- Engine and Connection Use
- Engine Configuration
- Working with Engines and Connections
- Connection Pooling
- Core Events¶
Events
- Connection Pool Events
- SQL Execution and Connection Events
ConnectionEvents
ConnectionEvents.after_cursor_execute()
ConnectionEvents.after_execute()
ConnectionEvents.before_cursor_execute()
ConnectionEvents.before_execute()
ConnectionEvents.begin()
ConnectionEvents.begin_twophase()
ConnectionEvents.commit()
ConnectionEvents.commit_twophase()
ConnectionEvents.dispatch
ConnectionEvents.engine_connect()
ConnectionEvents.engine_disposed()
ConnectionEvents.prepare_twophase()
ConnectionEvents.release_savepoint()
ConnectionEvents.rollback()
ConnectionEvents.rollback_savepoint()
ConnectionEvents.rollback_twophase()
ConnectionEvents.savepoint()
ConnectionEvents.set_connection_execution_options()
ConnectionEvents.set_engine_execution_options()
DialectEvents
- Schema Events
- Core API Basics
Project Versions
- Previous: Connection Pooling
- Next: Core API Basics
- Up: Home
- On this page:
- Core Events
Events
- Connection Pool Events
- SQL Execution and Connection Events
ConnectionEvents
ConnectionEvents.after_cursor_execute()
ConnectionEvents.after_execute()
ConnectionEvents.before_cursor_execute()
ConnectionEvents.before_execute()
ConnectionEvents.begin()
ConnectionEvents.begin_twophase()
ConnectionEvents.commit()
ConnectionEvents.commit_twophase()
ConnectionEvents.dispatch
ConnectionEvents.engine_connect()
ConnectionEvents.engine_disposed()
ConnectionEvents.prepare_twophase()
ConnectionEvents.release_savepoint()
ConnectionEvents.rollback()
ConnectionEvents.rollback_savepoint()
ConnectionEvents.rollback_twophase()
ConnectionEvents.savepoint()
ConnectionEvents.set_connection_execution_options()
ConnectionEvents.set_engine_execution_options()
DialectEvents
- Schema Events
Core Events¶
This section describes the event interfaces provided in SQLAlchemy Core. For an introduction to the event listening API, see Events. ORM events are described in ORM Events.
Object Name | Description |
---|---|
Define event listening functions for a particular target type. |
- class sqlalchemy.event.base.Events¶
Define event listening functions for a particular target type.
Members
Class signature
class
sqlalchemy.event.Events
(sqlalchemy.event._HasEventsDispatch
)-
attribute
sqlalchemy.event.base.Events.
dispatch: _Dispatch[_ET] = <sqlalchemy.event.base.EventsDispatch object>¶ reference back to the _Dispatch class.
Bidirectional against _Dispatch._events
-
attribute
Connection Pool Events¶
Object Name | Description |
---|---|
Available events for |
|
describes the state of a DBAPI connection as it is being passed to
the |
- class sqlalchemy.events.PoolEvents¶
Available events for
Pool
.The methods here define the name of an event as well as the names of members that are passed to listener functions.
e.g.:
from sqlalchemy import event def my_on_checkout(dbapi_conn, connection_rec, connection_proxy): "handle an on checkout event" event.listen(Pool, 'checkout', my_on_checkout)
In addition to accepting the
Pool
class andPool
instances,PoolEvents
also acceptsEngine
objects and theEngine
class as targets, which will be resolved to the.pool
attribute of the given engine or thePool
class:engine = create_engine("postgresql+psycopg2://scott:tiger@localhost/test") # will associate with engine.pool event.listen(engine, 'checkout', my_on_checkout)
Members
checkin(), checkout(), close(), close_detached(), connect(), detach(), dispatch, first_connect(), invalidate(), reset(), soft_invalidate()
Class signature
class
sqlalchemy.events.PoolEvents
(sqlalchemy.event.Events
)-
method
sqlalchemy.events.PoolEvents.
checkin(dbapi_connection: DBAPIConnection | None, connection_record: ConnectionPoolEntry) → None¶ Called when a connection returns to the pool.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngineOrPool, 'checkin') def receive_checkin(dbapi_connection, connection_record): "listen for the 'checkin' event" # ... (event handling logic) ...
Note that the connection may be closed, and may be None if the connection has been invalidated.
checkin
will not be called for detached connections. (They do not return to the pool.)- Parameters:
dbapi_connection¶ – a DBAPI connection. The
ConnectionPoolEntry.dbapi_connection
attribute.connection_record¶ – the
ConnectionPoolEntry
managing the DBAPI connection.
-
method
sqlalchemy.events.PoolEvents.
checkout(dbapi_connection: DBAPIConnection, connection_record: ConnectionPoolEntry, connection_proxy: PoolProxiedConnection) → None¶ Called when a connection is retrieved from the Pool.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngineOrPool, 'checkout') def receive_checkout(dbapi_connection, connection_record, connection_proxy): "listen for the 'checkout' event" # ... (event handling logic) ...
- Parameters:
dbapi_connection¶ – a DBAPI connection. The
ConnectionPoolEntry.dbapi_connection
attribute.connection_record¶ – the
ConnectionPoolEntry
managing the DBAPI connection.connection_proxy¶ – the
PoolProxiedConnection
object which will proxy the public interface of the DBAPI connection for the lifespan of the checkout.
If you raise a
DisconnectionError
, the current connection will be disposed and a fresh connection retrieved. Processing of all checkout listeners will abort and restart using the new connection.See also
ConnectionEvents.engine_connect()
- a similar event which occurs upon creation of a newConnection
.
-
method
sqlalchemy.events.PoolEvents.
close(dbapi_connection: DBAPIConnection, connection_record: ConnectionPoolEntry) → None¶ Called when a DBAPI connection is closed.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngineOrPool, 'close') def receive_close(dbapi_connection, connection_record): "listen for the 'close' event" # ... (event handling logic) ...
The event is emitted before the close occurs.
The close of a connection can fail; typically this is because the connection is already closed. If the close operation fails, the connection is discarded.
The
close()
event corresponds to a connection that’s still associated with the pool. To intercept close events for detached connections useclose_detached()
.- Parameters:
dbapi_connection¶ – a DBAPI connection. The
ConnectionPoolEntry.dbapi_connection
attribute.connection_record¶ – the
ConnectionPoolEntry
managing the DBAPI connection.
-
method
sqlalchemy.events.PoolEvents.
close_detached(dbapi_connection: DBAPIConnection) → None¶ Called when a detached DBAPI connection is closed.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngineOrPool, 'close_detached') def receive_close_detached(dbapi_connection): "listen for the 'close_detached' event" # ... (event handling logic) ...
The event is emitted before the close occurs.
The close of a connection can fail; typically this is because the connection is already closed. If the close operation fails, the connection is discarded.
- Parameters:
dbapi_connection¶ – a DBAPI connection. The
ConnectionPoolEntry.dbapi_connection
attribute.
-
method
sqlalchemy.events.PoolEvents.
connect(dbapi_connection: DBAPIConnection, connection_record: ConnectionPoolEntry) → None¶ Called at the moment a particular DBAPI connection is first created for a given
Pool
.Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngineOrPool, 'connect') def receive_connect(dbapi_connection, connection_record): "listen for the 'connect' event" # ... (event handling logic) ...
This event allows one to capture the point directly after which the DBAPI module-level
.connect()
method has been used in order to produce a new DBAPI connection.- Parameters:
dbapi_connection¶ – a DBAPI connection. The
ConnectionPoolEntry.dbapi_connection
attribute.connection_record¶ – the
ConnectionPoolEntry
managing the DBAPI connection.
-
method
sqlalchemy.events.PoolEvents.
detach(dbapi_connection: DBAPIConnection, connection_record: ConnectionPoolEntry) → None¶ Called when a DBAPI connection is “detached” from a pool.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngineOrPool, 'detach') def receive_detach(dbapi_connection, connection_record): "listen for the 'detach' event" # ... (event handling logic) ...
This event is emitted after the detach occurs. The connection is no longer associated with the given connection record.
- Parameters:
dbapi_connection¶ – a DBAPI connection. The
ConnectionPoolEntry.dbapi_connection
attribute.connection_record¶ – the
ConnectionPoolEntry
managing the DBAPI connection.
-
attribute
sqlalchemy.events.PoolEvents.
dispatch: _Dispatch[_ET] = <sqlalchemy.event.base.PoolEventsDispatch object>¶ reference back to the _Dispatch class.
Bidirectional against _Dispatch._events
-
method
sqlalchemy.events.PoolEvents.
first_connect(dbapi_connection: DBAPIConnection, connection_record: ConnectionPoolEntry) → None¶ Called exactly once for the first time a DBAPI connection is checked out from a particular
Pool
.Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngineOrPool, 'first_connect') def receive_first_connect(dbapi_connection, connection_record): "listen for the 'first_connect' event" # ... (event handling logic) ...
The rationale for
PoolEvents.first_connect()
is to determine information about a particular series of database connections based on the settings used for all connections. Since a particularPool
refers to a single “creator” function (which in terms of aEngine
refers to the URL and connection options used), it is typically valid to make observations about a single connection that can be safely assumed to be valid about all subsequent connections, such as the database version, the server and client encoding settings, collation settings, and many others.- Parameters:
dbapi_connection¶ – a DBAPI connection. The
ConnectionPoolEntry.dbapi_connection
attribute.connection_record¶ – the
ConnectionPoolEntry
managing the DBAPI connection.
-
method
sqlalchemy.events.PoolEvents.
invalidate(dbapi_connection: DBAPIConnection, connection_record: ConnectionPoolEntry, exception: BaseException | None) → None¶ Called when a DBAPI connection is to be “invalidated”.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngineOrPool, 'invalidate') def receive_invalidate(dbapi_connection, connection_record, exception): "listen for the 'invalidate' event" # ... (event handling logic) ...
This event is called any time the
ConnectionPoolEntry.invalidate()
method is invoked, either from API usage or via “auto-invalidation”, without thesoft
flag.The event occurs before a final attempt to call
.close()
on the connection occurs.- Parameters:
dbapi_connection¶ – a DBAPI connection. The
ConnectionPoolEntry.dbapi_connection
attribute.connection_record¶ – the
ConnectionPoolEntry
managing the DBAPI connection.exception¶ – the exception object corresponding to the reason for this invalidation, if any. May be
None
.
See also
-
method
sqlalchemy.events.PoolEvents.
reset(dbapi_connection: DBAPIConnection, connection_record: ConnectionPoolEntry, reset_state: PoolResetState) → None¶ Called before the “reset” action occurs for a pooled connection.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngineOrPool, 'reset') def receive_reset(dbapi_connection, connection_record, reset_state): "listen for the 'reset' event" # ... (event handling logic) ... # DEPRECATED calling style (pre-2.0, will be removed in a future release) @event.listens_for(SomeEngineOrPool, 'reset') def receive_reset(dbapi_connection, connection_record): "listen for the 'reset' event" # ... (event handling logic) ...
Changed in version 2.0: The
PoolEvents.reset()
event now accepts the argumentsPoolEvents.reset.dbapi_connection
,PoolEvents.reset.connection_record
,PoolEvents.reset.reset_state
. Support for listener functions which accept the previous argument signature(s) listed above as “deprecated” will be removed in a future release.This event represents when the
rollback()
method is called on the DBAPI connection before it is returned to the pool or discarded. A custom “reset” strategy may be implemented using this event hook, which may also be combined with disabling the default “reset” behavior using thePool.reset_on_return
parameter.The primary difference between the
PoolEvents.reset()
andPoolEvents.checkin()
events are thatPoolEvents.reset()
is called not just for pooled connections that are being returned to the pool, but also for connections that were detached using theConnection.detach()
method as well as asyncio connections that are being discarded due to garbage collection taking place on connections before the connection was checked in.Note that the event is not invoked for connections that were invalidated using
Connection.invalidate()
. These events may be intercepted using thePoolEvents.soft_invalidate()
andPoolEvents.invalidate()
event hooks, and all “connection close” events may be intercepted usingPoolEvents.close()
.The
PoolEvents.reset()
event is usually followed by thePoolEvents.checkin()
event, except in those cases where the connection is discarded immediately after reset.- Parameters:
dbapi_connection¶ – a DBAPI connection. The
ConnectionPoolEntry.dbapi_connection
attribute.connection_record¶ – the
ConnectionPoolEntry
managing the DBAPI connection.reset_state¶ –
PoolResetState
instance which provides information about the circumstances under which the connection is being reset.New in version 2.0.
-
method
sqlalchemy.events.PoolEvents.
soft_invalidate(dbapi_connection: DBAPIConnection, connection_record: ConnectionPoolEntry, exception: BaseException | None) → None¶ Called when a DBAPI connection is to be “soft invalidated”.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngineOrPool, 'soft_invalidate') def receive_soft_invalidate(dbapi_connection, connection_record, exception): "listen for the 'soft_invalidate' event" # ... (event handling logic) ...
This event is called any time the
ConnectionPoolEntry.invalidate()
method is invoked with thesoft
flag.Soft invalidation refers to when the connection record that tracks this connection will force a reconnect after the current connection is checked in. It does not actively close the dbapi_connection at the point at which it is called.
- Parameters:
dbapi_connection¶ – a DBAPI connection. The
ConnectionPoolEntry.dbapi_connection
attribute.connection_record¶ – the
ConnectionPoolEntry
managing the DBAPI connection.exception¶ – the exception object corresponding to the reason for this invalidation, if any. May be
None
.
-
method
- class sqlalchemy.events.PoolResetState¶
describes the state of a DBAPI connection as it is being passed to the
PoolEvents.reset()
connection pool event.Members
New in version 2.0.0b3.
-
attribute
sqlalchemy.events.PoolResetState.
asyncio_safe: bool¶ Indicates if the reset operation is occurring within a scope where an enclosing event loop is expected to be present for asyncio applications.
Will be False in the case that the connection is being garbage collected.
-
attribute
sqlalchemy.events.PoolResetState.
terminate_only: bool¶ indicates if the connection is to be immediately terminated and not checked in to the pool.
This occurs for connections that were invalidated, as well as asyncio connections that were not cleanly handled by the calling code that are instead being garbage collected. In the latter case, operations can’t be safely run on asyncio connections within garbage collection as there is not necessarily an event loop present.
-
attribute
sqlalchemy.events.PoolResetState.
transaction_was_reset: bool¶ Indicates if the transaction on the DBAPI connection was already essentially “reset” back by the
Connection
object.This boolean is True if the
Connection
had transactional state present upon it, which was then not closed using theConnection.rollback()
orConnection.commit()
method; instead, the transaction was closed inline within theConnection.close()
method so is guaranteed to remain non-present when this event is reached.
-
attribute
SQL Execution and Connection Events¶
Object Name | Description |
---|---|
Available events for
|
|
event interface for execution-replacement functions. |
- class sqlalchemy.events.ConnectionEvents¶
Available events for
Connection
andEngine
.The methods here define the name of an event as well as the names of members that are passed to listener functions.
An event listener can be associated with any
Connection
orEngine
class or instance, such as anEngine
, e.g.:from sqlalchemy import event, create_engine def before_cursor_execute(conn, cursor, statement, parameters, context, executemany): log.info("Received statement: %s", statement) engine = create_engine('postgresql+psycopg2://scott:tiger@localhost/test') event.listen(engine, "before_cursor_execute", before_cursor_execute)
or with a specific
Connection
:with engine.begin() as conn: @event.listens_for(conn, 'before_cursor_execute') def before_cursor_execute(conn, cursor, statement, parameters, context, executemany): log.info("Received statement: %s", statement)
When the methods are called with a statement parameter, such as in
after_cursor_execute()
orbefore_cursor_execute()
, the statement is the exact SQL string that was prepared for transmission to the DBAPIcursor
in the connection’sDialect
.The
before_execute()
andbefore_cursor_execute()
events can also be established with theretval=True
flag, which allows modification of the statement and parameters to be sent to the database. Thebefore_cursor_execute()
event is particularly useful here to add ad-hoc string transformations, such as comments, to all executions:from sqlalchemy.engine import Engine from sqlalchemy import event @event.listens_for(Engine, "before_cursor_execute", retval=True) def comment_sql_calls(conn, cursor, statement, parameters, context, executemany): statement = statement + " -- some comment" return statement, parameters
Note
ConnectionEvents
can be established on any combination ofEngine
,Connection
, as well as instances of each of those classes. Events across all four scopes will fire off for a given instance ofConnection
. However, for performance reasons, theConnection
object determines at instantiation time whether or not its parentEngine
has event listeners established. Event listeners added to theEngine
class or to an instance ofEngine
after the instantiation of a dependentConnection
instance will usually not be available on thatConnection
instance. The newly added listeners will instead take effect forConnection
instances created subsequent to those event listeners being established on the parentEngine
class or instance.- Parameters:
retval=False¶ – Applies to the
before_execute()
andbefore_cursor_execute()
events only. When True, the user-defined event function must have a return value, which is a tuple of parameters that replace the given statement and parameters. See those methods for a description of specific return arguments.
Members
after_cursor_execute(), after_execute(), before_cursor_execute(), before_execute(), begin(), begin_twophase(), commit(), commit_twophase(), dispatch, engine_connect(), engine_disposed(), prepare_twophase(), release_savepoint(), rollback(), rollback_savepoint(), rollback_twophase(), savepoint(), set_connection_execution_options(), set_engine_execution_options()
Class signature
class
sqlalchemy.events.ConnectionEvents
(sqlalchemy.event.Events
)-
method
sqlalchemy.events.ConnectionEvents.
after_cursor_execute(conn: Connection, cursor: DBAPICursor, statement: str, parameters: _DBAPIAnyExecuteParams, context: ExecutionContext | None, executemany: bool) → None¶ Intercept low-level cursor execute() events after execution.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'after_cursor_execute') def receive_after_cursor_execute(conn, cursor, statement, parameters, context, executemany): "listen for the 'after_cursor_execute' event" # ... (event handling logic) ...
- Parameters:
conn¶ –
Connection
objectcursor¶ – DBAPI cursor object. Will have results pending if the statement was a SELECT, but these should not be consumed as they will be needed by the
CursorResult
.statement¶ – string SQL statement, as passed to the DBAPI
parameters¶ – Dictionary, tuple, or list of parameters being passed to the
execute()
orexecutemany()
method of the DBAPIcursor
. In some cases may beNone
.context¶ –
ExecutionContext
object in use. May beNone
.executemany¶ – boolean, if
True
, this is anexecutemany()
call, ifFalse
, this is anexecute()
call.
-
method
sqlalchemy.events.ConnectionEvents.
after_execute(conn: Connection, clauseelement: Executable, multiparams: _CoreMultiExecuteParams, params: _CoreSingleExecuteParams, execution_options: _ExecuteOptions, result: Result[Unpack[TupleAny]]) → None¶ Intercept high level execute() events after execute.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'after_execute') def receive_after_execute(conn, clauseelement, multiparams, params, execution_options, result): "listen for the 'after_execute' event" # ... (event handling logic) ... # DEPRECATED calling style (pre-1.4, will be removed in a future release) @event.listens_for(SomeEngine, 'after_execute') def receive_after_execute(conn, clauseelement, multiparams, params, result): "listen for the 'after_execute' event" # ... (event handling logic) ...
Changed in version 1.4: The
ConnectionEvents.after_execute()
event now accepts the argumentsConnectionEvents.after_execute.conn
,ConnectionEvents.after_execute.clauseelement
,ConnectionEvents.after_execute.multiparams
,ConnectionEvents.after_execute.params
,ConnectionEvents.after_execute.execution_options
,ConnectionEvents.after_execute.result
. Support for listener functions which accept the previous argument signature(s) listed above as “deprecated” will be removed in a future release.- Parameters:
conn¶ –
Connection
objectclauseelement¶ – SQL expression construct,
Compiled
instance, or string statement passed toConnection.execute()
.multiparams¶ – Multiple parameter sets, a list of dictionaries.
params¶ – Single parameter set, a single dictionary.
execution_options¶ –
dictionary of execution options passed along with the statement, if any. This is a merge of all options that will be used, including those of the statement, the connection, and those passed in to the method itself for the 2.0 style of execution.
result¶ –
CursorResult
generated by the execution.
-
method
sqlalchemy.events.ConnectionEvents.
before_cursor_execute(conn: Connection, cursor: DBAPICursor, statement: str, parameters: _DBAPIAnyExecuteParams, context: ExecutionContext | None, executemany: bool) → Tuple[str, _DBAPIAnyExecuteParams] | None¶ Intercept low-level cursor execute() events before execution, receiving the string SQL statement and DBAPI-specific parameter list to be invoked against a cursor.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'before_cursor_execute') def receive_before_cursor_execute(conn, cursor, statement, parameters, context, executemany): "listen for the 'before_cursor_execute' event" # ... (event handling logic) ...
This event is a good choice for logging as well as late modifications to the SQL string. It’s less ideal for parameter modifications except for those which are specific to a target backend.
This event can be optionally established with the
retval=True
flag. Thestatement
andparameters
arguments should be returned as a two-tuple in this case:@event.listens_for(Engine, "before_cursor_execute", retval=True) def before_cursor_execute(conn, cursor, statement, parameters, context, executemany): # do something with statement, parameters return statement, parameters
See the example at
ConnectionEvents
.- Parameters:
conn¶ –
Connection
objectcursor¶ – DBAPI cursor object
statement¶ – string SQL statement, as to be passed to the DBAPI
parameters¶ – Dictionary, tuple, or list of parameters being passed to the
execute()
orexecutemany()
method of the DBAPIcursor
. In some cases may beNone
.context¶ –
ExecutionContext
object in use. May beNone
.executemany¶ – boolean, if
True
, this is anexecutemany()
call, ifFalse
, this is anexecute()
call.
-
method
sqlalchemy.events.ConnectionEvents.
before_execute(conn: Connection, clauseelement: Executable, multiparams: _CoreMultiExecuteParams, params: _CoreSingleExecuteParams, execution_options: _ExecuteOptions) → Tuple[Executable, _CoreMultiExecuteParams, _CoreSingleExecuteParams] | None¶ Intercept high level execute() events, receiving uncompiled SQL constructs and other objects prior to rendering into SQL.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'before_execute') def receive_before_execute(conn, clauseelement, multiparams, params, execution_options): "listen for the 'before_execute' event" # ... (event handling logic) ... # DEPRECATED calling style (pre-1.4, will be removed in a future release) @event.listens_for(SomeEngine, 'before_execute') def receive_before_execute(conn, clauseelement, multiparams, params): "listen for the 'before_execute' event" # ... (event handling logic) ...
Changed in version 1.4: The
ConnectionEvents.before_execute()
event now accepts the argumentsConnectionEvents.before_execute.conn
,ConnectionEvents.before_execute.clauseelement
,ConnectionEvents.before_execute.multiparams
,ConnectionEvents.before_execute.params
,ConnectionEvents.before_execute.execution_options
. Support for listener functions which accept the previous argument signature(s) listed above as “deprecated” will be removed in a future release.This event is good for debugging SQL compilation issues as well as early manipulation of the parameters being sent to the database, as the parameter lists will be in a consistent format here.
This event can be optionally established with the
retval=True
flag. Theclauseelement
,multiparams
, andparams
arguments should be returned as a three-tuple in this case:@event.listens_for(Engine, "before_execute", retval=True) def before_execute(conn, clauseelement, multiparams, params): # do something with clauseelement, multiparams, params return clauseelement, multiparams, params
- Parameters:
conn¶ –
Connection
objectclauseelement¶ – SQL expression construct,
Compiled
instance, or string statement passed toConnection.execute()
.multiparams¶ – Multiple parameter sets, a list of dictionaries.
params¶ – Single parameter set, a single dictionary.
execution_options¶ –
dictionary of execution options passed along with the statement, if any. This is a merge of all options that will be used, including those of the statement, the connection, and those passed in to the method itself for the 2.0 style of execution.
See also
-
method
sqlalchemy.events.ConnectionEvents.
begin(conn: Connection) → None¶ Intercept begin() events.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'begin') def receive_begin(conn): "listen for the 'begin' event" # ... (event handling logic) ...
- Parameters:
conn¶ –
Connection
object
-
method
sqlalchemy.events.ConnectionEvents.
begin_twophase(conn: Connection, xid: Any) → None¶ Intercept begin_twophase() events.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'begin_twophase') def receive_begin_twophase(conn, xid): "listen for the 'begin_twophase' event" # ... (event handling logic) ...
- Parameters:
conn¶ –
Connection
objectxid¶ – two-phase XID identifier
-
method
sqlalchemy.events.ConnectionEvents.
commit(conn: Connection) → None¶ Intercept commit() events, as initiated by a
Transaction
.Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'commit') def receive_commit(conn): "listen for the 'commit' event" # ... (event handling logic) ...
Note that the
Pool
may also “auto-commit” a DBAPI connection upon checkin, if thereset_on_return
flag is set to the value'commit'
. To intercept this commit, use thePoolEvents.reset()
hook.- Parameters:
conn¶ –
Connection
object
-
method
sqlalchemy.events.ConnectionEvents.
commit_twophase(conn: Connection, xid: Any, is_prepared: bool) → None¶ Intercept commit_twophase() events.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'commit_twophase') def receive_commit_twophase(conn, xid, is_prepared): "listen for the 'commit_twophase' event" # ... (event handling logic) ...
- Parameters:
conn¶ –
Connection
objectxid¶ – two-phase XID identifier
is_prepared¶ – boolean, indicates if
TwoPhaseTransaction.prepare()
was called.
-
attribute
sqlalchemy.events.ConnectionEvents.
dispatch: _Dispatch[_ET] = <sqlalchemy.event.base.ConnectionEventsDispatch object>¶ reference back to the _Dispatch class.
Bidirectional against _Dispatch._events
-
method
sqlalchemy.events.ConnectionEvents.
engine_connect(conn: Connection) → None¶ Intercept the creation of a new
Connection
.Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'engine_connect') def receive_engine_connect(conn): "listen for the 'engine_connect' event" # ... (event handling logic) ... # DEPRECATED calling style (pre-2.0, will be removed in a future release) @event.listens_for(SomeEngine, 'engine_connect') def receive_engine_connect(conn, branch): "listen for the 'engine_connect' event" # ... (event handling logic) ...
Changed in version 2.0: The
ConnectionEvents.engine_connect()
event now accepts the argumentsConnectionEvents.engine_connect.conn
. Support for listener functions which accept the previous argument signature(s) listed above as “deprecated” will be removed in a future release.This event is called typically as the direct result of calling the
Engine.connect()
method.It differs from the
PoolEvents.connect()
method, which refers to the actual connection to a database at the DBAPI level; a DBAPI connection may be pooled and reused for many operations. In contrast, this event refers only to the production of a higher levelConnection
wrapper around such a DBAPI connection.It also differs from the
PoolEvents.checkout()
event in that it is specific to theConnection
object, not the DBAPI connection thatPoolEvents.checkout()
deals with, although this DBAPI connection is available here via theConnection.connection
attribute. But note there can in fact be multiplePoolEvents.checkout()
events within the lifespan of a singleConnection
object, if thatConnection
is invalidated and re-established.- Parameters:
conn¶ –
Connection
object.
See also
PoolEvents.checkout()
the lower-level pool checkout event for an individual DBAPI connection
-
method
sqlalchemy.events.ConnectionEvents.
engine_disposed(engine: Engine) → None¶ Intercept when the
Engine.dispose()
method is called.Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'engine_disposed') def receive_engine_disposed(engine): "listen for the 'engine_disposed' event" # ... (event handling logic) ...
The
Engine.dispose()
method instructs the engine to “dispose” of it’s connection pool (e.g.Pool
), and replaces it with a new one. Disposing of the old pool has the effect that existing checked-in connections are closed. The new pool does not establish any new connections until it is first used.This event can be used to indicate that resources related to the
Engine
should also be cleaned up, keeping in mind that theEngine
can still be used for new requests in which case it re-acquires connection resources.
-
method
sqlalchemy.events.ConnectionEvents.
prepare_twophase(conn: Connection, xid: Any) → None¶ Intercept prepare_twophase() events.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'prepare_twophase') def receive_prepare_twophase(conn, xid): "listen for the 'prepare_twophase' event" # ... (event handling logic) ...
- Parameters:
conn¶ –
Connection
objectxid¶ – two-phase XID identifier
-
method
sqlalchemy.events.ConnectionEvents.
release_savepoint(conn: Connection, name: str, context: None) → None¶ Intercept release_savepoint() events.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'release_savepoint') def receive_release_savepoint(conn, name, context): "listen for the 'release_savepoint' event" # ... (event handling logic) ...
- Parameters:
conn¶ –
Connection
objectname¶ – specified name used for the savepoint.
context¶ – not used
-
method
sqlalchemy.events.ConnectionEvents.
rollback(conn: Connection) → None¶ Intercept rollback() events, as initiated by a
Transaction
.Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'rollback') def receive_rollback(conn): "listen for the 'rollback' event" # ... (event handling logic) ...
Note that the
Pool
also “auto-rolls back” a DBAPI connection upon checkin, if thereset_on_return
flag is set to its default value of'rollback'
. To intercept this rollback, use thePoolEvents.reset()
hook.- Parameters:
conn¶ –
Connection
object
See also
-
method
sqlalchemy.events.ConnectionEvents.
rollback_savepoint(conn: Connection, name: str, context: None) → None¶ Intercept rollback_savepoint() events.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'rollback_savepoint') def receive_rollback_savepoint(conn, name, context): "listen for the 'rollback_savepoint' event" # ... (event handling logic) ...
- Parameters:
conn¶ –
Connection
objectname¶ – specified name used for the savepoint.
context¶ – not used
-
method
sqlalchemy.events.ConnectionEvents.
rollback_twophase(conn: Connection, xid: Any, is_prepared: bool) → None¶ Intercept rollback_twophase() events.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'rollback_twophase') def receive_rollback_twophase(conn, xid, is_prepared): "listen for the 'rollback_twophase' event" # ... (event handling logic) ...
- Parameters:
conn¶ –
Connection
objectxid¶ – two-phase XID identifier
is_prepared¶ – boolean, indicates if
TwoPhaseTransaction.prepare()
was called.
-
method
sqlalchemy.events.ConnectionEvents.
savepoint(conn: Connection, name: str) → None¶ Intercept savepoint() events.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'savepoint') def receive_savepoint(conn, name): "listen for the 'savepoint' event" # ... (event handling logic) ...
- Parameters:
conn¶ –
Connection
objectname¶ – specified name used for the savepoint.
-
method
sqlalchemy.events.ConnectionEvents.
set_connection_execution_options(conn: Connection, opts: Dict[str, Any]) → None¶ Intercept when the
Connection.execution_options()
method is called.Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'set_connection_execution_options') def receive_set_connection_execution_options(conn, opts): "listen for the 'set_connection_execution_options' event" # ... (event handling logic) ...
This method is called after the new
Connection
has been produced, with the newly updated execution options collection, but before theDialect
has acted upon any of those new options.Note that this method is not called when a new
Connection
is produced which is inheriting execution options from its parentEngine
; to intercept this condition, use theConnectionEvents.engine_connect()
event.- Parameters:
conn¶ – The newly copied
Connection
objectopts¶ –
dictionary of options that were passed to the
Connection.execution_options()
method. This dictionary may be modified in place to affect the ultimate options which take effect.New in version 2.0: the
opts
dictionary may be modified in place.
See also
ConnectionEvents.set_engine_execution_options()
- event which is called whenEngine.execution_options()
is called.
-
method
sqlalchemy.events.ConnectionEvents.
set_engine_execution_options(engine: Engine, opts: Dict[str, Any]) → None¶ Intercept when the
Engine.execution_options()
method is called.Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'set_engine_execution_options') def receive_set_engine_execution_options(engine, opts): "listen for the 'set_engine_execution_options' event" # ... (event handling logic) ...
The
Engine.execution_options()
method produces a shallow copy of theEngine
which stores the new options. That newEngine
is passed here. A particular application of this method is to add aConnectionEvents.engine_connect()
event handler to the givenEngine
which will perform some per-Connection
task specific to these execution options.- Parameters:
opts¶ –
dictionary of options that were passed to the
Connection.execution_options()
method. This dictionary may be modified in place to affect the ultimate options which take effect.New in version 2.0: the
opts
dictionary may be modified in place.
See also
ConnectionEvents.set_connection_execution_options()
- event which is called whenConnection.execution_options()
is called.
- class sqlalchemy.events.DialectEvents¶
event interface for execution-replacement functions.
These events allow direct instrumentation and replacement of key dialect functions which interact with the DBAPI.
Note
DialectEvents
hooks should be considered semi-public and experimental. These hooks are not for general use and are only for those situations where intricate re-statement of DBAPI mechanics must be injected onto an existing dialect. For general-use statement-interception events, please use theConnectionEvents
interface.See also
ConnectionEvents.before_cursor_execute()
ConnectionEvents.before_execute()
Members
dispatch, do_connect(), do_execute(), do_execute_no_params(), do_executemany(), do_setinputsizes(), handle_error()
Class signature
class
sqlalchemy.events.DialectEvents
(sqlalchemy.event.Events
)-
attribute
sqlalchemy.events.DialectEvents.
dispatch: _Dispatch[_ET] = <sqlalchemy.event.base.DialectEventsDispatch object>¶ reference back to the _Dispatch class.
Bidirectional against _Dispatch._events
-
method
sqlalchemy.events.DialectEvents.
do_connect(dialect: Dialect, conn_rec: ConnectionPoolEntry, cargs: Tuple[Any, ...], cparams: Dict[str, Any]) → DBAPIConnection | None¶ Receive connection arguments before a connection is made.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'do_connect') def receive_do_connect(dialect, conn_rec, cargs, cparams): "listen for the 'do_connect' event" # ... (event handling logic) ...
This event is useful in that it allows the handler to manipulate the cargs and/or cparams collections that control how the DBAPI
connect()
function will be called.cargs
will always be a Python list that can be mutated in-place, andcparams
a Python dictionary that may also be mutated:e = create_engine("postgresql+psycopg2://user@host/dbname") @event.listens_for(e, 'do_connect') def receive_do_connect(dialect, conn_rec, cargs, cparams): cparams["password"] = "some_password"
The event hook may also be used to override the call to
connect()
entirely, by returning a non-None
DBAPI connection object:e = create_engine("postgresql+psycopg2://user@host/dbname") @event.listens_for(e, 'do_connect') def receive_do_connect(dialect, conn_rec, cargs, cparams): return psycopg2.connect(*cargs, **cparams)
-
method
sqlalchemy.events.DialectEvents.
do_execute(cursor: DBAPICursor, statement: str, parameters: _DBAPISingleExecuteParams, context: ExecutionContext) → Literal[True] | None¶ Receive a cursor to have execute() called.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'do_execute') def receive_do_execute(cursor, statement, parameters, context): "listen for the 'do_execute' event" # ... (event handling logic) ...
Return the value True to halt further events from invoking, and to indicate that the cursor execution has already taken place within the event handler.
-
method
sqlalchemy.events.DialectEvents.
do_execute_no_params(cursor: DBAPICursor, statement: str, context: ExecutionContext) → Literal[True] | None¶ Receive a cursor to have execute() with no parameters called.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'do_execute_no_params') def receive_do_execute_no_params(cursor, statement, context): "listen for the 'do_execute_no_params' event" # ... (event handling logic) ...
Return the value True to halt further events from invoking, and to indicate that the cursor execution has already taken place within the event handler.
-
method
sqlalchemy.events.DialectEvents.
do_executemany(cursor: DBAPICursor, statement: str, parameters: _DBAPIMultiExecuteParams, context: ExecutionContext) → Literal[True] | None¶ Receive a cursor to have executemany() called.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'do_executemany') def receive_do_executemany(cursor, statement, parameters, context): "listen for the 'do_executemany' event" # ... (event handling logic) ...
Return the value True to halt further events from invoking, and to indicate that the cursor execution has already taken place within the event handler.
-
method
sqlalchemy.events.DialectEvents.
do_setinputsizes(inputsizes: Dict[BindParameter[Any], Any], cursor: DBAPICursor, statement: str, parameters: _DBAPIAnyExecuteParams, context: ExecutionContext) → None¶ Receive the setinputsizes dictionary for possible modification.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'do_setinputsizes') def receive_do_setinputsizes(inputsizes, cursor, statement, parameters, context): "listen for the 'do_setinputsizes' event" # ... (event handling logic) ...
This event is emitted in the case where the dialect makes use of the DBAPI
cursor.setinputsizes()
method which passes information about parameter binding for a particular statement. The giveninputsizes
dictionary will containBindParameter
objects as keys, linked to DBAPI-specific type objects as values; for parameters that are not bound, they are added to the dictionary withNone
as the value, which means the parameter will not be included in the ultimate setinputsizes call. The event may be used to inspect and/or log the datatypes that are being bound, as well as to modify the dictionary in place. Parameters can be added, modified, or removed from this dictionary. Callers will typically want to inspect theBindParameter.type
attribute of the given bind objects in order to make decisions about the DBAPI object.After the event, the
inputsizes
dictionary is converted into an appropriate datastructure to be passed tocursor.setinputsizes
; either a list for a positional bound parameter execution style, or a dictionary of string parameter keys to DBAPI type objects for a named bound parameter execution style.The setinputsizes hook overall is only used for dialects which include the flag
use_setinputsizes=True
. Dialects which use this include cx_Oracle, pg8000, asyncpg, and pyodbc dialects.Note
For use with pyodbc, the
use_setinputsizes
flag must be passed to the dialect, e.g.:create_engine("mssql+pyodbc://...", use_setinputsizes=True)
See also
New in version 1.2.9.
-
method
sqlalchemy.events.DialectEvents.
handle_error(exception_context: ExceptionContext) → BaseException | None¶ Intercept all exceptions processed by the
Dialect
, typically but not limited to those emitted within the scope of aConnection
.Example argument forms:
from sqlalchemy import event @event.listens_for(SomeEngine, 'handle_error') def receive_handle_error(exception_context): "listen for the 'handle_error' event" # ... (event handling logic) ...
Changed in version 2.0: the
DialectEvents.handle_error()
event is moved to theDialectEvents
class, moved from theConnectionEvents
class, so that it may also participate in the “pre ping” operation configured with thecreate_engine.pool_pre_ping
parameter. The event remains registered by using theEngine
as the event target, however note that using theConnection
as an event target forDialectEvents.handle_error()
is no longer supported.This includes all exceptions emitted by the DBAPI as well as within SQLAlchemy’s statement invocation process, including encoding errors and other statement validation errors. Other areas in which the event is invoked include transaction begin and end, result row fetching, cursor creation.
Note that
handle_error()
may support new kinds of exceptions and new calling scenarios at any time. Code which uses this event must expect new calling patterns to be present in minor releases.To support the wide variety of members that correspond to an exception, as well as to allow extensibility of the event without backwards incompatibility, the sole argument received is an instance of
ExceptionContext
. This object contains data members representing detail about the exception.Use cases supported by this hook include:
read-only, low-level exception handling for logging and debugging purposes
Establishing whether a DBAPI connection error message indicates that the database connection needs to be reconnected, including for the “pre_ping” handler used by some dialects
Establishing or disabling whether a connection or the owning connection pool is invalidated or expired in response to a specific exception
exception re-writing
The hook is called while the cursor from the failed operation (if any) is still open and accessible. Special cleanup operations can be called on this cursor; SQLAlchemy will attempt to close this cursor subsequent to this hook being invoked.
As of SQLAlchemy 2.0, the “pre_ping” handler enabled using the
create_engine.pool_pre_ping
parameter will also participate in thehandle_error()
process, for those dialects that rely upon disconnect codes to detect database liveness. Note that some dialects such as psycopg, psycopg2, and most MySQL dialects make use of a nativeping()
method supplied by the DBAPI which does not make use of disconnect codes.Changed in version 2.0.0: The
DialectEvents.handle_error()
event hook participates in connection pool “pre-ping” operations. Within this usage, theExceptionContext.engine
attribute will beNone
, however theDialect
in use is always available via theExceptionContext.dialect
attribute.Changed in version 2.0.5: Added
ExceptionContext.is_pre_ping
attribute which will be set toTrue
when theDialectEvents.handle_error()
event hook is triggered within a connection pool pre-ping operation.Changed in version 2.0.5: An issue was repaired that allows for the PostgreSQL
psycopg
andpsycopg2
drivers, as well as all MySQL drivers, to properly participate in theDialectEvents.handle_error()
event hook during connection pool “pre-ping” operations; previously, the implementation was non-working for these drivers.A handler function has two options for replacing the SQLAlchemy-constructed exception into one that is user defined. It can either raise this new exception directly, in which case all further event listeners are bypassed and the exception will be raised, after appropriate cleanup as taken place:
@event.listens_for(Engine, "handle_error") def handle_exception(context): if isinstance(context.original_exception, psycopg2.OperationalError) and \ "failed" in str(context.original_exception): raise MySpecialException("failed operation")
Warning
Because the
DialectEvents.handle_error()
event specifically provides for exceptions to be re-thrown as the ultimate exception raised by the failed statement, stack traces will be misleading if the user-defined event handler itself fails and throws an unexpected exception; the stack trace may not illustrate the actual code line that failed! It is advised to code carefully here and use logging and/or inline debugging if unexpected exceptions are occurring.Alternatively, a “chained” style of event handling can be used, by configuring the handler with the
retval=True
modifier and returning the new exception instance from the function. In this case, event handling will continue onto the next handler. The “chained” exception is available usingExceptionContext.chained_exception
:@event.listens_for(Engine, "handle_error", retval=True) def handle_exception(context): if context.chained_exception is not None and \ "special" in context.chained_exception.message: return MySpecialException("failed", cause=context.chained_exception)
Handlers that return
None
may be used within the chain; when a handler returnsNone
, the previous exception instance, if any, is maintained as the current exception that is passed onto the next handler.When a custom exception is raised or returned, SQLAlchemy raises this new exception as-is, it is not wrapped by any SQLAlchemy object. If the exception is not a subclass of
sqlalchemy.exc.StatementError
, certain features may not be available; currently this includes the ORM’s feature of adding a detail hint about “autoflush” to exceptions raised within the autoflush process.- Parameters:
context¶ – an
ExceptionContext
object. See this class for details on all available members.
-
attribute
Schema Events¶
Object Name | Description |
---|---|
Define event listeners for schema objects,
that is, |
|
Base class for elements that are the targets of |
- class sqlalchemy.events.DDLEvents¶
Define event listeners for schema objects, that is,
SchemaItem
and otherSchemaEventTarget
subclasses, includingMetaData
,Table
,Column
, etc.Create / Drop Events
Events emitted when CREATE and DROP commands are emitted to the database. The event hooks in this category include
DDLEvents.before_create()
,DDLEvents.after_create()
,DDLEvents.before_drop()
, andDDLEvents.after_drop()
.These events are emitted when using schema-level methods such as
MetaData.create_all()
andMetaData.drop_all()
. Per-object create/drop methods such asTable.create()
,Table.drop()
,Index.create()
are also included, as well as dialect-specific methods such asENUM.create()
.New in version 2.0:
DDLEvents
event hooks now take place for non-table objects including constraints, indexes, and dialect-specific schema types.Event hooks may be attached directly to a
Table
object or to aMetaData
collection, as well as to anySchemaItem
class or object that can be individually created and dropped using a distinct SQL command. Such classes includeIndex
,Sequence
, and dialect-specific classes such asENUM
.Example using the
DDLEvents.after_create()
event, where a custom event hook will emit anALTER TABLE
command on the current connection, afterCREATE TABLE
is emitted:from sqlalchemy import create_engine from sqlalchemy import event from sqlalchemy import Table, Column, Metadata, Integer m = MetaData() some_table = Table('some_table', m, Column('data', Integer)) @event.listens_for(some_table, "after_create") def after_create(target, connection, **kw): connection.execute(text( "ALTER TABLE %s SET name=foo_%s" % (target.name, target.name) )) some_engine = create_engine("postgresql://scott:tiger@host/test") # will emit "CREATE TABLE some_table" as well as the above # "ALTER TABLE" statement afterwards m.create_all(some_engine)
Constraint objects such as
ForeignKeyConstraint
,UniqueConstraint
,CheckConstraint
may also be subscribed to these events, however they will not normally produce events as these objects are usually rendered inline within an enclosingCREATE TABLE
statement and implicitly dropped from aDROP TABLE
statement.For the
Index
construct, the event hook will be emitted forCREATE INDEX
, however SQLAlchemy does not normally emitDROP INDEX
when dropping tables as this is again implicit within theDROP TABLE
statement.New in version 2.0: Support for
SchemaItem
objects for create/drop events was expanded from its previous support forMetaData
andTable
to also includeConstraint
and all subclasses,Index
,Sequence
and some type-related constructs such asENUM
.Note
These event hooks are only emitted within the scope of SQLAlchemy’s create/drop methods; they are not necessarily supported by tools such as alembic.
Attachment Events
Attachment events are provided to customize behavior whenever a child schema element is associated with a parent, such as when a
Column
is associated with itsTable
, when aForeignKeyConstraint
is associated with aTable
, etc. These events includeDDLEvents.before_parent_attach()
andDDLEvents.after_parent_attach()
.Reflection Events
The
DDLEvents.column_reflect()
event is used to intercept and modify the in-Python definition of database columns when reflection of database tables proceeds.Use with Generic DDL
DDL events integrate closely with the
DDL
class and theExecutableDDLElement
hierarchy of DDL clause constructs, which are themselves appropriate as listener callables:from sqlalchemy import DDL event.listen( some_table, "after_create", DDL("ALTER TABLE %(table)s SET name=foo_%(table)s") )
Event Propagation to MetaData Copies
For all
DDLEvent
events, thepropagate=True
keyword argument will ensure that a given event handler is propagated to copies of the object, which are made when using theTable.to_metadata()
method:from sqlalchemy import DDL metadata = MetaData() some_table = Table("some_table", metadata, Column("data", Integer)) event.listen( some_table, "after_create", DDL("ALTER TABLE %(table)s SET name=foo_%(table)s"), propagate=True ) new_metadata = MetaData() new_table = some_table.to_metadata(new_metadata)
The above
DDL
object will be associated with theDDLEvents.after_create()
event for both thesome_table
and thenew_table
Table
objects.Members
after_create(), after_drop(), after_parent_attach(), before_create(), before_drop(), before_parent_attach(), column_reflect(), dispatch
Class signature
class
sqlalchemy.events.DDLEvents
(sqlalchemy.event.Events
)-
method
sqlalchemy.events.DDLEvents.
after_create(target: SchemaEventTarget, connection: Connection, **kw: Any) → None¶ Called after CREATE statements are emitted.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeSchemaClassOrObject, 'after_create') def receive_after_create(target, connection, **kw): "listen for the 'after_create' event" # ... (event handling logic) ...
- Parameters:
target¶ –
the
SchemaObject
, such as aMetaData
orTable
but also including all create/drop objects such asIndex
,Sequence
, etc., object which is the target of the event.New in version 2.0: Support for all
SchemaItem
objects was added.connection¶ – the
Connection
where the CREATE statement or statements have been emitted.**kw¶ – additional keyword arguments relevant to the event. The contents of this dictionary may vary across releases, and include the list of tables being generated for a metadata-level event, the checkfirst flag, and other elements used by internal events.
listen()
also accepts thepropagate=True
modifier for this event; when True, the listener function will be established for any copies made of the target object, i.e. those copies that are generated whenTable.to_metadata()
is used.
-
method
sqlalchemy.events.DDLEvents.
after_drop(target: SchemaEventTarget, connection: Connection, **kw: Any) → None¶ Called after DROP statements are emitted.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeSchemaClassOrObject, 'after_drop') def receive_after_drop(target, connection, **kw): "listen for the 'after_drop' event" # ... (event handling logic) ...
- Parameters:
target¶ –
the
SchemaObject
, such as aMetaData
orTable
but also including all create/drop objects such asIndex
,Sequence
, etc., object which is the target of the event.New in version 2.0: Support for all
SchemaItem
objects was added.connection¶ – the
Connection
where the DROP statement or statements have been emitted.**kw¶ – additional keyword arguments relevant to the event. The contents of this dictionary may vary across releases, and include the list of tables being generated for a metadata-level event, the checkfirst flag, and other elements used by internal events.
listen()
also accepts thepropagate=True
modifier for this event; when True, the listener function will be established for any copies made of the target object, i.e. those copies that are generated whenTable.to_metadata()
is used.
-
method
sqlalchemy.events.DDLEvents.
after_parent_attach(target: SchemaEventTarget, parent: SchemaItem) → None¶ Called after a
SchemaItem
is associated with a parentSchemaItem
.Example argument forms:
from sqlalchemy import event @event.listens_for(SomeSchemaClassOrObject, 'after_parent_attach') def receive_after_parent_attach(target, parent): "listen for the 'after_parent_attach' event" # ... (event handling logic) ...
listen()
also accepts thepropagate=True
modifier for this event; when True, the listener function will be established for any copies made of the target object, i.e. those copies that are generated whenTable.to_metadata()
is used.
-
method
sqlalchemy.events.DDLEvents.
before_create(target: SchemaEventTarget, connection: Connection, **kw: Any) → None¶ Called before CREATE statements are emitted.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeSchemaClassOrObject, 'before_create') def receive_before_create(target, connection, **kw): "listen for the 'before_create' event" # ... (event handling logic) ...
- Parameters:
target¶ –
the
SchemaObject
, such as aMetaData
orTable
but also including all create/drop objects such asIndex
,Sequence
, etc., object which is the target of the event.New in version 2.0: Support for all
SchemaItem
objects was added.connection¶ – the
Connection
where the CREATE statement or statements will be emitted.**kw¶ – additional keyword arguments relevant to the event. The contents of this dictionary may vary across releases, and include the list of tables being generated for a metadata-level event, the checkfirst flag, and other elements used by internal events.
listen()
accepts thepropagate=True
modifier for this event; when True, the listener function will be established for any copies made of the target object, i.e. those copies that are generated whenTable.to_metadata()
is used.listen()
accepts theinsert=True
modifier for this event; when True, the listener function will be prepended to the internal list of events upon discovery, and execute before registered listener functions that do not pass this argument.
-
method
sqlalchemy.events.DDLEvents.
before_drop(target: SchemaEventTarget, connection: Connection, **kw: Any) → None¶ Called before DROP statements are emitted.
Example argument forms:
from sqlalchemy import event @event.listens_for(SomeSchemaClassOrObject, 'before_drop') def receive_before_drop(target, connection, **kw): "listen for the 'before_drop' event" # ... (event handling logic) ...
- Parameters:
target¶ –
the
SchemaObject
, such as aMetaData
orTable
but also including all create/drop objects such asIndex
,Sequence
, etc., object which is the target of the event.New in version 2.0: Support for all
SchemaItem
objects was added.connection¶ – the
Connection
where the DROP statement or statements will be emitted.**kw¶ – additional keyword arguments relevant to the event. The contents of this dictionary may vary across releases, and include the list of tables being generated for a metadata-level event, the checkfirst flag, and other elements used by internal events.
listen()
also accepts thepropagate=True
modifier for this event; when True, the listener function will be established for any copies made of the target object, i.e. those copies that are generated whenTable.to_metadata()
is used.
-
method
sqlalchemy.events.DDLEvents.
before_parent_attach(target: SchemaEventTarget, parent: SchemaItem) → None¶ Called before a
SchemaItem
is associated with a parentSchemaItem
.Example argument forms:
from sqlalchemy import event @event.listens_for(SomeSchemaClassOrObject, 'before_parent_attach') def receive_before_parent_attach(target, parent): "listen for the 'before_parent_attach' event" # ... (event handling logic) ...
listen()
also accepts thepropagate=True
modifier for this event; when True, the listener function will be established for any copies made of the target object, i.e. those copies that are generated whenTable.to_metadata()
is used.
-
method
sqlalchemy.events.DDLEvents.
column_reflect(inspector: Inspector, table: Table, column_info: ReflectedColumn) → None¶ Called for each unit of ‘column info’ retrieved when a
Table
is being reflected.Example argument forms:
from sqlalchemy import event @event.listens_for(SomeSchemaClassOrObject, 'column_reflect') def receive_column_reflect(inspector, table, column_info): "listen for the 'column_reflect' event" # ... (event handling logic) ...
This event is most easily used by applying it to a specific
MetaData
instance, where it will take effect for allTable
objects within thatMetaData
that undergo reflection:metadata = MetaData() @event.listens_for(metadata, 'column_reflect') def receive_column_reflect(inspector, table, column_info): # receives for all Table objects that are reflected # under this MetaData # will use the above event hook my_table = Table("my_table", metadata, autoload_with=some_engine)
New in version 1.4.0b2: The
DDLEvents.column_reflect()
hook may now be applied to aMetaData
object as well as theMetaData
class itself where it will take place for allTable
objects associated with the targetedMetaData
.It may also be applied to the
Table
class across the board:from sqlalchemy import Table @event.listens_for(Table, 'column_reflect') def receive_column_reflect(inspector, table, column_info): # receives for all Table objects that are reflected
It can also be applied to a specific
Table
at the point that one is being reflected using theTable.listeners
parameter:t1 = Table( "my_table", autoload_with=some_engine, listeners=[ ('column_reflect', receive_column_reflect) ] )
The dictionary of column information as returned by the dialect is passed, and can be modified. The dictionary is that returned in each element of the list returned by
Inspector.get_columns()
:name
- the column’s name, is applied to theColumn.name
parametertype
- the type of this column, which should be an instance ofTypeEngine
, is applied to theColumn.type
parameternullable
- boolean flag if the column is NULL or NOT NULL, is applied to theColumn.nullable
parameterdefault
- the column’s server default value. This is normally specified as a plain string SQL expression, however the event can pass aFetchedValue
,DefaultClause
, ortext()
object as well. Is applied to theColumn.server_default
parameter
The event is called before any action is taken against this dictionary, and the contents can be modified; the following additional keys may be added to the dictionary to further modify how the
Column
is constructed:key
- the string key that will be used to access thisColumn
in the.c
collection; will be applied to theColumn.key
parameter. Is also used for ORM mapping. See the section Automating Column Naming Schemes from Reflected Tables for an example.quote
- force or un-force quoting on the column name; is applied to theColumn.quote
parameter.info
- a dictionary of arbitrary data to follow along with theColumn
, is applied to theColumn.info
parameter.
listen()
also accepts thepropagate=True
modifier for this event; when True, the listener function will be established for any copies made of the target object, i.e. those copies that are generated whenTable.to_metadata()
is used.See also
Automating Column Naming Schemes from Reflected Tables - in the ORM mapping documentation
Intercepting Column Definitions - in the Automap documentation
Reflecting with Database-Agnostic Types - in the Reflecting Database Objects documentation
-
attribute
sqlalchemy.events.DDLEvents.
dispatch: _Dispatch[_ET] = <sqlalchemy.event.base.DDLEventsDispatch object>¶ reference back to the _Dispatch class.
Bidirectional against _Dispatch._events
-
method
- class sqlalchemy.events.SchemaEventTarget¶
Base class for elements that are the targets of
DDLEvents
events.This includes
SchemaItem
as well asSchemaType
.Class signature
class
sqlalchemy.events.SchemaEventTarget
(sqlalchemy.event.registry.EventTarget
)
flambé! the dragon and The Alchemist image designs created and generously donated by Rotem Yaari.
Created using Sphinx 7.2.6. Documentation last generated: Wed 09 Oct 2024 01:02:03 PM EDT