Release: 0.8.7 | Release Date: July 22, 2014

SQLAlchemy 0.8 Documentation

Column and Data Types

SQLAlchemy provides abstractions for most common database data types, and a mechanism for specifying your own custom data types.

The methods and attributes of type objects are rarely used directly. Type objects are supplied to Table definitions and can be supplied as type hints to functions for occasions where the database driver returns an incorrect type.

>>> users = Table('users', metadata,
...               Column('id', Integer, primary_key=True)
...               Column('login', String(32))
...              )

SQLAlchemy will use the Integer and String(32) type information when issuing a CREATE TABLE statement and will use it again when reading back rows SELECTed from the database. Functions that accept a type (such as Column()) will typically accept a type class or instance; Integer is equivalent to Integer() with no construction arguments in this case.

Generic Types

Generic types specify a column that can read, write and store a particular type of Python data. SQLAlchemy will choose the best database column type available on the target database when issuing a CREATE TABLE statement. For complete control over which column type is emitted in CREATE TABLE, such as VARCHAR see SQL Standard Types and the other sections of this chapter.

class sqlalchemy.types.BigInteger(*args, **kwargs)

Bases: sqlalchemy.types.Integer

A type for bigger int integers.

Typically generates a BIGINT in DDL, and otherwise acts like a normal Integer on the Python side.

class sqlalchemy.types.Boolean(create_constraint=True, name=None)

Bases: sqlalchemy.types.TypeEngine, sqlalchemy.types.SchemaType

A bool datatype.

Boolean typically uses BOOLEAN or SMALLINT on the DDL side, and on the Python side deals in True or False.

__init__(create_constraint=True, name=None)

Construct a Boolean.

Parameters:
  • create_constraint – defaults to True. If the boolean is generated as an int/smallint, also create a CHECK constraint on the table that ensures 1 or 0 as a value.
  • name – if a CHECK constraint is generated, specify the name of the constraint.
class sqlalchemy.types.Date(*args, **kwargs)

Bases: sqlalchemy.types._DateAffinity, sqlalchemy.types.TypeEngine

A type for datetime.date() objects.

class sqlalchemy.types.DateTime(timezone=False)

Bases: sqlalchemy.types._DateAffinity, sqlalchemy.types.TypeEngine

A type for datetime.datetime() objects.

Date and time types return objects from the Python datetime module. Most DBAPIs have built in support for the datetime module, with the noted exception of SQLite. In the case of SQLite, date and time types are stored as strings which are then converted back to datetime objects when rows are returned.

__init__(timezone=False)

Construct a new DateTime.

Parameters:timezone – boolean. If True, and supported by the

backend, will produce ‘TIMESTAMP WITH TIMEZONE’. For backends that don’t support timezone aware timestamps, has no effect.

class sqlalchemy.types.Enum(*enums, **kw)

Bases: sqlalchemy.types.String, sqlalchemy.types.SchemaType

Generic Enum Type.

The Enum type provides a set of possible string values which the column is constrained towards.

By default, uses the backend’s native ENUM type if available, else uses VARCHAR + a CHECK constraint.

See also

ENUM - PostgreSQL-specific type, which has additional functionality.

__init__(*enums, **kw)

Construct an enum.

Keyword arguments which don’t apply to a specific backend are ignored by that backend.

Parameters:
  • *enums – string or unicode enumeration labels. If unicode labels are present, the convert_unicode flag is auto-enabled.
  • convert_unicode – Enable unicode-aware bind parameter and result-set processing for this Enum’s data. This is set automatically based on the presence of unicode label strings.
  • metadata – Associate this type directly with a MetaData object. For types that exist on the target database as an independent schema construct (Postgresql), this type will be created and dropped within create_all() and drop_all() operations. If the type is not associated with any MetaData object, it will associate itself with each Table in which it is used, and will be created when any of those individual tables are created, after a check is performed for it’s existence. The type is only dropped when drop_all() is called for that Table object’s metadata, however.
  • name – The name of this type. This is required for Postgresql and any future supported database which requires an explicitly named type, or an explicitly named constraint in order to generate the type and/or a table that uses it.
  • native_enum – Use the database’s native ENUM type when available. Defaults to True. When False, uses VARCHAR + check constraint for all backends.
  • schema

    Schema name of this type. For types that exist on the target database as an independent schema construct (Postgresql), this parameter specifies the named schema in which the type is present.

    Note

    The schema of the Enum type does not by default make use of the schema established on the owning Table. If this behavior is desired, set the inherit_schema flag to True.

  • quote – Force quoting to be on or off on the type’s name. If left as the default of None, the usual schema-level “case sensitive”/”reserved name” rules are used to determine if this type’s name should be quoted.
  • inherit_schema

    When True, the “schema” from the owning Table will be copied to the “schema” attribute of this Enum, replacing whatever value was passed for the schema attribute. This also takes effect when using the Table.tometadata() operation.

    New in version 0.8.

create(bind=None, checkfirst=False)

Issue CREATE ddl for this type, if applicable.

drop(bind=None, checkfirst=False)

Issue DROP ddl for this type, if applicable.

class sqlalchemy.types.Float(precision=None, asdecimal=False, **kwargs)

Bases: sqlalchemy.types.Numeric

A type for float numbers.

Returns Python float objects by default, applying conversion as needed.

__init__(precision=None, asdecimal=False, **kwargs)

Construct a Float.

Parameters:
  • precision – the numeric precision for use in DDL CREATE TABLE.
  • asdecimal – the same flag as that of Numeric, but defaults to False. Note that setting this flag to True results in floating point conversion.
  • **kwargs – deprecated. Additional arguments here are ignored by the default Float type. For database specific floats that support additional arguments, see that dialect’s documentation for details, such as sqlalchemy.dialects.mysql.FLOAT.
class sqlalchemy.types.Integer(*args, **kwargs)

Bases: sqlalchemy.types._DateAffinity, sqlalchemy.types.TypeEngine

A type for int integers.

class sqlalchemy.types.Interval(native=True, second_precision=None, day_precision=None)

Bases: sqlalchemy.types._DateAffinity, sqlalchemy.types.TypeDecorator

A type for datetime.timedelta() objects.

The Interval type deals with datetime.timedelta objects. In PostgreSQL, the native INTERVAL type is used; for others, the value is stored as a date which is relative to the “epoch” (Jan. 1, 1970).

Note that the Interval type does not currently provide date arithmetic operations on platforms which do not support interval types natively. Such operations usually require transformation of both sides of the expression (such as, conversion of both sides into integer epoch values first) which currently is a manual procedure (such as via func).

__init__(native=True, second_precision=None, day_precision=None)

Construct an Interval object.

Parameters:
  • native – when True, use the actual INTERVAL type provided by the database, if supported (currently Postgresql, Oracle). Otherwise, represent the interval data as an epoch value regardless.
  • second_precision – For native interval types which support a “fractional seconds precision” parameter, i.e. Oracle and Postgresql
  • day_precision – for native interval types which support a “day precision” parameter, i.e. Oracle.
coerce_compared_value(op, value)

See TypeEngine.coerce_compared_value() for a description.

impl

alias of DateTime

class sqlalchemy.types.LargeBinary(length=None)

Bases: sqlalchemy.types._Binary

A type for large binary byte data.

The Binary type generates BLOB or BYTEA when tables are created, and also converts incoming values using the Binary callable provided by each DB-API.

__init__(length=None)

Construct a LargeBinary type.

Parameters:length – optional, a length for the column for use in DDL statements, for those BLOB types that accept a length (i.e. MySQL). It does not produce a small BINARY/VARBINARY type - use the BINARY/VARBINARY types specifically for those. May be safely omitted if no CREATE TABLE will be issued. Certain databases may require a length for use in DDL, and will raise an exception when the CREATE TABLE DDL is issued.
class sqlalchemy.types.Numeric(precision=None, scale=None, asdecimal=True)

Bases: sqlalchemy.types._DateAffinity, sqlalchemy.types.TypeEngine

A type for fixed precision numbers.

Typically generates DECIMAL or NUMERIC. Returns decimal.Decimal objects by default, applying conversion as needed.

Note

The cdecimal library is a high performing alternative to Python’s built-in decimal.Decimal type, which performs very poorly in high volume situations. SQLAlchemy 0.7 is tested against cdecimal and supports it fully. The type is not necessarily supported by DBAPI implementations however, most of which contain an import for plain decimal in their source code, even though some such as psycopg2 provide hooks for alternate adapters. SQLAlchemy imports decimal globally as well. The most straightforward and foolproof way to use “cdecimal” given current DBAPI and Python support is to patch it directly into sys.modules before anything else is imported:

import sys
import cdecimal
sys.modules["decimal"] = cdecimal

While the global patch is a little ugly, it’s particularly important to use just one decimal library at a time since Python Decimal and cdecimal Decimal objects are not currently compatible with each other:

>>> import cdecimal
>>> import decimal
>>> decimal.Decimal("10") == cdecimal.Decimal("10")
False

SQLAlchemy will provide more natural support of cdecimal if and when it becomes a standard part of Python installations and is supported by all DBAPIs.

__init__(precision=None, scale=None, asdecimal=True)

Construct a Numeric.

Parameters:
  • precision – the numeric precision for use in DDL CREATE TABLE.
  • scale – the numeric scale for use in DDL CREATE TABLE.
  • asdecimal – default True. Return whether or not values should be sent as Python Decimal objects, or as floats. Different DBAPIs send one or the other based on datatypes - the Numeric type will ensure that return values are one or the other across DBAPIs consistently.

When using the Numeric type, care should be taken to ensure that the asdecimal setting is apppropriate for the DBAPI in use - when Numeric applies a conversion from Decimal->float or float-> Decimal, this conversion incurs an additional performance overhead for all result columns received.

DBAPIs that return Decimal natively (e.g. psycopg2) will have better accuracy and higher performance with a setting of True, as the native translation to Decimal reduces the amount of floating- point issues at play, and the Numeric type itself doesn’t need to apply any further conversions. However, another DBAPI which returns floats natively will incur an additional conversion overhead, and is still subject to floating point data loss - in which case asdecimal=False will at least remove the extra conversion overhead.

class sqlalchemy.types.PickleType(protocol=2, pickler=None, comparator=None)

Bases: sqlalchemy.types.TypeDecorator

Holds Python objects, which are serialized using pickle.

PickleType builds upon the Binary type to apply Python’s pickle.dumps() to incoming objects, and pickle.loads() on the way out, allowing any pickleable Python object to be stored as a serialized binary field.

To allow ORM change events to propagate for elements associated with PickleType, see Mutation Tracking.

__init__(protocol=2, pickler=None, comparator=None)

Construct a PickleType.

Parameters:
  • protocol – defaults to pickle.HIGHEST_PROTOCOL.
  • pickler – defaults to cPickle.pickle or pickle.pickle if cPickle is not available. May be any object with pickle-compatible dumps` and ``loads methods.
  • comparator – a 2-arg callable predicate used to compare values of this type. If left as None, the Python “equals” operator is used to compare values.
impl

alias of LargeBinary

class sqlalchemy.types.SchemaType(**kw)

Bases: sqlalchemy.events.SchemaEventTarget

Mark a type as possibly requiring schema-level DDL for usage.

Supports types that must be explicitly created/dropped (i.e. PG ENUM type) as well as types that are complimented by table or schema level constraints, triggers, and other rules.

SchemaType classes can also be targets for the DDLEvents.before_parent_attach() and DDLEvents.after_parent_attach() events, where the events fire off surrounding the association of the type object with a parent Column.

See also

Enum

Boolean

adapt(impltype, **kw)
bind
copy(**kw)
create(bind=None, checkfirst=False)

Issue CREATE ddl for this type, if applicable.

drop(bind=None, checkfirst=False)

Issue DROP ddl for this type, if applicable.

class sqlalchemy.types.SmallInteger(*args, **kwargs)

Bases: sqlalchemy.types.Integer

A type for smaller int integers.

Typically generates a SMALLINT in DDL, and otherwise acts like a normal Integer on the Python side.

class sqlalchemy.types.String(length=None, collation=None, convert_unicode=False, unicode_error=None, _warn_on_bytestring=False)

Bases: sqlalchemy.types.Concatenable, sqlalchemy.types.TypeEngine

The base for all string and character types.

In SQL, corresponds to VARCHAR. Can also take Python unicode objects and encode to the database’s encoding in bind params (and the reverse for result sets.)

The length field is usually required when the String type is used within a CREATE TABLE statement, as VARCHAR requires a length on most databases.

__init__(length=None, collation=None, convert_unicode=False, unicode_error=None, _warn_on_bytestring=False)

Create a string-holding type.

Parameters:
  • length – optional, a length for the column for use in DDL and CAST expressions. May be safely omitted if no CREATE TABLE will be issued. Certain databases may require a length for use in DDL, and will raise an exception when the CREATE TABLE DDL is issued if a VARCHAR with no length is included. Whether the value is interpreted as bytes or characters is database specific.
  • collation

    Optional, a column-level collation for use in DDL and CAST expressions. Renders using the COLLATE keyword supported by SQLite, MySQL, and Postgresql. E.g.:

    >>> from sqlalchemy import cast, select, String
    >>> print select([cast('some string', String(collation='utf8'))])
    SELECT CAST(:param_1 AS VARCHAR COLLATE utf8) AS anon_1

    New in version 0.8: Added support for COLLATE to all string types.

  • convert_unicode

    When set to True, the String type will assume that input is to be passed as Python unicode objects, and results returned as Python unicode objects. If the DBAPI in use does not support Python unicode (which is fewer and fewer these days), SQLAlchemy will encode/decode the value, using the value of the encoding parameter passed to create_engine() as the encoding.

    When using a DBAPI that natively supports Python unicode objects, this flag generally does not need to be set. For columns that are explicitly intended to store non-ASCII data, the Unicode or UnicodeText types should be used regardless, which feature the same behavior of convert_unicode but also indicate an underlying column type that directly supports unicode, such as NVARCHAR.

    For the extremely rare case that Python unicode is to be encoded/decoded by SQLAlchemy on a backend that does natively support Python unicode, the value force can be passed here which will cause SQLAlchemy’s encode/decode services to be used unconditionally.

  • unicode_error – Optional, a method to use to handle Unicode conversion errors. Behaves like the errors keyword argument to the standard library’s string.decode() functions. This flag requires that convert_unicode is set to force - otherwise, SQLAlchemy is not guaranteed to handle the task of unicode conversion. Note that this flag adds significant performance overhead to row-fetching operations for backends that already return unicode objects natively (which most DBAPIs do). This flag should only be used as a last resort for reading strings from a column with varied or corrupted encodings.
class sqlalchemy.types.Text(length=None, collation=None, convert_unicode=False, unicode_error=None, _warn_on_bytestring=False)

Bases: sqlalchemy.types.String

A variably sized string type.

In SQL, usually corresponds to CLOB or TEXT. Can also take Python unicode objects and encode to the database’s encoding in bind params (and the reverse for result sets.) In general, TEXT objects do not have a length; while some databases will accept a length argument here, it will be rejected by others.

class sqlalchemy.types.Time(timezone=False)

Bases: sqlalchemy.types._DateAffinity, sqlalchemy.types.TypeEngine

A type for datetime.time() objects.

class sqlalchemy.types.Unicode(length=None, **kwargs)

Bases: sqlalchemy.types.String

A variable length Unicode string type.

The Unicode type is a String subclass that assumes input and output as Python unicode data, and in that regard is equivalent to the usage of the convert_unicode flag with the String type. However, unlike plain String, it also implies an underlying column type that is explicitly supporting of non-ASCII data, such as NVARCHAR on Oracle and SQL Server. This can impact the output of CREATE TABLE statements and CAST functions at the dialect level, and can also affect the handling of bound parameters in some specific DBAPI scenarios.

The encoding used by the Unicode type is usually determined by the DBAPI itself; most modern DBAPIs feature support for Python unicode objects as bound values and result set values, and the encoding should be configured as detailed in the notes for the target DBAPI in the Dialects section.

For those DBAPIs which do not support, or are not configured to accommodate Python unicode objects directly, SQLAlchemy does the encoding and decoding outside of the DBAPI. The encoding in this scenario is determined by the encoding flag passed to create_engine().

When using the Unicode type, it is only appropriate to pass Python unicode objects, and not plain str. If a plain str is passed under Python 2, a warning is emitted. If you notice your application emitting these warnings but you’re not sure of the source of them, the Python warnings filter, documented at http://docs.python.org/library/warnings.html, can be used to turn these warnings into exceptions which will illustrate a stack trace:

import warnings
warnings.simplefilter('error')

For an application that wishes to pass plain bytestrings and Python unicode objects to the Unicode type equally, the bytestrings must first be decoded into unicode. The recipe at Coercing Encoded Strings to Unicode illustrates how this is done.

See also:

UnicodeText - unlengthed textual counterpart to Unicode.
__init__(length=None, **kwargs)

Create a Unicode object.

Parameters are the same as that of String, with the exception that convert_unicode defaults to True.

class sqlalchemy.types.UnicodeText(length=None, **kwargs)

Bases: sqlalchemy.types.Text

An unbounded-length Unicode string type.

See Unicode for details on the unicode behavior of this object.

Like Unicode, usage the UnicodeText type implies a unicode-capable type being used on the backend, such as NCLOB, NTEXT.

__init__(length=None, **kwargs)

Create a Unicode-converting Text type.

Parameters are the same as that of Text, with the exception that convert_unicode defaults to True.

SQL Standard Types

The SQL standard types always create database column types of the same name when CREATE TABLE is issued. Some types may not be supported on all databases.

class sqlalchemy.types.BIGINT(*args, **kwargs)

Bases: sqlalchemy.types.BigInteger

The SQL BIGINT type.

class sqlalchemy.types.BINARY(length=None)

Bases: sqlalchemy.types._Binary

The SQL BINARY type.

class sqlalchemy.types.BLOB(length=None)

Bases: sqlalchemy.types.LargeBinary

The SQL BLOB type.

class sqlalchemy.types.BOOLEAN(create_constraint=True, name=None)

Bases: sqlalchemy.types.Boolean

The SQL BOOLEAN type.

class sqlalchemy.types.CHAR(length=None, collation=None, convert_unicode=False, unicode_error=None, _warn_on_bytestring=False)

Bases: sqlalchemy.types.String

The SQL CHAR type.

class sqlalchemy.types.CLOB(length=None, collation=None, convert_unicode=False, unicode_error=None, _warn_on_bytestring=False)

Bases: sqlalchemy.types.Text

The CLOB type.

This type is found in Oracle and Informix.

class sqlalchemy.types.DATE(*args, **kwargs)

Bases: sqlalchemy.types.Date

The SQL DATE type.

class sqlalchemy.types.DATETIME(timezone=False)

Bases: sqlalchemy.types.DateTime

The SQL DATETIME type.

class sqlalchemy.types.DECIMAL(precision=None, scale=None, asdecimal=True)

Bases: sqlalchemy.types.Numeric

The SQL DECIMAL type.

class sqlalchemy.types.FLOAT(precision=None, asdecimal=False, **kwargs)

Bases: sqlalchemy.types.Float

The SQL FLOAT type.

sqlalchemy.types.INT

alias of INTEGER

class sqlalchemy.types.INTEGER(*args, **kwargs)

Bases: sqlalchemy.types.Integer

The SQL INT or INTEGER type.

class sqlalchemy.types.NCHAR(length=None, **kwargs)

Bases: sqlalchemy.types.Unicode

The SQL NCHAR type.

class sqlalchemy.types.NVARCHAR(length=None, **kwargs)

Bases: sqlalchemy.types.Unicode

The SQL NVARCHAR type.

class sqlalchemy.types.NUMERIC(precision=None, scale=None, asdecimal=True)

Bases: sqlalchemy.types.Numeric

The SQL NUMERIC type.

class sqlalchemy.types.REAL(precision=None, asdecimal=False, **kwargs)

Bases: sqlalchemy.types.Float

The SQL REAL type.

class sqlalchemy.types.SMALLINT(*args, **kwargs)

Bases: sqlalchemy.types.SmallInteger

The SQL SMALLINT type.

class sqlalchemy.types.TEXT(length=None, collation=None, convert_unicode=False, unicode_error=None, _warn_on_bytestring=False)

Bases: sqlalchemy.types.Text

The SQL TEXT type.

class sqlalchemy.types.TIME(timezone=False)

Bases: sqlalchemy.types.Time

The SQL TIME type.

class sqlalchemy.types.TIMESTAMP(timezone=False)

Bases: sqlalchemy.types.DateTime

The SQL TIMESTAMP type.

class sqlalchemy.types.VARBINARY(length=None)

Bases: sqlalchemy.types._Binary

The SQL VARBINARY type.

class sqlalchemy.types.VARCHAR(length=None, collation=None, convert_unicode=False, unicode_error=None, _warn_on_bytestring=False)

Bases: sqlalchemy.types.String

The SQL VARCHAR type.

Vendor-Specific Types

Database-specific types are also available for import from each database’s dialect module. See the Dialects reference for the database you’re interested in.

For example, MySQL has a BIGINT type and PostgreSQL has an INET type. To use these, import them from the module explicitly:

from sqlalchemy.dialects import mysql

table = Table('foo', metadata,
    Column('id', mysql.BIGINT),
    Column('enumerates', mysql.ENUM('a', 'b', 'c'))
)

Or some PostgreSQL types:

from sqlalchemy.dialects import postgresql

table = Table('foo', metadata,
    Column('ipaddress', postgresql.INET),
    Column('elements', postgresql.ARRAY(String))
)

Each dialect provides the full set of typenames supported by that backend within its __all__ collection, so that a simple import * or similar will import all supported types as implemented for that backend:

from sqlalchemy.dialects.postgresql import *

t = Table('mytable', metadata,
           Column('id', INTEGER, primary_key=True),
           Column('name', VARCHAR(300)),
           Column('inetaddr', INET)
)

Where above, the INTEGER and VARCHAR types are ultimately from sqlalchemy.types, and INET is specific to the Postgresql dialect.

Some dialect level types have the same name as the SQL standard type, but also provide additional arguments. For example, MySQL implements the full range of character and string types including additional arguments such as collation and charset:

from sqlalchemy.dialects.mysql import VARCHAR, TEXT

table = Table('foo', meta,
    Column('col1', VARCHAR(200, collation='binary')),
    Column('col2', TEXT(charset='latin1'))
)

Custom Types

A variety of methods exist to redefine the behavior of existing types as well as to provide new ones.

Overriding Type Compilation

A frequent need is to force the “string” version of a type, that is the one rendered in a CREATE TABLE statement or other SQL function like CAST, to be changed. For example, an application may want to force the rendering of BINARY for all platforms except for one, in which is wants BLOB to be rendered. Usage of an existing generic type, in this case LargeBinary, is preferred for most use cases. But to control types more accurately, a compilation directive that is per-dialect can be associated with any type:

from sqlalchemy.ext.compiler import compiles
from sqlalchemy.types import BINARY

@compiles(BINARY, "sqlite")
def compile_binary_sqlite(type_, compiler, **kw):
    return "BLOB"

The above code allows the usage of types.BINARY, which will produce the string BINARY against all backends except SQLite, in which case it will produce BLOB.

See the section Changing Compilation of Types, a subsection of Custom SQL Constructs and Compilation Extension, for additional examples.

Augmenting Existing Types

The TypeDecorator allows the creation of custom types which add bind-parameter and result-processing behavior to an existing type object. It is used when additional in-Python marshaling of data to and from the database is required.

Note

The bind- and result-processing of TypeDecorator is in addition to the processing already performed by the hosted type, which is customized by SQLAlchemy on a per-DBAPI basis to perform processing specific to that DBAPI. To change the DBAPI-level processing for an existing type, see the section Replacing the Bind/Result Processing of Existing Types.

class sqlalchemy.types.TypeDecorator(*args, **kwargs)

Bases: sqlalchemy.types.TypeEngine

Allows the creation of types which add additional functionality to an existing type.

This method is preferred to direct subclassing of SQLAlchemy’s built-in types as it ensures that all required functionality of the underlying type is kept in place.

Typical usage:

import sqlalchemy.types as types

class MyType(types.TypeDecorator):
    '''Prefixes Unicode values with "PREFIX:" on the way in and
    strips it off on the way out.
    '''

    impl = types.Unicode

    def process_bind_param(self, value, dialect):
        return "PREFIX:" + value

    def process_result_value(self, value, dialect):
        return value[7:]

    def copy(self):
        return MyType(self.impl.length)

The class-level “impl” attribute is required, and can reference any TypeEngine class. Alternatively, the load_dialect_impl() method can be used to provide different type classes based on the dialect given; in this case, the “impl” variable can reference TypeEngine as a placeholder.

Types that receive a Python type that isn’t similar to the ultimate type used may want to define the TypeDecorator.coerce_compared_value() method. This is used to give the expression system a hint when coercing Python objects into bind parameters within expressions. Consider this expression:

mytable.c.somecol + datetime.date(2009, 5, 15)

Above, if “somecol” is an Integer variant, it makes sense that we’re doing date arithmetic, where above is usually interpreted by databases as adding a number of days to the given date. The expression system does the right thing by not attempting to coerce the “date()” value into an integer-oriented bind parameter.

However, in the case of TypeDecorator, we are usually changing an incoming Python type to something new - TypeDecorator by default will “coerce” the non-typed side to be the same type as itself. Such as below, we define an “epoch” type that stores a date value as an integer:

class MyEpochType(types.TypeDecorator):
    impl = types.Integer

    epoch = datetime.date(1970, 1, 1)

    def process_bind_param(self, value, dialect):
        return (value - self.epoch).days

    def process_result_value(self, value, dialect):
        return self.epoch + timedelta(days=value)

Our expression of somecol + date with the above type will coerce the “date” on the right side to also be treated as MyEpochType.

This behavior can be overridden via the coerce_compared_value() method, which returns a type that should be used for the value of the expression. Below we set it such that an integer value will be treated as an Integer, and any other value is assumed to be a date and will be treated as a MyEpochType:

def coerce_compared_value(self, op, value):
    if isinstance(value, int):
        return Integer()
    else:
        return self
__init__(*args, **kwargs)

Construct a TypeDecorator.

Arguments sent here are passed to the constructor of the class assigned to the impl class level attribute, assuming the impl is a callable, and the resulting object is assigned to the self.impl instance attribute (thus overriding the class attribute of the same name).

If the class level impl is not a callable (the unusual case), it will be assigned to the same instance attribute ‘as-is’, ignoring those arguments passed to the constructor.

Subclasses can override this to customize the generation of self.impl entirely.

adapt(cls, **kw)
inherited from the adapt() method of TypeEngine

Produce an “adapted” form of this type, given an “impl” class to work with.

This method is used internally to associate generic types with “implementation” types that are specific to a particular dialect.

bind_expression(bindvalue)
inherited from the bind_expression() method of TypeEngine

“Given a bind value (i.e. a BindParameter instance), return a SQL expression in its place.

This is typically a SQL function that wraps the existing bound parameter within the statement. It is used for special data types that require literals being wrapped in some special database function in order to coerce an application-level value into a database-specific format. It is the SQL analogue of the TypeEngine.bind_processor() method.

The method is evaluated at statement compile time, as opposed to statement construction time.

Note that this method, when implemented, should always return the exact same structure, without any conditional logic, as it may be used in an executemany() call against an arbitrary number of bound parameter sets.

See also:

Applying SQL-level Bind/Result Processing

bind_processor(dialect)

Provide a bound value processing function for the given Dialect.

This is the method that fulfills the TypeEngine contract for bound value conversion. TypeDecorator will wrap a user-defined implementation of process_bind_param() here.

User-defined code can override this method directly, though its likely best to use process_bind_param() so that the processing provided by self.impl is maintained.

Parameters:dialect – Dialect instance in use.

This method is the reverse counterpart to the result_processor() method of this class.

coerce_compared_value(op, value)

Suggest a type for a ‘coerced’ Python value in an expression.

By default, returns self. This method is called by the expression system when an object using this type is on the left or right side of an expression against a plain Python object which does not yet have a SQLAlchemy type assigned:

expr = table.c.somecolumn + 35

Where above, if somecolumn uses this type, this method will be called with the value operator.add and 35. The return value is whatever SQLAlchemy type should be used for 35 for this particular operation.

coerce_to_is_types = (<type 'NoneType'>,)

Specify those Python types which should be coerced at the expression level to “IS <constant>” when compared using == (and same for

IS NOT in conjunction with !=.

For most SQLAlchemy types, this includes NoneType, as well as bool.

TypeDecorator modifies this list to only include NoneType, as typedecorator implementations that deal with boolean types are common.

Custom TypeDecorator classes can override this attribute to return an empty tuple, in which case no values will be coerced to constants.

..versionadded:: 0.8.2
Added TypeDecorator.coerce_to_is_types to allow for easier control of __eq__() __ne__() operations.
column_expression(colexpr)
inherited from the column_expression() method of TypeEngine

Given a SELECT column expression, return a wrapping SQL expression.

This is typically a SQL function that wraps a column expression as rendered in the columns clause of a SELECT statement. It is used for special data types that require columns to be wrapped in some special database function in order to coerce the value before being sent back to the application. It is the SQL analogue of the TypeEngine.result_processor() method.

The method is evaluated at statement compile time, as opposed to statement construction time.

See also:

Applying SQL-level Bind/Result Processing

compare_values(x, y)

Given two values, compare them for equality.

By default this calls upon TypeEngine.compare_values() of the underlying “impl”, which in turn usually uses the Python equals operator ==.

This function is used by the ORM to compare an original-loaded value with an intercepted “changed” value, to determine if a net change has occurred.

compile(dialect=None)
inherited from the compile() method of TypeEngine

Produce a string-compiled form of this TypeEngine.

When called with no arguments, uses a “default” dialect to produce a string result.

Parameters:dialect – a Dialect instance.
copy()

Produce a copy of this TypeDecorator instance.

This is a shallow copy and is provided to fulfill part of the TypeEngine contract. It usually does not need to be overridden unless the user-defined TypeDecorator has local state that should be deep-copied.

dialect_impl(dialect)
inherited from the dialect_impl() method of TypeEngine

Return a dialect-specific implementation for this TypeEngine.

get_dbapi_type(dbapi)

Return the DBAPI type object represented by this TypeDecorator.

By default this calls upon TypeEngine.get_dbapi_type() of the underlying “impl”.

load_dialect_impl(dialect)

Return a TypeEngine object corresponding to a dialect.

This is an end-user override hook that can be used to provide differing types depending on the given dialect. It is used by the TypeDecorator implementation of type_engine() to help determine what type should ultimately be returned for a given TypeDecorator.

By default returns self.impl.

process_bind_param(value, dialect)

Receive a bound parameter value to be converted.

Subclasses override this method to return the value that should be passed along to the underlying TypeEngine object, and from there to the DBAPI execute() method.

The operation could be anything desired to perform custom behavior, such as transforming or serializing data. This could also be used as a hook for validating logic.

This operation should be designed with the reverse operation in mind, which would be the process_result_value method of this class.

Parameters:
  • value – Data to operate upon, of any type expected by this method in the subclass. Can be None.
  • dialect – the Dialect in use.
process_result_value(value, dialect)

Receive a result-row column value to be converted.

Subclasses should implement this method to operate on data fetched from the database.

Subclasses override this method to return the value that should be passed back to the application, given a value that is already processed by the underlying TypeEngine object, originally from the DBAPI cursor method fetchone() or similar.

The operation could be anything desired to perform custom behavior, such as transforming or serializing data. This could also be used as a hook for validating logic.

Parameters:
  • value – Data to operate upon, of any type expected by this method in the subclass. Can be None.
  • dialect – the Dialect in use.

This operation should be designed to be reversible by the “process_bind_param” method of this class.

python_type
inherited from the python_type attribute of TypeEngine

Return the Python type object expected to be returned by instances of this type, if known.

Basically, for those types which enforce a return type, or are known across the board to do such for all common DBAPIs (like int for example), will return that type.

If a return type is not defined, raises NotImplementedError.

Note that any type also accommodates NULL in SQL which means you can also get back None from any type in practice.

result_processor(dialect, coltype)

Provide a result value processing function for the given Dialect.

This is the method that fulfills the TypeEngine contract for result value conversion. TypeDecorator will wrap a user-defined implementation of process_result_value() here.

User-defined code can override this method directly, though its likely best to use process_result_value() so that the processing provided by self.impl is maintained.

Parameters:
  • dialect – Dialect instance in use.
  • coltype – An SQLAlchemy data type

This method is the reverse counterpart to the bind_processor() method of this class.

type_engine(dialect)

Return a dialect-specific TypeEngine instance for this TypeDecorator.

In most cases this returns a dialect-adapted form of the TypeEngine type represented by self.impl. Makes usage of dialect_impl() but also traverses into wrapped TypeDecorator instances. Behavior can be customized here by overriding load_dialect_impl().

with_variant(type_, dialect_name)
inherited from the with_variant() method of TypeEngine

Produce a new type object that will utilize the given type when applied to the dialect of the given name.

e.g.:

from sqlalchemy.types import String
from sqlalchemy.dialects import mysql

s = String()

s = s.with_variant(mysql.VARCHAR(collation='foo'), 'mysql')

The construction of TypeEngine.with_variant() is always from the “fallback” type to that which is dialect specific. The returned type is an instance of Variant, which itself provides a with_variant() that can be called repeatedly.

Parameters:
  • type_ – a TypeEngine that will be selected as a variant from the originating type, when a dialect of the given name is in use.
  • dialect_name – base name of the dialect which uses this type. (i.e. 'postgresql', 'mysql', etc.)

New in version 0.7.2.

TypeDecorator Recipes

A few key TypeDecorator recipes follow.

Coercing Encoded Strings to Unicode

A common source of confusion regarding the Unicode type is that it is intended to deal only with Python unicode objects on the Python side, meaning values passed to it as bind parameters must be of the form u'some string' if using Python 2 and not 3. The encoding/decoding functions it performs are only to suit what the DBAPI in use requires, and are primarily a private implementation detail.

The use case of a type that can safely receive Python bytestrings, that is strings that contain non-ASCII characters and are not u'' objects in Python 2, can be achieved using a TypeDecorator which coerces as needed:

from sqlalchemy.types import TypeDecorator, Unicode

class CoerceUTF8(TypeDecorator):
    """Safely coerce Python bytestrings to Unicode
    before passing off to the database."""

    impl = Unicode

    def process_bind_param(self, value, dialect):
        if isinstance(value, str):
            value = value.decode('utf-8')
        return value

Rounding Numerics

Some database connectors like those of SQL Server choke if a Decimal is passed with too many decimal places. Here’s a recipe that rounds them down:

from sqlalchemy.types import TypeDecorator, Numeric
from decimal import Decimal

class SafeNumeric(TypeDecorator):
    """Adds quantization to Numeric."""

    impl = Numeric

    def __init__(self, *arg, **kw):
        TypeDecorator.__init__(self, *arg, **kw)
        self.quantize_int = -(self.impl.precision - self.impl.scale)
        self.quantize = Decimal(10) ** self.quantize_int

    def process_bind_param(self, value, dialect):
        if isinstance(value, Decimal) and \
            value.as_tuple()[2] < self.quantize_int:
            value = value.quantize(self.quantize)
        return value

Backend-agnostic GUID Type

Receives and returns Python uuid() objects. Uses the PG UUID type when using Postgresql, CHAR(32) on other backends, storing them in stringified hex format. Can be modified to store binary in CHAR(16) if desired:

from sqlalchemy.types import TypeDecorator, CHAR
from sqlalchemy.dialects.postgresql import UUID
import uuid

class GUID(TypeDecorator):
    """Platform-independent GUID type.

    Uses Postgresql's UUID type, otherwise uses
    CHAR(32), storing as stringified hex values.

    """
    impl = CHAR

    def load_dialect_impl(self, dialect):
        if dialect.name == 'postgresql':
            return dialect.type_descriptor(UUID())
        else:
            return dialect.type_descriptor(CHAR(32))

    def process_bind_param(self, value, dialect):
        if value is None:
            return value
        elif dialect.name == 'postgresql':
            return str(value)
        else:
            if not isinstance(value, uuid.UUID):
                return "%.32x" % uuid.UUID(value)
            else:
                # hexstring
                return "%.32x" % value

    def process_result_value(self, value, dialect):
        if value is None:
            return value
        else:
            return uuid.UUID(value)

Marshal JSON Strings

This type uses simplejson to marshal Python data structures to/from JSON. Can be modified to use Python’s builtin json encoder:

from sqlalchemy.types import TypeDecorator, VARCHAR
import json

class JSONEncodedDict(TypeDecorator):
    """Represents an immutable structure as a json-encoded string.

    Usage::

        JSONEncodedDict(255)

    """

    impl = VARCHAR

    def process_bind_param(self, value, dialect):
        if value is not None:
            value = json.dumps(value)

        return value

    def process_result_value(self, value, dialect):
        if value is not None:
            value = json.loads(value)
        return value

Note that the ORM by default will not detect “mutability” on such a type - meaning, in-place changes to values will not be detected and will not be flushed. Without further steps, you instead would need to replace the existing value with a new one on each parent object to detect changes. Note that there’s nothing wrong with this, as many applications may not require that the values are ever mutated once created. For those which do have this requirement, support for mutability is best applied using the sqlalchemy.ext.mutable extension - see the example in Mutation Tracking.

Replacing the Bind/Result Processing of Existing Types

Most augmentation of type behavior at the bind/result level is achieved using TypeDecorator. For the rare scenario where the specific processing applied by SQLAlchemy at the DBAPI level needs to be replaced, the SQLAlchemy type can be subclassed directly, and the bind_processor() or result_processor() methods can be overridden. Doing so requires that the adapt() method also be overridden. This method is the mechanism by which SQLAlchemy produces DBAPI-specific type behavior during statement execution. Overriding it allows a copy of the custom type to be used in lieu of a DBAPI-specific type. Below we subclass the types.TIME type to have custom result processing behavior. The process() function will receive value from the DBAPI cursor directly:

class MySpecialTime(TIME):
    def __init__(self, special_argument):
        super(MySpecialTime, self).__init__()
        self.special_argument = special_argument

    def result_processor(self, dialect, coltype):
        import datetime
        time = datetime.time
        def process(value):
            if value is not None:
                microseconds = value.microseconds
                seconds = value.seconds
                minutes = seconds / 60
                return time(
                          minutes / 60,
                          minutes % 60,
                          seconds - minutes * 60,
                          microseconds)
            else:
                return None
        return process

    def adapt(self, impltype):
        return MySpecialTime(self.special_argument)

Applying SQL-level Bind/Result Processing

As seen in the sections Augmenting Existing Types and Replacing the Bind/Result Processing of Existing Types, SQLAlchemy allows Python functions to be invoked both when parameters are sent to a statement, as well as when result rows are loaded from the database, to apply transformations to the values as they are sent to or from the database. It is also possible to define SQL-level transformations as well. The rationale here is when only the relational database contains a particular series of functions that are necessary to coerce incoming and outgoing data between an application and persistence format. Examples include using database-defined encryption/decryption functions, as well as stored procedures that handle geographic data. The Postgis extension to Postgresql includes an extensive array of SQL functions that are necessary for coercing data into particular formats.

Any TypeEngine, UserDefinedType or TypeDecorator subclass can include implementations of TypeEngine.bind_expression() and/or TypeEngine.column_expression(), which when defined to return a non-None value should return a ColumnElement expression to be injected into the SQL statement, either surrounding bound parameters or a column expression. For example, to build a Geometry type which will apply the Postgis function ST_GeomFromText to all outgoing values and the function ST_AsText to all incoming data, we can create our own subclass of UserDefinedType which provides these methods in conjunction with func:

from sqlalchemy import func
from sqlalchemy.types import UserDefinedType

class Geometry(UserDefinedType):
    def get_col_spec(self):
        return "GEOMETRY"

    def bind_expression(self, bindvalue):
        return func.ST_GeomFromText(bindvalue, type_=self)

    def column_expression(self, col):
        return func.ST_AsText(col, type_=self)

We can apply the Geometry type into Table metadata and use it in a select() construct:

geometry = Table('geometry', metadata,
              Column('geom_id', Integer, primary_key=True),
              Column('geom_data', Geometry)
            )

print select([geometry]).where(
  geometry.c.geom_data == 'LINESTRING(189412 252431,189631 259122)')

The resulting SQL embeds both functions as appropriate. ST_AsText is applied to the columns clause so that the return value is run through the function before passing into a result set, and ST_GeomFromText is run on the bound parameter so that the passed-in value is converted:

SELECT geometry.geom_id, ST_AsText(geometry.geom_data) AS geom_data_1
FROM geometry
WHERE geometry.geom_data = ST_GeomFromText(:geom_data_2)

The TypeEngine.column_expression() method interacts with the mechanics of the compiler such that the SQL expression does not interfere with the labeling of the wrapped expression. Such as, if we rendered a select() against a label() of our expression, the string label is moved to the outside of the wrapped expression:

print select([geometry.c.geom_data.label('my_data')])

Output:

SELECT ST_AsText(geometry.geom_data) AS my_data
FROM geometry

For an example of subclassing a built in type directly, we subclass postgresql.BYTEA to provide a PGPString, which will make use of the Postgresql pgcrypto extension to encrpyt/decrypt values transparently:

from sqlalchemy import create_engine, String, select, func, \
        MetaData, Table, Column, type_coerce

from sqlalchemy.dialects.postgresql import BYTEA

class PGPString(BYTEA):
    def __init__(self, passphrase, length=None):
        super(PGPString, self).__init__(length)
        self.passphrase = passphrase

    def bind_expression(self, bindvalue):
        # convert the bind's type from PGPString to
        # String, so that it's passed to psycopg2 as is without
        # a dbapi.Binary wrapper
        bindvalue = type_coerce(bindvalue, String)
        return func.pgp_sym_encrypt(bindvalue, self.passphrase)

    def column_expression(self, col):
        return func.pgp_sym_decrypt(col, self.passphrase)

metadata = MetaData()
message = Table('message', metadata,
                Column('username', String(50)),
                Column('message',
                    PGPString("this is my passphrase", length=1000)),
            )

engine = create_engine("postgresql://scott:tiger@localhost/test", echo=True)
with engine.begin() as conn:
    metadata.create_all(conn)

    conn.execute(message.insert(), username="some user",
                                message="this is my message")

    print conn.scalar(
            select([message.c.message]).\
                where(message.c.username == "some user")
        )

The pgp_sym_encrypt and pgp_sym_decrypt functions are applied to the INSERT and SELECT statements:

INSERT INTO message (username, message)
  VALUES (%(username)s, pgp_sym_encrypt(%(message)s, %(pgp_sym_encrypt_1)s))
  {'username': 'some user', 'message': 'this is my message',
    'pgp_sym_encrypt_1': 'this is my passphrase'}

SELECT pgp_sym_decrypt(message.message, %(pgp_sym_decrypt_1)s) AS message_1
  FROM message
  WHERE message.username = %(username_1)s
  {'pgp_sym_decrypt_1': 'this is my passphrase', 'username_1': 'some user'}

New in version 0.8: Added the TypeEngine.bind_expression() and TypeEngine.column_expression() methods.

See also:

PostGIS Integration

Redefining and Creating New Operators

SQLAlchemy Core defines a fixed set of expression operators available to all column expressions. Some of these operations have the effect of overloading Python’s built in operators; examples of such operators include ColumnOperators.__eq__() (table.c.somecolumn == 'foo'), ColumnOperators.__invert__() (~table.c.flag), and ColumnOperators.__add__() (table.c.x + table.c.y). Other operators are exposed as explicit methods on column expressions, such as ColumnOperators.in_() (table.c.value.in_(['x', 'y'])) and ColumnOperators.like() (table.c.value.like('%ed%')).

The Core expression constructs in all cases consult the type of the expression in order to determine the behavior of existing operators, as well as to locate additional operators that aren’t part of the built in set. The TypeEngine base class defines a root “comparison” implementation TypeEngine.Comparator, and many specific types provide their own sub-implementations of this class. User-defined TypeEngine.Comparator implementations can be built directly into a simple subclass of a particular type in order to override or define new operations. Below, we create a Integer subclass which overrides the ColumnOperators.__add__() operator:

from sqlalchemy import Integer

class MyInt(Integer):
    class comparator_factory(Integer.Comparator):
        def __add__(self, other):
            return self.op("goofy")(other)

The above configuration creates a new class MyInt, which establishes the TypeEngine.comparator_factory attribute as referring to a new class, subclassing the TypeEngine.Comparator class associated with the Integer type.

Usage:

>>> sometable = Table("sometable", metadata, Column("data", MyInt))
>>> print sometable.c.data + 5
sometable.data goofy :data_1

The implementation for ColumnOperators.__add__() is consulted by an owning SQL expression, by instantiating the TypeEngine.Comparator with itself as the expr attribute. The mechanics of the expression system are such that operations continue recursively until an expression object produces a new SQL expression construct. Above, we could just as well have said self.expr.op("goofy")(other) instead of self.op("goofy")(other).

New methods added to a TypeEngine.Comparator are exposed on an owning SQL expression using a __getattr__ scheme, which exposes methods added to TypeEngine.Comparator onto the owning ColumnElement. For example, to add a log() function to integers:

from sqlalchemy import Integer, func

class MyInt(Integer):
    class comparator_factory(Integer.Comparator):
        def log(self, other):
            return func.log(self.expr, other)

Using the above type:

>>> print sometable.c.data.log(5)
log(:log_1, :log_2)

Unary operations are also possible. For example, to add an implementation of the Postgresql factorial operator, we combine the UnaryExpression construct along with a custom_op to produce the factorial expression:

from sqlalchemy import Integer
from sqlalchemy.sql.expression import UnaryExpression
from sqlalchemy.sql import operators

class MyInteger(Integer):
    class comparator_factory(Integer.Comparator):
        def factorial(self):
            return UnaryExpression(self.expr,
                        modifier=operators.custom_op("!"),
                        type_=MyInteger)

Using the above type:

>>> from sqlalchemy.sql import column
>>> print column('x', MyInteger).factorial()
x !

See also:

TypeEngine.comparator_factory

New in version 0.8: The expression system was enhanced to support customization of operators on a per-type level.

Creating New Types

The UserDefinedType class is provided as a simple base class for defining entirely new database types. Use this to represent native database types not known by SQLAlchemy. If only Python translation behavior is needed, use TypeDecorator instead.

class sqlalchemy.types.UserDefinedType(*args, **kwargs)

Bases: sqlalchemy.types.TypeEngine

Base for user defined types.

This should be the base of new types. Note that for most cases, TypeDecorator is probably more appropriate:

import sqlalchemy.types as types

class MyType(types.UserDefinedType):
    def __init__(self, precision = 8):
        self.precision = precision

    def get_col_spec(self):
        return "MYTYPE(%s)" % self.precision

    def bind_processor(self, dialect):
        def process(value):
            return value
        return process

    def result_processor(self, dialect, coltype):
        def process(value):
            return value
        return process

Once the type is made, it’s immediately usable:

table = Table('foo', meta,
    Column('id', Integer, primary_key=True),
    Column('data', MyType(16))
    )
coerce_compared_value(op, value)

Suggest a type for a ‘coerced’ Python value in an expression.

Default behavior for UserDefinedType is the same as that of TypeDecorator; by default it returns self, assuming the compared value should be coerced into the same type as this one. See TypeDecorator.coerce_compared_value() for more detail.

Changed in version 0.8: UserDefinedType.coerce_compared_value() now returns self by default, rather than falling onto the more fundamental behavior of TypeEngine.coerce_compared_value().

Base Type API

class sqlalchemy.types.AbstractType

Bases: sqlalchemy.sql.visitors.Visitable

Base for all types - not needed except for backwards compatibility.

class sqlalchemy.types.TypeEngine(*args, **kwargs)

Bases: sqlalchemy.types.AbstractType

The ultimate base class for all SQL datatypes.

Common subclasses of TypeEngine include String, Integer, and Boolean.

For an overview of the SQLAlchemy typing system, see Column and Data Types.

class Comparator(expr)

Bases: sqlalchemy.sql.expression._DefaultColumnComparator

Base class for custom comparison operations defined at the type level. See TypeEngine.comparator_factory.

The public base class for TypeEngine.Comparator is ColumnOperators.

TypeEngine.__init__(*args, **kwargs)

Support implementations that were passing arguments

TypeEngine.adapt(cls, **kw)

Produce an “adapted” form of this type, given an “impl” class to work with.

This method is used internally to associate generic types with “implementation” types that are specific to a particular dialect.

TypeEngine.bind_expression(bindvalue)

“Given a bind value (i.e. a BindParameter instance), return a SQL expression in its place.

This is typically a SQL function that wraps the existing bound parameter within the statement. It is used for special data types that require literals being wrapped in some special database function in order to coerce an application-level value into a database-specific format. It is the SQL analogue of the TypeEngine.bind_processor() method.

The method is evaluated at statement compile time, as opposed to statement construction time.

Note that this method, when implemented, should always return the exact same structure, without any conditional logic, as it may be used in an executemany() call against an arbitrary number of bound parameter sets.

See also:

Applying SQL-level Bind/Result Processing

TypeEngine.bind_processor(dialect)

Return a conversion function for processing bind values.

Returns a callable which will receive a bind parameter value as the sole positional argument and will return a value to send to the DB-API.

If processing is not necessary, the method should return None.

Parameters:dialect – Dialect instance in use.
TypeEngine.coerce_compared_value(op, value)

Suggest a type for a ‘coerced’ Python value in an expression.

Given an operator and value, gives the type a chance to return a type which the value should be coerced into.

The default behavior here is conservative; if the right-hand side is already coerced into a SQL type based on its Python type, it is usually left alone.

End-user functionality extension here should generally be via TypeDecorator, which provides more liberal behavior in that it defaults to coercing the other side of the expression into this type, thus applying special Python conversions above and beyond those needed by the DBAPI to both ides. It also provides the public method TypeDecorator.coerce_compared_value() which is intended for end-user customization of this behavior.

TypeEngine.column_expression(colexpr)

Given a SELECT column expression, return a wrapping SQL expression.

This is typically a SQL function that wraps a column expression as rendered in the columns clause of a SELECT statement. It is used for special data types that require columns to be wrapped in some special database function in order to coerce the value before being sent back to the application. It is the SQL analogue of the TypeEngine.result_processor() method.

The method is evaluated at statement compile time, as opposed to statement construction time.

See also:

Applying SQL-level Bind/Result Processing

TypeEngine.comparator_factory

Bases: sqlalchemy.sql.expression._DefaultColumnComparator

A TypeEngine.Comparator class which will apply to operations performed by owning ColumnElement objects.

The comparator_factory attribute is a hook consulted by the core expression system when column and SQL expression operations are performed. When a TypeEngine.Comparator class is associated with this attribute, it allows custom re-definition of all existing operators, as well as definition of new operators. Existing operators include those provided by Python operator overloading such as operators.ColumnOperators.__add__() and operators.ColumnOperators.__eq__(), those provided as standard attributes of operators.ColumnOperators such as operators.ColumnOperators.like() and operators.ColumnOperators.in_().

Rudimentary usage of this hook is allowed through simple subclassing of existing types, or alternatively by using TypeDecorator. See the documentation section Redefining and Creating New Operators for examples.

New in version 0.8: The expression system was enhanced to support customization of operators on a per-type level.

alias of Comparator

TypeEngine.compare_values(x, y)

Compare two values for equality.

TypeEngine.compile(dialect=None)

Produce a string-compiled form of this TypeEngine.

When called with no arguments, uses a “default” dialect to produce a string result.

Parameters:dialect – a Dialect instance.
TypeEngine.dialect_impl(dialect)

Return a dialect-specific implementation for this TypeEngine.

TypeEngine.get_dbapi_type(dbapi)

Return the corresponding type object from the underlying DB-API, if any.

This can be useful for calling setinputsizes(), for example.
TypeEngine.hashable = True

Flag, if False, means values from this type aren’t hashable.

Used by the ORM when uniquing result lists.

TypeEngine.python_type

Return the Python type object expected to be returned by instances of this type, if known.

Basically, for those types which enforce a return type, or are known across the board to do such for all common DBAPIs (like int for example), will return that type.

If a return type is not defined, raises NotImplementedError.

Note that any type also accommodates NULL in SQL which means you can also get back None from any type in practice.

TypeEngine.result_processor(dialect, coltype)

Return a conversion function for processing result row values.

Returns a callable which will receive a result row column value as the sole positional argument and will return a value to return to the user.

If processing is not necessary, the method should return None.

Parameters:
  • dialect – Dialect instance in use.
  • coltype – DBAPI coltype argument received in cursor.description.
TypeEngine.with_variant(type_, dialect_name)

Produce a new type object that will utilize the given type when applied to the dialect of the given name.

e.g.:

from sqlalchemy.types import String
from sqlalchemy.dialects import mysql

s = String()

s = s.with_variant(mysql.VARCHAR(collation='foo'), 'mysql')

The construction of TypeEngine.with_variant() is always from the “fallback” type to that which is dialect specific. The returned type is an instance of Variant, which itself provides a with_variant() that can be called repeatedly.

Parameters:
  • type_ – a TypeEngine that will be selected as a variant from the originating type, when a dialect of the given name is in use.
  • dialect_name – base name of the dialect which uses this type. (i.e. 'postgresql', 'mysql', etc.)

New in version 0.7.2.

class sqlalchemy.types.Concatenable

A mixin that marks a type as supporting ‘concatenation’, typically strings.

__init__
inherited from the __init__ attribute of object

x.__init__(...) initializes x; see help(type(x)) for signature

class sqlalchemy.types.NullType(*args, **kwargs)

Bases: sqlalchemy.types.TypeEngine

An unknown type.

NullTypes will stand in if Table reflection encounters a column data type unknown to SQLAlchemy. The resulting columns are nearly fully usable: the DB-API adapter will handle all translation to and from the database data type.

NullType does not have sufficient information to particpate in a CREATE TABLE statement and will raise an exception if encountered during a create() operation.

class sqlalchemy.types.Variant(base, mapping)

Bases: sqlalchemy.types.TypeDecorator

A wrapping type that selects among a variety of implementations based on dialect in use.

The Variant type is typically constructed using the TypeEngine.with_variant() method.

New in version 0.7.2.

Members:with_variant, __init__