Database Tooling¶
This section covers the playhouse modules for managing connections, database URLs, schema migrations, introspection, code generation, and testing.
Database URLs¶
The playhouse.db_url module lets you configure Peewee from a connection
string, which is common in twelve-factor applications where database
credentials live in environment variables.
import os
from playhouse.db_url import connect
db = connect(os.environ.get('DATABASE_URL', 'sqlite:////default.db'))
Pass additional keyword arguments in the query string:
db = connect('postgres://user:pass@host/db?max_connections=20')
URL format: scheme://user:password@host:port/dbname?option=value
Common schemes:
Scheme |
Database class |
|---|---|
|
|
|
|
|
|
|
Connection pool implementations:
Scheme |
Database class |
|---|---|
|
|
|
|
|
|
|
Alternate drivers:
Scheme |
Database class |
|---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
- connect(url, unquote_password=False, unquote_user=False, **connect_params)¶
- Parameters:
url – the URL for the database, see examples.
unquote_password (bool) – unquote special characters in the password.
unquote_user (bool) – unquote special characters in the user.
connect_params – additional parameters to pass to the Database.
Parse
urland return an appropriateDatabaseinstance.Examples:
sqlite:///my_app.db- SQLite file in the current directory.sqlite:///:memory:- in-memory SQLite.sqlite:////absolute/path/to/app.db- absolute path SQLite.postgresql://user:password@host:5432/dbnamemysql://user:password@host:3306/dbname
- parse(url, unquote_password=False, unquote_user=False)¶
- Parameters:
url – the URL for the database, see
connect()above for examples.unquote_password (bool) – unquote special characters in the password.
unquote_user (bool) – unquote special characters in the user.
Parse a URL and return a dictionary with
database,host,port,user, andpasswordkeys plus any extra connect parameters from the query string.Useful if you need to construct a database class manually:
params = parse('postgres://user:pass@host:5432/mydb') db = MyCustomDatabase(**params)
- register_database(db_class, *names)¶
- Parameters:
db_class – A subclass of
Database.names – A list of names to use as the scheme in the URL.
Register a custom database class under one or more URL scheme names so that
connect()can instantiate it:register_database(FirebirdDatabase, 'firebird') db = connect('firebird://my-firebird-db')
Connection Pooling¶
The playhouse.pool module contains a number of Database classes
that provide connection pooling for Postgresql, MySQL and SQLite databases. The
pool works by overriding the methods on the Database class that open
and close connections to the backend.
In multi-threaded applications, each thread gets its own connection; the
pool maintains up to max_connections open connections at any time.
In single-threaded applications, a single connection is recycled.
The application only needs to ensure that connections are closed when work is done - typically at the end of an HTTP request. Closing a pooled connection returns it to the pool rather than actually disconnecting.
from playhouse.pool import PooledPostgresqlDatabase
db = PooledPostgresqlDatabase(
'my_app',
user='postgres',
max_connections=32,
stale_timeout=300)
Tip
Pooled database implementations may be safely used as drop-in replacements for their non-pooled counterparts.
Commonly-used pool implementations:
Additional implementations:
playhouse.cysqlite_ext-PooledCySqliteDatabaseplayhouse.mysql_ext-PooledMariaDBConnectorDatabaseplayhouse.mysql_ext-PooledMySQLConnectorDatabaseplayhouse.postgres_ext-PooledPostgresqlExtDatabaseplayhouse.postgres_ext-PooledPsycopg3Databaseplayhouse.cockroachdb-PooledCockroachDatabase
Note
Applications using Peewee’s asyncio integration do not need to use a special pooled database - the Async databases use a connection pool by default.
- class PooledDatabase(database, max_connections=20, stale_timeout=None, timeout=None, **kwargs)¶
Mixin class mixed into the specific backend subclasses above.
- Parameters:
database (str) – The name of the database or database file.
max_connections (int) – Maximum number of concurrent connections. Pass
Nonefor no limit.stale_timeout (int) – Seconds after which an idle connection is considered stale and will be discarded next time it would be reused.
timeout (int) – Seconds to block when all connections are in use.
0blocks indefinitely;None(default) raises immediately.
Note
Connections will not be closed exactly when they exceed their
stale_timeout. Instead, stale connections are only closed when a new connection is requested.Note
If the pool is exhausted and no
timeoutis configured, aValueErroris raised.- manual_close()¶
Close the current connection permanently without returning it to the pool. Use this when a connection has entered a bad state.
- close_idle()¶
Close all pooled connections that are not currently in use.
- close_stale(age=600)¶
- Parameters:
age (int) – Age at which a connection should be considered stale.
- Returns:
Number of connections closed.
Close in-use connections that have exceeded
ageseconds. Use with caution.
- close_all()¶
Close all connections including those currently in use. Use with caution.
- class PooledSqliteDatabase(database, max_connections=20, stale_timeout=None, timeout=None, **kwargs)¶
Pool implementation for SQLite databases. Extends
SqliteDatabase.
- class PooledPostgresqlDatabase(database, max_connections=20, stale_timeout=None, timeout=None, **kwargs)¶
Pool implementation for Postgresql databases. Extends
PostgresqlDatabase.
- class PooledMySQLDatabase(database, max_connections=20, stale_timeout=None, timeout=None, **kwargs)¶
Pool implementation for MySQL / MariaDB databases. Extends
MySQLDatabase.
Schema Migrations¶
The playhouse.migrate module provides a lightweight API for making
incremental schema changes to an existing database without writing raw SQL.
The peewee migration philosophy is that tools relying on database
introspection, versioning, and auto-detection are often fragile, brittle and
unnecessarily complex. Migrations can be written as simple python scripts and
executed from the command-line. Since the migrations only depend on your
application’s Database object, migration scripts to not introduce new
dependencies.
Supported operations:
Add, rename, or drop columns.
Make columns nullable or not nullable.
Change a column’s type.
Rename a table.
Add or drop indexes and constraints.
Add or drop column default values.
See also
from playhouse.migrate import SchemaMigrator, migrate
migrator = SchemaMigrator.from_database(db)
with db.atomic():
migrate(
migrator.add_column('tweet', 'is_published', BooleanField(default=True)),
migrator.add_column('user', 'email', CharField(null=True)),
migrator.drop_column('user', 'old_bio'),
)
Tip
Wrap migrations in db.atomic() to ensure changes are not partially
applied.
Operations¶
Add columns:
# Non-null fields must supply a default value.
migrate(
migrator.add_column('comment', 'pub_date', DateTimeField(null=True)),
migrator.add_column('comment', 'body', TextField(default='')),
)
Add a foreign key (the column name must include the _id suffix that
Peewee appends by default):
user_fk = ForeignKeyField(User, field=User.id, null=True)
migrate(
migrator.add_column('tweet', 'user_id', user_fk),
)
Rename a column:
migrate(
migrator.rename_column('story', 'pub_date', 'publish_date'),
migrator.rename_column('story', 'mod_date', 'modified_date'),
)
Drop a column:
migrate(migrator.drop_column('story', 'old_field'))
Nullable / not nullable:
migrate(
migrator.drop_not_null('story', 'pub_date'), # Allow NULLs.
migrator.add_not_null('story', 'modified_date'), # Disallow NULLs.
)
Change type:
# Change a VARCHAR(...) to a TEXT field.
migrate(migrator.alter_column_type('person', 'email', TextField()))
Rename table:
migrate(migrator.rename_table('story', 'stories'))
Add / drop indexes:
# Specify table, column(s), and unique/non-unique.
migrate(
# Create an index on the `pub_date` column.
migrator.add_index('story', ('pub_date',), False), # Normal index.
# Create a unique index on the category and title fields.
migrator.add_index('story', ('category_id', 'title'), True), # Unique.
# Drop the pub-date + status index.
migrator.drop_index('story', 'story_pub_date_status'),
)
Add / drop constraints:
from peewee import Check
# Add a CHECK() constraint to enforce the price cannot be negative.
migrate(migrator.add_constraint(
'products',
'price_check',
Check('price >= 0')))
# Remove the price check constraint.
migrate(migrator.drop_constraint('products', 'price_check'))
# Add a UNIQUE constraint on the first and last names.
migrate(migrator.add_unique('person', 'first_name', 'last_name'))
Column defaults:
# Add a default value:
migrate(migrator.add_column_default('entry', 'status', 'draft'))
# Use a function (not supported in SQLite):
migrate(migrator.add_column_default('entry', 'created_at', fn.NOW()))
# SQLite-compatible function syntax:
migrate(migrator.add_column_default('entry', 'created_at', 'now()'))
# Remove a default:
migrate(migrator.drop_column_default('entry', 'status'))
Note
Postgres users may need to set the search-path when using a non-standard schema. This can be done as follows:
migrator = PostgresqlMigrator(db)
migrate(
migrator.set_search_path('my_schema'),
migrator.add_column('table', 'field', TextField(default='')),
)
Migration API¶
- migrate(*operations)¶
Execute one or more schema-altering operations.
Usage:
migrate( migrator.add_column('t', 'col', CharField(default='')), migrator.add_index('t', ('col',), False), )
- class SchemaMigrator(database)¶
- Parameters:
database – a
Databaseinstance.
The
SchemaMigratoris responsible for generating schema-altering statements.- classmethod from_database(database)¶
- Parameters:
database (Database) – database instance to generate migrations for.
- Returns:
SchemaMigratorinstance appropriate to provided database.
Factory method that returns the appropriate
SchemaMigratorsubclass for the given database.
- add_column(table, column_name, field)¶
- Parameters:
Add a new column to the provided table. The
fieldprovided will be used to generate the appropriate column definition.If the field is not nullable it must specify a default value.
Note
For non-null columns, the following occurs:
column is added as allowing NULLs
UPDATEquery is executed to populate the default valuecolumn is changed to NOT NULL
- drop_column(table, column_name, cascade=True)¶
- Parameters:
table (str) – Name of the table to drop column from.
column_name (str) – Name of the column to drop.
cascade (bool) – Whether the column should be dropped with CASCADE.
- rename_column(table, old_name, new_name)¶
- Parameters:
table (str) – Name of the table containing column to rename.
old_name (str) – Current name of the column.
new_name (str) – New name for the column.
- add_not_null(table, column)¶
- Parameters:
table (str) – Name of table containing column.
column (str) – Name of the column to make not nullable.
- drop_not_null(table, column)¶
- Parameters:
table (str) – Name of table containing column.
column (str) – Name of the column to make nullable.
- add_column_default(table, column, default)¶
- Parameters:
table (str) – Name of table containing column.
column (str) – Name of the column to add default to.
default – New default value for column. See notes below.
Peewee attempts to properly quote the default if it appears to be a string literal. Otherwise the default will be treated literally. Postgres and MySQL support specifying the default as a peewee expression, e.g.
fn.NOW(), but Sqlite users will need to usedefault='now()'instead.
- drop_column_default(table, column)¶
- Parameters:
table (str) – Name of table containing column.
column (str) – Name of the column to remove default from.
- alter_column_type(table, column, field, cast=None)¶
- Parameters:
table (str) – Name of the table.
column_name (str) – Name of the column to modify.
cast – (postgres-only) specify a cast expression if the data-types are incompatible, e.g.
column_name::int. Can be provided as either a string or aCastinstance.
Alter the data-type of a column. This method should be used with care, as using incompatible types may not be well-supported by your database.
- rename_table(old_name, new_name)¶
- Parameters:
old_name (str) – Current name of the table.
new_name (str) – New name for the table.
- add_index(table, columns, unique=False, using=None)¶
- Parameters:
table (str) – Name of table on which to create the index.
columns (list) – List of columns which should be indexed.
unique (bool) – Whether the new index should specify a unique constraint.
using (str) – Index type (where supported), e.g. GiST or GIN.
- drop_index(table, index_name)¶
- Parameters:
table (str) – Name of the table containing the index to be dropped.
index_name (str) – Name of the index to be dropped.
- add_constraint(table, name, constraint)¶
- drop_constraint(table, name)¶
- Parameters:
table (str) – Table to drop constraint from.
name (str) – Name of constraint to drop.
- add_unique(table, *column_names)¶
- Parameters:
table (str) – Table to add constraint to.
column_names (str) – One or more columns for UNIQUE constraint.
- class PostgresqlMigrator(database)¶
- set_search_path(schema_name)¶
Set the Postgres search path for subsequent operations.
- class SqliteMigrator(database)¶
SQLite has limited support for
ALTER TABLEqueries, so the following operations are currently not supported for SQLite:add_constraintdrop_constraintadd_unique
- class MySQLMigrator(database)¶
MySQL-specific subclass.
Reflection¶
The playhouse.reflection module introspects an existing database and
generates Peewee model classes from its schema. It is used internally by
pwiz - Model Generator and DataSet.
from playhouse.reflection import generate_models
db = PostgresqlDatabase('my_app')
models = generate_models(db) # Returns {table_name: ModelClass}
# list(models.keys())
# ['account', 'customer', 'order', 'orderitem', 'product']
# Get a reference to a generated model.
Customer = models['customer']
# Or inject into the current namespace:
# globals().update(models)
# Query generated models:
for customer in Customer.select():
print(customer.name, customer.email)
- generate_models(database, schema=None, **options)¶
- Parameters:
database (Database) – database instance to introspect.
schema (str) – optional schema to introspect.
options – arbitrary options, see
Introspector.generate_models()for details.
- Returns:
a
dictmapping table names to model classes.
- print_model(model)¶
Print a human-readable summary of a model’s fields and indexes to stdout. Useful for interactive exploration:
>>> print_model(Tweet) tweet id AUTO PK user INT FK: User.id content TEXT timestamp DATETIME index(es) user_id timestamp
- print_table_sql(model)¶
Print the
CREATE TABLESQL for a model class (without indexes or constraints):>>> print_table_sql(Tweet) CREATE TABLE IF NOT EXISTS "tweet" ( "id" INTEGER NOT NULL PRIMARY KEY, "user_id" INTEGER NOT NULL, "content" TEXT NOT NULL, "timestamp" DATETIME NOT NULL, FOREIGN KEY ("user_id") REFERENCES "user" ("id") )
- class Introspector(metadata, schema=None)¶
Metadata can be extracted from a database by instantiating an
Introspector. Rather than instantiating this class directly, it is recommended to use the factory methodfrom_database().- classmethod from_database(database, schema=None)¶
- Parameters:
database – a
Databaseinstance.schema (str) – an optional schema (supported by some databases).
Creates an
Introspectorinstance suitable for use with the given database.db = SqliteDatabase('my_app.db') introspector = Introspector.from_database(db) models = introspector.generate_models() # User and Tweet (assumed to exist in the database) are # peewee Model classes generated from the database schema. User = models['user'] Tweet = models['tweet']
- generate_models(skip_invalid=False, table_names=None, literal_column_names=False, bare_fields=False, include_views=False)¶
- Parameters:
skip_invalid (bool) – Skip tables whose names are not valid Python identifiers.
table_names (list) – Only generate models for the given tables.
literal_column_names (bool) – Use the exact database column names as field names (rather than converting to Python naming conventions).
bare_fields (bool) – Do not attempt to detect field types; use
BareFieldfor all columns (SQLite only).include_views (bool) – Also generate models for views.
- Returns:
A dictionary mapping table-names to model classes.
Introspect the database, reading in the tables, columns, and foreign key constraints, then generate a dictionary mapping each database table to a dynamically-generated
Modelclass.
pwiz - Model Generator¶
pwiz is a command-line tool that introspects a database and prints
ready-to-use Peewee model code. If you have an existing database, running
pwiz saves significant time generating the initial model definitions.
# Introspect a Postgresql database and write models to a file:
python -m pwiz -e postgresql -u postgres my_db > models.py
# Introspect a SQLite database:
python -m pwiz -e sqlite path/to/my.db
# Introspect a MySQL database (prompts for password):
python -m pwiz -e mysql -u root -P my_db
# Introspect only specific tables:
python -m pwiz -e postgresql my_db -t user,tweet,follow
Command-line options:
Option |
Meaning |
Example |
|---|---|---|
|
Database backend |
|
|
Host |
|
|
Port |
|
|
Username |
|
|
Password (prompts interactively) |
|
|
Schema |
|
|
Comma-separated list of tables to include |
|
|
Include views |
|
|
Embed database info as a comment |
|
|
Preserve original column order |
|
|
Ignore fields whose type is unknown |
|
|
Use legacy table and column naming |
Valid -e values: sqlite, mysql, postgresql.
Warning
If a password is required to access your database, you will be prompted to enter it using a secure prompt.
The password will be included in the output. Specifically, at the top
of the file a Database will be defined along with any required
parameters - including the password.
Example output for a SQLite database with user and tweet tables:
from peewee import *
database = SqliteDatabase('example.db', **{})
class UnknownField(object):
def __init__(self, *_, **__): pass
class BaseModel(Model):
class Meta:
database = database
class User(BaseModel):
username = TextField(unique=True)
class Meta:
table_name = 'user'
class Tweet(BaseModel):
content = TextField()
timestamp = DateTimeField()
user = ForeignKeyField(column_name='user_id', field='id', model=User)
class Meta:
table_name = 'tweet'
Note that pwiz detects foreign keys, unique constraints, and preserves
explicit table names.
Note
The UnknownField is a placeholder that is used in the event your schema
contains a column declaration that Peewee doesn’t know how to map to a
field class.
Test Utilities¶
playhouse.test_utils provides helpers for testing peewee projects.
- class count_queries(only_select=False)¶
Context manager that counts the number of SQL queries executed within its block.
- Parameters:
only_select (bool) – If
True, count onlySELECTqueries.
with count_queries() as counter: user = User.get(User.username == 'alice') tweets = list(user.tweets) # Triggers a second query. assert counter.count == 2
- count¶
Number of queries executed.
- get_queries()¶
Return a list of
(sql, params)2-tuples for each query executed.
- assert_query_count(expected, only_select=False)¶
Decorator or context manager that raises
AssertionErrorif the number of queries executed does not matchexpected.As a decorator:
class TestAPI(unittest.TestCase): @assert_query_count(1) def test_get_user(self): user = User.get_by_id(1)
As a context manager:
with assert_query_count(3): result = my_function_that_should_make_exactly_three_queries()