Database Access Classes

class aisynphys.database.SynphysDatabase(ro_host, rw_host, db_name, check_schema=True)[source]

Augments the Database class with convenience methods for querying the synphys database.

initialize_database()[source]

Optionally called after create_tables

classmethod list_current_versions()[source]

Return a dict of the most recent DB versions for each size.

If no published DB file is available for a particular size, then the value will be set to None in the returned dictionary.

classmethod list_versions(only_supported=False)[source]

Return a list of all available database versions.

Each item in the list is a dictionary with keys db_file, release_version, db_size, and schema_version.

classmethod load_current(db_size)[source]

Load the most recent version of the database that is supported by this version of aisynphys.

The database file will be downloaded and cached, if an existing cache file is not found.

Parameters
db_sizestr

Must be one of ‘small’, ‘medium’, or ‘full’.

classmethod load_sqlite(sqlite_file, readonly=True)[source]

Return a SynphysDatabase instance connected to an existing sqlite file.

classmethod load_version(db_version)[source]

Load a named database version.

Available database names can be listed using list_versions(). The database file will be downloaded and cached, if an existing cache file is not found.

Example:

>>> from aisynphys.database import SynphysDatabase
>>> SynphysDatabase.list_versions()
[
    {'db_file': 'synphys_r1.0_small.sqlite'},
    {'db_file': 'synphys_r1.0_medium.sqlite'},
    {'db_file': 'synphys_r1.0_full.sqlite'},
    ...
]
>>> db = SynphysDatabase.load_version('synphys_r1.0_small.sqlite')
Downloading http://api.brain-map.org/api/v2/well_known_file_download/937779595 =>
  /home/luke/docs/aisynphys/doc/cache/database/synphys_r1.0_2019-08-29_small.sqlite
  [####################]  100.00% (73.13 MB / 73.1 MB)  4.040 MB/s  0:00:00 remaining
  done.
matrix_pair_query(pre_classes, post_classes, columns=None, pair_query_args=None)[source]

Returns the concatenated result of running pair_query over every combination of presynaptic and postsynaptic cell class.

pair_query(pre_class=None, post_class=None, synapse=None, synapse_type=None, synapse_probed=None, electrical=None, experiment_type=None, project_name=None, acsf=None, age=None, species=None, distance=None, internal=None, preload=(), session=None, filter_exprs=None)[source]

Generate a query for selecting pairs from the database.

Parameters
pre_classaisynphys.cell_class.CellClass | None

Filter for pairs where the presynaptic cell belongs to this class

post_classaisynphys.cell_class.CellClass | None

Filter for pairs where the postsynaptic cell belongs to this class

synapsebool | None

Include only pairs that are (or are not) connected by a chemical synapse

synapse_typestr | None

Include only synapses of a particular type (‘ex’ or ‘in’)

synapse_probedbool | None

If True, include only pairs that were probed for a synaptic connection (regardless of whether a connectin was found)

electricalbool | None

Include only pairs that are (or are not) connected by an electrical synapse (gap junction)

experiment_typestr | None

Include only data from specific types of experiments

project_namestr | list | None

Value(s) to match from experiment.project_name (e.g. “mouse V1 coarse matrix” or “human coarse matrix”)

acsfstr | list | None

Filter for ACSF recipe name(s)

agetuple | None

(min, max) age ranges to filter for. Either limit may be None to disable that check.

speciesstr | None

Species (‘mouse’ or ‘human’) to filter for

distancetuple | None

(min, max) intersomatic distance in meters

internalstr | list | None

Electrode internal solution recipe name(s)

preloadlist

List of strings specifying resources to preload along with the queried pairs. This can speed up performance in cases where these would otherwise be individually queried later on. Options are: - “experiment” (includes experiment and slice) - “cell” (includes cell, morphology, cortical_location, and patch_seq) - “synapse” (includes synapse, resting_state, dynamics, and synapse_model) - “synapse_prediction” (includes only synapse_prediction)

filter_exprslist | None

List of sqlalchemy expressions, each of which will restrict the query via a call to query.filter(expr)

property version_name

The version name of this database, as accepted by SynphysDatabase.load_version()

class aisynphys.database.Database(ro_host, rw_host, db_name, ormbase)[source]

Methods for doing relational database maintenance via sqlalchemy.

Supported backends: postgres, sqlite.

Features:

  • Automatically build/dispose ro and rw engines (especially after fork)

  • Generate ro/rw sessions on demand

  • Methods for creating / dropping databases

  • Clone databases across backends

property backend

Return the backend used by this database (sqlite, postgres, etc.)

bake_sqlite(sqlite_file, **kwds)[source]

Dump a copy of this database to an sqlite file.

clone_database(dest_db_name=None, dest_db=None, overwrite=False, **kwds)[source]

Copy this database to a new one.

create_tables(tables=None, initialize=True)[source]

Create tables in the database from the ORM base specification.

A list of the names of tables may be optionally specified to create a subset of known tables.

classmethod db_address(host, db_name=None, app_name=None)[source]

Return a complete address for DB access given a host (like postgres://user:pw@host) and database name.

Appends an app name to postgres addresses.

classmethod dispose_all_engines()[source]

Dispose engines on all Database instances.

dispose_engines()[source]

Dispose any existing DB engines. This is necessary when forking to avoid accessing the same DB connection simultaneously from two processes.

drop_tables(tables=None)[source]

Drop a list of tables (or all ORM-defined tables, if no list is given) from this database.

property exists

Bool indicating whether this DB exists yet.

get_database(db_name)[source]

Return a new Database object with the same hosts and orm base, but different db name

initialize_database()[source]

Optionally called after create_tables.

Initialize is _not_ called when cloning databases.

Default does nothing; subclasses may override.

static iter_copy_tables(source_db, dest_db, tables=None, skip_tables=(), skip_columns={}, skip_errors=False, vacuum=True)[source]

Iterator that copies all tables from one database to another.

Yields each table name as it is completed.

This function does not create tables in dest_db; use db.create_tables if needed.

property maint_engine

The maintenance engine.

For postgres DBs, this connects to the “postgres” database.

metadata_tables()[source]

Return an ordered dictionary (dependency-sorted) of {name:Table} pairs, one for each table in the sqlalchemy metadata for this database.

orm_tables()[source]

Return a dependency-sorted of ORM mapping objects (tables) that are described by the ORM base for this database.

reset_db()[source]

Drop the existing database and initialize a new one.

property ro_engine

The read-only database engine.

property rw_engine

The read-write database engine.

session(readonly=True)[source]

Create and return a new database Session instance.

If readonly is True, then the session is created using read-only credentials and has autocommit enabled. This prevents idle-in-transaction timeouts that occur when GUI analysis tools would otherwise leave transactions open after each request.

table_names()[source]

Return a list of the names of tables in this database.

May contain names that are not present in metadata_tables or orm_tables.

vacuum(tables=None)[source]

Cleans up database and analyzes table statistics in order to improve query planning. Should be run after any significant changes to the database.