infi.clickhouse_orm/docs/class_reference.md

3147 lines
50 KiB
Markdown
Raw Normal View History

2017-05-05 15:31:08 +03:00
Class Reference
===============
infi.clickhouse_orm.database
----------------------------
### Database
2017-09-10 17:17:04 +03:00
Database instances connect to a specific ClickHouse database for running queries,
2017-05-05 15:31:08 +03:00
inserting data and other operations.
2019-06-25 07:46:37 +03:00
#### Database(db_name, db_url="http://localhost:8123/", username=None, password=None, readonly=False, autocreate=True, timeout=60, verify_ssl_cert=True, log_statements=False)
2017-05-05 15:31:08 +03:00
Initializes a database instance. Unless it's readonly, the database will be
created on the ClickHouse server if it does not already exist.
- `db_name`: name of the database to connect to.
- `db_url`: URL of the ClickHouse server.
- `username`: optional connection credentials.
- `password`: optional connection credentials.
- `readonly`: use a read-only connection.
2018-12-14 09:34:40 +03:00
- `autocreate`: automatically create the database if it does not exist (unless in readonly mode).
2018-12-14 09:20:43 +03:00
- `timeout`: the connection timeout in seconds.
- `verify_ssl_cert`: whether to verify the server's certificate when connecting via HTTPS.
2019-06-25 07:46:37 +03:00
- `log_statements`: when True, all database statements are logged.
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
#### add_setting(name, value)
2018-10-14 09:51:04 +03:00
Adds a database setting that will be sent with every request.
2018-10-13 23:19:43 +03:00
For example, `db.add_setting("max_execution_time", 10)` will
limit query execution time to 10 seconds.
The name must be string, and the value is converted to string in case
it isn't. To remove a setting, pass `None` as the value.
2017-05-05 15:31:08 +03:00
#### count(model_class, conditions=None)
Counts the number of records in the model's table.
- `model_class`: the model to count.
- `conditions`: optional SQL conditions (contents of the WHERE clause).
#### create_database()
Creates the database on the ClickHouse server if it does not already exist.
#### create_table(model_class)
Creates a table for the given model class, if it does not exist already.
2018-08-21 16:01:10 +03:00
#### does_table_exist(model_class)
Checks whether a table for the given model class already exists.
Note that this only checks for existence of a table with the expected name.
2017-05-05 15:31:08 +03:00
#### drop_database()
Deletes the database on the ClickHouse server.
#### drop_table(model_class)
Drops the database table of the given model class, if it exists.
#### get_model_for_table(table_name, system_table=False)
Generates a model class from an existing table in the database.
This can be used for querying tables which don't have a corresponding model class,
for example system tables.
- `table_name`: the table to create a model for
- `system_table`: whether the table is a system table, or belongs to the current database
2017-05-05 15:31:08 +03:00
#### insert(model_instances, batch_size=1000)
Insert records into the database.
- `model_instances`: any iterable containing instances of a single model class.
- `batch_size`: number of records to send per chunk (use a lower number if your records are very large).
#### migrate(migrations_package_name, up_to=9999)
Executes schema migrations.
2017-09-10 17:17:04 +03:00
- `migrations_package_name` - fully qualified name of the Python package
2017-05-05 15:31:08 +03:00
containing the migrations.
- `up_to` - number of the last migration to apply.
#### paginate(model_class, order_by, page_num=1, page_size=100, conditions=None, settings=None)
Selects records and returns a single page of model instances.
- `model_class`: the model class matching the query's table,
or `None` for getting back instances of an ad-hoc model.
- `order_by`: columns to use for sorting the query (contents of the ORDER BY clause).
- `page_num`: the page number (1-based), or -1 to get the last page.
- `page_size`: number of records to return per page.
- `conditions`: optional SQL conditions (contents of the WHERE clause).
- `settings`: query settings to send as HTTP GET parameters
2017-09-10 17:17:04 +03:00
The result is a namedtuple containing `objects` (list), `number_of_objects`,
2017-05-05 15:31:08 +03:00
`pages_total`, `number` (of the current page), and `page_size`.
#### raw(query, settings=None, stream=False)
Performs a query and returns its output as text.
- `query`: the SQL query to execute.
- `settings`: query settings to send as HTTP GET parameters
- `stream`: if true, the HTTP response from ClickHouse will be streamed.
#### select(query, model_class=None, settings=None)
Performs a query and returns a generator of model instances.
- `query`: the SQL query to execute.
- `model_class`: the model class matching the query's table,
or `None` for getting back instances of an ad-hoc model.
- `settings`: query settings to send as HTTP GET parameters
### DatabaseException
Extends Exception
Raised when a database operation fails.
infi.clickhouse_orm.models
--------------------------
### Model
A base class for ORM models. Each model class represent a ClickHouse table. For example:
2017-09-10 17:17:04 +03:00
2017-05-05 15:31:08 +03:00
class CPUStats(Model):
timestamp = DateTimeField()
cpu_id = UInt16Field()
cpu_percent = Float32Field()
engine = Memory()
#### Model(**kwargs)
Creates a model instance, using keyword arguments as field values.
Since values are immediately converted to their Pythonic type,
invalid values will cause a `ValueError` to be raised.
Unrecognized field names will cause an `AttributeError`.
#### Model.create_table_sql(db)
2017-05-05 15:31:08 +03:00
2020-06-06 11:07:01 +03:00
Returns the SQL statement for creating a table for this model.
2017-05-05 15:31:08 +03:00
#### Model.drop_table_sql(db)
2017-05-05 15:31:08 +03:00
Returns the SQL command for deleting this model's table.
2018-04-21 12:10:30 +03:00
#### Model.fields(writable=False)
Returns an `OrderedDict` of the model's fields (from name to `Field` instance).
If `writable` is true, only writable fields are included.
Callers should not modify the dictionary.
2018-06-10 14:30:40 +03:00
#### Model.from_tsv(line, field_names, timezone_in_use=UTC, database=None)
2017-05-05 15:31:08 +03:00
Create a model instance from a tab-separated line. The line may or may not include a newline.
The `field_names` list must match the fields defined in the model, but does not have to include all of them.
- `line`: the TSV-formatted data.
- `field_names`: names of the model fields in the data.
2020-06-23 11:37:41 +03:00
- `timezone_in_use`: the timezone to use when parsing dates and datetimes. Some fields use their own timezones.
2017-05-05 15:31:08 +03:00
- `database`: if given, sets the database that this instance belongs to.
#### get_database()
2017-09-10 17:17:04 +03:00
Gets the `Database` that this model instance belongs to.
2017-05-05 15:31:08 +03:00
Returns `None` unless the instance was read from the database or written to it.
#### get_field(name)
Gets a `Field` instance given its name, or `None` if not found.
2020-02-08 13:38:23 +03:00
#### Model.has_funcs_as_defaults()
Return True if some of the model's fields use a function expression
as a default value. This requires special handling when inserting instances.
2018-06-10 14:30:40 +03:00
#### Model.is_read_only()
Returns true if the model is marked as read only.
#### Model.is_system_model()
Returns true if the model represents a system table.
2017-05-05 15:31:08 +03:00
#### Model.objects_in(database)
Returns a `QuerySet` for selecting instances of this model class.
#### set_database(db)
2017-09-10 17:17:04 +03:00
Sets the `Database` that this model instance belongs to.
2017-05-05 15:31:08 +03:00
This is done automatically when the instance is read from the database or written to it.
#### Model.table_name()
Returns the model's database table name. By default this is the
class name converted to lowercase. Override this if you want to use
a different table name.
2020-02-08 13:38:23 +03:00
#### to_db_string()
Returns the instance as a bytestring ready to be inserted into the database.
2017-05-05 15:31:08 +03:00
#### to_dict(include_readonly=True, field_names=None)
Returns the instance's column values as a dict.
- `include_readonly`: if false, returns only fields that can be inserted into database.
- `field_names`: an iterable of field names to return (optional)
2020-02-08 13:38:23 +03:00
#### to_tskv(include_readonly=True)
Returns the instance's column keys and values as a tab-separated line. A newline is not included.
Fields that were not assigned a value are omitted.
- `include_readonly`: if false, returns only fields that can be inserted into database.
2017-05-05 15:31:08 +03:00
#### to_tsv(include_readonly=True)
Returns the instance's column values as a tab-separated line. A newline is not included.
- `include_readonly`: if false, returns only fields that can be inserted into database.
### BufferModel
Extends Model
#### BufferModel(**kwargs)
Creates a model instance, using keyword arguments as field values.
Since values are immediately converted to their Pythonic type,
invalid values will cause a `ValueError` to be raised.
Unrecognized field names will cause an `AttributeError`.
#### BufferModel.create_table_sql(db)
2017-05-05 15:31:08 +03:00
2020-06-06 11:07:01 +03:00
Returns the SQL statement for creating a table for this model.
2017-05-05 15:31:08 +03:00
#### BufferModel.drop_table_sql(db)
2017-05-05 15:31:08 +03:00
Returns the SQL command for deleting this model's table.
2018-04-21 12:10:30 +03:00
#### BufferModel.fields(writable=False)
Returns an `OrderedDict` of the model's fields (from name to `Field` instance).
If `writable` is true, only writable fields are included.
Callers should not modify the dictionary.
2018-06-10 14:30:40 +03:00
#### BufferModel.from_tsv(line, field_names, timezone_in_use=UTC, database=None)
2017-05-05 15:31:08 +03:00
Create a model instance from a tab-separated line. The line may or may not include a newline.
The `field_names` list must match the fields defined in the model, but does not have to include all of them.
- `line`: the TSV-formatted data.
- `field_names`: names of the model fields in the data.
2020-06-23 11:37:41 +03:00
- `timezone_in_use`: the timezone to use when parsing dates and datetimes. Some fields use their own timezones.
2017-05-05 15:31:08 +03:00
- `database`: if given, sets the database that this instance belongs to.
#### get_database()
2017-09-10 17:17:04 +03:00
Gets the `Database` that this model instance belongs to.
2017-05-05 15:31:08 +03:00
Returns `None` unless the instance was read from the database or written to it.
#### get_field(name)
Gets a `Field` instance given its name, or `None` if not found.
2020-02-08 13:38:23 +03:00
#### BufferModel.has_funcs_as_defaults()
Return True if some of the model's fields use a function expression
as a default value. This requires special handling when inserting instances.
2018-06-10 14:30:40 +03:00
#### BufferModel.is_read_only()
Returns true if the model is marked as read only.
#### BufferModel.is_system_model()
Returns true if the model represents a system table.
2017-05-05 15:31:08 +03:00
#### BufferModel.objects_in(database)
Returns a `QuerySet` for selecting instances of this model class.
#### set_database(db)
2017-09-10 17:17:04 +03:00
Sets the `Database` that this model instance belongs to.
2017-05-05 15:31:08 +03:00
This is done automatically when the instance is read from the database or written to it.
#### BufferModel.table_name()
Returns the model's database table name. By default this is the
class name converted to lowercase. Override this if you want to use
a different table name.
2020-02-08 13:38:23 +03:00
#### to_db_string()
Returns the instance as a bytestring ready to be inserted into the database.
2017-05-05 15:31:08 +03:00
#### to_dict(include_readonly=True, field_names=None)
Returns the instance's column values as a dict.
- `include_readonly`: if false, returns only fields that can be inserted into database.
- `field_names`: an iterable of field names to return (optional)
2020-02-08 13:38:23 +03:00
#### to_tskv(include_readonly=True)
Returns the instance's column keys and values as a tab-separated line. A newline is not included.
Fields that were not assigned a value are omitted.
- `include_readonly`: if false, returns only fields that can be inserted into database.
2017-05-05 15:31:08 +03:00
#### to_tsv(include_readonly=True)
Returns the instance's column values as a tab-separated line. A newline is not included.
- `include_readonly`: if false, returns only fields that can be inserted into database.
2020-06-06 11:07:01 +03:00
### MergeModel
Extends Model
Model for Merge engine
Predefines virtual _table column an controls that rows can't be inserted to this table type
https://clickhouse.tech/docs/en/single/index.html#document-table_engines/merge
#### MergeModel(**kwargs)
Creates a model instance, using keyword arguments as field values.
Since values are immediately converted to their Pythonic type,
invalid values will cause a `ValueError` to be raised.
Unrecognized field names will cause an `AttributeError`.
#### MergeModel.create_table_sql(db)
Returns the SQL statement for creating a table for this model.
#### MergeModel.drop_table_sql(db)
Returns the SQL command for deleting this model's table.
#### MergeModel.fields(writable=False)
Returns an `OrderedDict` of the model's fields (from name to `Field` instance).
If `writable` is true, only writable fields are included.
Callers should not modify the dictionary.
#### MergeModel.from_tsv(line, field_names, timezone_in_use=UTC, database=None)
Create a model instance from a tab-separated line. The line may or may not include a newline.
The `field_names` list must match the fields defined in the model, but does not have to include all of them.
- `line`: the TSV-formatted data.
- `field_names`: names of the model fields in the data.
2020-06-23 11:37:41 +03:00
- `timezone_in_use`: the timezone to use when parsing dates and datetimes. Some fields use their own timezones.
2020-06-06 11:07:01 +03:00
- `database`: if given, sets the database that this instance belongs to.
#### get_database()
Gets the `Database` that this model instance belongs to.
Returns `None` unless the instance was read from the database or written to it.
#### get_field(name)
Gets a `Field` instance given its name, or `None` if not found.
#### MergeModel.has_funcs_as_defaults()
Return True if some of the model's fields use a function expression
as a default value. This requires special handling when inserting instances.
#### MergeModel.is_read_only()
Returns true if the model is marked as read only.
#### MergeModel.is_system_model()
Returns true if the model represents a system table.
#### MergeModel.objects_in(database)
Returns a `QuerySet` for selecting instances of this model class.
#### set_database(db)
Sets the `Database` that this model instance belongs to.
This is done automatically when the instance is read from the database or written to it.
#### MergeModel.table_name()
Returns the model's database table name. By default this is the
class name converted to lowercase. Override this if you want to use
a different table name.
#### to_db_string()
Returns the instance as a bytestring ready to be inserted into the database.
#### to_dict(include_readonly=True, field_names=None)
Returns the instance's column values as a dict.
- `include_readonly`: if false, returns only fields that can be inserted into database.
- `field_names`: an iterable of field names to return (optional)
#### to_tskv(include_readonly=True)
Returns the instance's column keys and values as a tab-separated line. A newline is not included.
Fields that were not assigned a value are omitted.
- `include_readonly`: if false, returns only fields that can be inserted into database.
#### to_tsv(include_readonly=True)
Returns the instance's column values as a tab-separated line. A newline is not included.
- `include_readonly`: if false, returns only fields that can be inserted into database.
2018-04-21 12:10:30 +03:00
### DistributedModel
Extends Model
2020-06-06 11:07:01 +03:00
Model class for use with a `Distributed` engine.
2018-04-21 12:10:30 +03:00
#### DistributedModel(**kwargs)
Creates a model instance, using keyword arguments as field values.
Since values are immediately converted to their Pythonic type,
invalid values will cause a `ValueError` to be raised.
Unrecognized field names will cause an `AttributeError`.
2018-04-21 13:48:00 +03:00
#### DistributedModel.create_table_sql(db)
2018-04-21 12:10:30 +03:00
2020-06-06 11:07:01 +03:00
Returns the SQL statement for creating a table for this model.
2018-04-21 13:48:00 +03:00
#### DistributedModel.drop_table_sql(db)
2018-04-21 12:10:30 +03:00
Returns the SQL command for deleting this model's table.
#### DistributedModel.fields(writable=False)
Returns an `OrderedDict` of the model's fields (from name to `Field` instance).
If `writable` is true, only writable fields are included.
Callers should not modify the dictionary.
#### DistributedModel.fix_engine_table()
Remember: Distributed table does not store any data, just provides distributed access to it.
So if we define a model with engine that has no defined table for data storage
(see FooDistributed below), that table cannot be successfully created.
This routine can automatically fix engine's storage table by finding the first
non-distributed model among your model's superclasses.
>>> class Foo(Model):
... id = UInt8Field(1)
...
>>> class FooDistributed(Foo, DistributedModel):
... engine = Distributed('my_cluster')
...
>>> FooDistributed.engine.table
None
>>> FooDistributed.fix_engine()
>>> FooDistributed.engine.table
<class '__main__.Foo'>
However if you prefer more explicit way of doing things,
you can always mention the Foo model twice without bothering with any fixes:
>>> class FooDistributedVerbose(Foo, DistributedModel):
... engine = Distributed('my_cluster', Foo)
>>> FooDistributedVerbose.engine.table
<class '__main__.Foo'>
See tests.test_engines:DistributedTestCase for more examples
2018-06-10 14:30:40 +03:00
#### DistributedModel.from_tsv(line, field_names, timezone_in_use=UTC, database=None)
2018-04-21 12:10:30 +03:00
Create a model instance from a tab-separated line. The line may or may not include a newline.
The `field_names` list must match the fields defined in the model, but does not have to include all of them.
- `line`: the TSV-formatted data.
- `field_names`: names of the model fields in the data.
2020-06-23 11:37:41 +03:00
- `timezone_in_use`: the timezone to use when parsing dates and datetimes. Some fields use their own timezones.
2018-04-21 12:10:30 +03:00
- `database`: if given, sets the database that this instance belongs to.
#### get_database()
Gets the `Database` that this model instance belongs to.
Returns `None` unless the instance was read from the database or written to it.
#### get_field(name)
Gets a `Field` instance given its name, or `None` if not found.
2020-02-08 13:38:23 +03:00
#### DistributedModel.has_funcs_as_defaults()
Return True if some of the model's fields use a function expression
as a default value. This requires special handling when inserting instances.
2018-06-10 14:30:40 +03:00
#### DistributedModel.is_read_only()
Returns true if the model is marked as read only.
#### DistributedModel.is_system_model()
Returns true if the model represents a system table.
2018-04-21 12:10:30 +03:00
#### DistributedModel.objects_in(database)
Returns a `QuerySet` for selecting instances of this model class.
#### set_database(db)
2020-06-06 11:07:01 +03:00
Sets the `Database` that this model instance belongs to.
This is done automatically when the instance is read from the database or written to it.
2018-04-21 12:10:30 +03:00
#### DistributedModel.table_name()
Returns the model's database table name. By default this is the
class name converted to lowercase. Override this if you want to use
a different table name.
2020-02-08 13:38:23 +03:00
#### to_db_string()
Returns the instance as a bytestring ready to be inserted into the database.
2018-04-21 12:10:30 +03:00
#### to_dict(include_readonly=True, field_names=None)
Returns the instance's column values as a dict.
- `include_readonly`: if false, returns only fields that can be inserted into database.
- `field_names`: an iterable of field names to return (optional)
2020-02-08 13:38:23 +03:00
#### to_tskv(include_readonly=True)
Returns the instance's column keys and values as a tab-separated line. A newline is not included.
Fields that were not assigned a value are omitted.
- `include_readonly`: if false, returns only fields that can be inserted into database.
2018-04-21 12:10:30 +03:00
#### to_tsv(include_readonly=True)
Returns the instance's column values as a tab-separated line. A newline is not included.
- `include_readonly`: if false, returns only fields that can be inserted into database.
2020-06-06 11:07:01 +03:00
### Constraint
Defines a model constraint.
#### Constraint(expr)
2020-06-06 20:56:32 +03:00
Initializer. Expects an expression that ClickHouse will verify when inserting data.
2020-06-06 11:07:01 +03:00
#### create_table_sql()
2020-06-06 20:56:32 +03:00
Returns the SQL statement for defining this constraint during table creation.
2020-06-06 11:07:01 +03:00
2020-06-06 20:56:32 +03:00
### Index
Defines a data-skipping index.
#### Index(expr, type, granularity)
Initializer.
- `expr` - a column, expression, or tuple of columns and expressions to index.
- `type` - the index type. Use one of the following methods to specify the type:
`Index.minmax`, `Index.set`, `Index.ngrambf_v1`, `Index.tokenbf_v1` or `Index.bloom_filter`.
- `granularity` - index block size (number of multiples of the `index_granularity` defined by the engine).
#### bloom_filter()
An index that stores a Bloom filter containing values of the index expression.
- `false_positive` - the probability (between 0 and 1) of receiving a false positive
response from the filter
#### create_table_sql()
Returns the SQL statement for defining this index during table creation.
#### minmax()
An index that stores extremes of the specified expression (if the expression is tuple, then it stores
extremes for each element of tuple). The stored info is used for skipping blocks of data like the primary key.
#### ngrambf_v1(size_of_bloom_filter_in_bytes, number_of_hash_functions, random_seed)
An index that stores a Bloom filter containing all ngrams from a block of data.
Works only with strings. Can be used for optimization of equals, like and in expressions.
- `n` — ngram size
- `size_of_bloom_filter_in_bytes` — Bloom filter size in bytes (you can use large values here,
for example 256 or 512, because it can be compressed well).
- `number_of_hash_functions` — The number of hash functions used in the Bloom filter.
- `random_seed` — The seed for Bloom filter hash functions.
#### set()
An index that stores unique values of the specified expression (no more than max_rows rows,
or unlimited if max_rows=0). Uses the values to check if the WHERE expression is not satisfiable
on a block of data.
#### tokenbf_v1(number_of_hash_functions, random_seed)
An index that stores a Bloom filter containing string tokens. Tokens are sequences
separated by non-alphanumeric characters.
- `size_of_bloom_filter_in_bytes` — Bloom filter size in bytes (you can use large values here,
for example 256 or 512, because it can be compressed well).
- `number_of_hash_functions` — The number of hash functions used in the Bloom filter.
- `random_seed` — The seed for Bloom filter hash functions.
2020-06-06 11:07:01 +03:00
2017-05-05 15:31:08 +03:00
infi.clickhouse_orm.fields
--------------------------
2018-10-13 23:19:43 +03:00
### ArrayField
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
Extends Field
2017-05-05 15:31:08 +03:00
2019-06-25 07:46:37 +03:00
#### ArrayField(inner_field, default=None, alias=None, materialized=None, readonly=None, codec=None)
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
### BaseEnumField
2017-05-05 15:31:08 +03:00
Extends Field
2018-10-13 23:19:43 +03:00
Abstract base class for all enum-type fields.
2017-05-05 15:31:08 +03:00
2019-06-25 07:46:37 +03:00
#### BaseEnumField(enum_cls, default=None, alias=None, materialized=None, readonly=None, codec=None)
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
### BaseFloatField
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
Extends Field
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
Abstract base class for all float-type fields.
2017-05-05 15:31:08 +03:00
2019-06-25 07:46:37 +03:00
#### BaseFloatField(default=None, alias=None, materialized=None, readonly=None, codec=None)
2017-05-05 15:31:08 +03:00
### BaseIntField
Extends Field
Abstract base class for all integer-type fields.
2019-06-25 07:46:37 +03:00
#### BaseIntField(default=None, alias=None, materialized=None, readonly=None, codec=None)
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
### DateField
2017-05-05 15:31:08 +03:00
Extends Field
2019-06-25 07:46:37 +03:00
#### DateField(default=None, alias=None, materialized=None, readonly=None, codec=None)
2018-10-13 23:19:43 +03:00
2017-05-05 15:31:08 +03:00
2020-06-23 11:37:41 +03:00
### DateTime64Field
2017-05-05 15:31:08 +03:00
2020-06-23 11:37:41 +03:00
Extends DateTimeField
2017-05-05 15:31:08 +03:00
2020-06-23 11:37:41 +03:00
#### DateTime64Field(default=None, alias=None, materialized=None, readonly=None, codec=None, timezone=None, precision=6)
2017-05-05 15:31:08 +03:00
2020-06-23 11:37:41 +03:00
### DateTimeField
2020-06-07 12:50:45 +03:00
2020-06-23 11:37:41 +03:00
Extends Field
2020-06-07 12:50:45 +03:00
2020-06-23 11:37:41 +03:00
#### DateTimeField(default=None, alias=None, materialized=None, readonly=None, codec=None, timezone=None)
2020-06-07 12:50:45 +03:00
2018-10-13 23:19:43 +03:00
### Decimal128Field
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
Extends DecimalField
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
#### Decimal128Field(scale, default=None, alias=None, materialized=None, readonly=None)
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
### Decimal32Field
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
Extends DecimalField
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
#### Decimal32Field(scale, default=None, alias=None, materialized=None, readonly=None)
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
### Decimal64Field
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
Extends DecimalField
#### Decimal64Field(scale, default=None, alias=None, materialized=None, readonly=None)
### DecimalField
2017-06-23 11:56:05 +03:00
Extends Field
2018-10-13 23:19:43 +03:00
Base class for all decimal fields. Can also be used directly.
2017-06-23 11:56:05 +03:00
2018-10-13 23:19:43 +03:00
#### DecimalField(precision, scale, default=None, alias=None, materialized=None, readonly=None)
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
### Enum16Field
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
Extends BaseEnumField
2017-05-05 15:31:08 +03:00
2019-06-25 07:46:37 +03:00
#### Enum16Field(enum_cls, default=None, alias=None, materialized=None, readonly=None, codec=None)
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
### Enum8Field
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
Extends BaseEnumField
2017-05-05 15:31:08 +03:00
2019-06-25 07:46:37 +03:00
#### Enum8Field(enum_cls, default=None, alias=None, materialized=None, readonly=None, codec=None)
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
### Field
2017-05-05 15:31:08 +03:00
2020-02-08 13:38:23 +03:00
Extends FunctionOperatorsMixin
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
Abstract base class for all field types.
2017-05-05 15:31:08 +03:00
2019-06-25 07:46:37 +03:00
#### Field(default=None, alias=None, materialized=None, readonly=None, codec=None)
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
### FixedStringField
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
Extends StringField
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
#### FixedStringField(length, default=None, alias=None, materialized=None, readonly=None)
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
### Float32Field
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
Extends BaseFloatField
2017-05-05 15:31:08 +03:00
2019-06-25 07:46:37 +03:00
#### Float32Field(default=None, alias=None, materialized=None, readonly=None, codec=None)
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
### Float64Field
Extends BaseFloatField
2019-06-25 07:46:37 +03:00
#### Float64Field(default=None, alias=None, materialized=None, readonly=None, codec=None)
2017-05-05 15:31:08 +03:00
2020-02-08 13:38:23 +03:00
### IPv4Field
Extends Field
#### IPv4Field(default=None, alias=None, materialized=None, readonly=None, codec=None)
### IPv6Field
Extends Field
#### IPv6Field(default=None, alias=None, materialized=None, readonly=None, codec=None)
2017-05-05 15:31:08 +03:00
### Int16Field
Extends BaseIntField
2019-06-25 07:46:37 +03:00
#### Int16Field(default=None, alias=None, materialized=None, readonly=None, codec=None)
2017-05-05 15:31:08 +03:00
### Int32Field
Extends BaseIntField
2019-06-25 07:46:37 +03:00
#### Int32Field(default=None, alias=None, materialized=None, readonly=None, codec=None)
2017-05-05 15:31:08 +03:00
### Int64Field
Extends BaseIntField
2019-06-25 07:46:37 +03:00
#### Int64Field(default=None, alias=None, materialized=None, readonly=None, codec=None)
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
### Int8Field
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
Extends BaseIntField
2017-05-05 15:31:08 +03:00
2019-06-25 07:46:37 +03:00
#### Int8Field(default=None, alias=None, materialized=None, readonly=None, codec=None)
### LowCardinalityField
Extends Field
#### LowCardinalityField(inner_field, default=None, alias=None, materialized=None, readonly=None, codec=None)
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
### NullableField
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
Extends Field
2017-05-05 15:31:08 +03:00
2019-06-25 07:46:37 +03:00
#### NullableField(inner_field, default=None, alias=None, materialized=None, extra_null_values=None, codec=None)
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
### StringField
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
Extends Field
2017-05-05 15:31:08 +03:00
2019-06-25 07:46:37 +03:00
#### StringField(default=None, alias=None, materialized=None, readonly=None, codec=None)
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
### UInt16Field
2017-05-05 15:31:08 +03:00
2018-10-13 23:19:43 +03:00
Extends BaseIntField
2017-05-05 15:31:08 +03:00
2019-06-25 07:46:37 +03:00
#### UInt16Field(default=None, alias=None, materialized=None, readonly=None, codec=None)
2018-10-13 23:19:43 +03:00
### UInt32Field
Extends BaseIntField
2019-06-25 07:46:37 +03:00
#### UInt32Field(default=None, alias=None, materialized=None, readonly=None, codec=None)
2018-10-13 23:19:43 +03:00
### UInt64Field
Extends BaseIntField
2019-06-25 07:46:37 +03:00
#### UInt64Field(default=None, alias=None, materialized=None, readonly=None, codec=None)
2018-10-13 23:19:43 +03:00
### UInt8Field
Extends BaseIntField
2019-06-25 07:46:37 +03:00
#### UInt8Field(default=None, alias=None, materialized=None, readonly=None, codec=None)
2017-05-05 15:31:08 +03:00
2019-06-13 05:19:16 +03:00
### UUIDField
Extends Field
2019-06-25 07:46:37 +03:00
#### UUIDField(default=None, alias=None, materialized=None, readonly=None, codec=None)
2019-06-13 05:19:16 +03:00
2017-05-05 15:31:08 +03:00
infi.clickhouse_orm.engines
---------------------------
### Engine
### TinyLog
Extends Engine
### Log
Extends Engine
### Memory
Extends Engine
### MergeTree
Extends Engine
2019-07-15 11:01:45 +03:00
#### MergeTree(date_col=None, order_by=(), sampling_expr=None, index_granularity=8192, replica_table_path=None, replica_name=None, partition_key=None, primary_key=None)
2017-05-05 15:31:08 +03:00
### Buffer
Extends Engine
Buffers the data to write in RAM, periodically flushing it to another table.
Must be used in conjuction with a `BufferModel`.
2020-05-28 19:18:10 +03:00
Read more [here](https://clickhouse.tech/docs/en/engines/table-engines/special/buffer/).
2017-05-05 15:31:08 +03:00
#### Buffer(main_model, num_layers=16, min_time=10, max_time=100, min_rows=10000, max_rows=1000000, min_bytes=10000000, max_bytes=100000000)
2017-09-13 12:15:44 +03:00
### Merge
Extends Engine
The Merge engine (not to be confused with MergeTree) does not store data itself,
but allows reading from any number of other tables simultaneously.
Writing to a table is not supported
2020-05-28 19:18:10 +03:00
https://clickhouse.tech/docs/en/engines/table-engines/special/merge/
2017-09-13 12:15:44 +03:00
#### Merge(table_regex)
2018-04-21 12:10:30 +03:00
### Distributed
Extends Engine
The Distributed engine by itself does not store data,
but allows distributed query processing on multiple servers.
Reading is automatically parallelized.
During a read, the table indexes on remote servers are used, if there are any.
See full documentation here
2020-05-28 19:18:10 +03:00
https://clickhouse.tech/docs/en/engines/table-engines/special/distributed/
2018-04-21 12:10:30 +03:00
2018-04-21 13:48:00 +03:00
#### Distributed(cluster, table=None, sharding_key=None)
2018-04-21 12:10:30 +03:00
2020-02-08 13:38:23 +03:00
- `cluster`: what cluster to access data from
- `table`: underlying table that actually stores data.
2018-04-21 12:10:30 +03:00
If you are not specifying any table here, ensure that it can be inferred
from your model's superclass (see models.DistributedModel.fix_engine_table)
2020-02-08 13:38:23 +03:00
- `sharding_key`: how to distribute data among shards when inserting
2018-04-21 12:10:30 +03:00
straightly into Distributed table, optional
2017-05-05 15:31:08 +03:00
### CollapsingMergeTree
Extends MergeTree
2019-07-15 11:01:45 +03:00
#### CollapsingMergeTree(date_col=None, order_by=(), sign_col="sign", sampling_expr=None, index_granularity=8192, replica_table_path=None, replica_name=None, partition_key=None, primary_key=None)
2017-05-05 15:31:08 +03:00
### SummingMergeTree
Extends MergeTree
2019-07-15 11:01:45 +03:00
#### SummingMergeTree(date_col=None, order_by=(), summing_cols=None, sampling_expr=None, index_granularity=8192, replica_table_path=None, replica_name=None, partition_key=None, primary_key=None)
2017-05-05 15:31:08 +03:00
### ReplacingMergeTree
Extends MergeTree
2019-07-15 11:01:45 +03:00
#### ReplacingMergeTree(date_col=None, order_by=(), ver_col=None, sampling_expr=None, index_granularity=8192, replica_table_path=None, replica_name=None, partition_key=None, primary_key=None)
2017-05-05 15:31:08 +03:00
infi.clickhouse_orm.query
-------------------------
### QuerySet
2017-06-24 12:28:42 +03:00
A queryset is an object that represents a database query using a specific `Model`.
It is lazy, meaning that it does not hit the database until you iterate over its
2017-05-05 15:31:08 +03:00
matching rows (model instances).
#### QuerySet(model_cls, database)
Initializer. It is possible to create a queryset like this, but the standard
way is to use `MyModel.objects_in(database)`.
#### aggregate(*args, **kwargs)
Returns an `AggregateQuerySet` over this query, with `args` serving as
grouping fields and `kwargs` serving as calculated fields. At least one
calculated field is required. For example:
```
Event.objects_in(database).filter(date__gt='2017-08-01').aggregate('event_type', count='count()')
```
is equivalent to:
```
SELECT event_type, count() AS count FROM event
WHERE data > '2017-08-01'
GROUP BY event_type
```
2017-05-05 15:31:08 +03:00
#### as_sql()
Returns the whole query as a SQL string.
2019-02-27 09:42:09 +03:00
#### conditions_as_sql(prewhere=False)
2017-05-05 15:31:08 +03:00
2018-12-19 08:10:03 +03:00
Returns the contents of the query's `WHERE` or `PREWHERE` clause as a string.
2017-05-05 15:31:08 +03:00
#### count()
Returns the number of matching model instances.
#### delete()
Deletes all records matched by this queryset's conditions.
Note that ClickHouse performs deletions in the background, so they are not immediate.
2017-09-10 17:17:04 +03:00
#### distinct()
Adds a DISTINCT clause to the query, meaning that any duplicate rows
in the results will be omitted.
2019-02-27 09:42:09 +03:00
#### exclude(*q, **kwargs)
2017-05-05 15:31:08 +03:00
Returns a copy of this queryset that excludes all rows matching the conditions.
2019-02-27 09:42:09 +03:00
Pass `prewhere=True` to apply the conditions as PREWHERE instead of WHERE.
2017-05-05 15:31:08 +03:00
2019-02-27 09:42:09 +03:00
#### filter(*q, **kwargs)
2017-05-05 15:31:08 +03:00
Returns a copy of this queryset that includes only rows matching the conditions.
2019-02-27 09:42:09 +03:00
Pass `prewhere=True` to apply the conditions as PREWHERE instead of WHERE.
2017-05-05 15:31:08 +03:00
2018-12-14 09:34:40 +03:00
#### final()
Adds a FINAL modifier to table, meaning data will be collapsed to final version.
2020-06-23 11:37:41 +03:00
Can be used with the `CollapsingMergeTree` and `ReplacingMergeTree` engines only.
2018-12-14 09:34:40 +03:00
#### limit_by(offset_limit, *fields_or_expr)
Adds a LIMIT BY clause to the query.
- `offset_limit`: either an integer specifying the limit, or a tuple of integers (offset, limit).
- `fields_or_expr`: the field names or expressions to use in the clause.
2017-05-05 15:31:08 +03:00
#### only(*field_names)
Returns a copy of this queryset limited to the specified field names.
2017-05-05 15:31:08 +03:00
Useful when there are large fields that are not needed,
or for creating a subquery to use with an IN operator.
#### order_by(*field_names)
Returns a copy of this queryset with the ordering changed.
#### order_by_as_sql()
Returns the contents of the query's `ORDER BY` clause as a string.
#### paginate(page_num=1, page_size=100)
Returns a single page of model instances that match the queryset.
Note that `order_by` should be used first, to ensure a correct
partitioning of records into pages.
- `page_num`: the page number (1-based), or -1 to get the last page.
- `page_size`: number of records to return per page.
The result is a namedtuple containing `objects` (list), `number_of_objects`,
`pages_total`, `number` (of the current page), and `page_size`.
2019-02-27 09:42:09 +03:00
#### select_fields_as_sql()
2019-02-27 09:58:41 +03:00
Returns the selected fields or expressions as a SQL string.
#### update(**kwargs)
Updates all records matched by this queryset's conditions.
Keyword arguments specify the field names and expressions to use for the update.
Note that ClickHouse performs updates in the background, so they are not immediate.
### AggregateQuerySet
Extends QuerySet
A queryset used for aggregation.
#### AggregateQuerySet(base_qs, grouping_fields, calculated_fields)
Initializer. Normally you should not call this but rather use `QuerySet.aggregate()`.
The grouping fields should be a list/tuple of field names from the model. For example:
```
('event_type', 'event_subtype')
```
The calculated fields should be a mapping from name to a ClickHouse aggregation function. For example:
```
{'weekday': 'toDayOfWeek(event_date)', 'number_of_events': 'count()'}
```
At least one calculated field is required.
#### aggregate(*args, **kwargs)
This method is not supported on `AggregateQuerySet`.
#### as_sql()
Returns the whole query as a SQL string.
2019-02-27 09:42:09 +03:00
#### conditions_as_sql(prewhere=False)
2018-12-19 08:10:03 +03:00
Returns the contents of the query's `WHERE` or `PREWHERE` clause as a string.
#### count()
Returns the number of rows after aggregation.
#### delete()
Deletes all records matched by this queryset's conditions.
Note that ClickHouse performs deletions in the background, so they are not immediate.
2017-09-10 17:17:04 +03:00
#### distinct()
Adds a DISTINCT clause to the query, meaning that any duplicate rows
in the results will be omitted.
2019-02-27 09:42:09 +03:00
#### exclude(*q, **kwargs)
Returns a copy of this queryset that excludes all rows matching the conditions.
2019-02-27 09:42:09 +03:00
Pass `prewhere=True` to apply the conditions as PREWHERE instead of WHERE.
2019-02-27 09:42:09 +03:00
#### filter(*q, **kwargs)
Returns a copy of this queryset that includes only rows matching the conditions.
2019-02-27 09:42:09 +03:00
Pass `prewhere=True` to apply the conditions as PREWHERE instead of WHERE.
2018-12-14 09:34:40 +03:00
#### final()
Adds a FINAL modifier to table, meaning data will be collapsed to final version.
2020-06-23 11:37:41 +03:00
Can be used with the `CollapsingMergeTree` and `ReplacingMergeTree` engines only.
2018-12-14 09:34:40 +03:00
#### group_by(*args)
This method lets you specify the grouping fields explicitly. The `args` must
be names of grouping fields or calculated fields that this queryset was
created with.
#### limit_by(offset_limit, *fields_or_expr)
Adds a LIMIT BY clause to the query.
- `offset_limit`: either an integer specifying the limit, or a tuple of integers (offset, limit).
- `fields_or_expr`: the field names or expressions to use in the clause.
#### only(*field_names)
This method is not supported on `AggregateQuerySet`.
#### order_by(*field_names)
Returns a copy of this queryset with the ordering changed.
2017-05-05 15:31:08 +03:00
#### order_by_as_sql()
Returns the contents of the query's `ORDER BY` clause as a string.
#### paginate(page_num=1, page_size=100)
Returns a single page of model instances that match the queryset.
Note that `order_by` should be used first, to ensure a correct
partitioning of records into pages.
- `page_num`: the page number (1-based), or -1 to get the last page.
- `page_size`: number of records to return per page.
The result is a namedtuple containing `objects` (list), `number_of_objects`,
`pages_total`, `number` (of the current page), and `page_size`.
2019-02-27 09:42:09 +03:00
#### select_fields_as_sql()
2019-02-27 09:58:41 +03:00
Returns the selected fields or expressions as a SQL string.
#### update(**kwargs)
Updates all records matched by this queryset's conditions.
Keyword arguments specify the field names and expressions to use for the update.
Note that ClickHouse performs updates in the background, so they are not immediate.
2019-02-27 09:58:41 +03:00
#### with_totals()
Adds WITH TOTALS modifier ot GROUP BY, making query return extra row
with aggregate function calculated across all the rows. More information:
2020-05-28 19:18:10 +03:00
https://clickhouse.tech/docs/en/query_language/select/#with-totals-modifier
2019-02-27 09:58:41 +03:00
### Q
#### Q(*filter_funcs, **filter_fields)
#### to_sql(model_cls)
2020-02-08 13:38:23 +03:00
infi.clickhouse_orm.funcs
-------------------------
### F
Extends Cond, FunctionOperatorsMixin
Represents a database function call and its arguments.
It doubles as a query condition when the function returns a boolean result.
#### CAST(type)
#### CRC32()
#### IPv4CIDRToRange(cidr)
#### IPv4NumToString()
#### IPv4NumToStringClassC()
#### IPv4StringToNum()
#### IPv4ToIPv6()
#### IPv6CIDRToRange(cidr)
#### IPv6NumToString()
#### IPv6StringToNum()
#### MD5()
#### SHA1()
#### SHA224()
#### SHA256()
#### URLHash(n=None)
#### UUIDNumToString()
#### UUIDStringToNum()
#### F(name, *args)
Initializer.
#### abs()
#### acos()
#### addDays(n, timezone=NO_VALUE)
2020-02-08 13:38:23 +03:00
#### addHours(n, timezone=NO_VALUE)
2020-02-08 13:38:23 +03:00
#### addMinutes(n, timezone=NO_VALUE)
2020-02-08 13:38:23 +03:00
#### addMonths(n, timezone=NO_VALUE)
2020-02-08 13:38:23 +03:00
#### addQuarters(n, timezone=NO_VALUE)
2020-02-08 13:38:23 +03:00
#### addSeconds(n, timezone=NO_VALUE)
2020-02-08 13:38:23 +03:00
#### addWeeks(n, timezone=NO_VALUE)
2020-02-08 13:38:23 +03:00
#### addYears(n, timezone=NO_VALUE)
2020-02-08 13:38:23 +03:00
#### alphaTokens()
#### any(**kwargs)
#### anyHeavy(**kwargs)
#### anyHeavyIf(cond)
#### anyHeavyOrDefault()
#### anyHeavyOrDefaultIf(cond)
#### anyHeavyOrNull()
#### anyHeavyOrNullIf(cond)
#### anyIf(cond)
#### anyLast(**kwargs)
#### anyLastIf(cond)
#### anyLastOrDefault()
#### anyLastOrDefaultIf(cond)
#### anyLastOrNull()
#### anyLastOrNullIf(cond)
#### anyOrDefault()
#### anyOrDefaultIf(cond)
#### anyOrNull()
#### anyOrNullIf(cond)
2020-02-08 13:38:23 +03:00
#### appendTrailingCharIfAbsent(c)
#### argMax(**kwargs)
2020-02-08 13:38:23 +03:00
#### argMaxIf(y, cond)
2020-02-08 13:38:23 +03:00
#### argMaxOrDefault(y)
#### argMaxOrDefaultIf(y, cond)
#### argMaxOrNull(y)
#### argMaxOrNullIf(y, cond)
#### argMin(**kwargs)
#### argMinIf(y, cond)
#### argMinOrDefault(y)
2020-02-08 13:38:23 +03:00
#### argMinOrDefaultIf(y, cond)
2020-02-08 13:38:23 +03:00
#### argMinOrNull(y)
2020-02-08 13:38:23 +03:00
#### argMinOrNullIf(y, cond)
#### array()
#### arrayConcat()
2020-02-08 13:38:23 +03:00
#### arrayDifference()
#### arrayDistinct()
#### arrayElement(n)
#### arrayEnumerate()
#### arrayEnumerateDense()
#### arrayEnumerateDenseRanked()
#### arrayEnumerateUniq()
#### arrayEnumerateUniqRanked()
#### arrayIntersect()
#### arrayJoin()
#### arrayPopBack()
#### arrayPopFront()
#### arrayPushBack(x)
#### arrayPushFront(x)
#### arrayReduce(*args)
#### arrayResize(size, extender=None)
#### arrayReverse()
#### arraySlice(offset, length=None)
2020-02-08 13:38:23 +03:00
#### arrayStringConcat(sep=None)
2020-02-08 13:38:23 +03:00
#### arrayUniq()
2020-02-08 13:38:23 +03:00
#### asin()
2020-02-08 13:38:23 +03:00
#### atan()
2020-02-08 13:38:23 +03:00
#### avg(**kwargs)
2020-02-08 13:38:23 +03:00
#### avgIf(cond)
2020-02-08 13:38:23 +03:00
#### avgOrDefault()
#### avgOrDefaultIf(cond)
#### avgOrNull()
#### avgOrNullIf(cond)
2020-02-08 13:38:23 +03:00
#### base64Decode()
#### base64Encode()
#### bitAnd(y)
#### bitNot()
#### bitOr(y)
#### bitRotateLeft(y)
#### bitRotateRight(y)
#### bitShiftLeft(y)
#### bitShiftRight(y)
#### bitTest(y)
#### bitTestAll(*args)
#### bitTestAny(*args)
#### bitXor(y)
#### bitmapAnd(y)
#### bitmapAndCardinality(y)
#### bitmapAndnot(y)
#### bitmapAndnotCardinality(y)
#### bitmapBuild()
#### bitmapCardinality()
#### bitmapContains(needle)
#### bitmapHasAll(y)
#### bitmapHasAny(y)
#### bitmapOr(y)
#### bitmapOrCardinality(y)
#### bitmapToArray()
#### bitmapXor(y)
#### bitmapXorCardinality(y)
#### bitmaskToArray()
#### bitmaskToList()
#### cbrt()
#### ceiling(n=None)
#### ceiling(n=None)
#### cityHash64()
#### coalesce()
2020-02-08 13:38:23 +03:00
#### concat()
#### convertCharset(from_charset, to_charset)
#### corr(**kwargs)
#### corrIf(y, cond)
#### corrOrDefault(y)
#### corrOrDefaultIf(y, cond)
#### corrOrNull(y)
#### corrOrNullIf(y, cond)
2020-02-08 13:38:23 +03:00
#### cos()
#### count(**kwargs)
2020-02-08 13:38:23 +03:00
#### countEqual(x)
#### countIf()
#### countOrDefault()
#### countOrDefaultIf()
#### countOrNull()
#### countOrNullIf()
#### covarPop(**kwargs)
#### covarPopIf(y, cond)
#### covarPopOrDefault(y)
#### covarPopOrDefaultIf(y, cond)
#### covarPopOrNull(y)
#### covarPopOrNullIf(y, cond)
#### covarSamp(**kwargs)
#### covarSampIf(y, cond)
#### covarSampOrDefault(y)
#### covarSampOrDefaultIf(y, cond)
#### covarSampOrNull(y)
#### covarSampOrNullIf(y, cond)
#### dictGet(attr_name, id_expr)
#### dictGetHierarchy(id_expr)
#### dictGetOrDefault(attr_name, id_expr, default)
#### dictHas(id_expr)
#### dictIsIn(child_id_expr, ancestor_id_expr)
2020-02-08 13:38:23 +03:00
#### divide(**kwargs)
#### e()
#### empty()
#### emptyArrayDate()
#### emptyArrayDateTime()
#### emptyArrayFloat32()
#### emptyArrayFloat64()
#### emptyArrayInt16()
#### emptyArrayInt32()
#### emptyArrayInt64()
#### emptyArrayInt8()
#### emptyArrayString()
#### emptyArrayToSingle()
#### emptyArrayUInt16()
#### emptyArrayUInt32()
#### emptyArrayUInt64()
#### emptyArrayUInt8()
#### endsWith(suffix)
#### equals(**kwargs)
#### erf()
#### erfc()
#### exp()
#### exp10()
#### exp2()
#### extract(pattern)
#### extractAll(pattern)
2020-02-08 13:38:23 +03:00
#### farmHash64()
#### floor(n=None)
#### formatDateTime(format, timezone="")
#### gcd(b)
#### generateUUIDv4()
#### greater(**kwargs)
#### greaterOrEquals(**kwargs)
#### greatest(y)
2020-02-08 13:38:23 +03:00
#### halfMD5()
#### has(x)
#### hasAll(x)
#### hasAny(x)
#### hex()
#### hiveHash()
#### ifNotFinite(y)
#### ifNull(y)
2020-02-08 13:38:23 +03:00
#### indexOf(x)
#### intDiv(b)
#### intDivOrZero(b)
#### intExp10()
#### intExp2()
#### intHash32()
#### intHash64()
#### isFinite()
2020-02-08 13:38:23 +03:00
#### isIn(others)
2020-02-08 13:38:23 +03:00
#### isInfinite()
2020-02-08 13:38:23 +03:00
#### isNaN()
2020-02-08 13:38:23 +03:00
#### isNotIn(others)
2020-02-08 13:38:23 +03:00
#### isNotNull()
2020-02-08 13:38:23 +03:00
#### isNull()
2020-02-08 13:38:23 +03:00
#### javaHash()
2020-02-08 13:38:23 +03:00
#### jumpConsistentHash(buckets)
2020-02-08 13:38:23 +03:00
#### kurtPop(**kwargs)
2020-02-08 13:38:23 +03:00
#### kurtPopIf(cond)
2020-02-08 13:38:23 +03:00
#### kurtPopOrDefault()
2020-02-08 13:38:23 +03:00
#### kurtPopOrDefaultIf(cond)
2020-02-08 13:38:23 +03:00
#### kurtPopOrNull()
2020-02-08 13:38:23 +03:00
#### kurtPopOrNullIf(cond)
2020-02-08 13:38:23 +03:00
#### kurtSamp(**kwargs)
2020-02-08 13:38:23 +03:00
#### kurtSampIf(cond)
2020-02-08 13:38:23 +03:00
#### kurtSampOrDefault()
2020-02-08 13:38:23 +03:00
#### kurtSampOrDefaultIf(cond)
2020-02-08 13:38:23 +03:00
#### kurtSampOrNull()
2020-02-08 13:38:23 +03:00
#### kurtSampOrNullIf(cond)
2020-02-08 13:38:23 +03:00
#### lcm(b)
2020-02-08 13:38:23 +03:00
#### least(y)
2020-02-08 13:38:23 +03:00
#### length(**kwargs)
2020-02-08 13:38:23 +03:00
#### lengthUTF8()
2020-02-08 13:38:23 +03:00
#### less(**kwargs)
2020-02-08 13:38:23 +03:00
#### lessOrEquals(**kwargs)
2020-02-08 13:38:23 +03:00
#### lgamma()
2020-02-08 13:38:23 +03:00
#### like(pattern)
2020-02-08 13:38:23 +03:00
#### log()
2020-02-08 13:38:23 +03:00
#### log()
2020-02-08 13:38:23 +03:00
#### log10()
2020-02-08 13:38:23 +03:00
#### log2()
2020-02-08 13:38:23 +03:00
#### lower(**kwargs)
2020-02-08 13:38:23 +03:00
#### lowerUTF8()
2020-02-08 13:38:23 +03:00
#### match(pattern)
2020-02-08 13:38:23 +03:00
#### max(**kwargs)
2020-02-08 13:38:23 +03:00
#### maxIf(cond)
2020-02-08 13:38:23 +03:00
#### maxOrDefault()
2020-02-08 13:38:23 +03:00
#### maxOrDefaultIf(cond)
2020-02-08 13:38:23 +03:00
#### maxOrNull()
2020-02-08 13:38:23 +03:00
#### maxOrNullIf(cond)
2020-02-08 13:38:23 +03:00
#### metroHash64()
2020-02-08 13:38:23 +03:00
#### min(**kwargs)
2020-02-08 13:38:23 +03:00
#### minIf(cond)
2020-02-08 13:38:23 +03:00
#### minOrDefault()
2020-02-08 13:38:23 +03:00
#### minOrDefaultIf(cond)
2020-02-08 13:38:23 +03:00
#### minOrNull()
2020-02-08 13:38:23 +03:00
#### minOrNullIf(cond)
2020-02-08 13:38:23 +03:00
#### minus(**kwargs)
2020-02-08 13:38:23 +03:00
#### modulo(**kwargs)
2020-02-08 13:38:23 +03:00
#### multiply(**kwargs)
2020-02-08 13:38:23 +03:00
#### murmurHash2_32()
2020-02-08 13:38:23 +03:00
#### murmurHash2_64()
2020-02-08 13:38:23 +03:00
#### murmurHash3_128()
2020-02-08 13:38:23 +03:00
#### murmurHash3_32()
2020-02-08 13:38:23 +03:00
#### murmurHash3_64()
2020-02-08 13:38:23 +03:00
#### negate()
2020-02-08 13:38:23 +03:00
#### ngramDistance(**kwargs)
2020-02-08 13:38:23 +03:00
#### ngramDistanceCaseInsensitive(**kwargs)
2020-02-08 13:38:23 +03:00
#### ngramDistanceCaseInsensitiveUTF8(needle)
2020-02-08 13:38:23 +03:00
#### ngramDistanceUTF8(needle)
2020-02-08 13:38:23 +03:00
#### ngramSearch(**kwargs)
2020-02-08 13:38:23 +03:00
#### ngramSearchCaseInsensitive(**kwargs)
2020-02-08 13:38:23 +03:00
#### ngramSearchCaseInsensitiveUTF8(needle)
2020-02-08 13:38:23 +03:00
#### ngramSearchUTF8(needle)
2020-02-08 13:38:23 +03:00
#### notEmpty()
2020-02-08 13:38:23 +03:00
#### notEquals(**kwargs)
2020-02-08 13:38:23 +03:00
#### notLike(pattern)
2020-02-08 13:38:23 +03:00
#### now()
2020-02-08 13:38:23 +03:00
#### nullIf(y)
2020-02-08 13:38:23 +03:00
#### parseDateTimeBestEffort(**kwargs)
2020-02-08 13:38:23 +03:00
#### parseDateTimeBestEffortOrNull(timezone=NO_VALUE)
2020-02-08 13:38:23 +03:00
#### parseDateTimeBestEffortOrZero(timezone=NO_VALUE)
2020-02-08 13:38:23 +03:00
#### pi()
2020-02-08 13:38:23 +03:00
#### plus(**kwargs)
2020-02-08 13:38:23 +03:00
#### position(**kwargs)
2020-02-08 13:38:23 +03:00
#### positionCaseInsensitive(**kwargs)
2020-02-08 13:38:23 +03:00
#### positionCaseInsensitiveUTF8(needle)
2020-02-08 13:38:23 +03:00
#### positionUTF8(needle)
2020-02-08 13:38:23 +03:00
#### power(y)
2020-02-08 13:38:23 +03:00
#### power(y)
2020-02-08 13:38:23 +03:00
#### quantile(**kwargs)
2020-02-08 13:38:23 +03:00
#### quantileDeterministic(**kwargs)
2020-02-08 13:38:23 +03:00
#### quantileDeterministicIf()
2020-02-08 13:38:23 +03:00
#### quantileDeterministicOrDefault()
2020-02-08 13:38:23 +03:00
#### quantileDeterministicOrDefaultIf()
2020-02-08 13:38:23 +03:00
#### quantileDeterministicOrNull()
2020-02-08 13:38:23 +03:00
#### quantileDeterministicOrNullIf()
2020-02-08 13:38:23 +03:00
#### quantileExact(**kwargs)
2020-02-08 13:38:23 +03:00
#### quantileExactIf()
2020-02-08 13:38:23 +03:00
#### quantileExactOrDefault()
2020-02-08 13:38:23 +03:00
#### quantileExactOrDefaultIf()
2020-02-08 13:38:23 +03:00
#### quantileExactOrNull()
2020-02-08 13:38:23 +03:00
#### quantileExactOrNullIf()
2020-02-08 13:38:23 +03:00
#### quantileExactWeighted(**kwargs)
2020-02-08 13:38:23 +03:00
#### quantileExactWeightedIf()
2020-02-08 13:38:23 +03:00
#### quantileExactWeightedOrDefault()
2020-02-08 13:38:23 +03:00
#### quantileExactWeightedOrDefaultIf()
2020-02-08 13:38:23 +03:00
#### quantileExactWeightedOrNull()
2020-02-08 13:38:23 +03:00
#### quantileExactWeightedOrNullIf()
2020-02-08 13:38:23 +03:00
#### quantileIf()
2020-02-08 13:38:23 +03:00
#### quantileOrDefault()
2020-02-08 13:38:23 +03:00
#### quantileOrDefaultIf()
2020-02-08 13:38:23 +03:00
#### quantileOrNull()
2020-02-08 13:38:23 +03:00
#### quantileOrNullIf()
2020-02-08 13:38:23 +03:00
#### quantileTDigest(**kwargs)
2020-02-08 13:38:23 +03:00
#### quantileTDigestIf()
2020-02-08 13:38:23 +03:00
#### quantileTDigestOrDefault()
2020-02-08 13:38:23 +03:00
#### quantileTDigestOrDefaultIf()
2020-02-08 13:38:23 +03:00
#### quantileTDigestOrNull()
2020-02-08 13:38:23 +03:00
#### quantileTDigestOrNullIf()
2020-02-08 13:38:23 +03:00
#### quantileTDigestWeighted(**kwargs)
2020-02-08 13:38:23 +03:00
#### quantileTDigestWeightedIf()
2020-02-08 13:38:23 +03:00
#### quantileTDigestWeightedOrDefault()
2020-02-08 13:38:23 +03:00
#### quantileTDigestWeightedOrDefaultIf()
2020-02-08 13:38:23 +03:00
#### quantileTDigestWeightedOrNull()
2020-02-08 13:38:23 +03:00
#### quantileTDigestWeightedOrNullIf()
2020-02-08 13:38:23 +03:00
#### quantileTiming(**kwargs)
2020-02-08 13:38:23 +03:00
#### quantileTimingIf()
2020-02-08 13:38:23 +03:00
#### quantileTimingOrDefault()
2020-02-08 13:38:23 +03:00
#### quantileTimingOrDefaultIf()
2020-02-08 13:38:23 +03:00
#### quantileTimingOrNull()
2020-02-08 13:38:23 +03:00
#### quantileTimingOrNullIf()
2020-02-08 13:38:23 +03:00
#### quantileTimingWeighted(**kwargs)
2020-02-08 13:38:23 +03:00
#### quantileTimingWeightedIf()
2020-02-08 13:38:23 +03:00
#### quantileTimingWeightedOrDefault()
2020-02-08 13:38:23 +03:00
#### quantileTimingWeightedOrDefaultIf()
2020-02-08 13:38:23 +03:00
#### quantileTimingWeightedOrNull()
2020-02-08 13:38:23 +03:00
#### quantileTimingWeightedOrNullIf()
2020-02-08 13:38:23 +03:00
#### quantiles(**kwargs)
2020-02-08 13:38:23 +03:00
#### quantilesDeterministic(**kwargs)
2020-02-08 13:38:23 +03:00
#### quantilesDeterministicIf()
2020-02-08 13:38:23 +03:00
#### quantilesDeterministicOrDefault()
2020-02-08 13:38:23 +03:00
#### quantilesDeterministicOrDefaultIf()
2020-02-08 13:38:23 +03:00
#### quantilesDeterministicOrNull()
2020-02-08 13:38:23 +03:00
#### quantilesDeterministicOrNullIf()
2020-02-08 13:38:23 +03:00
#### quantilesExact(**kwargs)
2020-02-08 13:38:23 +03:00
#### quantilesExactIf()
2020-02-08 13:38:23 +03:00
#### quantilesExactOrDefault()
2020-02-08 13:38:23 +03:00
#### quantilesExactOrDefaultIf()
2020-02-08 13:38:23 +03:00
#### quantilesExactOrNull()
2020-02-08 13:38:23 +03:00
#### quantilesExactOrNullIf()
2020-02-08 13:38:23 +03:00
#### quantilesExactWeighted(**kwargs)
2020-02-08 13:38:23 +03:00
#### quantilesExactWeightedIf()
2020-02-08 13:38:23 +03:00
#### quantilesExactWeightedOrDefault()
2020-02-08 13:38:23 +03:00
#### quantilesExactWeightedOrDefaultIf()
2020-02-08 13:38:23 +03:00
#### quantilesExactWeightedOrNull()
2020-02-08 13:38:23 +03:00
#### quantilesExactWeightedOrNullIf()
2020-02-08 13:38:23 +03:00
#### quantilesIf()
2020-02-08 13:38:23 +03:00
#### quantilesOrDefault()
2020-02-08 13:38:23 +03:00
#### quantilesOrDefaultIf()
2020-02-08 13:38:23 +03:00
#### quantilesOrNull()
2020-02-08 13:38:23 +03:00
#### quantilesOrNullIf()
2020-02-08 13:38:23 +03:00
#### quantilesTDigest(**kwargs)
2020-02-08 13:38:23 +03:00
#### quantilesTDigestIf()
2020-02-08 13:38:23 +03:00
#### quantilesTDigestOrDefault()
2020-02-08 13:38:23 +03:00
#### quantilesTDigestOrDefaultIf()
2020-02-08 13:38:23 +03:00
#### quantilesTDigestOrNull()
2020-02-08 13:38:23 +03:00
#### quantilesTDigestOrNullIf()
2020-02-08 13:38:23 +03:00
#### quantilesTDigestWeighted(**kwargs)
2020-02-08 13:38:23 +03:00
#### quantilesTDigestWeightedIf()
2020-02-08 13:38:23 +03:00
#### quantilesTDigestWeightedOrDefault()
2020-02-08 13:38:23 +03:00
#### quantilesTDigestWeightedOrDefaultIf()
2020-02-08 13:38:23 +03:00
#### quantilesTDigestWeightedOrNull()
2020-02-08 13:38:23 +03:00
#### quantilesTDigestWeightedOrNullIf()
2020-02-08 13:38:23 +03:00
#### quantilesTiming(**kwargs)
2020-02-08 13:38:23 +03:00
#### quantilesTimingIf()
2020-02-08 13:38:23 +03:00
#### quantilesTimingOrDefault()
2020-02-08 13:38:23 +03:00
#### quantilesTimingOrDefaultIf()
2020-02-08 13:38:23 +03:00
#### quantilesTimingOrNull()
2020-02-08 13:38:23 +03:00
#### quantilesTimingOrNullIf()
2020-02-08 13:38:23 +03:00
#### quantilesTimingWeighted(**kwargs)
2020-02-08 13:38:23 +03:00
#### quantilesTimingWeightedIf()
2020-02-08 13:38:23 +03:00
#### quantilesTimingWeightedOrDefault()
2020-02-08 13:38:23 +03:00
#### quantilesTimingWeightedOrDefaultIf()
2020-02-08 13:38:23 +03:00
#### quantilesTimingWeightedOrNull()
2020-02-08 13:38:23 +03:00
#### quantilesTimingWeightedOrNullIf()
2020-02-08 13:38:23 +03:00
#### rand()
2020-02-08 13:38:23 +03:00
#### rand64()
2020-02-08 13:38:23 +03:00
#### randConstant()
2020-02-08 13:38:23 +03:00
#### range()
2020-02-08 13:38:23 +03:00
#### regexpQuoteMeta()
2020-02-08 13:38:23 +03:00
#### replace(pattern, replacement)
2020-02-08 13:38:23 +03:00
#### replaceAll(pattern, replacement)
2020-02-08 13:38:23 +03:00
#### replaceOne(pattern, replacement)
2020-02-08 13:38:23 +03:00
#### replaceRegexpAll(pattern, replacement)
2020-02-08 13:38:23 +03:00
#### replaceRegexpOne(pattern, replacement)
2020-02-08 13:38:23 +03:00
#### reverse(**kwargs)
2020-02-08 13:38:23 +03:00
#### reverseUTF8()
2020-02-08 13:38:23 +03:00
#### round(n=None)
2020-02-08 13:38:23 +03:00
#### roundAge()
2020-02-08 13:38:23 +03:00
#### roundDown(y)
2020-02-08 13:38:23 +03:00
#### roundDuration()
2020-02-08 13:38:23 +03:00
#### roundToExp2()
2020-02-08 13:38:23 +03:00
#### sin()
2020-02-08 13:38:23 +03:00
#### sipHash128()
2020-02-08 13:38:23 +03:00
#### sipHash64()
2020-02-08 13:38:23 +03:00
#### skewPop(**kwargs)
#### skewPopIf(cond)
#### skewPopOrDefault()
#### skewPopOrDefaultIf(cond)
#### skewPopOrNull()
#### skewPopOrNullIf(cond)
#### skewSamp(**kwargs)
#### skewSampIf(cond)
#### skewSampOrDefault()
#### skewSampOrDefaultIf(cond)
#### skewSampOrNull()
#### skewSampOrNullIf(cond)
#### splitByChar(s)
#### splitByString(s)
#### sqrt()
#### startsWith(prefix)
#### substring(**kwargs)
#### substringUTF8(offset, length)
#### subtractDays(n, timezone=NO_VALUE)
#### subtractHours(n, timezone=NO_VALUE)
#### subtractMinutes(n, timezone=NO_VALUE)
#### subtractMonths(n, timezone=NO_VALUE)
#### subtractQuarters(n, timezone=NO_VALUE)
#### subtractSeconds(n, timezone=NO_VALUE)
#### subtractWeeks(n, timezone=NO_VALUE)
#### subtractYears(n, timezone=NO_VALUE)
#### sum(**kwargs)
#### sumIf(cond)
#### sumOrDefault()
#### sumOrDefaultIf(cond)
#### sumOrNull()
#### sumOrNullIf(cond)
#### tan()
#### tgamma()
#### timeSlot()
#### timeSlots(duration)
#### toDate(**kwargs)
#### toDateOrNull()
#### toDateOrZero()
#### toDateTime(**kwargs)
2020-06-23 11:37:41 +03:00
#### toDateTime64(**kwargs)
#### toDateTime64OrNull(precision, timezone=NO_VALUE)
#### toDateTime64OrZero(precision, timezone=NO_VALUE)
#### toDateTimeOrNull()
#### toDateTimeOrZero()
#### toDayOfMonth()
#### toDayOfWeek()
#### toDayOfYear()
#### toDecimal128(**kwargs)
#### toDecimal128OrNull(scale)
#### toDecimal128OrZero(scale)
#### toDecimal32(**kwargs)
#### toDecimal32OrNull(scale)
#### toDecimal32OrZero(scale)
#### toDecimal64(**kwargs)
#### toDecimal64OrNull(scale)
#### toDecimal64OrZero(scale)
#### toFixedString(length)
#### toFloat32(**kwargs)
#### toFloat32OrNull()
#### toFloat32OrZero()
#### toFloat64(**kwargs)
#### toFloat64OrNull()
#### toFloat64OrZero()
#### toHour()
#### toIPv4()
#### toIPv6()
#### toISOWeek(timezone="")
#### toISOYear(timezone="")
#### toInt16(**kwargs)
#### toInt16OrNull()
#### toInt16OrZero()
#### toInt32(**kwargs)
#### toInt32OrNull()
#### toInt32OrZero()
#### toInt64(**kwargs)
#### toInt64OrNull()
#### toInt64OrZero()
#### toInt8(**kwargs)
#### toInt8OrNull()
#### toInt8OrZero()
#### toIntervalDay()
#### toIntervalHour()
#### toIntervalMinute()
#### toIntervalMonth()
#### toIntervalQuarter()
#### toIntervalSecond()
#### toIntervalWeek()
#### toIntervalYear()
#### toMinute()
#### toMonday()
#### toMonth()
#### toQuarter(timezone="")
#### toRelativeDayNum(timezone="")
#### toRelativeHourNum(timezone="")
#### toRelativeMinuteNum(timezone="")
#### toRelativeMonthNum(timezone="")
#### toRelativeSecondNum(timezone="")
#### toRelativeWeekNum(timezone="")
#### toRelativeYearNum(timezone="")
#### toSecond()
#### toStartOfDay()
#### toStartOfFifteenMinutes()
#### toStartOfFiveMinute()
#### toStartOfHour()
#### toStartOfISOYear()
#### toStartOfMinute()
#### toStartOfMonth()
#### toStartOfQuarter()
#### toStartOfTenMinutes()
#### toStartOfWeek(mode=0)
#### toStartOfYear()
#### toString()
#### toStringCutToZero()
#### toTime(timezone="")
#### toTimeZone(timezone)
#### toUInt16(**kwargs)
#### toUInt16OrNull()
#### toUInt16OrZero()
#### toUInt32(**kwargs)
#### toUInt32OrNull()
#### toUInt32OrZero()
#### toUInt64(**kwargs)
#### toUInt64OrNull()
#### toUInt64OrZero()
#### toUInt8(**kwargs)
#### toUInt8OrNull()
#### toUInt8OrZero()
#### toUUID()
#### toUnixTimestamp(timezone="")
#### toWeek(mode=0, timezone="")
#### toYYYYMM(timezone="")
#### toYYYYMMDD(timezone="")
#### toYYYYMMDDhhmmss(timezone="")
#### toYear()
#### to_sql(*args)
Generates an SQL string for this function and its arguments.
For example if the function name is a symbol of a binary operator:
(2.54 * `height`)
For other functions:
gcd(12, 300)
#### today()
#### topK(**kwargs)
#### topKIf()
#### topKOrDefault()
#### topKOrDefaultIf()
#### topKOrNull()
#### topKOrNullIf()
#### topKWeighted(**kwargs)
#### topKWeightedIf()
#### topKWeightedOrDefault()
#### topKWeightedOrDefaultIf()
#### topKWeightedOrNull()
#### topKWeightedOrNullIf()
#### trimBoth()
#### trimLeft()
#### trimRight()
#### tryBase64Decode()
#### unhex()
#### uniq(**kwargs)
#### uniqExact(**kwargs)
2020-02-08 13:38:23 +03:00