snowshu.adapters.source_adapters.snowflake_adapter¶
-
class
snowshu.adapters.source_adapters.snowflake_adapter.
SnowflakeAdapter
(preserve_case: bool = False)¶ Bases:
snowshu.adapters.source_adapters.base_source_adapter.BaseSourceAdapter
The Snowflake Data Warehouse source adapter.
- Parameters
preserve_case – By default the adapter folds case-insensitive strings to uppercase. If preserve_case is True,SnowShu will __not__ alter cases (dangerous!).
-
ALLOWED_CREDENTIALS
= ('schema', 'warehouse', 'role')¶
-
DATA_TYPE_MAPPINGS
= {'array': json, 'bigint': bigint, 'binary': binary, 'boolean': boolean, 'char': char, 'character': char, 'date': date, 'datetime': datetime, 'decimal': decimal, 'double': float, 'double precision': float, 'float': float, 'float4': float, 'float8': float, 'int': bigint, 'integer': bigint, 'number': bigint, 'numeric': numeric, 'object': json, 'real': float, 'smallint': bigint, 'string': varchar, 'text': varchar, 'time': time, 'timestamp': timestamp_ntz, 'timestamp_ltz': timestamp_tz, 'timestamp_ntz': timestamp_ntz, 'timestamp_tz': timestamp_tz, 'varbinary': binary, 'varchar': varchar, 'variant': json}¶
-
DEFAULT_CASE
= 'upper'¶
-
MATERIALIZATION_MAPPINGS
= {'BASE TABLE': TABLE, 'VIEW': TABLE}¶
-
REQUIRED_CREDENTIALS
= ('user', 'password', 'account', 'database')¶
-
SNOWFLAKE_MAX_NUMBER_EXPR
= 16384¶
-
SUPPORTED_FUNCTIONS
= {'ANY_VALUE', 'RLIKE', 'UUID_STRING'}¶
-
SUPPORTED_SAMPLE_METHODS
= (<class 'snowshu.samplings.sample_methods.bernoulli_sample_method.BernoulliSampleMethod'>,)¶
-
SUPPORTS_CROSS_DATABASE
= True¶
-
static
analyze_wrap_statement
(sql: str, relation: snowshu.core.models.relation.Relation) → str¶
-
check_count_and_query
(query: str, max_count: int, unsampled: bool) → pandas.core.frame.DataFrame¶ checks the count, if count passes returns results as a dataframe.
-
directionally_wrap_statement
(sql: str, relation: snowshu.core.models.relation.Relation, sample_type: Optional[BaseSampleMethod]) → str¶
-
get_connection
(database_override: Optional[str] = None, schema_override: Optional[str] = None) → sqlalchemy.engine.base.Engine¶ Creates a connection engine without transactions.
By default, uses the instance credentials unless database or schema override are provided.
-
name
= 'snowflake'¶
-
static
polymorphic_constraint_statement
(relation: snowshu.core.models.relation.Relation, analyze: bool, local_key: str, remote_key: str, local_type: str, local_type_match_val: str = None) → str¶
-
static
population_count_statement
(relation: snowshu.core.models.relation.Relation) → str¶ creates the count * statement for a relation
- Parameters
relation – the
Relation
to create the statement for.- Returns
a query that results in a single row, single column, integer value of the unsampled relation population size
-
static
predicate_constraint_statement
(relation: snowshu.core.models.relation.Relation, analyze: bool, local_key: str, remote_key: str) → str¶ builds ‘where’ strings
-
static
quoted
(val: str) → str¶
-
sample_statement_from_relation
(relation: snowshu.core.models.relation.Relation, sample_type: Optional[BaseSampleMethod]) → str¶ builds the base sample statment for a given relation.
-
static
union_constraint_statement
(subject: snowshu.core.models.relation.Relation, constraint: snowshu.core.models.relation.Relation, subject_key: str, constraint_key: str, max_number_of_outliers: int) → str¶ Union statements to select outliers. This does not pull in NULL values.
-
static
unsampled_statement
(relation: snowshu.core.models.relation.Relation) → str¶
-
static
upstream_constraint_statement
(relation: snowshu.core.models.relation.Relation, local_key: str, remote_key: str) → str¶ builds upstream where constraints against downstream full population
-
static
view_creation_statement
(relation: snowshu.core.models.relation.Relation) → str¶