Snowflake (datamill-co
variant)
Table of Contents
- Alternative variants
- Getting Started
- Capabilities
- Settings
-
Account (
snowflake_account) -
Username (
snowflake_username) -
Password (
snowflake_password) -
Role (
snowflake_role) -
Snowflake Database (
snowflake_database) -
Snowflake Authenticator (
snowflake_authenticator) -
Warehouse (
snowflake_warehouse) -
Invalid Records Detect (
invalid_records_detect) -
Invalid Records Threshold (
invalid_records_threshold) -
Disable Collection (
disable_collection) -
Logging Level (
logging_level) -
Persist Empty Tables (
persist_empty_tables) -
Snowflake Schema (
snowflake_schema) -
State Support (
state_support) -
Target S3 Bucket (
target_s3.bucket) -
Target S3 Key Prefix (
target_s3.key_prefix) -
Target S3 AWS Access Key ID (
target_s3.aws_access_key_id) -
Target S3 AWS Secret Access Key (
target_s3.aws_secret_access_key)
-
Account (
- Troubleshooting
- Looking for help?
The target-snowflake Meltano loader sends data into Snowflake after it was pulled from a source using an extractor.
Alternative variants #
Multiple
variants
of target-snowflake are available.
This document describes the datamill-co variant.
Alternative variants are:
-
meltano -
transferwise(default)
Getting Started #
Prerequisites #
If you haven't already, follow the initial steps of the Getting Started guide:
Installation and configuration #
Using the Command Line Interface #
-
Add the
target-snowflakeloader to your project usingmeltano add:meltano add loader target-snowflake --variant datamill-co -
Configure the settings below using
meltano config.
Next steps #
Follow the remaining steps of the Getting Started guide:
If you run into any issues, learn how to get help.
Capabilities #
Settings #
target-snowflake requires the
configuration
of the following settings:
These and other supported settings are documented below. To quickly find the setting you're looking for, use the Table of Contents at the top of the page.
Account (snowflake_account)
#
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_ACCOUNT
ACCOUNT might require the region and cloud platform where your account is located, in the form of: <your_account_name>.<region_id>.<cloud> (e.g. xy12345.east-us-2.azure)
Refer to Snowflake’s documentation about Accounts.
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set snowflake_account <snowflake_account>
export TARGET_SNOWFLAKE_SNOWFLAKE_ACCOUNT=<snowflake_account>
Username (snowflake_username)
#
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_USERNAME
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set snowflake_username <snowflake_username>
export TARGET_SNOWFLAKE_SNOWFLAKE_USERNAME=<snowflake_username>
Password (snowflake_password)
#
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_PASSWORD
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set snowflake_password <snowflake_password>
export TARGET_SNOWFLAKE_SNOWFLAKE_PASSWORD=<snowflake_password>
Role (snowflake_role)
#
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_ROLE
If not specified, Snowflake will use the user’s default role.
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set snowflake_role <snowflake_role>
export TARGET_SNOWFLAKE_SNOWFLAKE_ROLE=<snowflake_role>
Snowflake Database (snowflake_database)
#
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_DATABASE
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set snowflake_database <snowflake_database>
export TARGET_SNOWFLAKE_SNOWFLAKE_DATABASE=<snowflake_database>
Snowflake Authenticator (snowflake_authenticator)
#
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_AUTHENTICATOR - Default:
snowflake
Specifies the authentication provider for snowflake to use. Valud options are the internal one (“snowflake”), a browser session (“externalbrowser”), or Okta (“https://
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set snowflake_authenticator <snowflake_authenticator>
export TARGET_SNOWFLAKE_SNOWFLAKE_AUTHENTICATOR=<snowflake_authenticator>
Warehouse (snowflake_warehouse)
#
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_WAREHOUSE
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set snowflake_warehouse <snowflake_warehouse>
export TARGET_SNOWFLAKE_SNOWFLAKE_WAREHOUSE=<snowflake_warehouse>
Invalid Records Detect (invalid_records_detect)
#
-
Environment variable:
TARGET_SNOWFLAKE_INVALID_RECORDS_DETECT - Default:
true
Include false in your config to disable crashing on invalid records
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set invalid_records_detect false
export TARGET_SNOWFLAKE_INVALID_RECORDS_DETECT=false
Invalid Records Threshold (invalid_records_threshold)
#
-
Environment variable:
TARGET_SNOWFLAKE_INVALID_RECORDS_THRESHOLD - Default:
0
Include a positive value n in your config to allow at most n invalid records per stream before giving up.
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set invalid_records_threshold 0
export TARGET_SNOWFLAKE_INVALID_RECORDS_THRESHOLD=0
Disable Collection (disable_collection)
#
-
Environment variable:
TARGET_SNOWFLAKE_DISABLE_COLLECTION - Default:
false
Include true in your config to disable Singer Usage Logging: https://github.com/datamill-co/target-snowflake#usage-logging
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set disable_collection true
export TARGET_SNOWFLAKE_DISABLE_COLLECTION=true
Logging Level (logging_level)
#
-
Environment variable:
TARGET_SNOWFLAKE_LOGGING_LEVEL -
Options:
DEBUGINFOWARNINGERRORCRITICAL - Default:
INFO
The level for logging. Set to DEBUG to get things like queries executed, timing of those queries, etc. See Python’s Logger Levels for information about valid values.
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set logging_level DEBUG
export TARGET_SNOWFLAKE_LOGGING_LEVEL=DEBUG
Persist Empty Tables (persist_empty_tables)
#
-
Environment variable:
TARGET_SNOWFLAKE_PERSIST_EMPTY_TABLES - Default:
false
Whether the Target should create tables which have no records present in Remote.
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set persist_empty_tables true
export TARGET_SNOWFLAKE_PERSIST_EMPTY_TABLES=true
Snowflake Schema (snowflake_schema)
#
-
Environment variable:
TARGET_SNOWFLAKE_SNOWFLAKE_SCHEMA - Default:
$MELTANO_EXTRACT__LOAD_SCHEMA
Note $MELTANO_EXTRACT__LOAD_SCHEMA will expand to the value of the load_schema extra for the extractor used in the pipeline, which defaults to the extractor’s namespace, e.g. tap_gitlab for tap-gitlab. Values are automatically converted to uppercase before they’re passed on to the plugin, so tap_gitlab becomes TAP_GITLAB.
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set snowflake_schema <snowflake_schema>
export TARGET_SNOWFLAKE_SNOWFLAKE_SCHEMA=<snowflake_schema>
State Support (state_support)
#
-
Environment variable:
TARGET_SNOWFLAKE_STATE_SUPPORT - Default:
true
Whether the Target should emit STATE messages to stdout for further consumption. In this mode, which is on by default, STATE messages are buffered in memory until all the records that occurred before them are flushed according to the batch flushing schedule the target is configured with.
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set state_support false
export TARGET_SNOWFLAKE_STATE_SUPPORT=false
Target S3 Bucket (target_s3.bucket)
#
-
Environment variable:
TARGET_SNOWFLAKE_TARGET_S3.BUCKET
When included, use S3 to stage files. Bucket where staging files should be uploaded to.
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set target_s3.bucket <target_s3.bucket>
export TARGET_SNOWFLAKE_TARGET_S3.BUCKET=<target_s3.bucket>
Target S3 Key Prefix (target_s3.key_prefix)
#
-
Environment variable:
TARGET_SNOWFLAKE_TARGET_S3.KEY_PREFIX
Prefix for staging file uploads to allow for better delineation of tmp files
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set target_s3.key_prefix <target_s3.key_prefix>
export TARGET_SNOWFLAKE_TARGET_S3.KEY_PREFIX=<target_s3.key_prefix>
Target S3 AWS Access Key ID (target_s3.aws_access_key_id)
#
-
Environment variable:
TARGET_SNOWFLAKE_TARGET_S3.AWS_ACCESS_KEY_ID
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set target_s3.aws_access_key_id <target_s3.aws_access_key_id>
export TARGET_SNOWFLAKE_TARGET_S3.AWS_ACCESS_KEY_ID=<target_s3.aws_access_key_id>
Target S3 AWS Secret Access Key (target_s3.aws_secret_access_key)
#
-
Environment variable:
TARGET_SNOWFLAKE_TARGET_S3.AWS_SECRET_ACCESS_KEY
How to use #
Manage this setting using
meltano config or an
environment variable:
meltano config target-snowflake set target_s3.aws_secret_access_key <target_s3.aws_secret_access_key>
export TARGET_SNOWFLAKE_TARGET_S3.AWS_SECRET_ACCESS_KEY=<target_s3.aws_secret_access_key>Troubleshooting #
Error: pg_config executable not found or libpq-fe.h: No such file or directory #
This error message indicates that the libpq dependency is missing.
To resolve this, refer to the “Dependencies” section above.
Looking for help? #
If you're having trouble getting the
target-snowflake loader to work, look for an
existing issue in its repository, file a new issue,
or
join the Meltano Slack community
and ask for help in the #plugins-general channel.
Found an issue on this page? #
This page is generated from a YAML file that you can contribute changes to. Edit it on GitHub!