Snowflake (datamill-co
variant)
Table of Contents
- Alternative variants
- Standalone usage
- Usage with Meltano
- Capabilities
- Settings
-
Account (
snowflake_account) -
Username (
snowflake_username) -
Password (
snowflake_password) -
Role (
snowflake_role) -
Snowflake Database (
snowflake_database) -
Snowflake Authenticator (
snowflake_authenticator) -
Warehouse (
snowflake_warehouse) -
Invalid Records Detect (
invalid_records_detect) -
Invalid Records Threshold (
invalid_records_threshold) -
Disable Collection (
disable_collection) -
Logging Level (
logging_level) -
Persist Empty Tables (
persist_empty_tables) -
Snowflake Schema (
snowflake_schema) -
State Support (
state_support) -
Target S3 Bucket (
target_s3.bucket) -
Target S3 Key Prefix (
target_s3.key_prefix) -
Target S3 AWS Access Key ID (
target_s3.aws_access_key_id) -
Target S3 AWS Secret Access Key (
target_s3.aws_secret_access_key)
-
Account (
- Troubleshooting
- Looking for help?
The target-snowflake Singer target sends data into Snowflake after it was pulled from a source using a Singer tap.
Alternative variants #
Multiple
variants
of target-snowflake are available.
This document describes the datamill-co variant.
Alternative variants are:
-
meltano -
transferwise(default)
Standalone usage #
Install the package using pip:
pip install target-snowflake
For additional instructions, refer to the README in the repository.
Prerequisites #
Dependencies #
target-snowflake requires the
libpq library to be available on your system.
If you’ve installed PostgreSQL, you should already have it, but you can also install it by itself using the
libpq-dev package on Ubuntu/Debian or the
libpq Homebrew formula on macOS.
Usage with Meltano #
Meltano helps you manage your configuration, incremental replication, and scheduled pipelines.
View the Meltano-specific target-snowflake instructions to learn more.
Capabilities #
Settings #
target-snowflake requires the
configuration
of the following settings:
These and other supported settings are documented below. To quickly find the setting you're looking for, use the Table of Contents at the top of the page.
Account (snowflake_account)
#
ACCOUNT might require the region and cloud platform where your account is located, in the form of: <your_account_name>.<region_id>.<cloud> (e.g. xy12345.east-us-2.azure)
Refer to Snowflake’s documentation about Accounts.
Username (snowflake_username)
#
Password (snowflake_password)
#
Role (snowflake_role)
#
If not specified, Snowflake will use the user’s default role.
Snowflake Database (snowflake_database)
#
Snowflake Authenticator (snowflake_authenticator)
#
- Default:
snowflake
Specifies the authentication provider for snowflake to use. Valud options are the internal one (“snowflake”), a browser session (“externalbrowser”), or Okta (“https://
Warehouse (snowflake_warehouse)
#
Invalid Records Detect (invalid_records_detect)
#
- Default:
true
Include false in your config to disable crashing on invalid records
Invalid Records Threshold (invalid_records_threshold)
#
- Default:
0
Include a positive value n in your config to allow at most n invalid records per stream before giving up.
Disable Collection (disable_collection)
#
- Default:
false
Include true in your config to disable Singer Usage Logging: https://github.com/datamill-co/target-snowflake#usage-logging
Logging Level (logging_level)
#
- Default:
INFO
The level for logging. Set to DEBUG to get things like queries executed, timing of those queries, etc. See Python’s Logger Levels for information about valid values.
Persist Empty Tables (persist_empty_tables)
#
- Default:
false
Whether the Target should create tables which have no records present in Remote.
Snowflake Schema (snowflake_schema)
#
- Default:
$MELTANO_EXTRACT__LOAD_SCHEMA
Note $MELTANO_EXTRACT__LOAD_SCHEMA will expand to the value of the load_schema extra for the extractor used in the pipeline, which defaults to the extractor’s namespace, e.g. tap_gitlab for tap-gitlab. Values are automatically converted to uppercase before they’re passed on to the plugin, so tap_gitlab becomes TAP_GITLAB.
State Support (state_support)
#
- Default:
true
Whether the Target should emit STATE messages to stdout for further consumption. In this mode, which is on by default, STATE messages are buffered in memory until all the records that occurred before them are flushed according to the batch flushing schedule the target is configured with.
Target S3 Bucket (target_s3.bucket)
#
When included, use S3 to stage files. Bucket where staging files should be uploaded to.
Target S3 Key Prefix (target_s3.key_prefix)
#
Prefix for staging file uploads to allow for better delineation of tmp files
Target S3 AWS Access Key ID (target_s3.aws_access_key_id)
#
Target S3 AWS Secret Access Key (target_s3.aws_secret_access_key)
#
Troubleshooting #
Error: pg_config executable not found or libpq-fe.h: No such file or directory #
This error message indicates that the libpq dependency is missing.
To resolve this, refer to the “Dependencies” section above.
Looking for help? #
If you're having trouble getting the
target-snowflake target to work, look for an
existing issue in its repository, file a new issue,
or
join the Meltano Slack community
and ask for help in the #plugins-general channel.
Found an issue on this page? #
This page is generated from a YAML file that you can contribute changes to. Edit it on GitHub!