Linking Snowflake as a source

Last updated:

|Edit this page

The data warehouse can link to data in Snowflake.

Start by going to the Data pipeline page and the sources tab and clicking New source. Choose Snowflake and enter the following data:

  • Account identifier: Likely a combination of your organization and the name of the account (e.g. myorg-account123). You can find this in the sidebar account selector or by executing the CURRENT_ACCOUNT_NAME and CURRENT_ORGANIZATION_NAME functions in SQL.
  • Database: Like tasty_bytes_sample_data
  • Warehouse: Like compute_wh
  • User: Your username like IANVPH
  • Password: The password for your user
  • Role (optional): The role with necessary privelges to access your context like accountadmin.
  • Schema: The schema for your database like RAW_POS. If it isn't working, trying using all caps.
  • Table Prefix: The optional prefix for your tables in PostHog. For example, if your table name ended up being menu, a prefix of snow_prod_ would create a table in PostHog called snow_prod_menu.
Snowflake details

Once added, click Next, select the tables you want to sync, as well as the sync method, and then press Import.

Once done, you can now query your new table using the table name.

Snowflake details

Questions?

Was this page useful?

Next article

Linking S3 as a source

The data warehouse can link to data in your object storage system like S3. To start, you'll need to: Create a bucket in S3 Set up an access key and secret Add data to the bucket (we'll use Airbyte) Create the table in PostHog Step 1: Creating a bucket in S3 Log in to AWS . Open S3 in the AWS console and create a new bucket. We suggest us-east-1 if you're using PostHog Cloud US, or eu-central-1 if you're using PostHog Cloud EU. Make sure to note both the name and region of your bucket, we…

Read next article