Linking Cloudflare R2 as a source

Last updated:

|Edit this page

The data warehouse can link to data in Cloudflare R2. To start, you'll need to:

  1. Create a bucket in R2
  2. Set up an access key and secret
  3. Add data to the bucket
  4. Create the table in PostHog

Step 1: Creating a bucket in R2

  1. Log in to the Cloudflare dashboard.
  2. Go to the R2 overview and create a new bucket. We suggest using Eastern North America as a location hint if you're using PostHog Cloud US or European Union as a specific jurisdiction if you're using PostHog Cloud EU.

created bucket

  1. In your bucket, upload the data you want to query such as CSV or Parquet data. It can be as simple as a .csv file like this:
csv
id,name,email
1,John Doe,john@example.com
2,Jane Doe,jane@example.com

Adding data to R2 automatically: See our S3 docs for how to set up Airbyte to automatically add data to the bucket. You can also use other ETL tools like Fivetran or Stitch.

Step 2: Create API tokens

  1. Head back to the R2 overivew and under account details, click Manage R2 API Tokens
  2. Click Create API token, give your token a name, choose Object Read only as the permission type, apply it to your bucket, and click Create API Token

Create token

  1. Copy the credentials for S3 clients, including the Access Key ID, Secret Access Key, and jurisdiction-specific endpoint URL. These will not be shown again, so make sure to copy them to a safe place.

Step 3: Create the table in PostHog

  1. Go to the Data pipeline page and the sources tab in PostHog
  2. Click New source and under self managed, look for Cloudflare R2 and click Link
  3. Fill the table name, then use the data from Cloudflare:
    • For files URL pattern, use the jurisdiction-specific endpoint URL with your bucket name and file name like https://b27247d543c.r2.cloudflarestorage.com/posthog-warehouse/cool_data.csv. You can also use * to query multiple files.
    • Chose the correct file format
    • For access key, use your Access Key ID
    • For secret key, use your Secret Access Key
  4. Click Next

Linking source

Step 4: Query the table

Once it is done syncing, you can now query your new table using the table name.

Questions?

Was this page useful?

Next article

Linking Google Cloud Storage as a source

You can sync and query files and folders from Google Cloud Storage (GCS) in PostHog by setting up a link. Step 1: Creating a bucket in GCS Go to your Google Cloud console. Create a bucket in GCS. For location, we recommend choosing us-east1 for US Cloud or europe-west3 for EU Cloud to keep them close to our servers. For storage class, choose Standard . For access control, choose Uniform . Upload your data to the bucket. This can be as simple as a .csv file like this: Step 2: Set up a…

Read next article