LogoLogo
WebsiteDocumentationChangelogStatusTerms
  • Getting Started
    • Overview
    • FAQs
    • Reference
      • Identifier Types
      • Identifier Profiles
      • Identity Clusters
      • Identity Resolution
      • Regular Expressions
      • Tabular Data Schema
      • Advanced Query Interface
      • IP Allowlisting
    • Global Data Regulations
  • Data
    • Sources
      • Amazon S3
      • API Source
      • Azure Blob Storage
      • BigQuery
      • Databricks
      • File Upload
      • Google Cloud Storage
      • Mailchimp
      • SDK Sources
        • Android SDK
        • iOS SDK
        • JavaScript SDK
      • SendGrid
      • Shopify
      • Snowflake
    • Data Configuration
      • Custom Identifiers
      • Traits
      • Events
      • Labels
  • Identity
    • EIDs Switchboard
  • Audiences
    • Audiences
      • Audience Builder
      • Insights
      • Activation
        • Real-Time Targeting
        • Google Ad Manager
        • Permutive
      • Exports
        • Metadata file
    • Destinations
      • Amazon DSP
      • Amazon S3
      • Azure Blob Storage
      • Campaign Manager 360
      • Display & Video 360
      • Google Ads
      • Google Cloud Storage (GCS)
      • Meta Ads
      • The Trade Desk
      • TikTok
      • Yahoo! Japan Ads
  • collaborate
    • Clean Rooms
      • Insights Clean Rooms
      • Activation Clean Rooms
      • Augmentation Clean Rooms
      • Prospecting Clean Rooms
    • Partnerships
      • Flash Partners
        • Optable Flash Node
        • AWS Connector
        • Snowflake Connector
      • Enterprise Partners
      • Flash Partner Settings
  • Differential Privacy
  • Admin
    • General
    • Accounts
      • Managing User Accounts
      • Roles
        • Permissions
      • Authentication
    • Real-Time API
    • Data Subject Requests
  • Profile Settings
    • Notifications
  • Guides
    • AWS Connector: Guides
      • Enterprise DCN
      • AWS Connector
    • Snowflake Connector: Guide
    • Optable CLI
    • Data Warehouse Exports
      • Google BigQuery
      • Snowflake
  • Announcements
    • Deprecation Notices
      • Q1 2025: Deprecation of Exports & Introduction of Syncs
      • Feb 21st, 2024: Removal of regex_allowed_http_origins
      • Dec 8th, 2022: Exports to Export Configurations
      • Deprecated uses in the CLI
Powered by GitBook
On this page
  • Prerequisites
  • Step 1 - Export Data from your DCN
  • Step 2 - Configure your external stage & load your data.
  • Step 3 - Query Data in Snowflake

Was this helpful?

  1. Guides
  2. Data Warehouse Exports

Snowflake

Loading data into Snowflake via external stages.

Last updated 1 year ago

Was this helpful?

This guide will walk you through the process of exporting data from your DCN into Snowflake using , , or as an intermediary storage bucket. This leverages , which allows you to load data stored in an external stage into a Snowflake table, which can then be re-loaded into your DCN via the

Prerequisites

Before proceeding, ensure that you have the following:

  • An active account with necessary permissions.

  • An active , , or bucket where data from your DCN will be exported to.

Please note that this guide assumes you have the necessary permissions to create and manage resources in Snowflake and your chosen cloud storage service. Always ensure data is handled in a secure and compliant manner.

Step 1 - Export Data from your DCN

The first step is to export the data from your DCN into the previously created , , or bucket in format. Once your audience is exported, you can then proceed to

Step 2 - Configure your external stage & load your data.

Amazon S3:

Example command:
COPY INTO my_table FROM @my_s3_stage FILE_FORMAT = (TYPE = 'PARQUET');

Google Cloud Storage (GCS):

Example command:
COPY INTO my_table FROM @my_gcs_stage FILE_FORMAT = (TYPE = 'PARQUET');

Azure Blob Storage:

Example command:
COPY INTO my_table FROM @my_azure_stage FILE_FORMAT = (TYPE = 'PARQUET');

Step 3 - Query Data in Snowflake

Following successful data export and querying, you can now perform further data analysis, create data visualizations, or develop machine learning models using Snowflake's comprehensive set of tools.

This guide provided a brief overview of the process. For more detailed steps and additional information, please refer to the official Snowflake documentation.

Allow Access: Configure permissions and .

Configure S3: Follow Snowflake's

Create S3 Stage: Create an .

Copy Data: Execute the .

Configure GCS: Follow Snowflake's

Copy Data: Execute the .

Allow Access: Configure permissions and .

Configure Azure: Follow Snowflake's

Create Azure Stage: Create an .

Copy Data: Execute the .

With the data successfully copied into Snowflake's internal tables , you are now ready to execute queries on these tables just like any other Snowflake table. To enhance your experience, consider utilizing Snowflake's features such as indexing, clustering, or other performance optimization techniques, especially when dealing with large datasets.

allow access to S3
guide for configuring an S3 integration.
S3 stage in Snowflake
COPY command for S3
guide for configuring a GCS integration.
COPY command for GCS
allow access to Azure
guide for configuring an Azure integration.
Azure stage in Snowflake
COPY command for Azure
(as detailed in Step 2)
Google Cloud Storage (GCS)
Amazon S3
Azure Blob Storage
Snowflake's external stage feature
Snowflake source.
Snowflake
Google Cloud Storage (GCS)
Amazon S3
Azure Blob Storage
Google Cloud Storage (GCS)
Amazon S3
Azure Blob Storage
step 2.
Exporting data from your DCN in Parquet format.
Parquet