LogoLogo
WebsiteDocumentationChangelogStatusTerms
  • Getting Started
    • Overview
    • FAQs
    • Reference
      • Identifier Types
      • Identifier Profiles
      • Identity Clusters
      • Identity Resolution
      • Regular Expressions
      • Tabular Data Schema
      • Advanced Query Interface
      • IP Allowlisting
    • Global Data Regulations
  • Data
    • Sources
      • Amazon S3
      • API Source
      • Azure Blob Storage
      • BigQuery
      • Databricks
      • File Upload
      • Google Cloud Storage
      • Mailchimp
      • SDK Sources
        • Android SDK
        • iOS SDK
        • JavaScript SDK
      • SendGrid
      • Shopify
      • Snowflake
    • Data Configuration
      • Custom Identifiers
      • Traits
      • Events
      • Labels
  • Identity
    • EIDs Switchboard
  • Audiences
    • Audiences
      • Audience Builder
      • Insights
      • Activation
        • Real-Time Targeting
        • Google Ad Manager
        • Permutive
      • Exports
        • Metadata file
    • Destinations
      • Amazon DSP
      • Amazon S3
      • Azure Blob Storage
      • Campaign Manager 360
      • Display & Video 360
      • Google Ads
      • Google Cloud Storage (GCS)
      • Meta Ads
      • The Trade Desk
      • TikTok
      • Yahoo! Japan Ads
  • collaborate
    • Clean Rooms
      • Insights Clean Rooms
      • Activation Clean Rooms
      • Augmentation Clean Rooms
      • Prospecting Clean Rooms
    • Partnerships
      • Flash Partners
        • Optable Flash Node
        • AWS Connector
        • Snowflake Connector
      • Enterprise Partners
      • Flash Partner Settings
  • Differential Privacy
  • Admin
    • General
    • Accounts
      • Managing User Accounts
      • Roles
        • Permissions
      • Authentication
    • Real-Time API
    • Data Subject Requests
  • Profile Settings
    • Notifications
  • Guides
    • AWS Connector: Guides
      • Enterprise DCN
      • AWS Connector
    • Snowflake Connector: Guide
    • Optable CLI
    • Data Warehouse Exports
      • Google BigQuery
      • Snowflake
  • Announcements
    • Deprecation Notices
      • Q1 2025: Deprecation of Exports & Introduction of Syncs
      • Feb 21st, 2024: Removal of regex_allowed_http_origins
      • Dec 8th, 2022: Exports to Export Configurations
      • Deprecated uses in the CLI
Powered by GitBook
On this page
  • Pre-requisites:
  • Setting up a Delta Share in Databricks:
  • Creating a Databricks source.

Was this helpful?

  1. Data
  2. Sources

Databricks

Loading data from your Databricks delta lake.

Last updated 1 month ago

Was this helpful?

The source enables you to seamlessly share tables from your Databricks Delta Lake with your DCN via the .

This allows for efficient mapping of identifiers and traits from -based tables without the need for data replication and complex data migrations or transformations.

Pre-requisites:

  • Have a Databricks account with sufficient permissions to create Delta Shares, add Recipients and add assets to Shares.

  • Create a table in Databricks that is based on the containing the data you want to load into your DCN.

Setting up a Delta Share in Databricks:

  1. In Databricks, head to Data and then select Delta Sharing from the accordion menu that appears.

  2. Click Share Data and create a Delta Share specific to your DCN. This share should contain tables that you want to load into your DCN.

  3. Click Manage Assets and add the previously created delta table that is based on the and click "Add"

  4. Once you've added the assets, you can then click Add Recipients.

  5. In the modal that opens, you can either select a previously created recipient, or create a new one that is specific to your DCN.

  6. Click Create & Add and copy the URL for the credentials file.

  7. Download the credentials file to your local machine & head to your DCN to create the source.

Creating a Databricks source.

  1. Head to Data > Sources and click "Create".

  2. Select the Databricks source.

  3. Enter the name of the source, this is how your source will be identified when you want to use it across the platform.

  4. Enter the name of the Share, Schema and Table that you created in Databricks.

  5. Upload the previously downloaded credentials file.

  6. Click "Create"

Databricks Delta Sharing credentials are usually granted for two (2) weeks, after which you will have to upload a new credentials file.

Check with your Databricks administrator for the exact credentials lifetime.

With these steps completed, your DCN service account should now have access to the Databricks table that you granted access to.

Databricks
Delta Sharing protocol
tabular data schema
tabular data format
tabular data format
Example of the Databricks Source Creation Form