LogoLogo
WebsiteDocumentationChangelogStatusTerms
  • Getting Started
    • Overview
    • FAQs
    • Reference
      • Identifier Types
      • Identifier Profiles
      • Identity Clusters
      • Identity Resolution
      • Regular Expressions
      • Tabular Data Schema
      • Advanced Query Interface
      • IP Allowlisting
    • Global Data Regulations
  • Data
    • Sources
      • Amazon S3
      • API Source
      • Azure Blob Storage
      • BigQuery
      • Databricks
      • File Upload
      • Google Cloud Storage
      • Mailchimp
      • SDK Sources
        • Android SDK
        • iOS SDK
        • JavaScript SDK
      • SendGrid
      • Shopify
      • Snowflake
    • Data Configuration
      • Custom Identifiers
      • Traits
      • Events
      • Labels
  • Identity
    • EIDs Switchboard
  • Audiences
    • Audiences
      • Audience Builder
      • Insights
      • Activation
        • Real-Time Targeting
        • Google Ad Manager
        • Permutive
      • Exports
        • Metadata file
    • Destinations
      • Amazon DSP
      • Amazon S3
      • Azure Blob Storage
      • Campaign Manager 360
      • Display & Video 360
      • Google Ads
      • Google Cloud Storage (GCS)
      • Meta Ads
      • The Trade Desk
      • TikTok
      • Yahoo! Japan Ads
  • collaborate
    • Clean Rooms
      • Insights Clean Rooms
      • Activation Clean Rooms
      • Augmentation Clean Rooms
      • Prospecting Clean Rooms
    • Partnerships
      • Flash Partners
        • Optable Flash Node
        • AWS Connector
        • Snowflake Connector
      • Enterprise Partners
      • Flash Partner Settings
  • Differential Privacy
  • Admin
    • General
    • Accounts
      • Managing User Accounts
      • Roles
        • Permissions
      • Authentication
    • Real-Time API
    • Data Subject Requests
  • Profile Settings
    • Notifications
  • Guides
    • AWS Connector: Guides
      • Enterprise DCN
      • AWS Connector
    • Snowflake Connector: Guide
    • Optable CLI
    • Data Warehouse Exports
      • Google BigQuery
      • Snowflake
  • Announcements
    • Deprecation Notices
      • Q1 2025: Deprecation of Exports & Introduction of Syncs
      • Feb 21st, 2024: Removal of regex_allowed_http_origins
      • Dec 8th, 2022: Exports to Export Configurations
      • Deprecated uses in the CLI
Powered by GitBook
On this page
  • Creating a Amazon S3 Destination
  • Exporting Audiences to S3 buckets
  • What Can I Export to Amazon S3?
  • How will my Export Appear in Amazon S3?

Was this helpful?

  1. Audiences
  2. Destinations

Amazon S3

Setting up your Amazon S3 bucket destination.

Last updated 1 month ago

Was this helpful?

Creating a Amazon S3 Destination

You can create a new Amazon S3 destination from your DCN by going to the Audiences -> Destinations section in your navigation menu or by using the . (see above tab).

To add your Amazon S3 bucket, you will need to enter the name of this destination (this is how your destination will appear in the listing), the Amazon S3 bucket URL, region, access ID and secret key.

The <access-id> and <secret-key> are required and should specify the AWS access ID and secret associated with a service account having at least write permissions to the specified bucket (s3:PutObject).

From the CLI, enter the following:

optable-cli destination create s3 <bucket> <access-id> <secret-key>

Where

  <bucket>        Bucket URL in which objects should be stored.
  <access-id>     AWS Access ID
  <secret-key>    AWS Secret Key

For example:

$ optable-cli destination create s3 my-s3-bucket-name \
              ACCESS_ID SECRET_KEY

The <access-id> and <secret-key> are required and should specify the AWS access ID and secret associated with a service account having at least write permissions to the specified bucket (s3:PutObject).

Optional Arguments

By default, created destination have the same name as the specified bucket name. Because destination names must be unique in the DCN, if you wish to create multiple destinations with the same bucket name, you can specify a custom name for the destination itself with the --name=<name> option.

In case of an AWS S3 destination, you can optionally supply region=<region> to specify an AWS region to use. Automatic selection is used by default. Specify a custom AWS S3 endpoint with --endpoint=<endpoint>.

Here is an example of the expected CLI output:

{
  "id": 7,
  "kind": "DESTINATION_KIND_S3",
  "created_at": "2022-09-28T14:27:46.412692412Z",
  "updated_at": "2022-09-28T14:27:46.412692412Z",
  "name": "destinationfromCLI",
  "state": "DESTINATION_STATE_ACTIVE",
  "gcs": {
    "bucket": "S3://destinationfromCLI"
  }
}

Exporting Audiences to S3 buckets

To export to your S3 bucket, navigate to the audience you want to export and click "Export" in the top right. From there you will be asked to select your preferred Amazon S3 bucket export destination from a list of all available destinations.

Here you can select what format to export your audience as, both CSV or TSV formats are supported. In addition to the format, you can choose to export either IDs or ID clusters, as well as which IDs to export and various export options. At the bottom of the export page, you can set the name of the file that will be exported to your Amazon S3 bucket bucket.

From the CLI, enter the following:

optable-cli export create file <audience-id> <destination-id> <filename> <format>

Where:

  <audience-id>       Audience to export.
  <destination-id>    Destination of the export.
  <filename>          Filename of the Export on the destination.
  <format>            Format to use for export (id_csv, id_tsv, cluster_json).

You can choose which ID to send with:

--identifiers-filter=IDENTIFIERS-FILTER,...
                             Filter identifiers by type. Options are: idfa,
                             gaid. Defaults to all types if omitted.

Here is an example of what your command should look like:

$ optable-cli export create file 114 7 "myfile.csv" "id_csv"

Here is an example of the expected CLI output:

{
  "id": 95,
  "created_at": "2022-09-28T14:40:22.878863747Z",
  "updated_at": "2022-09-28T14:40:22.878863747Z",
  "destination": {
    "id": 7,
    "kind": "DESTINATION_KIND_S3",
    "created_at": "2022-09-28T14:27:46.412692Z",
    "updated_at": "2022-09-28T14:27:46.412692Z",
    "name": "destinationfromCLI",
    "state": "DESTINATION_STATE_ACTIVE"
  },
  "audience": {
    "id": 114,
    "name": "myAudience",
    "kind": "AUDIENCE_KIND_QUERY",
    "created_at": "2022-07-15T17:53:29.255148Z",
    "updated_at": "2022-07-15T17:56:27.565118Z",
    "insights_computed_at": "2022-09-07T14:03:06.979720Z",
    "state": "AUDIENCE_STATE_ACTIVE",
    "activation": {
      "keyword": "48dcdse8ff"
    }
  },
  "state": "EXPORT_STATE_PENDING",
  "identifiers_filter": [
    "ID_KIND_EMAIL_HASH",
    "ID_KIND_APPLE_IDFA",
    "ID_KIND_GOOGLE_GAID",
    "ID_KIND_CUSTOM_ID",
    "ID_KIND_VISITOR_ID",
  ],
  "file": {
    "filename": "myfile.csv",
    "override_existing": true,
    "export_metadata": true,
    "id_format": {
      "kind": "ID_FORMAT_KIND_CSV",
      "type_column": true,
      "include_header": true
    }
  },
  "filename": "myfile.csv",
  "override_existing": true,
  "export_metadata": true,
  "id_format": {
    "kind": "ID_FORMAT_KIND_CSV",
    "type_column": true,
    "include_header": true
  }
}

What Can I Export to Amazon S3?

You can export IDs only or ID clusters. All ID types are supported by this destination type. Exports to this destination also have the following optional settings:

  • Include the header row.

  • Include values prefixed with type indicator.

  • ID type in second column.

  • Export an additional METADATA file

How will my Export Appear in Amazon S3?

If you choose to re-export an audience with the same name, it will overwrite the existing file in your storage bucket. However, if you opt for a filename that does not currently exist, a new file will be created upon export.

Finally, if you created the destination using your DCN UI you will be redirected to the destinations listing. If you used the CLI, you will be able to

Your export will appear in your Amazon S3 bucket with the .

export directly from the at the command line.
file name you set during export creation
CLI