Custom Data Upload

Updated by Andreea Mahu

Custom Upload is available to Enterprise customers.
Before getting started, please get in touch with your Dreamdata Customer Success Manager. A custom upload of data almost always requires tech resources at your end, so before you invest time and resources let us help you and ensure a custom upload is the best option.

For most Dreamdata customers, the native data sources within Dreamdata solve all their data integration needs. However, in some cases, data exists in other standard or custom-built systems. This article will describe how data from these other sources can be brought in while covering the different use cases we usually see.

This overview covers the following

  1. Use Cases
    1. Upload a custom CRM Source
    2. Upload custom Stage Objects
    3. Upload custom Event and Web-tracking Data
  2. Implementation Process

Use Cases

Below are the three use cases, each requiring different file types and schemas. All fields within these schemas are necessary, but you may leave some fields empty (null) if you don't have any data or if it doesn't apply to your context.

You might share multiple files or just one, depending on the use case.

Upload a custom CRM Source

It is possible to use Dreamdata without a CRM, read more here.

Uploading a custom CRM Source is used for customers who are in one of the following situations

  • You have a CRM not yet supported by Dreamdata
  • You have a self-hosted CRM such as Oracle, SAP or MS Dynamics
  • You want to avoid hitting CRM API limits
    • Consider if it's not cheaper and better to upgrade your CRM.
  • You have sensitive data in your CRM, that you can't share with Dreamdata for privacy or contractual reasons. For example if you work with the public sector, healthcare, or other regulated industries.
    • Your CRMs often have a complex permission model that can prevent Dreamdata from accessing sensitive data via the native CRM integration. Here, you can read more about Salesforce and HubSpot permission models.

Here, you can read more about the schema format when loading a custom CRM data source.

Upload custom Stage Objects

Before uploading custom stage objects, it is recommended to consider if you can't add a custom or calculated field on your CRM object and use the stage model builder. That is typically easier and faster, and available in all packages.

Uploading Custom Stage Objects is typically needed when you have your ERP system with information not else available in your CRM or when you have defined your ways of measurement as joins of multiple objects.

Here, you can read more about the schema format when loading custom Stage Objects.

Upload custom Events and Web-tracking Data

Before uploading custom event data, it is recommended to check the different options within the Data Hub. There are often easier and faster ways to implement.

Uploading custom events is typically used for custom events coming from a service you are using, that Dreamdata does not yet support or is from your own build service, such as your product.

Uploading custom web-tracking is typically used when uploading data from tracking solutions Dreamdata does not yet support out of the box such as Snowplow, a home build or other tracking solution.

Here, you can read more about the schema format when loading events and web-tracking data.

Implementation Process

  1. Prepare the data to upload
    1. Transform the data into one of the supported file formats (JSON format is preferred).
      1. JSON Lines (.jsonl)
      2. Newline Delimited JSON (.ndjson)
      3. Parquet (.parquet)
      4. CSV (.csv)
    2. Compress the data using gzip.
  2. Upload the data
    1. For custom CRM data and Stage objects
      1. Go to Data Platform -> Sources -> Custom CRM source. After enabling the source credentials will be made available. The data has to be uploaded to SFTP, through one of the 2 options:
        1. Upload through web client UI https://ftp.dreamdata.io/web/client/login
        2. Or upload using command line
          sftp username@ftp.dreamdata.io
          put /path/to/local/file
      2. For incremental uploads, the folder structure must be: /{data-type}/dt={YYYY-MM-DD}/{data-type}_{number}.jsonl
        Example: /accounts/dt=2024-01-23/accounts_01.jsonl, where 2024-01-23 is the upload date
      3. For uploads representing the whole period, overwrite files at each upload. The folder structure must be: /{data-type}/{data-type}{number}.jsonl
        Example: /contacts/contacts_01.jsonl
    2. For custom Event and Web-tracking
      1. The data has to be uploaded to Google Cloud Storage or SFTP. Alternatively Dreamdata can also pull the data directly from your BigQuery (EU) instance.
        1. You gain access to upload through your Customer Success Manager.
        2. Here are code samples of how to upload data to Google Cloud Storage.
      2. For incremental uploads, the folder structure must be: /{data-type}/dt={YYYY-MM-DD}/{data-type}_{number}.jsonl
        Example: /tracking/dt=2024-01-23/tracking_01.jsonl, where 2024-01-23 is the upload date
      3. For uploads representing the whole period, overwrite files at each upload. The folder structure must be: /{data-type}/{data-type}{number}.jsonl
        Example: /events/events_01.jsonl
  3. Dreamdata will pick up the latest data based on your Data Model Schedule.
    1. By uploading the data before the Data Model is scheduled to run, you ensure the data is as fresh as possible.
    2. Dreamdata recommends uploading the data daily unless the data changes infrequently.


How did we do?