Custom Data Upload
For most Dreamdata customers, the native data sources within Dreamdata solve all their data integration needs. However, in some cases, data exists in other standard or custom-built systems. This article will describe how data from these other sources can be brought in while covering the different use cases we usually see.
This overview covers the following
- Use Cases
- Upload a custom CRM Source
- Upload custom Stage Objects
- Upload custom Event and Web-tracking Data
- Implementation Process
Use Cases
Below are the three use cases, each requiring different file types and schemas. All fields within these schemas are necessary, but you may leave some fields empty (null) if you don't have any data or if it doesn't apply to your context.
You might share multiple files or just one, depending on the use case.
Upload a custom CRM Source
Uploading a custom CRM Source is used for customers who are in one of the following situations
- You have a CRM not yet supported by Dreamdata
- Dreamdata natively supports Salesforce, HubSpot, Pipedrive, and MS Dynamics
- You have a self-hosted CRM such as Oracle, SAP or MS Dynamics
- You want to avoid hitting CRM API limits
- Consider if it's not cheaper and better to upgrade your CRM.
- You have sensitive data in your CRM, that you can't share with Dreamdata for privacy or contractual reasons. For example if you work with the public sector, healthcare, or other regulated industries.
- Your CRMs often have a complex permission model that can prevent Dreamdata from accessing sensitive data via the native CRM integration. Here, you can read more about Salesforce and HubSpot permission models.
Here, you can read more about the schema format when loading a custom CRM data source.
Upload custom Stage Objects
Uploading Custom Stage Objects is typically needed when you have your ERP system with information not else available in your CRM or when you have defined your ways of measurement as joins of multiple objects.
Here, you can read more about the schema format when loading custom Stage Objects.
Upload custom Events and Web-tracking Data
Uploading custom events is typically used for custom events coming from a service you are using, that Dreamdata does not yet support or is from your own build service, such as your product.
Uploading custom web-tracking is typically used when uploading data from tracking solutions Dreamdata does not yet support out of the box such as Snowplow, a home build or other tracking solution.
Here, you can read more about the schema format when loading events and web-tracking data.
Implementation Process
- Prepare the data to upload
- Transform the data into one of the supported file formats (JSON format is preferred).
- JSON Lines (.jsonl)
- Newline Delimited JSON (.ndjson)
- Parquet (.parquet)
- CSV (.csv)
- Compress the data using gzip.
- Transform the data into one of the supported file formats (JSON format is preferred).
- Upload the data
- For custom CRM data and Stage objects
- Go to Data Platform -> Sources -> Custom CRM source. After enabling the source credentials will be made available. The data has to be uploaded to SFTP, through one of the 2 options:
- Upload through web client UI https://ftp.dreamdata.io/web/client/login
- Or upload using command line
sftp username@ftp.dreamdata.io
put /path/to/local/file
- For incremental uploads, the folder structure must be: /{data-type}/dt={YYYY-MM-DD}/{data-type}_{number}.jsonl
Example: /accounts/dt=2024-01-23/accounts_01.jsonl, where 2024-01-23 is the upload date - For uploads representing the whole period, overwrite files at each upload. The folder structure must be: /{data-type}/{data-type}{number}.jsonl
Example: /contacts/contacts_01.jsonl
- Go to Data Platform -> Sources -> Custom CRM source. After enabling the source credentials will be made available. The data has to be uploaded to SFTP, through one of the 2 options:
- For custom Event and Web-tracking
- The data has to be uploaded to Google Cloud Storage or SFTP. Alternatively Dreamdata can also pull the data directly from your BigQuery (EU) instance.
- You gain access to upload through your Customer Success Manager.
- Here are code samples of how to upload data to Google Cloud Storage.
- For incremental uploads, the folder structure must be: /{data-type}/dt={YYYY-MM-DD}/{data-type}_{number}.jsonl
Example: /tracking/dt=2024-01-23/tracking_01.jsonl, where 2024-01-23 is the upload date - For uploads representing the whole period, overwrite files at each upload. The folder structure must be: /{data-type}/{data-type}{number}.jsonl
Example: /events/events_01.jsonl
- The data has to be uploaded to Google Cloud Storage or SFTP. Alternatively Dreamdata can also pull the data directly from your BigQuery (EU) instance.
- For custom CRM data and Stage objects
- Dreamdata will pick up the latest data based on your Data Model Schedule.
- By uploading the data before the Data Model is scheduled to run, you ensure the data is as fresh as possible.
- Dreamdata recommends uploading the data daily unless the data changes infrequently.