Get started with Dreamdata
How it works
How set up Dreamdata Web tracking (analytics.js) manually
How to invite your colleagues to Dreamdata
How to set up up Dreamdata web tracking (analytics.js) using Google Tag Manager
How to track forms adding the auto-identify script via Google Tag Manager.
Onboarding for paying customers [VIDEO]
Onboarding process for free customers [VIDEO]
Setting Up Dreamdata
The Onboarding Process
What is Dreamdata? [VIDEO]
Content Performance - Dashboard Options
Which channel performs best for different content?
Which content generates pipeline?
Measuring influenced pipeline for B2B content - the true conversion metric
Setup Content Reporting
What KPI to measure the effect of B2B content?
Attribution Models- dashboard explanation
Data Driven Attribution
Types of Attribution Models
Google Display Ads
Google Search Ads
Return on Ads Spend
Performance vs. Revenue attribution: A guide on when to use what
Setting up AdRoll
Setting up Bing Ads
Setting up Capterra
Setting up Close
Setting up Data Export to BigQuery of CRM Properties
Setting up Facebook Ads
Setting up G2 Crowd
Setting up Google Ads
Setting up Google Search
Setting up Google Sheets
Setting up HubSpot
Setting up Intercom
Setting up LinkedIn Ads
Setting up Marketo
Setting up Microsoft Dynamics
Setting up Pardot
Setting up Pipedrive
Setting up Salesforce
Setting up Twitter Ads
Setting up Zapier integration & Zaps for Lead Gen forms/Lead Ads
Setting up Zendesk Sell
Setting up Zoho CRM
Connect to AWS Redshift using AWS Glue
Connect your Dreamdata data to Amazon Redshift
Connect your Dreamdata data to Snowflake
Getting Started with Google Data Studio Templates
Google Cloud Storage
Google Connected Sheets
Guides for Google Data Studio Reporting
How does Dreamdata track all relevant on-site customer data?
Load analytics.js from your own domain
Pardot iframe form tracking
Server Side Analytics APIs
Setting up Dreamdata Web tracking (analytics.js)
Setting up tracking with Segment
Tracking Hubspot Forms with auto-identify script
Tracking iframes with auto-identify script
Tracking using Sleeknote or Drift
Setting up your customised Stage Models
Setting up your default Stage Models
Can I exclude content or websites from being tracked?
Roles and Permissions
Some of my deals are flagged with "no-tracking". What does it mean?
What does Visitors, Contacts and Companies mean?
What is a Stage Model?
What is a company in Dreamdata?
What is a session?
Why are my dashboards empty?
Quick learning videos!
Are you using G2?
Do you know how your company is generating money?
Do you know which of your Marketing activities had the biggest impact on pipeline and revenue?
Dreamdata Content Analytics: Discover the real value of your content
Find the content that generates most pipeline
Helping BDRs break through to the hottest accounts
How Content Analytics tracks the influence of content of pipeline and revenue
How to cut the cost of your Google Search Ads
How to easily build a retargeting audience with Dreamdata
How to see the value of B2B Google Ads in pipeline and revenue generated
How to set up content categories on Dreamdata
Performance vs. Revenue Analytics reports- when to apply them best!
See the value of SEO in pipeline and revenue generated
What attribution really is and why you should care!
Which of your emails produce pipeline and revenue?
Updated by Steen Voersaa
By default, at Dreamdata we build your database on Google BigQuery. However, through our Google Cloud Storage Integration you can connect the raw dataset of your Dreamdata database and easily connect it to another cloud container and then to Snowflake.
Google Cloud Platform
To use Google Cloud Storage you first need an Google Cloud Platform account.
Dreamdata is an Google Cloud Partner, and if you have not already signed up to Google Cloud Platform you can do it here and get $500 of free credits. If you are not familiar with Google Cloud Storage and want to learn more, take a look at these how-to guides.
Set up necessary integration details
After you have acquired a Google Cloud Platform account, go here to set up the necessary access integration details. Once the integration is configured, we will begin to generate your raw datasets on 6-hour schedule.
Some key points:
- This data will be hosted in the region of your choosing
- Dreamdata will pay the data storage costs
- You will pay for exporting costs
We will want to know the following:
- The region/zone of your Google Cloud Platform.
- You Google Cloud Platform service account.
- A list of emails of users that should have access to the exported dataset.
- A list of Google group email addresses that should have access to the dataset.
Note: Either (3) or (4) is required.
Exporting your data
After you have configured the Google Cloud Storage integration and gained access to your files, any one of the following methods can be used to transfer data from Google Cloud Storage to either AWS S3 or another datacenter. From there it can be loaded into Snowflake as normal.
If you are using Snowflake on AWS
- You can connect data from Google BigQuery to AWS S3 using AWS Glue. This article describes how to set this up and run the task on a schedule.
- This article describes how to connect data using CLI. This approach will work between any two cloud storage providers.
If you are using Snowflake on any other data platform
- You can use Snowflake Scheduled Tasks and COPY INTO to load data directly from Google Cloud Storage.
Once connected, you can run your own queries on our data models, as well as copy, manipulate, join and use the data within other tools connected to Redshift.