AudienceData
Adserver
Best Practices
Do's and dont's when using AudienceData
How to document the accuracy?
Why introduce segments with different affinities?
Using targeting with the right conditioning
DSP
How to access data AudienceData segments in Adform
How to find AudienceData in MediaMath DMP
How to find segments in BidTheatre
How to find segments in Display & Video 360
Data Introduction
Available Segments
Existing integrations
Methodology and precision
The distinction between campaign impression profile and impressions in target group
What is deterministic data?
What is probabilistic data?
Publisher
Accessing targeted data with DFP Audience sync
AdForm publisher integration instructions
How to find data usage information in Google 360 for Publishers former DFP
How to report on AudienceProject segment usage in DFP
Inventory available for realtime targeting in DFP
Lotame integration
Sending targeting key values to AdManager
Troubleshooting
AudienceHub
How to create your first audience
How to create your first seed
Case 1: Selecting a customer file
Case 2: Selecting an Amazon S3 file
Case 3: Selecting survey data from UserReport
Creating a seed
Insights reports
What is AudienceHub?
The new generation of AudienceReport
API Documentation
How do I use the AudienceReport Next API?
Understanding the API documentation
What is an API?
Where do I find the API key?
Account Management
Account Types
Agencies: managing user access for connected accounts
FAQ: Disconnecting accounts
How to add new clients
How to connect my account to my client's or agency's account
How to disconnect accounts
How to manage access to my accounts
User roles
What is the 2-step verification and how does it work?
Integrations
Adform
Amazon Ads (Beta)
Beta expectations and limitations
Possibilities of Amazon Ads (Beta) integration
Use of Amazon Ads (Beta) integration
CTV and Addressable TV
Activate the CTV module
How is CTV and addressable TV measured?
What are the available CTV and addressable TV device types?
What is CTV and addressable TV?
What is the CTV addressable TV measurement availability?
Campaign Manager
DV360
Facebook/Meta
Semi-Automated integrations
How do I check that the semi-integration is setup correctly?
What can be integrated using a semi automated integration?
What is a semi automated integration?
TechEdge Integration
YouTube
Measurement Methodology
Pixel Implementation
Getting Started with Pixels
How do URL-Parameters work?
How to add parameters to AudienceReport pixel
How to check if your pixel is firing?
How to create a pixel?
What is a CACHE-Buster and why do we need it?
What is a tracking pixel?
What is the purpose of a t-code?
Setting up Pixels
How to setup measurement in Adform buy-side (Adform flow)
Implementing pixels in Campaign Manager
Implementing pixels in Display & Video 360
The GDPR parameters
SSL - Compliance
Pricing
Reports
Creating and Sharing reports
How to add and export tracking pixels to your reporting items
How to add custom report items
How to duplicate a report
How to export your report
How to share your report with your client
How to understand your report
How to understand your report - Dashboard
How to understand your report - Profile
How to understand your report - Reach
How to use an exported pixel
Getting Started with Reports
The original AudienceReport
Addressable TV
Activating Addressable TV measurement
Available Addressable TV device types
How Addressable TV is measured
How to get the Addressable TV measurement tool in AudienceReport
Impact on sample size and frequency
Sharing Addressable TV measurement numbers with TechEdge
What is Addressable TV?
Adserver Implementation
Ad Tech
Adserver - Adform
Adserver - VideoPlaza
Atlas
Double Click DCM Adserver
Emediate
Extented Sizmek Asia-guide
How to implement creative wrapper in Ad Manager
Programmatic Publisher Ad Server - Adform PPAS
Setting-up video measurement in Google Ad Manager
Sizmek Ad Suite Tracking
Sizmek/MediaMind guide
Tracking using JavaScript
Implementing AudienceReport tracking pixels in Webspectators
Brand Lift Studies
Cache-buster
Is my cache-buster working?
What is a cache-buster?
Which cache buster shall I use for my ad server?
Why do we need a cache-buster?
Creating Projects
Adding tracking points / pixels to your project
Applying filters to your data
Change your target group or report period
Creating your first project
Duplicating campaigns
How to merge projects
How your data will change when applying filters
Custom Segments
Activating your Customer Segments 3.0
Available Custom Segments
Custom Segments 3.0
Custom Segments and Sample Size
Reach, Coverage and Segments Availability
What are Custom Segments?
Event Tracking
Adding tracking points / pixels with event tracking to your project
Event tracking in various adservers
Implementing click trackers
In-app tracking
In-view tracking of inlined content
Understanding Event Tracking
What is Event Tracking?
Integrated Report
Connect your Facebook Business Manager account to AudienceReport
Connect your Google Ads account to AudienceReport
Connect your Google Display & Video 360 account to AudienceReport
How are data integrated?
How to create an Integrated Report
To-do list before creating an Integrated Report
Understanding your Integrated Report
What is an Integrated Report?
Integrations
Adform integration set-up
Automatic tracking of DFP campaigns
Google Campaign Manager Integration
Integrate AudienceReport and Techedge AxM (AdvantEdge Cross Media)
Intercept
Pixel implementation
Quality
How Transparency is measured
How Viewability is measured
How the Overall Quality Score is calculated
Viewability tracking using AudienceReport measurement script
What is Quality?
What is a good Quality score?
What is a hidden referrer or a generic referrer?
What is the difference between no referrer and other referrers (in the tracking point violations table)?
When is a tracking point considered to be violating Geo Compliance/Transparency/Viewability?
Why can’t I drill down on some countries to see in which regions my impressions are loaded?
Why is my overall score not that bad when almost all my impressions are of low quality?
Why is there a discrepancy between the impression count in the Quality tab and the rest of the report while my campaign is running?
Will my viewability score of 0.0 affect the overall Quality score if I didn’t implement in-view tracking?
Reports
Customized PDF reports
Deeper Insights with Campaign Audience Measurement
Exporting your report
How to search for your project
Introducing the common affinity profile
Managing your projects with labels
Sample sizes
Tired of clicks and imps?
Understanding your project
Technical Questions
Account Administration
Ad blocking
Can I change the phone number I chose for the two-step verification process?
Checking SSL-Compliance
General Troubleshooting
Getting started with AudienceReport API
How do URL-parameters work?
How often will I be asked to log in through the two-step verification process?
How to track traffic by device type
If you accidentally delete pixels from your project
The procedure to enable the two-step verification
What if I lose my phone and cannot access my account?
Tracking Pixels
Upgrade to the new generation of AudienceReport
AudienceReport Upgrade FAQ
Comparing the original and the new generation of AudienceReport
How to import data from the original AudienceReport
UserReport
Installing UserReport and setting up your media sections
Defining your website in the media section
General Account Information
Installing UserReport on your website or app
Kits
General Information
Reach and Coverage of Custom Segments
Target Audience verified by Kits
The technology behind Kits
What are Kits?
Getting started
Troubleshooting
Working with Kits
The feedback widget
Activate the Feedback widget
Adding context information to ideas and bugs
Customize Feedback widget buttons and links
Customize color, text and position of the Feedback widget
Disabling the Feedback widget on specific devices
Get direct link to the Feedback widget or manually display it
How to activate your Feedback widget
How to change the status of an idea or add a comment
How to disable the "report a bug" feature
Is the Feedback Forum available in my language?
Pre-populating username and email
What is the Feedback widget?
The feedback report
The survey widget
Activate Email Collect
Activate Net Promoter Score ®
Activate the Survey widget and create your questions
Chained questions and how they work
Controlling invitation frequency when using UserReport script and Google Tag Manager
How many questions can be added?
How many surveys answers do I need?
How to add questions to your survey
How to customise you survey widget
How to deactivate and delete your survey questions
How to show or hide results from users
Is UserReport 100% free?
Is the UserReport survey widget available in my language?
Managing invitation rules through Ad-server
Preview your survey
Respecting the user
User invitation to UserReport survey and the quarantine system
Who pays for the prizes in the survey?
Will UserReport slow down my website? Is it affected by popup blockers?
The Google Analytics Integration
The survey reports
Accessing Newsletter signups using API
- All Categories
- UserReport
- The survey widget
- Activate the Survey widget and create your questions
- How many questions can be added?
How many questions can be added?
You can add as many questions to the survey as you like!
Please notice that in the survey, the questions are randomized, this means that no matter how many questions you add, each participant will be asked a maximum of 20 questions.
Why are questions randomized?
A problem we have been facing is that webmasters have a hard time redefining surveys when they need to gather new information. After a certain number of questions, the survey gets to long. And every time you want to add something, you need to remove something to keep the length down. And removing something breaks historical data series which also must be considered.
In other words, making changes to an already running survey requires thought and attention. That is why we have introduced a random rotation of questions in the survey widget.
Surveys are still suffering from the constraints imposed when a survey was something printed on a piece of paper. Instead, try to imagine a survey as a long list of questions you would like to know the answers to. Sort of a 'Questions and Answers'. With the new UserReport survey widget, you don't really build a survey as a traditional serialized sequence of questions.
You just define those questions you would like answered - Nothing else!
The UserReport survey widget dynamically builds a custom survey for each and every visitor that takes the survey. We use a random selection of questions to make sure you get enough answers to each question. This means that no single survey will be longer than 20 questions.
When you edit your survey widget in the survey editor, you may preview your survey - either in full length or randomized like a respondent will experience it.
But how does it affect my data?
Let's say you have 20 questions in your survey and you collect 2000 interviews/responses. In this case, you got less than the max limit of 20 questions pr. survey, so you will get 2000 answers to each and every question.
If you increase the number of questions in your survey to 40 questions - and conduct another 2000 interviews - you will get 1000 answers to each of the 40 questions. But each participant will still experience the survey as being only 20 questions long.
Clever right? No matter how many questions you add, the survey experience will be the same. And all 40 questions can be cross-tabulated with each other due to the fact that the order of each survey is randomized.
The randomization is a bit more advanced than just 'randomizing'.
The randomization algorithm will always try to split the 20 questions between 50% personal questions and 50% opinion questions. Why? Because if you ask too many personal questions, the participants will be bored and leave the survey. So we need to strike a nice balance between personal and opinion questions.
Another important detail is that personal questions should always be asked at the end of the survey. That will ensure higher completion rates. Opinions first - then personal questions. But don't worry - we automated that behaviour as well.
Finally - Questions about the overall satisfaction score and gender and age will always be asked in your survey. It would not be a user satisfaction survey without those questions. So they are (still) mandatory.