Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
The default dashboards provide real time figures about your site’s visitors (returning versus new, traffic, source of traffic) and conversions, allowing you to create more efficient segments and optimize results very quickly.
The period of analysis reflected in the dashboards consists of the last 24 hours. All indicators are updated every five seconds.
Combine Your Data Sources to Create Actionable Insights and switch to Segment-Based Marketing Campaigns.
To increase relevancy, many marketers are building a single view of their prospects and customers that goes beyond the devices they’re using to consume digital content or the channels they’ve previously engaged with. This is made possible with DataCommander as it allows enterprises to compile, standardize, and store all previous prospect and customer interactions in one place that’s accessible for future campaigns.
This solution seamlessly integrates with many CRM platforms enabling both sales and marketing teams to enhance their existing prospect and customer profiles with online and offline purchasing data, browsing histories, previous marketing campaign exposures, and even call center interactions.
The DataCommander platform automates the process of combining data across disparate solutions enabling marketers to do what they do best: understanding their audiences, designing relevant campaigns to target them, analysing campaign performance and leveraging the insights gained to optimise future campaigns.
Commanders Act’s DataCommander meets two main objectives: to help you better understand your data and to help you make the most of it. In a nutshell, it allows you to:
*Optimize traffic acquisition.
*Increase conversion rates.
*Collect and match data obtained from different sources (web, CRM, CMS …).
*Prepare data for optimized use by segmenting it.
*Share data among players within your partner ecosystem (retargeting, testing partners).
*Activate data directly from our TagCommander product.
In order to enrich your DataCommander module with varied data, you need to connect it to as many data collection tools and databases as possible. The more of them you connect, the larger your data universe will be, and you will get to know your audience more deeply.
The data layer, which is placed on your site’s source code during the setup of our TagCommander product, is the first level of data that will be exploited with DataCommander.
In the following articles, you will find essential concepts explained to better understand the product and its full potential.
Within Commanders Act’s DataCommander, a segment is a cluster of information obtained from different types of data categories that are continuously included in the module:
-Page views
-Visitors
-Conversions (coming up)
-Customer journeys (coming up)
This information can be obtained from your data layer or all other sources of data (Commanders Act MixCommander’s customer journey, call center-obtained information, CRM, etc).
The characteristics of information comprising a segment can be narrowed thanks to logical operators AND and OR: a segment can be created/defined if it meets a number of conditions expressed with said operators.
These conditions can also be placed in groups in order to make a segment more precise. The logical operator AND can be used to create those groups. Please refer to the examples below for clarification.
Examples of segments:
Hot leads: all visitors who visited your site at least once and added a product to the shopping cart in the last 24hours.
Loyal clients: all visitors having purchased something on the site in the last six months.
Wealthy clients: visitors having purchased something on the site in the last six months and with an average basket of more than EUR 1,000.
Import users
You can send us your users data in 2 ways :
Files importer (Get in touch with your Account Manager)
One line per user
Header on the file is needed for the matching
Encoding: utf8
The variable can not be calculated. It must be one line, one variable
PGP or GPG key encryption are not supported. Files must be unencrypted before import.
Variables :
it must be the same name of data variable (Variables are in uppercase, they will transcribe automatically in lowercase)
You can add new variables. Don't forget to define them in DataCommander variable interface
The CRM variables should be prefixed with "person." and custom variables by "person.custom."
The first thing to do is to add Commanders Act’s Commanders Act tag into each and every container present on your site’s pages.
The tag must be added from the TagCommander interface.
1.Go to TagCommander
Go to the “EDITION” tab
2. Click “SELECT”
3. Click “ADD TAGS”
In the tag library look for the “DataCommander V.1.0” tag and add it to your container by clicking the “ADD TAG(S)”:
When the tag is added, you will need to generate (a) new version(s) of the container(s) you placed the tag into and proceed to deployment.
Now that you have added the tag and deployed the containers, you can create segments. Follow the steps below to do so.
Log in to Commanders Act from our Website if you are not already logged.
Click the DataCommander menu, then the “SEGMENT” button and then “ADD SEGMENT” to the right.
In the window that pops up, name your segment (1), choose whether you want to use it also in the TagCommander product or not (2) and add it (3).
If you choose “no”, tag firing rules based on this segment will not be made available in the “RULES” step from TagCommander.
In the configuration window that appears next:
1.Your recently created segment will be selected by default. You can display the list of all your available segments (should you wish to edit a different one), by clicking the arrow next to the segment’s name.
2.The cog wheel allows you to change your segment’s name and select whether you want to use it in the TagCommander product or not.
3.This menu allows to include (HAVE) or exclude (DO NOT HAVE) visitors from your segment.
4.This menu allows you to select which data cluster you are wanting to consider information from. Note: if you want to enable “customer journey” and “conversion”, you must have subscribed to our “MixCommander” product and request the activation of the MixCommander data unit in the DataCommander module. Please contact your account manager or write to support@commandersact.com in this case.
5.Select your data universe (visitor “page views”, “properties”, “customer journey” or visitor “conversions”) and the corresponding variable from the dropdown menu (in the screenshot below, the data universe is “page views” and the variables suggested are those from your data layer).
6.The variable is selected
7.Select from the dropdown menu the type of matching you need (equals, contains, is bigger or smaller than, begins with, etc).
8.In this field, select the variable’s values you wish to define your segment with.
9.An auto completion feature will propose you all the values that have been collected so far and available in your global database.
Note: the field will be auto-completed taking the letters a variable starts with.
For example, if you are looking for “Insurance” and type “In”, “Insurance” will be proposed in the list (provided it is present in the database). But if you type “surance”, nothing will be proposed, even though “Insurance” is present in the database.
/!\ This field is also case sensitive: typing “Insurance” instead of “insurance” will have an impact in the listed values that will be displayed.
10.Chose the frequency range – exact, minimum or maximum occurrences – of the chosen values.
11.Enter the amount of occurrences.
12.Select the period of time.
13.“SAVE” your segment configuration.
14.A counter indicates the size of your segment in real time: the number of individuals belonging to that segment and the percentage it represents compared to the global population of your visitors.
You can add more complex segments by adding more conditions: everything in a row is called a “condition”. When it is met, a visitor will be associated to your segment. Segments can be wider and more complex by adding more conditions, based on coding logical operators AND/OR.
To do so, click “ADD CONDITION” (1): It will add a row that will need to be configured just like you just did for the first one.
Once configured it will look like this:
The example above can be read as:
Segment name: Hot leads
Description: Hot leads are visitors whose page views’ information contain a page type that is equal to “Produit/assurance-auto” or “Actu-mobile” AND who are between 26 and 49 years old. In addition, to be included in said segment, the variables Page name and Age must have been populated with the aforementioned values at least four times in the past seven days.
In this example, no visitors within a pool comprising 10.9 million people matched that criteria. This is how segments are created within DataCommander, you can try any combination you like and see how many individuals your segments have and modify their configuration according to your needs.
Nevertheless, when the segment was made of only one type of visitors (those who had looked for the mentioned products at least once in the past 67 days), the population contained in that segment totaled 46,439 visitors. These are examples meant to illustrate the possibilities offered by EDataCommander in terms of segment configuration.
When you are satisfied with you segment, click save and proceed to creating more segments or adding connectors in the “SHARE” tab.
When you have finished creating your segments and wish to add connectors, go to the “SHARE” tab.
Note: Prior to doing this, please make sure your Commanders Act consultant has configured a data transfer protocol to connect with your partners. The most commonly used is FTP. You can contact your Commanders Act consultant or write to support@commandersact.com to do so.
1. Go to “SHARE”
2.Click “ADD STREAM”
3.In the window that pops up, enter the name of your stream.
4.Click “ADD”.
5.The configuration window will appear and will allow you to select the data category you wish to share with partners (“Page Views” or “Visitors” in this case [A]), and the segment among those you have created (in this case, “Hot Leads” [B]).
6.Select a partner.
7.STREAM LIST displays all your streams.
8. Once you select your partner, a new configuration window will appear for you to enter the required details.
Note: you can have the stream transferred through an FTP or by Email, all you need to do is select the corresponding icon.
In this “new” (enlarged) window (CONFIGURATION) you will have to:
A. Click “EDIT” to select the information available and related to the segment that you wish to share with your partners.
B. Select the information to be transferred (variables): one by one, all at the time or all but one, two or three, etc.… Selected variables will turn purple; unselected variables remain gray.
C. Select the export frequency (once or at a later date).
D. Name the file/export.
E. Enter the time range (period during which the data transfer will happen).
F. Define the transfer frequency.
G. Define the period of time for which you wish to send the information (last 20 days for example).
H. Enter the email addresses of the people who should receive a notification when the stream is shared.
In the Advanced settings, you will need to select:
I. The CSV separator you wish to use.
J.The Encoding type.
9. When you are done configuring the segment and sharing options, click “SAVE” and wait a little while to allow the first file to be transferred.
Once your segments are created, you can use them immediately in the TagCommander interface.
Note: You must activate the “Use in TagCommander” option in the segment creation window in order to create segment-based rules.
To do so,
1.Go to the TagCommander interface.
2.Once there, select the container where the tags you want to fire based on a segment are.
3. When it opens, go to the RULES section.
4. Click “Constraints”.
Note: segment-based rules can only be of the “constraint” type.
5.Click “ADD TAG CONSTRAINT”
6.Select the “DataCommander” trigger type
7.Select the type of rule you wish to activate:
-“DataCommander OR Condition (Up to Six Possible Audiences)”: It allows to fire one or more tags depending on the segment a visitor belongs to (can be one or another).
-“DataCommander AND Condition (Up to Six Possible Audiences)”: It allows you to trigger one or more tags if a visitor belongs to several segments.
-“Engage Event Activation”: It allows you to define which events you wish to trigger based on the segment a visitor belongs to. After defining the events, all you need to do is create one or more event constraints based on the segment(s) a visitor belongs to. You can do this with the DataCommander OR/AND conditions.
8.Name the rule.
9.Check the boxes corresponding to the tags you wish to trigger based on that rule.
10.Select your audience(s) from the dropdown menu.
11.Add your rule.
When the rule(s) are added, you will need to generate and deploy a new container*.
Here are some examples of how these rules can be used:
To trigger a popin if the user belongs to the “hot leads” segment.
To launch a virtual assistant if the user belongs to the “lost visitor” segment.
To display an “encouraging”/prompting message on your site if a user belongs to the “cold user” segment.
'User ID'
'Hashed email address'
'First visit date'
'Last visit date'
'Total page views'
'Total conversion amount'
'First conversion date'
'Last conversion date'
'Continent' 'Country'
'Region' 'City'
'Postal code'
'User gender'
'User age in years'
'User postal code'
'User address'
'User firstname'
'User lastname'
'Timestamp'
'Stream segment public IDs'
'Stream segment labels'
'Audience public IDs'
'Audience labels'
'User alias'
'User category'
'User agency'
'User loyalty card id'
'Last Device Name'
'Last Device Type'
'Total sessions'
'User Id List'
'User Email Hash List'
'Is master user'
'Original Products Count'
'Cancelled Products Count'
'Returned Products Count'
'Exchanged Products Count'
'Final Products Count'
'Total Original Conversion Amount'
'Total Cancelled Conversion Amount'
'Total Returned Conversion Amount'
'Total Exchanged Conversion Amount'
'Total Final Conversion Amount'
'Number of conversions'
'Date of First Conversion'
'Date of Last Conversion'
'Flags'
'New session'
'Env template'
'Env work'
'Country'
'Language'
'Channel'
'User ID'
'User logged'
'Gender'
'Age'
'Postal code'
'Address'
'Fistname'
'Lastname'
'Email (hashed)'
'New customer'
'Page Category 1 name'
'Page Category 2 name'
'Page Category 3 name'
'Page Category 1 ID'
'Page Category 2 ID'
'Page Category 3 ID'
'Key Words'
'Product ID'
'Product ID variation'
'Product name'
'Product name variation'
'Product quantity'
'Product unit price'
'Product unit price discount'
'Product URL'
'Product Image URL'
'Product Category 1 ID'
'Product Category 2 ID'
'Product Category 3 ID'
'Product Category 1 name'
'Product Category 2 name'
'Product Category 3 name'
'Product custom field 1'
'Product custom field 2'
'Product custom field 3'
'Product custom field 4'
'Product custom field 5'
'Basket ID'
'Basket custom field 1'
'Basket custom field 2'
'Basket custom field 3'
'Basket custom field 4'
'Basket custom field 5'
'Order ID'
'Order amount'
'Order products number'
'Conversion Product Name'
'Products list'
'Ordered products list'
If you need to map your segments:
Stream segment labels => Labels of segments selected for the stream
Stream segment public IDs => IDs of segments selected for the stream
Audience labels => Labels of all segments to which the user belongs
Audience public IDs => IDs of all segments to which the user belongs
"Default variables" means ...
They are declared on each account per default? --> Yes, by our servers
They are populated automatically by DataCommander tag? No
'Custom variables' are variables that could be created especially for you, to suit to your business. Limits:
Variable name : varchar(128) Variable shortname : varchar(55)
Consents from our product Trust (our Consent Management Platform) are stored on users and you can segment or export these values with these variables:
Allows to treat all kinds of data, incl. "" (empty)
Can come from collect, API, import
Can be used with text operator such as 'contains', 'begins with'...
All variables coming from MIX must be declared as String
Limit: 32766 bytes
Is either negative {-1, -2,-3, -4, -5, ... }, positive {1, 2, 3, 4, 5, ... }, or zero {0}
Can come from collect, API, import
Can be used with numeric operators for segmentation
Limit: 2147483647 bytes
Cannot be empty
Please be careful when you try to use this very specific type
Format is very specific: YYYY-MM-DDTHH:mm:ss
Cannot be empty
Can come from collect, API, import
Can be used for timing segmentation, in the past "ago" AND in the future "from now"
Is used to retrieve true / false for a specific information
Can come from collect, API, import
Can be used in the segmentation either with 1/0 - true/false - T/F: these 3 options retrieve the same results, any other usage will retrieve the value of true
Cannot be empty
Be careful this is case sensitive!!!
Is whatever figure you want
Cannot be empty
Can come from collect, API, import
Can be used with numeric operator
Used to store several information in one variable
Example: { age : 12, gender : M }
Cannot be updated from collect data layer (because tag converts objects to strings); Can only be updated by API or FTP import
Cannot be used for segmentation, only for exports
Cannot be empty
This is the list of proper variables you can use
First session date / Last session date = the first / last session date that is associated to a TCID
date format
can be used for segmentation
available operators: beteween / not between + date range
Count session = the number of sessions associated to a TCID for a specific date range
integer
can be used for segmentation
available operators: exactly / at least / at most + date range
Total session = the total number of sessions associated to a TCID
integer
can be used for segmentation
available operators: all operators that can be used for integers
as a classic user property there is no date range
New session = the first session variable populated on pageviews
boolean
can be used for segmentation (but honestly quite complicated to really use it...)
available operators: equals T/F
Be careful: it never matches 100% as you have many TCID older than 30 days where the variable new session in page views is not populated
Some information about how data is encrypted in DataCommander (in AES 256 format)
If you need encrypted data in DataCommander don't forget to pick 'Yes' in encrypted option in Create variable step.
This is the encryption scheme:
It means that we store encrypted data, and we are able to export clear data for our partners.
It is very important especially for emails, as it is a sensitive data that many clients want to over-secure, but we need to export it in clear for emailers to be able to use it
CSV import through FTP
If you wish to share your data via flat file feed integration, please use a CSV feed format, following the structure below.
The CSV format is based on the comma-separated value. The header must be declared in the first row of the file, it is recommended that headers are provided in the lower case and don't contain spaces.
Files can be uploaded to the Commandersact FTP or downloaded from your own FTP by CommandersAct. Get in touch with your Account Manager to agree the integration method that is best for you.
Use our CSV conversion importer connector to configure the FTP import.
Each line in the file should be a distinct product (conversion item) with their unit price and quantity. Transactions that include several products should be split into multiple lines, sharing the same conversion_id.
Example: conversion_ID_1, conversion_item_1, conversion_ID_1, conversion_item_2, conversion_ID_1, conversion_item_3,
Be sure you are using exactly the same headers as in this example CSV file:
Some columns are required, please check to see which fields are required or not.
Please sort your file and group it by conversion ID (all conversion items related to conversion_ID_1 then all conversion items related to conversion_ID_2 etc...)
This is where you create, edit and/or delete segments.
The first column displays the segments’ names.
The second column displays the segments’ technical IDs (present in the “TC AUDIENCE” cookie we use to store information about a segment).
The third column shows the percentage of visitors that are included in a segment.
The fourth column specifies the associated streams, using this segment.
The fifth column shows when was the last time a segment was modified.
The sixth column is where you edit or delete a segment, by clicking the pencil or the trash can.
Let’s say, for example, you want to match all users with ZIP code equals ‘75009’ or ‘92120’ or ‘94230’… and you realized you have more than 30 different values!
You can type one zip code per one, but it is long.
Or you can also copy and paste directly your values. However, you should use a separator between these values ; or |
Don’t forget to add the separator behind the last value also to paste all your values.
You can paste-up to 1024 values in the segment
You can segment on conversion items, meaning products bought by users.
First, you have to select 'conversion
' universe and define a condition for conversions, like 'conversion ID exists
' = users have at least 1 conversion.
Then the button 'ADD CONDITION' will appear, and you can define more conditions, including conditions for conversion items, select 'product':
Then, you can easily segment on conversion items:
The segment overlap feature allows you to compare your segments and visualize directly the audience shared between 2 segments.
This feature is very helpful in order to understand how your audience is divided and grouped in your segments.
It allows you to reduce the marketing pressure as it helps you to avoid activating similar audiences. For example, you can send an email to users on a segment. Then you can decide to send a SMS to users on another segment. However, it is recommended to check the overlap between the 2 segments in order to be sure you are not activating the same audience which can raise the number of opt-out.
It also allows you to refine your segments by divided a big segment into small pieces more relevant.
First step, select one of your segment and it will automatically calculate the overlap between the segment you choose and all your others segments.
Then you can visualize and analyse the overlap with the number of users in both segments and the percentage.
With the visual representation of each of your segments, you have an overview of the size of the segments compared, which is helpful to understand if your segment is part of a bigger one or is totally apart.
You can send us your conversions data in 2 ways :
DATA | Start | SSL | FTP Server | In base | In Export |
Datalayer | Clear | encrypted | - | encrypted | Clear |
CSV file | Clear | encrypted | Clear | encrypted | Clear |
Criteo is an advertising company that provides online display advertisements, especially for retargeting campaigns.
Create a segment on DataCommander to build your audience. Then, you can push this audience to Criteo through our connector Criteo (audiences).
You need first to accept a consent link send by our consultants. This link is needed to authorize our DataCommander Criteo App to have access to your Criteo account to push audiences.
Then you can configure the connector, select the segment you created just before and simply enter your Criteo Advertiser ID (you can find it on your Criteo account).
Click on SAVE and it will automatically create a new audience (user list) on Criteo side.
Ensure that you already have a Google account (Google Ads, DV360,...)
Contact your Google Account Manager to whitelist our DMP + get your Client ID (Client Customer ID: Your DoubleClick Client Customer ID)
We are listed as "TagCommander" and not as "Commanders Act".
You have to ask for the whitelisting of the advertiser ID, as we don't currently use the Partner ID on our connector.
This connector is only for Google DV360 accounts (including Google Ads for DV360) but not for Google Ads accounts (without a DV360 account).
If you have a Google Ads account, you can use the Customer Match connector or RLSA (Remarketing Lists for Search Ads).
Example of answer from dv360-support@google.com:
“You just have to tell your Google account manager to raise a DMP whitelisting request and once we receive that request it would be a new ticket altogether. So, one of our agents would be able to look into that and process your request.”
References:
Google Ads : https://developers.google.com/third-party-ads/googleads-vendors (look for "Fjord Technologies SAS")
Ad Manager : https://developers.google.com/third-party-ads/adx-vendors (look for "Fjord Technologies SAS")
Ad Manager & Ad Exchange: https://support.google.com/admanager/answer/9012903 ("look for Commanders Act")
Youtube : the partners' program has been closed
Wait for Google to whitelist our DMP.
If you are lost about all the namings, go to : https://marketingplatform.google.com/about/
Display & Video 360 was previously DoubleClick Bid Manager.
If you had a DoubleClick Bid Manager account or if you have a Display & Video 360 account, please use our connector 'Google Display & Video 360 Bid Manager'
Search Ads 360 was previously DoubleClick Search.
If you had a DoubleClick Search account or if you have a Search Ads 360 account, please use our connector 'Google Search Ads 360'
Display & Video 360 can be also used with an Adexchange account
If you had an Adexchange account, please use our connector 'Google Display & Video 360 Adex'
[Google DDP API][2473] Could not fetch lists: [AuthorizationError.USER_PERMISSION_DENIED @ clientCustomerId]
→ Trouble with the whitelisting: be sure you are using the right name (TagCommander) and the right ID (advertiser ID).
[Google DDP API][3206] Could not fetch lists: [DmpUserListServiceError.INVALID_CLIENT_CUSTOMER_ID @ clientCustomerId]
→ You don't have a Google DV360 account. Please use the Customer Match connector or RLSA (Remarketing Lists for Search Ads)
Pick your segment and create your stream
+ Put Google Client ID, if you do not have it, then it fails
Segments will be directly created by Commanders Act, and will appear in the Audience list in Google Ads.
Interface of DV360 where you are supposed to find the segments:
The sharing is based on a cookie synchronization between Commanders Act and Google. If you see less data than expected in the data stream, check your cookie-sync ratio.
This feature allows you to create enriched properties on your user: flags, ratio, rolling count/average, conditional value, boolean...
Enrich your data with your business rules in a user-friendly UI.
An enrichment transforms a property from a static value to a dynamic/enriched one.
By introducing business rules to your attributes, you give them context and more meaning. These enriched attributes then become the building blocks for dynamic segmentation, analytics, or can be retrieve in the datalayer via TagCommander to enrich your tags.
Flag allows you to tag your customers who met all the conditions you defined that are relevant to your business. It is helpful to create advanced segmentation based on criteria. You can tag your best customers or identify visitors who have an interest on your products but did not buy. Then you can create new segments based on these specific flags and mix it with more granular conditions (ex : have not seen a specific page within last 2h).
What are the differences between segment and flag?
→ Segments can help you to identify and activate your customers based on their live behaviour: page views, clicks, sign-up, orders… Your customers are interacting with you, your websites, your applications, your customer service… you have to identify these live actions in order to prepare the next best action that will enhance your customer experience: for example onsite personalization after a click on a remarketing ad and a specific product view, personalized email after a contact with the customer service…
Segments are built for instant reaction : customer actions 〉identification 〉marketing activation Because of this, segments are dynamic. Your users, according to their actions, will dynamically enter and most importantly exit automatically your segments the second they no longer meet the conditions of the segment . You are sure to work on fresh data.
→ In a different way, Flags can help you to categorize your customers based on what really matter for your business: revenue.
You have to define which dimensions are key for your business and define a matrix that can help you to categorize your customers. For example, you want to flag your customers based on their order frequency and average order amount:
However, you can go further with flags because you can integrate other dimensions such as period, online/offline purchase or whatever you need. This is allowed because there is no data retention for a flag, data is stored for a long time: the flag defined will not be removed on the user, except if you defined some exit conditions.
As best practices, you should define around 10 different flags in order to categorize your customers and as many segments as you need.
Important:
The new flag defined is NOT retroactive (yet):
Your new flag will take into consideration all the future actions/events/hits from the date you have created it.
It will be possible soon to define retroactive flags. Stay connected!
Flag VIP customers: flag all your best customers
Flag window shoppers: customers interested but not converted
Review our Business case
You would like to launch a fidelity program with different status for each customer: Platinum, Gold, Silver, Bronze… You have to define all characteristics for each status and then create flags in order to categorize your customers according to the criteria you decided.
For example:
Platinum
Total amount orders over the last 6 months >800€
Number of visits over the last 6 months >6
…
Gold
Total amount orders over the last 6 months 600€ < 800€
Number of visits over the last 6 months 4 < 6
…
Silver
Total amount orders over the last 6 months 400€ < 600€
Number of visits over the last 6 months 2 < 4
…
Bronze
Total amount orders over the last 6 months <400€
Number of visits over the last 6 months <2
…
Now it is your turn ;)
1. Name your new flag attribute and describe it
2. Define your conditions to ADD the flag
Add conditions among these universes:
For visitor
Filter among all the variables available through your web / mobile tracking
For page-view
Specify the number of page view over a period
Specify other variables such as page name, product category…
For view
Specify the number of views of your advertising over a period (all the advertising data comes from Mix Commander)
Specify other variables such as campaign, channel…
For click
Specify the number of clicks on your advertising over a period (all the advertising data comes from Mix Commander)
Specify other variables such as campaign, channel…
For conversion
Specify the number of conversions (orders, sign-up…) over a period
Specify other variables such as product name, billing information…
You can add multiple variables with the AND/OR function.
3. Define your conditions to REMOVE the flag
Never remove the flag
Remove the flag IF all conditions defined to add the flag are no longer met
Remove the flag IF specific conditions are met:
You can use the same conditions and filters as defined for adding the flag.
Please keep in mind that segments are more powerful to manage entry and exit criteria because they are dynamics (no needs to set exit conditions).
Calculate cumulative sum, average, minimum or maximum for one variable based on a specific date range. It allows you to create new business rules based on a variable aggregation.
You can use these new indicators to create new segments or to flag your customers.
Total order: cumulative total order amount for last 6 months
Page views: total of pages views for last week
Orders: Average orders amount for last 6 months
Products: Average products in last month
Orders: Minimum order amount for last 6 months
Orders: Maximum order amount for last 6 months
You would like to identify customers who have ordered more than 300€ in the product category ‘clothes’ over the last 2 months. You set up the new rolling sum attribute:
You can now create a new segment based on the new attribute ‘[total orders amount clothes <2 months] >300€’ which concerns, on this example, only the customer B.
With our example we can also calculate the average orders amount or the minimum / maximum amount.
1. Name your new attribute
2. Specify the universe (page view, view, click, conversion) and the variable you want
Variable type: numeric only
3. Define the date range you want
Forever: no period defined, you consider all the dates
Relative: you can set a duration on rolling hours or days (last 3 days for example)
Absolute: data is aggregated from a specific date (from 01/09/2019 for example)
Minimum period: 1 hour Maximum period: No limit
4. [Optional] If needed, define conditions in order to calculate the aggregation only if the conditions are met
Create a countdown attribute for one variable, based on a specific date range. This feature is helpful to define your best customers based on KPI like order frequency or number of visits on your website.
Frequency: conversions number in last 6 months
Discount efficiency: number of conversions with discount codes
Cross-selling: number of conversions with more than 2 product categories.
You would like to segment users who have 2 or more conversions with a discount code. You set up the new rolling count attribute: Count [Conversions] where [Conversion discount code] = 'True'
Then you can create a segment with the rolling count attribute 'Users with total conversions with a discount code = 2 or more' which returns, on our example, only the customer B. However you can go further by adding a period or others conditions.
1. Name your new attribute
2. Specify the universe (page view, view, click, conversion) you want
3. Define the date range you want
Forever: no period defined, you consider all the dates
Relative: you can set a duration on rolling hours or days
Absolute: data is aggregated from a specific date
Minimum period: 1 hour Maximum period: No limit
4. Define conditions in order to count only if the conditions are met
Filter on a specific page, product, ad…
Create a new calculated attribute. You can add, subtract, divide or multiply 2 or more variables. These new calculated attributes can be part of a segmentation. This feature allows you to define new KPIs based on mathematical formulas.
Ratio: repartition online/offline conversions
CLV: Customer Lifetime Value
Ratio: You would like to identify cross-channel customers meaning the partition of customers who are buying both online and offline. You need a ratio between online and offline conversions:
Channel ratio: number of online conversions / number of offline conversions
Higher is the rate, more the preferred channel is online and lower is the rate more the preferred channel is offline.
Customer Lifetime Value (CLTV): This metric helps you to estimate the income per customer during his entire relationship with your company (present and future). It allows you to establish forecasts on profits based on future cash flows.
CLTV = Customer Value ✕ Customer Average Lifespan
Customer Value = Average purchase value ✕ Average purchase frequency (on 1 year)
Customer Average Lifespan depends on your business model (subscriptions service, freemium, retailer...). In general we consider the customer average lifespan is comprised between 1 and 3 years.
To sum-up, in order to calculate the customer lifetime value with the augmented user attributes feature, you have to:
Use a Rolling Average attribute to determine the average purchase value and the average purchase frequency
Use a Calculus attribute to define the Customer Value by multiplying the average purchase value and the average purchase frequency (determined previously on step 1)
Ultimately, use again a Calculus attribute to estimate the Customer Lifetime Value by multiplying the Customer Value (as calculated on step 2) and the Customer Average Lifespan (determined by your business model).
You can calculate the CLTV step by step or you can also create a formula that combines all the dimensions.
Ex: Average purchase value (on 1 year) = 50€ Average purchase frequency (on 1 year) = 5 orders/year Customer Value = 250€ (50x5) Customer Average Lifespan = 2 years Customer Lifetime Value = 500€ (250x2)
Or: CLTV = (50x5)x2 = 500€
1. Name your new calculated attribute
2. Specify the mathematic formula (rule) to calculate with the variables
Variable type: numeric only
Operators supported: ➕ ➖ ➗✖️
You have to type the variable name and add the operator. When a variable name is found, the autocompletion feature will suggest you the exact name.
Ex: type ‘Lab’ and the platform will suggest you ‘Label’.
This feature allows you to copy values stored at the event level (pages, views, clicks, conversions) and paste to the user level. You can aggregate data at the user level in order to consolidate all the data around one unique user.
It is useful to have a global view of dimensions such as product categories viewed or last order date.
Last checkout date
Product categories viewed
You are looking for a travel for your next holidays. You visit many websites, blogs, travel agencies websites... It takes time to choose the best offer that will suit you perfectly.
As a travel agency, you have many visitors on your website who left many informations such as trip dates, destinations... This online data will be stored 30 days, however your sales cycle could be long, more than 30 days. In order to don't loose this precious information, it could be useful to keep this data and this is what Copy allows you to do.
You can create a new attribute called 'Trip dates' and store the dates considered. You can do the same for 'destination' or whatever that could be useful for you. Then you can launch a dedicated campaign 3 months later with a segment based on this data (if there is no conversion for these customers).
1. Name your new attribute
2. Specify the universe (page view, view, click, conversion) and the variable you want
If a copy is from an encrypted variable, the attribute will be encrypted too
3. Define the date range you want
Forever: no period defined, you consider all the dates
Relative: you can set a duration on rolling hours or days
Absolute: data is aggregated from a specific date
Minimum period: 1 hour Maximum period: No limit
4. If needed, define conditions in order to copy the aggregation only if the conditions are met
Boolean allows you to create True / False conditions. Ask you a question, set conditions and the value will be set to True on the user if the conditions are met, otherwise it will be set to False.
Has the user seen the campaign '10% discount'? True / False
Has the user added an item in the cart over the last 2 days? True / False
Has the user opened the last email sent? True / False
1. Name your new attribute
2. Define the conditions you want to use.
A control group is very helpful to measure the performance of a campaign. A control group is a small group of users (a percentage of users from associated segments) which is willingly not exposed to an ad or an email in order to measure the difference in terms of engagement or conversion between the population not exposed (control group) and the population exposed (users in the associated segments).
In our case, we take into consideration only conversions for now (online and offline purchases).
A control group is setup on a stream (share menu). Check the box ‘Set control group’ to activate the possibility to configure a control group.
Specify the % of users you want to set for your control group: these users will be not pushed on the stream (and so not exposed to the campaign).
10% is generally enough to have statistically significant metrics, however for very small segments it is recommended to double this %.
As soon as you start the stream, the control group will be launched also, and results will be recorded.
be careful with the modification of streams with an active control group as it impacts the results on the campaign performance dashboard. Same with segments used in a stream with an active control group, if you modify a condition it impacts all the results.
When a stream is over you cannot relaunch it, you should duplicate it
When a stream with a control group is activated, on the stream list page an icon will appear on the right of the stream’s name.
Click on this icon to access to the analytics part.
First, select the period of analysis: you can choose for example to read the results on the last 3 days or last week.
All the figures will be related to this analysis period.
Definition of each metrics:
Total population: number of users in segments which were selected on the stream.
It is equals to Audience activated + Control Group
.
Audience activated: number of users that were pushed on the stream (and exposed to the campaign).
It is equals to Associated segments – control group
.
Control Group: number of users in the control group, it represents the % of users selected to be in the control group and not exposed to the campaign.
It is equals to Associated segments – audience activated
.
Cards
You have here some metrics related to conversions.
We compare the performance between the users exposed and not exposed in terms of conversions (online and offline purchases).
In details:
Incremental revenue represents the campaign impact on your revenue.
All starts with a supposition: if we consider the full population in the control group what are the performances, and what is the incremental revenue generated by the campaign?
This incremental revenue could be positive or negative, and it represents the campaign performance and impact.
In details, we consider the full population of users (= Audience activated + users in control group) and we apply control group results for Conversion Rate and Average Basket Amount to all these users:
Then, we can easily compare the revenue generated by the Audience activated and the simulated revenue generated by the total population as control group and see if the incremental revenue is negative or positive.
Formula:
Incremental revenue = (Revenue generated by the Audience activated + Revenue generated by the Control group) - (simulated Revenue generated by the total population as control group)
For each population, we calculate here the conversion rate over the period selected.
Formula:
Conversion rate = number of conversions ÷ number of users
(we compare the conversion rate calculated for the population ‘audience activated’ and for the population ‘control group’)
Represents the impact of the campaign. As we compare users in the activated audience and users in the control group, we have to determine the impact of the campaign in terms of conversions.
If the uplift is low or negative, that means users in control group have good results without the exposure to the campaign and so the campaign is not as efficient as it should be.
On contrary, if the uplift is high, that means the campaign is over performing because users exposed to the campaign convert more than users in the control group, the impact is very positive on purchases.
Formula:
Uplift = ((conversion rate ‘Audience activated’ - conversion rate ‘Control group’) ÷ conversion rate ‘Audience activated’) x 100
We compare here the revenue per user for each population (audience activated and control group).
Formula:
Average Revenue Per User (ARPU) = Total revenue ÷ number of users
(we compare the ARPU calculated for the population ‘audience activated’ and for the population ‘control group’)
We compare here the average basket amount for each population.
Formula:
Average basket amount = Total revenue ÷ number of conversions
(we compare the average basket amount calculated for the population ‘audience activated’ and for the population ‘control group’)
Charts
You can see here the conversion rate evolution day after day for the two populations: the audience activated and the control group.
This chart is cumulative in order to reflect more the impact in terms of users, exposure and conversions, that means for each day conversions are added.
You can visualize directly here the comparison between the two populations (audience activated and control group) for the following metrics: ARPU (Average Revenue Per User); Average Basket Amount; Conversion Rate. It allows you to analyze quickly which metrics are very impactful and relevant to compare the populations.
You have here a summary of all metrics needed to compare the populations, such as the number of users and number of conversions (needed to calculate the conversion rate for example). The turnover is based on total_final_conversion_amount, meaning it could evolve depending on products returns, for example.
Yes, you can select an export (email, S3 or FTP for example), and select the universe 'campaign':
You can select 1 segment or choose 'all users' and you can export the following variables:
Control Group: does the user belong to the control group? True / False
Exposure count: number of exposure to campaign for a user = number of times a user entered a segment and was pushed through the connector
First exposure: date of first exposure
Last exposure: date of last exposure
Stream ID: ID of the stream for which the control group was activated
This is where you create data streams and establish a connection between Commanders Act DataCommander and the partners you wish to share segments with.
The first column displays the name of the data stream.
The second column displays the type of transfer set up for the data stream: a one-time transfer or a scheduled transfer, with variable frequency.
The third column displays the connector name (vowed to disappear)
The fourth column corresponds to the segments associatd to this stream.
The fifth column shows when was the last time the data stream has been sent.
The sixth column shows when was the last time the data stream was modified.
In the seventh column you can edit, duplicate or delete a data stream by clicking the pencil, the sheets or the trash can icons, respectively.
This is what the partner selection window looks like, it will be described in further detail later on.
Ask your account manager to activate this option.
The color red represents a high probability of purchase (the more intense the red is the higher the probability) The blue color represents a high probability of no purchase (the darker the blue is, the less likely users are to buy)
The first point on the left represents 100% of the population, a first split is made on the variable total_order_amount.
On the one hand, those who have bought in total (over their entire purchase history) for less than 25€, have very little chance of buying (very dark blue); on the other hand, those who have bought for more than 25€, they have a little more chance of buying (light blue) Among those who have recently been in the funnel have a good chance of buying (light red circle), and among them, those whose first visit date is less than 28 days, have a higher probability of buying even more (very dark red circle).
And for those whose first visit is more than 28 days, we see that those who have seen less than 22 pages recently are unlikely to buy, except for those who have a total page view history of less than 55 and who have recently seen more than 4 pages.
Unlike those who have a total page view history of more than 55 and who have recently seen less than 8 pages, who will not buy.
Etc., etc. by following all nodes.
The most predictive variable is the total amount of purchases (total_order_amount), I have to take it into account when I create my segments
The recent presence in the purchasing tunnel significantly changes the deal (not surprisingly, hence the interest in making the relaunch abandon baskets)
The volume of pages viewed is an indicator of the probability of purchase and depends on the date of first visit (it is deduced that there is a kind of ratio that allows to deduce an intentionist, according to his date of first visit, his total number of pages viewed ever and his number of pages viewed recently) So I have to create as many segments as there are red nodes on the far right of the screen to be able to find all the visitors who are intentionalists (or dig this ratio story to create a new score variable to facilitate the rest)
Among the other predictive variables to keep in mind are the recent_view_product and recent_view_category, I see that the higher they are, the less chance there is of buying, probably because they are users who shop around without really knowing what they want, unlike those who see few products and categories and who are more likely to buy quickly
The connector Google Store Sales Direct allows you to send conversions to Google Ads.
Store sales measurement allows you to use your sales data in a privacy-safe way to understand how much value your ads are truly driving for in-store purchases. By uploading and matching transaction data from your business, you can see how your ads translate into offline purchases.
More information: https://support.google.com/google-ads/topic/9941533?hl=en&ref_topic=7280668
From your Data Commander account, you can send your conversions through our connector. https://support.google.com/google-ads/answer/10018944?hl=en&ref_topic=9941533#zippy=%2Cbenefits-of-uploading-store-sales-data-in-google-ads
We send conversions to your FTP, and then you have to push these conversions from the FTP to Google Ads: https://support.google.com/google-ads/answer/10018944?hl=en&ref_topic=9941533#zippy=%2Cbenefits-of-uploading-store-sales-data-in-google-ads%2Cschedule-an-upload
DataCommander ⇒ Customer FTP (Connector Google Store Sales Direct)
Customer FTP ⇒ Google Ads (you should configure the file exchange between your FTP and your Google Ads account like described on the documentation above)
On the connector setup interface, select all users or select a segment you want to use.
For example, you can send only conversions that belongs to specific users that you selected in your segment (users living in Germany only, for example).
You should select all the variables requested by Google: Conversion Time, Conversion Value, Conversion Currency, and at least one Email variable
Full list and format here: https://support.google.com/google-ads/answer/10018336?hl=en&ref_topic=9941533
All personal data (email, first name, phone number...) are hashed in SHA-256.
There are also 3 variables specific to that connector:
Loyalty Rate:
The percent of overall sales in your data file that you can associate with a customer
The Loyalty Rate needs to be between “0” and “1” (excluding “0”)
For example:
0.5
0.2
0.8
Transaction Upload Rate:
The ratio of sales you’re uploading to the overall sales that you can associate with a customer
While the Transaction Upload Rate can be between “0” and “1,” we recommend uploading all customer associated sales and setting the Transaction Upload Rate to “1”.
For example:
0.5
0.2
0.8
Conversion name:
The name of the conversion action that you’d like to import
Must match the same spelling and capitalization of the conversion action in your Google Ads account
For example:
SS_customer_signups
SS_customer_purchases
More details: https://support.google.com/google-ads/answer/10018336?hl=en&ref_topic=9941533#
Google Customer Match allows you to push users to Google Ads. The matching key is email (SHA-256 or in clear only).
You can create an audience segment, push it to Google Ads and activate it. Users matched could be then activated on Google services (Search, Display network, Gmail, Youtube, Google Shopping...).
Sync. key can only be SHA256 email hash in Europe (phone number, mailing address and mobile device ID are for the US)
Audience size: minimum 1000 matched users (otherwise it will be rejected for privacy purpose)
Data collection: it is only allowed to import first party data (collected via your website/app/point of sale)
Available for accounts with good history of policy compliance, good payment history, at least 90 days history, more than USD 50,000 total lifetime spend ()
Google indicates that it takes 6 to 12 hours for a list to be populated with members
A list may appear to be smaller than expected when viewed in the Audience Manager in the Google Ads UI. This view shows the number of active users in the list. A user is considered active if they have recently logged into their Google account.
First, you need to enter your Google Ads customer ID:
Then, there are 2 ways to configure it:
It is the fastest and easiest way to configure the connector. Simply ask to our consultants or customer representative that you want to use this option and provide them your Google Ads Customer ID. Then they will link your account with our Google Ads account in order to be able to push audiences.
We recommend to use the first option as the second option is much more complex and requires an access to Google Developer Console. You have to enter the Developer Token, Client ID, Client Secret and generate a refresh token.
All details here:
Last step, create a user list on Google side.
On your Google Ads account, click on ‘Tools and settings’ and ‘Audience manager’.
Then in ‘Audience lists’, click on the + and select ‘Customer list’ (or user list).
Create the list, save it, open it and find the List ID
Paste the List ID in the segment mappings on the Data Commander Google Customer Match connector interface (it allows you to map a segment in DataCommander to a user list on Google Ads)
Your connector is ready, click save
After 12 to 24 hours, go back to Google Ads, click on your customer list you created just before, and it will be filled with users selected.
Send your offline conversions (like purchases made in stores) directly to Facebook.
Create a new stream, select Facebook Offline conversion connector and enter the information needed:
Then select offline:
Finish the data source configuration and retrieve the Event set ID:
Create a system user (if you don't already have one) and generate a token:
Select ads_management authorization:
Copy and paste the token on our connector configuration page.
An API based Facebook Connector can be set up with the following procedure. The connector will send users data of all the users belonging to a given segment. The audience sent can contain both FB subscribers and non subscribers.
More information you send to Facebook, better is the matching with Facebook. You can choose to send email, phone number or any other information to increase the matching with Facebook.
If users don't have enough information, we will reject them (example: if a user only have a zip code, Facebook will not be able to match this user with only this information).
It takes up to 24 hours for Facebook to match users.
You need 2 information for the Facebook connector:
Ad Account ID
Access Token
Then you have to go to ‘Ad Account settings’:
And you will find the Ad Account ID:
Copy and paste it on our Facebook connector
Create a new app
Add the product ‘Marketing API’.
Go on ‘Settings’ and ‘Advanced’ section
Link your app to your Business Manager account and add your Ad Account ID in the ‘Authorized Ad Account ID’ section:
Then click on 'App Review' and 'Permissions and features'. Find the 'Ads Management Standard Access' component and, if you don't already have it, submit a request:
Wait for the validation of your request.
You can choose here to generate a token without any expiration date (1st method, recommended) or with an expiration date (60 days - 2nd method, not recommended).
This system user should have admin rights for the ad account and for the app you created.
When it is done, you can Generate New Token:
Select the App you have created before and select the right permission:
Then copy and paste the generated token on our connector:
On the Facebook developers, on your App you created, click on ‘Marketing API’ and ‘Tools’.
On ‘Get Access Token’, tick ‘ads_management’ and click on ‘Get Token’.
You can now copy and paste this token on our Facebook connector.
Be careful: Each time you click on Get Token it will generate a new one that you have to use (the previous one will be not valid anymore)
Before to save the connector, be sure you have accepted Facebook general conditions for custom audiences.
Go on this link and replace the AD_ACCOUNT_ID per your own Ad Account ID:
Be sure you have the admin rights for your Facebook Business account.
Save the connector and users who will enter the segment will be pushed to Facebook new custom audience.
Then on Facebook, click on Audiences.
A new custom audience will be created.
The name of the new audience will start with CA_{name}
Serverside v2 and DataCommander connector
This connector allows you to push every kind of events directly to Facebook through API. Online conversions, offline conversions... send it to Facebook and it will help you to increase the reach and accuracy of your campaigns.
You can, for example, not send campaigns related to a specific product to users who already bought it, or you can also send campaigns to users who bought a specific product in cross-sell logic.
Facebook developed an API called 'Facebook Conversions API'
You need a Facebook Business Manager account
Then on the menu, click on 'Events Manager':
Here you have to create a new Web Pixel:
Select Conversions API and give a name to your connection:
Now your pixel is created and you will have access to the IDs needed on our connector.
You need to fill the pixel ID on our connector, it is the ID of the pixel you just created on steps above.
You can find this ID when you click on the pixel's name and on the right of the graph activities. You can find it also on the settings tab.
You can now copy and paste this ID on our connector.
Then you need the Access Token
Click on the settings tab.
Scroll until the section 'Conversions API'
Click on the button 'Create Access Token':
If you are not able to click on the button 'Create access token', that mean you don't have enough rights to do it. You should be administrator on your Facebook Business account to create it.
Then you can copy and paste the Access Token on our connector.
Only events with a consent will be sent to Facebook
Only purchases with personal information (email and/or phone number...) will be sent to Facebook
On the connector, the consent is managed with the field 'User Consent Category'. You should enter a category ID, the one corresponding to Facebook (advertising) on Trust consent categories.
We should distinguish 3 cases:
Your online events are collected through our Commanders Act event's tag:
You have to provide, in the event tags, the list of category ids consented by the user, through the consent_categories
property.
You are pushing your events to us through API or CSV file:
a field consent_categories
must be added on the JSON or CSV to precise the consent category IDs of the user. Then inside the connector setting, use the field 'User Consent Category' to enter a category ID, the one corresponding to Facebook (advertising)
You already manage consents on your side and you only send us, from your server,
events that obtained the consent for the category advertising. In this case, do not fill the field ‘User Consent Category’ in the connector.
Using both the pixel and server is recommended per Facebook as it could avoid losing data.
To make it works, you should have the same configuration for both the pixel and server, using same Facebook parameters.
event_id should be the same
On the pixel, event_id
is automatically generated by our Commanders Act Tag and we retrieve the same value for the server onintegrations.facebook.event_id
. As a result, these 2 values should be the same.
Event_name
should be the same also.
Fbp
parameter is automatically retrieved to keep the same value between pixel and server.
Deduplication works when the same event is sent first from the browser and then from the server, otherwise it creates a duplicate. Events are pushed in real-time.
On pixel:
eventID: tc.uniqueEventId
is automatically generated.
On server:
integrations.facebook.event_id
automatically retrieves the eventID value coming from the pixel (eventID: tc.uniqueEventId
) for standard events.
To send any of your Commanders Act events (that are not listed in the table above) to Facebook custom events, you don't have anything to do. By default, your unmapped events are automatically sent as a Facebook custom events with the name of your Commanders Act events.
If you want to change the name of the custom event that will receive Facebook, you can overwrite the event_name property using integrations.facebook.event_name:'yourCustomEventName'
Every property can be overridden using integrations.facebook.user_data.<property>
(for standard data) or integrations.facebook.custom_data.<property>
(for custom data).
Events can only be used if there is enough information to match a user. Facebook expects at least one user_data
property, but strongly advises sending as many properties as possible.
Here are our conditions to send the events :
at least 1 of those fields: em
, ph
, external_id
, fbp
, fbc
at least 3 of the other fields
Note : external_id, fbp, fbc will allow matching event with other events. But to match a user, one of those events shall contain additional information (em
and ph
are best suited for matching)
Custom Facebook parameters could be added.
Facebook allows you to send any data you want in custom data parameters. By default, we fill generic fields when possible (like value, currency, contents...). You can specify in the tag your own parameters in integrations.facebook.custom_data.
Push your offline conversions, like purchases made in physical stores, directly to Criteo.
You must specify the Store ID variable, the variable corresponding to the store identifier on your conversions.
By default, we apply a filter on conversion_type = 'offline' to push only offline conversions. However, if you have a specific setup, you can change this value if needed.
Enter your Criteo Account ID, you can find it directly on Criteo interface.
GET
https://api.commander1.com/v1.0/engage/visitors/
This endpoint allows you to get properties for one specific visitor. When you create the token, you can define which properties to return. This API is more designed to be called from a tag in each user's browser.
Name | Type | Description |
---|
GET
https://api.commander1.com/engage/user/
This endpoint allows you to get properties for one specific user based on a user_id
. When you create the token, you can define which properties to return.
PUT
https://api.commander1.com/engage/user/
Insert or update a user
PUT
https://api.commander1.com/engage/user/?site=1234&user_id=1234&tc_id=1234&token=WvNIX8955cnZ7WF0f632s0Wb99Ql3rtA
Delete a user
https://api.commander1.com/engage/user/
DELETE
https://api.commander1.com/engage/user/?site=1234&user_id=1234&tc_id=1234&token=WvNIX8955cnZ7WF0f632s0Wb99Ql3rtA
This article has been created for vendors that would like to synchronize their cookies (COOKIE_ID) with Commanders Act CDP.
Each vendor solution identifies browsers via a proprietary COOKIE_ID. This prevents them from sharing user data with other solutions directly. The cookie synchronization allows vendors to share their Unique Identifiers with the Commanders Act CDP.
Commanders Act sends a request to the vendor server to get their COOKIE_ID for a user and then stores the mapping on the Commanders Act servers. Cookie synchronization maps different IDs together and thus enables user data to be shared.
Web service URL: The vendor solution provides Commanders Act with the URL to be called to trigger the webservice and get its COOKIE_ID for the current browser
Commanders Act Token: Commanders Act provides the vendor with a 32-character unique ID (ex: KD04a85DjH1015yzuAu7pnaoTh5P53iL). This token will be used to send request to the Commanders Act web service.
1) Commanders Act sends a request to the vendor's web service through its JavaScript tag (browser side).
2) The vendor API sends its COOKIE_ID to http://sync.commander1.com/#PARTNER_TOKEN#/#UID# by replacing on the fly the #UID# in the destination URL. The #TOKEN# is provided by Commanders Act and unique per vendor.
3) Commanders Act stores the vendor COOKIE_ID into its cookie sync mapping table. This database keep an history of Commanders Act unique ID & every synchronized vendor COOKIE_ID(s).
4) End users can create segments from DataCommander UI. The COOKIE_IDs will be translated into the vendor's one when extracting the data, so the vendor will be able to re-use them.
Returns the list of segments created in Engage UIX
https://api.commander1.com/api/dms/segmentation/segments/list
GET
https://api.commander1.com/api/dms/segmentation/segments/list?site=1234&token=WvNIX8955cnZ7WF0f632s0Wb99Ql3rtA
Returns the list of segments for the user
https://api.commander1.com/api/dms/segmentation/segments
GET
https://api.commander1.com/api/dms/segmentation/segments?site=1234&callback=tC_funcEngage
New functionality: copy the data you need stored on all universes (pages, views, clicks and conversion) to the user level. As a result, the data will be stored for life (in respect with GDPR) and not deleted after 30 days (usually 30 days but depends on your contract).
On Facebook Business Manager, click on Events Manager and create a new data source:
Still on Facebook Business Manager, click on Business settings and select system user:
You have to go on Facebook Business and login to your account.
For this part you have to go to Facebook Developers and create or login to your account (you need to have a dedicated account for Facebook Developer).
Now you have an App linked to your Facebook Business Manager Account (). Please check this page to be sure that the App is well associated with the Business account:
You have to create a system user on your Facebook Business Account ().
TRUST Commander is our Consent Management Platform. (More information: )
The following mappings are fully automated and do not require any additional configuration. Any of the Commanders Act Standard Events in the table below will be sent as the corresponding Facebook Standard Event. The Facebook pixel has more information on these.
Official documentation
Name | Type | Description |
---|
Name | Type | Description |
---|
COMMANDERS ACT EVENTS | FACEBOOK STANDARD EVENT |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
COMMANDERS ACT STANDARD PROPERTIES | FACEBOOK STANDARD PARAMETERS |
|
|
OR
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Automatically set if generated from Commanders Act OneTag |
|
Automatically set if generated from Commanders Act OneTag |
|
|
|
|
|
|
|
|
|
|
|
default value : ' |
|
|
|
token | string | Security token |
user_id | string | ID of the user |
site | integer | ID of the site |
site | string | Id of the site (account) |
user_id | string | Id of the user. Required if tc_id parameter is not set |
tc_id | string | Optional. Cookie id of the user |
token | string | Security token |
Response formats | JSON |
Requires authentication? | Yes (token) |
NAME | REQUIREMENT | EXAMPLE VALUES | DESCRIPTION |
site | d+ | 1234 | Id of the site |
user_id | d+ | 1234 | Id of the user |
tc_id (optional) | d+ | 1234 | Id of the visitor |
token | [a-zA-Z0-9]* | WvNIX8955cnZ7WF0f632s0Wb99Ql3rtA | Security token |
Response formats | JSON |
Requires authentication? | Yes (token) |
NAME | REQUIREMENT | EXAMPLE VALUES | DESCRIPTION |
site | d+ | 1234 | Id of the site to query detail for |
token | [a-zA-Z0-9]* | WvNIX8955cnZ7WF0f632s0Wb99Ql3rtA | Security token |
Response formats | javascript (JSONP) |
Requires authentication? | No if tcid is not set, token if tcid is set |
NAME | REQUIREMENT | EXAMPLE VALUES | DESCRIPTION |
site | \d+ | 1234 | Id of the site to query detail for |
tcid (optional) | \d+ | 1234 | Id of the tcid (if cookie is disable) |
token (optional) | [a-zA-Z0-9]* | WvNIX8955cnZ7WF0f632s0Wb99Ql3rtA | Security token (if tcid is set) |
callback | \w+ | do_something | javascript callback method for JSONP |
callback | string | (optional) Callback for jsonp request |
token | string | Security token |
site | integer | ID of the site |
tcid | string | Cookie id. If empty (recommanded) it will read the tcid in the user's cookie. |
Customer A
Customer B
Order #1
Product Category: Clothes
Price: 80€
Date: 3 months ago
Product Category: Bag
Price: 80€
Date: 3 months ago
Order #2
Product Category: Shoes
Price: 100€
Date: 2 months ago
Product Category: Clothes
Price: 150€
Date: 1 month ago
Order #3
Product Category: Clothes
Price: 150€
Date: 1 month ago
Product Category: Clothes
Price: 200€
Date: 2 weeks ago
Rolling SUM ‘Total orders amount clothes <2 months’
Filters:
Product category = Clothes
Period = last 2 months
150 €
(only the order #3 fill all the conditions)
350 €
(Orders #2 and #3 fill all the conditions)
Segment based on the new attribute ‘Users with a total orders amount clothes <2 months’
Filters:
Total >300€
❌ No entry
Customer A
Customer B
Order #1
Product Category: Clothes
Price: 80€
Date: 3 months ago
Discount code: False
Product Category: Bag
Price: 80€
Date: 3 months ago
Discount code: True
Order #2
Product Category: Shoes
Price: 100€
Date: 2 months ago
Discount code: False
Product Category: Clothes
Price: 150€
Date: 1 month ago
Discount code: False
Order #3
Product Category: Clothes
Price: 150€
Date: 1 month ago
Discount code: True
Product Category: Clothes
Price: 200€
Date: 2 weeks ago
Discount code: True
Rolling Count ‘Total conversions with a discount code'
Filters:
Discount code = 'True'
1
(only the order #3 fill all the conditions)
2
(Orders #1 and #3 fill all the conditions)
Segment based on the new attribute ‘Users who have 2 or more conversions with a discount code'
Filters:
Total >=2
❌ No entry
Scroll down for code samples, example requests and responses. Select a language for code samples from the tabs above or the mobile navigation menu.
It is highly recommended that you send multiple objects in one HTTP request. This API allows streaming using newline separated JSON format or ndjson (http://ndjson.org/)
You may send up to 30 requests per second
You may have up to 30 concurrent connections
If you send many conversions/products/etc. in bulk, the upload speed will be limited to 30 conversions/products/etc. per second
If you send 1 conversion per request you will be limited to 30 requests per second
If you send 90 conversions in one request your upload will be completed in about 3 seconds
If you send 40 requests, each with one conversion in the same second, 30 of them will be processed and 10 of them will be rejected
If you send 3 requests, each with 100 conversions they will be completed in 10 seconds
You can send up to 150 conversion items
Use the long format with timezone for passing ISO-8601 dates. The following formats are accepted:
"2019-04-29T13:47:47.315Z"
"2019-04-29T13:47:47Z"
"2019-04-29T13:47:47.315+02:00"
"2019-04-29T13:47:47+02:00"
Errors are always returned as an array of objects in the top-level "errors" property.
For bulk operations you may have "errors" and "data" properties at the same time since some objects may have errors while others may not. Bulk errors are aggregated which means there won't be an error for each instance of an error but one error for each type of error with the number of occurrences and some examples of line numbers or ids.
Error objects have the following properties
Base URLs:
HTTP Authentication, scheme: bearer Token will be provided by our support/consulting team
Code samples
POST /conversions/bulk
This endpoint creates and updates conversions. Your request will be processed asynchronously. It can take up to 24 hours until the request is processed and updates are made in the database.
Body parameter
Example responses
202 Response
400 Cannot parse nd-json line
400 Missing required property
400 Invalid property type
400 Invalid property format
401 Authorization header is missing
401 Token type is missing
401 The token type is invalid
401 The provided token is unknown
403 Response
Too Many Requests
500 Response
Code samples
POST /products/bulk
This endpoint creates and updates products. Your request will be processed asynchronously. It can take up to 24 hours until the request is processed and updates are made in the database.
Body parameter
It is recommended to use as many fields as you can in order to be able to build good segments with advanced conditions
Enumerated Values
It is recommended to use as many fields as you can in order to be able to build good segments with advanced conditions
It is recommended to use as many fields as you can in order to be able to build good segments with advanced conditions
There are three ways to have product information in your conversion items. First method is to put product properties inline for each conversion item. Second method is to synchronize your product catalog with our database using "POST /products/bulk" endpoint and only send product ids in conversion items (our server will copy product properties from catalog). Third method is a combination of previous ones and implies having a product catalog and send the product information inline. In case a property is present in both catalog product and inline product, properties from inline product will overwrite properties from catalog. This method is useful when product information is incomplete or complementary in inline products. It is recommended to send products inline, except when you do not have all product information. In most cases you don't need to use the catalog. It is recommended to use as many fields as you can in order to be able to build good segments with advanced conditions. When you only send the id of the product in a conversion item, you need to make sure that your catalog already contains the product, otherwise product properties will not be added to your conversion item.
Enumerated Values
The following methods have been deprecated since January 2019. Please use version 2.0.
Insert a new conversion
https://api.commander1.com/engage/conversion/
PUT
https://api.commander1.com/engage/conversion/?site=1234&user_id=1234&tc_id=1234&token=WvNIX8955cnZ7WF0f632s0Wb99Ql3rtA
For this connector, the concept is to send to Tableau each X minutes new events in append mode ( ex : pageviews, clicks, impressions, ...)
We are able to push Data to Tableau every 10 minutes (full export or delta).
Append mode will add new lines to existing table in Tableau, so the stream has to be configured to differential (export only new events), like this :
For users universe, it's different, you can't choose "append" because users are updated continously and you don't want to add each times the same users, so you have to choose overwrite mode for this univers, with an export each night (because it will take hours to send all users to Tableau) (edited)
Tableau online doesn't support updates It offers only append or overwrite
Url Server The URL for the Tableau server on which the data is published. For Tableau Online, specify https://online.tableau.com. -- Site name The site id is independent of the site name, and it is indicated in the URL when you view the site in a browser. For example, if the URL for the page you see after signing in to Tableau Online is https://online.tableau.com/t/vernazza/views the site id is vernazza. -- date source name The name of the data source, as published to Tableau Online. -- username Valid Tableau Online user. -- password The password for the specified Tableau Online user.
The Trade Desk is one of the largest independent demand-side platform (DSP) providing real-time ad pricing and placement for advertisers, ad agencies and brands.
Create audience segments on DataCommander and simply push these audiences (users) to The Trade Desk through our connector. As a result, you can then create personalized ads for these users on The Trade Desk platform. This connector uses the cookie-sync process to match users.
It is quite simple, you just have to enter your The Trade Desk Advertiser ID and The Trade Desk Secret Key:
You can find this information directly on your The Trade Desk account, in Preferences section.
Entry
Entry
Property
Type
Required
Description
code
string
true
Always present and contains error code that can be checked programmatically
detail
string
true
Human readable message that explains the problem. You should not check the value of this property programmatically because it may change
meta
object
false
Error specific object that contains details about what generated the error
Name
In
Type
Required
Description
Authorization
header
string
true
Authorization token
body
body
true
Conversions as newline delimited JSON strings
Status
Meaning
Description
Schema
202
All objects are accepted for processing
None
400
Cannot process request or part of the request due to client error
None
401
Cannot identify the API caller
None
403
API caller does not have access to this resource
None
429
Too Many Requests
None
500
Internal server error
None
Name
In
Type
Required
Description
Authorization
header
string
true
Authorization token
body
body
true
Products as newline delimited JSON strings
Status
Meaning
Description
Schema
202
Accepted
None
207
Multi-Status
None
401
Unauthorized
None
405
Invalid input
None
Name
Type
Required
Restrictions
Description
id
string(1-50)
true
none
Conversion id. Used as key for updates
user
object
true
none
All properties that you add here will be used as conditions for matching users in our database. You must ensure that values used inside these properties are unique. Use same property names as those defined in variables interface for the user.
» user.email
string(1-250)
false
none
Email of the user
»user.consent_categories
string
false
none
Consent categories of the user, to be allowed to share conversions with partners
type
string(1-250)
true
none
Type of conversion (online, offline, call etc.)
status
string
true
none
Status of your conversion (see list of possible values below). Conversions with status "pending" are not included in default sum and counts aggregated on a user.
created
string(ISO-8601)
true
none
Time when conversion occurred. See "Date formats" section above for a list of allowed formats.
updated
string(ISO-8601)
false
none
Time when conversion was updated. See "Date formats" section above for a list of allowed formats.
acknowledged
boolean
false
none
Set to true if conversion was acknowledged
currency
string(ISO-4217)
true
none
Currency
comment
string(1-250)
false
none
Comment of the buyer
billing_address
false
none
It is recommended to use as many fields as you can in order to be able to build good segments with advanced conditions
contact_address
false
none
It is recommended to use as many fields as you can in order to be able to build good segments with advanced conditions
shipping_address
false
none
It is recommended to use as many fields as you can in order to be able to build good segments with advanced conditions
shipping_provider
string(1-250)
false
none
Shipping provider
shipping_tracking_code
string(1-250)
false
none
Shipping tracking code
payment_method
string
false
none
Payment method type (see list of possible values below)
payment_provider
string
false
none
Payment provider used for this transaction
original_quantity
float
false
read-only
Sum of all articles in the original conversion (CALCULATED)
cancelled_quantity
float
false
read-only
Quantity of cancelled articles in the conversion (CALCULATED)
returned_quantity
float
false
read-only
Quantity of returned articles in the conversion (CALCULATED)
exchanged_quantity
float
false
read-only
Quantity of exchanged articles in the conversion (CALCULATED)
final_quantity
float
false
read-only
Quantity of articles in final transaction for this conversion (original_quantity - cancelled_quantity - returned_quantity - exchanged_quantity) (CALCULATED)
original_amount
float
false
write-once
Original amount for this conversion (shipping price and taxes included)
cancelled_amount
float
false
none
Cancelled amount for this conversion
returned_amount
float
false
none
Returned amount for this conversion
exchanged_amount
float
false
none
Exchanged amount for this conversion
shipping_amount
float
false
none
Shipping amount for this conversion
discount_amount
float
false
none
Discount amount for this conversion
tax_amount
float
false
none
Tax amount for this conversion
final_amount
float
false
none
Final amount for this conversion after returns, exchanges, cancellations etc. (shipping price and taxes included). This represents the overall transaction amount between the buyer and the seller
custom
object
false
none
Object containing custom properties
conversion_items
true
none
List of products in the conversion + their own attributes. You cannot have the same product twice inside a conversion unless you provide a conversion item id
Property
Value
status
canceled
status
delivered
status
in_progress
status
partially_delivered
status
partially_returned
status
partially_shipped
status
pending_shipment
status
returned
status
shipped
status
pending
payment_method
by_bank_transfer_in_advance
payment_method
by_invoice
payment_method
card
payment_method
check_in_advance
payment_method
cod
payment_method
coupon
payment_method
direct_debit
payment_method
online_payment_system
payment_method
other
Name
Type
Required
Restrictions
Description
id
string
true
none
Id of this item in the conversion. This id is required. If you don't have an item id in your database and the same product id cannot repeat within a conversion you can use the product id as value. This field is used for identifying the item in updates.
original_quantity
float
true
none
Quantity of items in the original conversion
cancelled_quantity
float
false
none
Quantity of cancelled items
returned_quantity
float
false
none
Quantity of returned items
exchanged_quantity
float
false
none
Quantity of exchanged items
final_quantity
float
false
none
Quantity of items in final transaction (original_quantity - cancelled_quantity - returned_quantity - exchanged_quantity)
original_amount
float
false
none
Original amount for this item
cancelled_amount
float
false
none
Cancelled amount for this item
returned_amount
float
false
none
Returned amount for this item
exchanged_amount
float
false
none
Exchanged amount for this item
final_amount
float
false
none
Final amount for this item (original_amount - cancelled_amount - returned_amount - exchanged_amount)
price
float
false
none
Price of item (using same currency as for conversion)
original_item
boolean
false
none
Wether this item was present in the original conversion. This is automatically set to false to all items added in conversion updates
custom
object
false
none
Object containing custom properties
product
true
none
There are three ways to have product information in your conversion items. First method is to put product properties inline for each conversion item. Second method is to synchronize your product catalog with our database using "POST /products/bulk" endpoint and only send product ids in conversion items (our server will copy product properties from catalog). Third method is a combination of previous ones and implies having a product catalog and send the product information inline. In case a property is present in both catalog product and inline product, properties from inline product will overwrite properties from catalog. This method is useful when product information is incomplete or complementary in inline products. It is recommended to send products inline, except when you do not have all product information. In most cases you don't need to use the catalog. It is recommended to use as many fields as you can in order to be able to build good segments with advanced conditions. When you only send the id of the product in a conversion item, you need to make sure that your catalog already contains the product, otherwise product properties will not be added to your conversion item.
Name
Type
Required
Restrictions
Description
country
string(1-250)
false
none
Readable country name
iso_country_code
string(ISO-3166)
false
none
ISO-3166 country code
country_code
string
false
none
Use this field in case you use country codes other than ISO-3166
region
string(1-250)
false
none
Administrative region
locality
string(1-250)
false
none
Name of city/town/village etc.
postal_code
string(1-250)
false
none
Postal code
recipient
string(1-250)
false
none
Recipient name
street_address
string(1-250)
false
none
Street name, street number, building number etc.
full_address
string(1-250)
false
none
Full address as a string that can contain newlines. Not usable in segmentation but available for exports
label
string(1-250)
false
none
Label for this address (home, work etc.)
coordinates
object
false
none
Coordinates for this address
» latitude
float
false
none
Latitude
» longitude
float
false
none
Longitude
Name
Type
Required
Restrictions
Description
id
string(1-50)
true
none
Unique identifier for the article (try using the most specific identifier or SKU), such as a reference. If there are several occurrences for the same identifier, only the last one will be recorded
name
string(1-500)
false
none
Name of the article
description
string(max 5000 chars)
false
none
Description of the article
category_1
string(1-250)
false
none
Main category of the article
category_2
string(1-250)
false
none
Second sub-category of the article
category_3
string(1-250)
false
none
Third sub-category of the article
category_4
string(1-250)
false
none
Fourth sub-category of the article
category_5
string(1-250)
false
none
Fifth sub-category of the article. If you have more than five levels of category you may choose to concatenate the remaining ones like 'Bikes/Parts/Wheels/Front' or simply ignore the remaining ones like 'Bikes', depending on your segmentation needs.
tags
[string]
false
none
Array of tags for the product. Tags can be anything that labels the product: hand-made, eco-friendly, heat-resistant etc.
condition
string
false
none
Current status of the material in your store (see list of possible values below)
availability
string
false
none
Current availability of the item in your store. Make sure to indicate the availability of the item on your store page and keep it up to date (see list of possible values below)
availability_date
string(ISO-8601)
false
none
Date when product became or will become available. See "Date formats" section above for a list of allowed formats.
expiration_date
string(ISO-8601)
false
none
Date when product became or will become unavailable. See "Date formats" section above for a list of allowed formats.
price
float
false
none
Default price for the article. In a conversion you can specify the real price at which the item was sold in case of sales, discounts etc.
sale_price
float
false
none
Default price for the article during sales periods. In a conversion you can specify the real price at which the item was sold in case of discounts
currency
string(ISO-4217)
false
none
Currency used for given prices. Note that you have to use the same currency for products and conversions
image_link
string(url)
false
none
URL of product image
link
string(url)
false
none
URL to the website where you can buy the item
brand
string(1-250)
false
none
Brand of the article
width
float
false
none
Width of the article in centimeters (cm)
length
float
false
none
Length of the article in centimeters (cm)
height
float
false
none
Height of the article in centimeters (cm)
weight
float
false
none
Height of the article in centimeters (grams)
size
string(1-250)
false
none
Size of the article when width, height and lengts are not applicable. You can use any value that describes the size. Examples: S, XL, large
colors
[string]
false
none
Colors of product
gender
string(1-250)
false
none
Gender for gender specific products (male, female, unisex)
gtin
string(1-250)
false
none
International trade identification number of the article Supported numbers: UPC (North America, 12 digits), EAN (Europe, 13 digits), JAN (Japan, 8 to 13 digits), ISBN (books, 13 digits)
mpn
string(1-250)
false
none
Manufacturer part number of the material
custom
object
false
none
Object containing custom properties
Property
Value
condition
new
condition
refurbished
condition
used
availability
in_stock
availability
available
availability
pre_order
availability
out_of_stock
gender
male
gender
female
gender
unisex
Response formats
JSON
Requires authentication?
Yes (token)
NAME
REQUIREMENT
EXAMPLE VALUES
DESCRIPTION
site
d+
1234
Id of the site
user_id (optional)
d+
1234
Id of the user
tc_id (optional)
d+
1234
Id of the visitor
token
[a-zA-Z0-9]*
WvNIX8955cnZ7WF0f632s0Wb99Ql3rtA
Security token