The steps you need to complete to implement our data collection scripts are shown on the Data Sources page, which you can find in the Admin menu in the Emarsys Marketing Cloud.
This page describes these steps.
- Before you start
- Step 1 – Pick your base currency
- Step 2 – Upload your product catalog
- Step 5 – Validating the JS API implementation
- Step 6 – Bringing data collection live and validating the implementation
- Step 7 – Checking site traffic statistics
- Step 8 – Checking the output in your contact database
Before you start
- We provide a number of tools to help you validate your integration as you implement it, including the Live Validator, the Live Events view, and our very own Inspector Gadget.
- All data that is collected and stored by these scripts is anonymous; only after it has been imported into the Emarsys database is the web behavior associated with actual contacts.
- If you are using these scripts in conjunction with Emarsys Predict, you should discuss your data collection policy with Emarsys Support, as the type of data collected depends greatly on the profile of your online business.
- Most importantly: You don’t just want to collect data on all your current web pages, you also want to collect it on all the new pages you create in the future. This includes new product pages and the landing pages you create for marketing campaigns. So you should also add the relevant commands to the templates you use for creating different page types.
Step 1 – Pick your base currency
If you have not configured your base currency yet, you can do so in the Account Settings box on the right-hand side. Base currency is the currency in which you want to track your revenue statistics, regardless of the number of currencies actually used on your website.
There is an important point to make here: your revenue (and therefore your data collection) is recorded in a single base currency. This base currency is used to display website revenues and campaign revenues. Your site may offer products in different currencies and prices, but this is not relevant for data collection. Data collection must always use your base currency.
If your currency is not shown in the list, please contact Emarsys Support.
Step 2 – Upload your product catalog
The product catalog identifies your products for analysis. The catalog is a simple, daily updated table, each row containing all the information relating to a specific item (product ID, category, page URL, thumbnail image link, etc.). This step should already have been performed as part of your data onboarding.
If you have an existing Google Tag Manager integration, then you may want to see our Guidelines for implementing GTM.
Some of our API commands should be included in ALL website pages:
setEmailcommand to identify your logged-in visitors.
cartcommand to report the cart content of your visitors.
- And the
gocommand to execute the Scarab Queue.
Use this JSFiddle example to see these commands in action. Use the LIVE EVENTS viewer in the Web Behavior box on the Data Sources page to check your implementation works.
JSFiddle working code example for ALL website pages:
Some data-collection commands should only be included on specific pages of your website:
viewreports the ID of the product being viewed on your Product Page.
categoryreports the category path being visited.
purchasereports the orders placed. This is the basis for the revenue metrics.
Use these JSFiddle examples to see these commands in action. Use the LIVE EVENTS viewer in the Web Behavior box on the Data Sources page to check your implementation works.
JSFiddle working code example for Product Pages:
JSFiddle working code example for Category Pages:
JSFiddle working code example for the Order Confirmation Page:
Step 5 – Validating the JS API Implementation
Live Event is an API validation tool to check the correct functioning of our API commands. It is a real-time listener that monitors the API commands received by our servers. Every successful API call is listed in the live event feed, so you can check the correct implementation of each command.
Click the link on the right-hand side of the box to get to the detailed Live Events screen.
Inspector Gadget is the second API validation tool we provide. Inspector Gadget runs in the browser, and monitors API commands as they are sent from the browser.
Step 6 – Bringing data-collection live, and validating implementation
Once you have made sure that all API commands function properly, it is time to bring your implementation of the JS API live.
Live Validator is our dedicated tool to analyse and validate data collection once exposed to real traffic. Live Validator monitors the health of your integration by sampling real web traffic and running over 100 validation checks, making sure that the data received from the API commands makes sense when applied to your product catalog.
Live Validator also posts its alerts to each box of the Data Sources page. Live Validator offers error examples and fix suggestions for each alert it displays.
Step 7 – Checking site traffic statistics
After data collection has been running for a few days, you will also see Page views and Revenue figures in the Web Behavior box of the Data Sources page.
Click Site Traffic reports on the right-hand side of the box to get to the detailed Site Traffic screen. Here yo u can see additional site statistics, and this is also where you find a function to export collected purchase data in one .csv table (this is useful for a final confirmation that Emarsys is collecting all the transactions on your site).
Step 8 – Checking the output in your contact DB
Now, the Contact Matching box of the Data Sources page is also active, showing you the number of visitors identified through the data-collection scripts, as well as any issues with updating the contacts in the Emarsys contact database based on this data.
And you're done
Once you have completed the above steps, and validations run successfully on your integration, data collection and the enrichment of contact data through the behavior fields is continuous.
Please remember to check back on your integration’s health by visiting the Data Sources page every once in a while, and at least after each change of the website frontend (to ensure other updates to the pages have not corrupted the data collection scripts).