Data Quality
Have confidence in your data
Most companies detect issues after their team has used bad data to make decisions or trigger campaigns. Quickly take action on every invalid event with in-app reporting and daily email digests.
Align all the teams in your company around a single data dictionary
In the Normalized datalayer interface, you will be able to define the schema of your data and define the validation rules that will feed your data quality workflow.
Automate the QA process
Writing event specification allows you to automate the QA process, to feed the in the Source data quality dashboard, but also to define realtime alerts to react quickly when errors occur on your data.
Fix your data in realtime
To react quickly to data errors, while your IT team corrects the problem at source, you can rely on the live data transformation feature, aka Data cleansing.
Control the data delivery
Having a good data quality on each source is essential, but being able to check also the quality of the data transmission is at least as important. For each destination, you can view the event delivery history, quickly identify errors and define realtime alerts with a personnalized thresold.
Inspect specific events
In case of doubt or to further investigate a data problem, you can access the logs of the events sent. You benefit both from Event source inspector and Destination event inspector to search for a specific event, analyse it's data and better understand the issue.
Last updated