Data Quality
Last updated
Last updated
Most companies detect issues after their team has used bad data to make decisions or trigger campaigns. Quickly take action on every invalid event with in-app reporting and daily email digests.
In the Normalized datalayer interface, you will be able to define the schema of your data and define the validation rules that will feed your data quality workflow.
Writing event specification allows you to automate the QA process, to feed the in the Source data quality dashboard, but also to define realtime alerts to react quickly when errors occur on your data.
To react quickly to data errors, while your IT team corrects the problem at source, you can rely on the live data transformation feature, aka Data cleansing.
Having a good data quality on each source is essential, but being able to check also the quality of the data transmission is at least as important. For each destination, you can view the event delivery history, quickly identify errors and define realtime alerts with a personnalized thresold.
In case of doubt or to further investigate a data problem, you can access the logs of the events sent. You benefit both from Event source inspector and Destination event inspector to search for a specific event, analyse it's data and better understand the issue.