The database back-end part of this simple framework will be made of these parts:
schema dql – containing all the Data quality objects mentioned below
table dql.ValidationRule – containing a list of Data quality rules with their meta information
table dql.ValidationRun – containing information about the validation runs
table dql.ValidationResult – containing aggregated information with the results needed for your reporting
table dql.ValidationRuleItem – containing the lowest level info related to the business object specific errors, ie. a list of the Contacts having more than 1 invoicing address filled in SQL_VARIANT datatype columns
stored procedure dql.ValidationRunner – responsible for the Data quality rules evaluation, maintenance of data and aggregations for reporting
stored procedure dql.ValidationRuleAPI – responsible for the Data quality rules management
SQL Agent job DataQuality – responsible for scheduled execution of the stored procedure dql.ValidationRunner
Here is a relational schema to give you a better idea:
And how do we bring the Data Quality evaluation to life? First we need to fill the Data quality rules and their meta information into the dql.ValidationRule table.
INSERT INTO dql.ValidationRule
, IsActive )
'Contacts with multiple invoicing Address',
'List of all contacts with their owners having multiple invoicing addresses in the CRM',
'SELECT Contact_ID, ContactName, ContactOwner_ID, ContactOwnerName, AddressType, COUNT(*) AS [InvoicingAddressCount]
FROM dbo.Contact WITH (NOLOCK)
WHERE AddressType = ''Invoicing''
GROUP BY Contact_ID, ContactName, ContactOwner_ID, ContactOwnerName, AddressType
HAVING COUNT(*) > 1',
1 UNION ...
At this phase you can build your simple stored procedure API. Just prepare a stored procedure called dql.ValidationRuleAPI that can either insert, update or delete a Data quality rule based on the provided parameters.
Next you need to create the stored procedure dql.ValidationRunner which will take care of the main part. It has to create a row in the dql.ValidationRun table to mark the validation run has started. Then it needs to iterate through the table dql.ValidationRule, use the value from the ValidationRuleQuery column, build a dynamic SQL statement and fill the result table dql.ValidationResultItem with each Data quality query results. ( No need for cursors here, you can refer to this link to use much more lightweight solution using a temp table )
DECLARE @statement NVARCHAR(MAX);
SET @statement = N'YOUR DATA QUALITY QUERY SELECT.. GOES HERE';
INSERT INTO dql.ValidationResultItem(
EXEC sp_executesql @statement;
**Note: In case you’ll have different result column count for each query, you need to generate the INSERT INTO… statement also in a dynamic SQL manner. You can use some simple logic based on the column count metadata for the specific rule. The result would look like EXEC sp_executesql @statement where @statement is set something like : ‘INSERT INTO dql.ValidationResultItem (…) SELECT DATA QUALITY QUERY SELECT GOES HERE’…
After each rule being finished, this SP needs also to write the number of found results for the rule into the dql.ValidationResult table. Once all the rules are finished and all the results are inserted into the result table, this SP needs to mark the end of the validation run back into the dql.ValidationRun table and do some maintenance so the result table does not grow out of control. This table could easily become really huge, being considered the job is scheduled once in a hour, evaluating in average 50 rules with about approx. 100 results per rule. At any cost avoid using DELETE statement here, use TRUNCATE. I have spilled the results for the last 2 runs into a temp db table, truncated the result item table and inserted the spilled results back. I am pretty much sure you can come up with a more elegant solution.
Here is a simple UML diagram explaining the dql.ValidationRunner stored procedure workflow.
One important note. DO make sure, all the Data quality queries have NOLOCK hints. You don’t want to stop your production CRM application with your queries. Also I strongly advise to have the framework run on a different server then your CRM database production.