Once your information governance group has established (and documented) what fit-for-use data looks like, you can then use those rules to enforce high-quality data at the most efficient place in your business process: point-of-entry. Also see Data Quality firewall.
Even after initial data-heavy projects, like a data migration, your data quality degrades over time. As users follow the Create Read Update Delete (CRUD) orchestration policies established by your information governance team, new errors enter the system (like duplicate rrecords). If your policy allows this bad data to enter your system, and counts on a routine, batch process to identify errors and fix them later, then you are raising the cost of fixing those data errors. Without sufficient data quality checks at the point-of-entry, your business will experience these problems:
- Increased time to manually work through data problems, as more data quality problems enter your system every day.
- On-going lost time to detect data quality errors and route these errors to the appropriate data stewards.
- Increase time to research data quality errors and fix them at a later time--perhaps including tracking down the originator of the data quality problem. All of this takes time, especially if your business is global and spans time zones.
- With SAP Information Steward, establish fit-for-use validation rules that can be applied at point-of-entry.
- With SAP Data Services, you can create complex validation rules (including duplicate checks and address validation) that can be executed at point-of-entry.
- With Data Quality Management for SAP, you can execute complex quality rules directly at the point-of-entry in SAP ERP and SAP CRM.
- With SAP NetWeaver Master Data Management, you can not only call Data Services validation rules, but also create master-data-centric quality rules that can run at point-of-entry in MDM as well as through the portal.
- With SAP Master Data Governance, you can execute complex quality rules directly at point-of-entry in SAP ERP.
- Decreased error remediation time, as originators of quality errors are asked to correct the data at the most efficient time in the process.
- Consistent application of data quality standards, whether during a data migration or a point-of-entry. This consistency helps ensure that data is fit-for-use for both strategic analytics and business process optimization.
- Lowered TCO for IT, as business rules can be established in one system and executed in multiple systems.
- Decreased training time, as IT developers can use their understanding of a single set of data quality rule syntax and execute those rules at multiple points-of-entry.
- Information governance: commonly asked questions
- Data quality firewall: Ensuring data quality at its source
- Information governance blog
- Information governance solutions, overview
- Information governance products: Information Steward , Data Services , Master Data Management , Business Process Management , Enterprise Content Management , Information Lifecycle Management , Business Rules Management , and Business Intelligence
- eLearning: Create complex validation rules
- eLearning: Create rule bindings
- eLearning: Create a cleansing package
- eLearning: Perform address profiling
- eLearning: Defining rules tasks and viewing results