Previously, I analyzed the types of data in a governance program for financial services companies. In this post, I discuss the components needed to design a data governance strategy.
Become a Financial Services Experience Maker
Perficient has the industry experience and capabilities to transform and power financial services firms with modern technology and digital solutions. Explore how we can help you become an experience maker in financial services.
A comprehensive program to manage data consists of a number of components:
Data Strategy & Architecture
- A firm-wide repository (data dictionary) should be constructed to catalog each data element along with its definition, point of origin (“golden source”), format, and owner.
- Data quality constraints that define the range of values and validation rules to be enforced should be noted.
- A data warehousing strategy (data mart, data lake) should be defined to centralize associated data elements, minimizing redundancy and improving analytics and reporting efficiency.
- Rules for data usage should be created, monitored, and enforced, so that applications and reporting only access data from approved sources (either the golden source, its proxy, or an approved and reconciled data warehouse).
Data Validation & Quality
- Data edit and validation rules should be enforced at each data element’s point of origin, thus ensuring quality standards are met.
- Validation rules should be dynamic (e.g., sourced from database or rules engine, implemented via API, SOAP interface, message bus), rather than hardcoded in applications, for consistency and timeliness.
- Periodic audit of data repositories should be conducted to identify any non-conforming data and its source.
- Management and regulatory report preparation should be periodically reviewed to confirm all required data elements are being created by the source systems, communicated to all necessary downstream systems, and captured in the data repositories.
Data Lineage & Journey
- Applications that require data created elsewhere should receive that data from the point of origin (“golden source”) or a designated proxy.
- As data gets passed from one application, it should be ensured that the information received by a downstream application has not been modified in transit and remains current. If data has necessarily been modified, new data elements should be named and documented in the aforementioned dictionary.
Data Visualization & Forensics
- Visualization tools allow rapid determination of flaws in data sets. Records with data elements containing erroneous values indicate breaks in referential integrity and insufficient front-end controls that should be addressed.
Data Security & Privacy
- Protecting a firm’s data assets is critical. Access to information should be on a need-to-know basis with access controls implemented on sets of data along with a number of dimensions.
- Requesting access to data should be automated through workflows to facilitate and speed the servicing of requests as to not hinder productivity. The access rights granted to individuals, rather than production systems, should be tethered to HR records, so that access can be systemically removed upon transfer to a new department or the end of employment or contractor engagement.
To learn more about the components of a data governance program and the steps to take to remediate any weaknesses that can compromise the quality and security of a firm’s data; download our guide here or click the link below.