Regulatory Compliance Articles / Blogs / Perficient https://blogs.perficient.com/category/industries/financial-services/regulatory-compliance/ Expert Digital Insights Tue, 10 Feb 2026 14:45:33 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Regulatory Compliance Articles / Blogs / Perficient https://blogs.perficient.com/category/industries/financial-services/regulatory-compliance/ 32 32 30508587 Seven Federal Regulatory Reports Banks and BHCs with $10 to $100 Billion in Assets Must Master https://blogs.perficient.com/2026/02/05/seven-federal-regulatory-reports-banks-and-bhcs-with-10-to-100-billion-in-assets-must-master/ https://blogs.perficient.com/2026/02/05/seven-federal-regulatory-reports-banks-and-bhcs-with-10-to-100-billion-in-assets-must-master/#respond Thu, 05 Feb 2026 13:22:29 +0000 https://blogs.perficient.com/?p=390151

Introduction

Insured domestic financial institutions operating in the United States with total consolidated assets between $10 billion and $100 billion face a complex and multi-layered regulatory reporting landscape. These mid-sized banking organizations occupy a critical position in the financial system—large enough to pose potential systemic risks yet distinct from the very largest global systemically important banks. As a result, federal regulators have established a comprehensive framework of periodic reporting requirements designed to monitor capital adequacy, liquidity positions, credit concentrations, operational risks, and overall financial condition.

This article provides an in-depth examination of the major federal regulatory reports that banks in this asset category must file with the Federal Reserve System, the Federal Deposit Insurance Corporation (FDIC), and the Office of the Comptroller of the Currency (OCC). Understanding these reporting obligations is essential for Chief Compliance Officers, Chief Financial Officers, and regulatory reporting teams responsible for producing timely and accurate submissions to federal banking agencies.

The Regulatory Framework

Banks with assets exceeding $10 billion but remaining below $100 billion are subject to enhanced prudential standards under the Dodd-Frank Wall Street Reform and Consumer Protection Act of 2010. Section 165 of the Act requires the Federal Reserve Board to establish risk-based capital requirements, leverage limits, liquidity requirements, and stress testing protocols for bank holding companies and savings and loan holding companies with total consolidated assets of $10 billion or more. These enhanced standards are implemented through a series of regular reporting requirements that provide regulators with detailed, timely information about each institution’s financial condition, risk exposures, and capital planning processes.

The regulatory reporting regime serves multiple supervisory purposes. First, it enables regulators to monitor individual institutions’ safety and soundness on an ongoing basis, identifying emerging risks before they threaten financial stability. Second, aggregate data from these reports inform broader systemic risk assessments and macroeconomic policy decisions. Third, the information collected supports the Federal Reserve’s supervisory stress testing framework, including the Dodd-Frank Act Stress Test (DFAST) and Comprehensive Capital Analysis and Review (CCAR) processes. Finally, certain reporting data are used to calculate regulatory capital ratios, liquidity coverage ratios, and other key prudential metrics that determine whether institutions meet minimum regulatory standards. Providing regulators each of those measures are the following reports:

Major Reporting Requirements

1. Consolidated Reports of Condition and Income (Call Report)

The cornerstone of bank regulatory reporting is the quarterly Call Report, formally known as the Consolidated Reports of Condition and Income. Every national bank, state member bank, insured state nonmember bank, and savings association must file a consolidated Call Report as of the close of business on the last calendar day of each calendar quarter.

Purpose and Scope

The Call Report collects comprehensive financial data in the form of a balance sheet, income statement, and supporting schedules that detail a bank’s condition and performance. The information is used by the FDIC, OCC, and Federal Reserve for bank supervision and examination, deposit insurance assessment, monetary policy analysis, and public disclosure. Supervisory agencies use Call Report data to monitor individual bank risk profiles, identify troubled institutions, assess the impact of economic and policy changes on the banking system, and prepare reports to Congress and the public.

Banks between $10 and $100 billion in total consolidated assets file one of two primary Call Report forms depending on their office structure. Banks with any foreign offices—including International Banking Facilities (“IBFs”), foreign branches or subsidiaries, or majority-owned Edge or Agreement subsidiaries—must file the FFIEC 031 form quarterly. Bank’s with only domestic offices file the FFIEC 041 form.

Reporting Frequency and Timing

The Call Report is due quarterly as of March 31, June 30, September 30, and December 31. While the core report is required quarterly, specific schedules have varying frequencies. Schedule RC-T (Fiduciary and Related Services) is filed quarterly only by banks with more than $250 million in fiduciary assets or with fiduciary income exceeding 10% of total revenue; otherwise it is filed annually on December 31. Several memorandum items are reported semiannually on June 30 and December 31, including data on held-to-maturity securities transfers and purchased credit-impaired loans. Other items such as preferred deposits, reverse mortgage data, internet transaction capability, and captive insurance/reinsurance assets are reported only annually on December 31.

2. Complex Institution Liquidity Monitoring Report (FR 2052a)

The FR 2052a is one of the newer but also one of the most detailed and data-intensive regulatory reports in the banking system. It collects granular, transaction-level information on assets, liabilities, funding activities, and contingent liabilities to enable the Federal Reserve to monitor liquidity risks at large, complex banking organizations. Perficient has previously offered a free guide to the 2052a report available here – Breaking Down the FR 2052a Complex Institution Liquidity Monitoring Report, a Guide / Perficient

Purpose and Scope

The Federal Reserve uses the FR 2052a to monitor the overall liquidity profile of supervised institutions, including detailed information on liquidity risks within different business lines such as securities financing, prime brokerage activities, and derivative exposures. These data points as part of the Board’s supervisory surveillance program in liquidity risk management and provide timely information on firm-specific liquidity risks during periods of stress. Analyses of systemic and idiosyncratic liquidity risk issues inform supervisory processes and the preparation of analytical reports detailing funding vulnerabilities.

The report is used to monitor compliance with the Liquidity Coverage Ratio (LCR) and Net Stable Funding Ratio (NSFR) requirements established under Basel III and implemented by U.S. banking agencies. The FR 2052a collects data across ten distinct tables covering 115 product types, 14 counterparty types, 72 asset classes, and 75 maturity buckets extending out to five-plus years.

Reporting Frequency and Timing

U.S. banking organizations that are subject to Category III standards with average weighted short-term wholesale funding of $75 billion or more must submit the FR 2052a on each business day. Daily filers must submit reports by 3:00 p.m. ET each business day. U.S. banking organizations subject to Category III standards with average weighted short-term wholesale funding of less than $75 billion, or subject to Category IV standards, must submit the report monthly.

When a banking organization’s required reporting frequency increases from monthly to daily, it may continue to report monthly until the first day of the second calendar quarter after the change in category becomes effective. Conversely, when frequency decreases from daily to monthly, the reduction takes effect immediately on the first day of the first quarter in which the change is effective.

3. Country Exposure Report (FFIEC 009)

The FFIEC 009 Country Exposure Report provides regulators with detailed information on the geographic distribution of U.S. banks’ claims on foreign residents, enabling assessment of country-specific and transfer risks in bank portfolios.

Purpose and Scope

The report is used to monitor country exposures of banks to determine the degree of country risk and transfer risk in their portfolios and assess the potential impact on U.S. banks of adverse developments in particular countries. The International Lending Supervision Act mandates quarterly reporting to obtain more frequent and timely data on changes in the composition and maturity of banks’ loan portfolios subject to transfer risk.

Data collected includes detailed information on claims by country, sector, and maturity, as well as risk transfers through guarantees and other credit enhancements. The Interagency Country Exposure Review Committee (ICERC) uses this information to conduct periodic reviews of country exposures and assign transfer risk ratings to specific countries.

Reporting Frequency and Timing

The FFIEC 009 must be filed quarterly as of the last business day of March, June, September, and December, with submissions due within 45 calendar days after the reporting date (50 calendar days after the December 31 reporting date).

The report is required of every U.S.-chartered commercial bank that holds aggregate foreign claims of $30 million or more and maintains a foreign branch, international banking facility, majority-owned foreign subsidiary, or similar foreign office. Bank holding companies must also file under certain conditions, and Edge and agreement corporations with foreign claims exceeding $30 million must file unless consolidated under a reporting bank.

4. Weekly Report of Selected Assets and Liabilities (FR 2644)

The FR 2644 provides the Federal Reserve with high-frequency data on selected balance sheet items from a sample of commercial banks, serving as the primary source for weekly banking statistics.

Purpose and Scope

The FR 2644 collects sample data that are used to estimate universe levels for the entire commercial banking sector when combined with quarterly Call Report data. Data from the FR 2644, together with other sources, are used to construct weekly estimates of bank credit, balance sheet data for the U.S. banking industry, sources and uses of banks’ funds, and current banking developments.

These weekly statistics are published in the Federal Reserve’s H.8 statistical release “Assets and Liabilities of Commercial Banks in the United States” and are routinely monitored by Federal Reserve staff, included in materials prepared for the Board of Governors and the Federal Open Market Committee, and incorporated into the semiannual Monetary Policy Report to Congress.

Reporting Frequency and Timing

The FR 2644 is submitted weekly as of the close of business each Wednesday by an authorized stratified sample of approximately 850-875 domestically chartered commercial banks and U.S. branches and agencies of foreign banks. This sample accounts for approximately 88% of domestic assets of commercial banks and U.S. branches and agencies of foreign banks. Small banks (those with assets less than $5 billion) have an option to report monthly rather than weekly, helping to reduce burden for community banks while maintaining adequate sample coverage.

5. Single-Counterparty Credit Limits (FR 2590)

The FR 2590 report enables the Federal Reserve to monitor compliance with the Single-Counterparty Credit Limits (SCCL) rule, which prohibits covered companies from having aggregate net credit exposure to any unaffiliated counterparty exceeding 25% of Tier 1 Capital.

Purpose and Scope

The SCCL rule, adopted pursuant to Section 165(e) of the Dodd-Frank Act, is designed to limit the exposure of large banking organizations to single counterparties, thereby reducing the risk that the failure of a counterparty could cause significant losses to a covered bank and threaten financial stability. The FR 2590 reporting form collects comprehensive information on a respondent organization’s credit exposures to its counterparties, including detailed data on gross exposures, securities financing transactions, derivative exposures, risk-shifting arrangements, eligible collateral and mitigants, and the presence of relationships requiring aggregation under economic interdependence or control tests.

Reporting Frequency and Timing

Respondents must file the FR 2590 quarterly as of the close of business on March 31, June 30, September 30, and December 31. Submissions are due 40 calendar days after the first three quarters and 45 calendar days after the December 31 reporting date.

All U.S. bank holding companies, savings and loan holding companies, and foreign banking organizations that are subject to Category I, II, or III standards must file the report. For foreign banking organizations, the requirement applies to those subject to Category II or III standards or those with total global consolidated assets of $250 billion or more. The estimated average hours per response for the FR 2590 is approximately 254 hours per quarterly submission, reflecting the detailed counterparty-level information and complex risk calculations required. The report requires respondents to identify and report data for their top 50 counterparties. Respondents must retain one exact copy of each completed FR 2590 in electronic form for at least three years.

6. Capital Assessments and Stress Testing Report – Annual (FR Y-14A)

The FR Y-14A is the annual component of the Capital Assessments and Stress Testing information collection that supports the Federal Reserve’s supervisory stress testing and capital planning framework.

Purpose and Scope

The FR Y-14A collects detailed quantitative projections of balance sheet assets and liabilities, income, losses, and capital across a range of macroeconomic scenarios, as well as qualitative information on methodologies used to develop internal projections of capital across scenarios. The report comprises Summary, Scenario, Regulatory Capital Instruments, Operational Risk, and Business Plan Changes schedules.

Respondents report projections across supervisory scenarios provided by the Federal Reserve as well as firm-defined scenarios where applicable. The data are used to assess capital adequacy of large firms using forward-looking projections of revenue and losses, to support supervisory stress test models, for continuous monitoring efforts, and to inform the Federal Reserve’s operational decision-making under the Dodd-Frank Act.

Reporting Frequency and Timing

The FR Y-14A is filed annually with an as-of date of December 31. Submissions are due 52 calendar days after the calendar quarter-end (typically early February). The annual submission must be accompanied by an attestation signed by the CFO or equivalent senior officer.

Bank Holding Companies, Intermediate Holding Companies, and Savings and Loan Holding Companies with $100 billion or more in total consolidated assets are required to file. The specific schedules required depend on whether the institution is subject to Category I-III standards or Category IV standards.

The FR Y-14A reporting burden is substantial, reflecting the comprehensive forward-looking projections and detailed scenario analysis required. The current estimated average burden is approximately 1,330 hours per annual response. This includes both the preparation of quantitative projections across multiple scenarios and the development of supporting qualitative documentation describing methodologies and assumptions.

7. Capital Assessments and Stress Testing Report – Quarterly (FR Y-14Q)

The FR Y-14Q collects detailed quarterly data on Bank Holding Companies’, Intermediate Holding Companies’, and Savings and Loan Holding Companies’ various asset classes, capital components, and categories of pre-provision net revenue.

Purpose and Scope

The FR Y-14Q schedules collect firm-specific granular data on positions and exposures used as inputs to supervisory stress test models, to monitor actual versus forecast information on a quarterly basis, and for ongoing supervision. The report comprises Retail, Securities, Regulatory Capital Instruments, Regulatory Capital, Operational Risk, Trading, PPNR, Wholesale, Retail Fair Value Option/Held for Sale, Counterparty, Balances, and Supplemental schedules.

All schedules must be submitted for each reporting period unless materiality thresholds apply. For example, only firms subject to Category I, II, or III standards with aggregate trading assets and liabilities of $50 billion or more, or trading assets and liabilities equal to 10% or more of total consolidated assets, must submit the Trading and Counterparty schedules.

Reporting Frequency and Timing

The FR Y-14Q is filed quarterly as of March 31, June 30, September 30, and December 31. Submissions are due 45 calendar days after the end of the first three quarters and 52 calendar days after the December 31 quarter-end. For the fourth quarter Trading and Counterparty schedules, submissions may be due as early as March 15 if the Board selects an earlier as-of date for the global market shock component.

New reporters receive implementation relief, with the filing deadline extended to 90 days after quarter-end for the first two quarterly submissions. This allows institutions crossing the significant $100 billion asset threshold extra time to build necessary reporting infrastructure and processes.

Table: Federal Reporting Requirements Summary

Federal Bank Reports

Additional Considerations

Data Governance and Quality Control

The volume, granularity, and frequency of these reporting requirements demand robust data governance frameworks and quality control processes. Financial institutions must establish clear data lineage documentation, implement automated validation checks, maintain comprehensive data dictionaries, and conduct regular reconciliation across reports.

Many of these reports require data at transaction or contract levels (FR 2052a, FR-2590, FR Y-14Q), necessitating direct integration with core banking systems, loan origination platforms, treasury management systems, and risk management applications. Manual data gathering and spreadsheet-based processes for Inured Depository Institutions with greater than $10 billion of assets are insufficient for sustained compliance with these requirements, particularly for daily or weekly filings. At Perficient, we have seen and helped clients implement AI-enhanced reporting capabilities.

Systems and Technology Infrastructure

Implementing and maintaining compliance with these reporting requirements typically requires significant technology investments. Institutions may need to deploy specialized regulatory reporting platforms, develop custom data extraction and transformation tools, implement automated validation and reconciliation systems, and establish secure data transmission capabilities.

The FR 2052a, in particular, has driven substantial technology modernization at many institutions due to its granular cash flow reporting requirements and daily submission frequency for the largest banks. Similarly, the FR Y-14A and Q reports require sophisticated data aggregation capabilities to assemble loan-level detail from disparate systems across the enterprise.

Staffing and Expertise Requirements

Compliance with these reporting requirements necessitates dedicated teams with specialized expertise spanning regulatory reporting, financial accounting, risk management, data management, and systems analysis. Larger institutions typically maintain separate teams for different reporting families, with subject matter experts for capital, liquidity, credit risk, market risk, and operational risk reporting.

The attestation requirements for several reports—including the FR Y-14Q and FR 14A—place direct accountability on senior financial officers, underscoring the importance of robust internal controls, documentation, and review processes.

Coordination with Business Lines

Successful regulatory reporting requires close coordination between centralized reporting functions and business lines across the organization. Trading desks must provide transaction-level derivatives data, retail lending units must supply detailed loan-level information, treasury teams must furnish liquidity and funding details, and international operations must contribute country exposure data.

Establishing clear roles, responsibilities, and service level agreements between reporting teams and data providers is essential to ensure timely, accurate submissions.

Conclusion

Banks with total consolidated assets between $10 billion and $100 billion face a demanding federal regulatory reporting regime that reflects their significance to the financial system and the potential risks they pose. The seven major reporting requirements discussed in this article—Call Reports, FR 2052a, FFIEC 009, FR 2644, FR 2590, FR Y-14A, and FR Y-14Q—collectively require thousands of hours of effort annually and generate vast amounts of detailed financial, risk, and operational data.

Effective management of these reporting obligations requires substantial investments in data infrastructure, technology systems, specialized expertise, and governance processes. Institutions must balance the compliance imperative with considerations of cost, efficiency, and the need to leverage reporting data for internal management purposes. Those institutions that view regulatory reporting not merely as a compliance burden but as an opportunity to enhance data quality, strengthen risk management, and improve decision-making are best positioned to meet these obligations efficiently while deriving maximum value from their reporting investments.

As the regulatory landscape continues to evolve in response to emerging risks and changing market conditions, banking organizations in this asset range must maintain flexibility, invest in scalable reporting infrastructure, and cultivate deep regulatory expertise to navigate future reporting requirements successfully. The complexity and significance of federal bank reporting requirements underscore the critical role of compliance and regulatory reporting functions in maintaining the safety, soundness, and stability of individual institutions and the broader financial system.

Our financial services experts continuously monitor the regulatory landscape and deliver pragmatic, scalable solutions that meet the mandate and more. Reach out to Perficient’s BFSI team here – Contact Us / Perficient – and discover why we’ve been trusted by 18 of the top 20 banks16 of the 20 largest wealth and asset management firms, and are regularly recognized by leading analyst firms.

 

]]>
https://blogs.perficient.com/2026/02/05/seven-federal-regulatory-reports-banks-and-bhcs-with-10-to-100-billion-in-assets-must-master/feed/ 0 390151
2/3 of the World is Covered by Water – the Other Third is Covered by the Gramm-Leach-Bliley Act https://blogs.perficient.com/2026/01/29/2-3-of-the-world-is-covered-by-water-the-other-third-is-covered-by-the-gramm-leach-bliley-act/ https://blogs.perficient.com/2026/01/29/2-3-of-the-world-is-covered-by-water-the-other-third-is-covered-by-the-gramm-leach-bliley-act/#respond Thu, 29 Jan 2026 13:40:51 +0000 https://blogs.perficient.com/?p=389996

With the possible exception of medical providers, financial institutions handle some of the most sensitive information consumers possess—Social Security numbers, income and employment details, credit histories, account balances, and more. Protecting this data is not only essential to maintaining consumer trust but is also a legal requirement under the Gramm‑Leach‑Bliley Act (“GLBA”) and the Federal Trade Commission (“FTC”) Safeguards Rule. Together, these regulations establish a comprehensive framework for how financial institutions must secure, manage, and protect consumer information throughout its lifecycle.

Although the GLBA has been in effect for a couple of decades, and the FTC Safeguarding Rule was put into effect in 2003 and updated for smart phone usage in 2021 with penalties taking effect in 2023, we thought a review would be helpful for executives of financial institutions as well as fintechs. Below, we break down the core requirements of GLBA and the Safeguards Rule, along with practical considerations for financial institutions striving to meet and exceed compliance expectations. While the regulatory language can feel intricate, the intent is clear: organizations must take proactive, documented, and continually improving measures to safeguard customer data from unauthorized access, misuse, and breaches.

The GLBA: Overview and Purpose

Enacted in 1999, GLBA reversed the Glass-Steagall Act, modernizing the financial services industry by allowing greater integration across banking, securities, and insurance markets. But along with this expanded capability came heightened responsibility. Title V of GLBA—the Privacy Rule and the Safeguards Rule—requires financial institutions to:

  1. Explain their information‑sharing practices to consumers
  2. Protect the security and confidentiality of nonpublic personal information (NPI)
  3. Limit data sharing with non‑affiliated third parties unless certain conditions are met

The law defines “financial institution” broadly, extending beyond banks to include mortgage brokers, lenders, payday loan companies, tax preparation firms, investment advisers, fintechs, and various other service providers engaged in financial activities.

The FTC Safeguards Rule: Framework for a Modern Security Program

The FTC Safeguards Rule—originally issued under GLBA and updated significantly in 2021 and 2023—provides the detailed blueprint for how financial institutions must secure customer information. The rule outlines administrative, technical, and physical safeguards that organizations must implement as part of a comprehensive information security program.

Here are the foundational elements required under the rule:

  1. Designation of a Qualified Individual

Every financial institution must appoint a Qualified Individual (“QI”) responsible for implementing and overseeing the company’s information security program. This person may be an internal employee or an external service provider, but accountability ultimately remains with the institution’s leadership.

  1. Risk Assessment

A written, formal risk assessment must identify reasonably foreseeable internal and external threats to customer information. This includes evaluating:

  • Data storage and transmission methods
  • Employee access
  • Third‑party risks
  • System vulnerabilities
  • Potential impact of data compromise

The risk assessment must guide the selection and implementation of safeguards and guardrails, ensuring they are appropriate to the institution’s size, complexity, and the sensitivity of the data it handles.

  1. Implementation of Safeguards Aligned to Identified Risks

The Safeguards Rule specifies several required protections:

  • Access Controls: Ensure only authorized personnel can access sensitive data, requiring under the regulation role‑based permissions and least‑privilege principles.
  • Encryption: Encrypt customer data both in transit and at rest.
  • Multi‑Factor Authentication (“MFA”): Require MFA for any access to systems containing customer information. This requirement is why you have to constantly check your phone and keep yourself in Wi-Fi every time you use that financial website or app.
  • Secure Development Practices: Implement secure coding practices and change‑management procedures.
  • Data Inventory and Mapping: Maintain a clear understanding of where data resides, how it flows, and who has access. Data lineage is generally considered a next natural step once data inventory and mapping is completed.
  • Monitoring and Logging: Continuously monitor systems for unauthorized activity and maintain detailed event logs.
  • Vulnerability Management: Conduct routine scans, penetration testing, and timely patch management.

These safeguards ensure that institutions take a proactive rather than reactive approach to data protection.

  1. Employee Training

Human error is among the most common causes of data breaches. The rule mandates that institutions provide regular security awareness training designed to equip employees with the knowledge to identify threats such as phishing, social engineering, or unauthorized data access attempts.

  1. Oversight of Service Providers

Many financial institutions rely on third‑party vendors for critical operations, from cloud hosting to data analytics. Under the Safeguards Rule, institutions must:

  • Conduct due diligence before engaging vendors
  • Ensure contracts contain specific data‑security obligations
  • Monitor vendor compliance

This requirement reflects the increasingly interconnected ecosystem of financial technology and the shared responsibility model.

  1. Incident Response Planning

The rule requires a written incident response plan that outlines:

  • Roles and responsibilities
  • Internal and external communication procedures
  • Criteria for defining events
  • Steps for containment, remediation, and recovery
  • Documentation and post‑incident analysis

A well‑designed plan ensures organizations can respond to security events quickly and effectively.

  1. Annual Reporting to the Board of Directors

At least once a year, the QI (remember #1 above) must deliver a written report to the board or governing body detailing:

  • Program status
  • Risk assessment findings
  • Security events and responses
  • Recommendations for improvement

This ensures executive oversight and board accountability.

Conclusion

As financial data becomes increasingly valuable and cyber threats more advanced, GLBA and the FTC Safeguards Rule provide a structured, strategic framework for protecting consumer information. Institutions that embrace these requirements not as a checkbox exercise but as a guide to building a mature, adaptive security program position themselves for stability, trust, and competitive advantage.

Failure to comply can lead to substantial financial penalties; reputational damage; a significant and perhaps permanent loss of consumer trust; and increased scrutiny form federal regulators.

If your firm would like assistance designing or adopting robust cybersecurity strategies aligned with GLBA and the Safeguards Rule as part of migrating to the cloud with a consulting partner that has deep industry expertise – reach out to us here.

 

 

]]>
https://blogs.perficient.com/2026/01/29/2-3-of-the-world-is-covered-by-water-the-other-third-is-covered-by-the-gramm-leach-bliley-act/feed/ 0 389996
Part 504 Compliance Deadline Fast Approaching for BFSI Firms in New York https://blogs.perficient.com/2026/01/28/part-504-compliance-deadline-fast-approaching-for-bfsi-firms-in-new-york/ https://blogs.perficient.com/2026/01/28/part-504-compliance-deadline-fast-approaching-for-bfsi-firms-in-new-york/#respond Wed, 28 Jan 2026 13:35:32 +0000 https://blogs.perficient.com/?p=389980

This blog was co-authored by Perficient Project Manager: Alicia Lawrence

As a global organization headquartered in St. Louis, Perficient is committed to supporting current and future clients by monitoring federal and state regulations and alerting them of changes that may impact them.  In 2024, Perficient published a blog highlighting insights gathered through continuous monitoring a of the New York State regulations impacting financial services firms:

NYDFS Part 500 Cybersecurity Amendments – What You Need to Know  

This blog highlights key observations and implications of the latest changes to the NYDFS 500 regulations and builds on the previously published blog to inform financial services executives that the NYDFS Part504 Transaction Monitoring and Filtering Certification is a significant annual regulatory requirement for any institution regulated under New York’s Banking, Insurance or Financial Services Law. The regulation imposes an annual certification on senior officers and board members that their organization’s transaction monitoring and sanctions filtering programs are designed, maintained, and tested to effectively detect money laundering, terrorist financing, and sanctioned-party transactions.  

What is Part 504 Certification? 

Under 3 NYCRR Part504, regulated institutions are legally obligated to: 

  • Operate an Anti-Money Laundering (“AML”)-compliant Transaction Monitoring Program, tailored to their risk profile. 
  • Run a Watchlist/Sanctions Filtering (i.e., Office of Foreign Assets Control “OFAC” compliance) Program. 
  • Annually certify, by April 15th, that these programs meet the Part 504 control standards, even if an institution finds and is actively remediating deficiencies.  

The certification itself covers the prior calendar year and is a standalone submission via DFS’ portal. The certification doesn’t require and actually prohibits the submission of supporting documentation. However, institutions must maintain records supporting their certification for potential DFS review. Such documentation includes internal/external audit results, scenario logic, testing strategy and results, and if necessary, documentation of remediation efforts and remediation plans. 

A link to the page is available here: 

Transaction Monitoring Certification (3 NYCRR 504) | Department of Financial Services 

 Who Must Certify? 

Part504 applies to any institution regulated by NYDFS under its financial services law, including: 

  • State-chartered banks 
  • Non-bank entities (e.g., money transmitters, Money Services Businesses “MSBs”) 
  • Insurance firms offering financial products 
  • Other licensed financial service providers 

Why Part504 Matters 

Part504 enhances financial integrity by ensuring senior-level accountability, mirroring Sarbanes-Oxley-style executive attestations. Even if an executive or Board member leaves a regulated financial institution, they could still be liable for false certifications made  the institution, should fraud be found after the fact. The NYDFS enacted this after uncovering weaknesses in AML controls across state-supervised banks and nonbanks, underscoring a need for robust governance.  

The regulation aims to: 

  • Elevate governance and oversight of AML/OFAC programs. 
  • Standardize program controls, including testing, validation, vendor oversight, and qualified staffing.  
  • Improve defenses against financial crime and regulatory infractions. 

Key Transaction Monitoring Requirements 

Getting further into the weeds, as required by Section 504.3, an effective program must include the following core components:  

  • Risk-Based Design: Align thresholds and detection logic with your institution’s assessed AML and OFAC risks. 
  • Periodic Testing & Updates:  
    • Incorporate regular reviews (including model validation and data flows). 
    • Update parameters based on evolving regulatory guidance or business changes.
  • Comprehensive Detection Scenarios: Create alert rules targeting suspicious behaviors aligned with your AML risk appetite.
  • Full Testing Regimen:  
    • End-to-end testing (pre/post-implementation). 
    • Governance oversight, data quality checks, and scenario validation. 
  • Documentation:  
    • Maintain records of detection scenarios, assumptions, thresholds, testing outcomes, and remediation. 
  • Alert Handling Protocols:
    • Define investigative workflows, decision points (clear vs escalate), roles, and documentation processes. 
  • Ongoing Monitoring:  
    • Continuously review scenario relevance, threshold efficacy, and real-world performance. 

These requirements also extend to sanctions filtering – ensuring timely name screening, alerts, and case management controls are in place. 

Risks of NonCompliance 

Non-compliance with Part504 can lead to: 

  • DFS enforcement actions, including fines or directives, under Banking Law §37 or Financial Services Law §302.  
  • Reputational damage, aka “Headline Risk” if AML or sanctions failures become public. 
  • Operational vulnerabilities, including weakened AML controls and potential for financial crime. 

Best Practices for Compliance 

Perficient consultants and compliance SMEs have seen and helped firms build and maintain a rock-solid Part504 posture by helping design and build the following best practices: 

  • Governance Oversight: Including AML leadership and internal/external audit in program reviews. 
  • Periodic Program Testing: Conducting fresh scenario validations, testing the design and operation of existing controls, performing data assembly testing, and model verification no less than annually. 
  • Issue Remediation: Prioritizing issues for remediation using a risk-based approach and performing issue validation testing.
  • Risk Assessment: Execute risk assessments of key business processes and determine inherent and residual risks.
  • Staff Training: Ensuring business line staff and compliance leads understand Part504 requirements and manage alerts effectively. 
  • Comprehensive Documentation: Keeping complete audit trails including logs of monitoring system updates, testing reports, governance minutes, and remediation plans. 
  • Vendor Oversight: If using third-party monitoring systems, conducting due diligence and regularly reviewing vendor performance. 
  • Senior Executive and Board Engagement: Encouraging frequent executive-level reviews, not just during certification preparation aka April 14th. 

Conclusion 

Navigating Part504 certification isn’t just an annual checkbox. It’s a significant piece of an institution’s AML and OFAC defense. By embedding risk-based monitoring, rigorous testing, and senior-level accountability, regulated institutions in New York not only fulfill their regulatory obligations but also strengthen their ability to deter and detect financial crimes. 

Through consistent governance, meticulous documentation, and leadership engagement, Part504 becomes more than compliance—it becomes a strategic shield for safeguarding financial integrity. For institutions governed by DFS, this certification confirms that all necessary steps have been taken to comply with Part 504 posture, reputation, and resiliency requirements —all by April 15 each year. 

If you would like to have Perficient SMEs work with you on your Part 504 preparation work – or just have a conversation – reach out to us here. 

]]>
https://blogs.perficient.com/2026/01/28/part-504-compliance-deadline-fast-approaching-for-bfsi-firms-in-new-york/feed/ 0 389980
Bulgaria’s 2026 Euro Adoption: What the End of the Lev Means for Markets https://blogs.perficient.com/2025/12/22/bulgarias-2026-euro-adoption-what-the-end-of-the-lev-means-for-markets/ https://blogs.perficient.com/2025/12/22/bulgarias-2026-euro-adoption-what-the-end-of-the-lev-means-for-markets/#comments Mon, 22 Dec 2025 17:03:29 +0000 https://blogs.perficient.com/?p=389245

Moments of currency change are where fortunes are made and lost. In January 2026, Bulgaria will enter one of those moments. The country will adopt the euro and officially retire the Bulgarian lev, marking a major euro adoption milestone and reshaping how investors, banks, and global firms manage currency risk in the region. The shift represents one of the most significant macroeconomic transitions in Bulgaria’s modern history and is already drawing attention across FX markets.

To understand how dramatically foreign exchange movements can shift value, consider one of the most famous examples in modern financial history. In September 1992, investor George Soros, “the man who broke the British Bank,” bet against the British pound, anticipating that the UK’s exchange rate policy would collapse. The resulting exchange rate crisis, now known as Black Wednesday, became a defining moment in forex trading and demonstrated how quickly policy decisions can trigger massive market dislocations.

By selling roughly $10 billion worth of pounds, his Quantum Fund earned ~$1 billion in profit when the currency was forced to devalue. The trade earned Soros the nickname “the man who broke the Bank of England” and remains a lasting example of how quickly confidence and capital flows can move entire currency systems.

Screenshot 2025 12 22 At 11.43.20 am

GBP/USD exchange rate from May 1992 to April 1993, highlighting the dramatic plunge during Black Wednesday. When George Soros famously shorted the pound, forcing the UK out of the ERM and triggering one of the most significant currency crises in modern history

To be clear, Bulgaria is not in crisis. The Soros example simply underscores how consequential currency decisions can be. Even when they unfold calmly and by design, currency transitions reshape the texture of daily life. The significance of Bulgaria’s transition becomes more clear when you consider what the lev has long represented. Safety. Families relied on it through political uncertainty and economic swings, saved it for holidays, passed it down during milestones, and trusted it in moments when little else felt predictable. Over time, the lev became a source of stability as Bulgaria navigated decades of change and gradually aligned itself with the European Union..

Its retirement feels both symbolic and historic. But for global markets, currency traders, banks, and companies engaged in cross border business, the transition is not just symbolic. It introduces real operational changes that require early attention. This article explains what is happening, why it matters, and how organizations can prepare.

Some quick facts help frame the scale of this shift.

Screenshot 2025 12 22 At 11.34.43 am

Map of Bulgaria

Bulgaria has a population of roughly 6.5 million.

The country’s GDP is about 90 billion U.S. dollars (World Bank, 2024)

Its largest trade partners are EU member states, Turkey, and China.

Why Bulgaria Is Adopting the Euro

​​Although the move from the Lev to the Euro is monumental, many Bulgarians also see it as a natural progression. ​​When Bulgaria joined the European Union in 2007, Euro adoption was always part of the long-term plan. Adopting the Euro gives Bulgaria a stronger foundation for investment, more predictable trade relationships, and smoother participation in Europe’s financial systems. It is the natural next step in a journey the country has been moving toward slowly, intentionally, and with growing confidence. That measured approach fostered public and institutional trust, leading European authorities to approve Bulgaria’s entry into the Eurozone on January 1, 2026 (European Commission, 2023; European Central Bank, 2023).

How Euro Adoption Affects Currency Markets

Bulgaria’s economy includes manufacturing, agriculture, energy, and service sectors. Its exports include refined petroleum, machinery, copper products, and apparel. It imports machinery, fuels, vehicles, and pharmaceuticals (OECD, 2024). The Euro supports smoother trade relationships within these sectors and reduces barriers for European partners.

Once Bulgaria switches to the Euro, the Lev will quietly disappear from global currency screens. Traders will no longer see familiar pairs like USD to BGN or GBP to BGN. Anything involving Bulgaria will now flow through euro-based pairs instead. In practical terms, the Lev simply stops being part of the conversation.

For people working on trading desks or in treasury teams, this creates a shift in how risk is measured day to day. Hedging strategies built around the Lev will transition to euro-based approaches. Models that once accounted for Lev-specific volatility will have to be rewritten. Automated trading programs that reference BGN pricing will need to be updated or retired. Even the market data providers that feed information into these systems will phase out Lev pricing entirely.

And while Bulgaria may be a smaller player in the global economy, the retirement of a national currency is never insignificant. It ripples through the internal workings of trading floors, risk management teams, and the systems that support them . It is a reminder that even quiet changes in one part of the world can require thoughtful adjustments across the financial landscape.

Combined with industry standard year-end code-freezes, Perficient has seen and helped clients stop their Lev trading weeks before year-end.

The Infrastructure Work Behind Adopting the Euro

Adopting the Euro is not just a change people feel sentimental about. Behind the scenes, it touches almost every system that moves money. Every financial institution uses internal currency tables to keep track of existing currencies, conversion rules, and payment routing. When a currency is retired, every system that touches money must be updated to reflect the change.

This includes:

  • Core banking and treasury platforms
  • Trading systems
  • Accounting and ERP software
  • Payment networks, including SWIFT and ISO 20022
  • Internal data warehouses and regulatory reporting systems

Why Global Firms Should Pay Attention

If the Lev remains active anywhere after the transition, payments can fail, transactions can be misrouted, and reconciliation issues can occur. The Bank for International Settlements notes that currency changes require “significant operational coordination,” because risk moves across systems faster than many institutions expect. 

Beyond the technical updates, the disappearance of the Lev also carries strategic implications for multinational firms. Any organization that operates across borders, whether through supply chains, treasury centers, or shared service hubs, relies on consistent currency identifiers to keep financial data aligned. If even one system, vendor, or regional partner continues using the old code, firms can face cascading issues such as misaligned ledgers, failed hedging positions, delayed settlements, and compliance flags triggered by mismatched reporting. In a world where financial operations are deeply interconnected, a seemingly local currency change can ripple outward and affect global liquidity management and operational continuity.

Many firms have already started their transition work well in advance of the official date in order to minimize risk. In practice, this means reviewing currency tables, updating payment logic, testing cross-border workflows, and making sure SWIFT and ISO 20022 messages recognize the new structure. 

Trade Finance Will Feel the Change

For people working in finance, this shift will change the work they do every day. Tools like Letters of Credit and Banker’s Acceptances are the mechanisms that keep international trade moving, and they depend on accurate currency terms. If any of these agreements are written to settle in Lev, they will need to be updated before January 2026.

That means revising contracts, invoices, shipping documents, and long-term payment schedules. Preparing early gives exporters, importers, and the teams supporting them the chance to keep business running smoothly through the transition.

What Euro Adoption Means for Businesses

Switching to the Euro unlocks several practical benefits that go beyond finance departments.

  • Lower currency conversion costs
  • More consistent pricing for long-term agreements
  • Faster cross-border payments within the European Union
  • Improved financial reporting and reduced foreign exchange risk
  • Increased investor confidence in a more stable currency environment

Because so much of Bulgaria’s trade already occurs with Eurozone countries, using the Euro simplifies business operations and strengthens economic integration.

How Organizations Can Prepare

The most important steps for institutions include:

  1. Auditing systems and documents for references to BGN
  2. Updating currency tables and payment rules
  3. Revising Letters of Credit and other agreements that list the Lev
  4. Communicating the transition timeline to partners and clients
  5. Testing updated systems well before January 1, 2026

Early preparation ensures a smooth transition when Bulgaria officially adopts the Euro. Ensure that operationally you’re prepared to accept Lev payments through December 31, 2025, but given settlement timeframes, prepared to reconcile and settle Lev transactions into 2026.a

Final Thoughts

The Bulgarian Lev has accompanied the country through a century of profound change. Its retirement marks the end of an era and the beginning of a new chapter in Bulgaria’s economic story. For the global financial community, Bulgaria’s adoption of the Euro is not only symbolic but operationally significant.

Handled thoughtfully, the transition strengthens financial infrastructure, reduces friction in global business, and supports a more unified European economy.

References 

Bank for International Settlements. (2024). Foreign exchange market developments and global liquidity trends. https://www.bis.org

Eichengreen, B. (1993). European monetary unification. Journal of Economic Literature, 31(3), 1321–1357.

European Central Bank. (2023). Convergence report. https://www.ecb.europa.eu

European Commission. (2023). Economic and monetary union: Euro adoption process. https://ec.europa.eu

Henriques, D. B. (2011). The billionaire was not always so bold. The New York Times.

Organisation for Economic Co-operation and Development. (2024). Economic surveys: Bulgaria. https://www.oecd.org

World Bank. (2024). Bulgaria: Country data and economic indicators. https://data.worldbank.org/country/bulgaria

 

]]>
https://blogs.perficient.com/2025/12/22/bulgarias-2026-euro-adoption-what-the-end-of-the-lev-means-for-markets/feed/ 1 389245
Regulatory Landscape Becomes More Stable as FDIC Approves Proposal for IDIs to Issue Stablecoins https://blogs.perficient.com/2025/12/17/regulatory-landscape-becomes-more-stable-as-fdic-approves-proposal-for-idis-to-issue-stablecoins/ https://blogs.perficient.com/2025/12/17/regulatory-landscape-becomes-more-stable-as-fdic-approves-proposal-for-idis-to-issue-stablecoins/#respond Wed, 17 Dec 2025 16:24:20 +0000 https://blogs.perficient.com/?p=389164

On December 16th, the Federal Deposit Insurance Corporation (FDIC) became the first US regulatory body to utilize the GENIUS Act and create procedures for institutions to issue payment stablecoins. The GENIUS Act was enacted on July 18, 2025, and will become effective on January 18, 2027, so there is still time to determine how your institution will navigate the new regulatory landscape. The FDIC approval marks a significant milestone in the evolving landscape of digital currencies and their integration into the traditional financial system. As the financial sector continues to embrace innovation, the FDIC’s decision provides much-needed clarity and regulatory guidance for institutions looking to venture into the stablecoin market.

For those unfamiliar, stablecoins are digital currencies pegged to the value of a traditional currency and have been gaining traction as a means of facilitating fast and secure transactions. By establishing a clear framework for their issuance, the FDIC is paving the way for FDIC-supervised institutions to explore this emerging market with greater confidence.

At Perficient, we believe that the new procedures will drive innovation in payment systems, enhance financial inclusion, and provide consumers with more choices for conducting transactions. We also believe that the act will allow significant innovation in the Treasury Services space and allow new Treasury entrants to embrace the new state of the art technology and leap to the head of the industry, just as the adoption of smart phones and the Internet allowed new leaders to emerge in those industries.

Key Highlights in the New Regulation

Readers must know that the FDIC approved proposal refers to the subsidiary of an Insured Depository Institution (“IDI”) that has been approved to issue payment stablecoins under the GENIUS Act as a Permitted Payment Stablecoin Issuer, or “PPSI” – an acronym that will soon become widely used in the industry.

The proposal limits PPSI’s activities to:

  • issuing and redeeming payment stablecoins,
  • managing related reserves,
  • providing payment stablecoin and reserve custodial and safekeeping services,
  • and engaging in digital asset service provider activities.

The proposal sharply prohibits pledging, rehypothecating, or reusing a PPSI’s reserves assets. The PPSI’s reserves are the digital equivalent to cash in the vault.

What the FDIC Will Require

The FDIC’s proposal outlines specific requirements for stablecoin issuance by PPSIs. To be eligible, the subsidiary must:

  • maintain identifiable reserves backing the outstanding payment stablecoins on at least a 1 to 1 basis,
  • Maintain reserves comprised of specified categories of high-quality assets,
  • Document the ability to relatedly meet the monthly reserve disclosure requirements applicable to a PPSI.
    • The reserve disclosure requirements include disclosing the composition of the PPSI’s reserves on its website and
    • submitting to the FDIC certified reports examined by a public accounting firm regarding the prior month’s reserve composition disclosure.

Additionally, the FDIC is required to consider the ability of the PPSI, based on financial condition and resources, to comply with forthcoming regulations to be issued by the FDIC regarding:

  1. capital requirements
  2. liquidity requirements
  3. reserve asset diversification
  4. operational, compliance, and information technology risk management principles-based requirements and standards, including but not limited to:
    1. Bank Secrecy Act
    2. Know Your Customer and
    3. Sanctions Standards

Therefore, while a significant landmark regulation, there are still more regulations to come before January 2027.

If your IDI is ready to start down this road as an applicant to create a state-of-the-art Treasury Payment subsidiary to issue and redeem stablecoins, you will need a partner with decades of background in the financial services industry. Perhaps one that has been trusted by 18 of the top 20 banks, 16 of the 20 largest wealth and asset management firms and are regularly recognized by leading analyst firms. If this sounds like the type of trusted partner you need to help build your Treasury Payment abilities, reach out to Perficient’s Financial Services Managing Director David Weisel to start a conversation.

]]>
https://blogs.perficient.com/2025/12/17/regulatory-landscape-becomes-more-stable-as-fdic-approves-proposal-for-idis-to-issue-stablecoins/feed/ 0 389164
Navigating the AI Frontier: Data Governance Controls at SIFIs in 2025 https://blogs.perficient.com/2025/10/13/navigating-the-ai-frontier-data-governance-controls-at-sifis-in-2025/ https://blogs.perficient.com/2025/10/13/navigating-the-ai-frontier-data-governance-controls-at-sifis-in-2025/#comments Mon, 13 Oct 2025 10:57:25 +0000 https://blogs.perficient.com/?p=387652

The Rise of AI in Banking

AI adoption in banking has accelerated dramatically. Predictive analytics, generative AI, and autonomous agentic systems are now embedded in core banking functions such as loan underwriting, compliance including fraud detection and AML, and customer engagement. 

A recent White Paper by Perficient affiliate Virtusa Agentic Architecture in Banking – White Paper | Virtusa documented that when designed with modularity, composability, Human-in-the-Loop (HITL), and governance, agentic AI agents empower a more responsive, data-driven, and human-aligned approach in financial services.

However, the rollout of agentic and generative AI tools without proper controls poses significant risks. Without a unified strategy and governance structure, Strategically Important Financial Institutions (“SIFIs”) risk deploying AI in ways that are opaque, biased, or non-compliant. As AI becomes the engine of next-generation banking, institutions must move beyond experimentation and establish enterprise-wide controls.

Key Components of AI Data Governance

Modern AI data governance in banking encompasses several critical components:

1. Data Quality and Lineage: Banks must ensure that the data feeding AI models is accurate, complete, and traceable.

Please refer to Perficient’s recent blog on this topic here:

AI-Driven Data Lineage for Financial Services Firms: A Practical Roadmap for CDOs / Blogs / Perficient

2. Model Risk Management: AI models must be rigorously tested for fairness, accuracy, and robustness. It has been documented many times in lending decision-making software that the bias of coders can result in biased lending decisions.

3. Third-Party Risk Oversight: Governance frameworks now include vendor assessments and continuous monitoring. Large financial institutions do not have to develop AI technology solutions themselves (Buy vs Build) but they do need to monitor the risks of having key technology infrastructure owned and/or controlled by third parties.

4. Explainability and Accountability: Banks are investing in explainable AI (XAI) techniques. Not everyone is a tech expert, and models need to be easily explainable to auditors, regulators, and when required, customers.

5. Privacy and Security Controls: Encryption, access controls, and anomaly detection are essential. These are all done already in legacy systems and extending it to the AI environment, whether it is narrow AI, machine learning, or more advanced agentic and/or generative AI it is natural to ensure these proven controls are extended to the new platforms. 

Industry Collaboration and Standards

The FINOS Common Controls for AI Services initiative is a collaborative, cross-industry effort led by the FINtech Open-Source Foundation (FINOS) to develop open-source, technology-neutral baseline controls for safe, compliant, and trustworthy AI adoption in financial services. By pooling resources from major banks, cloud providers, and technology vendors, the initiative creates standardized, open-source technology-neutral controls, peer-reviewed governance frameworks, and real-time validation mechanisms to help financial institutions meet complex regulatory requirements for AI. 

Key participants of FINOS include financial institutions such as BMO, Citibank, Morgan Stanley, and RBC, and key Technology & Cloud Providers include Perficient’s technology partners including Microsoft, Google Cloud, and Amazon Web Services (AWS). The FINOS Common Controls for AI Services initiative aims to create vendor-neutral standards for secure AI adoption in financial services.

At Perficient, we have seen leading financial institutions, including some of the largest SIFIs, establishing formal governance structures to oversee AI initiatives. Broadly, these governance structures typically include:

– Executive Steering Committees at the legal entity level
– Working Groups, at the legal entity as well as the divisional, regional and product levels
– Real-Time Dashboards that allow customizable reporting for boards, executives, and auditors

This multi-tiered governance model promotes transparency, agility, and accountability across the organization.

Regulatory Landscape in 2025

Regulators worldwide are intensifying scrutiny of Artificial Intelligence in banking. The EU AI Act, the U.S. SEC’s cybersecurity disclosure rules, and the National Insititute of Standards and Technology (“NIST”) AI Risk Management Framework are shaping how financial institutions must govern AI systems.

Key regulatory expectations include:

– Risk-Based Classification
– Human Oversight
– Auditability
– Bias Mitigation

Some of these, and other regulatory regimes have been documented and summarized by Perficient at the following links:

AI Regulations for Financial Services: Federal Reserve / Blogs / Perficient

AI Regulations for Financial Services: European Union / Blogs / Perficient 

Eu Ai Act Risk Based Approach

The Road Ahead

As AI becomes integral to banking operations, data governance will be the linchpin of responsible innovation. Banks must evolve from reactive compliance to proactive risk management, embedding governance into every stage of the AI lifecycle.

The journey begins with data—clean, secure, and well-managed. From there, institutions must build scalable frameworks that support ethical AI development, align with regulatory mandates, and deliver tangible business value.

Readers are urged to read the links contained in this blog and then contact Perficient, a global AI-first digital consultancy to discuss how partnering with Perficient can help run a tailored assessment and pilot design that maps directly to your audit and governance priorities and ensure all new tools are rolled out in a well-designed data governance environment.

]]>
https://blogs.perficient.com/2025/10/13/navigating-the-ai-frontier-data-governance-controls-at-sifis-in-2025/feed/ 1 387652
AI-Driven Data Lineage for Financial Services Firms: A Practical Roadmap for CDOs https://blogs.perficient.com/2025/10/06/ai-driven-data-lineage-for-financial-services-firms-a-practical-roadmap-for-cdos/ https://blogs.perficient.com/2025/10/06/ai-driven-data-lineage-for-financial-services-firms-a-practical-roadmap-for-cdos/#respond Mon, 06 Oct 2025 11:17:05 +0000 https://blogs.perficient.com/?p=387626

Introduction

Imagine just as you’re sipping your Monday morning coffee and looking forward to a hopefully quiet week in the office, your Outlook dings and you see that your bank’s primary federal regulator is demanding the full input – regulatory report lineage for dozens of numbers on both sides of the balance sheet and the income statement for your latest financial report filed with the regulator. The full first day letter responses are due next Monday, and as your headache starts you remember that the spreadsheet owner is on leave; the ETL developer is debugging a separate pipeline; and your overworked and understaffed reporting team has three different ad hoc diagrams that neither match nor reconcile.

If you can relate to that scenario, or your back starts to tighten in empathy, you’re not alone. Artificial Intelligence (“AI”) driven data lineage for banks is no longer a nice-to-have. We at Perficient working with our clients in banking, insurance, credit unions, and asset managers find that it’s the practical answer to audit pressure, model risk (remember Lehman Brothers and Bear Stearns), and the brittle manual processes that create blind spots. This blog post explains what AI-driven lineage actually delivers, why it matters for banks today, and a phased roadmap Chief Data Officers (“CDOs”) can use to get from pilot to production.

Why AI-driven data lineage for banks matters today

Regulatory pressure and real-world consequences

Regulators and supervisors emphasize demonstrable lineage, timely reconciliation, and governance evidence. In practice, financial services firms must show not just who touched data, but what data enrichment and/or transformations happened, why decisions used specific fields, and how controls were applied—especially under BCBS 239 guidance and evolving supervisory expectations.

In addition, as a former Risk Manager, the author knows that he would have wanted and has spoken to a plethora of financial services executives who want to know that the decisions they’re making on liquidity funding, investments, recording P&L, and hedging trades are based on the correct numbers. This is especially challenging at global firms that operate in in a transaction heavy environment with constantly changing political, interest rate, foreign exchange and credit risk environment.

Operational risks that keep CDOs up at night

Manual lineage—spreadsheets, tribal knowledge, and siloed code—creates slow audits, delayed incident response, and fragile model governance. AI-driven lineage automates discovery and keeps lineage living and queryable, turning reactive fire drills into documented, repeatable processes that will greatly shorten the time QA tickets are closed and reduce compensation costs for misdirected funds. It also provides a scalable foundation for governed data practices without sacrificing traceability.

What AI-driven lineage and controls actually do (written by and for non-tech staff)

At its core, AI-driven data lineage combines automated scanning of code, SQL, ETL jobs, APIs, and metadata with semantic analysis that links technical fields to business concepts. Instead of a static map, executives using AI-driven data lineage get a living graph that shows data provenance at the field level: where a value originated, which transformations touched it, and which reports, models, or downstream services consume it.

AI adds value by surfacing hidden links. Natural language processing reads table descriptions, SQL comments, and even README files (yes they do still exist out there) to suggest business-term mappings that close the business-IT gap. That semantic layer is what turns a technical lineage graph into audit-ready evidence that regulators or auditors can understand.

How AI fixes the pain points keeping CDOs up at night

Faster audits: As a consultant at Perficient, I have seen AI-driven lineage that after implementation allowed executives to answer traceability questions in hours rather than weeks. Automated evidence packages—exportable lineage views and transformation logs—provide auditors with a reproducible trail.
Root-cause and incident response: When a report or model spikes, impact analysis highlights which datasets and pipelines are involved, highlighting responsibility and accountability, speeding remediation and alleviating downstream impact.
Model safety and feature provenance: Lineage that includes training datasets and feature transformations enables validation of model inputs, reproducibility of training data, and enforcement of data controls—supporting explainability and governance requirements. That allows your P&L to be more R&S. (a slogan used by a client that used R&S P&L to mean rock solid profit and loss.)

Tooling, architecture, and vendor considerations

When evaluating vendors, demand field-level lineage, semantic parsing (NLP across SQL, code, and docs), auditable diagram exports, and policy enforcement hooks that integrate with data protection tools. Deployment choices matter in regulated banking environments; hybrid architectures that keep sensitive metadata on-prem while leveraging cloud analytics often strike a pragmatic balance.

A practical, phased roadmap for CDOs

Phase 0 — Align leadership and define success: Engage CRO, COO, and Head of Model Risk. Define 3–5 KPIs (e.g., lineage coverage, evidence time, mean time to root cause) and what “good” will look like. This is often done during a evidence gathering phase by Perficient with clients who are just starting their Artificial Intelligence journey.
Phase 1 — Inventory and quick wins: Target a high-risk area such as regulatory reporting, a few production models, or a critical data domain. Validate inventory manually to establish baseline credibility.
Phase 2 — Pilot AI lineage and controls: Run automated discovery, measure accuracy and false positives, and quantify time savings. Expect iterations as the model improves with curated mappings.
Phase 1 and 2 are usually done by Perficient with clients as a Proof-of-Concept phase to show that the key feeds into and out of existing technology platforms can be done.
Phase 3 — Operationalize and scale: Integrate lineage into release workflows, assign lineage stewards, set SLAs, and connect with ticketing and monitoring systems to embed lineage into day-to-day operations.
Phase 4 — Measure, refine, expand: Track KPIs, adjust models and rules, and broaden scope to additional reports, pipelines, and models as confidence grows.

Risks, human oversight, and governance guardrails

AI reduces toil but does not remove accountability. Executives, auditors and regulators either do or should require deterministic evidence and human-reviewed lineage. Treat AI outputs as recommendations subject to curator approval. This will avoid what many financial services executives are dealing with what is now known as AI Hallucinations.

Guardrails include the establishment of exception processing workflows for disputed outputs and toll gates to ensure security and privacy are baked into design—DSPM, masking, and appropriate IAM controls should be integral, not afterthoughts.

Conclusion and next steps

AI data lineage for banks is a pragmatic control that directly addresses regulatory expectations, speeds audits, and reduces model and reporting risk. Start small, prove value with a focused pilot, and embed lineage into standard data stewardship processes. If you’re a CDO looking to move quickly with minimal risk, contact Perficient to run a tailored assessment and pilot design that maps directly to your audit and governance priorities. We’ll help translate proof into firm-wide control and confidence.

]]>
https://blogs.perficient.com/2025/10/06/ai-driven-data-lineage-for-financial-services-firms-a-practical-roadmap-for-cdos/feed/ 0 387626
The Silent Architect: How Data Governance Will Decide the Winners and Losers in the AI World https://blogs.perficient.com/2025/04/28/the-silent-architect-how-data-governance-will-decide-the-winners-and-losers-in-the-ai-world/ https://blogs.perficient.com/2025/04/28/the-silent-architect-how-data-governance-will-decide-the-winners-and-losers-in-the-ai-world/#comments Mon, 28 Apr 2025 21:50:48 +0000 https://blogs.perficient.com/?p=380674

 “The strength of a nation derives from the integrity of the home.” – Confucius.

 A room full of smart people, eyes glinting with the thrill of the future. Words like predictive models, AI-driven insights, and automated decisioning fly across the table like a Wimbledon final. Budgets, approved. Deadlines, drawn. Headlines, dreamed about.

But no one talks to or notices the quiet, slightly awkward one in the room, “it’s Data Governance”. The one who isn’t flashy … The one who shows up early with spreadsheets. The one who asks annoying questions, such as, “Where did this data come from?” and “Can we really trust this source?”

And yet, in almost every great technology story and every technology failure, Data Governance is the silent architect, whether you call it that or not. It was present, building the foundation… or sometimes silently watching as the castle falls.

The Illusion of Data-Driven Greatness

Some time back, I was working on a project where a major trading platform launched a new engine to automate trade surveillance and compliance monitoring.

Dollars were invested. The system promised to detect insider trading, front-running, and wash trade patterns too subtle for human eyes to catch. At first, everyone celebrated… until the false positives began to roll in. Legitimate trades were flagged as suspicious!!! Compliance officers were drowning in noise!!! Clients grew agitated, and regulatory auditors began asking uncomfortable questions.

When traced back, the root cause wasn’t the model itself. It was the data feeding it!

  • Trade timestamps were off by milliseconds across systems.
  • Reference data on instrument types was incomplete.
  • Entity mappings between clients and brokers were outdated by over 9%.
  • Historical compliance notes were inconsistently formatted and misclassified.

The model learned from incorrect data… and produced inaccuracy at an exponential scale. The organization had to suspend the AI engine and return to manual reviews in parallel, a massive operational setback.

The real problem wasn’t a technology failure. It was a data governance failure.

Why Data Governance is the New Competitive Edge

In the coming decade, success won’t be determined by who has the flashiest algorithms. Algorithms are cheap, open-source, and are increasingly commoditized. Success will hinge on who has better data, the companies that:

  • Know where their data comes from.
  • Know how it has been transformed.
  • Know its limitations, its biases, and its gaps.
  • Know how to course-correct in real time when something goes wrong.

Data governance used to be framed as a compliance tax… a necessary evil. But in the AI economy? It has become the operating system. Companies that treat governance like a strategic weapon, like a competitive differentiator, will build systems that are faster, smarter, safer, and more trusted. Everyone else will just be building very expensive sandcastles at low tide and praying tides don’t change.

The Risks Few Are Talking About

People love to talk about risks in the AI-driven world in sci-fi terms: rogue robots, existential threats, AI Models running for president 😊

The real risk, one that is already unfolding in boardrooms and regulatory filings today, is much simpler: bad data feeding powerful systems.

  • False alerts triggering unnecessary audits.
  • Missed detection of real financial crimes.
  • Market surveillance breakdowns causing regulatory breaches.
  • Systemic compliance failures due to unseen data quality gaps.

All because governance was an afterthought.

The New Playbook for the AI Economy

If you’re a business leader, here’s the shift you need to make:

Old Thinking New Thinking
Data Governance is a compliance overhead Data Governance is strategic infrastructure
Data is static, fixed once loaded Data is dynamic, living, and needs continuous validation
Governance slows innovation Governance “enables” trustworthy, scalable innovation
We can fix data later Data quality debt is like technical debt… it compounds and destroys

Smart organizations are now embedding governance into the very DNA of how they build, deploy, and manage AI systems. They’re asking:

  • Who owns this dataset?
  • How do we know it’s complete?
  • What biases are hiding here?
  • How do we certify and monitor trustworthiness over time?

And they’re investing accordingly — not reactively, but proactively.

Respect the Architect

Here’s the thing about architects. If they do their jobs right, no one notices them. The building just stands tall, sturdy, unshakable against storms. But what about when the foundation is weak? When the beams are poorly set? When the wiring is rushed? Well, then everyone notices. Usually, it’s too late. Data Governance is the silent architect of the AI structures. It’s time we gave it the respect and the investment it deserves. Because in the end, it’s not the flashiest ideas that win.
It’s the ones built on unshakable foundations.

Remember: “It is not the beauty of a building you should look at; it is the construction of the foundation that will stand the test of time.” – David Allan Coe.

]]>
https://blogs.perficient.com/2025/04/28/the-silent-architect-how-data-governance-will-decide-the-winners-and-losers-in-the-ai-world/feed/ 10 380674
7 Steps to Define a Data Governance Structure for a Mid-Sized Bank (Without Losing Your Mind) https://blogs.perficient.com/2025/03/25/7-steps-to-define-a-data-governance-for-a-mid-sized-bank-without-losing-your-mind/ https://blogs.perficient.com/2025/03/25/7-steps-to-define-a-data-governance-for-a-mid-sized-bank-without-losing-your-mind/#respond Tue, 25 Mar 2025 22:07:39 +0000 https://blogs.perficient.com/?p=379259

A mid-sized bank I was consulting with for their data warehouse modernization project finally realized that data isn’t just some necessary but boring stuff the IT department hoards in their digital cave. It’s the new gold, the ticking time bomb of risk, and the bane of every regulatory report that’s ever come back with more red flags than a beach during a shark sighting.

Welcome to the wild world of data governance, where dreams of order collide with the chaos of reality. Before you start mainlining espresso and squeezing that stress ball shaped suspiciously like your last audit report, let’s break this down into 7 steps that might just keep you sane.

  1. Wrangle Some Executive Buy-In

Let’s not pretend. Without exec sponsorship, your data governance initiative is just a Trello board with high hopes. You need someone in a suit (preferably with a C in their title) to not just bless but be convinced about your mission, and preferably get it added to their KPI this year.

Pro tip to get that signature: Skip the jargon about “metadata catalogs” and go straight for the jugular with words like “penalties” and “reputational risk.” Nothing gets an exec’s attention quite like the threat of their club memberships being revoked.

  1. Tame the Scope Before It Turns Into a Stampede

Organizations  have a knack for letting projects balloon faster than a tech startup’s valuation. Be ruthless. You don’t need to govern every scrap of data from the CEO’s coffee order to the janitor’s mop schedule.

Focus on the critical stuff:

  • Customer data (because knowing who owes you money is kind of important)
  • Transaction history (aka “where did all the money go?”)
  • Regulatory reporting (because nobody likes surprise visits from auditors)

Start small, prove it works, then expand. Rome wasn’t built in a day, and neither was a decent data governance structure.

  1. Pick a Framework (But Don’t Treat It Like Holy Scripture)

Sure, you could go full nerd and dive into DAMA-DMBOK, but unless you’re gunning for a PhD in bureaucracy, keep it simple. Aim for a model that’s more “I get it” and less “I need an interpreter”.

Focus on:

  • Who’s responsible for what (RACI, if you must use an acronym)
  • What data belongs where
  • Rules that sound smart but won’t make everyone quit in protest

Remember, frameworks are like diets – the best one is the one you’ll actually stick to.

  1. Recruit Your Data Stewards (and Convince Them It’s Not a Punishment)

Your data stewards are the poor souls standing between order and chaos, armed with nothing but spreadsheets and a dwindling supply of patience. Look for folks who:

  • Actually understand the data (a rare breed, cherish them)
  • Can handle details without going cross-eyed
  • Won’t melt down when stuck between the rock of compliance and the hard place of IT

Bonus: Give them a fancy title like “Data Integrity Czar.” It won’t pay more, but it might make them feel better about their life choices.

  1. Define Your Terms (Or Prepare for the “What Even Is a ‘Customer’?” Wars)

Get ready for some fun conversations about what words mean. You’d think “customer” would be straightforward, but you’d be wrong. So very, very wrong.

  • Establish a single source of truth
  • Create a glossary that doesn’t read like a legal document
  • Accept that these definitions will change more often than a teenager’s social media profile

It’s not perfect, but it’s governance, not a philosophical treatise on the nature of reality.

  1. Build Your Tech Stack (But Don’t Start with the Shiny Toys)

For the love of all that is holy and GDPR-compliant, don’t buy a fancy governance tool before you know what you’re doing. Your tech should support your process, not be a $250,000 band-aid for a broken system.

Figure out:

  • Who gets to see what (and who definitely shouldn’t)
  • How you’re classifying data (beyond “important” and “meh”)
  • Where your golden records live
  • What to do when it all inevitably goes sideways

Metadata management and data lineage tracking are great, but they’re the icing, not the cake.

  1. Make It Boring (In a Good Way)

The true test of your governance structure isn’t the PowerPoint that put the board to sleep. It’s whether it holds up when someone decides to get creative with data entry at 4:59 PM on Fridays.

So:

  • Schedule regular data quality check-ups
  • Treat data issues like actual problems, not minor inconveniences
  • Set up alerts (but not so many that everyone ignores them)
  • Reward the good, don’t just punish the bad

Bonus: Document Everything (Then Document Your Documentation)

If it’s not written down, it doesn’t exist. If it’s written down but buried in a SharePoint site that time forgot, it still doesn’t exist.

Think of governance like flossing – it’s not exciting, but it beats the alternative.

Several mid-sized banks have successfully implemented data governance structures, demonstrating the real-world benefits of these strategies. Here are a few notable examples:

Case Study of a Large American Bank

This bank’s approach to data governance offers valuable lessons for mid-sized banks. The bank implemented robust data governance practices to enhance data quality, security, and compliance. Their focus on:

  • Aligning data management with regulatory requirements
  • Ensuring accurate financial reporting
  • Improving decision-making processes

resulted in better risk management, increased regulatory compliance, and enhanced customer trust through secure and reliable financial services.

Regional Bank Case Study

A regional bank successfully tackled data quality issues impacting compliance, credit, and liquidity risk assessment. Their approach included:

  1. Establishing roles and responsibilities for data governance
  2. Creating domains with assigned data custodians and stewards
  3. Collecting and simplifying knowledge about critical data elements (CDEs)

For example, in liquidity risk assessment, they identified core CDEs such as liquidity coverage ratio and net stable funding ratio.

Mid-Sized Bank Acquisition

In another case, a major bank acquired a regional financial services company and faced the challenge of integrating disparate data systems. Their data governance implementation involved:

  • Launching a data consolidation initiative
  • Centralizing data from multiple systems into a unified data warehouse
  • Establishing a cross-functional data governance team
  • Defining clear data definitions, ownership rules, and access permissions

This approach eliminated data silos, created a single source of truth, and significantly improved data quality and reliability. It also facilitated more accurate reporting and analysis, leading to more effective risk management and smoother banking services for customers.

Parting Thought

In the end, defining a data governance structure for your bank isn’t about creating a bureaucratic nightmare. It’s about keeping your data in check, your regulators off your back, and your systems speaking the same language.

When it all comes together, and your data actually starts making sense, you’ll feel like a criminal mastermind watching their perfect plan unfold. Only, you know, legal and with fewer car chases.

Now go forth and govern. May your data be clean, your audits be boring, and your governance meetings be mercifully short.

]]>
https://blogs.perficient.com/2025/03/25/7-steps-to-define-a-data-governance-for-a-mid-sized-bank-without-losing-your-mind/feed/ 0 379259
AI Regulations for Financial Services: Hong Kong https://blogs.perficient.com/2024/11/21/ai-regulations-for-financial-services-hong-kong/ https://blogs.perficient.com/2024/11/21/ai-regulations-for-financial-services-hong-kong/#respond Thu, 21 Nov 2024 15:14:09 +0000 https://blogs.perficient.com/?p=370864

Artificial intelligence (AI) is poised to affect every aspect of the world economy and play a significant role in the global financial system, leading financial regulators around the world to take various steps to address the impact of AI on their areas of responsibility. The economic risks of AI to the financial systems include everything from the potential for consumer and institutional fraud to algorithmic discrimination and AI-enabled cybersecurity risks. The impacts of AI on consumers, banks, nonbank financial institutions, and the financial system’s stability are all concerns to be investigated and potentially addressed by regulators.

It is the goal of Perficient’s Financial Services consultants to help financial services executives, whether they lead banks, bank branches, bank holding companies, broker-dealers, financial advisors, insurance companies or investment management firms, the knowledge to know the status of AI regulation and the risk and regulatory trend of AI regulation not only in the US, but around the world where their firms are likely to have investment and trading operations.

In the summer of 2024, the Hong Kong Monetary Authority (“HKMA”) issued multiple guidance documents to financial services firms covering their use of artificial intelligence in both customer-facing applications as well as anti-money laundering and detecting and countering terrorist financing (“AML/CTF”). Specifically, the HKMA issued:

  1. The guiding principles issued by the HKMA on August 19, 2024 (“GenAI”) in customer-facing applications (“GenAI Guidelines”). The GenAI Guidelines built on a previous HKMA circular “Consumer Protection in respect of Use of Big Data Analytics and Artificial Intelligence by Authorized Institutions” dated November 5, 2019 (“2019 BDAI Guiding Principles”) and provide specific guidelines to financial services firms on the use of GenAI; and
  2. An AML/CTF circular issued by the HKMA on September 9, 2024, that requires financial services firms with operations in Hong Kong to:
    1. undertake a study to consider the feasibility of using artificial intelligence in tackling AML/CTF, and
    2. submit the feasibility study and an implementation plan to the HKMA by the end of March 2025.

Leveraging the 2019 BDAI Guiding Principles as a foundation, the GenAI Guidelines adopts the same core principles of governance and accountability, fairness, transparency and disclosure, and data privacy and protection, but introduces additional requirements to address the specific challenges presented by GenAI.

Core Principles Requirements under GenAI Guidelines
Governance and Accountability The board and senior management of financial services firms should remain accountable for all GenAI-driven decisions and processes and should thoroughly consider the potential impact of GenAI applications on customers through an appropriate committee which sits within the firm’s governance framework. The board and senior management should ensure the following:

  • Clearly defined scope of customer-facing GenAI applications to avoid GenAI usage in unintended areas;
  • Proper policies and procedures and related control measures for responsible GenAI use in customer-facing applications; and
  • Proper validation of GenAI models, including a “human-in-the-loop” approach in early stages, i.e. having a human retain control in the decision-making process, to ensure the model-generated outputs are accurate and not misleading.
Fairness Financial services firms are responsible for ensuring that GenAI models produce objective, consistent, ethical, and fair outcomes for customers. This includes:

  • That model generated outputs do not lead to unfair outcomes for customers. As part of this, firms are expected to consider different approaches that may be deployed in GenAI models, such as
    1.       anonymizing certain data categories; and
    2.       using comprehensive and fair datasets; and
    3.       making adjustments to remove bias during validation and review; and
  • During the early deployment stage, provide customers with an option to opt out of GenAI use and request human intervention on GenAI-generated decisions as far as practicable. If an “opt-out” option is unavailable, AIs should provide channels for customers to request review of GenAI-generated decisions.
Transparency and Disclosure Financial Services firms should:

  • Provide appropriate transparency to customers regarding GenAI applications; and
  • Disclose the use of GenAI to customers; and
  • Communicate the use, purpose, and limitations of GenAI models to enhance customer understanding.
Data Privacy and Protection Financial Services firms should:

  • Implement effective protection measures for customer data; and
  • Where personal data are collected and processed by GenAI applications, comply with the Personal Data (Privacy) Ordinance, including the relevant recommendations and good practices issued by the Office of the Privacy Commissioner for Personal Data, such as the:
  1. “Guidance on the Ethical Development and Use of Artificial Intelligence” issued on August 18, 2021, and
  2. “Artificial Intelligence: Model Personal Data Protection Framework” issued on June 11, 2024.

Consistent with the HKMA’s recognition of the potential use of GenAI in consumer protection in the GenAI Guidelines, the HKMA Circular also indicates that the HKMA recognizes the considerable benefits that may come from the deployment of AI in improving AML/CTF. In particular, the HKMA Circular notes that the use of AI powered systems “take into account a broad range of contextual information focusing not only on individual transactions, but also the active risk profile and past transaction patterns of customers…These systems have proved to be more effective and efficient than conventional rules-based transaction monitoring systems commonly used by covered firms.”

Given this, the HKMA has indicated that financial services firms with operations in Hong Kong should:

  • give due consideration to adopting AI in their AML/CTF monitoring systems to enable them to stay effective and efficient; and
  • undertake a feasibility study in relation to the adoption of AI in their AML/CTF monitoring systems and based on the outcome of that review, should formulate an implementation plan.

The feasibility study and implementation plan should be signed off at the board level and submitted to the HKMA by March 31, 2025.

]]>
https://blogs.perficient.com/2024/11/21/ai-regulations-for-financial-services-hong-kong/feed/ 0 370864
1033 Open Banking Mandate Blueprint for Success https://blogs.perficient.com/2024/11/21/open-banking-1033/ https://blogs.perficient.com/2024/11/21/open-banking-1033/#respond Thu, 21 Nov 2024 14:30:47 +0000 https://blogs.perficient.com/?p=361726

The Consumer Financial Protection Bureau (CFPB) recently issued a final rule § 1033.121(c) supporting open banking and personal financial data rights. Under this ruling, banks, credit unions, credit card issuers, and other financial service providers must enhance consumer access to personal financial data.

The first compliance deadline of April 1, 2026, impacts the largest organizations.

  • The ruling demands action from all non-depository firms (e.g., institutions that issue credit cards, hold transaction accounts, issue devices to access an account, or provide other types of payment facilitation products or services). The compliance deadline, however, depends on the firm’s total receipts from calendar years 2023 and 2024.
    • April 1, 2026: $10B+ total receipts in either calendar year
    • April 1, 2027: <$10B total receipts in both calendar years
  • The ruling also impacts depository institutions that hold at least $850 million in total assets. Compliance deadlines follow a staggered rollout based on total assets.
    • April 1, 2026: $250B+ total assets
    • April 1, 2027: $10B to <$250B total assets 
    • April 1, 2028: $3B to <$10B total assets
    • April 1, 2029: $1.5B to <$3B total assets
    • April 1, 2030: $850M to <$1.5B total assets

Accelerating the shift to open banking with 1033 

Open banking changes how financial data is shared and accessed, giving customers more control of their information. The 1033 Personal Financial Data Rights rule ensures that:

  • Personal financial data is made available to consumers and agents at no charge
  • Data is exchanged through a safe, secure, and reliable digital interface
  • Consumers aren’t surprised with hidden or unexpected charges when accessing their personal financial data
  • Consumers can walk away from bad financial services and products
  • Safeguards protect consumers and financial firms from surveillance, data misuse, and risky data practices

Open banking is going to do for the banking industry what the introduction of the Apple smart phone did for cell phones.

CFPB 1033 open banking requires financial firms to ease personal financial data access for consumers 

CFPB first proposed the rule in the Federal Register on October 31, 2023, accepted public comments on the regulation though December 29, 2023, then issued its final rule November 18, 2024. This effort carries out the personal financial data rights established by the Consumer Financial Protection Act of 2010 (CFPA).

The final rule § 1033.121(c) “requires banks, credit unions, and other financial service providers to make consumers’ data available upon request to consumers and authorized third parties in a secure and reliable manner; defines obligations for third parties accessing consumers’ data, including important privacy protections; and promotes fair, open, and inclusive industry standards.”  

The implications of the CFPB’s regulation on open banking will be enormous for consumers, banks, and data providers.

Impact on consumers 

Without open banking, consumers struggle to switch between bank deposit and lending offerings. For example, switching checking accounts to one with a better interest rate involves resetting direct deposits and recurring bill-paying, printing new checks, and obtaining a new ATM card. Mistakes resulting in overdrafts are costly, both financially and to one’s credit score and reputation.   

As a result, larger banks have a much smaller net interest margin, as shown in the chart below:

Open Banking Chart For Carl's Blog

In addition, the stickiness of deposits causes a considerable lag between when a bank raises deposit rates and when deposit balances increase proportionately. 

As open banking, mandated by Rule 1033, takes effect, consumers will be able to:

  • Switch credit cards within seconds while retaining terms and rewards of their current account
  • Transfer deposits and multiple years of transaction history into a new checking account  

Impact on data providers 

Data providers, including digital wallet providers, will be able to move on from “screen scraping” and instead provide API-driven real-time balances, transaction history, and reward balances to their retail customers. Of course, providing this “new and improved” service will require re-writing front ends and processing engines to provide the necessary data in a timely manner. 

Impact on banks 

Banks and their affiliates must look toward building an open, larger ecosystem as part of continued digital transformation efforts.

While challenging, this work is necessary for banks that aim to grow revenue through collaboration and cooperation. Ultimately, banks that don’t satisfy their borrowers or lenders will be hard-pressed to compete in the ever-challenging financial landscape.

Navigate 1033 open banking compliance deadlines with confidence 

We encourage leaders to identify mandates’ silver lining opportunities. After all, to remain competitive and compliant, financial services firms must innovate in ways that add business value, meet consumers’ evolving expectations, and build trust. Achieving transformative outcomes and experiences requires a digital strategy that not only satisfies mandates but also aligns the enterprise around a shared vision and actionable KPIs, ultimately keeping customers at the heart of progress.

A holistic approach could include:

  • Strategy + Transformation: current-state assessment, future-state roadmap, change management
  • Platforms + Technology: pragmatically scalable, composable architecture and automations to accelerate progress
  • Data + Intelligence: well-governed “golden source of truth” data and secure integrations/orchestration
  • Innovation + Product Development: engineering and design for what’s now, new, and next
  • Customer Experience + Digital Marketing: human-centered, journey-based engagement
  • Optimized Delivery: Agile methodologies, deep domain expertise, and scalable global teams

Our financial services experts continuously monitor the regulatory landscape and deliver pragmatic, scalable solutions that meet the mandate and more. Discover why we’ve been trusted by 18 of the top 20 banks, 16 of the 20 largest wealth and asset management firms, and are regularly recognized by leading analyst firms.

Ready to explore your firm’s compliance with Rule 1033? Contact us to discuss your specific risk and regulatory challenges.  

]]>
https://blogs.perficient.com/2024/11/21/open-banking-1033/feed/ 0 361726
AI Regulations for Financial Services: Japan https://blogs.perficient.com/2024/11/19/ai-regulations-for-financial-services-japan/ https://blogs.perficient.com/2024/11/19/ai-regulations-for-financial-services-japan/#respond Tue, 19 Nov 2024 15:13:32 +0000 https://blogs.perficient.com/?p=370870

Artificial intelligence (AI) is poised to affect every aspect of the world economy and play a significant role in the global financial system, leading financial regulators around the world to take various steps to address the impact of AI on their areas of responsibility. The economic risks of AI to the financial systems include everything from the potential for consumer and institutional fraud to algorithmic discrimination and AI-enabled cybersecurity risks. The impacts of AI on consumers, banks, nonbank financial institutions, and the financial system’s stability are all concerns to be investigated and potentially addressed by regulators.

It is the goal of Perficient’s Financial Services consultants to help financial services executives, whether they lead banks, bank branches, bank holding companies, broker-dealers, financial advisors, insurance companies or investment management firms, the knowledge to know the status of AI regulation and the risk and regulatory trend of AI regulation not only in the US, but around the world where their firms are likely to have investment and trading operations.

Japan currently has yet to pass a law or regulation specifically directed to regulating the use of AI at financial services firms. Currently, the Japanese government and regulators are taking an indirect approach of supporting a policy goal of prioritizing innovation while minimizing foreseeable harms.

On April 19, 2024, the Japanese government published new “AI Guidelines for Business Version 1.0” (the “Guidelines”). While not legally binding, the Guidelines are expected to support and induce voluntary efforts by developers, providers, and business users of AI systems through compliance with generally recognized AI principles and are similar to the EU regulations discussed previously in that they propose a risk-based approach.

As noted on page 26 of the English version of the Guidelines, the Guidelines promote “agile governance” where “multiple stakeholders continuously and rapidly run a cycle consisting of environment and risk analysis, goal setting, system design, operation and then evaluation in various governance systems in companies, regulations, infrastructure, markets, social codes and the like”.

In addition to the Guidelines, an AI Strategy Council, a government advisory body, was established to consider approaches for maximizing the potential of AI while minimizing the potential risks to the financial system. On May 22, 2024, the Council submitted draft discussion points concerning the advisability and potential scope of any future regulation.

Finally, a working group in the Japanese Parliament has proposed the first specific Japanese regulation of AI, “the Basic Act on the Advancement of Responsible AI,” which proposes a hard law approach to regulate certain generative AI foundation models. If passed as-is, the Japanese government would designate the AI systems and developers that are subject to regulation; impose obligations on them with respect to the vetting, operation, and output of the systems; and require periodic reports concerning AI systems.

The proposed obligations would provide a general framework, while industry groups for financial services firms would work with the Japanese Financial Services Agency (“JFSA”) to establish the specific standards by which firms would comply. It is further thought that the government would have the authority to monitor AI developers and impose fines and penalties for violations of the reporting obligations and/or compliance with the substance of the law.

]]>
https://blogs.perficient.com/2024/11/19/ai-regulations-for-financial-services-japan/feed/ 0 370870