Banking CIO Outlook
show-menu

Meeting the Rigors of Stress Testing with a Stronger Data Strategy

The US Federal Reserve recently released the results of its annual stress test involving dozens of major banks. As analysts review the results for capital management insights, it’s a good time for technology leaders to review the stress testing process itself as one of the most valuable and rigorous activities from an IT management perspective. Let’s examine how stress testing by the Fed and other regulators helps ensure banks can perform well in the face of both stress testing scenarios and real-world operational challenges with a modern data and workflow management approach.

The unique value of stress tests

Known formally as the US Federal Reserve Comprehensive Capital Analysis and Review (CCAR), the Fed’s stress test is an annual rite for dozens of major banks. It is designed to gauge how well firms are prepared to absorb losses while maintaining adequate capital levels to continue operating. Stress tests by the Fed and other regulators are meant to clarify weaknesses and point the way toward a stronger and more resilient banking enterprise.

The banks that took part in this year’s testing by the Fed succeeded in demonstrating capacity to withstand a mock scenario of severe recession. Depending on where they operate, firms may also have to conduct stress tests with the Bank of England Prudential Regulation Authority (PRA), the European Banking Authority (EBA), the Hong Kong Monetary Authority (HMA), and other entities.

Stress tests are a uniquely collaborative exercise for banks and regulators. They’re less about oversight and more about gaining a collective understanding of how to test and refine best practices for keeping banks responsive and resilient to financial shocks. By uncovering vulnerabilities in a bank’s balance sheet, capital adequacy, and management practices, stress testing informs risk management and mitigation strategies while uncovering where and how processes and operations can be improved.

Modernizing stress testing methodologies

The stress testing process itself is rigorous. For a large bank, a typical stress test will cover dozens of complex processes involving thousands of tasks and data elements, all under tight time constraints. The scope is such that banks are finding they must prioritize several key modernization goals in order to have the infrastructure in place needed to conduct these stress tests.

Let’s start by isolating some essential methodologies banks’ data architectures should support for effective stress testing. To begin with, stress testing requires banks to manage complex test procedures and gather data from multiple disparate data sources. This puts a premium on data integration capabilities to connect various information streams from across the enterprise and accelerate data gathering. Banks also need to streamline workflows in order to make stress test management more efficient, accurate, and collaborative. It’s important to have a solution that can design and automate, with a high degree of accuracy, the unique sequence of tasks and approvals required for testing to take place.
Process monitoring is another essential capability that can provide visibility into stress test activities to improve quality assurance and repeatability. The right solution will support real-time monitoring and optimizing of both application and infrastructure performance so that issues are detected and resolved before they have an adverse impact on end users. Throughout, organizations should utilize collaboration and reporting tools that enable teams to work together efficiently, pre-screen and share test results with key stakeholders and regulators.

Evolving data architectures for stronger stress testing and operational resilience

The above methodologies are essential to support modern stress testing, but they must be supported by an underlying data infrastructure that is highly evolved and responsive to the agile needs of a modern banking enterprise.

Organizations have several data frameworks to choose from in the marketplace, including data lakes, data meshes, and data fabric architectures. While all of these choices represent an improvement on the traditional data warehouse approach, when it comes to breaking down silos and better connecting data, both data lake and data mesh architectures still require coding intensive, sophisticated data engineering in order to run properly.

By contrast, a data fabric architecture lets you access all data where it resides while simultaneously applying an “abstraction layer” that eliminates the high-code burden and connects all data in a centralized platform. In essence, a data fabric allows teams to get a unified view of data across the organization—highly contextualized data that is ready for consumption and management by the various business users who need it. The addition of low-code platforms can further support this collaborative data fabric architecture. Such interfaces allow business analyst-level users to create and manage even complex functions like robotics process automation (RPA) and digital process automation (DPA) by connecting data and accelerating automation with AI. This helps speed development times and quickly scale processes.

The underlying combination of data fabric and modern process orchestration architecture will drastically improve agility and control around the methodologies for managing data, processes, and workflows that are required for proper stress testing. Consider the use case of one global banking enterprise with more than 250,000 staff operating in 80 countries. This organization used a data fabric and low-code solution to quickly coordinate 30 processes with over 12,500 tasks and 3,000 data elements globally. The bank ultimately delivered successful CCAR stress tests in 13 weeks from the start through to live production use.

Outcomes like this not only lead to better performance on stress tests, they also help banks evolve and strengthen their resolution plans. Commonly known as living wills, resolutions plans are documents that detail how banks would unwind assets in the event of a bank failure, and they are legally required by the Dodd-Frank Act. Strengthening living wills is top of mind for many banks in light of recent regulatory findings showing weaknesses in such plans. The good news is that better technologies and processes can at once fortify living wills and reduce the chance that banks will ever need to use them in the first place.

Conclusion

Stress testing is a valuable and rigorous exercise. Financial institutions must implement solutions that enable more agile and effective stress test management and reporting to regulatory authorities. To better manage data, workflows, and process automation at scale, a unified platform that includes data fabric and low-code tooling can help banks optimize both their stress testing capabilities and overall operations.