productionizing allowance: an engineering approach for an analytical process · 2021. 5. 28. · an...

3
FEATURED CONSULTANTS Productionizing Allowance: An Engineering Approach for an Analytical Process THE SITUATION Each quarter, three analyst teams involving 20 people at a Top 10 bank spent four weeks estimating the company’s consumer loan impairment allowance, which represents capital to be set aside for expected losses on the bank’s $100 billion loan portfolio. At the beginning of the manual process, the analysts would collect and input source data into spreadsheets, which often led to missing data fields and mistakes that caused inaccurate calculations. Every cycle, they spent weeks pulling the most recent portfolio performance data, running allowance models and generating reports – leaving limited time and bandwidth for higher-value analysis and insight generation. The lead time for execution resulted in a lag between the availability of new data and an updated model, which meant leadership used outdated assumptions to determine their customer credit policy. Recognizing that its cumbersome manual processes, lack of documentation and control failures were driving audit findings and potentially resulting in holding tens of millions of dollars more allowance than necessary due to uncertainty in the best forecast, bank leadership needed help improving their process. Data Engineering Business Process Automation Model Implementation Sarbanes-Oxley (SOX) Act Compliance Adam Gradzki Abdul Mallick CAPABILITIES COVERED

Upload: others

Post on 21-Aug-2021

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Productionizing Allowance: An Engineering Approach for an Analytical Process · 2021. 5. 28. · An Engineering Approach for an Analytical Process THE SITUATION Each quarter, three

FEATURED CONSULTANTS

Productionizing Allowance: An Engineering Approach for an Analytical Process

THE SITUATIONEach quarter, three analyst teams involving 20 people at a Top 10 bank spent four weeks

estimating the company’s consumer loan impairment allowance, which represents capital

to be set aside for expected losses on the bank’s $100 billion loan portfolio.

At the beginning of the manual process, the analysts would collect and input source data

into spreadsheets, which often led to missing data fields and mistakes that caused

inaccurate calculations. Every cycle, they spent weeks pulling the most recent portfolio

performance data, running allowance models and generating reports – leaving limited

time and bandwidth for higher-value analysis and insight generation. The lead time for

execution resulted in a lag between the availability of new data and an updated model,

which meant leadership used outdated assumptions to determine their customer

credit policy.

Recognizing that its cumbersome manual processes, lack of documentation and control

failures were driving audit findings and potentially resulting in holding tens of millions of

dollars more allowance than necessary due to uncertainty in the best forecast, bank

leadership needed help improving their process.

Data Engineering

Business Process Automation

Model Implementation

Sarbanes-Oxley (SOX) Act

Compliance

Adam Gradzki

Abdul Mallick

CAPABILITIES COVERED

Page 2: Productionizing Allowance: An Engineering Approach for an Analytical Process · 2021. 5. 28. · An Engineering Approach for an Analytical Process THE SITUATION Each quarter, three

Productionizing Allowance: An Engineering Approach for an Analytical Process

Business Beyond the Horizon | 804.510.0768 | flyingphase.com 2

the parameters of each run using plain English. This

enabled a single analyst to have total control over inputs,

assumptions and other parameters that could be used for

execution, but did not require coding skill to update and

produce new results. Each module’s plug-and-play nature

allowed analysts to execute quickly, while retaining the

flexibility to run a single module to investigate anomalies

and to explore scenarios and insights within each process

phase – from data, to modeling, to allowance calculation.

Concurrent to the Excel coding process, we partnered

with the modeling teams to rationalize and optimize their

business logic, while simultaneously confronting incoming

data quality. We repointed data sourcing away from

manual inputs to well-controlled SQL-based databases,

which greatly reduced quality issues and eliminated

manual errors. We also instituted a new data quality

monitoring system to identify upstream data anomalies

and alert analysts to potential issues before using data in

the process.

OUR APPROACHLeveraging open-source programming languages like

Python, we began by transitioning the set of Excel

worksheets used to generate the impairment allowance

into a code-based framework. We translated all

calculations into code, and organized them into a set of

interconnected, independent modules – though still

loosely coupled – to enable interoperability,

experimentation and quick modifications at every stage of

the impairment process.

Next, we implemented a simplified configuration file to set

Reimagine execution of loss forecasting models to produce an accurate impairment allowance that remains well-governed while conserving capital for reinvestment elsewhere.

CHALLENGE

Figure 1: Simplification of spreadsheet-based process with modular code-based execution

Page 3: Productionizing Allowance: An Engineering Approach for an Analytical Process · 2021. 5. 28. · An Engineering Approach for an Analytical Process THE SITUATION Each quarter, three

Productionizing Allowance: An Engineering Approach for an Analytical Process

Business Beyond the Horizon | 804.510.0768 | flyingphase.com 3

To address governance concerns and ensure the forecast’s

accuracy, we partnered with Internal Audit to implement

and automate SOX controls to reflect the more streamlined

process. We created logging for all inputs, intermediate

calculations and final calculations for each model

component. To strengthen access and change

management controls, we moved all code into GitHub to

maintain a centralized, approved version used for

production. This enabled code updates to be more closely

controlled, and highlighted differences from month to

month, which could be reproduced and compared for

enhanced transparency and auditability.

To speed up the deployment of enhancements and

onboarding of new loan portfolios – and to confirm that

changes to the code worked as intended – we built an

automated testing pipeline to run a test suite against any

requests and ensure no unexpected errors were

introduced during deployment. We also automatically

exported model results to a web-based Tableau dashboard

for visualization and further analysis by downstream

analysts. Raw data extracts were then stored in the cloud

for ad hoc analysis and insight generation, as needed.

Figure 2: Migration of legacy manual process to engineering-based solution

• Created a robust, easy-to-use and audit-

ready loss forecasting and allowance system

that eliminated manual errors and increased

forecast accuracy

• Deployed well-documented open source

code and integrated logging to satisfy all

compliance and regulatory requirements for

a critical financial process

• Implemented a consistent software

development and change management

process that reduced time needed to

onboard new portfolios and models by more

than 80%

• Leveraged configuration files and parallel

execution to reduce cycle time for execution

and analysis from one month to one week

• Automated execution to reduce required

resources from 20 business, data and

quantitative analysts to four business analysts

MEASURABLE RESULTS