top considerations in wholesale credit loss...portfolio-level models • initial phase of modeling...

16
Top Considerations in Wholesale Credit Loss with Comprehensive Capital Analysis and Review Model Development

Upload: others

Post on 17-Apr-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Top Considerations in Wholesale Credit Loss...Portfolio-Level Models • Initial phase of modeling lifecycle. • Limited heterogeneity. • The portfolio shows significantly dierent

Top Considerations in Wholesale Credit Losswith Comprehensive Capital Analysis and Review Model Development

Page 2: Top Considerations in Wholesale Credit Loss...Portfolio-Level Models • Initial phase of modeling lifecycle. • Limited heterogeneity. • The portfolio shows significantly dierent

2

Introduction

Each year, the US Federal Reserve System undertakes the annual Comprehensive Capital Analysis and Review (CCAR) to assess whether the largest bank holding companies (BHCs) operating in the United States have sufficient capital to continue operations in times of financial stress and have the capability to meet their fiduciary duties.

The process started in 2011, and has completed five annual cycles as of 2016. As a guiding principle, CCAR has focused not just on the amount of capital that a BHC has, but also on the internal practices and policies that a firm uses to determine the adequacy of the amount and composition of capital.1

Banks participating in CCAR include retail-centric firms, global investment banks, regional banks with a mix of traditional banking and capital markets. In all cases, traditional wholesale or retail banking activities figure prominently in firms’ businesses. Reflecting this, CCAR stress testing emphasizes the modeling of wholesale and retail lending, as well as trading activities. In this paper, we discuss preferred practices and emerging trends for CCAR participants and present guidance with a focus on wholesale lending.

CCAR Model Development Framework As is consistent with their stated guidance, supervisors expect institutions to implement a well-defined CCAR model development framework, including both qualitative and quantitative components.2 Supervisors also expect banks to have clear and defensible processes for the estimation of models for stress testing.3 The estimation methodologies and processes should be well supported, transparent and replicable. Institutions should

establish a quantitative basis for enterprise-wide scenario analysis, supplemented by qualitative projections, specialist judgment and adjustments for purposes of stressed capital estimation under CCAR scenarios.

Shown in Figure 1 is a generalized modeling framework of loss forecasting and Pre-Provision Net Revenue (PPNR), which focuses on quantitative components. While risk segments such as operational and market risk—in addition to credit risk—are included, certain framework components are common to all. These common components, such as data selection and preparation, portfolio segmentation and the champion vs. challenger model construct (i.e., the leading model in production vs. an alternative model not in production built for testing purposes, respectively), present unique challenges to CCAR participants.

Model DataSupervisors and CCAR participants generally concur that the use of internal data for estimation of CCAR models is a preferred practice. However, it should be noted that there are several cases in which the use of external data is favored.

Banks using external data should be familiar with the standards for using such data, as these can help satisfy supervisory expectations and deliver the appropriate modeling outcome. For example, there should be a robust mapping methodology which links the external data history to the bank’s current portfolio. This can be done

through mechanisms such as matching characteristics by industry, geography or risk rating. If there are material differences in portfolio characteristics or risk profile between internal and external datasets, there should be a transparent and well-documented process for adjusting model output to compensate for these divergences.

Segmentation ApproachSegmentation according to homogeneity of risk—or grouping exposures according to a common set of risk factors—is generally considered to be a leading industry practice. However, it is still rather common for segmentation to be driven by non-risk factors, such as business unit or product type. Reasons include the way bank business lines are organized and the aggregation of reporting, among other factors. The choice of a segmentation approach can often be driven by data limitations. The complexity of the bank’s portfolio often drives the degree of granularity in the segmentation.

The choice of segmentation plays a key role in CCAR modeling efforts and can inform the choice of modeling methodology itself. As CCAR participants evolve their modeling through successive submissions, we frequently see a migration from portfolio to loan-level models, corresponding with an increase in modeling sophistication, as highlighted in Figure 2. However, note that loan-level models are not always more advanced, as the rationalization of the portfolio should be taken into account.

Page 3: Top Considerations in Wholesale Credit Loss...Portfolio-Level Models • Initial phase of modeling lifecycle. • Limited heterogeneity. • The portfolio shows significantly dierent

3

Source: Accenture, October 2016

Figure 1. Quantitative CCAR Framework

Figure 2. Model Segmentation Evolution

Source: Accenture, October 2016

Loss Forecasting PPNR

NetInterestIncome(NII)

Retail andWholesaleCredit Risk

Assets Held forSale, Held toMaturity andSecurities

OperationalRisk

Market Risk andCounterpartyRisk

Non-InterestRevenue(NIR)

Non-InterestExpense(NIE)

Data Selection

Segmentation

Assumptions/Sensitivity Testing/Challenger Models

The corporate planning areas toprovide independent projectionsthat are compared to theaggregated business line results,or vice versa.

Expected LossApproaches

Estimationmethods thatcapture both

security-specificand country-

specificperformance

data for relevantportfolios

Rating TransitionModels

Roll-Rate Models

Vintage LossModels

Charge-O�Models

ScalarAdjustments

RegressionModels

Loss-Distribution

Approach (LDA)

ScenarioAnalysis

HistoricalAverages

LegalExposures

StressScenario

TranslatingScenarios to Risk

Factor Shocks

RevaluationMethodologies

and Profit & Loss(P&L) Estimates

Risk Mitigantsand Other

Assumptions

Portfolio-Level Models

• Initial phase of modeling lifecycle.• Limited heterogeneity.

• The portfolio shows significantly di�erent historical characteristics.

Segment-Level Models

Loan-Level Models

• Regulatory requirement for loan-level analysis.• The data quality is high and allows for a loan-level model.• Capture loan-level attributes for granular forecasts.

Modeling Methodology Under the current guidance, regulators are not taking a prescriptive approach regarding the choice of specific methodologies. This means there are no “standard” modeling techniques for

banks to project PPNR or losses for CCAR purposes due to different business models and degrees of systemic importance. Thus, a bank’s choice of modeling approaches and stress scenarios should reflect BHC-specific factors, such as business strategy, portfolio size, scope of operations,

activities, approaches to capital planning, and other considerations. However, regulators do provide supervisory guidance addressing expectations and preferred practices for model development, which we will discuss in more detail.

Page 4: Top Considerations in Wholesale Credit Loss...Portfolio-Level Models • Initial phase of modeling lifecycle. • Limited heterogeneity. • The portfolio shows significantly dierent

4

Modeling FrameworkIn CCAR, BHCs should focus on risk and finance integration to the fullest extent possible. By taking this focus, banks can realize the economies of scale that accrue by leveraging CCAR models for things like business-as-usual (BAU) forecasting, and can also benefit from the fact that such models will have a greater degree of credibility in the eyes of supervisors. The early Basel Advanced Internal Ratings-Based Approach (AIRB) guidance introduced the concept of the “Use Test,” with integration aspects designed to include the following:4

• Integration of risk and revenue models.

• Frequent and parallel communication between the model development function and the lines of business.

• Consistency and transparency across processes, methodologies and reporting platforms.

As mentioned above, many models used for stress testing rely on a significant number of assumptions for implementation. The following methods are used to test the robustness of models and to help the modelers, BHC management, the Board of Directors and supervisors identify the assumptions and parameters that can materially affect outcomes:

• Sensitivity Analysis. This is one tool BHCs might want to use to help identify the assumptions and parameters that materially affect outcomes. Sensitivity analysis can also help link core assumptions to outcomes.

• Challenger Models. Using results from different estimation approaches or different sets of macroeconomic drivers as a benchmark is another way BHCs can gain greater comfort around their primary model estimates. Through the comparison of results, the strengths of one approach can help highlight weaknesses in another.

As seen in Figure 3 below, a Stress Testing Architecture is a generalized stress testing modeling framework, which illustrates the relationship between individual stress testing models and stress testing execution. This framework highlights various CCAR components and is illustrative of CCAR execution, focusing on a traditional banking business line, such as wholesale or retail lending, along with minimal capital markets activities.

Figure. 3 Stress Testing Architecture

Source: Accenture, October 2016

Inputs

Regulatory Mapping of all Categories

Capital Plan and Policy

(for RegulatoryReview)

Baseline Calculations

Stress Scenario Calculations

Reporting

Wholesale Exposures

Retail Exposures

Common Hierarchies- Line of Business (LOB),Asset ClassGL Data

Commercial Exposures

Risk Information

Asset CorrelationInformation

Mitigant Information

Customer Information

Customer Hierarchy

Market and Trade Data

Securitized Exposures

Probability of Default (PD), Loss Given Default (LGD) RatingMigration Matrix

Historical Data -Delinquency, Charge-O�s, Recoveries, etc.

Baseline P&Land BalanceSheet

Baseline CapitalAdequacy and CARComputation

Stress P&L andBalance Sheet

Stress CapitalAdequacy and CARComputation

Risk AppetiteMonitoringand Tracking

Model AssumptionValidation andRisk Analysis

Data

Qua

lity

Chec

ks, V

alid

atio

n, G

ap A

naly

sis

and

Redr

essa

l Credit Market Opr. Liq.

Stress AssetBalanceForecast

Stress LiabilityBalanceForecast

StressProvisions and

Reserves

Stress Capital Computation/CAR Determination

Stress Tradingand Counterparty

Losses

Stress ScenarioBad Debts +

Unrecoverables+ Credit Losses

Stress ScenarioP&L Forecast

StressProforma P&L

Statement

StressProforma

Balance Sheet

Asset SideForecast

Liability SideForecast

P&LForecast

Provisionsand Reserves

Capital Computation/CapitalAdequacy Requirement (CAR)

Determination

Trading andCounterparty

Losses

Bad Debts +Unrecoverables+ Credit Losses

RetainedEarnings andDistributions

Model 1Top Down +Provision-

Based Model

Model 2Cash Flow-

Based Model

Page 5: Top Considerations in Wholesale Credit Loss...Portfolio-Level Models • Initial phase of modeling lifecycle. • Limited heterogeneity. • The portfolio shows significantly dierent

5

Page 6: Top Considerations in Wholesale Credit Loss...Portfolio-Level Models • Initial phase of modeling lifecycle. • Limited heterogeneity. • The portfolio shows significantly dierent

6

Emerging Trends for Loss Forecast Modeling of Wholesale Portfolios

Through multiple CCAR reporting cycles, BHCs have faced a number of challenges in stress testing their lending portfolios. Among these challenges are data issues, portfolio segmentation, loss estimation methodologies and treatment of sub-portfolios considered immaterial. In this section, we discuss a number of approaches that CCAR participants have taken to address these concerns.

Data ConsiderationsOne recurring challenge for many BHCs is data availability. CCAR stress testing data requirements run the gamut from developmental datasets for individual models, to full cycle production datasets with eventual general ledger (G/L) reconciliation. Contributing factors for data limitations vary based on specific BHCs and their individual business models, but it is safe to say that BHCs will experience limitations that they need to compensate for.

Based on our experience with CCAR banks across a number of business lines, we see some distinct tendencies and challenges arising in wholesale portfolios. These include:

Table 1. Data Considerations for Loss Forecasting in Wholesale Portfolios

Wholesale Portfolio

For wholesale portfolios, due to a low number of defaults or a short time series (5–10 years) with only a single downturn cycle, the following practices are common:

• Internal data is supplemented with external data in building the model.

• External data is used to calibrate models.

• The extent of this mix depends upon the risk profile of the portfolio.

Experience of credit specialists in relation to particular products or customer groups may be leveraged to augment datasets.

Source: Accenture analysis, October 2016

In general, regulators encourage the temporary use of mapping from roughly comparable internal or external sources, or from projections based on econometric models or judgment from specialists. It is the expectation that at some point banks will construct robust internal datasets to support loss forecasting modeling. In addition to internal data, banks have also used external data to build models. Some types of external data providers are listed in Table 2.

Page 7: Top Considerations in Wholesale Credit Loss...Portfolio-Level Models • Initial phase of modeling lifecycle. • Limited heterogeneity. • The portfolio shows significantly dierent

7

Vendor Type Features

Third-Party Data Sources • These vendors offer financial data that spans industries and countries, and includes both the private and public sectors.

• Historical data that is available includes trading and capital markets related fields, as well as company or sector specific financials.

• Vendors generally standardize data fields across trade and regulatory bodies, including accounting (Generally Accepted Accounting Principles-GAAP), International Organization for Standardization (ISO), Global Legal Entity Identifier (GLEI).

Government Agencies The Federal Financial Institutions Examination Council (FFIEC) Central Data Repository’s Public Data Distribution provides financial and structural information for every Federal Deposit Insurance Corporation (FDIC) insured institution and is updated to reflect the most recent data for both prior and current periods. This includes accounting charge-offs or recoveries and revenue/expense line items.

Nationally Recognized Statistical Rating Organizations

A number of Nationally Recognized Statistical Rating Organizations (NRSROs) offer historical default and recovery data for borrowers, including ratings. Borrowers include corporates, sovereign entities, non-profits and others. The available history varies by NRSRO.

Industry Data Consortia Several industry-sponsored organizations pool various modeling reference data fields across institutions and provide anonymized datasets to their members for modeling and benchmarking purposes such as the Risk Management Association (RMA), Moody’s Analytics Credit Research Database (CRD) and the Global Credit Data (GCD) consortium.

Source: Accenture analysis of publicly available data, October 2016

Segmentation ApproachesCCAR modeling requires that banks segment portfolios at an appropriate level of granularity. Dimensions of segmentation can include portfolio, product, geography, obligor type, asset class and other factors. Additional considerations include data homogeneity with respect to risk, and whether a chosen level of segmentation aligns with the bank’s business structure. In general, BHCs face a tradeoff between the value of increased levels of model discrimination gained from greater granularity and the cost of achieving that granularity in terms of the augmented statistical variation around estimates. However, we see some general themes in banks regarding portfolio segmentation in CCAR modeling including:

• Banks segment portfolios into homogenous groups by choice of risk drivers (see Table 3) based on their

discriminatory power. Granular segments provide more valid estimates as the composition of a loan portfolio changes. However, finer segmentation also results in smaller groups of obligors, which below a certain threshold presents challenges for statistical validation.

• Two driving considerations for segmentation are the different factors relevant to creditworthiness or other dimensions of risk, as well as the varying availability of data in individual segments.

Portfolio segmentation along multiple dimensions can sometimes result in an unwieldy number of combinations of variables, split points and split sequencing. In certain cases, alternative segmentation techniques may be employed. As an example, in wholesale modeling, techniques such as hierarchical clustering can be used to develop a segmentation scheme that does not impose undue complexity. When testing individual combinations for discriminatory value, some CCAR participants use advanced techniques,

such as machine learning coupled with specialist judgment. Using these techniques potentially can make the segmentation process more efficient.

Table 3. Segmentation Schema for Loss Forecasting in Wholesale Portfolios

Risk Drivers

Risk drivers that commonly attribute variation to loss rates for wholesale portfolios may include:

• Industry segment.

• Obligor type.

• Obligor financial condition.

• Collateral types and values.

• Lien position or seniority of claim.

• Guarantees or implied support.

Source: Accenture, October 2016

Table 2. Vendor Data Sources for Loss Forecasting in Wholesale Portfolios

Page 8: Top Considerations in Wholesale Credit Loss...Portfolio-Level Models • Initial phase of modeling lifecycle. • Limited heterogeneity. • The portfolio shows significantly dierent

8

Methodologies for Estimating LossesBHCs face a number of considerations in modeling losses for wholesale lending portfolios. CCAR participants face some particular challenges in estimating losses based on scenarios and their associated risk drivers. The selection of modeling methodology should satisfy a number of criteria, such as suitability for portfolio type, materiality, and data availability, as well as alignment with chosen risk drivers.

Many of the risk assessment models needed for enterprise-wide stress tests have been in place for some time. In the case of wholesale credit risk, credit-rating systems assigned to each borrower a risk grade that rank orders default risk. Historical data were then used to estimate the Probability of Default (PD) for each risk rating, using modeling approaches such as transition matrices or Markov chains. This is in contrast to retail credit risk, where credit and behavioral scorecard models directly rank order borrowers by estimated PDs.

A major challenge in developing modern credit risk models is incorporating macroeconomic or financial variables. Econometric forecasting models have issues related to regression specification or parameter estimation. Some regression models can only detect a linear association between variables, and are likely to pick up spurious associations driven by common unobserved variables (i.e., “unobserved heterogeneity”) and outlier observations. These are frequently deleted from the datasets used for estimation, a practice that invites scrutiny from supervisors and independent validators. Finally, standard regression models are challenged to incorporate the phenomenon of regime change (i.e., changing relationships or dependency structures over time).

Loan-Level Loss ModelsLoan-level loss models are bottom-up models used by banks to forecast the expected loss of each wholesale loan. The expected loss is calculated for each loan and the sum of expected losses across all loans provides an estimate of portfolio loss.

PDs may be estimated using logistic regressions that may include both internal borrower-specific characteristics (such as financial ratios or other firm specific characteristics) or macroeconomic variables as predictors. Losses Given Default (LGDs) are sometimes estimated using a regression model (for example, fractional regression or Tobit) at the facility level, that may include macroeconomic or financial risk drivers; when granular facility-level data is unavailable, the alternative is to use averages across segments such as collateral type or business unit. One nuance in LGD in a CCAR context is that, as the forecast is over nine quarters, it is necessary to develop a mechanism to distribute the loss, so that in fact it is not like the present value of cash flows under the AIRB approach to LGD. Exposures at Default (EADs) may also be estimated using regression techniques; in principle, EAD models may also include macroeconomic or borrower-specific variables, although in practice the approaches tend to be less developed than for PDs or LGDs.5 In the case of term loans, EAD equals the principal balance at default, including currently undrawn commitments. In the case of unfunded commitments, Loan Equivalent Quotients (LEQs) may be applied to the undrawn portion of loan commitments, which may be modeled or developed as averages over credit ratings, in situations where granular facility-level data is not available to support the model.6

The primary advantage of loan-level models is the ease of modeling heterogeneity of underlying loans and interaction of loan-level risk factors. The primary disadvantages of loan-level models is that, while there are a variety of loan-level methodologies that can be used, these models are much more complex to specify and estimate. These models generally require more sophisticated econometric

and simulation techniques, and model validation standards may be more stringent. Relative to more simplistic approaches, a greater number of assumptions may need to be defended and the added complexity may not be justified by superior model performance. Furthermore, such models run the risk of over-fitting datasets, and there may be a limited length and breadth of data to support a hold-out sample and out-of-sample testing as a way to mitigate and control for over-fitting. Another dimension to consider is the cost of implementation, which may be higher in loan-level modeling, for example more extensive data requirements or greater computational overhead. Our view is therefore that an institution should favor loan-level modeling not only for theoretical reasons, but consider practical challenges, including stability to the portfolio under consideration. (For example, top-down modeling may be more acceptable in cases of rather simple products or segmentations, as compared to more complex situations such as structured instruments.)

Pool-Level Loss ModelsPool-level loss models are top-down models used by banks to forecast charge-off rates by wholesale loan types as a function of macroeconomic and financial variables. In most cases, banks use only one to four macroeconomic or financial risk drivers as explanatory variables for these models. These variables are usually determined by the interaction between model development teams and line of business specialists.

The primary advantage of pool-level models has been the ready availability of data and the simplicity of model estimation. The primary disadvantage of pool-level models is that borrower-specific characteristics are generally not used as explanatory variables, except at the aggregate level using pool averages. Modeling challenges include determination of an appropriate loss horizon (for example, for CCAR it is a nine-quarter duration), averaging methodology, data segmentation and loss aggregation, and the annualization of loss rates.

Page 9: Top Considerations in Wholesale Credit Loss...Portfolio-Level Models • Initial phase of modeling lifecycle. • Limited heterogeneity. • The portfolio shows significantly dierent

9

Step 1: Data Availability Step 2: Modeling Techniques Step 3: Rationalization Process

Data availability and consistency over time is evaluated to narrow down modeling options. Some examples are:

• Is data available through a credit cycle?

• Are business practices consistently applied? (For example, if there is a change from judgmental PD rating to PD rating scorecards, are ratings guidelines consistent?)

• Are there only static origination variables or refreshed characteristics?

• Is the data sufficiently reconciled and controlled? (For example, is risk data checked against G/L?)

Various modeling options are considered based on general industry practices and the bank’s portfolio credit risk management practices:

• Loan-level modeling.

• Rating-based transition modeling/Markov chain modeling.

• Top down loss modeling.

The appropriate modeling option and segmentation scheme is selected based upon additional business considerations:

• How rich is the default and loss experience?

• Will a less complex model provide a better result because of transparency and flexibility to qualitatively adjust assumptions?

• Are there logical segmentation schemes that clearly reflect differentiated performance?

• Is there evidence of portfolio mix changing with the cycle such that a more granular segmentation is not considered, even if it introduces more noise because of thinning of the data?

Source: Accenture, October 2016

Table 4. Typical Steps in the Credit Loss Modeling Process

In principle, loan-level models would appear to be superior to pool-level models in terms of forecast accuracy, since they use both macroeconomic factors and loan-specific characteristics as predictors of expected losses. This has been verified by simulation techniques, which have found that pool-level models performed better than loan-level models.7

In CCAR loss modeling, BHCs tend to follow a certain sequence of steps for wholesale portfolios. The choice of modeling methodology is informed by certain factors, the individual importance of which varies by bank. As seen in Table 4 below, steps that are commonly followed include:

Page 10: Top Considerations in Wholesale Credit Loss...Portfolio-Level Models • Initial phase of modeling lifecycle. • Limited heterogeneity. • The portfolio shows significantly dierent

10

Technique Description Advantages Disadvantages

Rating Transition Approach

Credit ratings are applied to individual obligors or loans, then ratings changes are projected over time for individual scenarios.

Projection usually involves the following steps:

1. Convert the rating transition matrix into a single summary measure (e.g., a “z-factor”).

2. Estimate a time-series model linking the summary measure to macroeconomic scenario variables.

3. Project the summary measure over the nine-quarter planning horizon, using the parameter estimates from the estimated time-series model.

4. Convert the projected summary measure into a full set of quarterly transition matrices.

Leverages readily available commercial scorecard rating systems.

Can model the rating migration phenomenon that is central to wholesale credit and can be leveraged in economic capital calculations.

A robust and well-calibrated dataset is required, either internal or third party.8

Relies on ratings as a summary credit risk metric and treats all loans in a rating class uniformly regardless of other factors influencing credit risk such as financial ratios, etc.

Net Charge-off (NCO) Rates Models/Approaches

NCO rate models, or top-down approaches, are often used when more granular loan, vintage or credit rating/score/collateral variables are not available.

Portfolios are segmented by broad portfolio types, such as in wholesale Large Corporate, Commercial Real Estate (CRE), Commercial and Industrial (C&I), Middle Market, and Small & Medium Enterprises (SME).

Various regression techniques are applied to time series of losses (such as Autoregressive Integrated Moving Average with Exogenous Variables (ARIMAX), Ordinary Least Squares (OLS), and Quantile regression) to model the relationship between losses and predictors.

Sometimes external data is incorporated through panel regression techniques, in order to include idiosyncratic factors (such as panel logistic regression using peer bank data).

Simplicity of model structure and transparency of approach.

Capable of exploiting powerful time series techniques that are widely used and well understood in the industry.

In ARIMAX, the influence of autoregressive terms sometimes dominates the influence of macroeconomic factors.

Statistical assumptions are rather more stringent than other approaches (for example, in ARIMAX requirements of variable stationarity are prone to failure).

Prone to the criticism of not modeling specific risks and of potential overfitting.

Bayesian Model Approach9

This can be applied as an extension of other approaches, such as ratings transition or NCO models.

Key parameters of the model can be depicted as random (such as PD rates or sensitivities to macroeconomic factors), having distributions determined according to prior data information or even specialist elicited information.

This approach can be a means of avoiding overfitting, or alternatively as a way to incorporate supervisory views.

The approach lends itself well to sensitivity analysis around model inputs.

The approach can be computationally expensive.

The Bayesian approach is not well understood by many in the industry.

Results can be driven by the structure of prior distributions, which are subjective.

Source: Accenture, October 2016

Table 5. Loss Modeling Techniques

CCAR participants may choose from a number of different loss modeling methodologies for wholesale portfolios, based upon portfolio type, risk profile

preferences of the institution and other factors. Shown in Table 5 are a number of commonly used loss methodologies, along with a description of their characteristics.

Page 11: Top Considerations in Wholesale Credit Loss...Portfolio-Level Models • Initial phase of modeling lifecycle. • Limited heterogeneity. • The portfolio shows significantly dierent

11

Loss Modeling by Portfolio Materiality From a credit loss perspective, CCAR participants may designate certain portfolios as immaterial. Criteria for immateriality includes portfolio segment size relative to the bank’s total portfolio, presence of certain credit risk mitigants, and other factors such as if the segment is considered a run-off portfolio. As seen in Table 6, banks have a number of modeling options available for those portfolio segments considered immaterial.

Model UncertaintyAs many BHCs are discovering, the complexity of their stress testing models and the interaction of those models with risk parameters require CCAR participants to objectively quantify model uncertainty. That uncertainty can be measured both across portfolios and for risk parameters themselves. Estimation of uncertainty is important both for purposes of risk mitigation and as a component of performance monitoring. Therefore, the key consideration for banks should be how to quantify model uncertainty, and adjust model performance both through qualitative and quantitative means. Table 7 presents some adjustment techniques for consideration.10

Option Description

Scalar Adjustments

• Scalars can be used to adjust a portfolio loss estimate under a baseline scenario upward for stress scenarios.

• Scalars are calibrated based on a combination of historical performance, the ratio of modeled stressed losses to baseline losses estimated for other portfolios, or specialist judgment.

• Use of scalars implies limited transparency and no sensitivity to portfolio changes or risk drivers.

NCO Models • NCO models can be used as either a primary loss-estimation model or as a benchmark model.

• Used to estimate a statistical relationship between NCO rates and macroeconomic variables at a portfolio level and often include autoregressive terms (i.e., lagged NCO rates as explanatory variables).

• Typically, these models cannot capture variation in sensitivities to risk drivers across important portfolio segments or account for changes in portfolio risk characteristics over time.

Quantitative Adjustment Buffer Overlays

Judgmental Adjustment Buffers

Overlays are derived based on sensitivity analysis and judgment, and the adjustment of model output to reflect behaviors that cannot be captured by the model.

In order to adjust for uncertainties, overlays are applied at the model level for loss estimates.

Example: A bank takes model output from a range of challenger models in order to get an expected range of potential outcomes, which can be benchmarked against statistical confidence bounds of the champion model.

Some banks apply an additional buffer, or a catch-all buffer, to account for model uncertainties.

As per regulatory guidance, a judgmental adjustment buffer should not be seen as an alternative to model overlays.

Example: A bank looks at peer data in order to inform the judgment applied to make the qualitative adjustment.

Source: Accenture, October 2016

Source: Accenture, October 2016

Table 6. Loss Modeling Approaches for Immaterial Segments

Table 7. CCAR Modeling Quantitative Adjustment Buffer Overlays vs. Expert Adjustment Buffers

Page 12: Top Considerations in Wholesale Credit Loss...Portfolio-Level Models • Initial phase of modeling lifecycle. • Limited heterogeneity. • The portfolio shows significantly dierent

12

Model Assumptions and LimitationsIn conducting CCAR stress testing exercises, banks inevitably make modeling assumptions that are informed by their business activities and overall strategy. For example, in CRE stressed loss modeling, a bank facing a dearth of historical data may assume Loan-to-Value (LTV) or Debt Service Coverage Ratio (DSCR) default threshold triggers based upon LOB specialist judgment. Such assumptions can significantly impact model effectiveness, to the extent that the resulting model output differ from what is realized in practice. As a result, banks find it useful to mitigate model risk by quantifying the effects of assumptions and limitations, as shown in Table 8.

Table 8. Key Considerations for Quantifying Assumptions and Limitations

• Consistency with economic scenarios.

• Conservatism and the consideration of adjustments to account for model weaknesses.

• Comprehensive documentation of specialist panel discussions.

• Reasonableness from a business perspective.

• Use of benchmark data to the extent possible.

• Testing of assumptions and quantifying the impact of these on model output.

Source: Accenture, October 2016

Sensitivity AnalysisCCAR participants are finding that sensitivity analysis is an important tool for stress testing model robustness and checking for model stability. In sensitivity analysis, a model’s output is evaluated by changing individual input factors to understand the model’s dependency on these individual factors. Sensitivity analysis can be used as part of a bank’s champion/challenger process, a key part of the model governance and management framework (see Table 9).

Table 9. Uses of Sensitivity Analysis

• Model validation testing.

• Quantifying a model risk buffer.

• Demonstrating the conservatism of model assumptions.

Source: Accenture, October 2016

Challenger ModelsFor BHCs, champion/challenger frameworks are important for model governance, delivering model robustness, usability and long-term performance. In this context, challenger models are a critical source of performance benchmarking and range from internal models to third-party offerings. As shown in Table 10, banks should observe some general preferred practices in creating and applying challenger models.

Table 10. Key Considerations for Challenger Models

• Appropriateness of chosen methodology.

• Alternative estimation techniques or different risk factors considered.

• Comparability of model output.

• Use of a common reference dataset.

• Rigorous model testing and meaningful metrics (such as the Akaike Information Criterion (AIC) or the Bayesian Information Criterion (BIC) measures).

Source: Accenture, October 2016

Model Monitoring As CCAR modeling methodologies evolve, banks find it increasingly important to have a robust and effective model monitoring process in place. In doing so, banks can more readily identify models in the stress testing performance chain which may need updating, or to which overlays need to be applied. Typical reasons for “model drift” include changes in a bank’s underlying portfolio, or industry-wide borrower trends which may result in calibration drift. Indeed, banks benefit from automating the monitoring process to the extent possible, as doing so tends to streamline governance and reduce error from manual activities. Model monitoring considerations may vary depending on bank size, as discussed below.

Page 13: Top Considerations in Wholesale Credit Loss...Portfolio-Level Models • Initial phase of modeling lifecycle. • Limited heterogeneity. • The portfolio shows significantly dierent

13

Industry Trends and Preferred Practices Based upon Accenture’s experience with a wide range of clients, the following are a set of preferred practices that pertain specifically to CCAR.

Table 11. CCAR Modeling Industry Trends and Preferred Practices

Themes Key Findings

Regulatory Guidance • It is a preferred practice to be in compliance with Basel Committee for Banking Supervision (BCBS) principles of prudential supervision and guidance.

• It is preferable to be proactive regarding the pace of regulatory change rather than just being compliant.

Governance and Organizational • One should establish a robust data governance and control framework.

• It is important to find the right balance between short-term, regulatory-driven tactical projects (such as remediation of Matters Requiring Attention) and mid- to long-term strategic initiatives (for example, risk technological architecture).

• Process workflow should be streamlined from data sourcing to submission of regulatory reports.

• Documentation of every detail of a wide range of business practices should be completed to facilitate compliance.

Infrastructure and Technology • It is key to use robust analytics tools to make sense of expansive data.

• Banks should integrate and centralize finance and risk data management and regulatory reporting.

• Automation of regulatory report generation is the preferred practice.

• It is advisable to use regulatory reporting and compliance data for making business decisions.

Source: Accenture, October 2016

Page 14: Top Considerations in Wholesale Credit Loss...Portfolio-Level Models • Initial phase of modeling lifecycle. • Limited heterogeneity. • The portfolio shows significantly dierent

14

Conclusion

In this paper, we have reviewed prevalent and preferred practices in CCAR wholesale loss modeling, emerging trends in this space and supervisory expectations, as well as presenting guidance for CCAR participants. In the case of banks participating in CCAR, we have focused on business lines residing in various types of banks, including global investment banks and regional banks having a variety of lending activities.

Page 15: Top Considerations in Wholesale Credit Loss...Portfolio-Level Models • Initial phase of modeling lifecycle. • Limited heterogeneity. • The portfolio shows significantly dierent

15

References

1 “Principles for sound stress testing practices and supervision,” Basel Committee on Banking Supervision, May 2009. Access at: http://www.bis. org/publ/bcbs155.pdf.

2 Ibid

3 Ibid

4 “International Convergence of Capital Measurement and Capital Standards, a Revised Framework, Comprehensive Version,” Basel Committee on Banking Supervision, June 2006. Access at: http://www.bis.org/publ/bcbs128.pdf.

5 “Measuring LGD on Commercial Loans: An 18-Year Internal Study,” The Journal of the Risk Management Association, May 2004. Access at: http://www.michaeljacobsjr.com/files/JPMC_LGD_Publication_May2004.pdf. “Modeling Ultimate Loss Given Default on Corporate Debt,” The Journal of Fixed Income, Summer 2011. Access at: http://www.michaeljacobsjr.com/files/JacobsKaragozoglu_LGDCorpDebt_JFI_5-11.PDF. “Empirical Implementation of a 2-Factor Structural Model for Loss-Given-Default,” The Capco Institute Journal of Financial Transformation, April 2011. Access at: http://www.michaeljacobsjr.com/files/Jacobs_LGDModel_JFT_3-11_31_31-43.pdf. “An option theoretic model for ultimate loss-given-default with systematic recovery risk and stochastic returns on defaulted debt,” BIS Papers No 58. Access at: http://www.michaeljacobsjr.com/files/Jacobs_LGDModel_BIS_58_257-285_10-11.pdf

6 “Loan Equivalents for Revolving Credits and Advised Lines,” The Journal of the Risk Management Association, May 2001. Access at: Access at: http://www.michaeljacobsjr.com/files/Jacobs_Araten_RMA_LEQ_Pub_April_2001.pdf. “An Empirical Study of Exposure at Default,” Journal of Advanced Studies in Finance, Summer 2010. Access at: http://www.michaeljacobsjr.com/files/Jacobs_EmpiricalStudyOfEAD_JASF_I-1_Summer_2010_31-59.pdf. “What do we know about Exposure at Default on Contingent Credit Lines? A Survey of the Literature, Empirical Analysis and Models,” Journal of Advanced Studies in Finance, Summer 2011. Access at: http://www.michaeljacobsjr.com/files/Bagjacobs_2011_ReviewEAD_JASF_Vol2Iss1_3__Summer_Pg26-46.pdf. “Parsimonious exposure-at-default modeling for unfunded loan commitments,” The Journal of Risk Finance, 2012. Access at: http://www.michaeljacobsjr.com/files/BagJacobs_EADModel_12-11_JRF_v13-no1_pp77-94_Jan2012.pdf.

7 “Comparing loan-level and pool-level mortgage portfolio analysis,” Moody’s Research Labs, November 14, 2010. Access at: https://www.moodys.com/sites/products/ProductAttachments/MRL%20WP%202010%2011%201.pdf.

8 “An Internal Ratings Migration Study,” The Journal of the Risk Management Association, April 2004. Access at: http://www.michaeljacobsjr.com/files/JPMC_IRM_Publication_Apr04.pdf.

9 “Stress testing and model validation: application of the Bayesian approach to a credit risk portfolio,” Journal of Risk Model Validation, September 2015. Access at: http://www.michaeljacobsjr.com/files/JacobsKaragozogluSensenbrenner_Bayes_StressTesting_JRMV_vol9no3_pp41-70_Cover.pdf.

10 “Emerging Trends in Model Risk Management,” Accenture, September 2015. Access at: https://www.accenture.com/_acnmedia/Accenture/Conversion-Assets/DotCom/Documents/Global/PDF/Industries_19/Accenture-Emerging-Trends-Model-Risk-Management.pdf.

Page 16: Top Considerations in Wholesale Credit Loss...Portfolio-Level Models • Initial phase of modeling lifecycle. • Limited heterogeneity. • The portfolio shows significantly dierent

Copyright © 2016 Accenture All rights reserved.

Accenture, its logo, and High Performance Delivered are trademarks of Accenture.

16-3584

About the Authors Luther Klein

Luther is a Managing Director and leads the North America Finance and Risk Analytics organization. He brings sizable experience from industry and consulting partnering with over 30 financial services companies to deliver projects across analytic modeling (CCAR, credit/market/operational risk, Basel, RWA), governance, validation, operating model design, investment governance, M&A and risk management.

Michael Jacobs Jr., Ph.D., CFAMike is Principal Director, Accenture Risk Analytics. He has over 25 years of financial risk modeling and analytics experience in industry, supervison and consultancy. He is specialized in model development and validation for CCAR, PPNR, credit/market/operational risk; Basel and ICAAPP; model risk management; financial regulation; advanced statistical and optimization methodologies. Mike has presented at high profile industry venues and published in several practitioner and academic journals.

Akber MerchantAkber Merchant is a Senior Manager, Accenture Finance & Risk. Based in New York, Akber has solid consulting and industry experience in financial services and risk management. He has worked with global and regional financial institutions across North America, Asia and Europe. His work in enterprise risk management, regulatory compliance, liquidity risk management, capital management and analytics helps clients become high-performance businesses while meeting regulatory mandates.

Stay ConnectedAccenture Finance and Risk www.accenture.com/financeandrisk

Connect with us www.linkedin.com/groups?gid=3753715

Join us www.facebook.com/accenture

Follow us www.twitter.com/accenture

Watch us www.youtube.com/accenture

About AccentureAccenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions—underpinned by the world’s largest delivery network—Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With approximately 384,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Its home page is www.accenture.com

This document is intended for general informational purposes only and does not take into account the reader’s specific circumstances, and may not reflect the most current developments. Accenture disclaims, to the fullest extent permitted by applicable law, any and all liability for the accuracy and completeness of the information in this document and for any acts or omissions made based on such information. Accenture does not provide legal, regulatory, audit, or tax advice. Readers are responsible for obtaining such advice from their own legal counsel or other licensed professionals.