Why CECL Makes 2019 NOW For Community Banks

03/22/2016

Peter Cherpack - CEO of Ardmore Fintellix, Executive VP & Senior Director of Ardmore Banking Advisors

Please click here to download a printer-friendly version of this document

2019 is Sooner Than You Think - The Time is NOW to Begin Gathering Your CECL Data

The recent announcement of the new ALLL CECL rules for banks being implemented for most in 2019 should be a wake up call for community banks. But it feels that 2019 is a long time from now, there is plenty of time to learn about the new rules and put in place new procedures, so why worry? That’s the reaction of many community bankers to the CECL announcement.

But this is precisely the wrong way to think - as in many ways 2019 is closer than you might think.

The new CECL rules are pro-active, and attempt to more adequately tie reserves to actual loss cycles. If anything was learned from the last great recession it was that looking back at historical losses to project future losses is a futile exercise.

FASB’s new rules for how to create the provision for loan losses are based around the concept of projecting expected losses for the life of each loan based on loan risk characteristics and a “reasonably supportable” macro-economic projection of the future. The reasonably supportable projections are based on correlations of past loss patterns of performance compared to national and regional economic indicators.

These new rules are a major departure from current process and require a new approach from community bankers. Today the vast majority of community bankers use spreadsheets to perform their ALLL calculations with relatively simple formulas. Under CECL this will not be so likely, as the complexity of the projections and need for more detailed pooling and tracking make the job too hard for manual spreadsheets.

New Levels of Complexity is a Problem for Community Banks

The new rules require a new level of detail around creating ALLL pools that take into consideration the age of the loans (vintage), terms of the loans (actual life of loan), loss accumulation periods (relative position on the loss curve). 

When applying a forecast of the future life of the loan (LOL), loans behave differently based on age and seasoning. Calculations are needed not only for each pool’s origination vintage, but separate estimates need to address each vintage by loan age in each future year. 

For example, in 2015 for a portfolio with an expected life of four years, ten different annual sub-calculations would be needed for projecting data for previous years . All four years losses for those loans originated in 2015 have to be projected, three years projections for the 2014 vintage (as you would only have the actual losses for 2014 year one), two years projections for 2013, and one for 2012 (year four).

 Multiply this process times the number of the bank’s asset and risk rating pools, and you have a lot of work to do for each projection. Add to this the fact that each of these projections need to include justifiable macro and local economic projections, and you have a very complicated process indeed.

There are other approved ways to calculate CECL, including creating a probability of default and loss given default for the life of each loan at origination or a life of loan discounted cash flow – but these more sophisticated projections are likely out of reach for most community banks at the time CECL is implemented.
 
The Bigger Problem - Credit Data Management Requirements
 
To perform the CECL calculations a community banker needs data, lots of data. They will need data on their loans, data on their borrowers, and data on the performance of the regional and national economy. They will need more data than ever before, including historical data (5 - 7 years) and possibly transactional data (risk rating change dates, charge offs, prepayments, etc.).

Traditionally, community banks have not dedicated the amount of attention or resources to the collection, organization and archiving of their credit data that will satisfy new regulatory rules like CECL and Dodd Frank Stress Testing (DFAST). With scarce resources at their disposal, typically community bank credit departments have focused on gathering data on their borrowers and collateral to make the deals, and then booking just enough information in their core accounting systems to get the loan on-boarded.

This approach ensures that the loans would start accruing and customer statements would be generated in a timely manner, but does not ensure that many of the data elements that define the loan and borrower’s “risk characteristics” would be saved for later use in credit and risk analysis.

This situation was illustrated well in 2007 when the Guidance on CRE Management from the agencies required banks to track their real estate concentrations. Many banks asked the question: “what’s a concentration”? Most didn’t have required data in their systems to generate reports by real estate property types, owner occupancy status or LTV percentage amounts. In fact, many still struggle with this problem a full nine years later.

Some community banks turned to manual spreadsheets and internally created access databases to track loan risk characteristics and demographic coding. Others used ancillary banking systems for tracking collateral types and other coding for their loan portfolios.

Even those banks that did make a concerted effort to collect more loan and borrower information at booking can be disappointed when they actually try to access and use that data for reporting and analysis. The issue of the lack of data integrity and governance rules becomes a significant obstacle.

The lack of clear governance over loan coding standards and the frequent use of “user defined” fields in the core accounting system can play havoc with loan categorization and meaningful reporting. Often it is left up to the data entry clerk to “read the tea leaves” on the credit memo to try to understand key loan characteristics and then translate them into core system codes . In some cases loan codes can be actually invented by bank operations or data is keyed in system fields not as intended by the core. This can make it impossible to generate meaningful reports.  Incentives for coding and booking are usually related to speed of input, not the value of the credit and risk data. 

Why CECL Needs Data

The basis for many of the key CECL calculations is creating “cohorts” or “specific pools of loans with like risk characteristics” at as detailed a level as necessary to group loans for expected loss projections.

The old “FAS 5” methodology of simply grouping loans by Fed call code and risk rating (or class of risk ratings) and tracking loss history patterns will not be sufficient for an effective CECL calculation. Instead more detailed pooling considering term, prepayments, loss accumulation periods, risk rating changes, vintage and other risk characteristics will be a requirement.

Why can’t bankers continue to use the old call code/risk rating pooling method? Under CECL, each pool will create expected losses for the life of the loans in them (LOL) - and characteristics like loan type, term and vintage will be key to creating an accurate LOL loss estimate as illustrated earlier in this paper.

Preliminary CECL model testing has already proven that the impact of lumping seasoned loans of the same loan types and risk ratings with newly originated loans can create a major change in the estimated loss amounts. If a bank is not detailed enough in their pool definitions, they are likely to pay the price in higher reserve amounts.

Another key data need involves the need for consistent and through historical data by cohort or pool. A major component of the CECL loss projection is how the cohort or pool of loans has performed as compared to local and national macro-economic factors like unemployment, GDP and housing starts. To effectively build correlations with these historical macro-economic factors, bankers will have to analyze how the loan pools have performed historically though more than one market cycle - commonly considered to be 5 to 7 years.

Creating meaningful correlations can be very important, as without supporting data, the loss projections are severely limited. A bank with limited historical data correlations will be forced to revert to a straight historical loss pattern which may not be as favorable when calculated.

External Auditors Driving Data Needs

Recently  many of the large external audit firms have demanded better justification and transparency around the ALLL calculations, specifically related to qualitative (“Q & E”) factor adjustments. Bankers are required to produce empirical data to support assumptions and auditable trails to source data systems.

The audit standards board (PCAOB) has already issued papers outlining new standards like: “Auditing Estimates and Fair Value Measurements” and “The Auditor’s Use of the Work of Specialists”. They have yet to outline details around expectations for a CECL audit, but it is reasonable that more justification, support and data detail will be required.

Under CECL,  loan originations create immediate loss expectations – additional detail will be required to ensure that factors for loss expectations are identified and tracked including, but not limited to: appraisals, LTV ratios on collateral, and underwriting analysis factors. This process is not an area that was heavily scrutinized by external audit currently.

It is anticipated that evolving audit standards will drive additional data needs for disclosures and justifications, and that the regulatory agencies will also be involved. With this uncertainty and undefined requirements as a back drop, bankers have to consider that the data they think they need now for CECL may not address all future needs as best practices develop between now and 2019. Limiting data collection now could cause a bank pain later when these rules and requirements are fully refined.

Get Started With Your Data Now:  2019 is Around the Corner

Now is the time to get your credit data management program defined and the remediation process started. Even if only 5 years of data is needed to support CECL, you would still need detailed, thorough loan and borrower data dating back to 2014. Clearly those banks that start their efforts in 2017 or 2018 will be hamstrung compared to those that start earlier.

A bank will clearly be at a competitive disadvantage with significantly higher loan loss reserves due to inadequate data to support a reasonable CECL estimate. External audit requirements will require significant research and justification without adequate data to support CECL projections.

An effective and efficient credit data management program starts with the identification of what credit data is captured in the bank’s automated systems and databases, and how thorough, consistent and accurate that data is. A credit data assessment or audit project can be managed by experienced internal staff or a qualified vendor.

The assessment project should be focused on data items that are likely to influence CECL projections, as well as more standard coding practices like industry, loan type, collateral type and risk grade. (While data requirements are not clearly defined yet, many industry experts have published suggested lists of data likely to be required. Interested parties can contact Ardmore Fintellix for an example list if desired.)

Once the assessment is complete, a gap analysis should reveal where key data is missing, inconsistently coded or inaccurate. Once gaps are identified, a road map should be put together to address short term and longer term needs with an action plan as to how to proceed.

For example, typically the entire loan booking process should be reviewed to identify the criteria used for credit coding and the bank’s practices around what codes are used and how they are reviewed for quality control.

The bank may need to obtain historical information no longer available to them online via the core. Banks should contact their core vendors to find out about getting access to their historical data required to support CECL - along with the potential for future downloads to support the CECL process on going.

Long term solutions should include building or buying some type of credit data warehouse or credit data mart where key data items can be organized and maintained outside of the core system. This should include data from financial spreads, charge offs and collections and other off core bank systems. The data mart can be stand alone or part of a holistic CECL/ALLL Automation system solution.

A Capital Idea

Even with help from internal projects and capable vendors, there is much work to do, and the longer a community bank waits, the more difficult and expensive it will be to address their inevitable data short comings. A banker doesn’t want to be scrambling for required portfolio data in 2018 for 2019’s CECL calculation.

Of key concern is that banks will need to be using CECL calculations in 2017 and 2018 to create simulations of the impact of the new models on the reserve to see the potential impact on their capital levels. Regulators consider good capital planning by community banks to be a key best practice in proactive risk management. 

Capital planning is a long term process, with a typical time horizon of at least nine quarters. FASB has determined that there will be a one-time capital hit resulting from the final transition to CECL for the ALLL. Amounts of 30 - 50% above the current ALLL amount have been discussed. 

A bank will need to know their likely capital hit resulting from the transition to CECL more than just nine quarters in advance. Without adequate data to simulate the likely capital impact of CECL, bankers will be “flying blind” when they do their future capital planning.

It is likely that banks will need to run parallel CECL based ALLL calculations for a period of time to validate their capital planning estimates.

And Now, There is Good News from CECL?

While the task ahead for community bankers to effectively deal with CECL and its prodigious data requirements is daunting, it is not without its benefits. In fact, if the only value community bankers were to get of this credit data management effort is to deal with CECL requirements, then it would be a great effort resulting in relatively minimal value to the institution.

Instead of treating this data gathering, organizing, cleansing and archiving project as a “tax” to pay for just being a bank - consider as well the opportunities and value of having an accurate, robust data source for credit portfolio information creates.

Banking is the information business and lending is the highest risk and highest reward activity a bank can perform. Having better credit and risk information and using it is a way to not only potentially increase revenue but at the same time it demonstrates for the regulators advanced risk management practices. By creating a validated source of truth for your credit data, you have increased efficiency, reduced risk and helped justify a business strategy of growth.

Through building out a robust, accurate and timely credit data mart for your bank, you create a competitive advantage over riskier, less efficient bank competitors. By proactively addressing regulatory concerns, a community can reduce their own regulatory burden.

The road to CECL is a long one, and the effort is significant, but it can also bring benefits along the way. However, to maximize the value and minimize the pain, now is indeed the time to start - as more than ever for community banks the future is now. And 2019 is right around the corner.

For more information about CECL, or how Ardmore Fintellix can help you start to prepare for the upcoming changes, please contact Peter Cherpack at pcherpack@ardmoreadvisors.com.