The purpose of measuring interest rate risk (IRR) for any credit union is to capture the potential net interest income exposure of the organization over a range of "severe-but-plausible" interest rate scenarios. To analyze IRR, credit unions typically move rates up and down 300 to 400 basis points in 100-basis-point increments. Directors must then set limits for how much exposure they will tolerate given the severe-but-plausible scenarios, and ALCO committees must manage the organization within the defined limits.
Both regulators and individual credit unions agree with this procedure for managing IRR, but does this "shock" analysis provide meaningful results or has the time come to re-calibrate the standard? I think the time has come to re-calibrate the standard. Let me explain.
The key to making IRR analysis meaningful is to measure severe-but-plausible interest rate scenarios as opposed to interest rate scenarios that have never occurred to date and will likely never occur in the future, particularly if those measured scenarios can produce misleading results. Shocked interest rate scenarios accomplish the latter, and it is time to change.
Since the recession began, the credit union industry has experienced a significant rise in non-maturity deposits along with a transition from time deposits to non-maturity deposits. This change has resulted in a major concentration of non-maturity deposits. As reflected in the table below, non-maturity deposits now represent more than 70% of the industry’s funding. Given this concentration, credit unions must think carefully about how they model these liabilities in a rising rate environment so they don't distort the results.
When measuring instantaneous shocks, models assume the rate change measured (300 basis points, for example) occur immediately and continue through the measurement period (three years, for example). Using this methodology today with the funding concentrated in non-maturity deposits, a majority of the institution’s interest expense occurs in month one while the increase in interest income occurs methodically over the measured period. As a result, many institutions — depending on their asset mix — will show a decline in income in year one and, possibly, year two and an increase in income in year three.
For most institutions, once a rate rise is fully priced into the balance sheet, net interest margins will increase, not decrease, and earnings will improve, not decline. Yet the immediate shock analysis reports the opposite, and the declines in income reported in year one can be significant.
For example, using the immediate shock methodology today, it is not unusual to see projected declines in net interest income in the 300-plus-basis-point shock analysis to exceed 10% of the base net interest income, which can result in projected declines in net income of 30% or more.
If an institution really thought net income could decline in the coming year by 30% or more given a rise in rates, it would definitely take action to reduce this exposure. But because everyone knows the immediate shock analysis is flawed, no one takes the potential outcome seriously. Instead of taking actions to reduce risk, they simply increase their policy limits to stay in compliance. Whose interest do such actions serve?
If credit unions cannot rely on measured results, then their confidence in the process declines precipitously. Actions taken by boards of directors and ALCO members become a regulatory exercise as opposed to a useful management tool, which is the primary purpose of IRR management. If management relied on calculated results and took action to reduce risk, institutions would need to either decrease the maturity of their assets or increase the maturity of their liabilities. Either case would negatively effect both current and future earnings, and for what purpose?
Just as financial institutions no longer rely on gap reports to measure their interest rate risk, the time has come to retire the immediate shock analysis as the primary interest rate measurement tool. This is not to suggest immediate shock analysis does not provide value in the IRR process, just as gap reports do, but to set board policy and trigger ALCO actions based on a shocked rate environment no longer makes sense. Not only is this rate environment not plausible but also — given the shift in the funding mix of credit unions’ balance sheets — the predicted outcomes provide misleading results.
So what analysis do I recommend boards use to place risk limits and ALCOs use to manage within? Given the concentration of non-maturity deposits in the industry, a ramped rate rise over the simulation period analyzed — two to three years — is the most sensible way to assess risk.
As for the rate rise analyzed over the period, a 300- to 400-basis-point rise would not be unreasonable but credit unions should not expect it to occur overnight. The Federal Reserve, which will ultimately dictate the rise in short-term rates, now periodically provides the market a projection of future rate levels. As of its most recent FOMC meeting concluding June, 17, 2015, the average and median estimate for Fed Funds by the 17 voting members for years ending 2015, 2016 and 2017 were as follows:
As you can see from the table above, future increases in rates are currently projected by the Fed to be gradual over the next 2.5 years.
For more information on CNBS, please visit www.cnbsnet.com or contact the author, Robert Colvin at firstname.lastname@example.org. CNBS is a credit union service organization (CUSO) dedicated exclusively to assisting credit unions in developing balance sheet strategies to manage their interest rate risk, investment portfolio, liquidity and capital utilization.