JOURNAL - VOLUME 2

Ghosts in the Machine: The Nemesis of Banking


2019/06/24 - Monocle Journal

What Is The Purpose Of Banking?

When one thinks of the impact of the 2008 financial crisis on banking one thinks foremost of two things. First, of the enormous erosion in trust of the global financial institutions that once ruled the world. And second, of the plethora of new regulation that has been imposed on these same institutions as a result.

These regulations, imposed both nationally and specifically within certain countries, and internationally – crossing trade and sovereign borders – have been the by-product of a deep and convicted notion within a public baying for blood that the greed exhibited by the banks leading up to the crisis is no longer tolerable, and needs to be reined in. The very idea of free-market capitalism – certainly in the guise that it took in the form of the 1990s style no-holds barred investment banking model – is no longer acceptable to western governments nor to its voters.

Since 2008, banks have lost their shine, making poor investments for shareholders, frequently struck down by rogue trading, by enormous fines and by internal misconduct that has led to congressional hearings and high- level resignations. At the same time, these institutions have been plagued by a mountain of paperwork and audit requirements that hang like an albatross around their necks, hindering their flight. They are the whipping boy of international commerce – and each time one of the key protagonists makes an effort to once again stand up from the floor to which they have been beaten, they seem to be hit by yet another unexpected blow. In common parlance, names like Goldman Sachs, Morgan Stanley, and even the likes of Wells Fargo and Barclays, no longer evoke wonder and admiration, but rather engender derision.

The battle lines between the 1990s-banking model and the policing by the state of these institutions appear to have been drawn around the concept of the free market itself. The critical question being asked goes to the integrity of these institutions, the question of who they serve. Is their purpose ultimately to return profits to shareholders, or are they rather custodians of something far broader and more meaningful?

As listed entities that are privately owned, it seems logical that the banks should serve to maximise shareholder return. Yet, at the same time, they serve a critical purpose within the broader economy, acting as intermediaries in the supply chain of money, and as providers of credit to private households and industrial companies across the world. In recent literature, there is an intellectual war being waged on this very question: can the general good be served by a system that focuses almost entirely on shareholder return? Especially when the purpose of banking within broader society is to provide growth and equality through the provision of credit and financial intermediation.

 

The problem lies with efficiency not the free market

There is danger however in over-simplifying the problem of banking. The bind that the industry finds itself in today is not merely a stakeholder nor an agency problem. The issues plaguing the world’s largest banks will not be solved with a wave of a magic wand. Should the general public and western leaders suddenly and inexplicably return philosophically to the extreme version of the free market propagated by Milton Friedman, banks would not actually find themselves much better off. The framing of the problems within banking as simply an issue of solving the question of who they serve – whilst being compelling in its simplicity – is ultimately reductive.

This danger specifically lies in the assumption that banks were running efficiently in the first place, even prior to the 2008 financial crisis. It is commonly understood that Lehman Brothers failed – as an example – because of a short-term liquidity squeeze owing to the toxic assets Lehman had accumulated. Its peer investment banks refused, at a critical moment, to roll over Lehman’s short-term borrowings, in what is known as the interbank market. These peer banks held back on providing Lehman liquidity because of their perception that Lehman held at that moment in time the most toxic of the Collateralised Debt Obligations (CDOs). These complex securities had been issued out of pass-through securities made up of cashflows emanating from the failing oversold US mortgage market. This, at least, is the most common perception of the cause of Lehman’s demise.

 

The problem of knowing aggregate exposure and risk

What is less well understood – and perhaps far more telling as a funda- mental cause – is that Richard Fuld, the then-CEO of Lehman, did not have a proper handle on the aggregate risks faced by the bank as a whole. Specifically, he did not know – nor for that matter did anyone at Lehman know – the aggregate exposure or sensitivity of Lehman to a movement, for example, in the Mexican peso against the US dollar.

The executive management of Lehman did not even know – at an overall aggregated level – the sensitivity of the banks’ assets and liabilities to potential changes in the Fed funds rate. This may seem a profane remark, and untrue. But consider that the auditors who were called in to wind-up the balance sheet of Lehman post its failure, estimated that it would take up to a decade to unravel the trades that Lehman was counterparty to.

These trades would have included not only the now infamous CDOs and their associated derivatives, the credit default swaps (CDSs) that were meant to protect the CDO-holder of counterparty risk, but also included such basic products as the Lehman mini-bond. The Lehman mini-bond was a way for Lehman – prior to Bill Clinton in 1999 repealing the Glass-Steagall Act – to raise retail deposit funds for use in investment banking activities and trading, at that time against the law.

By calling a deposit a bond, and by having smaller offshore banks on-sell these bonds in countries as far afield as China and Singapore, Lehman was able to raise significant funding, just as a retail bank would through deposit creation. The difference was that the ultimate holders of the mini-bonds – pensioners in Hong Kong for example – did not realise the excessive risks that were being taken with their money. Lehman, after all, was a household brand, albeit a hallowed US brand, whose actual business activities were beyond the comprehension of even the financially literate.

On the asset side of its balance sheet Lehman held complex derivatives, bonds, and CDO securities. On the liability side of its balance sheet Lehman held substantial short-term money market funding from the interbank market, as well as rafts of cash raised through activities such as its mini-bond efforts from institutions around the world.

The actual counterparties to these mini-bonds was even a question    of political debate in Hong Kong. Immediately following Lehman’s failure, the local Hong Kong banks eschewed any responsibility, calling themselves middlemen. Lehman was bankrupt and could not pay its bondholders, and the pensioners would therefore have to face the full brunt of their losses, irrespective of the fact that these bonds had been sold as safe investments. In the end, the Hong Kong Monetary Authority (HKMA) insisted that the local banks faced the full loss on the bonds, avoiding the significant social unrest that may have resulted.

The mini-bonds, however, put the problem of Lehman management knowing about their exposures, and the sensitivities of these exposures to changes in economic conditions, in context. If regulators, law-makers, attorneys, civil rights activists and even auditors cannot easily and accurately identify the market and credit risks of an asset or liability on Lehman’s balance sheet – even under the scrutiny of congressional sub- committees incensed by an enraged public post-crisis – then what chance did Richard Fuld have?

 

The ubiquity of the problem of smart managers not knowing

This may seem a cynical question, for surely that is the purpose of having Harvard-educated top brass management – to have the right people in control of this complex world of banking, to steer the ship? But the Lehman story, in one form or another, has continued since 2008 to repeat itself, in one crisis after another in now almost all of the world’s largest banks.

To be fair to Lehman, and to Richard Fuld, this problem of not knowing aggregate risk exposure or book sensitivity was not and is not unique to Lehman. It is ubiquitous across the banking fraternity worldwide. And it is not a function of banks employing people who are not smart enough or enabled enough to calculate these risks and exposures. In fact, ironically, on occasion it can sometimes be as a result of having too many smart quantitative analysts, that the problems of calculating risk aggregation can be exacerbated. There have been several cases in recent years within banking of substantial discord between managers within specific institutions at odds with each other over the valuation of a particular trade or transaction. Often, there are simply too many quantitative analysts involved in the process of valuing or measuring the risk of a particular exposure.

The single most significant problem in banking is not in fact risk management per se, it is actually a problem of information management. And to put it more bluntly: it is an organisational design and management problem that plagues the industry. Before embarking on a justification of this claim, it is worth touching on the more recent travails that Wells Fargo and John Stumpf, its CEO, faced in 2016.

 

The Wells Fargo fake account debacle

Wells Fargo – as has been widely reported – instituted, at senior levels of management, a policy of compensating its retail banking sales staff specifically for raising accounts and loans, as opposed to compensating them against a more nuanced metric, such as risk-adjusted return on capital (RAROC).

Over several years, Wells Fargo’s blunt reward and compensation system led to extensive and broad sales abuse, the most severe of which included the creation by many sales staff of fake accounts that would result in them receiving inflated bonuses. The deleterious effects of this practice not only distorted US government statistics on money supply and consumer borrowings, but also, and most importantly, fraudulently distorted the profit, value and growth figures of the firm, published in its annual and semi-annual financial statements.

Not a single person has been successfully convicted of a crime for this practice, and no arrests were made, although many lost their jobs, John Stumpf’s included. He was also made to pay back over 75 million dollars in bonuses and salary to compensate shareholders. The question that naturally springs to mind is whether he knew of the practice, or worse, whether he sanctioned it. In the financial industry it would appear that this question is seldom successfully answered, nor even investigated, given the fact that very few individuals were ever held criminally liable for the practices that led up to the 2008 financial crisis.

In the case of Richard Fuld, it would seem unlikely that he deliberately wished to risk the existence of Lehman Brothers. On the contrary, he was extremely surprised by the call Hank Paulson made to him on the fateful day of the firm’s demise. One suspects that the same is true of John Stumpf. It seems more likely that he did not know of the practice, since it would not make sense to risk his entire career on what really amounted to a marginal distortion in sales across the entire portfolio of one of the world’s largest banking groups.

The reason, however, that he did not know, and the reason also that no-one in a regulatory oversight role would have known had a whistle blower not led the Department of Justice to the details of the practice, is once again a function of organisational failure and specifically information management failure. Given the trillion-dollar balance sheet that Wells Fargo holds, and given its breadth and depth across geographical and divisional and product dimensions, it would be almost impossible for John Stumpf to have noticed a distortion at that level.

That is, unless he had inculcated the correct set of values and principles throughout his executive management. And hat is, of course, assuming he took sufficient care and attention to detail in the manner in which he consumed information.

 

Banks are not financial intermediaries, they are information processors

Rather than viewing banks as financial intermediaries, it is more useful in the context of this argument to view banks as information processing machines. After all, banks do not actually make things. They do not have storage facilities near airports or railways that hold stock. Their balance sheets are rich, but are made up most significantly of loans to individuals and corporations, mere numbers in a system, rather than physical things.

Their most significant costs are people costs and information technology costs. And their key assets – putting aside the obvious financial definition of loans as assets – is the data that represents those assets.

Imagine if the representation of the assets held by a large diversified retailer such as Walmart were not properly managed. Imagine if the labels on products as different as ketchup to prams were incorrect, or misleading, or exaggerated. This would be the precise equivalent of getting the labelling of financial assets incorrect. But, precisely because financial assets are not bricks and mortar assets, they ironically receive less care than they should in their definition and processing.

The problem of labelling, however, is only the beginning of the overall challenge for banks. Banks tend to exist across multiple geographic locations, and they tend to sell products that are quite different from each other, with different risks associated, to very different types of customers. Once again, in a bricks and mortar context, it is quite simple to store children’s bicycles in a separate warehouse to perishable foods, for example. But in the case of financial assets, the difference between a leveraged-finance loan of several billion dollars to a public private partnership is not in fact particularly discernible from a regular mortgage loan to a single parent in Kentucky, other than in the numbers. And to be even more precise, the distinction lies not in the numbers per se, but rather in the numbers about the numbers, in the codes. This is what is called metadata – it is the information about the numbers, and it is these codes that are interpreted by multiple systems across a bank that actually help managers distinguish between their assets and their liabilities.

Once the ‘OK’ button on the appropriate screen has been pushed by the appropriate banker – the multitude of legal documentation notwithstanding – the separation of the asset from its representation, in the case of banking, is immediate. The information about the asset is stored in the machine, somewhere. The asset itself – being the expected future cashflows emanating from the loan – is now actually a series of binary numbers held more than likely in what is known as the ‘cloud’. It is no wonder then – given the enormous complexity in properly representing information, storing it, and processing it – that there are not many more reported cases of severe information distortion in banking.

 

The first of five key challenges in banking today

There are five key challenges that banks face in transforming from sales organisations into efficient data processing organisations. The first is that they sell – both on the asset and liability sides of their balance sheet – a very broad range of products.

They sell retail deposits, as an example. But even this seemingly simple product has its own idiosyncratic complexities. These deposits can be of a fixed term or of a notice term nature. This term can range from one month to several years and can include all tenors in between. They can and do sell these products with complex characteristics of optionality and trigger penalties that kick in should the client wish to exercise early redemption. The risks embedded in this relatively simply product are complex: roll-over risk, early redemption risk, and these risks can differ by customer, as well as by the customer’s circumstances.

They also sell corporate loans, which are backed by physical or financial collateral – or that may not be collateralised at all, other than by the strength of a corporate customer’s balance sheet at a point in time. The collateral that banks hold against loans made to corporates, as well as loans made to individuals, may change in value not only on an annual basis, but also on a daily basis, such as the value of an equity portfolio over which the bank has taken cession.

In the market of loans called specialised lending, it is the future cashflows generated by the special purpose vehicle that ultimately borrows from the bank to build a school, or a road or a prison, that serve as collateral. The value of this collateral in assessing the risks on this nature of loan depend on macro-economic forecasts and the impact of these predicted economic variables on the possibility and value of future free cashflow.

The assets and liabilities of a bank are complex, not only because they are so broadly different in the nature of their origination, but also because their resultant cashflows and the value of their collateral must also be predicted and recorded and constantly assessed. Whereas, in a large industrial firm, the asset values are often determined simply by market prices; in the case of banks, each and every individual loan and deposit needs to be individually valued both from a financial book value perspective, as well as from a risk perspective.

The complexity and breadth of the broad range of products that a bank markets and sells, and the difference in the data that represents these assets and liabilities, massively compounds the problem of information management. Data about transactions, each of which is in itself complex, and whose counterparties are unique and often related parties, is a complex problem to solve. The problem is exacerbated by the sheer volume of this data – there may be several hundred million transactions per day passing through any of the larger multinational banks. This is the first problem in banking: the issue of managing and interpreting the complexity of information generated out of product diversity and complexity.

 

The gap between accounting rules and risk rules

The second major challenge faced by banks is that the accounting rules and risk management rules governing the valuations and risk values of each and every asset and liability, differ substantially still today and have differed even more substantially in the past.

In the derivative trading market, for example, the value of a derivative – if it is a vanilla derivative product, such as a single stock equity option – is usually determined simply by calculating its market value using either reference values in the exchange traded market, or by using closed-form mathematical solutions such as the well-heeled Black-Scholes equation.

Recent accounting rules compel banks to treat the assets and liabilities made up of these derivative products as values that change daily, and whose profits and losses must pass through the income statement, and this treatment must be calculated to the day, using the most recent of market data.

In the case of a loan to a corporate, however, the accounting treatment will depend on the intentions of the bank in terms of why they are holding these positions in the first place. If the bank is simply holding a loan position for short-term purposes, expecting the position to change value in their favour, then these positions are deemed to be trading positions, and their accounting treatment should be the same as that for derivatives, i.e. accounted for as profits and losses through the income statement. Whereas, if the intentions of the bank are to hold these loans for the full period of their term, then these positions should be treated from an accounting perspective as being held to maturity.

These rules were implemented to prevent the accounting arbitrage that can exist for banks in treating balance sheet positions in ways that amount to distortive reflections to shareholders of their true exposures, and the risks inherent in these exposures – from a financial perspective, as well as from a liquidity perspective. The bank, however, may not always easily be able to distinguish between the two treatments in terms of their intentions. They may not always know if they will hold a loan, or a portion of a syndicated loan, to maturity, since that may depend on client behaviour or on changing market conditions, or even on changes in the bank’s own aggregate liquidity position. This treatment problem adds yet another dimension to the problem of accounting treatment.

On the other hand, in terms of assessing and valuing the risks embedded in holding derivative positions, the rules pertaining to risk valuation are quite substantially different to those of accounting treatment. Under the 1996 amendment to the Basel rules for capital adequacy, the concept of Value-at-Risk (VaR) was introduced. This measurement effectively measures the probability of loss on a portfolio of all market-traded assets and liabilities, under a presumption of future economic conditions, and under the assumption that the values of these derivative positions are not perfectly correlated. Put simply, market risk measurement assumes that there exists a diversification effect in holding a broad range of derivative and underlying positions in a trading portfolio, and this introduces a significant non-linear aspect to the measurement of risk in the change of value in market-traded positions.

The VaR metric has been much maligned following its initial take-up, since it is complex and non-linear and is prone to significant model error. Yet it has been adopted throughout the world of banking in at least one of its three forms, the first being Historical Simulation, the second being Monte Carlo Simulation, and the third being what is called the Variance Co-Variance approach. Some banks and even regulators have moved away from VaR as a risk measure and have introduced what were meant to be more simplistic metrics, that have in actual fact their own idiosyncrasies and complexities. These are metrics that require, for example, the distinction between the portion of risk in a single derivative position that is deemed systemic risk, from the portion of risk held in that same derivative position that is deemed specific risk.

To make matters even more complex, post the 2008 financial crisis, there has been far greater focus on what is known as the Exposure-at-Default. In the case of an amortisation loan made to an individual or a corporation, the exposure at the expected time of default – should default occur – is relatively straightforward to calculate. One simply makes a guess at a default point in time and then projects the capital outstanding at the expected time of default.

In the case of facilities set up by the bank on behalf of corporate customers which can then draw down funds from the bank as they require them – obviously within certain boundaries called limits – the calculation of Exposure-at-Default is far more complex. It would depend on a number of factors that include the customer’s historical draw-downs on facilities, any changes that may have taken place to the customer’s balance sheet, as well as any delays or cost overruns that may have occurred to the customer’s key capital expenditure projects.

In the case of derivative exposures with corporate customers or banks, the situation is even more complex. The exposure that a bank faces on a derivative product needs to be assessed on an ongoing daily basis as opposed to on a monthly or annual basis, since the credit risk inherent in the possibility that the counterparty on a trade will not pay the bank back is only relevant if that trade is presently valued in the favour of the bank. Derivates are often zero-sum games in which the bank and its counterparty in a trade are effectively taking a bet on the future value of a particular economic indicator, or on the underlying value of a particular equity or debt index. On a daily basis, this means that on a particular derivative trade, the bank may actually owe money to its counterparty rather than be owed money.

The credit exposure risk embedded in a derivative or trading transaction is therefore dependent on what is known as Potential Future Exposure (PFE), which itself depends on making assumptions about future underlying financial and economic indicators and feeding these assumptions into complex non-linear models. No wonder then that in the daily operations of a bank, enormous effort and large numbers of highly sophisticated banking staff are required to simply reconcile between these two very different worlds – that is between the world of accounting valuations, versus the world of risk valuations.

This complexity in the accounting valuation as well as in the risk valuation of the products within a bank demonstrate the substantial difference between these types of financial assets and liabilities versus those non-financial assets that may be held by an industrial firm. From an information management perspective, the complex relationship between the accounting rules and the risk rules, governing value, require banks to constantly, frequently, and accurately manage market data, counterparty data, collateral data, and often cashflow data on behalf of a single trade. And they would need to be able to manage this data always in context to its associated fields – i.e. the correct and appropriate market data needs to be associated not with any trade, but specifically with those trades that require that market data.

In a diversified bank that operates across multiple jurisdictions, the challenge in accurately performing the information management processes required to reconcile between the world of accounting and the world of risk management, is one of the most significant cost areas within banking. Many banks have attempted to solve this problem through either simplifying their systems architecture, or through standardising their processes, but both approaches lead to enormous organisational challenges and both have significant disadvantages in terms of flexibility to respond to competition and rapidly changing market forces. For most front-office banking staff, these issues seem to them major hindrances in performing the sales and trading functions they are hired for. For regulators, auditors, but particularly for the finance and risk professionals who operate the systems and processes that manage the information generated by these trades, this is the heart and soul of the matter. Organisationally, a bank is far more likely to succeed if it performs these functions efficiently, than if it focuses primarily on sales incentives.

 

The absence of a single complete banking platform

This then leads to the third reason that banks are substantially encumbered in the efficiency of their operations. The challenge is that there exists no single standard or system or banking platform that can handle the operational aspects of providing banking products, accounting for these products using a standard accounting rules engine, and assessing the risks inherent in their products using a single risk measurement system.

In general, banks – especially the larger banks with well-diversified product ranges, as well as diversified across different countries and regions of the world – will possess literally hundreds of systems. These systems will have been implemented as long as forty years ago in some cases, such as the basic credit card system, which may have been written in Cobol code.

Often, banks will have different accounting rules engines for different business divisions, as well as different sub-ledgers for each of those divisions. There will exist a raft of what is known as ETL (Extract, Transform, Load) code that may be written in different programming languages – and there are many – that act as translation layers between the different core banking systems (what is known as the product systems) and the accounting rules engines. And then there will exist an even further and sometimes even more complex ETL layer between theses accounting rules engines and their respective sub-ledgers and their general ledgers.

Naturally, the design of a sub-ledger and a general ledger will be based on transforming the origination of loans and advances on the asset side of the balance sheet, and deposits on the liability side of the balance sheet, as well as all of the subsequent cashflows that result from these originations, into debits and credits. These debits and credits will then be posted to the appropriate sub-ledger and ultimately to the general ledger to particular GL account codes, and this will allow the bank to possess a point-in-time view of its balance sheet.

These postings, however, will effectively aggregate values and data about banking transactions into a limited number of GL codes that will necessarily simplify the overall accounting view of the bank’s balance sheet into usually a few thousand rows of information. From an accounting perspective this will be acceptable, and this process can be audited both internally and externally fairly easily, owing to a long tradition of audit standards that exist throughout banking.

The trouble is that the process of aggregation embedded into complex rules built into the accounting rules engines, into the sub-ledgers and general ledger, as well as into the ETL processes that prepare data for these accounting engines, loses critical customer-related and risk-related information and distinctions along the way.

If, for example, the general ledger does not have a particular categorical distinction between revocable and irrevocable facilities to retail customers, then this distinction will be lost in the accounting process as it is applied to the facility overdraft book of the bank on a monthly basis. This distinction, however, from a risk management and liquidity risk assessment point of view, is a critical one for regulatory purposes as well as for the purpose of tactical and strategic management of the bank.

Of course, one could simply implement a change to the general ledger so that it could absorb this distinction. However, upstream from the simplicity of adding a new GL code, there is an entire history of code that has been written by bank employees, contractors and consultants, over time frames often spanning more than a decade – often undocumented – that will need to be rewritten and massaged to meet the new GL code requirement. And bear in mind that this is only one data distinction for only one type of asset in only one product system.

Often the facility overdraft book of a diversified banking group will be made up of several books that sit on different origination product systems in different countries. This could be the result of system decisions having been made under a ‘federal’ model – one in which the local managers are empowered to make their own operational decisions and whose success is measured on a local income statement level. Or it could be a result of the banking group acquiring local banks in new markets and thereby inheriting their legacy systems.

Either way, the more diversified a bank, the more acquisitive it has been, and the longer its history, the greater this challenge is in integrating the finance and risk views of the world.

It is not going too far in fact to say that this integration problem is the single greatest problem facing banks going forward, particularly in a world of ever greater complexity, frequency and detail in the demands of regulation.

Once again, the reason Richard Fuld and John Stumpf are deemed by the market to have failed in their duty of being effective custodians of their banks’ respective balance sheets, is not because they were inherently dishonest, or even derelict in their duties. It is perhaps because they failed to understand that their key role was to manage an information organisation, rather than a sales and marketing organisation.

This issue of senior management in banking emerging primarily from trading or transacting backgrounds is evidence – across many successful banks worldwide – that in spite of the ongoing costs and frustrations of the complex problem of information management, the critical nature of this problem is not yet fully understood. Should banks be viewed first as information processing organisations, and second as sales and marketing organisations, they would choose managers whose skills and purpose are better aligned to the task. And, as elucidated, it is a complex and ever- changing task, fought on a highly competitive playing field.

 

Banks have become an extension of state policy

The fourth significant challenge that banks face today is the degree to which they have been coerced by government, and by the prevailing orthodoxy of the state, to act as policeman in a world increasingly driven by fear.

There are many examples of this coercion and the manner in which the state is passing on responsibility and cost to banks throughout the world, but there are three areas of particular interest.

The first is tax. The imposition by the US tax authorities of FATCA (Foreign Account Tax Compliance Act) has led almost all banks world- wide, irrespective of their own local laws, to implement complex FATCA indicia within their origination and client onboarding systems. Non-compliance, after all, can lead to an imposition on their clients of a significant withholding tax. This tax law has had an immediate and significant impact on so-called tax havens and the existence of secret bank accounts, and has led to significant political changes in countries such as Switzerland, whose economy has been historically propped up by the concept of non-disclosed bank accounts held in the names of offshore clients.

Of course, the net effect of reining in US tax dodgers is a positive one, and is an action that will be followed by most western nations in a process being implemented now called Automatic Exchange. These pieces of legislation should, over time, substantially reduce the degree and extent of tax avoidance that has led to such high and illogical tax rates in leading liberal democratic nations.

The question of course, for banks running as privately-owned institutions, is why the responsibility and cost for the hard labour required to identify US citizens and US-owned entities within offshore corresponding banks on a regular ongoing basis should be borne by them? Or, for that matter, why the corresponding bank – not even a US-based bank – should bear the cost on their side?

Irrespective, banks now face these costs, and any deviation – even to the slightest degree – in not meeting the US-written FATCA laws, can and will lead to souring relationships with US-based banks, as well   as breaches of local laws in countries that have adopted the FATCA principles, sometimes under political pressure from their US ally. US tax laws have effectively – over a relatively short space of time – been written, codified, propagated, and implemented by banks worldwide. As Thomas Piketty – the renowned French economist – has pointed out on several occasions, FATCA and Automatic Exchange ended eight hundred years of Swiss bank account secrecy and client tax avoidance, in a matter of two years after FATCA was first conceived.

 

The coercion of banks into the fight against money- laundering and terror

The second and third significant examples of just how deeply banks have been drawn into the world of cross-border politics, are the two requirements known as Anti-Money Laundering and Combating the Financing of Terrorism laws, commonly summarised as AML/CFT, and often combined into a single implementation within a bank.

The idea is simple: ISIS, its cohorts, and all terrorist organisations in general, would more easily be beaten down by starving their supply chain of funding rather than by fighting them in the streets at the cost of soldiers’ lives.

Forcing banks to reveal the nature of their customers, their identities – and by exposing authorities to all suspicious transactions – will eventually lead to the noose being tightened around the necks of these terrorist organisations, and they will thus be starved of funding and of weapons – or so the theory goes. Whilst the CFT requirement is specifically targeted to fight terrorism, the AML rules are targeted to address other cross- border criminal activities, such as the laundering of money produced out of the international drug trade or trade in prostitution, or child- trafficking.

As with the tax example above, it is difficult to argue against this logic, and therefore it makes sense for governments working in tandem with each other against these terrorist organisations to force banks to bear the cost of implementing these systems. It is these systems that would possess the capability of constantly and diligently monitoring, observing and filtering through all transactions every day to sift out suspicious transactions.

The trouble is that this cost is not one that will assist banks in attracting equity investment and fresh capital. It is a cost whose outcome is asymmetric: no news is good news, whereas bad news could lead to extensive and painful fines or, worse, to sanctions. In all three of the cases cited above, the tax case, the terrorist financing case and the anti-money laundering case, it is sensible for banks to act as agents on behalf of the state, but the costs and complexity borne by these banks make them less attractive to investors, and more difficult to manage.

Once again, there is a significant risk in banking that does not exist in most industrial organisations, and that has to do with the fact that the business of money is at the heart of illegal and terrorist activity. As such, banks face a greater burden than industrial firms typically would, and this burden is one that has enormous downside risk if not properly addressed.

The ideal bank is one that is now extremely suspicious of its customers, and is party to the notion of having to constantly know – irrespective of an individual’s rights of privacy – everything to do with its customers. This is not only a moral problem, it is an information management problem. As evidenced by the enormous fines levied against banks that were caught trading with Syrian counterparties during the period of US sanctions on Syria, the financial risks are significant in not implementing world-class monitoring systems across all transactions emanating from the bank.

 

Increased capital and liquidity requirements on banks

Which brings the argument to the fifth major challenge facing banks today, and that is the increased capital and liquidity requirements that have been imposed on all international banks since the advent of Basel III and Dodd-Frank as a result of the anger generated out of the 2008 financial crisis.

These capital and liquidity requirements are complex but their net result is that the old-school banking model of the heyday of the 1990s has been broken. Banks used to borrow short and lend long and this is now heavily disincentivised through the imposition of a metric called the Net Stable Funding Ratio.

Tier 1 equity capital adequacy requirements have effectively doubled since the 2008 crisis. From a systemic point of view this makes sense, since banks on aggregate will now have far more loss-absorbing capability. But on an individual bank level, raising the capital adequacy ratio ironically makes each individual bank more likely to fail, and therefore far riskier, since although they hold more capital to absorb risk, they will fail at the very moment they need to employ this capital to actually absorb losses. If Lehman taught us anything it is that banks will not lend to their competitors in a challenging market.

Banks will, under increased capital adequacy rules, be able to absorb greater losses for longer, in theory. But they will never reach that point because they will see funding dry up in the interbank market far earlier, merely because they crossed the capital adequacy thresholds imposed upon them.

In the context of a business environment in which there has been a significant explosion in the costs of managing and processing data, and in transforming banks from being sales organisations into information management organisations, the increased capital and liquidity requirements could not come at a worse time.

One can understand the necessity for regulators to buffer capital on aggregate, and to mitigate interbank liquidity dependency between banks, but they impose these rules at a time that banks face innumerable challenges in simply managing the enormous complexity of information they already have at hand. Regulators also forget that banks need to operate as attractive investment prospects to outside investors, otherwise their equity funding will get cheaper, and will be even harder to come by.

In terms of liquidity risk, and the imposition of the Net Stable Funding Ratio, one should once again recall that Lehman had enormous exposure to the CDO market and this was the reason interbank funding dried up. It is true that their death-blow was exacted by their peer banks refusing to roll over short-term funding, but the underlying cause was a significant failure of judgement on the part of their management in building such a significantly dangerous exposure to a single asset class. And this is perhaps the main point – that the extent of this exposure was probably not known to them owing to a failure in information management.

 

Summary

In summary, banks are now riskier than ever before. They face enormous regulatory requirements, both from a financial and risk perspective. New regulations have reduced their ability to return value to shareholders and has undermined the fundamental business model of banking in which they intermediate on the yield curve.

They also face enormously increased costs in implementing systems and processes that effectively make them an extension of government, but for which they are not compensated. For older and larger and more diversified banks, the challenges inherent in legacy systems, and the challenges of conflicting local and international legislation, has massively increased their costs of integration and their ability to accurately and timeously provide critical information to their managers.

Banks are yet to understand, in the main, that they are information processing organisations, rather than brands and sophisticated sales- people. The CEOs of internationally diversified banks have emerged out of a Harvard Business School inspired model, very much the Milton Friedman model – in which the focus is almost entirely based on shareholder return.

In a world in which banks have been forced to take responsibility for imposing tax law, for fighting the war on terror, and to do so with increasing costs, increasing complexity and increasing capital requirements, shareholder returns are not likely to be that impressive for some time anyway.

Should bank executives take the long-term view, should they plan and execute information processing as their primary capability, and should they execute with the same kind of precision and care that one would expect in the aeronautical industry, for example, they will surely beyond any doubt be the executives who excel and leave a legacy of success.

This would require them of course to dodge some bullets, absorb Department of Justice fines, and have very thick skins. But, ultimately, the information executive, rather than the sales executive, will prevail.

 



Recent Posts



 
 

Monocle is a results-focused consulting firm specialising in banking and insurance. We believe in doing business with integrity and transparency. We consult broadly – but specifically, we translate business and regulatory imperatives into tangible, data-driven results, bridging the gap between the business and its operations.


Learn More
OUR ADDRESS

South Africa

13th Floor, Greenpark Corner, 3 Lower Road, Morningside, Sandton, South Africa

United Kingdom

4 Lombard Street, London, EC3V 9HD, England

CONTACT US

South Africa

Phone: +27 11 263 5800
Fax: +27 11 263 5811
Web Site: www.monocle.co.za

United Kingdom

Phone: +44 (0) 2071 902 990
Web Site: www.monocle.co.za

Social



  

Copyright © 2019 Monocle Solutions. All Right Reserved.