When life throws you lemons….

Posted by on Jan 4, 2013 in Dodd Frank, Noteworthy, Regulatory, Whitepaper and Downloads | 0 comments

… build a juicer, corner the melon market and prove that dyslexia never stopped anyone from succeeding.

The Tabb Group – in a 2011 presentation to the Commodity Futures Trading Commission’s (CFTC) Technical Advisory Committee  – estimates that the largest US OTC derivatives dealers will spend a total of $1.8B on Dodd Frank (DFA) related technology costs; with the top eight spending over $1.5B.

An August 2012 update to a 2010 S&P analyst report puts its annualized estimate of DFA-related technology and related expenses for the top eight US banks at $2.0/$2.5B.

Throw in the profound and fundamental changes the regulations have wrought on OTC derivatives market structure, business models, terms of competition and future earnings expectations – and that’s a lot of chucked lemons.

All those lemons however present an opportunity for competitive differentiation.

This new Acuity Derivatives client report From Regulatory Compliance to Technological Advantage (making lemonade…) seeks to show that given the fundamental nature of changes to the OTC derivatives industry and also the high technology costs involved; that the deployment of this technology spend should not be viewed solely in the context of sunk compliance costs, but in the context of investing for competitive technology advantage.

Read More

Unraveling Proprietary Trading – implementing the Volcker metrics

Posted by on Oct 22, 2012 in Classification, Dodd Frank, P&L Attribution, Regulatory, Risk, Volcker | 0 comments

The Volcker Rule mandates that banking entities cease proprietary trading, subject to certain exceptions for “permitted activities” including market making and risk-mitigating hedging.

The current proposed implementation of the rule includes recommendations for a framework of 17 quantitative metrics to be calculated and analyzed daily, and reported to regulators monthly.

The 17 quantitative metrics are grouped into 5 metrics groups (as listed to the left)

Each metrics group variously seeks to establish that a bank’s risk taking appetite, revenue profile, trading inventory and origination are all consistent with that of a market maker providing liquidity and hedging any residual risks incurred in the provision of this service.

Risk Management: the 4 metrics in this group try to establish that the bank’s trading units retain risk that is not in excess of the size and type required to provide intermediation/market making services to customers.

Sources of Revenue: the 5 metrics in this group try to establish that the bank’s trading units’ revenues are earned primarily from customer revenues (fees, commissions and bid-offer spreads) and not from price movements of retained principal positions.

Read More

For whom the bell tolls…or regulatory reporting compliance

Posted by on Oct 15, 2012 in Regulatory, Uncategorized | 0 comments

For Swap Dealers (SD) and Major Swap Participants (MSP), Friday October 12, 2012 was the effective date for which compliance to the swap public and regulatory reporting rules of the Dodd-Frank Act is required (for interest rate and credit swaps). Many financial institutions have implemented solutions to support these requirements. We published a note (downloadable here) providing an overview of the technical complexities that a reporting solution would need to resolve. Some of these complexities emerge from the following:

Read More

Whitepaper on P&L Attribution now available for download

Posted by on Aug 22, 2012 in Classification, Download, P&L, P&L Attribution, Risk, Whitepaper and Downloads | 0 comments

Our whitepaper on P&L attribution (PLA) is now available for download.

The paper examines the practice of PLA production, analysis and reporting within banks. Given the recent regulatory focus  on PLA and banks’ capacity to produce it, the report also examines areas of potential interest i.e. policy, governance, process capacity and metrics that can be used to benchmark the bank’s capacity to produce, analyze, monitor and report PLA.

Read More

P&L Attribution – Judging the weathermen

Posted by on Aug 22, 2012 in Classification, Modeling, Noteworthy, P&L, P&L Attribution, Regulatory, Risk | 1 comment

“The storm starts, when the drops start dropping
When the drops stop dropping then the storm starts stopping.”
― Dr. Seuss, Oh Say Can You Say?

“Pray don’t talk to me about the weather, Mr. Worthing. Whenever people talk to me about the weather, I always feel quite certain that they mean something else. And that makes me so nervous.”
– Oscar Wilde, The Importance of Being Earnest, Act 1

We will talk about weathermen and the predictions they make. And we will mean something entirely different. By weathermen, we will mean the models in a bank and the predictions they make or the hypotheses they form. And for the realism of Dr. Seuss’ drops dropping, we will substitute the realism of P&L..  More specifically, we will talk about P&L attribution (PLA) and the role it plays in helping us use the realism of P&L to test the hypotheses posed by our various risk models – which actually is its primary purpose in life.

We will focus specifically on 3 hypotheses formulated by a bank’s risk models, its VAR model and its CVA/EPE model respectively. Namely, for a given bank:

I.         Change in the mark-to-market value of its positions are materially determined by changes to a specified set of variables and parameters (i.e. risk factors) and the expected change is quantified by the sensitivities obtained to these risk factors from its models;

II.         There is a specified % probability that the value of its positions will lose more than its VAR number over any given interval equal to the VAR holding period;

III.         The cost of insuring its aggregate positions against the risk of counterparty Z defaulting is not expected to exceed the cumulative sum of the CVA fees charged to its trading desks for originating exposure to counterparty Z.

Read More

A Practical Example of Classification

Posted by on Feb 27, 2012 in Bankruptcy, Classification, Edith Piaf, Lehman, Modeling | 0 comments

Of herding 1 million hissing cats onto a carousel somewhere a few blocks north of Bryant Park in New York. It must be said though that the music on this particular carousel had stopped (and Edith Piaf had most definitely left the building).

As part of the unwind of Lehman Brothers’ derivatives portfolio for the post-bankruptcy Estate (a portfolio of over 1 million derivatives trades); the team conducted a classification exercise of the products in the portfolio (with underlying covering all major asset classes; and instruments running the gamut of complexity from vanilla single factor single asset class flow products to highly exotic structured multi-factor hybrid products).

The context and objective was valuing them in the shortest possible time, in the most efficient manner possible (given limited to no infrastructure), in the most defensible manner possible (given their eventual day in Bankruptcy court) over the days in September/October 2008 when they were unwound.

This classification exercise was essential to developing and driving the valuation strategy for the portfolio, covering:

  • Team selection;
  • Valuation platform, model and method selection;
  • Computational resource provisioning;
  • Market data requirements definition;
  • Developing the netting and hedging assumptions needed to take a view on reasonable bid/offer and transaction costs;
  • And conducting self-consistency checks of the valuations.

A complex and gargantuan valuation exercise that could only be accomplished by intelligent abstraction of product commonality through classification.

Read More