Customisable optimisation
26 November 2013
It is important that institutions consider their approach to the collateral challenges facing them, and to what extent data and flexible inventory functionality can help, says Lombard Risk’s Elaine MacAllan
Image: Shutterstock
The changing regulatory environment is creating increased demands for collateral in both the cleared and uncleared markets, across all financial products. The direct implications of current drafts of regulations (US Dodd-Frank Act and European Market Infrastructure Regulation), and supervisory frameworks (the Basel Committee on Banking Supervision and the International Organization of Securities Commissions) are mandated higher quality two-way margin, restrictions on rehypothecation and both enforced and client-driven segregation of collateral. It is now undisputed that there will be a ‘collateral squeeze’—the only item left for debate is how severe and imminent the drain on liquidity will be.
In response, firms are reviewing their collateral programmes and looking for methods to maximise the efficiency and reduce the cost of their processes. Under intense scrutiny is the extent to which parties are posting the most optimal collateral allowed, under the terms of their margin agreements.
At Lombard Risk, we have found that our existing and prospective clients are all seeking an optimisation solution, although the drivers and business requirements vary widely. From a software development perspective, understanding that ‘one size does not fit all’ and offering a customisable solution is critical, as is the ongoing development of the product to meet evolving demands.
A pre-requisite for an efficient optimisation process
Data access and consolidation issues are a constant inhibitor to an efficient optimisation process.
It is not cost effective to maintain large collateral buffers, or post high quality, universally eligible assets where lower quality assets may be acceptable under most margin agreements: it is not necessary, it is too expensive, and soon there won’t be enough to go around. Now that collateral is no longer cheap or readily available, pressures are increasing to gain real-time efficiency, maximise the usage of all available assets and simultaneously reduce the cost of the assets used wherever possible.
Product, process and technology silos prevent the rationalisation of inventory processes—the greatest challenge is often in bringing them all together. Inventory functions can be significantly more efficient in the utilisation of all resources where they are available to view on a single platform. A consolidated inventory manager should enable internal asset transfers and transformations to be effected on a real-time basis, or as close to real-time as settlement processes allow. It is often far cheaper to source collateral from internal sources, but if it takes a day to identify and transfer assets to the point of need, the efficiency opportunity has already been lost.
A flexible and configurable firm-wide inventory tool adds the most value where it can be supported on the same platform as the exposure management function, or at least be seamlessly integrated with it. There is a migration of collateral responsibility from the back to the front office, already most large institutions have established credit valuation adjustment (CVA) desks and collateral optimisation programmes that can fully model trade costs, including offsets, regulatory charges, impact on balance sheet, and calculate which asset is the cheapest to deliver/most expensive to hold. However, they rarely have direct access to information such as where the asset may be eligible, or what limits or constraints may be applicable under legally documented terms.
At a time when eligibility, haircut, concentration rules and rehypo/segregation models are becoming significantly more complex, a single technology solution is the obvious place to support optimisation if it consolidates and seamlessly cross references the data elements required:
Cross-product exposures/requirements (how much do I need?)
Inventory positions, values and location/source (where are the assets?)
Eligibility and haircut rules (am I allowed to use the assets?)
Concentration and correlation limits (what is the limit of the assets I can use?)
Segregation and rehypo constraints (am I allowed to re-use the assets?)
The providers and consumers of inventory should be brought together, with appropriate controls. The inventory should consolidate both trading and collateral positions, on a real-time basis, with forward ladder projections based on anticipated (agreed) and confirmed (settled) transactions and collateral pledges. Where this can be achieved, a firm is able to coordinate and maximise usage of all available assets: the trading desks can readily consume available/excess collateral, or provide it where and when required, and the collateral sourcing function can view where assets may become available for re-use as term trades or existing pledges roll on or off.
There are practicalities to consider. Integrated controls are required in such a model:
It should be possible to reserve/earmark assets, or identify them as not available for re-use (including reserves required under Basel III liquidity coverage ratio calculations);
Rehypo limitations should be identified—and use of non-rehypo assets prevented;
Segregated assets should be clearly marked, or removed from view where appropriate;
User restrictions and privileges are required, including the ability to filter certain properties and views according to role; and
A real-time automated substitution workflow is required to recall assets when required.
From a software design perspective, careful consideration needs to be given to the various needs of the users of the inventory. Trading desks may have different asset utilisation strategies than the collateral sourcing function, and require different views over what is essentially the same set of data. The buy side has different needs to the sell side. Brokers and clearing members may offer transformation or optimisation services, whereas those short of eligible collateral may need to transform the assets they have. Agent lenders, asset managers and collateral service providers may want to reflect their client investment strategies or inventory priorities—in this case, ‘cheapest to deliver’ is not necessarily the priority.
Configurable views are a fundamental requirement for a truly fit for purpose inventory. As markets and global regulations evolve and new asset segregation models appear, the inventory should be able to adapt with little or no additional development.
Trade, inventory and collateral optimisation distinctions
The lack of a standard lexicon is causing some confusion and frustration in the market.
Trade optimisation
In the new landscape, pre-trade decisions need to be made as to which is the optimal venue to execute and/or clear transactions, and what is the right price. Is the trade type subject to mandatory clearing? Or can it be bilaterally executed? In either case, different costs and charges are implied. Which central counterparty (CCP) or counterparty should be chosen? Factors that influence this decision include:
Cost, including transaction, settlement, custodial and operational costs; and
Initial margin requirement calculations—brokers/CCPs support various margin methodologies and will demand different levels and types of collateral, also taking into account:
Hedging opportunities with the existing portfolio, and resulting margin offset benefits;
Regulatory impacts/charges, including calculation of/leverage of LCRs (liquidity coverage ratios), and CVA adjustments against non-cleared trades; and
Impact on collateral usage and haircuts payable—can any assets be freed up and made available for funding or revenue-generating purposes?
Technology solutions are required that can consider and calculate all the above factors on a real-time basis, to identify the optimal trading or clearing venue, and to provide the level of detailed analysis that clients ultimately require. Many legacy systems simply cannot support the complexity of calculating and validating against so many factors—firms will need to examine their technical infrastructure to determine if it is fit for purpose. The reality is that they may need to be replaced.
Inventory optimisation
There are multiple aspects to inventory optimisation:
Optimise use of available assets (what have I got?)
Optimise collateral sourcing (what do I need and what will I need in the future?)
Maintain assets in the optimal location (where is it and how much is it costing me?)
Inventory optimisation should be a proactive enterprise function, considering anticipated movements and exposures (PFEs), and reflecting evolving internal trading and collateral strategies. Central to this remains the need for a centralised inventory, capable of consolidating collateral positions, trading positions and externally fed inventory data, including optional forward tracking (‘ladder’) views: if I can project my future exposures, I can be far more efficient in sourcing the lowest quality or cheapest collateral eligible, in anticipation of the requirement.
Collateral optimisation
For many, this may be based on the identification of ‘cheapest to deliver’ (CTD), or ‘most expensive to hold’ (ETH). Although CTD is a market-wide concept, it is relative and there is no universally applicable CTD calculation—what is cheap for one organisation may in fact be expensive for another, and therefore the optimisation solution needs to allow the user to define and calculate CTD before using it as an element in the optimisation calculation. For others, the CTD may only form part of their optimisation strategy, and be weighted against other factors such as limiting shortfall/maximising utilisation of available concentration limits or using certain asset types ahead of others in line with their client’s investment strategy.
The most effective collateral optimisation programme is:
Real-time, global and cross-product;
Automated, supported by algorithmic calculations;
Flexible and configurable (data elements and algorithm models);
Rules-based and goal-driven;
Comprehensive, taking into account all known exposures, available assets, and documented terms (thresholds, haircuts, eligibility/concentration constraints, etc); and
Integrated, with automated collateral booking facilities of optimised movements.
And must provide quantifiable business benefits:
Reduce costs;
Improve efficiency, maximise use of available assets;
Improve liquidity;
Control asset selection in line with organisational strategies;
Reduce operational overheads; and
Provide a bespoke client service.
‘Optimisation’ is a bespoke concept
Simply put, firms need to invest in technology and data management if they want to achieve the benefits that an optimisation solution offers. Some organisations have built internal optimisation solutions with varying degrees of success, but the majority are still in the analysis phase, investigating what optimisation could or should mean to their firm, and how it could be implemented.
The global financial markets are still in a state of flux, so much so that the impacts of regulatory changes are still not fully understood. ‘Future-proof’ technology solutions must offer configurable and extendable functionality that has the best chance of meeting business needs today and in the future.
Configuration and rules
Users should be able to customise their own:
Optimisation goals;
Constraints;
Variables/filters;
Rules and rankings;
Parameters; and
Templates.
As market conditions and risk concerns change over time, optimisation rules will need to be adjusted. A user-friendly interface should enable real-time rule updates without technology intervention.
Functional flexibility
Optimisation calculations should be available across single or multiple agreements/regions/business lines so that, for example, repo, OTC, securities lending and clearing margin requirements are considered within the same calculation to achieve optimal firm-wide collateral allocations. For example, if eligibility terms on a repo agreement are less restrictive than on a clearing agreement, the clearing haircut payable can be improved by substituting lower grade collateral to the repo agreement.
Optimisation calculations should be real-time, accessing the latest available data from source, and re-run should always be available. They should offer some algorithmic flexibility, with multiple approaches available altering the solution. It should also be possible for front office users to customise algorithms and plug in proprietary cost models for enhanced bespoke optimisation.
Extendable data parameters
Optimisation data attributes should be configurable and extendable. In addition to standard market data, institutions have data elements that are unique to them. Users should be able to define their own values and utilise them for filtering, aggregation and rule definition:
Benchmarking data;
Internally calculated cost value on a security;
Risk weighting value on an asset type;
CTD value on an asset class;
Funding spread per inventory source; and
Isolated securities, eg, specials.
Data simulations and scenario analysis
The user should be able to dynamically adjust values or define hypothetical events in a simulation environment for real-time and fully flexible scenario analysis. For example:
I anticipate a marked increase in exposure by x percent, where is my cheapest source of eligible collateral according to my own cost models?
What additional assets will I need, or what substitutions should I effect, if there is a shift in CCP haircuts?
What shortfalls will I need to cover in the event of market price shifts?
What collateral transformation opportunities are available to me?
What will the impact be on my collateral portfolio if eligibilities or ratings change?
If I lock/reserve asset pools, what will the result be on my available inventory?
Strategic considerations
It is important that institutions consider their approach to the collateral challenges facing them, and to what extent gathering firm-wide data, an optimisation programme, and/or, flexible inventory functionality can help them rationalise and reduce the cost of their collateral functions, or improve their client service offerings.
It is a fact that collateral optimisation is becoming a focus in the front office from a cost and inventory perspective, but the operational efficiencies and counterparty-level collateral allocation improvements offered by an optimisation solution are more likely to be felt downstream in the margin workflow. The maximum benefits are undoubtedly to be gained front-to-back where the architecture will allow.
An extended whitepaper on this topic, and demonstration of Lombard Risk’s Optimisation and Inventory Manager solutions, are now available.
In response, firms are reviewing their collateral programmes and looking for methods to maximise the efficiency and reduce the cost of their processes. Under intense scrutiny is the extent to which parties are posting the most optimal collateral allowed, under the terms of their margin agreements.
At Lombard Risk, we have found that our existing and prospective clients are all seeking an optimisation solution, although the drivers and business requirements vary widely. From a software development perspective, understanding that ‘one size does not fit all’ and offering a customisable solution is critical, as is the ongoing development of the product to meet evolving demands.
A pre-requisite for an efficient optimisation process
Data access and consolidation issues are a constant inhibitor to an efficient optimisation process.
It is not cost effective to maintain large collateral buffers, or post high quality, universally eligible assets where lower quality assets may be acceptable under most margin agreements: it is not necessary, it is too expensive, and soon there won’t be enough to go around. Now that collateral is no longer cheap or readily available, pressures are increasing to gain real-time efficiency, maximise the usage of all available assets and simultaneously reduce the cost of the assets used wherever possible.
Product, process and technology silos prevent the rationalisation of inventory processes—the greatest challenge is often in bringing them all together. Inventory functions can be significantly more efficient in the utilisation of all resources where they are available to view on a single platform. A consolidated inventory manager should enable internal asset transfers and transformations to be effected on a real-time basis, or as close to real-time as settlement processes allow. It is often far cheaper to source collateral from internal sources, but if it takes a day to identify and transfer assets to the point of need, the efficiency opportunity has already been lost.
A flexible and configurable firm-wide inventory tool adds the most value where it can be supported on the same platform as the exposure management function, or at least be seamlessly integrated with it. There is a migration of collateral responsibility from the back to the front office, already most large institutions have established credit valuation adjustment (CVA) desks and collateral optimisation programmes that can fully model trade costs, including offsets, regulatory charges, impact on balance sheet, and calculate which asset is the cheapest to deliver/most expensive to hold. However, they rarely have direct access to information such as where the asset may be eligible, or what limits or constraints may be applicable under legally documented terms.
At a time when eligibility, haircut, concentration rules and rehypo/segregation models are becoming significantly more complex, a single technology solution is the obvious place to support optimisation if it consolidates and seamlessly cross references the data elements required:
Cross-product exposures/requirements (how much do I need?)
Inventory positions, values and location/source (where are the assets?)
Eligibility and haircut rules (am I allowed to use the assets?)
Concentration and correlation limits (what is the limit of the assets I can use?)
Segregation and rehypo constraints (am I allowed to re-use the assets?)
The providers and consumers of inventory should be brought together, with appropriate controls. The inventory should consolidate both trading and collateral positions, on a real-time basis, with forward ladder projections based on anticipated (agreed) and confirmed (settled) transactions and collateral pledges. Where this can be achieved, a firm is able to coordinate and maximise usage of all available assets: the trading desks can readily consume available/excess collateral, or provide it where and when required, and the collateral sourcing function can view where assets may become available for re-use as term trades or existing pledges roll on or off.
There are practicalities to consider. Integrated controls are required in such a model:
It should be possible to reserve/earmark assets, or identify them as not available for re-use (including reserves required under Basel III liquidity coverage ratio calculations);
Rehypo limitations should be identified—and use of non-rehypo assets prevented;
Segregated assets should be clearly marked, or removed from view where appropriate;
User restrictions and privileges are required, including the ability to filter certain properties and views according to role; and
A real-time automated substitution workflow is required to recall assets when required.
From a software design perspective, careful consideration needs to be given to the various needs of the users of the inventory. Trading desks may have different asset utilisation strategies than the collateral sourcing function, and require different views over what is essentially the same set of data. The buy side has different needs to the sell side. Brokers and clearing members may offer transformation or optimisation services, whereas those short of eligible collateral may need to transform the assets they have. Agent lenders, asset managers and collateral service providers may want to reflect their client investment strategies or inventory priorities—in this case, ‘cheapest to deliver’ is not necessarily the priority.
Configurable views are a fundamental requirement for a truly fit for purpose inventory. As markets and global regulations evolve and new asset segregation models appear, the inventory should be able to adapt with little or no additional development.
Trade, inventory and collateral optimisation distinctions
The lack of a standard lexicon is causing some confusion and frustration in the market.
Trade optimisation
In the new landscape, pre-trade decisions need to be made as to which is the optimal venue to execute and/or clear transactions, and what is the right price. Is the trade type subject to mandatory clearing? Or can it be bilaterally executed? In either case, different costs and charges are implied. Which central counterparty (CCP) or counterparty should be chosen? Factors that influence this decision include:
Cost, including transaction, settlement, custodial and operational costs; and
Initial margin requirement calculations—brokers/CCPs support various margin methodologies and will demand different levels and types of collateral, also taking into account:
Hedging opportunities with the existing portfolio, and resulting margin offset benefits;
Regulatory impacts/charges, including calculation of/leverage of LCRs (liquidity coverage ratios), and CVA adjustments against non-cleared trades; and
Impact on collateral usage and haircuts payable—can any assets be freed up and made available for funding or revenue-generating purposes?
Technology solutions are required that can consider and calculate all the above factors on a real-time basis, to identify the optimal trading or clearing venue, and to provide the level of detailed analysis that clients ultimately require. Many legacy systems simply cannot support the complexity of calculating and validating against so many factors—firms will need to examine their technical infrastructure to determine if it is fit for purpose. The reality is that they may need to be replaced.
Inventory optimisation
There are multiple aspects to inventory optimisation:
Optimise use of available assets (what have I got?)
Optimise collateral sourcing (what do I need and what will I need in the future?)
Maintain assets in the optimal location (where is it and how much is it costing me?)
Inventory optimisation should be a proactive enterprise function, considering anticipated movements and exposures (PFEs), and reflecting evolving internal trading and collateral strategies. Central to this remains the need for a centralised inventory, capable of consolidating collateral positions, trading positions and externally fed inventory data, including optional forward tracking (‘ladder’) views: if I can project my future exposures, I can be far more efficient in sourcing the lowest quality or cheapest collateral eligible, in anticipation of the requirement.
Collateral optimisation
For many, this may be based on the identification of ‘cheapest to deliver’ (CTD), or ‘most expensive to hold’ (ETH). Although CTD is a market-wide concept, it is relative and there is no universally applicable CTD calculation—what is cheap for one organisation may in fact be expensive for another, and therefore the optimisation solution needs to allow the user to define and calculate CTD before using it as an element in the optimisation calculation. For others, the CTD may only form part of their optimisation strategy, and be weighted against other factors such as limiting shortfall/maximising utilisation of available concentration limits or using certain asset types ahead of others in line with their client’s investment strategy.
The most effective collateral optimisation programme is:
Real-time, global and cross-product;
Automated, supported by algorithmic calculations;
Flexible and configurable (data elements and algorithm models);
Rules-based and goal-driven;
Comprehensive, taking into account all known exposures, available assets, and documented terms (thresholds, haircuts, eligibility/concentration constraints, etc); and
Integrated, with automated collateral booking facilities of optimised movements.
And must provide quantifiable business benefits:
Reduce costs;
Improve efficiency, maximise use of available assets;
Improve liquidity;
Control asset selection in line with organisational strategies;
Reduce operational overheads; and
Provide a bespoke client service.
‘Optimisation’ is a bespoke concept
Simply put, firms need to invest in technology and data management if they want to achieve the benefits that an optimisation solution offers. Some organisations have built internal optimisation solutions with varying degrees of success, but the majority are still in the analysis phase, investigating what optimisation could or should mean to their firm, and how it could be implemented.
The global financial markets are still in a state of flux, so much so that the impacts of regulatory changes are still not fully understood. ‘Future-proof’ technology solutions must offer configurable and extendable functionality that has the best chance of meeting business needs today and in the future.
Configuration and rules
Users should be able to customise their own:
Optimisation goals;
Constraints;
Variables/filters;
Rules and rankings;
Parameters; and
Templates.
As market conditions and risk concerns change over time, optimisation rules will need to be adjusted. A user-friendly interface should enable real-time rule updates without technology intervention.
Functional flexibility
Optimisation calculations should be available across single or multiple agreements/regions/business lines so that, for example, repo, OTC, securities lending and clearing margin requirements are considered within the same calculation to achieve optimal firm-wide collateral allocations. For example, if eligibility terms on a repo agreement are less restrictive than on a clearing agreement, the clearing haircut payable can be improved by substituting lower grade collateral to the repo agreement.
Optimisation calculations should be real-time, accessing the latest available data from source, and re-run should always be available. They should offer some algorithmic flexibility, with multiple approaches available altering the solution. It should also be possible for front office users to customise algorithms and plug in proprietary cost models for enhanced bespoke optimisation.
Extendable data parameters
Optimisation data attributes should be configurable and extendable. In addition to standard market data, institutions have data elements that are unique to them. Users should be able to define their own values and utilise them for filtering, aggregation and rule definition:
Benchmarking data;
Internally calculated cost value on a security;
Risk weighting value on an asset type;
CTD value on an asset class;
Funding spread per inventory source; and
Isolated securities, eg, specials.
Data simulations and scenario analysis
The user should be able to dynamically adjust values or define hypothetical events in a simulation environment for real-time and fully flexible scenario analysis. For example:
I anticipate a marked increase in exposure by x percent, where is my cheapest source of eligible collateral according to my own cost models?
What additional assets will I need, or what substitutions should I effect, if there is a shift in CCP haircuts?
What shortfalls will I need to cover in the event of market price shifts?
What collateral transformation opportunities are available to me?
What will the impact be on my collateral portfolio if eligibilities or ratings change?
If I lock/reserve asset pools, what will the result be on my available inventory?
Strategic considerations
It is important that institutions consider their approach to the collateral challenges facing them, and to what extent gathering firm-wide data, an optimisation programme, and/or, flexible inventory functionality can help them rationalise and reduce the cost of their collateral functions, or improve their client service offerings.
It is a fact that collateral optimisation is becoming a focus in the front office from a cost and inventory perspective, but the operational efficiencies and counterparty-level collateral allocation improvements offered by an optimisation solution are more likely to be felt downstream in the margin workflow. The maximum benefits are undoubtedly to be gained front-to-back where the architecture will allow.
An extended whitepaper on this topic, and demonstration of Lombard Risk’s Optimisation and Inventory Manager solutions, are now available.
NO FEE, NO RISK
100% ON RETURNS If you invest in only one securities finance news source this year, make sure it is your free subscription to Securities Finance Times
100% ON RETURNS If you invest in only one securities finance news source this year, make sure it is your free subscription to Securities Finance Times