What factors are driving increased demand for the use of execution algorithms by FX market participants?
NC: The continuous evolution of the FX market coupled with the evolving regulatory environment has driven our clients to access more sophisticated execution mechanisms. Aiming to better manage their risk and achieve best execution, market players increasingly adopt algorithms to reduce overall transaction costs, access liquidity from multiple sources, provide seamless operational efficiency and increase transparency around order executions. Our tailored algorithmic execution services are based on this increased demand. Our clients have shown they are confident to leave the order execution with our eFX teams allowing them to focus on their core businesses, whether that’s alpha generation or corporate treasury services.
How can the use of algorithms help to give back more control to FX trading firms and assist them to recover ownership of their order flow and the prices they get?
![]() “Algorithms offer powerful tools for traders to control the tradeoff between execution cost and immediacy.” |
DM: Algorithms offer powerful tools for traders to control the tradeoff between execution cost and immediacy.
Workflows for immediate risk transfer like RFQ can work well when liquidity is cheap and plentiful relative to the size of an order. But when liquidity is scarce or expensive, or an order is large, algorithms excel at sourcing liquidity efficiently from a wide range of venues and counterparties, and – by stretching an order out over time – can significantly reduce trading costs by not requiring any one counterparty to take on a large position.
SW: There is sometimes a misconception that algorithms are black boxes that replace humans. This is far from reality. Algorithms require human input and on-going facilitation. While algos help systematise execution styles as well as access liquidity pools in more efficient ways than by doing everything manually, they need to be properly set up, managed, and evaluated. This is especially true in highly fragmented markets like FX. This human touch is the reason why we see a lot of emphasis today on customisation, which allows clients to quickly and efficiently change an algo’s parameters of execution, both at inception and on the fly. However, once a user has fine tuned their algo strategies, there can be significant efficiencies and cost savings by incorporating these tools into an execution regime.
How can providers of FX algorithms help to avoid the stigma of “black-box technology” by making their workings more transparent and their specific design objectives easier to understand?
MG: If we start with the understanding that this is the client’s order and expect people to use our product then we need to understand that being transparent as to how it works is a pre-requisite. This needs to happen from the outset of the discussion about our offering, when placing the order, during the trade and post-trade. One of the key ways we deliver this is through our specialist electronic sales trading team whose sole focus is to ensure our clients get the optimal outcome out of using our algorithms. By engaging with a human rather than only a machine, the client is able to better understand how to achieve the right selection of algorithm and parameters, get real-time feedback during the trade and review performance post trade. This also ensures an effective feedback loop into our product development to further enhance the experience, and essentially deliver transparency by partnering with the client in the development of the algorithm.
In what ways does the ability to automate best practices and harness the power of data play into the strengths of algorithmic FX trading?
DM: Algorithmic trading is fundamentally just the automation of best trading practices. Automation allows algorithms to master the velocity and volume of real time data that comes from our highly fragmented electronic marketplace in a way humans just can’t. The principal benefits that algorithms provide are re-aggregating that fragmented liquidity; reducing market impact by trading larger orders in smaller pieces; and, for patient traders, acting as a liquidity provider as well as a price taker to capture a portion of the bid-offer spread. Together these can have significant benefits in execution quality.
What are the key considerations that will determine in the first place whether a trading firm should be using FX algorithms?
![]() “Understanding the client’s needs and the problem they are trying to solve, in my opinion is amongst the most important inputs in developing algorithmic execution strategies and supporting tools” |
FW: In dealing with clients across all segments, it’s obvious to me that there no silver bullet or key consideration that swings the pendulum in favor of using an algo vs taking a risk price or using a bench mark execution. The same can be said for choosing what algo strategy one might use should a client reach that decision. The clients risk tolerance, time sensitivity, legal documentation, industry and even the make up or jurisdiction of their underlining clients all contribute to the style of execution or the choice of strategy. Upcoming market events, liquidity and volatility all play a factor too. Our goal is to provide tools and solutions that help our clients navigate these considerations so that they can make the most relevant and informed decision for themselves. As part of our Citi Velocity trading platform, we have just launched a new analytics dashboard that enables our clients to do exactly this.
What issues are generally likely to influence how different firms undertake their algorithmic FX trading? For example the common types of algo they might use, how long to trade them for and the most optimal times for doing so?
NC: Different types of clients have different risk management requirements and trading objectives. Risk tolerance is an important consideration when deciding which algorithm to use and defining the parameters. This depends on the organisation’s preference when faced with the trade-off between reducing execution cost versus minimising market risk.
A provider’s execution algorithm could be very different to another’s depending on the liquidity optimisation and available sources – and how the strategy interacts with this liquidity. Despite the fancy names and buzzwords, most algorithmic providers offer various flavours on the same themes. Where we differentiate ourselves is by reducing implementation shortfall through advanced liquidity management and intelligent order routing. Commerzbank has focused on developing a small set of algorithmic strategies that efficiently achieve these core objectives.
We have an ongoing dialogue with our clients to determine their aims, and then advise them on the appropriate algorithm or parameters that would help them achieve those aims (e.g. if they want to execute a large order relative to the available liquidity at the time of execution, we would advise them to slow the execution down to reduce market impact, but potentially increase aggression at specific price points. This is done via the “I Would Price” parameter we offer.
The work is not done once execution strategies are aligned to trading objectives. As the market and regulation demands more transparency, we are seeing how organisations place more attention on objective Transaction Cost Analysis (TCA) demonstrating “Best Execution” as they strive for unbiased execution analysis.
What steps can trading firms take to develop a more systematic methodology for selecting and deploying FX algorithms?
FW: Whilst not everyone may be willing to adopt a systematic approach for selecting and deploying an execution algorithm, I see decent benefit in doing so. Understanding the client’s needs and the problem they are trying to solve, in my opinion is amongst the most important inputs in developing these tools. We strive to develop tools that are intuitive, use transparent techniques and through our expert quantitative analysis provide inputs that our clients can use in the decision to use (or not) an FX Algorithm. Our clients are asking for these tools to help them identify and evaluate a variety of risk factors and costs that are relevant to them. Their motivations are many, including a need to optimise their trading performance, reduce execution costs and in some cases fiduciary obligations and regulation. The obvious tools of course are Pre-Trade Analytics and post Trade TCA; the quality and depth of the underlying data, innovation of quantitative analysis and the ease of use of the interfaces will over time separate one provider from another. The analytics dashboard we have just launched in Citi Velocity leverages the breadth of our franchise to differentiate our offering.
Why is execution benchmark selection (alpha or beta) a key consideration when selecting algorithms?
![]() “Algorithmic execution will be crucial for controlling information and market impact as exchange style central limit order book” |
MG: It can sometimes be misleading to think purely in benchmarks when selecting an algorithm. For example it may be tempting to select an aggressive, DMA-style algorithm when targeting arrival price rather than a more passive TWAP. However, this ignores the trade-off between time risk and market impact. Understanding why you are trading an order and over what time horizon this reason is valid can be much more significant factor than benchmark selection. For example, my benchmark may be arrival price but I am executing a relatively large order with no short-term alpha and not linked to another trade – by using a more aggressive algorithm I am paying additional market impact but there is no benefit to reducing the time risk. In this case I would potentially get a better outcome from using a TWAP over a longer time period, even when measured against arrival price.
What issues may determine how passive or aggressive an FX algorithm should be and the time frames over which it operates?
DM: There are three primary considerations. The first consideration, which can argue for a faster execution timeframe, is alpha. If a trader is trying to capture profit from a prediction on which way the market is going to move, they will tend to want to trade faster so that they don’t miss it. However, this is generally only a consideration for speculators; most market participants are trading currencies for settlement, hedging, or commercial purposes, and have no alpha.
The second consideration, which can also argue for faster execution, is timing risk. Even when there no alpha, and favorable and unfavorable price moves are equally likely, firms are risk averse, and would be willing to pay something – though maybe only a little – to avoid very bad outcomes even if that means also avoiding very good outcomes.
The final consideration is market impact. Other things being equal, trading slower and more passively incurs less market impact and costs less on average; the market has time to absorb liquidity demands between orders, and patient traders can be liquidity providers as well as price takers, allowing them to capture a portion of the bid-offer spread. Of course, market impact and what “large” and “small” orders, and “fast” and “slow” execution means vary by currency pair and even by time of day. These three considerations tend to conflict with each other. One of the human trader’s jobs is to evaluate how they apply to each order – different flows within the firm may have very different requirements – and to select the right algorithm and time frame for execution given the alpha, the firm’s risk aversion and how often it trades similar orders, and how big the order is given the pair and time of day.
Build versus buy is a common question when it comes to developing an algorithmic FX trading capability. What level of knowledge and technology is required to design, build, test and deploy FX algorithms and is this achievable for most trading firms?
DM: Building a competitive institutional-grade algorithmic offering is extremely challenging, expensive, and time-consuming. It requires specialised software technology infrastructure including an API-based liquidity aggregator; an “algorithmic container” that allows pluggable algorithmic strategies to run safely and that provides real-time visibility into what the engine is doing; and analytical framework for running analytics against tick data and backtesting strategies. Building an algorithmic offering also requires a significant quant capability – a team of people who are skilled in capturing and working with large volumes of high-frequency data (we capture about 50GB of tick data each day for our typical FX client), who understand trading and market microstructure and how to design and implement robust algorithms. In addition, an algorithmic offering requires a rigorous discipline around testing and deployment. Practically speaking, it’s out of reach for most firms, but even many firms who do have the capability find it not to be the best use of their key quant personnel.
NC: This is something Commerzbank examined extensively. After consultation with our core clients we ultimately decided that the strategic long-term solution for our business was to leverage our existing eFX infrastructure. This draws on decades of experience in global liquidity aggregation, management and optimisation, combined with our quantitative analytics. A growing challenge for many trading firms is the changing regulatory landscape. For example, MiFID II introduces Algorithmic Trading requirements (Article 17) which is further defined within RTS 6. This specifically targets stress testing and continuity arrangements whilst further strengthening pre-trade controls, real-time monitoring and post-trade controls for firms. This is not something that can be completely passed onto a turnkey solution provider as the regulators put the onus on each investment firm to ensure they are compliant. Our approach is to partner with our clients allowing them to focus on their core trading and business objectives.
Information leakage can be a concern for some users of algorithms. How can the predictability of how an algorithm trades be reduced or removed?
DM: First of all, sophisticated algorithms use randomisation in timing to make sure there are no obvious predictable patterns in how liquidity is sourced – for example not taking on an “egg timer.” Second, algorithms can use a variety of execution techniques – both passive and aggressive, again to minimise the likelihood that a pattern can be observed that’s distinct from the overall market activity. Finally, algorithms can make intelligent decisions about what counterparties or venues to use in order to minimise their footprint, to the extent it’s compatible with the given the algorithm’s trading goals. For example, banks who provide algorithmic services to their customer can use their internalisation pools to reduce potential information leakage on certain orders.
NC: We provide a multi-tiered approach within our execution strategies allowing clients to trade without signalling to other market participants that they are trying to establish, or liquidate a large position, thus minimising information leakage. Our strategies allow corporates to define certain parameters whereby we randomise the frequency, size and liquidity sources. Orders touching multiple platforms may increase the information leakage. Therefore our Smart Order Router (SOR) will look to match orders internally before externalising. We use the same principles in liquidity analytics and optimisation as our trading desk, which provides a systematic process to penalise a trading venue for market impact objectively and self-adjust as trading behaviour on the various venues changes. This impact and the opportunity cost of rejections are further analysed in our TCA reporting.
What factors can significantly influence the effectiveness of FX algorithms and how should these be taken into account when deciding how to employ them?
![]() What steps can users of FX algorithms take to monitor how well they perform against other methods of trading and in different market conditions? |
SW: The effectiveness of FX algos depends very heavily on the objectives of a user. This could include any combination of: price achieved vs a myriad of benchmarks, required speed of execution, desired level of market impact, transparency, auditability, confidentiality, and time/trade tracking.
However, the key factors to consider are: time horizon, size of trade, liquidity (inherent and time of day), and volatility.
The cost of execution on all trades increase with a shorter time frame, higher volatility, wider spreads and lower liquidity, and this needs to be factored in when accepting a risk price or defining an algo. However, it is the ability of algorithms to trade over a period of time and the fact that algos don’t necessarily need to cross the spread to access liquidity that sets them apart.
Selecting the best strategy and tuning the parameters of the algo, given trade in market conditions, can lead to significant savings and optimal outcomes, especially when used with detailed TCA analysis over time.
What steps can users of FX algorithms take to monitor how well they perform against other methods of trading and in different market conditions?
FW: First and foremost, the devil is in the detail, thus accurately capturing all the pertinent inputs and market factors before, during and after an execution is key to the evaluations. The ones frequently discussed are arrival price, slippage/market impact, risk price, passive fill rates, internalisation, liquidity and volatility; not rocket science – but rather the basics. Once everything is captured, in my view it still comes down to the user specific criteria as outlined above, with a particular focus on each client’s objectives and risk criteria. What’s important to you, your boss, your company or investors? What is your combined risk factor? These questions need to be answered before any meaningful and actionable comparison or benchmarking can be made.
TCA has been described as more art than science. How important are pre-trade TCA estimates in algorithmic trading and what advantages are there in combining them with post trade TCA results?
SW: Pre-trade TCA provides users insight with into prevailing market execution factors as well as the expected/possible outcomes of an algorithmic execution, which of course depends on the algo itself and the parameters selected. It gives an objective, analytical and quantitative perspective based on historical data which can significantly enhance decision making when combined with experience and taking into account prevailing market conditions.
Post-trade TCA tells you what happened and becomes a data reference for future execution. When pre-trade and post-trade TCA differ significantly, it basically means that there were factors in the latest execution that differed from the historical norms and pre-trade estimates. The most common reasons for this are unexpected events or user intervention, or users trading on proprietary or superior information than the TCA provider. Using both pre- and post-trade TCA will help users refine and optimise both expected and evidenced outcomes.
In what ways can adaptive FX algorithms now deliver a reduction in the digital footprints and patterns they leave in the marketplace and what advantages does that have?
MG: By adapting to current market conditions the algorithm is less likely to standout in the noise of overall trading. Trading schedules exclusively based on historical data can be very obvious when circumstances change – this is as important intraday as it is over a longer time period. Adaptive algorithms can reflect the circumstances in the market at the time of execution – for example UBS TAP trades according to the relative level of activity in the current market rather than a historical volume curve, meaning its behaviour more accurately reflects the circumstances at the time of trade.
What are the cost implications of deploying FX algorithms for firms with different trading styles and volumes?
![]() “Whilst the individual algorithm may not have a shelf-life, the underlying assumptions on which they are built can become less relevant over time as market conditions, micro-structure and participants change.” |
SW: Clients with frequent positions to liquidate have a competitive advantage when optimising their flows. As they send orders and observe the outcomes through TCA systematically, they are able to fine tune their algo settings to lower their execution costs. Therefore, regardless of passive or active stances, users tend to soften their trading style as they gain experience with algos, and are able to measure the value of patience. Only those who are extremely good intraday pickers of high/lows with very strong views about levels stick to the most aggressively tuned algos. Even then, algos give traders the opportunity to systematise and optimise their liquidity access.
Is it useful to think of FX algorithms as having a “shelf-life” and if so can combinations and linkages between algos help to extend it?
MG: Whilst the individual algorithm may not have a shelf-life, the underlying assumptions on which they are built can become less relevant over time as market conditions, micro-structure and participants change. To address this the provider needs to ensure they have a rigorous quality control process which identifies changes in relative performance over time. In addition algorithms built on dynamic quantitative parameters, ensuring that as the market evolves these values are changing to reflect that, are more likely to remain relevant even as markets change.
How much potential is there for algorithmic execution of currency derivatives and less liquid emerging market currencies and are trading solutions for these now becoming available?
SW: Algorithmic execution can help systematise any electronic market and we have already seem algorithmic execution spread to multiple EM currency pairs as the depth of electronic liquidity grows.
Those markets that are still broker based, where the main pots of liquidity are provided primarily based on human interactions, are those where algorithmic execution is unlikely to flourish. However, most markets are relentlessly edging towards electronification.
Furthermore, all the existing and pending regulations seem to be pushing markets towards a more transparent and electronic world that effectively creates the right conditions for algorithmic execution.
What further innovation in the design and construction of FX algorithms is around the corner?
MG: As the market for FX algorithms evolves we continually increase the amount of data available to us to better refine the performance and this will drive further innovations in the space, particularly increasing our ability to customise the client’s experience to better reflect the unique underlying drivers of their trading. The discussions regarding the use of artificial intelligence have all the hallmarks of hype currently although there is generally an underlying truth. As FX algorithms evolve we are looking at techniques such as machine learning to further enhance their ability to respond real time and continually refine their performance. It is important to balance this level of innovation with the need to be transparent with our clients as to how the algorithm behaves and this can be more challenging as the complexity of the algorithms increase.
Looking ahead are the traditional reasons for using FX algorithms (reduced market impact, liquidity access, execution consistency, trader productivity etc.) likely to be overtaken by other drivers and considerations?
NC: We will continue to see regulatory scrutiny across financial markets to further drive the electronification and evolution of FX as an asset class to an even more controlled, transparent and centrally cleared environment. Algorithmic execution will be crucial for controlling information and market impact as exchange style central limit order book (CLOB) venues gain momentum in FX markets. We see the upcoming MiFID II algorithm and TCA requirements as only a precursor.
In addition to new products such as Non-Deliverable Forwards (NDFs) and precious metals; the future evolution of execution algorithms will see more sophisticated, market adaptive approaches that use artificial intelligence and machine learning techniques to consistently outperform standard algorithms like Time-Weighted Average Price (TWAPs).
Our goal is to partner with our core Client base and work with them to further reduce their cost base, both in terms of infrastructure and in terms of the transaction costs. We will work together as the global FX market evolves.