Last week, the FIX Trading Community hosted a regional meeting called “Trading and Transparency – the FIX Solution.” It was focused on electronic trading execution transparency and transaction cost analysis. The speakers were:
- Ian Domowitz, Managing Director, Head of Analytics, ITG
- Vlad Rashkovich, Global Business Manager, Bloomberg L.P.
- Brian Lees, AVP, Trading Application Manager, Capital Group
- Phil Mackintosh, Head of Trading Strategy & Analysis, KCG Holdings
- Kevin McPartland, Head of Research, Market Structure and Technology Practice, Greenwich Associates
The process of buying and selling securities incurs both explicit and implicit costs. Explicit costs include items like commissions, taxes and fees. Implicit costs are variable and are based on the effect that trading activity has on the security. Examples include costs around implementation shortfall, delay, opportunity cost, and information leakage. Transaction cost analysis (TCA) is the process for analyzing these implicit costs.
The meeting panel members discussed issues around execution and routing transparency and how to obtain the information needed to effectively analyze the true cost of execution. They covered a lot of ground. Being relatively new to this topic, I found the discussion quite interesting. Here are some of my key takeaways from the discussion. Forgive me if I sound like a TCA neophyte. It’s true…I am.
(Note: The meeting was conducted in accordance with Chatham House Rules, which means that we can write about the content but cannot attribute quotes to any speaker or organization. Therefore, I have not identified the panelists quoted below.)
Venues are Ahead of Sell-Side Firms in Providing Transparency
Since the FIX TCA Working Group added tags to FIX messages to provide execution information, certain buy side firms have begun to gather this data via execution data from the venues. The venues varied in their speed of adding the functionality. Some were able to add the information to their FIX messages very quickly and promptly started transmitting that information with execution reports. Other exchanges, particularly those using vendor gateways and/or more complex legacy systems and took longer. One of the panelists reported that today, they’re receiving this data from about 90% of US equities venues and about 80% of the European venues. Asian venues have a different market structure, and I didn’t catch the status with these exchanges. The panelist said he’s seeing much less participation from their sell side counter parties.
One of the key goals for the buy side is evaluating how much information leakage is impacting their trading results. One of the speakers commented, “If you’re on a venue routing mechanism, then you’re probably leaking information.” The speaker was talking about the issue of routing orders to brokers with the goal of getting the best price. The broker routes the order to a venue, but if the venue doesn’t have the NBBO (national best bid/offer) at the moment when it receives the order, and if the order is not set to cancel immediately, then the venue is obligated to route it to where the best price can be obtained. In this case, the market can move during the routing process, resulting in slippage and the potential for the order to bounce around between venues like a ping pong ball. The buy side has little or no insight into those routing decisions or which venue actually executed the order.
In this case, the speaker pointed out, it’s less about transparency and more about objectives. “Sometimes, it’s more important to sweep and signal less. It’s better to focus on avoiding leakage than getting the lowest price at any moment.”
Strategy Must Be Considered In Venue Analysis
The moderator reminded the group that we can’t compare venues without considering market conditions and trading strategy. He said, “We’ve generated 20 years of collective amnesia with respect to venue analysis. In the 90’s, we realized that you can’t analyze the venue without knowing something about the strategy. For example – OMX in London used to have an 80% cancellation rate. But it was because the equities exchange was linked to the options exchange, and the strategies required an execution on the options. So when you looked at the strategies being executed, the high cancellation rate made perfect sense. It’s not just a routing decision – it’s a strategy decision.”
The panel agreed. You’d never evaluate an algorithm without taking into account the market condition. You should not do it for venues or routing decisions either. The fee structure at each venue also has an impact on routing decisions. These also need to be considered when doing thorough venue analysis. I’m guessing that this is part of the issue for the buy side. Are routing decisions made because of fee and rebate structures, or because that route made the best sense for the objectives of the order?
Understanding the Impact of Market Movement
One panelist said, “When we’re trying to understand how the brokers are behaving on our behalf related to a given strategy, we’re not looking at all the routing decisions that were made. For example, we’d like to know if when an order hits one venue, the rest of the market runs away.” In this case, the buy side is trying to understand the behavior of their selected algorithm so they can predict performance and potential market impact, and adjust their orders and strategy in real time. “We didn’t write the algo, but it’s our order out there, and we need to know how it’s behaving.” Another panelist said “We need to understand how market movement affects us.”
The Current FIX Message Structure is Insufficient
The panel talked about how much data this level of transparency would require. They all agreed that the current FIX specification does not include sufficient tags to cover the information needed. A panelist said, “You don’t just need the tags we have. You’d also need print size, market cap, volatility, and spread.”
Trading Transparency Requires Massive Data
To evaluate how a sell side algo is routing an order, the buy side would also need all the routing information. But that’s a huge problem to manage. A panelist from the sell side commented, “It’s your data, but it’s so much data that it’s a huge problem to manage. A single order might have 50-60 rows of routing.” Another panelist added, “If you’re looking at an average institutional order, you’re likey to get 200 fills. If you look at them fill by fill and get 5 bad fills, it’s just noise. If you get 100 bad fills, you’ll see it in other places.”
All agreed that firms need to understand where they can get the highest ROI before investing in getting the information. Most buy sides lack the processing power, technical expertise and budget needed to actually digest and use this information, even if it were available to them.
Why Do You Care?
The moderator asked the panel, “Why do you care? Why are so focused on the full process? One of the panelists responded, “If I’m a trader on the buy side, my key priority is understanding the need, for example, the urgency of the order versus the TCA objective. What is my order? Who is the best broker and how will they take it to market?” Because the buy side doesn’t have the visiblity they want into the broker’s decision process, they put pressure on the sell side to manage transaction cost and to provide transparency. However, the panelist conceded that even having the data doesn’t really deliver the desired transparency, “The market structure is so complex that it’s hard to understand even if I do have it.”
The algos themselves add even more complexity. Brokers are allowing substantial customization of their algos, and each trader can configure as desired. A panelist asked, “How do I disentangle the performance of the algo from the customization and the strategy the trader is trying to execute? I need to tease out what’s gone on on the broker, algo and routing side.”
Obstacles to Real-Time Transparency
While delivering data for this kind of transparency is technically possible, it’s nearly impossible to actually use it to impact trading strategy in a real-time basis. It’s the classic big data problem. The moderator posed an important and valid question, “Is it just a silly exercise?”
A panelist responded, “My traders have not asked for anything in real-time, so we’re evaluating the data we have access to post trade, looking to see if there are anomalies and if a router disadvantaged us.”
There is a lot of information, and the data needs to be transferred either in batch or as it’s happening. FIX is suited to the latter. The TCA Working Group sketched out what an exchange of FIX messages would look like to transfer this information in real time. For example, they could stamp routing decisions on the fill and drop-copy the actual routes. This would prevent the fills from being backed up in queues and negatively affecting the order routing. All agreed that the information would need to be asynchronous.
One of the members of the Working Group conceded, “It would take a lot of resources to analyze this in real time, assuming the broker is able to feed the data, and we’re able to consume it in real time. When we got to that point in our process, no one on the committee was interested in moving forward. There was no budget and nowhere to store the data. But at least we’ve sketched out the process.”
For now, nobody has the answer of how they can use the data and actually change their trading strategy in real time. As a panelist put it, “The practical constraints are the cost. But to my mind, the end-of-day batch files are going to deliver very little information for a venue by venue comparison. The real-time data is what will actually work on the desk to impact results. For example, if you had intra-day information about fade, you could gain more insight about a strategy that’s running.”
What Should the FIX Community Do Next
Before he closed the session, the moderator asked panelists, “if you could wave a magic wand and ask the FIX community to do something to improve transparency, what would it be?” Here are their responses:
“I need these tags standardized. Tag 851 mapping needs to be done. I have traders and quants breathing down my neck needing this data to be complete. There are several large buy sides saying the same thing.”
“Get all the tags for the child orders and child routes linked to the parent orders, give it to the buy side and let them go crazy. It’s probably going to call their bluff to see if they can do something with it profitably.”
“The completely unrealistic idea would be to store all the data in the cloud in one place – let everyone in to see their specific data. Won’t ever happen, but…”
“FIX needs to continue hosting these discussions. We’ll see more tags and more transparency. I’m not keen on putting everything in one place. Brokers want to protect proprietary information, and there is a hacker risk. A cloud approach is unrealistic.”
My Take Away
I have my doubts that we’ll see a lot of progress in this area in the next 2-3 years. I don’t think that most members of the buy side have the budget or processing power to actually make meaningful progress with the transparency data that is already accessible. But it will be interesting to see where we get in the next 5 years, as data volume and computer processing power follows Metcalf’s Law.