In the latest issue of the London Review of Books, a sociology teacher at the University of Edinburgh manages to successfully explain the history of dark pools to a general audience.
In his article “Dark Markets”, Donald MacKenzie, who has been supported by the European Research Council to look into automated trading, goes right back to Instinet setting up the Crossing Network in 1986 and Posit being launched in the following year by ITG and Barra. He traces the debates over dark pool trading with the development of block-trading venues such as Liquidnet and internalisation by investment banks. MacKenzie brings the piece bang up-to-date with the case bought by the New York attorney general Eric Schneiderman against Barclays Bank, which has denied any wrongdoing.
MacKenzie said: “Note that Schneiderman isn’t claiming that institutional investors lost money because their orders were executed in Barclays’ dark pool, which would matter to anyone, say, whose pension or savings funds were being managed by an institutional-investment firm. Is the way the firm is doing its trading imposing unnecessary costs on those funds? You can’t really tell.”
He correctly highlights that it is difficult for fund mangers to truly judge the quality of their executions, even if they receive reports from their brokers, because there is no public database against which they can compare their results.
MacKenzie writes about Stéphane Tyč, co-founder of low latency market data provider Quincy Data, who came up with a solution in a paper last September when he recommended that each matching engine be given a unique reference. The matching engines could all be time synchronized within ten millionths a second to the UTC global time standard. Anonymised trade data could then be published in a standard format with the matching engine ID, the trade ID and the time stamp to a publicly accessible data stream.
Tyč wrote about the US market but said the same proposal would work in Europe and he responded to the European Securities and Markets Authority’s consultation paper on MiFID II, the proposed regulations covering financial markets in the region.
MiFID II has much stronger best execution requirements but the region does not even have a consolidated tape. In addition, many of the buysideresponses to the Esma consultation complained that the cost of market data in Europe was far too high compared to the US.
However Tyč argues that consolidated tape proposals aim to improve the price formation process by providing rapid feedback on trades and are useful for market makers, professional traders and brokers. He said: “Our proposal is aimed at providing transparency for the end users, the money managers or individuals. Some money managers or individuals do not have the means or the interest to perform real-time analysis of trades and alter their real-time trading patterns even with the existence of a consolidated tape.”
Tyč claimed that size of the data produced for all the orders executed on any given day in a terse format will fit on a small thumb drive and that it would be cheap to produce as all electronic matching engines already have the necessary data.
However sensible Tyč’s proposal may sound, the biggest resistance to his suggestion will be to trade data being accessible free of charge. This is bound to produce opposition from those exchanges, who get an ever growing share of their revenue from market data fees, and vendors. Their lack of support contributed to the failure of the COBA project in 2012, a commercial venture which tried to produce a European consolidate tape.
Tyč argues that his proposal would narrow the regulation gap between exchanges and dark pools and help lit venues compete more fairly, but this is unlikely to win them round. I would put my money on vested interests having to be pushed kicking and screaming into a solution imposed by Esma, rather than them voluntarily coming up with a proposal that benefits the whole market.