At it’s core, this is much of what we are ever doing as a research and advisory operation: Looking for clues.
Ideally, we are looking for the kinds of clues that recur as patterns. And then, tell the stories from those clues and patterns. (Better yet, if we can devise a mechanism to systematically discover more clues and more patterns with regularity, then we will have developed something quite valuable. But, I digress…)
So, it was with great fascination that we discovered one of the next important clues; some evidence of the nature of transformation in the trading and investment world – and that which is indicative of so many other sympathetic movements in the broader financial industry. This is the falling of dominoes that we often refer to.
Here’s the gist: Quantitative methods are set to pervade much more of the traditional asset management community and a broader cross-section of the strategy spectrum. Likely more than expected. Reason being: Fee compression renders traditional investment processes too expensive and the lack of systematic overlays to these processes renders their performance too inconsistent. The latest major clue to this shift is that spending on managed market data infrastructure is now eclipsing the spending on traditional desktop or screen solutions (at least at one of the leading vendors).
Of course, there has been both speculation and various evidence for this assertion for a while, however, Alphacution would like to add one more piece of evidence to the mix that may represent a tipping point.
Here are the details:
With few exceptions, quantitative methods – often translated to mean “automated methods” – began where market structure supported high levels of automation, which then led to liquidity interactions at high turnover frequencies. After all, average position duration is the greatest source of decay in risk-adjusted returns (i.e. – Sharpe Ratio).
Today, the “capacity” of the highest turnover opportunities (that yield the highest Sharpe Ratios) – and, which is heavily influenced by volatility plus the aggregate inventory of securities – are being consumed by those few who possess the most sophisticated technical infrastructures that are tuned for highest speed. Because of the “winner-take-all” nature of systematically-mined opportunities, this significant barrier to entry leaves all other market participants to apply quantitative methods at slower turnover frequencies (or, longer average position durations).
But, here’s the challenge: Different sources of alpha require different tools. As position duration lengthens – particularly, moving from intraday to daily and beyond – new factors related to underlying fundamentals and portfolio construction come into play. And, this development heavily influences the nature in which market participants consume market data (including an expanding array of non-market and new alternative data sources). In other words, just because an asset manager is not competing on speed does not necessarily mean that that manager does not need high-performance tools.
So, with the expansion of the portfolio of data sources from the “relatively simple” and homogenous market data sources to include a vast and diverse array of additional data sources also comes the requirement for increasingly sophisticated data management infrastructure; the kind that most firms will not want to manage on a proprietary basis.
We can see this needle moving as follows: After years of loud noises being made about the decline of screens being caused by the decline of traditional “users” – and conversely, the rise of managed services for market data infrastructures to replace expensive and brittle proprietary infrastructures – without much noise at all, Thomson Reuters’ “platform” finally overtook its screens on a revenue basis during mid-2017. Note: In the charts below, Eikon represents desktops or screens and Elektron represents market data infrastructure or platform.
Now, this is one piece of a larger story involving several other solution providers and various segments of the broad and global asset management community that is beyond the scope of this short post. Suffice to say that each solution provider tends to have its “wheelhouse” of client that is typically segmented by asset class, region and/or role, such as financial advisors vs. hedge funds.
Our read of these tea leaves is that the natural buyer of “screens” – like the wealth management community – is moving to cheaper solutions while the natural buyers of “platform” – like sophisticated banks, asset managers and hedge funds – are moving to managed solutions and other supply chain configurations.
Also, there is a broader story here about the nature and growth of platforms. Alphacution’s recently published post – Rise of the Platform – covers some additional territory on the topic, as well as our recent case study on the IT outsourcing deal between Deutsche Bank and HPE.
In short: Alphacution expects that most large market participants will eventually outsource to managed infrastructure solutions, if they haven’t done so already, because the expense of proprietary infrastructure management will no longer deliver the incremental benefits (except in rare cases).
Beyond this evidence, we suspect that the pace of additional clues will continue to increase. Stay tuned…
As always, if you value this work: Like it, share it, comment on it – or discuss amongst your colleagues – and then send us firstname.lastname@example.org.
As our “feedback loop” becomes more vibrant – given input from clients and other members of our network, especially around new questions to be answered – the value of this work will accelerate.
Don’t be shy…