The Dawn of Operational Alpha

They say, it is always darkest before the dawn. They don’t say, however, that no one is up and working their asses off before the first light of a new day…

By many accounts, the concept of operational alpha has been around for years. I first heard this term in the context of Citadel’s launch of its eponymous, mid- and back-office technology externalization effort, Citadel Solutions – later known as Omnium. Citadel even went so far as to trademark the term, operational alpha. It was the exuberant days of late 2006. The timing for mainstream appreciation of such a seemingly hair-splitting concept was not particularly hospitable.

Not too long later (during the immediate aftermath of the GFC), Till Guldimann – then Vice Chairman of SunGard – was loudly promoting his vision for operational analytics; a new category of data designed to help asset managers run their businesses better and respond to market shifts with greater agility. The timing was improving, but now market operators were way too distracted – by unprecedented dislocations and the specter of regulatory interventions – to focus on this. Plus, so much of the playing field, rules and gear was set to change dramatically.

Flash forward to the spring of 2016, at an analyst day meeting for the data products group of the Depository Trust & Clearing Corporation (DTCC), where part of the discussion focused on the introduction of a new category (for them) of data products called – wait for it – operational analytics. Launched off the back of Omgeo’s operational performance benchmark offering for broker-dealers and investment managers, this was a first response to the growing sentiment that this was a category of interest; an intuition of a need for future development. Like their recent announcement for the maturity of cloud offerings, you know when the DTCC gets on board, mainstream adoption cannot be far behind.

These signals are why we are just now emerging, after a decade of incubation, into the dawn of operational alpha. We are at a tipping point. And, players of all stripes – sell-side and buy-side, traditional or alternative – actually need to wake up and smell the coffee.

Here’s more for the case:

Those who are “up” before this particularly special “dawn” are – by endowment of vision or by the nature of their position in the ecosystem – intensely focused on processing.  This is why the message seems to be coming exclusively from the vendor community – and a few visionaries from the edges of the landscape. These are folks who need to design for task or process effectiveness before their competitors do it first.

But, let’s face it: Talk of processing is boring. By my estimation, this is due to an artifact of trading culture. After all, the glory and fanfare of financial football-spiking and end-zone dancing can only be found in the front office, so the thinking goes. Nothing sexy and no windfalls are ever found after the trade – or in the mundane trenches of getting to and making the trade…

And yet, by our estimation, leadership – as measured by persistent outperformance – is a direct result of maniacal sensitivity to processing. It is this sensitivity to the value of efficient and agile processing that allows leading firms to successfully navigate whatever uncertainties and opportunities lie ahead.

Bottom line: Perceptions are changing. In fact, Alphacution published a post entitled “Back to the Front: Post-Trade Processing Becoming Sexy-er” in September 2016.  It was our most popular for the year…

Many financial firms are being forced to contemplate the mechanisms for achieving operational alpha up and down the workflows, front to back. Gone are the days when fat margins and fatter profits disguised the unintended impacts of weak processing. In many ways, this is what digital transformation is all about.

The various stages of the evolving quant – now, data science – revolution gets to the heart of the matter: In pursuit of high(est) Sharpe Ratio strategies, some asset managers – typically of the alternative or prop shop variety – focused on signal processing; a formerly manual component of trade workflows.

As quants and data scientists continue to rise in authority (within their respective trading firms – or by starting new ones) and their sensitivity to processing continues to permeate the culture of what it now means to be an asset manager or a broker or a bank, the needs and skills have been converging to bring the concept of operational analytics into common discourse.

The problem, however, is that operational data – the stuff that describes the range of outcomes from each task in a workflow (and no matter its position in that workflow), and which represents the precursor to operational analytics – cannot be squeezed from the end of a tube like a finely curated delicacy. This data is often way too cumbersome to collect and assemble without material disruption to the existing process, and it is difficult to normalize measurements over time or across workflows on a proprietary basis.

The better place to start is within your vendor solutions. Many of these solutions – particularly post-trade solutions where the least amount of attention to efficiency has been paid – are natural repositories for data about processing. Now, truth be told, the vendor community these days is none too comfortable walking further out onto brittle limbs. (And, post-trade solution vendors are an even more cautious group than the whole.)

So, here’s some advice for market participants in need of turning the right screws when it comes to process efficiency: Your supply-chain counterparts are not going to build the solutions you need unless you make some noise about it. Tell them. In fact, go find that dusty football that you haven’t spiked in a long time – and chuck it over to them. You will thank us later, when the dawn turns to something far more turbulent – and your brittle workflows have been upgraded to perform on target with agility.

Alphacution can help along the way, particularly when new data starts to flow and we get a better shot at measuring and benchmarking these impacts. Once that happens, you have a feedback loop. And, when you have a feedback loop, you possess the tools to navigate intelligently, somewhat similar to operating a finely-tuned trading system. I suppose that’s why they call it operational alpha

By | 2017-06-16T10:36:22+00:00 May 25th, 2017|Alphacution Feed|

About the Author:

Paul Rowady is the Director of Research for Alphacution Research Conservatory, the first digitally-oriented research and strategic advisory business model focused on providing data, analytics and technical infrastructure intelligence within the financial services industry. He has 28 years of senior-level research, risk, technology, capital markets and proprietary trading experience. Contact: paul@alphacution.com; Follow: @alphacution.

Leave A Comment