Originally published on the Thomas Murray website on August 20, 2018
Conversation with Paul Rowady, Director of Research, Alphacution Research Conservatory
What is Alphacution?
Alphacution is the first digitally-oriented research and strategic advisory platform focused on developing a centralized “market intelligence asset” by modeling, measuring, and benchmarking technology spending patterns, and the operational impacts of those investment decisions. The Alphacution platform is specifically designed to deliver an empirical and quantitatively-backed perspective on the effect of financial services industry spending for an institutional client network.
This mission starts by modeling individual companies – banks, asset managers, hedge funds, solution providers, and others – using publicly available data. Our model library currently consists of more than 250 such companies. From there, sector composite models – like global banking, IT services, and asset management – are developed by aggregating individual models. Ultimately, sector composite models come together to represent the full view of the largest parts of the financial services industry.
Where data on technology spending and other operational dynamics are not available, we can leverage observable data to improve the credibility of estimation. As Alphacution grows the comprehensiveness of its model library and diversity of data sourcing, we improve the contextual power of our platform, and, therefore, the credibility and accuracy of its estimation capabilities.
In other words, Alphacution is focused on measuring which institutions are getting the most payback, and why that is.
What are the most prominent shifts that your research methodology has discovered or validated so far?
Since its launch in mid-2015, Alphacution has developed three core studies focused on composite modeling of sector technology spending factors – and, what those patterns suggest about organizational transformation among the various players.
These studies include global banking (based on 58 large global banks), IT services (based on 27 global providers), and asset management (based on 158 asset managers, hedge funds, proxies and others related to asset management). As we update individual and composite models, we are able to harvest intelligence in greater detail. We always learn something new, and often much of it unexpected. For instance, the upcoming series of modeling will provide more detail on the state of workflow automation (such as front-, middle-, and back-office technology investment patterns) as well as business model segmentation (such as themes specific to retail banking, investment banking or custody banking, among others).
In terms of prominent shifts, post-Global Financial Crisis (GFC) regulatory requirements made the need for new software the primary driver. Alphacution’s modeling shows how ballooning software budget needs “crowded out” most other infrastructure spending, causing an expedited tipping point to cloud adoption, more urgent rationalization of fragmented and legacy infrastructure, and some dramatic moves to IT services and other managed solution offerings. Thereafter, we have witnessed steadily increasing demand for IT human capital to develop new regulatorily-responsive software on proprietary, outsourced and other third-party bases.
Thanks to the benchmarking methodology, which uses headcount to normalize absolute spending across entities of significantly different scale and then compares those measurements over time (usually starting in 2005), Alphacution is able to showcase numerous common operational tendencies and “technical signatures” among a community of players, like banks. We are able to see how most retail-centric banks spend a strangely similar amount on hardware and software. These players tend to move in packs. We are able to see how Asian banks continued to thrive while EMEA and Americas banks’ technology spending stalled (on a per employee basis) in the post-GFC period. And, we are able to benchmark the leaders and laggards in the banking sector in terms of “technical leverage” a measure of the return on technologythat allows us to rank a community of banks based on the spread between revenue and technology spending.
When we add some of the intelligence harvested from our recent asset management study, Alphacution is, for instance, able to showcase a new analytic, “AuX/e”, which is the level of assets under management or under administration. So, when we compare assets under management per employee among a group of competitors, we can see which players’ processing is able to handle which level of assets per employee over time – which then allows us to develop new modeling hypotheses to discover the drivers behind these variances.
As you can see, just because we start with a question like “What do banks spend on technology?” doesn’t necessarily mean that this is all we intend to learn. Measuring technology spending is a great doorway through which to enter the world of operational analytics, a field of study that we believe may be entering its golden age.
Define “Return on Technology.” And, can you detect changes in productivity or process efficiency with this analytic?
Alphacution defines Return on Technology (RoT) as the difference between revenue and technology spending divided by headcount, per period. As long as the data inputs line up, RoT can be useful at the enterprise level or business unit level. And, since we have normalized for scale (by headcount), RoT can be benchmarked among groups of banks or other members of the ecosystem.
In essence, this analytic highlights the spread between “performance” and the “cost of that performance” from a technology perspective. In subsequent iterations of our modeling efforts, Alphacution will be able to index workflow automation levels, thereby better balancing human capital and technology in the cost of performance component of this analytic.
Furthermore, Alphacution has begun to demonstrate that benchmarking RoT yields quantitative insights into the pace of adoption of automation tools and methods and allows us to visualize the stuff of the FinTech Revolution as it progresses. The challenge, however, is that RoT is influenced by factors in addition to technology spending (and automation), namely exogenous drivers that cause volatility in revenue plus changes in headcount. We mitigate this by benchmarking RoT over time and with a large sample of banks, so that we can detect and quantify overall sector changes in process efficiency while still being faced with the challenges of credibly quantifying individual bank improvements in process efficiency.
Enhancements are on the way. As we have demonstrated in the asset management sector modeling, when we supplement the RoT analysis, we can strengthen the credibility and accuracy of process efficiency estimation. So, as with all of our research so far, we always expect to improve on prior modeling, analysis and tools. Because there are few other tools available to quantify such transformations, even the early versions of Alphacution’s platform output demonstrate value to clients.
So, if we boil all of this down, Alphacution’s methodology and platform are becoming increasingly accurate at measuring the FinTech payback with all of its new artificial intelligence, blockchain, and cloud-based “toys”. Furthermore, because we have been unusually collaborative and transparent with most of our peers and competitors, we have not discovered another platform that is likely to provide a similar level of “navigational intelligence” on the critical areas of productivity gains and return on these very large sums being spent.
What is your opinion on the impact of distributed ledger technology (DLT)?
Distributed ledger technology has enjoyed, perhaps, an extra level of hype recently due to the cryptocurrency development that is one visible application of it. In brief, I remain skeptical on the long-term potential for the “creation of decentralized trust” offered by cryptocurrencies to fully, or even partially, disintermediate the “centralized trust” provided by the current monetary framework, with the possible exception of the new hybridized solution to offer a cryptocurrency that tracks the US dollar.
On the other hand, I am much more optimistic about the potential for DLT to improve current administrative burdens, costs and efficiencies for various transaction processing. For most institutions responsible for transaction processing today, any transition to DLT-based processing will be slow and painful.
In Chapter 5 of its recently published Annual Economic Report – “Cryptocurrencies: Looking Beyond the Hype” the Swiss-based Bank for International Settlements (BIS) corroborates these sentiments with one of the more thorough and sober analyses of the utility of cryptocurrencies of late.
And, despite a rather gloomy assessment (which an article in the 18 June edition of the UK’s Telegraph calls “the final authoritative nail in the coffin”) the BIS report does shed some positive light on the applicability of crypto’s underlying blockchain technology to the reduction of various costs for transaction processing.
What advice do you have for a board of directors on their technology budgeting and strategy over the short- to medium-term?
Overall, thorough preparation for persistent change is critical. To me, the concept of transformation implies a common version of change management. However, when it comes to digital transformation, I interpret the qualifier “digital” to imply a radical version of change management. So, the preparation for a more radical pace of change is essential before making big decisions about adoption of new technologies and technically-biased shifts in human capital skills mix. All that said, I fully understand that new regulations and swift responses to new competitive threats often do not allow for the most thoughtful preparations, and so my overarching advice here is to resist the urge to be hasty. The mantra here is: “Slow is smooth; smooth is fast…”
My next recommendation to those in authority to influence technology strategy and budgets in financial institutions is to rediscover and reaffirm the company’s value proposition – which those of us that come from proprietary trading backgrounds often call “special sauce.” What is your special sauce? Chances are that if this exercise has not been performed in a while it will have changed. Due to innovations and competitive forces, many of the activities (like proprietary infrastructure management or proprietary solution development) that used to provide competitive advantage have since become democratized or commoditized. In other words, the bank or other kind of institution will need to draw a new line between that which it develops and manages itself and that which it outsources. This exercise will help decision-makers reimagine and redesign supply chains, which may be a relatively new concept for financial services, but one that is quite mature in, for example, manufacturing circles.
My third recommendation is to enhance the bank’s portfolio of operational analytics. Otherwise sometimes known as key performance indicators (KPI’s), operational analytics are the metrics that measure and monitor the health and efficiency of various tasks, workflows and business units. A word of warning: for workflows that are biased to manual tasks, developing, collecting, and maintaining the consistency of KPI’s will be challenging, and may in fact degrade somewhat the current efficiency of that workflow. Expect this and plow through it, because with a more detailed suite of operational analytics, you will have established a base-case “statistical footprint” of existing workflows. And, as the bank adopts new tools and methods, it will have a much clearer (and quantitative) measure of improving process efficiencies.
This leads us to my final recommendation: identify business segments, workflows, or segments of these where experimentation with the new fintech toolkit can take place with minimal disruption to core functionality. Now, for banks, technology strategy is often segmented between that which is intended to “run the bank” and that which is intended to “change the bank,” with new solutions assumed to be adopted within this latter segment. However, the new normal pace of change places significant pressures on this approach. It’s simply too challenging to stay up to date with the latest and greatest. So, what I am suggesting here is a third category which, in the case of many large banks today, has already been established but may not be viewed like this: an innovation laboratory in-house. Exchanges and market infrastructures have taken such a step. In partnership with independent advisors, the innovation laboratory is tasked with maintaining an inventory of current knowledge about the tools, technologies and methods that are likely to be most important to maximizing the enterprise’s value proposition, which had hopefully been renewed at the outset. The innovation lab can operate somewhat like a venture capital firm, but make sure it also places priority on maintaining a “technology intelligence asset,” too…
Note from the website editor: more information on estimating the value of FinTech can be found on https://alphacution.com
The author, Thomas Krantz, is Senior Advisor, Capital markets, in the firm of Thomas Murray; and served as Secretary General of the World Federation of Exchanges (2000-2012). The views expressed are his own, and not necessarily those of the firm.