There’s no doubt that starting a career in financial services during the Global Financial Crisis has greatly influenced who I am today.
How did you get into quant investing?
Arriving at MIT, I had no financial experience whatsoever – and I never really expected to go into the financial industry. Before Loomis Sayles, I spent most of my formative years at buy-side global macro hedge funds, so my approach to investing is really an amalgamation of what I learned from some of the best quants in the business.
Your first foray into financial services after MIT was on the mortgage trading desk of a global investment bank in 2007; what was that experience like?
As I’d had no real financial education, the bank sent me from New York to Stamford for two months of finance 101 training. By the time I returned, they had whittled the 400 traders down to about 60. So the mood was extremely depressing.
The cutback in traders increased my responsibilities quite significantly. Over the next nine months there, I learned some important lessons on tail risk and that it is more important to protect your downside than miss out on a few basis of points of upside potential.
It also taught me to model distributions as ‘non-normal’, which typically people avoid because first, they are much harder to generalize, and second, most algorithms are written for normal distributions. So, although it was a difficult period emotionally, just being there in New York during the Global Financial Crisis played a huge role in how I think about protecting downside.
Where did your career go from there?
I was asked to join Loomis Sayles in 2011 to lead the build-out of quant commodities, FX [foreign exchange] and rates strategies. The team I joined was very diverse in terms of having both fundamental and quantitative backgrounds. My boss and the head of our team, Kevin [Kearns], constantly reminded us – and still does – that it didn’t matter how we produced alpha, just as long as we concentrated on capital preservation and managed downside risk.
For the first few years at Loomis Sayles, I managed hedge fund portfolios in the commodities and global macro space and provided support to a credit long-short fund. Then, about four years ago, the CIO, Jae Park, the deputy CIO, David Waldman, and Kevin, approached me on factor-based investing, which was happening in equities at the time, but not fixed income. They asked me to build a team and develop a multi-asset systematic and alternative risk premia unit.
It must be difficult to assemble a quantitative investment team from scratch. How did you go about it?
The culture of our team and the way we work with each other reflects how my professor Alan Edelman used to run our research meetings at MIT. He had a singular focus on innovation and without any concept of seniority or hierarchy within the team. Everyone I work with has great ideas, especially the ones without much financial industry experience. These are often the people that come up with the most creative solutions to problems.
We have Koushik Balasubramanian, a theoretical physicist, who thinks mainly in terms of regimes and brings the astrophysical concepts to problem solving. Chetan Shinde is a material science engineer who did a lot of work previously on DNA and protein folding. And then Diqing Wu, who is like a human supercomputer. This diversity of thought has allowed us to build products and strategies that are clearly differentiated in the marketplace.
Do your backgrounds really help you to build better models?
Our strategies are, for want of a better word, ‘quantamental’ – we marry the fundamentals with our quantitative techniques. Kevin is a big believer that 70% of alpha is made by identifying top-down factors, so we essentially use probabilities of a crisis to forecast alpha and beta. The machine learning and concepts that derive from various engineering fields allow us to do that with integrity.
How have you seen AI and machine learning techniques develop over time?
Think about trying to predict oil prices 10 years ago. That was an exercise based on three factors and fairly straightforward. Today, we have almost 100,000 data points to consider when thinking about oil prices. This is impossible to do without quantitative tools including machine learning and AI, which will become increasingly helpful.
Now it’s true that some, maybe even most, of the insights from the 100,000 data points is probably useless. But you will only know that when you test them diligently. And even if you identify 100 relevant data sets within 100,000, it is something that could revolutionize investing.
The members of my team are heavy users of machine learning in all facets of investing. They use it all the way from regime identification to alpha generation. We also use robotics for production automation and we do a lot of natural language processing and sentiment analysis across most asset classes.
How difficult is it to build contrarian algorithms or differentiated strategies?
What we believe in, and what we have been doing for the last few years, is recreating the algorithm itself. For that, you need to have a theoretical understanding of the loss functions, how the algorithm works and the optimisers needed to produce results. And, actually, this approach allows us to do things differently.
Why is recreating algorithms necessary for a successful quantitative strategy?
I’m not saying that that alpha is useless but when algorithms get crowded, investors want to find non-consensus alpha opportunities. That’s why we are attempting to create hundreds of thousands of alphas and use machine learning and AI to essentially pick out the alphas that work in each regime. We believe this is a better use of all the outputs. We don’t discard any alpha, rather we’re hoping that the algorithm can find the right tail for each particular regime.
If quants are all using the same data sets, does this increase the risk of flash crashes or particularly violent market moves?
Yet the vast majority of the time, these high frequency trading algorithms have helped to provide liquidity to the market. So instead of spending a basis point on buying a single share of an equity, you’re spending significantly less thanks to the liquidity that's being provided.
Are execution speed and trading costs important in systematic investing?
What’s vital is the technology and infrastructure. Things like co-location, where you are putting servers close to some of these exchanges, become very important.
Having said that, every firm has limited resources and needs to decide in which baskets to put their eggs. Trading is extremely expensive, that’s why many firms outsource some of their trading operations to algorithms or investment banks
What advice would you give to future science, technology, engineering and mathematics (STEM) graduates considering a career in financial services?
However, I learned that everything in finance is non-linear and that the problems finance needs to solve are actually very complex. Often, it’s simply because the data is noisy and that timing is imperative.
In my first attempts at investing on my own, I think I lost nearly everything. Yet the thing is, losing teaches you a lot about how to build better portfolios.
I learned to evaluate they key underlying components of asset allocation and how correlations can all go to one and how different regimes are important. These experiences moulded me into the investor that I am today.
When we build strategies, it is always with the intention of providing our clients with security in all markets, even in the event of another global financial crisis.
There’s no doubt that starting a career in financial services during the Global Financial Crisis has greatly influenced who I am today, how I have assembled my team and how I’ve contributed to building quantitative strategies at Loomis Sayles.