How did you get into quant investing?
From entire background is quantitative. I graduated from MIT [Massachusetts Institute of Technology] in 2007 with a master's degree in applied mathematics, primarily focused on advanced optimization, numerical methods, machine learning and supercomputing technologies. I was a big fan of maths since childhood, and that's essentially what made me switch from engineering, which I studied back in India, to pure mathematics.
Arriving at MIT, I had no financial experience whatsoever – and I never really expected to go into the financial industry. Before Loomis Sayles, I spent most of my formative years at buy-side global macro hedge funds, so my approach to investing is really an amalgamation of what I learned from some of the best quants in the business.
Your first foray into financial services after MIT was on the mortgage trading desk of a global investment bank in 2007; what was that experience like?
Honestly, in a word, ‘depressing’. When I joined in the summer of 2007, it followed a trading internship that I really enjoyed. At that time, there were about 400 traders on the mortgage desk.
As I’d had no real financial education, the bank sent me from New York to Stamford for two months of finance 101 training. By the time I returned, they had whittled the 400 traders down to about 60. So the mood was extremely depressing.
The cutback in traders increased my responsibilities quite significantly. Over the next nine months there, I learned some important lessons on tail risk and that it is more important to protect your downside than miss out on a few basis of points of upside potential.
It also taught me to model distributions as ‘non-normal’, which typically people avoid because first, they are much harder to generalize, and second, most algorithms are written for normal distributions. So, although it was a difficult period emotionally, just being there in New York during the Global Financial Crisis played a huge role in how I think about protecting downside.
Where did your career go from there?
Well, I have been lucky to have very smart and diverse mentors, ranging from a previous Wharton Professor and previous Chief Risk Officer at Goldman Sachs to a Doctor of Nuclear Physics and previous CIO of Enterprise Risk Management at AIG. They were all instrumental in teaching me the fundamentals of asset allocation and how to look at data-driven strategies. Given the fall-out from Global Financial Crisis, many of the discussions I had early in my career were around how to get the best risk-adjusted return while paying attention to the left tail of these distributions – which learned first hand during the infamous ‘Black Swan’ event.
I was asked to join Loomis Sayles in 2011 to lead the build-out of quant commodities, FX [foreign exchange] and rates strategies. The team I joined was very diverse in terms of having both fundamental and quantitative backgrounds. My boss and the head of our team, Kevin [Kearns], constantly reminded us – and still does – that it didn’t matter how we produced alpha, just as long as we concentrated on capital preservation and managed downside risk.
For the first few years at Loomis Sayles, I managed hedge fund portfolios in the commodities and global macro space and provided support to a credit long-short fund. Then, about four years ago, the CIO, Jae Park, the deputy CIO, David Waldman, and Kevin, approached me on factor-based investing, which was happening in equities at the time, but not fixed income. They asked me to build a team and develop a multi-asset systematic and alternative risk premia unit.
It must be difficult to assemble a quantitative investment team from scratch. How did you go about it?
I'm a big believer in diversity so I take a lot of pride in the team I've built, which is made up of intelligent and creative problem-solvers. I try to hire people with the least amount of overlap, either in educational or financial background. It’s the same approach we take when we try to build strategies that are uncorrelated to each other. Together, we’ve developed a business that is really unique.
The culture of our team and the way we work with each other reflects how my professor Alan Edelman used to run our research meetings at MIT. He had a singular focus on innovation and without any concept of seniority or hierarchy within the team. Everyone I work with has great ideas, especially the ones without much financial industry experience. These are often the people that come up with the most creative solutions to problems.
We have Koushik Balasubramanian, a theoretical physicist, who thinks mainly in terms of regimes and brings the astrophysical concepts to problem solving. Chetan Shinde is a material science engineer who did a lot of work previously on DNA and protein folding. And then Diqing Wu, who is like a human supercomputer. This diversity of thought has allowed us to build products and strategies that are clearly differentiated in the marketplace.
Do your backgrounds really help you to build better models?
I think the fact that all of us have engineering or scientific backgrounds plays a huge difference in creating mathematical models. When we think of building a model, the first thing that comes to mind is how to do so with appropriate integrity and robustness? We're not really looking to build strategies that are overly fine-tuned to achieve two or three Sharpe in back tests. Instead, our focus is on building robust strategies that will perform in reality, in sync with our simulations. Think of it this way, if an engineer was building a skyscraper and made even one small miscalculation - this could lead to dire consequences – we think about our models the same way.
Our strategies are, for want of a better word, ‘quantamental’ – we marry the fundamentals with our quantitative techniques. Kevin is a big believer that 70% of alpha is made by identifying top-down factors, so we essentially use probabilities of a crisis to forecast alpha and beta. The machine learning and concepts that derive from various engineering fields allow us to do that with integrity.
How have you seen AI and machine learning techniques develop over time?
Well, the amount of alternative data available has changed the face of finance. We know that human minds are only able to think in three dimensions; information becomes incomprehensible beyond those dimensions.
Think about trying to predict oil prices 10 years ago. That was an exercise based on three factors and fairly straightforward. Today, we have almost 100,000 data points to consider when thinking about oil prices. This is impossible to do without quantitative tools including machine learning and AI, which will become increasingly helpful.
Now it’s true that some, maybe even most, of the insights from the 100,000 data points is probably useless. But you will only know that when you test them diligently. And even if you identify 100 relevant data sets within 100,000, it is something that could revolutionize investing.
The members of my team are heavy users of machine learning in all facets of investing. They use it all the way from regime identification to alpha generation. We also use robotics for production automation and we do a lot of natural language processing and sentiment analysis across most asset classes.
How difficult is it to build contrarian algorithms or differentiated strategies?
You cannot use an algorithm invented for human automation of mundane tasks. Think about the Roomba, the robotic vacuum cleaner – it knows exactly what to do over and over again, and that’s without having to adjust its operation based on an unforeseen regime change. With financial data, there’s a regime change every week. Take the coronavirus, for example. It completely and suddenly changed the marketplace regime. Financial data is predominantly non-stationary. The use of structured algorithms on such datasets, doesn’t help investors gain any relevant insight or perspective.
What we believe in, and what we have been doing for the last few years, is recreating the algorithm itself. For that, you need to have a theoretical understanding of the loss functions, how the algorithm works and the optimisers needed to produce results. And, actually, this approach allows us to do things differently.
Why is recreating algorithms necessary for a successful quantitative strategy?
Ten years from now robots will be building robots. At some point humans will just step away, because the robots will be able to build better robots than humans will be able to. When that happens, machines will start using the same set of algorithms and data which will result in similar outcomes because machines learn from each other. It will lead to overcrowded algorithms which will eliminate some of the alpha opportunities we see right now.
I’m not saying that that alpha is useless but when algorithms get crowded, investors want to find non-consensus alpha opportunities. That’s why we are attempting to create hundreds of thousands of alphas and use machine learning and AI to essentially pick out the alphas that work in each regime. We believe this is a better use of all the outputs. We don’t discard any alpha, rather we’re hoping that the algorithm can find the right tail for each particular regime.
If quants are all using the same data sets, does this increase the risk of flash crashes or particularly violent market moves?
Definitely. If you look at how high frequency trading firms function, their algorithms are designed to shut off when there is no volume or if they have a certain threshold for volume in the market. And when that happens, prices can move up or down very violently in a very short period of time. That will happen every once in a while. Maybe once every 10 years.
Yet the vast majority of the time, these high frequency trading algorithms have helped to provide liquidity to the market. So instead of spending a basis point on buying a single share of an equity, you’re spending significantly less thanks to the liquidity that's being provided.
Are execution speed and trading costs important in systematic investing?
Trading costs are a very important consideration regardless of whether we’re talking about systematic or fundamental. A lot of these high frequency firms are trying to pick off any of the patterns they’re seeing from the opposite side of every trade. And the higher the turnover, where essentially you can produce your higher Sharpe strategies, the higher the transaction costs.
What’s vital is the technology and infrastructure. Things like co-location, where you are putting servers close to some of these exchanges, become very important.
Having said that, every firm has limited resources and needs to decide in which baskets to put their eggs. Trading is extremely expensive, that’s why many firms outsource some of their trading operations to algorithms or investment banks
What advice would you give to future science, technology, engineering and mathematics (STEM) graduates considering a career in financial services?
Before I arrived at MIT, there were only two places I wanted to work – at the CIA as a data scientist developing cutting-edge technology to solve crime, or at NASA.
However, I learned that everything in finance is non-linear and that the problems finance needs to solve are actually very complex. Often, it’s simply because the data is noisy and that timing is imperative.
In my first attempts at investing on my own, I think I lost nearly everything. Yet the thing is, losing teaches you a lot about how to build better portfolios.
I learned to evaluate they key underlying components of asset allocation and how correlations can all go to one and how different regimes are important. These experiences moulded me into the investor that I am today.
When we build strategies, it is always with the intention of providing our clients with security in all markets, even in the event of another global financial crisis.
There’s no doubt that starting a career in financial services during the Global Financial Crisis has greatly influenced who I am today, how I have assembled my team and how I’ve contributed to building quantitative strategies at Loomis Sayles.