How can investment operations teams at insurance organisations incorporate new technologies to meet shifting portfolio needs and strategies? That was the central question in a conversation on integrating technology to improve investing processes between Justin T. Palmer, Head of Strategic Change Initiatives at Barings, a subsidiary of MassMutual, and Simon Zais, Senior Consultant at Capco.
In a recent Clear Path Analysis report, “Insurance Asset Management, North America 2022”, several senior industry players – including Palmer – discussed how adapting to a different investment environment is the most important facet for investment teams at insurance companies, especially when it comes to technology systems.
Palmer was asked whether it is more beneficial for firms to buy, rent, or build IT systems. “Everyone has been part of the discussion on upgrading technology and whether you do that in-house, buy it, or go to [someone external],” said Zais, asking Palmer if he saw a solution in today’s market.
“Every firm has a different answer, some say ALM is their core competency
whilst others might say allocation strategies."
Palmer said that the picture was muddy – adding that often the question around what is needed is viewed through the wrong lens. “Information and insights is our basic anchoring point,” said Palmer on the topic of whether to keep systems internal or outsource.
When asked what strategies could be helpful, he said that the majority of organisations first ask themselves what issues they are trying to solve, what are they doing well, and what their core competencies are. “Every firm has a different answer, some say ALM is their core competency whilst others might say allocation strategies, we would say that origination and research are our core competencies. We know that we are not going to buy those types of products per se because we need absolute control over that piece of it,” he said with the caveat that, often, more granular information is needed.
He added that the way that the investment industry looks at the idea of transformation is often counterproductive: “We are solving the information layer first – what is the output to the data model, application layer and process,” he explained, which he said meant that companies were hindering themselves.
Palmer added that he tends to take a different stance. “I take a more simplistic view because there are a tonne of asset classes out there but not a tonne of data models,” he said. “There are fewer data models around assets than there are asset classes and what trips everything up is how we want to go to market, how we want to organise our business, why do we need segregation within certain teams, verticals and how we want to do things.”
“There might be different approaches to assess a high-yield bond versus an Investment Grade bond but not a lot of data processing points are needed.”
This situation was exacerbated when additional input points were added from a company, which meant it was ultimately trickier to achieve efficient change, he continued. “There might be different approaches to how you assess a high-yield bond versus an Investment Grade bond but there are not a lot of data processing points or points of entry that are needed.”
One example of this specific situation, said Palmer, was that of a syndicated loan versus a middle market loan. Broadly syndicated loans are floating rate loans made to corporate borrowers that generally have greater than $50 million in EBITDA, which means they have a first claim on the assets of the borrower, said Churchill Asset Management. Middle market loans are often made by a small number of co-lenders in a quasi-club structure where the lenders cooperate closely. “[These are] different ways of funding bringing to market different valuation techniques – but it is still a loan,” Palmer said, adding that oversight was still required.
Palmer noted that when complications posed by this type of loan structure are dissected down to the nitty gritty data, stakeholders can better determine how many input data points are necessary – to minimise what is actually needed from an IT perspective.
Because of these myriad factors, technological processes and personal preferences often merge at this point. “A lot of this is tied up in personal preference of investment and operations teams and how we have all been built up organisationally and how we have built up this overhead of databases, data stores and you are trying to work within those confines.”
Palmer added that he would “challenge everyone that you are not going to get value by just working in your current data environment; you might have to tear down the whole thing and start over because most likely, established firms that have been around for decades and are trying to integrate [this] into the spaghetti that has been created over this time.”
“Most people would ask why we are doing this, and, well, we want to improve our
teams’ experience and have confidence in what we are seeing daily."
As a result, Palmer said he believed that the value could often be “fairly limited”, as it required achieving scale whilst increasing confidence in the data processes implemented by the team.
The negative consequences of this knotty issue mean exacerbated workloads in the short term, Palmer warned. “Most people would ask why we are doing this, and, well, we want to improve our teams’ experience, bring products to market faster, and have confidence in what we are seeing daily. If you are trying to build into the old environments, it is pretty tough to get the last piece which is the confidence in which you need.”