Fund administration: Ask the tech experts

jon archer 180x200
JON ARCHER
TouchstoneCRM
Jon Archer leads the private equity practice at TouchstoneCRM. He has over seven years’ experience advising, planning and leading implementations of digital solutions for specialist sectors.

fleur_hicks 180x200 
FLEUR HICKS
OneFourZero
Fleur Hicks is managing director with OneFourZero, a leading digital diligence agency, and has spent more than 16 years working with private equity firms and blue-chip brands in the B2C and B2B sectors.

kevin kelly 180x200 
KEVIN KELLY
Altvia Solutions
Kevin Kelly is founder and chief executive officer of Altvia Solutions, which designs advanced web-based applications tailored specifically for institutional investors, alternative asset fund managers and impact investors.

James Waletzky 180x200 
JAMES WALETZKY
Crosslake Technologies
James Waletzky is a partner at Crosslake Technologies with more than 20 years of experience in the software industry. At Crosslake, James leads due diligence activities and helps technology teams boost business results.

James Watts 180x200 
JAMES WATTS
Augentius
James Watts is group information officer, with Augentius, a fund administrator, where he directs the planning and implementation of enterprise systems to support business operations.

What are the major technological developments that GPs should be aware of?
James Watts: With cloud-based systems gaining prominence, some regulators have expressed concerns over their use. The SEC in the US, FCA in the UK and CSSF in Luxembourg all have views on cloud-based systems in relation to factors such as risk management and access to data – and these must be taken into account when considering outsourcing to a cloud provider. 

Additionally, LPs themselves are changing their modus operandi and demanding more and more data, in some instances including data on portfolio companies so that valuations can be reviewed. GPs need systems containing all the relevant information to support these requests. Demand for this type of supporting data will only increase in the future, so now is the time to adapt.

Finally cybercrime cannot be ignored. GPs must protect their technology systems and procedures properly – and those of their portfolio companies. The use of investor portals, incorporating additional layers of security and protection, needs to become the norm for investor communications in order to mitigate against the growing risk of cybercrime. GPs are under pressure – to deliver more for their management fee, to comply with increasing regulatory pressures, and to consistently provide good returns. Historically, not all GPs have used technology to their best advantage, but the ways in which such developments can help meet the industry’s growing demands are becoming a more prominent item on fund managers’ agendas.

How much should we be spending on technology?
Fleur Hicks:
Typically, companies spend between 3 percent and 10 percent of turnover on technology but it really depends on what your company does. Smaller companies often spend a larger portion of turnover on technologies, but spend means nothing if it is not apportioned correctly. Budgets are split according to needs and normally include infrastructure, securities, hosting and innovation.

What are the main trends in the way LPs use technology?
James Watts:
There now needs to be a smooth and efficient transfer of data between general partners and limited partners. The world of the investor is changing and demands for transparency are on the rise, with the Institutional Limited Partners Association calling for more and more data to be collected from GPs and analysed by LPs.

The SEC has recently highlighted areas where GPs have made some errors, resulting in a greater focus on management fees and carry calculations. LPs are looking to cross check these factors to ensure they are in line with side letters. They are building waterfall calculations into their own technology systems to track GPs’ carried interest, and analysing other factors such as ESG compliance reporting. 

LPs require this data in specific formats that can be automatically loaded into their own systems to facilitate complex data analysis. They need information in formats tailored to them – which isn’t necessarily the standard PDF files that require re-keying. The ways in which LPs process information is evolving, and consequently GPs will have to change to meet these demands.

How can I manage our data better so I spend less time in the ‘swivel chair’ switching between disparate systems?
Kevin Kelly: Capturing data from multiple sources into a central system is one of the primary challenges for private equity firms. At the time of investment, LPs are seeking visibility into revenue multiples and other indicators that support the fund manager’s investment thesis. They are increasingly aware of the impact of multiples paid on expected returns and want the data to further inform this thinking and to be able to benchmark managers. For portfolio companies, LPs want to understand the numbers that the manager is using for the value of a company at any given time since that rolls into the net asset value of the fund. 

This is the puzzle yet the pieces are strewed about in several, disconnected systems causing massive amounts of ‘swivel chair’ work to pull the puzzle together from different systems to gain insights from the data. There are countless examples of disconnected data ranging from investor transparency to ESG.

Data management is the key. A central system that can capture, consolidate and integrate data from these disparate systems is the cornerstone. As the data get connected, private equity firms can have the full spectrum of data points to build the models, benchmarks and ultimately transform data into insights into better decision-making.

How can technology help us focus our investment approach?
Jon Archer: This is something we help to address on a regular basis in various ways. Private equity firms are having to evaluate increasing amounts of investment opportunities and need due diligence processes in place and ways to track the ‘play book’ using business process flows and automated workflow. On top of this there also needs to be live, dynamic and intuitive analytics to quickly identify where increased focus should be given to target companies dependent on what stage they have reached in the deal process.

The other key area is in understanding origination and the value not just of companies but of specific individuals who may have moved between organisations. Identifying who they are, their value and then finding ways to engage with them can help to ensure that the investment approach isn’t just more focused but that there is an opportunity to enter a bid early at a lower cost.

We are being asked for ever more complex sets of data. How should we be looking to improve our data management?
Kevin Kelly: Fund managers are looking to the value creation that more structured data management offers to address the ever-increasing complexity of data needed to fulfil regulatory requirements and investor demands, and to provide the intel to compete for capital and deals. There’s usually a two-fold process: establishing best practice workflows and then identifying the right tools to support and drive these practices. 

Once a baseline for the business has been established, you can align the data management system to those processes. While there are a number of systems available, it’s important to consider the level of customisation, integration and scalability you need over the longer term so that the system can evolve as your business grows. Moving to a more structured data management system comes at a price, but the opportunity cost of failing to recognise and invest in the software is significantly higher than the cost of implementation. Those private equity firms that invest in the right data architecture will gain a major informational and competitive advantage.

Is there any software that can help hone our compliance strategies?
Jon Archer: Yes, absolutely. At the end of the day compliance revolves around a set of rules that state certain criteria need to be met at certain stages. As long as these can be mapped and laid out, we can set up the processes and parameters using Microsoft Dynamics which doesn’t just help ensure compliance is adhered to but make it automated, reducing the strain on back-office staff.

Once implemented these processes can help general partners build up a compliance track record. These can then be examined and analysed further to understand where things have perhaps slipped through the net or could be optimised further to enhance future operations.

How do I ensure that our platforms are scalable?
James Waletzky: Building applications and platforms that scale is extremely important in today’s world of software-as-a-service. To scale, a product architecture should be stateless so as not to rely on any specific server to fulfil a request, and support asynchronous operations to minimise bottlenecks.

One general strategy to scale is to break the software architecture into smaller components so that each component can scale independently. The microservices strategy leverages this fundamental principle. If there is a bottleneck in the system around one of these components, it’s easier to scale up or out in a cost-effective way and easier to fix.

In addition, containerise the application, which allows the components to be deployed in seconds. This helps scalability and enables auto-scale – the creation and destruction of resources as needed – which most cloud-based infrastructures support.

Finally, to validate scale, you need to determine your load/capacity objectives and create automated load tests in production-like environments to validate that objectives have been met and use monitoring tools to indicate problems.

How can we make sure our system is cyber secure?
Fleur Hicks: Cyber compliance comprises of two elements: action and readiness for action. It has never been enough to have checked your company’s cyber compliance once. Hackers are always one step ahead. 

So ongoing checks and robust processes are essential to ensure that you have done everything you can to prevent the misuse of your customers’ confidential details. There are three core elements to cybersecurity and so all of them should be looked at to ensure a basic level of robustness:

Platform: Are your systems, infrastructure and technologies all compliant? This is perhaps considered the most ‘basic’ element of cybersecurity and can be easily actioned. The trick is to maintain a basic level of compliance and check securities of all platforms regularly. All technology updates may have a knock-on effect on other areas, so regular securities checks are advised. Be sure to understand that it is your responsibility to diligence suppliers and third-party technologies.

People: Arguably the most neglected area of cyber security is ensuring that staff know what it means to be cyber safe. You need to consider everything from email access from personal mobile phones to password strength and duplication. An IT policy is something that is often signed by staff but perhaps not ‘read’, so we recommend that cybersecurity forms part of the on-boarding process for all new staff and that regular cyber training sessions are held.

Processes: Processes for cyber compliance are essential and should resonate throughout companies, from HR to IT. It is important to reiterate that both assurance and response processes are necessary in order to pass even the most basic cyber compliance check.

What should I look for in a CTO?
James Waletzky: An effective chief technology officer is a combination of leader and technologist. The ability to think strategically to form the company’s technology vision with an eye to achieving business results is paramount, as is communication of the vision. The CTO ensures that every major technology decision is backed with an ROI, which is a difficult exercise for software engineers. The CTO then delivers on the vision. The most capable CTOs demonstrate the ability to lead and coach a cross-functional software engineering team, build technical roadmaps that support the product roadmap and guide teams through technical initiatives such as a move to the cloud. They need to be extremely good communicators. 

Cultural fit is important, as is delivery experience using modern technologies, development principles and proven best practices with an emphasis on product value, quality, time to market and team productivity. Finally, the ability to work with the C-suite to translate business concepts, such as financials and impact to customers, to concepts the engineering team can relate to (and vice versa) is key to getting results.