Search
Close this search box.
Client Login

The Data Management Challenge: What it Means for Investors and Asset Managers

Data is the heart of the asset management industry. Multi-million-dollar decisions are made and deals are won based on the insights and number-crunching results data provides. After all, data is an enabler for smarter, more fact-based decisions. However, it can also be an anchor as firms struggle to cope with volume, ensure quality, implement sound governance and extract value in a sea of constantly changing information.

Data management

These overarching issues represent the “data management challenge” and they impact the entire asset management sector from hedge funds to private equity firms, fund administrators and family offices.

The Data Deluge

The days of relying primarily on pricing vendors, annual reports and other traditional sources of information to inform investment decisions are long gone. A new data paradigm has emerged.

Asset managers are increasingly incorporating financial and non-financial data flows from the internet, social media and a multitude of other sources into position making. With information feeding these sources growing exponentially, it’s no surprise that firms are drowning in data.

Two other factors are also contributing to the data deluge – the search for alpha in today’s low interest-rate environment and the need to mitigate risk. As a result, asset managers are spreading investments across a wider range of assets, such as cryptocurrencies and other alternatives. They are also investing more heavily in private equity, where opportunities might as easily be in a local market as they would be in a company on the other side of the world. Between new asset classes and the globalization of investments, managers must pull in more information, in more languages from more international and domestic sources than ever before.

The proliferation of data can also be seen in the public sphere where disclosure and reporting requirements are driving regulated companies to provide additional information for greater transparency.

Searching for Quality

“Garbage in, garbage out” (GIGO) is as true today as it was when the phrase was first coined in the mid-1960’s. Yet, the sheer volume of structured and unstructured data from an ever-growing number of sources makes it increasingly difficult to ensure quality. Not only must data be accurate from the source, but firms must implement proper controls so that data integrity does not erode as information moves through the operational flow from download to aggregation and reporting.

Standardizing data sources would help to eliminate the manual manipulation needed to reformat or normalize the data for integration with other systems. And it would improve data quality – as nothing compromises quality more than manual processes.

Unfortunately, efforts to establish standards based on government or industry requirements have fallen short because the standards must cover too many jurisdictions, each with its own individual issues and constantly changing rules. Without a definitive standardized model, firms continue to rely on manual input and spreadsheets, which increase the chance of error and put the firm at risk of making decisions based on inaccurate or incomplete information.

While artificial intelligence (AI) is much hyped, the surest way to address the issue of data quality is with technology that can accommodate standard protocols (i.e., FTP) and APIs for connectivity and integration with the widest range of banks, custodians and brokers. These solutions automatically collect and aggregate information, thereby eliminating the resultant errors from handling data. Once in the system, more advanced technology and reporting can complement other solutions to extract value.

The Expanding Role of Data Governance

Madoff and other high-profile scandals have brought into sharp relief the need for firms to establish data governance programs to protect their financial assets and reputational integrity. These programs typically include policies and procedures for data management that address consistency, reliability, traceability, security and accountability.

Shadow accounting is one example of how firms have responded to the need for better data management. Maintaining a separate set of books to validate (“shadow”) a third-party administrator’s accounting records is not mandated, but is now considered best practice and used by firms of all sizes and for numerous reasons.

For example, firms trying to raise funds in the public sphere use shadow accounting to build trust. In the private sector, firms rely on shadow accounting as a way to verify to investors that they are achieving the advertised returns. Shadow accounting can be useful even to smaller companies that may need to validate revenue in order to secure a bank loan.

Technology plays a pivotal role in data management. It can record historic data points, track valuations over time and provide an audit trail of inputs and activity to identify data lineage. This information is as important in areas of asset management where transparency, regulatory oversight and reporting are critical as it is in family offices, where “key-man risk” is an ongoing concern. Whether data is held in the cloud or on premises, the critical element is to have everything in one place where it can be accessed by approved staff or family members.

Extracting Value – the Final Frontier

Asset managers that use data from online news sites, social media and the internet to supplement traditional sources can gain additional details regarding the strategies and long-term potential of companies. Mining this information helps to better predict outcomes, inform decisions and mitigate risk. However, much of the data from non-traditional sources is unstructured, making it hard to integrate into an objective decision process and hard to use without the right tools.

True data success is not just the ability to manage large troves of information, but to derive business intelligence from the information collected. If leveraged properly, data can be transformational. Technology provides they key to unlock the value of data. It not only improves efficiency and mitigates risk, but enables firms to better understand and use data for competitive advantage.

Technology solutions that automatically collect and aggregate data from disparate systems and process it quickly can produce accurate net asset value, account for cash flows between entities, calculate fees, compile risk statistics and deliver other financial metrics that feed investment decisions. Flexible, user friendly reporting should be an integral part of any technology solution, providing asset managers with ready, hands-on access to manipulate data as needed and quickly produce reports for stakeholders.

Adopting technology is a wise long-term move. It brings greater value to data, improves decision making, and enables cost-conscious firms to integrate with the rest of the world more efficiently. Firms that are slower to adopt technology will have a harder time accepting the operational paradigm that’s needed to grow investments or safeguard wealth for future generations.

The good news is that where you start with technology is further down the ladder than you might think.

Contact us to learn how FundCount can help you tackle the data challenge.

Related articles

Sign up for FundCount Highlights

Keep your business on trend with what is new in the FinTech industry and FundCount
Get our monthly digest!
© 2023 FundCount • All rights reserved • Terms of usePrivacy PolicyAccessibility Feedback