Providing data to multiple databases is a fact of life if institutional asset managers want to compete in new business opportunities.
The most important thing a manager can do to ensure success is to avoid triggering red flags that will exclude them from consideration by consultants and asset owners. That means data inputs must be timely, accurate and consistent across multiple databases for every single reporting period.
Sounds simple, right?
That was our main takeaway from “The Journey from Data to Success,” the information-packed webinar hosted by Assette partner eVestment this fall.
It was full of great insights and tips to help investment firms avoid the dreaded red flag by better managing the data required by manager databases. You can view it in its entirety and download the presentation slides here.
One of the most powerful—and actionable— segments was “Database Best Practices.” In it, eVestment highlighted seven things managers can do to make the data input process more efficient and to ensure their firm’s data are seen by as many potential buyers and influencers as possible.
Here they are, along with some of our thoughts:
- Answer what is being asked
- Know what each database wants and how they define their fields.
- Pay special attention to AUM fields—some are vague and/or broad, others are very specific.
- When in doubt, ask.
- Don’t be an outlier
- The wrong data in a field is a red flag.
- Never force a strategy into a bucket where it doesn’t belong—it’ll be flagged when performance patterns don’t align in peer comparisons.
- Be consistent
- Within databases, as well as across databases, RFPs and other marketing materials. Consultants often use multiple databases to screen managers, and they also rely on RFPs, fact sheets and pitch books.
- With qualitative inputs and narratives. This can be a moving target, especially when it comes to investment strategy narratives. Be sure you are using the same narratives in RFPs, pitch books and all manager databases.
- Have timely update procedures in place when a Subject Matter Expert develops a new qualitative answer.
- Dedicate resources to the function
- Include database specialists and create a cross-functional team with clear accountability and management of content.
- Leverage technology and specialist database update services to streamline the process and help internal staff be more efficient.
- Review data frequently (and carefully)
- Review all qualitative and quantitative elements before submitting to each database or updating service.
- Utilize a manageable review schedule of existing database language and data, e.g., do five strategies every quarter.
- Pay attention to details, e.g., dates on your GIPS® Verification letter and changes to minimum fee/account size. Little things like staff titles, credentials and years of experience can quickly become out of sync across different materials and databases, especially when they appear in different formats.
- Pay attention to whether databases have added or changed questions or fields.
- Provide data promptly
- Submit data as soon as possible. That is, once your accounts are reconciled, submit data within days instead of weeks. The sooner your data appears in databases, the sooner you may appear in shadow searches.
- Keep strict internal deadlines and adhere to them.
- Help senior management understand what’s at stake and the opportunity cost of missing deadlines.
- Get buy-in from your experts
- Involve internal Subject Matter Experts in the process, especially for ensuring the latest qualitative content is in reflected in all databases.
- Have multiple stakeholders review qualitative narratives.
- Employ workflow tools to manage the review process and keep track of a “single version of the truth.”
Following these best practices will go a long way toward making sure your firm isn’t missing out on mandates from institutional clients. If you would like more detailed insights into how to maximize your firm’s exposure in manager databases, here’s a link to our library of related blog posts.