How Are You Managing the Data Explosion?
As institutional asset managers seek better ways to manage an influx of data, new tools and collaboration are helping uncover the insights that matter.
In New York City, newly planted trees are tagged with metal cards that dangle from their branches. Passersby are encouraged to scan the card’s QR code to learn about the tree’s care, age, size, species, and dollar value. Alongside these details is an interactive map that displays the city’s approximate half a million trees that have been identified, measured, mapped, and tracked.
Budding arborists and data enthusiasts alike can explore New York’s urban forest or navigate to the broader NYC OpenData Dashboard to view an extensive amount of information aimed at helping New Yorkers use and learn about the city’s data. The city regularly publishes more than 3,000 data sets with over 4.5 billion rows of data. Through their long-term commitment to data-driven decision-making and open access to data, the NYC OpenData Platform has created countless ways for government, citizens, and businesses to use data that’s available almost quite literally at their fingertips.
As segment and product leads at Arcesium, not surprisingly, data management comes up in many discussions – both as an asset manager’s greatest challenge and opportunity. Effective systems and data are critical to enabling firms to intelligently use their data, reshape their decision-making abilities, and keep pace with rising competition.
New Data, New Tools
As investment managers become increasingly data-driven, they’re turning to new sources, including web-scraped data, credit card transactions, geo-location foot traffic, and more, for a competitive edge. As a result, data has become the cornerstone of operational insights to assess customer needs and drive a business forward.
Investment management firms understand the need to quickly aggregate information and help their teams make more-informed investment decisions. Yet, many still struggle to incorporate data quickly enough to give users access to timely, relevant information. The influx of third-party data adds to the challenge.
Tools that enable teams to ingest and enrich large volumes of information, resolve data quality issues, and aggregate and organize the data are critical to gaining an edge.
New Roles to Wrangle the Data Explosion
Data comes in various shapes and forms that is not typically ready for consumption until it’s ingested, run through quality checks, and normalized. Even the best data is not often linked to securities or comparable in many ways. To get data to a usable state, it must be queried, combined with other data, and run through models that help teams compare and make sense of the information.
Getting the right tools in users’ hands is essential to strong data management. Data teams are expensive, though. Therefore, any work done by the data teams needs to be productive and have a long project life.
As chief data and technology officers develop roadmaps to manage the data explosion, they’re evaluating the tools and resources necessary to help teams better understand data and use it in their decision-making.
Data engineers and platform teams lay the foundation and develop the toolsets that enable data scientists to be productive. Seamless access, analysis, and collaboration help teams better manage data and ensure all members play an active role in turning data into actionable intelligence.
Collaborating to Make the Best Use of Data
Asset managers are consciously factoring data management into their roadmap and labeling exploratory datasets, the nice-to-haves, and their core needs.
A sandbox environment can enable asset managers to explore and test datasets before bringing them fully onboard their data platform. This can be helpful to ensure proper data linking via security and entity identifiers. In addition, the testing environment can also create a sufficient history for data analytics.
A no- to low-code environment is another useful tool to simplify how users quickly view charts, time series, and deviations. Low- or no-code tools also help with collaboration, an increasingly important step in making the best use of data. With datasets spread across a firm’s various systems, collaboration helps teams understand what data is available. Working together to eliminate data siloes, teams can ensure they don’t overlook valuable pieces of information.
Modern data platforms enable cataloging, lineage tracking, advanced data curation and transformation, and multiple mechanisms for ingress and egress – all vital to managing the data explosion. Seamless integration with data warehousing platforms, like Snowflake and data lakes, is also critical to aggregating and mining the growing universe of datasets.
Data Growing on Trees
Front-office data cannot exist in silos, separate from middle- and back-office information such as trades, positions, performance, attribution, and risk measures. Combining front-, middle-, and back-office data is critical for unified analysis. Much like the data practically growing on trees in New York and available to analyze everyday life in the city, firms have the opportunity to glean information on almost every aspect of the investment lifecycle. Gathering, ingesting, and normalizing it on agile systems will be invaluable as firms look for new and better ways to use their data to make intelligent decisions.
Share This post
Subscribe Today
No spam. Just the latest releases and tips, interesting articles, and exclusive interviews in your inbox every week.