Going the Distance to Build a Robust Data and Analytics Infrastructure
Using data to derive actionable insights and effectively create positive impact is a game-changer for any business.
Common goals of a robust data analytics program can include greater visibility into business units, improved reporting, better forecasting capabilities, and more effective data-driven decisions. While firms often dedicate considerable attention to data analytics, a Gartner study of 566 data and analytics leaders found less than half (44%) of analytics teams provide effective value to their organization.[1]
In a three-part blog series, our professionals have been examining firms’ data strategy challenges and considerations to evaluate at each stage of the institutional investment data value chain.
Sponsors of data initiatives tend to focus their attention on analytics outcomes. But, as we uncovered in the first two articles of the series, investments in analytics are only valuable when they’re paired with effective data sourcing and a robust data management infrastructure. Even firms that successfully reach the analytics phase – which is often considered the last mile of the value chain – have high-stakes considerations for their analytics programs.
In the last piece of our series, our team contemplates how to structure the data analytics function, insight generation best practices, and future-proofing analytics program capabilities.
Structuring Your Data Analytics Function
The first step in building a successful data analytics program is defining a target operating model for how key functions involved will work together.
Three common models for where analytics and other key functions sit within an organization are center of excellence (CoE), distributed, and hybrid:
- 1.Center of Excellence (CoE): In this model, central resources are either assigned on a project basis or indirectly aligned to business unit-specific work on a rotational basis. Challenges of this setup are the analytics team may not have exposure to the domain-specific knowledge of the business unit and often must manage competing project priorities. Firms can mitigate these risks by establishing a regular cadence of communication.
- 2.Distributed: Under this model, the analytics function is fully embedded into the business unit and typically calls for a generalist skillset. A challenge with this model is that technical tooling and scale are often not a priority for the business. With a central data platform function, it’s easier for the analytics role to generate insights at scale. If no data platform function exists, the infrastructure function can provide basic tooling to the embedded analytics person to mitigate risk.
- 3.Hybrid: Most firms have landed on some version of the hybrid model as a best practice for organizational structure. In this model, data platform functions such as sourcing, data pipelines, operations, and governance are typically central CoE, while analytics are embedded into the business. This model ultimately calls for a unified data platform to solve more complex data entitlements and access schemes. The platform can also mitigate the risk of slower end-to-end implementation due to more project handoff points.
Operating Model Framework
To set firms up for success, each data analytics program draws on diverse skillsets across functional competencies. A common hiring mistake is to look for the unicorn candidate who can fill all the skill and functional requirements for the program.
Data analytics programs should be resourced knowing that roughly 80% of project time should be spent on data sourcing, governance, and data and pipeline management. When data reaches the hands of the analytics team for insights generation, it should be cleansed, structured in a point-in-time format, and have a sufficient history to set the program up for success. Dedicated data platform support for the data analytics practice allows each team to focus on their core competencies. As a result, this focus creates scalability and higher-quality outcomes.
Insight Generation Best Practices
A common reason data analytics programs fail is that firms misinterpret data, resulting in incorrect or low-value insights for stakeholders. The best way to avoid this pitfall is to embed subject-matter expertise into analytics generation. Team members with subject-matter expertise can better reconcile contextual industry-specific evidence with the data they are looking at. They are also more adept at distinguishing causation from correlation to extract high-value features from data and effectively generating a story around the data for stakeholder consumption.
Beyond subject-matter expertise, successful analytics programs use the largest possible representative sample size of quality data to generate persistent insights and avoid overfitting. Teams must align the breadth and depth of coverage of the input data with the target insights they’re producing. For example, using a US geographic sample for a global forecast wouldn’t work in a macroeconomic context. However, it would work for a company-specific forecast where global revenues are entirely US-driven.
There is also a distinction between types of desired insights and corresponding data, techniques, and skillsets required. The desired insights can be classified as look-back, or monitoring what happened or is happening, versus forecasting, which can be produced on an ad-hoc or persistent frequency.
Forecasting typically requires more advanced data science techniques. The technique calls for subject-matter generalist hires with strong technical skills who can learn domain-specific knowledge on the job. These individuals often provide a novel perspective on how problems have been solved in the financial industry historically.
If the business calls for more look-back or ad-hoc insights, domain-specific knowledge is paramount in an analytics team hire. These team members can be enabled with low-code tooling and generally have a more attainable path to technical upskilling to fulfill insight generation requirements. The proliferation of data analytics boot camps has also opened opportunities to upskill teams through company-sponsored learning and development programs.
A key consideration for the analytics function is how the team delivers insights to stakeholders. A delivery toolkit underpinned by strong UX/UI principles will ensure stakeholders receive the most value from insights the analytics team generates. Analytics teams with robust delivery toolkits design interfaces that are easy to navigate and purpose-built for business analyst and executive stakeholder bases. Easily accessible databases with purpose-built SaaS microservices can also support technical stakeholders across functions or business units.
Future-Proofing Analytics Program Capabilities
Successful analytics programs need to be designed for continuous evolution to preserve an edge in insights generation:
- For consistently reliable insights, firms must have processes to dynamically add new data or remove degraded data. The ability to adjust the weightings of input data in their models based on confidence in its performance in each market environment is also key.
- Guess-and-check principles are a highly regarded learning process in science but underused in business environments. Embedding guess-and-check-principles and strong peer review frameworks will enable a truly adaptive analytics practice and help firms develop more innovative competitive insights.
- Capacity for pure research and development is important for firms whose long-term objective is technological superiority over their peers. Studying and contributing to the latest methodologies and techniques outside of specific business requirements enables firms to quickly embed bleeding-edge technologies when an application presents itself. The note of caution is this may accelerate scale requirements for data platform teams supporting research and development projects.
Generating Value from Your Data Initiative
The value chain for data initiatives is complex and full of interdependencies. As a result, it makes launching, executing, and realizing value from a data program difficult. High in complexity, but also high in upside and impact, financial institutions seeking an edge invest considerably in and tout their data programs.
Each major component of the value chain – data platform, data sourcing, and data analytics – is like a separate stage of a marathon. Struggling in the last mile at the analytics stage is common.
Most firms aren’t running the marathon alone. Many choose to buy from or partner with a third party, in addition to building in-house. This approach can enable firms to optimize their target operating model, decrease time to value for newly launched or re-platforming initiatives, enable scale, mitigate risk, and maintain control over key operations.
If you’re looking for frameworks and best practices to identify where partners can be beneficial, stay tuned for our next piece. Don’t want to wait that long? Arcesium’s technology is a trusted solution for firms at any stage of the value chain. Reach out, we love talking about data.
[1]Gartner Survey Reveals Less Than Half of Data and Analytics Teams Effectively Provide Value to the Organization, March 21, 2023, Gartner
Share This post
Subscribe Today
No spam. Just the latest releases and tips, interesting articles, and exclusive interviews in your inbox every week.