10 Imperatives for Data & AI in 2026

  • Tiago Almeida, Prashanth Sekar, Chris Probert
  • 16 February 2026

The past 12 months have only served to underline the foundational role that data has to play in successful generative AI and agentic AI programs across the financial services and energy industries. Modern data architectures, high quality data and the democratization of that data also remain critical to wider transformation initiatives – elevating the insights that inform strategic decision-making, spotlighting vectors of risk and threat, and surfacing new opportunities for efficiency enhancements and cost savings.

With the temperature only set to rise around innovation and competition in the months ahead, we are pleased to share our 10 Data and AI imperatives for 2026.

1. ROI measurement in Data & AI and funding model shifts

The hype around AI has raised expectations for data-driven value. However, many organizations are still struggling to translate AI investment into sustained, measurable returns. 

Future-facing organizations are redefining the funding model paradigm, from traditional project-based budgeting to portfolio, value stream and outcome-linked funding models that prioritize continuous value delivery over one-off initiatives. At the same time, the value of AI increasingly extends beyond cost and time savings. Scalability, adaptability and operational resilience are becoming critical components of how AI return on investment is assessed.

 

2. Acceleration of data foundations required for AI implementation


AI depends on strong data foundations, but the challenge for many organizations is they are still building those foundations using operating models designed for traditional analytics.  As a result, advanced models and generative capabilities are often deployed via fragmented architectures, inconsistent semantic definitions and pipelines built for reporting rather than continuous, AI-driven consumption – limiting their ability to scale beyond pilots.

At the same time, AI is beginning to reshape how data foundations are built and governed. AI-assisted metadata, semantic consistency and data contracts are becoming critical as AI is embedded in decision-making and regulated processes, where weaknesses in data architecture surface directly as model risk and operational friction.

 

3. Impact of Gen AI on data user behavior


GenAI is becoming the default interface for discovery, decision support and service delivery as users shift from navigating menus to asking questions and expecting complete, contextual answers. This raises expectations around immediacy, relevance and conversational engagement across digital channels.

With AI embedded in daily workflows, users are delegating more judgement to machines. This introduces risks, including an over-reliance on automated outputs, reduced verification and the growing use of unsanctioned GenAI tools. 

Trust and transparency are now as critical as model performance. To remain relevant, organizations must embrace AI-first journeys, where trust is engineered through explainability while the risks of unmanaged GenAI use must be actively contained.

 

4. Operationalization of Agentic AI


As we highlighted last year, Agentic AI implementations are moving into the financial services mainstream, shifting from research environments into the enterprise delivery lifecycle.1 

Firms eager to reduce human workload look at AgentOps as a strategic tool to enable automation of repetitive tasks – such as anomaly detection, metadata enrichment and data quality remediation – with human-in-the-loop checkpoints. Bounded operational domains with clearly defined policies will guarantee auditability, particularly in the financial services industry.
 
The operationalization of Agentic AI is a growing trend, as enterprises seek to balance gains in autonomy with explainability, compliance, and business objectives.

 

5. Unstructured data management


Unstructured data such as text, images, audio and video accounts for up to 80% of the enterprise information, yet it has historically been difficult to manage effectively.2 Financial services firms have now begun to access this previously untapped reservoir of data, building integrated platforms that automatically classify, index and enrich unstructured information. 

Improved unstructured data management unlocks richer customer insights, compliance readiness, and operational intelligence – making this an immediate strategic priority. We expect organizations to layer in AI-powered governance to enforce retention, privacy and usage policies across unstructured stores, while data mesh principles should be extended to unstructured domains, enabling subject-matter experts to curate and expose trusted content. 

 

6. Data marketplaces


As organizations scale analytics and AI, data access – rather than algorithms – has become the primary constraint. Valuable datasets remain underutilized due to poor discoverability, unclear ownership and inconsistent governance, creating a bottleneck in delivery and unnecessary duplicated efforts across teams. 

However, organizations are now beginning to treat data as an asset rather than a by-product, and data marketplaces are a key lever at their disposal. Combining discoverability, trust signals, access controls and commercial mechanisms, data marketplaces help clarify ownership, improve transparency around quality and usage and provide a foundation for more controlled data sharing beyond the enterprise.

 

7. Knowledge architectures and semantic layers


Bringing previously siloed data sources into a unified model, knowledge graph architectures and semantics – whereby data is stored as a graph network of concepts linked by semantic relationships – have been a key driver of innovation in the data landscape, with over 72% of Fortune 500 enterprises having already adopted graph-based technologies.3

Financial firms are now exploring the synergy with GenAI technologies to provide context-rich solutions to business problems underpinned by knowledge. The strategic direction points towards a fully integrated enterprise semantic layer, acting as the ‘business brain’ of their ecosystem to provide shared definitions, relationships and rules. 

 

8. Business literacy across data teams


The ability to apply a financial, strategic and operational understanding to everyday decision-making is a powerful competitive differentiator for data professionals and teams. Data analytics and AI tools are no longer enough – data teams are now expected to fluently speak the language of business outcomes, KPIs and value chains.  With almost half of business leaders identifying business acumen as their most critical skills shortfall, bridging this gap is a particularly pressing concern.4

That task increasingly falls to data leadership teams through structured cross-training, product-oriented data roles and a tighter alignment between data artefacts and revenue and efficiency levers. Financial services firms that cultivate business fluency in their data teams will accelerate insight adoption, reduce misinterpretation and more effectively prioritize analytics investment tied to measurable business impact.

 

9. Generative BI & MI


Despite decades of investment in dashboards and reporting platforms, many organizations still struggle to convert data into timely, actionable insight. Business users remain constrained by complex interfaces, rigid metrics and slow turnaround times for analysis. 

Generative BI and MI (Business Intelligence and Market Intelligence) combine governed data with natural-language interfaces and automated narrative generation, enabling users to ask questions in plain language and receive answers with context, drivers and explanations.

However, the value of generative BI depends less on tooling and more on semantic foundations. As analytical power moves closer to non-technical users, consistent definitions, governed metrics and transparent lineage become essential to maintaining trust.

 

10. Policy as Code


As cloud, data and AI environments scale, traditional governance models are struggling to keep pace. Policies defined in documents and enforced through manual reviews are increasingly ineffective in operating models shaped by continuous deployment, dynamic infrastructure and autonomous systems.

Policy as Code (PaC) shifts governance into a programmable control layer, where policies are versioned, tested and enforced automatically across infrastructure, data platforms and AI pipelines. This enables consistent enforcement at machine speed while producing auditable evidence by design.

As AI adoption accelerates and environments become more complex, organizations are embedding governance directly into their technology stacks to maintain control while preserving delivery velocity.


References

https://www.capco.com/intelligence/capco-intelligence/10-imperatives-for-data-and-ai-in-2025

https://www.gartner.com/en/documents/6847534

3 https://www.industryresearch.biz/market-reports/knowledge-graph-market-111234

https://culturepartners.com/insights/business-acumen-essential-skills-for-modern-leadership-success/ 

 

Get in touch

To find out more about working with Capco and how we can help you overcome any potential challenges, contact our experts or subscribe for the latest insights below.