As this year comes to a close, many experts have begun to look ahead to next year. Here are several predictions for how companies will manage their data in 2026.


Sijie Guo, CEO of StreamNative
A fundamental shift is happening in how we think about data engineering. For decades, data engineers prepared data for human consumption – analysts, data scientists, and business users. In 2026, AI agents will emerge as primary data consumers, and this changes everything. “Context engineering” isn’t just a rebrand – it’s a recognition that agents have different requirements than humans: they need fresh, streaming context delivered in milliseconds, not batch updates delivered overnight. The best data infrastructure companies will embrace this evolution, using their deep expertise in streaming, storage, and processing to solve genuinely new problems around agent-facing analytics and real-time context delivery. While the underlying principles of good data engineering remain constant, the application layer is transforming.

Chris Child, VP of product for Data Engineering at Snowflake
In 2026, the metadata layer will emerge as the critical control plane for modern data architecture. As open table formats like Apache Iceberg™ gain widespread adoption, and open source catalogs continue to mature, the abstraction of metadata from storage and compute has become not just possible — but essential. The organizations leading in data are no longer those with the biggest lakehouses, but those who can unify governance, discovery, and access across fragmented data ecosystems. The metadata layer is now where trust, transparency, and agility are won or lost. It’s the battleground for data leadership, and open standards are the strategic advantage. In 2026, this architectural shift will be the key differentiator, separating the market leaders from those left behind.

Alan Peacock, general manager of IBM Cloud
We’ll see governments and regulated industries in particular move data to adopt a strategic mix of on-prem and cloud solutions – the days of a one-size-fits-all approach will soon be over and hybrid will be key. Although these organizations face the same rising demand for advanced compute workloads as any other, they have had to balance this demand with increasing concerns about cost predictability, sovereignty and operational control, all while managing security and compliance requirements. And while risk management remains paramount — organizations still navigate the need to have full control over where data is stored and processed, as well as maintain compliance with local data protection laws — regulated industries will start to take a workload-by-workload approach, deciding where to host data and applications. They can now choose what’s best for them, and they will.

Genevieve Broadhead, global lead of retail solutions at MongoDB
As 2026 approaches, we are still seeing notable differences between retailers who have modernised their technology and those still relying on legacy systems. As speed and the ability to quickly pivot and adapt to market trends become more important, retailers have realised that flexibility needs to be at the core of their design. The ability to release iteratively without downtime or complex schema change will be key to keeping your development teams shipping at the pace of the industry

Deepak Singh, chief innovation officer of Adeptia
Enterprises will realize that AI’s real leverage point isn’t the model—it’s the First-Mile Data flowing into it: the messy, inconsistent information arriving from customers, partners, brokers, and legacy systems. As this scattered data becomes the biggest obstacle to automation and AI accuracy, organizations will shift attention upstream. The priority will be normalizing and enriching incoming data before it hits AI workflows. And companies that get it right will see faster operations, more dependable AI outputs, and a dramatically smoother path to true AI-driven transformation.

Tyler Akidau, CTO of Redpanda
By the end of 2026, connectivity, governance, and context provisioning for AI agents will be built into every serious data platform. SQL and open protocols like MCP will sit side by side, allowing both humans and machines to query, act, and collaborate safely within the same governed data plane.


Lisa Owings, chief privacy officer at Zoom
Regulators expect AI to meet long-standing requirements around consumer protection, data governance, transparency, and data minimization. With the power of AI increasing exponentially, applying privacy requirements to the AI world is simple in concept, challenging in execution unless it is included by design. In 2026, we’ll see a shift toward greater alignment between regulators and companies that proactively embed privacy and accountability into their AI systems.




