Databricks & Snowflake Enter Agentic AI's Orbit

Datacenter350

By: Craig Matsumoto


Data is the lifeblood of AI. In more practical terms, that means AI is forcing enterprises and vendors to rethink how data is handled.

That's why Databricks and Snowflake both had to announce new databases last week — both powered by acquisition rather than internal tweaking. It's an acknowledgement that AI requires a level of agility and just-in-time service that neither platform was designed to deliver.

It's all part of a new data pipeline that both companies are now fighting to dominate. It's also true that both companies, despite their size, risk obsolescence if they don't prepare for the coming onslaught of AI agents.

Modernizing the Database

The major data platforms use architectures that are at least a decade old, from the days when a database was a monolith, impressive for its size and grandeur. Agentic AI pictures a different world where agents sporadically grab data and even build their own databases.

So, Databricks used its Data + AI Summit 2025 to announce Lakebase, a new class of online transactional processing (OLTP) database. (Databricks also espoused the general concept of the lakebase, lower-case.) It's powered by Neon, the PostgreSQL (Postgres) database startup that Databricks is acquiring for $1 billion, a deal announced in May.

By stunning coincidence, rival Snowflake pounced with its own new database, announced the day before Databricks'. The Snowflake Postgres database is likewise based on an acquisition: Crunchy Data, a deal announced in early June during Snowflakes' own confab, the Snowflake Summit. Multiple news sources reported the price as $250 million.

Lakebase Neon Crunch (Your New Favorite Cereal)

Databricks' Lakebase, currently in public preview, is a fully managed serverless database, where "serverless" is a nod to the that AI agents will be building these databases themselves. It's not far-fetched. According to Neon, 80% of databases today are already created by agents.

Traditional databases aren't built for on-the-fly behavior. Their fixed size alone is an issue — too small means risking a capacity crunch, but too large means wasting resources. That's not even considering the time it takes to spin up a database.

Databricks is now criticizing those older database architectures, including its own, as monolithic and slow. Lakebase, by comparison, would let agents create databases nearly instantly. By being serverless, the platform auto-scales the database, committing resources as needed.

Snowflake's announcement highlighted another way the world is changing: App developers, not just data scientists, need to control and manage data. Crunchy Data built its database with developers in mind. At the same time, while Postgres is a developers' darling, Snowflake sees room to build a platform more enterprise-ready than off-the-shelf Postgres, including robust security and compliance controls.

Snowflake Summit included some other nods to the agentic AI world but focused more on using agents, compelling users to unlock more power from the Snowflake platform. (Snowflake Intelligence, the agentic experience, will be in public preview soon.)

80% Seems Low

Getting back to Neon's statement that 80% of databases are being created by agents: That figure seems likely to increase, given developers' visions of AI agents continually tapping many sources for information and spinning up software infrastructure continually.

So, Databricks emphasized a commitment to openness when it comes to Lakebase. That includes working with a commonly used open-source foundation (Postgres itself). This is also where the Multi-Context Protocol (MCP) or a similarly open option would comes in, to expose lakebase creation and operation capabilities to agents.

Snowflake has started down those paths as well; it's implied in the company's embrace of Postgres. Those angles weren't the headliners at Snowflake Summit but will doubtless be points of discussion now. That's part of the rivalry, yes, but it's also obvious that both companies need to revamp the data experience before AI nudges it byeond their zone of control.