Analyst(s): Nick Patience
Publication Date: June 5, 2025
What is Covered in this Article:
- Snowflake’s new AI-powered analytics features: Cortex AISQL and SnowConvert AI.
- Introduction of Snowflake Intelligence and Data Science Agent for enhanced AI/ML capabilities.
- Launch of Snowflake OpenFlow for improved data interoperability and movement.
- Advancements in compute performance with Standard Warehouse – Generation 2 and Adaptive Compute.
- Enhancements in AI-driven data governance with Snowflake Horizon Catalog updates.
The Event – Major Themes & Vendor Moves: Snowflake Summit 2025, held in San Francisco this week, attracted about 20,000 attendees and showcased the company’s deepened commitment to empowering enterprises with accessible and powerful AI capabilities built upon a unified data foundation. The event highlighted a significant shift towards making AI and machine learning (ML) more integrated, efficient, and actionable for a wide range of users, from business analysts to data scientists. Key themes included the simplification of AI adoption, the critical role of data interoperability (especially for unstructured and on-premises data), substantial improvements in compute efficiency moving towards infrastructure-less experiences, the embedding of AI into data governance, and empowering developers with richer tooling.
Major announcements centered around expanding the Snowflake AI Data Cloud’s capabilities:
Cortex AI Enhancements: Snowflake introduced Cortex AISQL, embedding generative AI directly into queries to analyze diverse data types, including unstructured data, and build flexible pipelines with SQL. This represents an evolution of AI meeting SQL across multimodal data. SnowConvert AI, an evolution of SnowConvert, was launched as an automation solution focused on end-to-end pipeline conversion to accelerate migrations from legacy platforms, automating code conversion, testing, and data validation. It reportedly makes these phases 2-3 times faster, with Snowflake’s own professional services team being a key customer for medium to large migrations. Cortex AI now offers a single API (via REST or SQL) to all frontier models, with new additions like OpenAI models now available to run within Snowflake’s security boundary. A new Cortex Knowledge Extension allows information providers to create a Cortex Search service over their content without copying or exposing the entire information corpus, enabling a “share without copying” model paid via Snowflake. Cortex Search itself is a key differentiator, providing robust retrieval with lexical and vector search, underpinning capabilities like Cortex Analyst for text-to-SQL. Cortex Agents provide the API to bring these components together.
Snowflake Intelligence and Data Science Agent: To bridge the gap between data and business action, Snowflake unveiled Snowflake Intelligence, a unified conversational experience allowing natural language querying of both structured and unstructured data. This is positioned as a unique platform for bringing agents to business users, natively handling both data types. Complementing this, the new Data Science Agent aims to boost productivity for data scientists by automating routine ML model development tasks, focusing on the data preparation and training elements of the data science process.
Snowflake OpenFlow: Addressing data interoperability, OpenFlow, born from the DataVolo acquisition (closed in December), was announced for General Availability. This new multi-modal data ingestion service, powered by Apache NiFi, aims to simplify real-time data movement from virtually any source, including on-premises data centers and other clouds, into Snowflake. It offers a deployable data plane with more than 200 connectors that can run in any Kubernetes cluster, enabling better integration across different architectures and facilitating AI-powered innovations. This can operate via private link, ensuring data never touches the public internet. OpenFlow competes with parts of tools like Fivetran but targets more advanced use cases beyond simple SaaS ingestion, at least according to Snowflake. This, along with partner S3-compatible APIs (e.g., from Pure Storage and Dell), provides two key methods for accessing on-premises data.
Compute Innovations: Snowflake unveiled Standard Warehouse – Generation 2 (Gen2), promising 2.1x faster analytics performance. Additionally, Snowflake Adaptive Compute (also referred to as Adaptive Warehouse) was introduced as a service to automatically manage resources, maximizing efficiency and performance. This marks a significant step towards users not needing to manage infrastructure directly, though adoption will take time, given the large existing customer base accustomed to virtual warehouses.
AI-Driven Data Governance: Updates to the Snowflake Horizon Catalog include Copilot for Horizon Catalog and AI-led monitoring, signifying a move towards more intelligent and automated data governance. A model catalog is also part of Snowflake’s AI strategy.
Developer Experience and Lakehouse Capabilities: Snowflake introduced Workspaces, a new UI for managing all Snowflake files and objects, offering a full IDE experience backed by Git, catering to users from simple SQL query analysts to advanced developers. Support for dbt projects is integrated. The company reiterated its full embrace of Apache Iceberg for open lakehouse strategies, seeing it as a huge opportunity to query more data without ingestion and take on more data engineering workloads. While revenue from Iceberg workloads (primarily analytics) was not disclosed, it was implied to be significant. A new Data Projects object type was introduced to manage collections of Snowflake objects. In a pricing update, Snowpipe will now be priced per GB ingested.
Native Application Enhancements: Native Apps can now include semantic models and can also be leveraged as tools by Cortex AI.
Crunchy Data acquisition: On Day 1 of Summit, Snowflake announced the acquisition of Crunchy Data, an enterprise-grade PostgreSQL cloud technology provider, for approximately $250 million. This move was driven by a significant rise in customer requests for Postgres over the preceding nine months. See our report here for a full analysis of what that acquisition means.
Snowflake Summit ’25: Accelerating AI with Unified Data & Compute
Analyst Take: Snowflake’s announcements at Summit 2025 signal a clear strategic direction: to become the central nervous system for enterprise AI, making its platform an “AI-powered data platform” and a “data-powered AI platform.” By deeply integrating AI capabilities across its stack – from data ingestion (OpenFlow, SnowConvert AI) and compute (Adaptive Compute) to analytics (Cortex AISQL, Snowflake Intelligence) and application development (Native Apps with semantic models, Data Science Agent) – Snowflake is addressing key friction points in AI adoption and aiming to democratize AI for every team. The core tenets are to make AI easy, efficient, and trusted.
The company’s AI strategy has two main prongs: enabling AI where customer data already resides, and empowering every team within an organization to leverage AI. A key theme, reminiscent of Snowflake’s early days unifying structured and semi-structured data, is now the unification of structured and unstructured data (UD) under one roof, enabling the querying of UD directly using SQL and natural language. This is crucial, as customers consistently raise UD and security as primary concerns when approaching AI. The emphasis on strong retrieval capabilities within Cortex Search (lexical and vector) for both data types is a foundational differentiator, as is the native support for capabilities like charting without relying on LLMs to build them.
The introduction of Gen2 Warehouses and Adaptive Compute directly tackles performance and cost concerns often associated with large-scale data analytics and AI workloads. Adaptive Compute, in particular, represents a historical shift for Snowflake, aiming for an infrastructure-less experience. While this is the future, the company acknowledges a transition period for its 11,000+ customers accustomed to managing virtual warehouses. The consumption-based business model remains central, even influencing partner compensation models to align with customer usage.
Data interoperability remains a critical focus. OpenFlow and expanded Iceberg support are crucial in a world of diverse data sources and architectures, including on-premises systems (which, for Snowflake, means parts of its product running in a customer’s VPC, not a full on-prem deployment). Snowflake’s stance on Delta Lake is pragmatic: customers own their data, and Snowflake will work with their choices, viewing potential convergence between Delta and Iceberg as positive.
The acquisition of Crunchy Data, spurred by rising demand for Postgres, and comments about not doing “a good enough job” with alliances like SAP and Palantir, signal an aggressive stance on expanding its ecosystem and capabilities, even if it means competing more directly with erstwhile partners or making bold acquisitions. In terms of go-to-market, Snowflake plans to continue to invest in direct sales but also talked of a greater need for lift from channel partners (SIs, hyperscalers, and resellers). It’s about selling business outcomes, rather than products.
However, the success of these new offerings will depend on execution and customer adoption, particularly for the newer AI tools and the more abstract Adaptive Compute concept. While the vision is interesting, organizations will need to see tangible benefits in ease of use, performance, and cost-effectiveness to embrace these advanced capabilities fully and for Snowflake to demonstrate clear ROI in a competitive AI platform market. The move towards AI-driven data governance with Horizon Catalog is a necessary step, as the power of AI must be coupled with robust security and compliance. Initial traction appears promising, with more than 5,200 customers reportedly using its AI products weekly.
What to Watch:
- Adoption Rate of New AI Tools & Adaptive Compute: How quickly and broadly will customers adopt Cortex AISQL, Snowflake Intelligence, the Data Science Agent, and particularly the new Adaptive Compute model? Real-world use cases, success stories, and TCO comparisons will be key.
- Impact of OpenFlow and Iceberg: Monitor the uptake of OpenFlow and its effectiveness in simplifying data ingestion from diverse sources, including its Bring Your Own Cloud model and on-premises environments. Track the growth of Iceberg workloads and Snowflake’s ability to capture more data engineering tasks.
- Crunchy Data Integration & Postgres Strategy: How will Crunchy Data be integrated, and what will be the impact on Snowflake’s appeal to developers and applications requiring transactional capabilities alongside analytics?
Partner Ecosystem Evolution: We’ll be looking for some changes in relationships with SIs, hyperscalers (especially regarding sovereign cloud offerings in Europe), and strategic alliances like SAP and Palantir. - Competitive Landscape & GTM Execution: Competitors – notably Databricks, which has its conference in the same venue next week – but also cloud hyperscalers will be watching Snowflake’s increasingly integrated AI capabilities, compute efficiency, and GTM strategy focusing on business outcomes and industry verticalization.
You can read more about the announcements made at Snowflake Summit on the company’s website.
Disclosure: Futurum is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.
Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of Futurum as a whole.
Other insights from Futurum:
Is Snowflake’s Crunchy Data Acquisition a Game-Changer in the AI Data Platform Race?
Previous Analysis: Snowflake Q4 FY2025 Earnings Report
Image Source: Nick Patience
Author Information
Nick is VP and Practice Lead for AI at The Futurum Group. Nick is a thought leader on the development, deployment and adoption of AI - an area he has been researching for 25 years. Prior to Futurum, Nick was a Managing Analyst with S&P Global Market Intelligence, with responsibility for 451 Research’s coverage of Data, AI, Analytics, Information Security and Risk. Nick became part of S&P Global through its 2019 acquisition of 451 Research, a pioneering analyst firm Nick co-founded in 1999. He is a sought-after speaker and advisor, known for his expertise in the drivers of AI adoption, industry use cases, and the infrastructure behind its development and deployment. Nick also spent three years as a product marketing lead at Recommind (now part of OpenText), a machine learning-driven eDiscovery software company. Nick is based in London.