Snowflake expands Cortex Code CLI with dbt, Airflow
Snowflake has expanded Cortex Code CLI-an AI coding agent that runs in local development environments-by adding support for dbt and Apache Airflow workflows. It also introduced a subscription option that does not depend on Snowflake usage.
Originally focused on Snowflake-native workflows, the product is now reflecting how data engineering teams build pipelines that span multiple platforms and tools.
Broader tool support
Cortex Code CLI now integrates with dbt, a widely used framework for data transformation, and Apache Airflow, an orchestration tool for scheduling and managing workflow execution. Snowflake called the additions a first step toward supporting data work across systems, regardless of where the underlying data sits.
Snowflake positioned the expansion as a response to operational friction in multi-environment pipelines. Broken workflows can trigger rework across teams and reduce confidence in downstream analytics. The AI agent is designed to help with model development, debugging, and optimisation across the supported tools.
"Developers don't operate in a single system, and AI coding assistants shouldn't either," said Christian Kleinerman, EVP of Product at Snowflake.
He said the product is being built around cross-system context and fitting into developers' existing practices and workflows.
Subscription shift
Snowflake also introduced what it called its first standalone subscription model. Because the plan runs independently of Snowflake compute and consumption, teams that do not run workloads on Snowflake can still buy Cortex Code CLI as a monthly subscription.
The change aims to address a common procurement barrier for developer tools. Data teams often evaluate assistants in their current environments before standardising on a platform-wide approach. A standalone subscription offers a clearer way to trial the tool without changing the data stack.
Snowflake described the model as self-service and aimed at developer teams that want to start using the CLI immediately.
Models and controls
Alongside expanded tool support and new packaging, Snowflake highlighted broader model choice within Cortex Code CLI. Customers can choose from models including Claude Opus 4.6 and OpenAI GPT-5.2, depending on requirements such as quality, latency, and cost.
The release also adds administrative controls and governance features, including tools to manage access, usage, and policy enforcement across teams. Governance has become a central concern as organisations roll out AI assistants in software development and data engineering, particularly where code may touch regulated datasets or production systems.
Customer use cases
Braze, a customer engagement platform, described Cortex Code as part of a broader approach to agent-based analytics and engineering workflows.
"Cortex Code is transforming how we approach agentic analytics at Braze," said Spencer Burke, SVP of Growth at Braze.
Burke said the tool's understanding of datasets, schemas, and columns changed how engineers worked with context and outputs, and that Braze was deploying it for more complex data integrations.
A consultancy user also pointed to productivity gains in development and engineering work.
"Cortex Code has transformed solution development at evolv Consulting by providing a direct, context-aware connection to the Snowflake ecosystem, allowing our team to interact seamlessly with databases, objects, and git repositories," said Trent Foley, Chief Technology Officer at evolv Consulting. "By leveraging the most advanced models available, such as Opus 4.6, Cortex Code can essentially 'do anything' through its CLI version, from full-featured React app development to complex data engineering tasks. This translated into over 500 hours in time saving - roughly $100,000 in value - in just the first 20 days of adoption."
Adoption signals
Snowflake said Cortex Code launched in November 2025 and has recorded more than 4,400 new users. It presented the figure as evidence of demand for developer-focused tools that reduce repetitive work and speed up changes across multi-system data pipelines.
Adding dbt and Apache Airflow also brings the product closer to the day-to-day tooling of data engineering teams. dbt projects often sit at the centre of transformation logic, while Airflow remains widely used for scheduling and coordinating workflows across warehouses, lakes, and operational systems.
Snowflake said it plans further expansions as it works toward broader coverage of data sources and systems. Kleinerman described the goal as building an agent that "understands and works across" the wider data ecosystem.