Most candidates assume SnowPro Advanced is the one that gets you hired. It isn't. Hiring managers skim for SnowPro Core and move on, because Core is the version that proves you've actually touched the platform. Advanced is a flex for people already employed on Snowflake. If you're using the cert to break into a DE role, Core is the right line item and Advanced is mostly vanity. The market data is clear on this even if the Snowflake marketing page isn't.
Core questions
Core fee
Advanced fee
Pass score
Source: DataDriven analysis of 1,042 verified data engineering interview rounds.
People call Core "entry-level" and wildly underprepare. It covers architecture decisions most engineers never touch in the UI, specifically virtual warehouse sizing, multi-cluster scaling policies, and the exact behavior of result caching under session resets. The pass rate is lower than it looks because the wrong answers on Core are plausible-sounding defaults. You fail by trusting intuition, not by missing obscure trivia.
| Detail | Info |
|---|---|
| Exam Code | COF-C02 |
| Question Count | 100 questions |
| Time Limit | 115 minutes |
| Passing Score | 750 / 1000 |
| Cost | $175 USD |
| Format | Multiple choice, multiple select (proctored online or test center) |
| Renewal | Every 2 years |
Snowflake publishes the domain weights in their exam guide. Here is where your study time should go, ordered by weight.
Virtual warehouses, storage layer, cloud services layer, micro-partitions, metadata, caching (result cache, local disk cache, remote disk cache). Know the three-layer architecture cold. Questions ask what happens when you resize a warehouse, how micro-partitions affect pruning, and when each cache layer is used.
Role-based access control (RBAC), DAC, network policies, MFA, key-pair authentication, SCIM provisioning. Understand the role hierarchy: ACCOUNTADMIN, SECURITYADMIN, SYSADMIN, and custom roles. Know which role owns what and how privileges flow down.
COPY INTO, Snowpipe, external stages, internal stages, file formats, bulk loading vs continuous loading. Know the difference between Snowpipe (serverless, event-driven, continuous) and COPY INTO (batch, warehouse-powered). Understand how staging works with S3, GCS, and Azure Blob.
Clustering keys, query profiling, EXPLAIN, warehouse sizing, auto-suspend, auto-resume, scaling policies. Know when to use clustering keys (large tables with selective filters on non-natural sort columns) and when they waste credits (small tables, tables already well-clustered by ingestion order).
Snowflake SQL syntax, semi-structured data (VARIANT, OBJECT, ARRAY), FLATTEN, LATERAL, QUALIFY, window functions. The SQL section is lighter than you might expect because most SQL knowledge is assumed. Focus on Snowflake-specific syntax like QUALIFY and FLATTEN.
Secure data sharing, reader accounts, data marketplace, listings. Know that data sharing is zero-copy (the consumer accesses the provider's storage directly), that reader accounts let you share with non-Snowflake users, and how listings work in the marketplace.
The Advanced Data Engineer exam is the specialist certification for people who build and maintain data pipelines on Snowflake. It goes beyond Core by testing your ability to design ingestion workflows, automate transformations, handle schema evolution, and optimize pipeline performance. This is the cert that separates “I know Snowflake” from “I build production systems on Snowflake.”
| Detail | Info |
|---|---|
| Exam Code | DEA-C01 |
| Prerequisite | SnowPro Core (active) |
| Question Count | 65 questions |
| Time Limit | 115 minutes |
| Passing Score | 750 / 1000 |
| Cost | $375 USD |
| Format | Multiple choice, multiple select (proctored) |
Snowpipe (REST API and auto-ingest), Snowpipe Streaming, COPY INTO with transformations, external tables, dynamic tables, Streams and Tasks for CDC pipelines. Know how to chain Streams and Tasks to build an incremental processing pipeline. Understand the difference between append-only and standard streams.
Query profiling, warehouse sizing strategies, clustering key selection, search optimization service, materialized views vs dynamic tables. Know when each optimization strategy applies. The exam tests whether you can pick the right tool for a given scenario, not just list features.
Row access policies, column-level masking, tag-based masking, object tagging, data classification, access history. This domain tests pipeline-specific governance: how do you mask PII in a staging layer, how do you audit who accessed what data, and how do you enforce policies across shared datasets.
VARIANT column optimization, FLATTEN for nested JSON and arrays, schema detection, schema evolution, Parquet/ORC/Avro ingestion. Expect scenario questions: “Your JSON payloads added a new nested field. How does your pipeline handle it without breaking downstream consumers?”
This plan assumes you use Snowflake at work or have completed Snowflake's free trial tutorials. If you are starting from zero Snowflake experience, double the timeline.
Read the Snowflake documentation on architecture (cloud services, compute, storage). Draw the three-layer diagram from memory. Study micro-partitions, metadata, and how pruning works. Build a cheat sheet of all cache types and when each is used. Create a free trial account and run queries while studying.
Study the role hierarchy, privilege model, and network policies. Practice COPY INTO from staged files, configure Snowpipe on a test table, and set up file formats for CSV and JSON. Know the difference between internal stages (user, table, named) and external stages.
Study clustering keys and the query profile UI. Practice reading EXPLAIN output. Review data sharing mechanics (shares, reader accounts, marketplace). Brush up on Snowflake-specific SQL: QUALIFY, FLATTEN, OBJECT_CONSTRUCT, PARSE_JSON, and semi-structured data access with colon notation.
Take 2 to 3 practice exams. Snowflake provides a free sample exam on their certification page. For every question you get wrong, go back to the documentation and read the full page on that topic. Focus your last few days on your weakest domain.
The Advanced exam requires hands-on experience. Reading documentation alone will not prepare you. You need to build actual pipelines.
Build a CDC pipeline using Streams and Tasks. Insert, update, and delete rows in a source table. Watch how the stream captures changes. Create a task that processes the stream on a schedule. Then rebuild the same pipeline using dynamic tables. Compare the two approaches.
Set up Snowpipe auto-ingest from S3. Load nested JSON into VARIANT columns. Use FLATTEN and LATERAL to normalize it. Test schema detection on Parquet files. Practice handling schema evolution (adding fields, changing types) without breaking downstream views.
Implement row access policies and masking policies on test tables. Study access history queries. Review warehouse sizing under different workload patterns. Take practice exams and fill gaps.
The honest answer depends on where you are in your career. Here is a breakdown by situation.
You are switching into data engineering. The cert gives you something concrete to put on a resume that has no DE experience yet. Recruiters use it as a filter.
Your target companies run Snowflake. If 3 of the 5 companies you are applying to list Snowflake in their job descriptions, the cert shows you took the platform seriously.
Your employer pays for it. At $175 for Core and $375 for Advanced, the cost is minimal if your company covers certification expenses. The study process itself fills knowledge gaps you might not know you have.
You want to go independent or contract. Contractors and consultants benefit more from certs than full-time employees. Clients hiring a Snowflake consultant want proof of expertise.
You already have 3+ years of Snowflake on your resume. If your resume says “Built a 50TB incremental pipeline on Snowflake processing 2B events/day,” no recruiter cares whether you have the cert. Your work speaks for itself.
Your target stack is not Snowflake. If you are going into a Databricks or BigQuery shop, a Snowflake cert does not help. Get the cert for the platform you will use.
You are time-constrained and need to practice SQL instead. If your SQL skills are weak, 4 weeks of SQL practice will improve your interview performance more than 4 weeks of cert study. SQL is tested in every data engineering interview. Snowflake specifics are tested only at Snowflake-heavy companies.
The exam fee is only part of the total cost. Here is the full picture.
| Item | Core | Advanced DE |
|---|---|---|
| Exam fee | $175 | $375 |
| Study materials | $0 (docs are free) | $0 (docs are free) |
| Snowflake trial account | Free ($400 credit) | Free ($400 credit) |
| Study time | 2-4 weeks | 4-6 weeks |
| Renewal (every 2 years) | $175 | $375 |
Tip: Snowflake occasionally runs certification promotions with discounted exam fees. Check the Snowflake community forums and their certification page before booking. Some Snowflake Summit events include free exam vouchers with attendance.
Build the SQL first. The cert questions will read differently when window functions aren't the thing slowing you down.
Compare Snowflake, Databricks, AWS, GCP, and Azure certs side by side
30+ questions on architecture, SQL, performance, and pipeline design for Snowflake interviews
Step-by-step path from beginner to senior data engineer with skill checkpoints