ExamGecko
Home Home / Snowflake / SnowPro Core

Snowflake SnowPro Core Practice Test - Questions Answers, Page 8

Question list
Search
Search

Query compilation occurs in which architecture layer of the Snowflake Cloud Data Platform?

A.
Compute layer
A.
Compute layer
Answers
B.
Storage layer
B.
Storage layer
Answers
C.
Cloud infrastructure layer
C.
Cloud infrastructure layer
Answers
D.
Cloud services layer
D.
Cloud services layer
Answers
Suggested answer: D

Explanation:

Query compilation in Snowflake occurs in the Cloud Services layer. This layer is responsible for coordinating and managing all aspects of the Snowflake service, including authentication, infrastructure management, metadata management, query parsing and optimization, and security. By handling these tasks, the Cloud Services layer enables the Compute layer to focus on executing queries, while the Storage layer is dedicated to persistently storing data.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Snowflake Architecture1

Which is the MINIMUM required Snowflake edition that a user must have if they want to use AWS/Azure Privatelink or Google Cloud Private Service Connect?

A.
Standard
A.
Standard
Answers
B.
Premium
B.
Premium
Answers
C.
Enterprise
C.
Enterprise
Answers
D.
Business Critical
D.
Business Critical
Answers
Suggested answer: D

Explanation:

https://docs.snowflake.com/en/user-guide/admin-security-privatelink.html

In the query profiler view for a query, which components represent areas that can be used to help optimize query performance? (Select TWO)

A.
Bytes scanned
A.
Bytes scanned
Answers
B.
Bytes sent over the network
B.
Bytes sent over the network
Answers
C.
Number of partitions scanned
C.
Number of partitions scanned
Answers
D.
Percentage scanned from cache
D.
Percentage scanned from cache
Answers
E.
External bytes scanned
E.
External bytes scanned
Answers
Suggested answer: A, C

Explanation:

In the query profiler view, the components that represent areas that can be used to help optimize query performance include 'Bytes scanned' and 'Number of partitions scanned'. 'Bytes scanned' indicates the total amount of data the query had to read and is a direct indicator of the query's efficiency. Reducing the bytes scanned can lead to lower data transfer costs and faster query execution. 'Number of partitions scanned' reflects how well the data is clustered; fewer partitions scanned typically means better performance because the system can skip irrelevant data more effectively.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Query Profiling1

A marketing co-worker has requested the ability to change a warehouse size on their medium virtual warehouse called mktg__WH.

Which of the following statements will accommodate this request?

A.
ALLOW RESIZE ON WAREHOUSE MKTG__WH TO USER MKTG__LEAD;
A.
ALLOW RESIZE ON WAREHOUSE MKTG__WH TO USER MKTG__LEAD;
Answers
B.
GRANT MODIFY ON WAREHOUSE MKTG WH TO ROLE MARKETING;
B.
GRANT MODIFY ON WAREHOUSE MKTG WH TO ROLE MARKETING;
Answers
C.
GRANT MODIFY ON WAREHOUSE MKTG__WH TO USER MKTG__LEAD;
C.
GRANT MODIFY ON WAREHOUSE MKTG__WH TO USER MKTG__LEAD;
Answers
D.
GRANT OPERATE ON WAREHOUSE MKTG WH TO ROLE MARKETING;
D.
GRANT OPERATE ON WAREHOUSE MKTG WH TO ROLE MARKETING;
Answers
Suggested answer: B

Explanation:

:The correct statement to accommodate the request for a marketing co-worker to change the size of their medium virtual warehouse calledmktg__WHis to grant theMODIFYprivilege on the warehouse to theROLE MARKETING. This privilege allows the role to change the warehouse size among other properties.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Access Control Privileges1

When reviewing a query profile, what is a symptom that a query is too large to fit into the memory?

A.
A single join node uses more than 50% of the query time
A.
A single join node uses more than 50% of the query time
Answers
B.
Partitions scanned is equal to partitions total
B.
Partitions scanned is equal to partitions total
Answers
C.
An AggregateOperacor node is present
C.
An AggregateOperacor node is present
Answers
D.
The query is spilling to remote storage
D.
The query is spilling to remote storage
Answers
Suggested answer: D

Explanation:

When a query in Snowflake is too large to fit into the available memory, it will start spilling to remote storage. This is an indication that the memory allocated for the query is insufficient for its execution, and as a result, Snowflake uses remote disk storage to handle the overflow. This spill to remote storage can lead to slower query performance due to the additional I/O operations required.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Query Profile1

Snowpro Core Certification Exam Flashcards2

Which stage type can be altered and dropped?

A.
Database stage
A.
Database stage
Answers
B.
External stage
B.
External stage
Answers
C.
Table stage
C.
Table stage
Answers
D.
User stage
D.
User stage
Answers
Suggested answer: B

Explanation:

External stages can be altered and dropped in Snowflake. An external stage points to an external location, such as an S3 bucket, where data files are stored. Users can modify the stage's definition or drop it entirely if it's no longer needed. This is in contrast to table stages, which are tied to specific tables and cannot be altered or dropped independently.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Stages1

Which command can be used to stage local files from which Snowflake interface?

A.
SnowSQL
A.
SnowSQL
Answers
B.
Snowflake classic web interface (Ul)
B.
Snowflake classic web interface (Ul)
Answers
C.
Snowsight
C.
Snowsight
Answers
D.
.NET driver
D.
.NET driver
Answers
Suggested answer: A

Explanation:

SnowSQL is the command-line client for Snowflake that allows users to execute SQL queries and perform all DDL and DML operations, including staging files for bulk data loading. It is specifically designed for scripting and automating tasks.

SnowPro Core Certification Exam Study Guide

Snowflake Documentation on SnowSQL

https://docs.snowflake.com/en/user-guide/snowsql-use.html

Why does Snowflake recommend file sizes of 100-250 MB compressed when loading data?

A.
Optimizes the virtual warehouse size and multi-cluster setting to economy mode
A.
Optimizes the virtual warehouse size and multi-cluster setting to economy mode
Answers
B.
Allows a user to import the files in a sequential order
B.
Allows a user to import the files in a sequential order
Answers
C.
Increases the latency staging and accuracy when loading the data
C.
Increases the latency staging and accuracy when loading the data
Answers
D.
Allows optimization of parallel operations
D.
Allows optimization of parallel operations
Answers
Suggested answer: D

Explanation:

Snowflake recommends file sizes between 100-250 MB compressed when loading data to optimize parallel processing.Smaller, compressed files can be loaded in parallel, which maximizes the efficiency of the virtual warehouses and speeds up the data loading process

Which of the following features are available with the Snowflake Enterprise edition? (Choose two.)

A.
Database replication and failover
A.
Database replication and failover
Answers
B.
Automated index management
B.
Automated index management
Answers
C.
Customer managed keys (Tri-secret secure)
C.
Customer managed keys (Tri-secret secure)
Answers
D.
Extended time travel
D.
Extended time travel
Answers
E.
Native support for geospatial data
E.
Native support for geospatial data
Answers
Suggested answer: A, D

Explanation:

The Snowflake Enterprise edition includes database replication and failover for business continuity and disaster recovery, as well as extended time travel capabilities for longer data retention periods1.

What is the default file size when unloading data from Snowflake using the COPY command?

A.
5 MB
A.
5 MB
Answers
B.
8 GB
B.
8 GB
Answers
C.
16 MB
C.
16 MB
Answers
D.
32 MB
D.
32 MB
Answers
Suggested answer: C

Explanation:

The default file size when unloading data from Snowflake using the COPY command is not explicitly stated in the provided resources.However, Snowflake documentation suggests that the file size can be specified using theMAX_FILE_SIZEoption in theCOPY INTO <location>command2.

Total 627 questions
Go to page: of 63