ExamGecko
Home / Snowflake / SnowPro Core / List of questions
Ask Question

Snowflake SnowPro Core Practice Test - Questions Answers, Page 8

Question list
Search

Question 71

Report
Export
Collapse

Query compilation occurs in which architecture layer of the Snowflake Cloud Data Platform?

Compute layer
Compute layer
Storage layer
Storage layer
Cloud infrastructure layer
Cloud infrastructure layer
Cloud services layer
Cloud services layer
Suggested answer: D

Explanation:

Query compilation in Snowflake occurs in the Cloud Services layer. This layer is responsible for coordinating and managing all aspects of the Snowflake service, including authentication, infrastructure management, metadata management, query parsing and optimization, and security. By handling these tasks, the Cloud Services layer enables the Compute layer to focus on executing queries, while the Storage layer is dedicated to persistently storing data.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Snowflake Architecture1

asked 23/09/2024
Keletso Rafedile
34 questions

Question 72

Report
Export
Collapse

Which is the MINIMUM required Snowflake edition that a user must have if they want to use AWS/Azure Privatelink or Google Cloud Private Service Connect?

Standard
Standard
Premium
Premium
Enterprise
Enterprise
Business Critical
Business Critical
Suggested answer: D

Explanation:

https://docs.snowflake.com/en/user-guide/admin-security-privatelink.html

asked 23/09/2024
Lawrence Acherman
42 questions

Question 73

Report
Export
Collapse

In the query profiler view for a query, which components represent areas that can be used to help optimize query performance? (Select TWO)

Bytes scanned
Bytes scanned
Bytes sent over the network
Bytes sent over the network
Number of partitions scanned
Number of partitions scanned
Percentage scanned from cache
Percentage scanned from cache
External bytes scanned
External bytes scanned
Suggested answer: A, C

Explanation:

In the query profiler view, the components that represent areas that can be used to help optimize query performance include 'Bytes scanned' and 'Number of partitions scanned'. 'Bytes scanned' indicates the total amount of data the query had to read and is a direct indicator of the query's efficiency. Reducing the bytes scanned can lead to lower data transfer costs and faster query execution. 'Number of partitions scanned' reflects how well the data is clustered; fewer partitions scanned typically means better performance because the system can skip irrelevant data more effectively.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Query Profiling1

asked 23/09/2024
Antonios Petropoulos
32 questions

Question 74

Report
Export
Collapse

A marketing co-worker has requested the ability to change a warehouse size on their medium virtual warehouse called mktg__WH.

Which of the following statements will accommodate this request?

ALLOW RESIZE ON WAREHOUSE MKTG__WH TO USER MKTG__LEAD;
ALLOW RESIZE ON WAREHOUSE MKTG__WH TO USER MKTG__LEAD;
GRANT MODIFY ON WAREHOUSE MKTG WH TO ROLE MARKETING;
GRANT MODIFY ON WAREHOUSE MKTG WH TO ROLE MARKETING;
GRANT MODIFY ON WAREHOUSE MKTG__WH TO USER MKTG__LEAD;
GRANT MODIFY ON WAREHOUSE MKTG__WH TO USER MKTG__LEAD;
GRANT OPERATE ON WAREHOUSE MKTG WH TO ROLE MARKETING;
GRANT OPERATE ON WAREHOUSE MKTG WH TO ROLE MARKETING;
Suggested answer: B

Explanation:

:The correct statement to accommodate the request for a marketing co-worker to change the size of their medium virtual warehouse calledmktg__WHis to grant theMODIFYprivilege on the warehouse to theROLE MARKETING. This privilege allows the role to change the warehouse size among other properties.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Access Control Privileges1

asked 23/09/2024
irwandi irwandi
33 questions

Question 75

Report
Export
Collapse

When reviewing a query profile, what is a symptom that a query is too large to fit into the memory?

A single join node uses more than 50% of the query time
A single join node uses more than 50% of the query time
Partitions scanned is equal to partitions total
Partitions scanned is equal to partitions total
An AggregateOperacor node is present
An AggregateOperacor node is present
The query is spilling to remote storage
The query is spilling to remote storage
Suggested answer: D

Explanation:

When a query in Snowflake is too large to fit into the available memory, it will start spilling to remote storage. This is an indication that the memory allocated for the query is insufficient for its execution, and as a result, Snowflake uses remote disk storage to handle the overflow. This spill to remote storage can lead to slower query performance due to the additional I/O operations required.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Query Profile1

Snowpro Core Certification Exam Flashcards2

asked 23/09/2024
David Rossi
33 questions

Question 76

Report
Export
Collapse

Which stage type can be altered and dropped?

Database stage
Database stage
External stage
External stage
Table stage
Table stage
User stage
User stage
Suggested answer: B

Explanation:

External stages can be altered and dropped in Snowflake. An external stage points to an external location, such as an S3 bucket, where data files are stored. Users can modify the stage's definition or drop it entirely if it's no longer needed. This is in contrast to table stages, which are tied to specific tables and cannot be altered or dropped independently.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Stages1

asked 23/09/2024
Cheri Brown
33 questions

Question 77

Report
Export
Collapse

Which command can be used to stage local files from which Snowflake interface?

SnowSQL
SnowSQL
Snowflake classic web interface (Ul)
Snowflake classic web interface (Ul)
Snowsight
Snowsight
.NET driver
.NET driver
Suggested answer: A

Explanation:

SnowSQL is the command-line client for Snowflake that allows users to execute SQL queries and perform all DDL and DML operations, including staging files for bulk data loading. It is specifically designed for scripting and automating tasks.

SnowPro Core Certification Exam Study Guide

Snowflake Documentation on SnowSQL

https://docs.snowflake.com/en/user-guide/snowsql-use.html

asked 23/09/2024
Mashudu Abraham
34 questions

Question 78

Report
Export
Collapse

Why does Snowflake recommend file sizes of 100-250 MB compressed when loading data?

Optimizes the virtual warehouse size and multi-cluster setting to economy mode
Optimizes the virtual warehouse size and multi-cluster setting to economy mode
Allows a user to import the files in a sequential order
Allows a user to import the files in a sequential order
Increases the latency staging and accuracy when loading the data
Increases the latency staging and accuracy when loading the data
Allows optimization of parallel operations
Allows optimization of parallel operations
Suggested answer: D

Explanation:

Snowflake recommends file sizes between 100-250 MB compressed when loading data to optimize parallel processing.Smaller, compressed files can be loaded in parallel, which maximizes the efficiency of the virtual warehouses and speeds up the data loading process

asked 23/09/2024
Winston Seedorf
31 questions

Question 79

Report
Export
Collapse

Which of the following features are available with the Snowflake Enterprise edition? (Choose two.)

Database replication and failover
Database replication and failover
Automated index management
Automated index management
Customer managed keys (Tri-secret secure)
Customer managed keys (Tri-secret secure)
Extended time travel
Extended time travel
Native support for geospatial data
Native support for geospatial data
Suggested answer: A, D

Explanation:

The Snowflake Enterprise edition includes database replication and failover for business continuity and disaster recovery, as well as extended time travel capabilities for longer data retention periods1.

asked 23/09/2024
Tarun Sharma
45 questions

Question 80

Report
Export
Collapse

What is the default file size when unloading data from Snowflake using the COPY command?

5 MB
5 MB
8 GB
8 GB
16 MB
16 MB
32 MB
32 MB
Suggested answer: C

Explanation:

The default file size when unloading data from Snowflake using the COPY command is not explicitly stated in the provided resources.However, Snowflake documentation suggests that the file size can be specified using theMAX_FILE_SIZEoption in theCOPY INTO <location>command2.

asked 23/09/2024
Wisit Luasomboon
28 questions
Total 627 questions
Go to page: of 63