ExamGecko
Home / Snowflake / COF-C02
Ask Question

Snowflake COF-C02 Practice Test - Questions Answers, Page 8

Question list
Search

Question 71

Report
Export
Collapse

In the query profiler view for a query, which components represent areas that can be used to help optimize query performance? (Select TWO)

Bytes scanned
Bytes scanned
Bytes sent over the network
Bytes sent over the network
Number of partitions scanned
Number of partitions scanned
Percentage scanned from cache
Percentage scanned from cache
External bytes scanned
External bytes scanned
Suggested answer: A, C

Explanation:

In the query profiler view, the components that represent areas that can be used to help optimize query performance include 'Bytes scanned' and 'Number of partitions scanned'. 'Bytes scanned' indicates the total amount of data the query had to read and is a direct indicator of the query's efficiency. Reducing the bytes scanned can lead to lower data transfer costs and faster query execution. 'Number of partitions scanned' reflects how well the data is clustered; fewer partitions scanned typically means better performance because the system can skip irrelevant data more effectively.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Query Profiling1

asked 23/09/2024
AN KANGWOOK
46 questions

Question 72

Report
Export
Collapse

A marketing co-worker has requested the ability to change a warehouse size on their medium virtual warehouse called mktg__WH.

Which of the following statements will accommodate this request?

ALLOW RESIZE ON WAREHOUSE MKTG__WH TO USER MKTG__LEAD;
ALLOW RESIZE ON WAREHOUSE MKTG__WH TO USER MKTG__LEAD;
GRANT MODIFY ON WAREHOUSE MKTG WH TO ROLE MARKETING;
GRANT MODIFY ON WAREHOUSE MKTG WH TO ROLE MARKETING;
GRANT MODIFY ON WAREHOUSE MKTG__WH TO USER MKTG__LEAD;
GRANT MODIFY ON WAREHOUSE MKTG__WH TO USER MKTG__LEAD;
GRANT OPERATE ON WAREHOUSE MKTG WH TO ROLE MARKETING;
GRANT OPERATE ON WAREHOUSE MKTG WH TO ROLE MARKETING;
Suggested answer: B

Explanation:

:The correct statement to accommodate the request for a marketing co-worker to change the size of their medium virtual warehouse calledmktg__WHis to grant theMODIFYprivilege on the warehouse to theROLE MARKETING. This privilege allows the role to change the warehouse size among other properties.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Access Control Privileges1

asked 23/09/2024
Adish Narayan
38 questions

Question 73

Report
Export
Collapse

When reviewing a query profile, what is a symptom that a query is too large to fit into the memory?

A single join node uses more than 50% of the query time
A single join node uses more than 50% of the query time
Partitions scanned is equal to partitions total
Partitions scanned is equal to partitions total
An AggregateOperacor node is present
An AggregateOperacor node is present
The query is spilling to remote storage
The query is spilling to remote storage
Suggested answer: D

Explanation:

When a query in Snowflake is too large to fit into the available memory, it will start spilling to remote storage. This is an indication that the memory allocated for the query is insufficient for its execution, and as a result, Snowflake uses remote disk storage to handle the overflow. This spill to remote storage can lead to slower query performance due to the additional I/O operations required.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Query Profile1

Snowpro Core Certification Exam Flashcards2

asked 23/09/2024
Mekmek Kh
37 questions

Question 74

Report
Export
Collapse

Which stage type can be altered and dropped?

Database stage
Database stage
External stage
External stage
Table stage
Table stage
User stage
User stage
Suggested answer: B

Explanation:

External stages can be altered and dropped in Snowflake. An external stage points to an external location, such as an S3 bucket, where data files are stored. Users can modify the stage's definition or drop it entirely if it's no longer needed. This is in contrast to table stages, which are tied to specific tables and cannot be altered or dropped independently.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Stages1

asked 23/09/2024
Muneer Deers
43 questions

Question 75

Report
Export
Collapse

Which command can be used to stage local files from which Snowflake interface?

SnowSQL
SnowSQL
Snowflake classic web interface (Ul)
Snowflake classic web interface (Ul)
Snowsight
Snowsight
.NET driver
.NET driver
Suggested answer: A

Explanation:

SnowSQL is the command-line client for Snowflake that allows users to execute SQL queries and perform all DDL and DML operations, including staging files for bulk data loading. It is specifically designed for scripting and automating tasks.

References:

SnowPro Core Certification Exam Study Guide

Snowflake Documentation on SnowSQL

https://docs.snowflake.com/en/user-guide/snowsql-use.html

asked 23/09/2024
Thanh Phan
37 questions

Question 76

Report
Export
Collapse

What is the recommended file sizing for data loading using Snowpipe?

A compressed file size greater than 100 MB, and up to 250 MB
A compressed file size greater than 100 MB, and up to 250 MB
A compressed file size greater than 100 GB, and up to 250 GB
A compressed file size greater than 100 GB, and up to 250 GB
A compressed file size greater than 10 MB, and up to 100 MB
A compressed file size greater than 10 MB, and up to 100 MB
A compressed file size greater than 1 GB, and up to 2 GB
A compressed file size greater than 1 GB, and up to 2 GB
Suggested answer: C

Explanation:

For data loading using Snowpipe, the recommended file size is a compressed file greater than 10 MB and up to 100 MB. This size range is optimal for Snowpipe's continuous, micro-batch loading process, allowing for efficient and timely data ingestion without overwhelming the system with files that are too large or too small.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Snowpipe1

asked 23/09/2024
Lazar Marinovic
34 questions

Question 77

Report
Export
Collapse

Which services does the Snowflake Cloud Services layer manage? (Select TWO).

Compute resources
Compute resources
Query execution
Query execution
Authentication
Authentication
Data storage
Data storage
Metadata
Metadata
Suggested answer: C, E

Explanation:

The Snowflake Cloud Services layer manages a variety of services that are crucial for the operation of the Snowflake platform. Among these services, Authentication and Metadata management are key components. Authentication is essential for controlling access to the Snowflake environment, ensuring that only authorized users can perform actions within the platform. Metadata management involves handling all the metadata related to objects within Snowflake, such as tables, views, and databases, which is vital for the organization and retrieval of data.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation12

https://docs.snowflake.com/en/user-guide/intro-key-concepts.html

asked 23/09/2024
L Zsolt
38 questions

Question 78

Report
Export
Collapse

What data is stored in the Snowflake storage layer? (Select TWO).

Snowflake parameters
Snowflake parameters
Micro-partitions
Micro-partitions
Query history
Query history
Persisted query results
Persisted query results
Standard and secure view results
Standard and secure view results
Suggested answer: B, D

Explanation:

The Snowflake storage layer is responsible for storing data in an optimized, compressed, columnar format. This includesmicro-partitions, which are the fundamental storage units that contain the actual data stored in Snowflake. Additionally,persisted query results, which are the results of queries that have been materialized and stored for future use, are also kept within this layer.This design allows for efficient data retrieval and management within the Snowflake architecture1.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Key Concepts & Architecture | Snowflake Documentation2

asked 23/09/2024
Shaharyar Chaudhry
42 questions

Question 79

Report
Export
Collapse

In which scenarios would a user have to pay Cloud Services costs? (Select TWO).

Compute Credits = 50 Credits Cloud Services = 10
Compute Credits = 50 Credits Cloud Services = 10
Compute Credits = 80 Credits Cloud Services = 5
Compute Credits = 80 Credits Cloud Services = 5
Compute Credits = 10 Credits Cloud Services = 9
Compute Credits = 10 Credits Cloud Services = 9
Compute Credits = 120 Credits Cloud Services = 10
Compute Credits = 120 Credits Cloud Services = 10
Compute Credits = 200 Credits Cloud Services = 26
Compute Credits = 200 Credits Cloud Services = 26
Suggested answer: A, E

Explanation:

In Snowflake, Cloud Services costs are incurred when the Cloud Services usage exceeds 10% of the compute usage (measured in credits). Therefore, scenarios A and E would result in Cloud Services charges because the Cloud Services usage is more than 10% of the compute credits used.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake's official documentation on billing and usage1

asked 23/09/2024
shylashri selvamani
46 questions

Question 80

Report
Export
Collapse

What transformations are supported in a CREATE PIPE ... AS COPY ... FROM (....) statement? (Select TWO.)

Data can be filtered by an optional where clause
Data can be filtered by an optional where clause
Incoming data can be joined with other tables
Incoming data can be joined with other tables
Columns can be reordered
Columns can be reordered
Columns can be omitted
Columns can be omitted
Row level access can be defined
Row level access can be defined
Suggested answer: A, D

Explanation:

In aCREATE PIPE ... AS COPY ... FROM (....)statement, the supported transformations include filtering data using an optional WHERE clause and omitting columns. The WHERE clause allows for the specification of conditions to filter the data that is being loaded, ensuring only relevant data is inserted into the table. Omitting columns enables the exclusion of certain columns from the data load, which can be useful when the incoming data contains more columns than are needed for the target table.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Simple Transformations During a Load1

asked 23/09/2024
ERIC MERRILL
41 questions
Total 716 questions
Go to page: of 72