Snowflake COF-C02 Practice Test - Questions Answers, Page 8
List of questions
Related questions
Question 71

In the query profiler view for a query, which components represent areas that can be used to help optimize query performance? (Select TWO)
Explanation:
In the query profiler view, the components that represent areas that can be used to help optimize query performance include 'Bytes scanned' and 'Number of partitions scanned'. 'Bytes scanned' indicates the total amount of data the query had to read and is a direct indicator of the query's efficiency. Reducing the bytes scanned can lead to lower data transfer costs and faster query execution. 'Number of partitions scanned' reflects how well the data is clustered; fewer partitions scanned typically means better performance because the system can skip irrelevant data more effectively.
References:
[COF-C02] SnowPro Core Certification Exam Study Guide
Snowflake Documentation on Query Profiling1
Question 72

A marketing co-worker has requested the ability to change a warehouse size on their medium virtual warehouse called mktg__WH.
Which of the following statements will accommodate this request?
Explanation:
:The correct statement to accommodate the request for a marketing co-worker to change the size of their medium virtual warehouse calledmktg__WHis to grant theMODIFYprivilege on the warehouse to theROLE MARKETING. This privilege allows the role to change the warehouse size among other properties.
References:
[COF-C02] SnowPro Core Certification Exam Study Guide
Snowflake Documentation on Access Control Privileges1
Question 73

When reviewing a query profile, what is a symptom that a query is too large to fit into the memory?
Explanation:
When a query in Snowflake is too large to fit into the available memory, it will start spilling to remote storage. This is an indication that the memory allocated for the query is insufficient for its execution, and as a result, Snowflake uses remote disk storage to handle the overflow. This spill to remote storage can lead to slower query performance due to the additional I/O operations required.
References:
[COF-C02] SnowPro Core Certification Exam Study Guide
Snowflake Documentation on Query Profile1
Snowpro Core Certification Exam Flashcards2
Question 74

Which stage type can be altered and dropped?
Explanation:
External stages can be altered and dropped in Snowflake. An external stage points to an external location, such as an S3 bucket, where data files are stored. Users can modify the stage's definition or drop it entirely if it's no longer needed. This is in contrast to table stages, which are tied to specific tables and cannot be altered or dropped independently.
References:
[COF-C02] SnowPro Core Certification Exam Study Guide
Snowflake Documentation on Stages1
Question 75

Which command can be used to stage local files from which Snowflake interface?
Explanation:
SnowSQL is the command-line client for Snowflake that allows users to execute SQL queries and perform all DDL and DML operations, including staging files for bulk data loading. It is specifically designed for scripting and automating tasks.
References:
SnowPro Core Certification Exam Study Guide
Snowflake Documentation on SnowSQL
https://docs.snowflake.com/en/user-guide/snowsql-use.html
Question 76

What is the recommended file sizing for data loading using Snowpipe?
Explanation:
For data loading using Snowpipe, the recommended file size is a compressed file greater than 10 MB and up to 100 MB. This size range is optimal for Snowpipe's continuous, micro-batch loading process, allowing for efficient and timely data ingestion without overwhelming the system with files that are too large or too small.
References:
[COF-C02] SnowPro Core Certification Exam Study Guide
Snowflake Documentation on Snowpipe1
Question 77

Which services does the Snowflake Cloud Services layer manage? (Select TWO).
Explanation:
The Snowflake Cloud Services layer manages a variety of services that are crucial for the operation of the Snowflake platform. Among these services, Authentication and Metadata management are key components. Authentication is essential for controlling access to the Snowflake environment, ensuring that only authorized users can perform actions within the platform. Metadata management involves handling all the metadata related to objects within Snowflake, such as tables, views, and databases, which is vital for the organization and retrieval of data.
References:
[COF-C02] SnowPro Core Certification Exam Study Guide
Snowflake Documentation12
https://docs.snowflake.com/en/user-guide/intro-key-concepts.html
Question 78

What data is stored in the Snowflake storage layer? (Select TWO).
Explanation:
The Snowflake storage layer is responsible for storing data in an optimized, compressed, columnar format. This includesmicro-partitions, which are the fundamental storage units that contain the actual data stored in Snowflake. Additionally,persisted query results, which are the results of queries that have been materialized and stored for future use, are also kept within this layer.This design allows for efficient data retrieval and management within the Snowflake architecture1.
References:
[COF-C02] SnowPro Core Certification Exam Study Guide
Key Concepts & Architecture | Snowflake Documentation2
Question 79

In which scenarios would a user have to pay Cloud Services costs? (Select TWO).
Explanation:
In Snowflake, Cloud Services costs are incurred when the Cloud Services usage exceeds 10% of the compute usage (measured in credits). Therefore, scenarios A and E would result in Cloud Services charges because the Cloud Services usage is more than 10% of the compute credits used.
References:
[COF-C02] SnowPro Core Certification Exam Study Guide
Snowflake's official documentation on billing and usage1
Question 80

What transformations are supported in a CREATE PIPE ... AS COPY ... FROM (....) statement? (Select TWO.)
Explanation:
In aCREATE PIPE ... AS COPY ... FROM (....)statement, the supported transformations include filtering data using an optional WHERE clause and omitting columns. The WHERE clause allows for the specification of conditions to filter the data that is being loaded, ensuring only relevant data is inserted into the table. Omitting columns enables the exclusion of certain columns from the data load, which can be useful when the incoming data contains more columns than are needed for the target table.
References:
[COF-C02] SnowPro Core Certification Exam Study Guide
Simple Transformations During a Load1
Question