ExamGecko
Home / Snowflake / COF-C02
Ask Question

Snowflake COF-C02 Practice Test - Questions Answers, Page 9

Question list
Search

Question 81

Report
Export
Collapse

What is a responsibility of Snowflake's virtual warehouses?

Infrastructure management
Infrastructure management
Metadata management
Metadata management
Query execution
Query execution
Query parsing and optimization
Query parsing and optimization
Management of the storage layer
Management of the storage layer
Suggested answer: C

Explanation:

The primary responsibility of Snowflake's virtual warehouses is to execute queries. Virtual warehouses are one of the key components of Snowflake's architecture, providing the compute power required to perform data processing tasks such as running SQL queries, performing joins, aggregations, and other data manipulations.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Virtual Warehouses1

asked 23/09/2024
Olugbenga Fagbohun
40 questions

Question 82

Report
Export
Collapse

Which of the following compute resources or features are managed by Snowflake? (Select TWO).

Execute a COPY command
Execute a COPY command
Updating data
Updating data
Snowpipe
Snowpipe
AUTOMATIC__CLUSTERING
AUTOMATIC__CLUSTERING
Scaling up a warehouse
Scaling up a warehouse
Suggested answer: C, E

Explanation:

Snowflake manages various compute resources and features, including Snowpipe and the ability to scale up a warehouse. Snowpipe is Snowflake's continuous data ingestion service that allows users to load data as soon as it becomes available. Scaling up a warehouse refers to increasing the compute resources allocated to a virtual warehouse to handle larger workloads or improve performance.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Snowpipe and Virtual Warehouses1

asked 23/09/2024
Rakesh Sharma
34 questions

Question 83

Report
Export
Collapse

What happens when a virtual warehouse is resized?

When increasing the size of an active warehouse the compute resource for all running and queued queries on the warehouse are affected
When increasing the size of an active warehouse the compute resource for all running and queued queries on the warehouse are affected
When reducing the size of a warehouse the compute resources are removed only when they are no longer being used to execute any current statements.
When reducing the size of a warehouse the compute resources are removed only when they are no longer being used to execute any current statements.
The warehouse will be suspended while the new compute resource is provisioned and will resume automatically once provisioning is complete.
The warehouse will be suspended while the new compute resource is provisioned and will resume automatically once provisioning is complete.
Users who are trying to use the warehouse will receive an error message until the resizing is complete
Users who are trying to use the warehouse will receive an error message until the resizing is complete
Suggested answer: A

Explanation:

When a virtual warehouse in Snowflake is resized, specifically when it is increased in size, the additional compute resources become immediately available to all running and queued queries. This means that the performance of these queries can improve due to the increased resources.Conversely, when the size of a warehouse is reduced, the compute resources are not removed until they are no longer being used by any current operations1.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Virtual Warehouses2

asked 23/09/2024
jateen chibabhai
41 questions

Question 84

Report
Export
Collapse

What tasks can be completed using the copy command? (Select TWO)

Columns can be aggregated
Columns can be aggregated
Columns can be joined with an existing table
Columns can be joined with an existing table
Columns can be reordered
Columns can be reordered
Columns can be omitted
Columns can be omitted
Data can be loaded without the need to spin up a virtual warehouse
Data can be loaded without the need to spin up a virtual warehouse
Suggested answer: C, D

Explanation:

The COPY command in Snowflake allows for the reordering of columns as they are loaded into a table, and it also permits the omission of columns from the source file during the load process. This provides flexibility in handling the schema of the data being ingested. References: [COF-C02] SnowPro Core Certification Exam Study Guide

asked 23/09/2024
David Fernando del Villar
44 questions

Question 85

Report
Export
Collapse

What feature can be used to reorganize a very large table on one or more columns?

Micro-partitions
Micro-partitions
Clustering keys
Clustering keys
Key partitions
Key partitions
Clustered partitions
Clustered partitions
Suggested answer: B

Explanation:

Clustering keys in Snowflake are used to reorganize large tables based on one or more columns. This feature optimizes the arrangement of data within micro-partitions to improve query performance, especially for large tables where efficient data retrieval is crucial. References: [COF-C02] SnowPro Core Certification Exam Study Guide

https://docs.snowflake.com/en/user-guide/tables-clustering-keys.html

asked 23/09/2024
vceplus plus
46 questions

Question 86

Report
Export
Collapse

What SQL command would be used to view all roles that were granted to user.1?

show grants to user USER1;
show grants to user USER1;
show grants of user USER1;
show grants of user USER1;
describe user USER1;
describe user USER1;
show grants on user USER1;
show grants on user USER1;
Suggested answer: A

Explanation:

The correct command to view all roles granted to a specific user in Snowflake isSHOW GRANTS TO USER <user_name>;. This command lists all access control privileges that have been explicitly granted to the specified user. References:SHOW GRANTS | Snowflake Documentation

asked 23/09/2024
ROBERTO INFANTAS
34 questions

Question 87

Report
Export
Collapse

Which of the following can be executed/called with Snowpipe?

A User Defined Function (UDF)
A User Defined Function (UDF)
A stored procedure
A stored procedure
A single copy_into statement
A single copy_into statement
A single insert__into statement
A single insert__into statement
Suggested answer: C

Explanation:

Snowpipe is used for continuous, automated data loading into Snowflake. It uses aCOPY INTO <table>statement within a pipe object to load data from files as soon as they are available in a stage. Snowpipe does not execute UDFs, stored procedures, or insert statements. References:Snowpipe | Snowflake Documentation

asked 23/09/2024
Arturs Grigorjevs
42 questions

Question 88

Report
Export
Collapse

What Snowflake role must be granted for a user to create and manage accounts?

ACCOUNTADMIN
ACCOUNTADMIN
ORGADMIN
ORGADMIN
SECURITYADMIN
SECURITYADMIN
SYSADMIN
SYSADMIN
Suggested answer: A

Explanation:

The ACCOUNTADMIN role is required for a user to create and manage accounts in Snowflake.This role has the highest level of privileges and is responsible for managing all aspects of the Snowflake account, including the ability to create and manage other user accounts1.

https://docs.snowflake.com/en/user-guide/security-access-control-considerations.html

asked 23/09/2024
Lucas de Paula Mello
31 questions

Question 89

Report
Export
Collapse

When unloading to a stage, which of the following is a recommended practice or approach?

Set SINGLE: = true for larger files
Set SINGLE: = true for larger files
Use OBJECT_CONSTRUCT ( * ) when using Parquet
Use OBJECT_CONSTRUCT ( * ) when using Parquet
Avoid the use of the CAST function
Avoid the use of the CAST function
Define an individual file format
Define an individual file format
Suggested answer: D

Explanation:

When unloading data to a stage, it is recommended to define an individual file format.This ensures that the data is unloaded in a consistent and expected format, which can be crucial for downstream processing and analysis2

asked 23/09/2024
william hwang
28 questions

Question 90

Report
Export
Collapse

When is the result set cache no longer available? (Select TWO)

When another warehouse is used to execute the query
When another warehouse is used to execute the query
When another user executes the query
When another user executes the query
When the underlying data has changed
When the underlying data has changed
When the warehouse used to execute the query is suspended
When the warehouse used to execute the query is suspended
When it has been 24 hours since the last query
When it has been 24 hours since the last query
Suggested answer: C, E

Explanation:

The result set cache in Snowflake is invalidated and no longer available when the underlying data of the query results has changed, ensuring that queries return the most current data.Additionally, the cache expires after 24 hours to maintain the efficiency and accuracy of data retrieval1.

asked 23/09/2024
Kazi Basit
41 questions
Total 716 questions
Go to page: of 72