ExamGecko
Home / Snowflake / COF-C02
Ask Question

Snowflake COF-C02 Practice Test - Questions Answers, Page 11

Question list
Search

Question 101

Report
Export
Collapse

If 3 size Small virtual warehouse is made up of two servers, how many servers make up a Large warehouse?

4
4
8
8
16
16
32
32
Suggested answer: B

Explanation:

In Snowflake, each size increase in virtual warehouses doubles the number of servers.Therefore, if a size Small virtual warehouse is made up of two servers, a Large warehouse, which is two sizes larger, would be made up of eight servers (2 servers for Small, 4 for Medium, and 8 for Large)2.

Size specifies the amount of compute resources available per cluster in a warehouse. Snowflake supports the following warehouse sizes:

Snowflake COF-C02 image Question 101 explanation 73243 09232024004424000000

https://docs.snowflake.com/en/user-guide/warehouses-overview.html

asked 23/09/2024
Husein M
34 questions

Question 102

Report
Export
Collapse

Which command sets the Virtual Warehouse for a session?

COPY WAREHOUSE FROM <<config file>>;
COPY WAREHOUSE FROM <<config file>>;
SET WAREHOUSE = <<warehouse name>>;
SET WAREHOUSE = <<warehouse name>>;
USE WAREHOUSE <<warehouse name>>;
USE WAREHOUSE <<warehouse name>>;
USE VIRTUAL_WAREHOUSE <<warehouse name>>;
USE VIRTUAL_WAREHOUSE <<warehouse name>>;
Suggested answer: C

Explanation:

The commandUSE WAREHOUSE <<warehouse name>>;is used to set the virtual warehouse for the current session in Snowflake.This command specifies which virtual warehouse to use for executing queries in that session1.

asked 23/09/2024
Zafor Iqbal
35 questions

Question 103

Report
Export
Collapse

What occurs when a pipe is recreated using the CREATE OR REPLACE PIPE command?

The Pipe load history is reset to empty.
The Pipe load history is reset to empty.
The REFRESH command is executed.
The REFRESH command is executed.
The stage will be purged.
The stage will be purged.
The destination table is truncated.
The destination table is truncated.
Suggested answer: A

Explanation:

When a pipe is recreated using theCREATE OR REPLACE PIPEcommand, the load history of the pipe is reset.This means that Snowpipe will consider all files in the stage as new and will attempt to load them, even if they were loaded previously by the old pipe2.

asked 23/09/2024
Igor van der Burgh
38 questions

Question 104

Report
Export
Collapse

True or False: Snowpipe via REST API can only reference External Stages as source.

True
True
False
False
Suggested answer: B

Explanation:

Snowpipe via REST API can reference both named internal stages within Snowflake and external stages, such as Amazon S3, Google Cloud Storage, or Microsoft Azure1. This means that Snowpipe is not limited to only external stages as a source for data loading.

References = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation1

asked 23/09/2024
Elham Alasmari
36 questions

Question 105

Report
Export
Collapse

Which of the following are best practices for loading data into Snowflake? (Choose three.)

Aim to produce data files that are between 100 MB and 250 MB in size, compressed.
Aim to produce data files that are between 100 MB and 250 MB in size, compressed.
Load data from files in a cloud storage service in a different region or cloud platform from the service or region containing the Snowflake account, to save on cost.
Load data from files in a cloud storage service in a different region or cloud platform from the service or region containing the Snowflake account, to save on cost.
Enclose fields that contain delimiter characters in single or double quotes.
Enclose fields that contain delimiter characters in single or double quotes.
Split large files into a greater number of smaller files to distribute the load among the compute resources in an active warehouse.
Split large files into a greater number of smaller files to distribute the load among the compute resources in an active warehouse.
When planning which warehouse(s) to use for data loading, start with the largest warehouse possible.
When planning which warehouse(s) to use for data loading, start with the largest warehouse possible.
Partition the staged data into large folders with random paths, allowing Snowflake to determine the best way to load each file.
Partition the staged data into large folders with random paths, allowing Snowflake to determine the best way to load each file.
Suggested answer: A, C, D

Explanation:

Best practices for loading data into Snowflake include aiming for data file sizes between 100 MB and 250 MB when compressed, as this size is optimal for parallel processing and minimizes overhead. Enclosing fields with delimiter characters in quotes ensures proper field recognition during the load process. Splitting large files into smaller ones allows for better distribution of the load across compute resources, enhancing performance and efficiency.

asked 23/09/2024
Lawrence Acherman
42 questions

Question 106

Report
Export
Collapse

What do the terms scale up and scale out refer to in Snowflake? (Choose two.)

Scaling out adds clusters of the same size to a virtual warehouse to handle more concurrent queries.
Scaling out adds clusters of the same size to a virtual warehouse to handle more concurrent queries.
Scaling out adds clusters of varying sizes to a virtual warehouse.
Scaling out adds clusters of varying sizes to a virtual warehouse.
Scaling out adds additional database servers to an existing running cluster to handle more concurrent queries.
Scaling out adds additional database servers to an existing running cluster to handle more concurrent queries.
Snowflake recommends using both scaling up and scaling out to handle more concurrent queries.
Snowflake recommends using both scaling up and scaling out to handle more concurrent queries.
Scaling up resizes a virtual warehouse so it can handle more complex workloads.
Scaling up resizes a virtual warehouse so it can handle more complex workloads.
Scaling up adds additional database servers to an existing running cluster to handle larger workloads.
Scaling up adds additional database servers to an existing running cluster to handle larger workloads.
Suggested answer: A, E

Explanation:

Scaling out in Snowflake involves adding clusters of the same size to a virtual warehouse, which allows for handling more concurrent queries without affecting the performance of individual queries. Scaling up refers to resizing a virtual warehouse to increase its compute resources, enabling it to handle more complex workloads and larger queries more efficiently.

asked 23/09/2024
Novy Kingra
43 questions

Question 107

Report
Export
Collapse

What is the minimum Snowflake edition that has column-level security enabled?

Standard
Standard
Enterprise
Enterprise
Business Critical
Business Critical
Virtual Private Snowflake
Virtual Private Snowflake
Suggested answer: B

Explanation:

Column-level security, which allows for the application of masking policies to columns in tables or views, is available starting from the Enterprise edition of Snowflake1.

References = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation1

asked 23/09/2024
John Russell
23 questions

Question 108

Report
Export
Collapse

When cloning a database, what is cloned with the database? (Choose two.)

Privileges on the database
Privileges on the database
Existing child objects within the database
Existing child objects within the database
Future child objects within the database
Future child objects within the database
Privileges on the schemas within the database
Privileges on the schemas within the database
Only schemas and tables within the database
Only schemas and tables within the database
Suggested answer: A, B

Explanation:

When cloning a database in Snowflake, the clone includes all privileges on the database as well as existing child objects within the database, such as schemas, tables, views, etc.However, it does not include future child objects or privileges on schemas within the database2.

References = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation

asked 23/09/2024
Colin Ng
46 questions

Question 109

Report
Export
Collapse

Which of the following describes the Snowflake Cloud Services layer?

Coordinates activities in the Snowflake account
Coordinates activities in the Snowflake account
Executes queries submitted by the Snowflake account users
Executes queries submitted by the Snowflake account users
Manages quotas on the Snowflake account storage
Manages quotas on the Snowflake account storage
Manages the virtual warehouse cache to speed up queries
Manages the virtual warehouse cache to speed up queries
Suggested answer: A

Explanation:

The Snowflake Cloud Services layer is a collection of services that coordinate activities across Snowflake, tying together all the different components to process user requests, from login to query dispatch1.

References = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation1

asked 23/09/2024
Marcin Golec
33 questions

Question 110

Report
Export
Collapse

What is the maximum total Continuous Data Protection (CDP) charges incurred for a temporary table?

30 days
30 days
7 days
7 days
48 hours
48 hours
24 hours
24 hours
Suggested answer: D

Explanation:

For a temporary table, the maximum total Continuous Data Protection (CDP) charges incurred are for the duration of the session in which the table was created, which does not exceed 24 hours2.

References = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation2

asked 23/09/2024
Amardeep Kumar
32 questions
Total 716 questions
Go to page: of 72