ExamGecko
Home Home / Snowflake / COF-C02

Snowflake COF-C02 Practice Test - Questions Answers, Page 11

Question list
Search
Search

If 3 size Small virtual warehouse is made up of two servers, how many servers make up a Large warehouse?

A.
4
A.
4
Answers
B.
8
B.
8
Answers
C.
16
C.
16
Answers
D.
32
D.
32
Answers
Suggested answer: B

Explanation:

In Snowflake, each size increase in virtual warehouses doubles the number of servers.Therefore, if a size Small virtual warehouse is made up of two servers, a Large warehouse, which is two sizes larger, would be made up of eight servers (2 servers for Small, 4 for Medium, and 8 for Large)2.

Size specifies the amount of compute resources available per cluster in a warehouse. Snowflake supports the following warehouse sizes:

https://docs.snowflake.com/en/user-guide/warehouses-overview.html

Which command sets the Virtual Warehouse for a session?

A.
COPY WAREHOUSE FROM <<config file>>;
A.
COPY WAREHOUSE FROM <<config file>>;
Answers
B.
SET WAREHOUSE = <<warehouse name>>;
B.
SET WAREHOUSE = <<warehouse name>>;
Answers
C.
USE WAREHOUSE <<warehouse name>>;
C.
USE WAREHOUSE <<warehouse name>>;
Answers
D.
USE VIRTUAL_WAREHOUSE <<warehouse name>>;
D.
USE VIRTUAL_WAREHOUSE <<warehouse name>>;
Answers
Suggested answer: C

Explanation:

The commandUSE WAREHOUSE <<warehouse name>>;is used to set the virtual warehouse for the current session in Snowflake.This command specifies which virtual warehouse to use for executing queries in that session1.

What occurs when a pipe is recreated using the CREATE OR REPLACE PIPE command?

A.
The Pipe load history is reset to empty.
A.
The Pipe load history is reset to empty.
Answers
B.
The REFRESH command is executed.
B.
The REFRESH command is executed.
Answers
C.
The stage will be purged.
C.
The stage will be purged.
Answers
D.
The destination table is truncated.
D.
The destination table is truncated.
Answers
Suggested answer: A

Explanation:

When a pipe is recreated using theCREATE OR REPLACE PIPEcommand, the load history of the pipe is reset.This means that Snowpipe will consider all files in the stage as new and will attempt to load them, even if they were loaded previously by the old pipe2.

True or False: Snowpipe via REST API can only reference External Stages as source.

A.
True
A.
True
Answers
B.
False
B.
False
Answers
Suggested answer: B

Explanation:

Snowpipe via REST API can reference both named internal stages within Snowflake and external stages, such as Amazon S3, Google Cloud Storage, or Microsoft Azure1. This means that Snowpipe is not limited to only external stages as a source for data loading.

References = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation1

Which of the following are best practices for loading data into Snowflake? (Choose three.)

A.
Aim to produce data files that are between 100 MB and 250 MB in size, compressed.
A.
Aim to produce data files that are between 100 MB and 250 MB in size, compressed.
Answers
B.
Load data from files in a cloud storage service in a different region or cloud platform from the service or region containing the Snowflake account, to save on cost.
B.
Load data from files in a cloud storage service in a different region or cloud platform from the service or region containing the Snowflake account, to save on cost.
Answers
C.
Enclose fields that contain delimiter characters in single or double quotes.
C.
Enclose fields that contain delimiter characters in single or double quotes.
Answers
D.
Split large files into a greater number of smaller files to distribute the load among the compute resources in an active warehouse.
D.
Split large files into a greater number of smaller files to distribute the load among the compute resources in an active warehouse.
Answers
E.
When planning which warehouse(s) to use for data loading, start with the largest warehouse possible.
E.
When planning which warehouse(s) to use for data loading, start with the largest warehouse possible.
Answers
F.
Partition the staged data into large folders with random paths, allowing Snowflake to determine the best way to load each file.
F.
Partition the staged data into large folders with random paths, allowing Snowflake to determine the best way to load each file.
Answers
Suggested answer: A, C, D

Explanation:

Best practices for loading data into Snowflake include aiming for data file sizes between 100 MB and 250 MB when compressed, as this size is optimal for parallel processing and minimizes overhead. Enclosing fields with delimiter characters in quotes ensures proper field recognition during the load process. Splitting large files into smaller ones allows for better distribution of the load across compute resources, enhancing performance and efficiency.

What do the terms scale up and scale out refer to in Snowflake? (Choose two.)

A.
Scaling out adds clusters of the same size to a virtual warehouse to handle more concurrent queries.
A.
Scaling out adds clusters of the same size to a virtual warehouse to handle more concurrent queries.
Answers
B.
Scaling out adds clusters of varying sizes to a virtual warehouse.
B.
Scaling out adds clusters of varying sizes to a virtual warehouse.
Answers
C.
Scaling out adds additional database servers to an existing running cluster to handle more concurrent queries.
C.
Scaling out adds additional database servers to an existing running cluster to handle more concurrent queries.
Answers
D.
Snowflake recommends using both scaling up and scaling out to handle more concurrent queries.
D.
Snowflake recommends using both scaling up and scaling out to handle more concurrent queries.
Answers
E.
Scaling up resizes a virtual warehouse so it can handle more complex workloads.
E.
Scaling up resizes a virtual warehouse so it can handle more complex workloads.
Answers
F.
Scaling up adds additional database servers to an existing running cluster to handle larger workloads.
F.
Scaling up adds additional database servers to an existing running cluster to handle larger workloads.
Answers
Suggested answer: A, E

Explanation:

Scaling out in Snowflake involves adding clusters of the same size to a virtual warehouse, which allows for handling more concurrent queries without affecting the performance of individual queries. Scaling up refers to resizing a virtual warehouse to increase its compute resources, enabling it to handle more complex workloads and larger queries more efficiently.

What is the minimum Snowflake edition that has column-level security enabled?

A.
Standard
A.
Standard
Answers
B.
Enterprise
B.
Enterprise
Answers
C.
Business Critical
C.
Business Critical
Answers
D.
Virtual Private Snowflake
D.
Virtual Private Snowflake
Answers
Suggested answer: B

Explanation:

Column-level security, which allows for the application of masking policies to columns in tables or views, is available starting from the Enterprise edition of Snowflake1.

References = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation1

When cloning a database, what is cloned with the database? (Choose two.)

A.
Privileges on the database
A.
Privileges on the database
Answers
B.
Existing child objects within the database
B.
Existing child objects within the database
Answers
C.
Future child objects within the database
C.
Future child objects within the database
Answers
D.
Privileges on the schemas within the database
D.
Privileges on the schemas within the database
Answers
E.
Only schemas and tables within the database
E.
Only schemas and tables within the database
Answers
Suggested answer: A, B

Explanation:

When cloning a database in Snowflake, the clone includes all privileges on the database as well as existing child objects within the database, such as schemas, tables, views, etc.However, it does not include future child objects or privileges on schemas within the database2.

References = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation

Which of the following describes the Snowflake Cloud Services layer?

A.
Coordinates activities in the Snowflake account
A.
Coordinates activities in the Snowflake account
Answers
B.
Executes queries submitted by the Snowflake account users
B.
Executes queries submitted by the Snowflake account users
Answers
C.
Manages quotas on the Snowflake account storage
C.
Manages quotas on the Snowflake account storage
Answers
D.
Manages the virtual warehouse cache to speed up queries
D.
Manages the virtual warehouse cache to speed up queries
Answers
Suggested answer: A

Explanation:

The Snowflake Cloud Services layer is a collection of services that coordinate activities across Snowflake, tying together all the different components to process user requests, from login to query dispatch1.

References = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation1

What is the maximum total Continuous Data Protection (CDP) charges incurred for a temporary table?

A.
30 days
A.
30 days
Answers
B.
7 days
B.
7 days
Answers
C.
48 hours
C.
48 hours
Answers
D.
24 hours
D.
24 hours
Answers
Suggested answer: D

Explanation:

For a temporary table, the maximum total Continuous Data Protection (CDP) charges incurred are for the duration of the session in which the table was created, which does not exceed 24 hours2.

References = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation2

Total 716 questions
Go to page: of 72