ExamGecko
Home / Snowflake / COF-C02
Ask Question

Snowflake COF-C02 Practice Test - Questions Answers, Page 12

Question list
Search

Question 111

Report
Export
Collapse

What type of query benefits the MOST from search optimization?

A query that uses only disjunction (i.e., OR) predicates
A query that uses only disjunction (i.e., OR) predicates
A query that includes analytical expressions
A query that includes analytical expressions
A query that uses equality predicates or predicates that use IN
A query that uses equality predicates or predicates that use IN
A query that filters on semi-structured data types
A query that filters on semi-structured data types
Suggested answer: C

Explanation:

Search optimization in Snowflake is designed to improve the performance of queries that are selective and involve point lookup operations using equality and IN predicates.It is particularly beneficial for queries that access columns with a high number of distinct values1.

References = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation

asked 23/09/2024
Paulo Cury
33 questions

Question 112

Report
Export
Collapse

Which of the following are characteristics of Snowflake virtual warehouses? (Choose two.)

Auto-resume applies only to the last warehouse that was started in a multi-cluster warehouse.
Auto-resume applies only to the last warehouse that was started in a multi-cluster warehouse.
The ability to auto-suspend a warehouse is only available in the Enterprise edition or above.
The ability to auto-suspend a warehouse is only available in the Enterprise edition or above.
SnowSQL supports both a configuration file and a command line option for specifying a default warehouse.
SnowSQL supports both a configuration file and a command line option for specifying a default warehouse.
A user cannot specify a default warehouse when using the ODBC driver.
A user cannot specify a default warehouse when using the ODBC driver.
The default virtual warehouse size can be changed at any time.
The default virtual warehouse size can be changed at any time.
Suggested answer: C, E

Explanation:

Snowflake virtual warehouses support a configuration file and command line options in SnowSQL to specify a default warehouse, which is characteristic C. Additionally, the size of a virtual warehouse can be changed at any time, which is characteristic E.These features provide flexibility and ease of use in managing compute resources2.

References = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation

asked 23/09/2024
Vidana Weerasinghe
43 questions

Question 113

Report
Export
Collapse

Which command should be used to load data from a file, located in an external stage, into a table in Snowflake?

INSERT
INSERT
PUT
PUT
GET
GET
COPY
COPY
Suggested answer: D

Explanation:

The COPY command is used in Snowflake to load data from files located in an external stage into a table.This command allows for efficient and parallelized data loading from various file formats1.

References = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation

asked 23/09/2024
Stefan Lundmark
44 questions

Question 114

Report
Export
Collapse

The Snowflake Cloud Data Platform is described as having which of the following architectures?

Shared-disk
Shared-disk
Shared-nothing
Shared-nothing
Multi-cluster shared data
Multi-cluster shared data
Serverless query engine
Serverless query engine
Suggested answer: C

Explanation:

Snowflake's architecture is described as a multi-cluster, shared data architecture.This design combines the simplicity of a shared-disk architecture with the performance and scale-out benefits of a shared-nothing architecture, using a central repository accessible from all compute nodes2.

References = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation

asked 23/09/2024
Hassan Shafi
29 questions

Question 115

Report
Export
Collapse

Which of the following is a data tokenization integration partner?

Protegrity
Protegrity
Tableau
Tableau
DBeaver
DBeaver
SAP
SAP
Suggested answer: A

Explanation:

Protegrity is listed as a data tokenization integration partner for Snowflake.This partnership allows Snowflake users to utilize Protegrity's tokenization solutions within the Snowflake environment3.

References = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation

asked 23/09/2024
Baljit Bhadare
39 questions

Question 116

Report
Export
Collapse

What versions of Snowflake should be used to manage compliance with Personal Identifiable Information (PII) requirements? (Choose two.)

Custom Edition
Custom Edition
Virtual Private Snowflake
Virtual Private Snowflake
Business Critical Edition
Business Critical Edition
Standard Edition
Standard Edition
Enterprise Edition
Enterprise Edition
Suggested answer: B, C

Explanation:

To manage compliance with Personal Identifiable Information (PII) requirements, the Virtual Private Snowflake and Business Critical Editions of Snowflake should be used.These editions provide advanced security features necessary for handling sensitive data

asked 23/09/2024
Glenn Abdoelkarim
36 questions

Question 117

Report
Export
Collapse

What are supported file formats for unloading data from Snowflake? (Choose three.)

XML
XML
JSON
JSON
Parquet
Parquet
ORC
ORC
AVRO
AVRO
CSV
CSV
Suggested answer: B, C, F

Explanation:

The supported file formats for unloading data from Snowflake include JSON, Parquet, and CSV.These formats are commonly used for their flexibility and compatibility with various data processing tools

asked 23/09/2024
Sivakumar Duraimanickam
36 questions

Question 118

Report
Export
Collapse

The Snowflake cloud services layer is responsible for which tasks? (Choose two.)

Local disk caching
Local disk caching
Authentication and access control
Authentication and access control
Metadata management
Metadata management
Query processing
Query processing
Database storage
Database storage
Suggested answer: B, C

Explanation:

The Snowflake cloud services layer is responsible for tasks such as authentication and access control, ensuring secure access to the platform, and metadata management, which involves organizing and maintaining information about the data stored in Snowflake56.

asked 23/09/2024
Bill May
45 questions

Question 119

Report
Export
Collapse

When publishing a Snowflake Data Marketplace listing into a remote region what should be taken into consideration? (Choose two.)

There is no need to have a Snowflake account in the target region, a share will be created for each user.
There is no need to have a Snowflake account in the target region, a share will be created for each user.
The listing is replicated into all selected regions automatically, the data is not.
The listing is replicated into all selected regions automatically, the data is not.
The user must have the ORGADMIN role available in at least one account to link accounts for replication.
The user must have the ORGADMIN role available in at least one account to link accounts for replication.
Shares attached to listings in remote regions can be viewed from any account in an organization.
Shares attached to listings in remote regions can be viewed from any account in an organization.
For a standard listing the user can wait until the first customer requests the data before replicating it to the target region.
For a standard listing the user can wait until the first customer requests the data before replicating it to the target region.
Suggested answer: B, C

Explanation:

When publishing a Snowflake Data Marketplace listing into a remote region, it's important to note that while the listing is replicated into all selected regions automatically, the data itself is not. Therefore, the data must be replicated separately.Additionally, the user must have the ORGADMIN role in at least one account to manage the replication of accounts1.

asked 23/09/2024
Bhavya AGGARWAL
41 questions

Question 120

Report
Export
Collapse

When loading data into Snowflake via Snowpipe what is the compressed file size recommendation?

10-50 MB
10-50 MB
100-250 MB
100-250 MB
300-500 MB
300-500 MB
1000-1500 MB
1000-1500 MB
Suggested answer: B

Explanation:

For loading data into Snowflake via Snowpipe, the recommended compressed file size is between 100-250 MB.This size range is optimal for balancing the performance of parallel processing and minimizing the overhead associated with handling many small files2.

asked 23/09/2024
Andries Coetzee
30 questions
Total 716 questions
Go to page: of 72