ExamGecko
Home / Snowflake / SnowPro Core / List of questions
Ask Question

Snowflake SnowPro Core Practice Test - Questions Answers, Page 13

List of questions

Question 121

Report
Export
Collapse

What is the maximum total Continuous Data Protection (CDP) charges incurred for a temporary table?

30 days
30 days
7 days
7 days
48 hours
48 hours
24 hours
24 hours
Suggested answer: D

Explanation:

For a temporary table, the maximum total Continuous Data Protection (CDP) charges incurred are for the duration of the session in which the table was created, which does not exceed 24 hours2.

Reference = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation2

asked 23/09/2024
Audrey Buan
32 questions

Question 122

Report
Export
Collapse

What type of query benefits the MOST from search optimization?

A query that uses only disjunction (i.e., OR) predicates
A query that uses only disjunction (i.e., OR) predicates
A query that includes analytical expressions
A query that includes analytical expressions
A query that uses equality predicates or predicates that use IN
A query that uses equality predicates or predicates that use IN
A query that filters on semi-structured data types
A query that filters on semi-structured data types
Suggested answer: C

Explanation:

Search optimization in Snowflake is designed to improve the performance of queries that are selective and involve point lookup operations using equality and IN predicates.It is particularly beneficial for queries that access columns with a high number of distinct values1.

Reference = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation

asked 23/09/2024
Christheo Van Rooyen
30 questions

Question 123

Report
Export
Collapse

Which of the following are characteristics of Snowflake virtual warehouses? (Choose two.)

Auto-resume applies only to the last warehouse that was started in a multi-cluster warehouse.
Auto-resume applies only to the last warehouse that was started in a multi-cluster warehouse.
The ability to auto-suspend a warehouse is only available in the Enterprise edition or above.
The ability to auto-suspend a warehouse is only available in the Enterprise edition or above.
SnowSQL supports both a configuration file and a command line option for specifying a default warehouse.
SnowSQL supports both a configuration file and a command line option for specifying a default warehouse.
A user cannot specify a default warehouse when using the ODBC driver.
A user cannot specify a default warehouse when using the ODBC driver.
The default virtual warehouse size can be changed at any time.
The default virtual warehouse size can be changed at any time.
Suggested answer: C, E

Explanation:

Snowflake virtual warehouses support a configuration file and command line options in SnowSQL to specify a default warehouse, which is characteristic C. Additionally, the size of a virtual warehouse can be changed at any time, which is characteristic E.These features provide flexibility and ease of use in managing compute resources2.

Reference = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation

asked 23/09/2024
Tarnauceanu Diana
40 questions

Question 124

Report
Export
Collapse

Which command should be used to load data from a file, located in an external stage, into a table in Snowflake?

INSERT
INSERT
PUT
PUT
GET
GET
COPY
COPY
Suggested answer: D

Explanation:

The COPY command is used in Snowflake to load data from files located in an external stage into a table.This command allows for efficient and parallelized data loading from various file formats1.

Reference = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation

asked 23/09/2024
Rodwell Shibambu
51 questions

Question 125

Report
Export
Collapse

The Snowflake Cloud Data Platform is described as having which of the following architectures?

Shared-disk
Shared-disk
Shared-nothing
Shared-nothing
Multi-cluster shared data
Multi-cluster shared data
Serverless query engine
Serverless query engine
Suggested answer: C

Explanation:

Snowflake's architecture is described as a multi-cluster, shared data architecture.This design combines the simplicity of a shared-disk architecture with the performance and scale-out benefits of a shared-nothing architecture, using a central repository accessible from all compute nodes2.

Reference = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation

asked 23/09/2024
Angelo Gulisano
36 questions

Question 126

Report
Export
Collapse

Which of the following is a data tokenization integration partner?

Protegrity
Protegrity
Tableau
Tableau
DBeaver
DBeaver
SAP
SAP
Suggested answer: A

Explanation:

Protegrity is listed as a data tokenization integration partner for Snowflake.This partnership allows Snowflake users to utilize Protegrity's tokenization solutions within the Snowflake environment3.

Reference = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake Documentation

asked 23/09/2024
Arkadius Thoma
46 questions

Question 127

Report
Export
Collapse

What versions of Snowflake should be used to manage compliance with Personal Identifiable Information (PII) requirements? (Choose two.)

Custom Edition
Custom Edition
Virtual Private Snowflake
Virtual Private Snowflake
Business Critical Edition
Business Critical Edition
Standard Edition
Standard Edition
Enterprise Edition
Enterprise Edition
Suggested answer: B, C

Explanation:

To manage compliance with Personal Identifiable Information (PII) requirements, the Virtual Private Snowflake and Business Critical Editions of Snowflake should be used.These editions provide advanced security features necessary for handling sensitive data

asked 23/09/2024
Russell Ang
34 questions

Question 128

Report
Export
Collapse

What are supported file formats for unloading data from Snowflake? (Choose three.)

XML
XML
JSON
JSON
Parquet
Parquet
ORC
ORC
AVRO
AVRO
CSV
CSV
Suggested answer: B, C, F

Explanation:

The supported file formats for unloading data from Snowflake include JSON, Parquet, and CSV.These formats are commonly used for their flexibility and compatibility with various data processing tools

asked 23/09/2024
Olanrewaju Abolanle
31 questions

Question 129

Report
Export
Collapse

The Snowflake cloud services layer is responsible for which tasks? (Choose two.)

Local disk caching
Local disk caching
Authentication and access control
Authentication and access control
Metadata management
Metadata management
Query processing
Query processing
Database storage
Database storage
Suggested answer: B, C

Explanation:

The Snowflake cloud services layer is responsible for tasks such as authentication and access control, ensuring secure access to the platform, and metadata management, which involves organizing and maintaining information about the data stored in Snowflake56.

asked 23/09/2024
Linda Jannina Sourander
38 questions

Question 130

Report
Export
Collapse

When publishing a Snowflake Data Marketplace listing into a remote region what should be taken into consideration? (Choose two.)

There is no need to have a Snowflake account in the target region, a share will be created for each user.
There is no need to have a Snowflake account in the target region, a share will be created for each user.
The listing is replicated into all selected regions automatically, the data is not.
The listing is replicated into all selected regions automatically, the data is not.
The user must have the ORGADMIN role available in at least one account to link accounts for replication.
The user must have the ORGADMIN role available in at least one account to link accounts for replication.
Shares attached to listings in remote regions can be viewed from any account in an organization.
Shares attached to listings in remote regions can be viewed from any account in an organization.
For a standard listing the user can wait until the first customer requests the data before replicating it to the target region.
For a standard listing the user can wait until the first customer requests the data before replicating it to the target region.
Suggested answer: B, C

Explanation:

When publishing a Snowflake Data Marketplace listing into a remote region, it's important to note that while the listing is replicated into all selected regions automatically, the data itself is not. Therefore, the data must be replicated separately.Additionally, the user must have the ORGADMIN role in at least one account to manage the replication of accounts1.

asked 23/09/2024
David Wang
28 questions
Total 627 questions
Go to page: of 63