ExamGecko
Home Home / Snowflake / COF-C02

Snowflake COF-C02 Practice Test - Questions Answers, Page 21

Question list
Search
Search

Which query profile statistics help determine if efficient pruning is occurring? (Choose two.)

A.
Bytes sent over network
A.
Bytes sent over network
Answers
B.
Percentage scanned from cache
B.
Percentage scanned from cache
Answers
C.
Partitions total
C.
Partitions total
Answers
D.
Bytes spilled to local storage
D.
Bytes spilled to local storage
Answers
E.
Partitions scanned
E.
Partitions scanned
Answers
Suggested answer: C, E

Explanation:

Efficient pruning in Snowflake is indicated by the number of partitions scanned out of the total available.If a small percentage of partitions are scanned, it suggests that the pruning process is effectively narrowing down the data, which can lead to improved query performance

Which TABLE function helps to convert semi-structured data to a relational representation?

A.
CHECK_JSON
A.
CHECK_JSON
Answers
B.
TO_JSON
B.
TO_JSON
Answers
C.
FLATTEN
C.
FLATTEN
Answers
D.
PARSE_JSON
D.
PARSE_JSON
Answers
Suggested answer: C

Explanation:

The FLATTEN table function in Snowflake is used to convert semi-structured data, such as JSON or XML, into a relational format.It expands nested arrays or objects into multiple rows, making the data suitable for relational querying3.

Which URL type allows users to access unstructured data without authenticating into Snowflake or passing an authorization token?

A.
Pre-signed URL
A.
Pre-signed URL
Answers
B.
Scoped URL
B.
Scoped URL
Answers
C.
Signed URL
C.
Signed URL
Answers
D.
File URL
D.
File URL
Answers
Suggested answer: A

Explanation:

Pre-signed URLs in Snowflake allow users to access unstructured data without the need for authentication into Snowflake or passing an authorization token.These URLs are open and can be directly accessed or downloaded by any user or application, making them ideal for business intelligence applications or reporting tools that need to display unstructured file contents

What is the recommended compressed file size range for continuous data loads using Snowpipe?

A.
8-16 MB
A.
8-16 MB
Answers
B.
16-24 MB
B.
16-24 MB
Answers
C.
10-99 MB
C.
10-99 MB
Answers
D.
100-250 MB
D.
100-250 MB
Answers
Suggested answer: D

Explanation:

For continuous data loads using Snowpipe, the recommended compressed file size range is between 100-250 MB.This size range is suggested to optimize the number of parallel operations for a load and to avoid size limitations, ensuring efficient and cost-effective data loading

Which of the following statements describes a schema in Snowflake?

A.
A logical grouping of objects that belongs to a single database
A.
A logical grouping of objects that belongs to a single database
Answers
B.
A logical grouping of objects that belongs to multiple databases
B.
A logical grouping of objects that belongs to multiple databases
Answers
C.
A named Snowflake object that includes all the information required to share a database
C.
A named Snowflake object that includes all the information required to share a database
Answers
D.
A uniquely identified Snowflake account within a business entity
D.
A uniquely identified Snowflake account within a business entity
Answers
Suggested answer: A

Explanation:

A schema in Snowflake is a logical grouping of database objects, such as tables and views, that belongs to a single database.Each schema is part of a namespace in Snowflake, which is inferred from the current database and schema in use for the session5

What is a responsibility of Snowflake's virtual warehouses?

A.
Infrastructure management
A.
Infrastructure management
Answers
B.
Metadata management
B.
Metadata management
Answers
C.
Query execution
C.
Query execution
Answers
D.
Query parsing and optimization
D.
Query parsing and optimization
Answers
E.
Permanent storage of micro-partitions
E.
Permanent storage of micro-partitions
Answers
Suggested answer: C

Explanation:

Snowflake's virtual warehouses are responsible for query execution.They are clusters of compute resources that execute SQL statements, perform DML operations, and load data into tables

Which of the following are handled by the cloud services layer of the Snowflake architecture? (Choose two.)

A.
Query execution
A.
Query execution
Answers
B.
Data loading
B.
Data loading
Answers
C.
Time Travel data
C.
Time Travel data
Answers
D.
Security
D.
Security
Answers
E.
Authentication and access control
E.
Authentication and access control
Answers
Suggested answer: D, E

Explanation:

The cloud services layer of Snowflake architecture handles various aspects including security functions, authentication of user sessions, and access control, ensuring that only authorized users can access the data and services23.

Credit charges for Snowflake virtual warehouses are calculated based on which of the following considerations? (Choose two.)

A.
The number of queries executed
A.
The number of queries executed
Answers
B.
The number of active users assigned to the warehouse
B.
The number of active users assigned to the warehouse
Answers
C.
The size of the virtual warehouse
C.
The size of the virtual warehouse
Answers
D.
The length of time the warehouse is running
D.
The length of time the warehouse is running
Answers
E.
The duration of the queries that are executed
E.
The duration of the queries that are executed
Answers
Suggested answer: C, D

Explanation:

Credit charges for Snowflake virtual warehouses are calculated based on the size of the virtual warehouse and the length of time the warehouse is running.The size determines the compute resources available, and charges are incurred for the time these resources are utilized

What file formats does Snowflake support for loading semi-structured data? (Choose three.)

A.
TSV
A.
TSV
Answers
B.
JSON
B.
JSON
Answers
C.
PDF
C.
PDF
Answers
D.
Avro
D.
Avro
Answers
E.
Parquet
E.
Parquet
Answers
F.
JPEG
F.
JPEG
Answers
Suggested answer: B, D, E

Explanation:

Snowflake supports several semi-structured data formats for loading data.The supported formats include JSON, Avro, and Parquet12. These formats allow for efficient storage and querying of data that does not conform to a traditional relational database schema.

Which Snowflake feature will allow small volumes of data to continuously load into Snowflake and will incrementally make the data available for analysis?

A.
COPY INTO
A.
COPY INTO
Answers
B.
CREATE PIPE
B.
CREATE PIPE
Answers
C.
INSERT INTO
C.
INSERT INTO
Answers
D.
TABLE STREAM
D.
TABLE STREAM
Answers
Suggested answer: B

Explanation:

The Snowflake feature that allows for small volumes of data to be continuously loaded into Snowflake and incrementally made available for analysis is Snowpipe.Snowpipe is designed for near-real-time data loading, enabling data to be loaded as soon as it's available in the storage layer3

Total 716 questions
Go to page: of 72