ExamGecko
Home Home / Snowflake / COF-C02

Snowflake COF-C02 Practice Test - Questions Answers, Page 7

Question list
Search
Search

Where would a Snowflake user find information about query activity from 90 days ago?

A.
account__usage . query history view
A.
account__usage . query history view
Answers
B.
account__usage.query__history__archive View
B.
account__usage.query__history__archive View
Answers
C.
information__schema . cruery_history view
C.
information__schema . cruery_history view
Answers
D.
information__schema - query history_by_ses s i on view
D.
information__schema - query history_by_ses s i on view
Answers
Suggested answer: B

Explanation:

To find information about query activity from 90 days ago, a Snowflake user should use theaccount_usage.query_history_archiveview. This view is designed to provide access to historical query data beyond the default 14-day retention period found in the standardquery_historyview. It allows users to analyze and audit past query activities for up to 365 days after the date of execution, which includes the 90-day period mentioned.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Account Usage Schema1

Which Snowflake technique can be used to improve the performance of a query?

A.
Clustering
A.
Clustering
Answers
B.
Indexing
B.
Indexing
Answers
C.
Fragmenting
C.
Fragmenting
Answers
D.
Using INDEX__HINTS
D.
Using INDEX__HINTS
Answers
Suggested answer: A

Explanation:

Clustering is a technique used in Snowflake to improve the performance of queries. It involves organizing the data in a table into micro-partitions based on the values of one or more columns. This organization allows Snowflake to efficiently prune non-relevant micro-partitions during a query, which reduces the amount of data scanned and improves query performance.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Clustering

User-level network policies can be created by which of the following roles? (Select TWO).

A.
ROLEADMIN
A.
ROLEADMIN
Answers
B.
ACCOUNTADMIN
B.
ACCOUNTADMIN
Answers
C.
SYSADMIN
C.
SYSADMIN
Answers
D.
SECURITYADMIN
D.
SECURITYADMIN
Answers
E.
USERADMIN
E.
USERADMIN
Answers
Suggested answer: B, D

Explanation:

User-level network policies in Snowflake can be created by roles with the necessary privileges to manage security and account settings. The ACCOUNTADMIN role has the highest level of privileges across the account, including the ability to manage network policies. The SECURITYADMIN role is specifically responsible for managing security objects within Snowflake, which includes the creation and management of network policies.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Network Policies1

Section 1.3 - SnowPro Core Certification Study Guide

Which command can be used to load data into an internal stage?

A.
LOAD
A.
LOAD
Answers
B.
copy
B.
copy
Answers
C.
GET
C.
GET
Answers
D.
PUT
D.
PUT
Answers
Suggested answer: D

Explanation:

The PUT command is used to load data into an internal stage in Snowflake. This command uploads data files from a local file system to a named internal stage, making the data available for subsequent loading into a Snowflake table using the COPY INTO command.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Data Loading

What happens when an external or an internal stage is dropped? (Select TWO).

A.
When dropping an external stage, the files are not removed and only the stage is dropped
A.
When dropping an external stage, the files are not removed and only the stage is dropped
Answers
B.
When dropping an external stage, both the stage and the files within the stage are removed
B.
When dropping an external stage, both the stage and the files within the stage are removed
Answers
C.
When dropping an internal stage, the files are deleted with the stage and the files are recoverable
C.
When dropping an internal stage, the files are deleted with the stage and the files are recoverable
Answers
D.
When dropping an internal stage, the files are deleted with the stage and the files are not recoverable
D.
When dropping an internal stage, the files are deleted with the stage and the files are not recoverable
Answers
E.
When dropping an internal stage, only selected files are deleted with the stage and are not recoverable
E.
When dropping an internal stage, only selected files are deleted with the stage and are not recoverable
Answers
Suggested answer: A, D

Explanation:

When an external stage is dropped in Snowflake, the reference to the external storage location is removed, but the actual files within the external storage (like Amazon S3, Google Cloud Storage, or Microsoft Azure) are not deleted. This means that the data remains intact in the external storage location, and only the stage object in Snowflake is removed.

On the other hand, when an internal stage is dropped, any files that were uploaded to the stage are deleted along with the stage itself. These files are not recoverable once the internal stage is dropped, as they are permanently removed from Snowflake's storage.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Stages

How long is Snowpipe data load history retained?

A.
As configured in the create pipe settings
A.
As configured in the create pipe settings
Answers
B.
Until the pipe is dropped
B.
Until the pipe is dropped
Answers
C.
64 days
C.
64 days
Answers
D.
14 days
D.
14 days
Answers
Suggested answer: C

Explanation:

Snowpipe data load history is retained for 64 days. This retention period allows users to review and audit the data load operations performed by Snowpipe over a significant period of time, which can be crucial for troubleshooting and ensuring data integrity.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Snowpipe1

In which use cases does Snowflake apply egress charges?

A.
Data sharing within a specific region
A.
Data sharing within a specific region
Answers
B.
Query result retrieval
B.
Query result retrieval
Answers
C.
Database replication
C.
Database replication
Answers
D.
Loading data into Snowflake
D.
Loading data into Snowflake
Answers
Suggested answer: C

Explanation:

Snowflake applies egress charges in the case of database replication when data is transferred out of a Snowflake region to another region or cloud provider. This is because the data transfer incurs costs associated with moving data across different networks. Egress charges are not applied for data sharing within the same region, query result retrieval, or loading data into Snowflake, as these actions do not involve data transfer across regions.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Data Replication and Egress Charges1

Which account__usage views are used to evaluate the details of dynamic data masking? (Select TWO)

A.
ROLES
A.
ROLES
Answers
B.
POLICY_REFERENCES
B.
POLICY_REFERENCES
Answers
C.
QUERY_HISTORY
C.
QUERY_HISTORY
Answers
D.
RESOURCE_MONIT ORS
D.
RESOURCE_MONIT ORS
Answers
E.
ACCESS_HISTORY
E.
ACCESS_HISTORY
Answers
Suggested answer: B, E

Explanation:

To evaluate the details of dynamic data masking, thePOLICY_REFERENCESandACCESS_HISTORYviews in theaccount_usageschema are used. ThePOLICY_REFERENCESview provides information about the objects to which a masking policy is applied, and theACCESS_HISTORYview contains details about access to the masked data, which can be used to audit and verify the application of dynamic data masking policies.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Dynamic Data Masking1

Query compilation occurs in which architecture layer of the Snowflake Cloud Data Platform?

A.
Compute layer
A.
Compute layer
Answers
B.
Storage layer
B.
Storage layer
Answers
C.
Cloud infrastructure layer
C.
Cloud infrastructure layer
Answers
D.
Cloud services layer
D.
Cloud services layer
Answers
Suggested answer: D

Explanation:

Query compilation in Snowflake occurs in the Cloud Services layer. This layer is responsible for coordinating and managing all aspects of the Snowflake service, including authentication, infrastructure management, metadata management, query parsing and optimization, and security. By handling these tasks, the Cloud Services layer enables the Compute layer to focus on executing queries, while the Storage layer is dedicated to persistently storing data.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Snowflake Architecture1

Which is the MINIMUM required Snowflake edition that a user must have if they want to use AWS/Azure Privatelink or Google Cloud Private Service Connect?

A.
Standard
A.
Standard
Answers
B.
Premium
B.
Premium
Answers
C.
Enterprise
C.
Enterprise
Answers
D.
Business Critical
D.
Business Critical
Answers
Suggested answer: D

Explanation:

https://docs.snowflake.com/en/user-guide/admin-security-privatelink.html

Total 716 questions
Go to page: of 72