ExamGecko
Home / Snowflake / COF-C02 / List of questions
Ask Question

Snowflake COF-C02 Practice Test - Questions Answers, Page 7

List of questions

Question 61

Report Export Collapse

Where would a Snowflake user find information about query activity from 90 days ago?

account__usage . query history view
account__usage . query history view
account__usage.query__history__archive View
account__usage.query__history__archive View
information__schema . cruery_history view
information__schema . cruery_history view
information__schema - query history_by_ses s i on view
information__schema - query history_by_ses s i on view
Suggested answer: B
Explanation:

To find information about query activity from 90 days ago, a Snowflake user should use theaccount_usage.query_history_archiveview. This view is designed to provide access to historical query data beyond the default 14-day retention period found in the standardquery_historyview. It allows users to analyze and audit past query activities for up to 365 days after the date of execution, which includes the 90-day period mentioned.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Account Usage Schema1

asked 23/09/2024
Mpho Ntshontsi
43 questions

Question 62

Report Export Collapse

Which Snowflake technique can be used to improve the performance of a query?

Clustering
Clustering
Indexing
Indexing
Fragmenting
Fragmenting
Using INDEX__HINTS
Using INDEX__HINTS
Suggested answer: A
Explanation:

Clustering is a technique used in Snowflake to improve the performance of queries. It involves organizing the data in a table into micro-partitions based on the values of one or more columns. This organization allows Snowflake to efficiently prune non-relevant micro-partitions during a query, which reduces the amount of data scanned and improves query performance.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Clustering

asked 23/09/2024
Empire Cybersecurity
34 questions

Question 63

Report Export Collapse

User-level network policies can be created by which of the following roles? (Select TWO).

ROLEADMIN
ROLEADMIN
ACCOUNTADMIN
ACCOUNTADMIN
SYSADMIN
SYSADMIN
SECURITYADMIN
SECURITYADMIN
USERADMIN
USERADMIN
Suggested answer: B, D
Explanation:

User-level network policies in Snowflake can be created by roles with the necessary privileges to manage security and account settings. The ACCOUNTADMIN role has the highest level of privileges across the account, including the ability to manage network policies. The SECURITYADMIN role is specifically responsible for managing security objects within Snowflake, which includes the creation and management of network policies.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Network Policies1

Section 1.3 - SnowPro Core Certification Study Guide

asked 23/09/2024
Timothy Smith
41 questions

Question 64

Report Export Collapse

Which command can be used to load data into an internal stage?

LOAD
LOAD
copy
copy
GET
GET
PUT
PUT
Suggested answer: D
Explanation:

The PUT command is used to load data into an internal stage in Snowflake. This command uploads data files from a local file system to a named internal stage, making the data available for subsequent loading into a Snowflake table using the COPY INTO command.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Data Loading

asked 23/09/2024
Zden Bohm Autocont a.s.
29 questions

Question 65

Report Export Collapse

What happens when an external or an internal stage is dropped? (Select TWO).

When dropping an external stage, the files are not removed and only the stage is dropped
When dropping an external stage, the files are not removed and only the stage is dropped
When dropping an external stage, both the stage and the files within the stage are removed
When dropping an external stage, both the stage and the files within the stage are removed
When dropping an internal stage, the files are deleted with the stage and the files are recoverable
When dropping an internal stage, the files are deleted with the stage and the files are recoverable
When dropping an internal stage, the files are deleted with the stage and the files are not recoverable
When dropping an internal stage, the files are deleted with the stage and the files are not recoverable
When dropping an internal stage, only selected files are deleted with the stage and are not recoverable
When dropping an internal stage, only selected files are deleted with the stage and are not recoverable
Suggested answer: A, D
Explanation:

When an external stage is dropped in Snowflake, the reference to the external storage location is removed, but the actual files within the external storage (like Amazon S3, Google Cloud Storage, or Microsoft Azure) are not deleted. This means that the data remains intact in the external storage location, and only the stage object in Snowflake is removed.

On the other hand, when an internal stage is dropped, any files that were uploaded to the stage are deleted along with the stage itself. These files are not recoverable once the internal stage is dropped, as they are permanently removed from Snowflake's storage.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Stages

asked 23/09/2024
Mir Ali
44 questions

Question 66

Report Export Collapse

How long is Snowpipe data load history retained?

As configured in the create pipe settings
As configured in the create pipe settings
Until the pipe is dropped
Until the pipe is dropped
64 days
64 days
14 days
14 days
Suggested answer: C
Explanation:

Snowpipe data load history is retained for 64 days. This retention period allows users to review and audit the data load operations performed by Snowpipe over a significant period of time, which can be crucial for troubleshooting and ensuring data integrity.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Snowpipe1

asked 23/09/2024
MD Farmudin Safi
42 questions

Question 67

Report Export Collapse

In which use cases does Snowflake apply egress charges?

Data sharing within a specific region
Data sharing within a specific region
Query result retrieval
Query result retrieval
Database replication
Database replication
Loading data into Snowflake
Loading data into Snowflake
Suggested answer: C
Explanation:

Snowflake applies egress charges in the case of database replication when data is transferred out of a Snowflake region to another region or cloud provider. This is because the data transfer incurs costs associated with moving data across different networks. Egress charges are not applied for data sharing within the same region, query result retrieval, or loading data into Snowflake, as these actions do not involve data transfer across regions.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Data Replication and Egress Charges1

asked 23/09/2024
JORGE ROCHA
37 questions

Question 68

Report Export Collapse

Which account__usage views are used to evaluate the details of dynamic data masking? (Select TWO)

ROLES
ROLES
POLICY_REFERENCES
POLICY_REFERENCES
QUERY_HISTORY
QUERY_HISTORY
RESOURCE_MONIT ORS
RESOURCE_MONIT ORS
ACCESS_HISTORY
ACCESS_HISTORY
Suggested answer: B, E
Explanation:

To evaluate the details of dynamic data masking, thePOLICY_REFERENCESandACCESS_HISTORYviews in theaccount_usageschema are used. ThePOLICY_REFERENCESview provides information about the objects to which a masking policy is applied, and theACCESS_HISTORYview contains details about access to the masked data, which can be used to audit and verify the application of dynamic data masking policies.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Dynamic Data Masking1

asked 23/09/2024
Aaron Whitlow
37 questions

Question 69

Report Export Collapse

Query compilation occurs in which architecture layer of the Snowflake Cloud Data Platform?

Compute layer
Compute layer
Storage layer
Storage layer
Cloud infrastructure layer
Cloud infrastructure layer
Cloud services layer
Cloud services layer
Suggested answer: D
Explanation:

Query compilation in Snowflake occurs in the Cloud Services layer. This layer is responsible for coordinating and managing all aspects of the Snowflake service, including authentication, infrastructure management, metadata management, query parsing and optimization, and security. By handling these tasks, the Cloud Services layer enables the Compute layer to focus on executing queries, while the Storage layer is dedicated to persistently storing data.

References:

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Snowflake Architecture1

asked 23/09/2024
Wessel Beulink
47 questions

Question 70

Report Export Collapse

Which is the MINIMUM required Snowflake edition that a user must have if they want to use AWS/Azure Privatelink or Google Cloud Private Service Connect?

Standard
Standard
Premium
Premium
Enterprise
Enterprise
Business Critical
Business Critical
Suggested answer: D
Explanation:

https://docs.snowflake.com/en/user-guide/admin-security-privatelink.html

asked 23/09/2024
Aldrin Advincula
37 questions
Total 716 questions
Go to page: of 72