ExamGecko
Home Home / Snowflake / SnowPro Core

Snowflake SnowPro Core Practice Test - Questions Answers, Page 9

Question list
Search
Search

What features that are part of the Continuous Data Protection (CDP) feature set in Snowflake do not require additional configuration? (Choose two.)

A.
Row level access policies
A.
Row level access policies
Answers
B.
Data masking policies
B.
Data masking policies
Answers
C.
Data encryption
C.
Data encryption
Answers
D.
Time Travel
D.
Time Travel
Answers
E.
External tokenization
E.
External tokenization
Answers
Suggested answer: C, D

Explanation:

Data encryption and Time Travel are part of Snowflake's Continuous Data Protection (CDP) feature set that do not require additional configuration.Data encryption is automatically applied to all files stored on internal stages, and Time Travel allows for querying and restoring data without any extra setup

Which Snowflake layer is always leveraged when accessing a query from the result cache?

A.
Metadata
A.
Metadata
Answers
B.
Data Storage
B.
Data Storage
Answers
C.
Compute
C.
Compute
Answers
D.
Cloud Services
D.
Cloud Services
Answers
Suggested answer: D

Explanation:

The Cloud Services layer in Snowflake is responsible for managing the result cache.When a query is executed, the results are stored in this cache, and subsequent identical queries can leverage these cached results without re-executing the entire query1.

A Snowflake Administrator needs to ensure that sensitive corporate data in Snowflake tables is not visible to end users, but is partially visible to functional managers.

How can this requirement be met?

A.
Use data encryption.
A.
Use data encryption.
Answers
B.
Use dynamic data masking.
B.
Use dynamic data masking.
Answers
C.
Use secure materialized views.
C.
Use secure materialized views.
Answers
D.
Revoke all roles for functional managers and end users.
D.
Revoke all roles for functional managers and end users.
Answers
Suggested answer: B

Explanation:

Dynamic data masking is a feature in Snowflake that allows administrators to define masking policies to protect sensitive data.It enables partial visibility of the data to certain roles, such as functional managers, while hiding it from others, like end users

Users are responsible for data storage costs until what occurs?

A.
Data expires from Time Travel
A.
Data expires from Time Travel
Answers
B.
Data expires from Fail-safe
B.
Data expires from Fail-safe
Answers
C.
Data is deleted from a table
C.
Data is deleted from a table
Answers
D.
Data is truncated from a table
D.
Data is truncated from a table
Answers
Suggested answer: B

Explanation:

Users are responsible for data storage costs in Snowflake until the data expires from the Fail-safe period. Fail-safe is the final stage in the data lifecycle, following Time Travel, and provides additional protection against accidental data loss.Once data exits the Fail-safe state, users are no longer billed for its storage

What is the recommended file sizing for data loading using Snowpipe?

A.
A compressed file size greater than 100 MB, and up to 250 MB
A.
A compressed file size greater than 100 MB, and up to 250 MB
Answers
B.
A compressed file size greater than 100 GB, and up to 250 GB
B.
A compressed file size greater than 100 GB, and up to 250 GB
Answers
C.
A compressed file size greater than 10 MB, and up to 100 MB
C.
A compressed file size greater than 10 MB, and up to 100 MB
Answers
D.
A compressed file size greater than 1 GB, and up to 2 GB
D.
A compressed file size greater than 1 GB, and up to 2 GB
Answers
Suggested answer: C

Explanation:

For data loading using Snowpipe, the recommended file size is a compressed file greater than 10 MB and up to 100 MB. This size range is optimal for Snowpipe's continuous, micro-batch loading process, allowing for efficient and timely data ingestion without overwhelming the system with files that are too large or too small.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Snowpipe1

Which services does the Snowflake Cloud Services layer manage? (Select TWO).

A.
Compute resources
A.
Compute resources
Answers
B.
Query execution
B.
Query execution
Answers
C.
Authentication
C.
Authentication
Answers
D.
Data storage
D.
Data storage
Answers
E.
Metadata
E.
Metadata
Answers
Suggested answer: C, E

Explanation:

The Snowflake Cloud Services layer manages a variety of services that are crucial for the operation of the Snowflake platform. Among these services, Authentication and Metadata management are key components. Authentication is essential for controlling access to the Snowflake environment, ensuring that only authorized users can perform actions within the platform. Metadata management involves handling all the metadata related to objects within Snowflake, such as tables, views, and databases, which is vital for the organization and retrieval of data.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation12

https://docs.snowflake.com/en/user-guide/intro-key-concepts.html

What data is stored in the Snowflake storage layer? (Select TWO).

A.
Snowflake parameters
A.
Snowflake parameters
Answers
B.
Micro-partitions
B.
Micro-partitions
Answers
C.
Query history
C.
Query history
Answers
D.
Persisted query results
D.
Persisted query results
Answers
E.
Standard and secure view results
E.
Standard and secure view results
Answers
Suggested answer: B, D

Explanation:

The Snowflake storage layer is responsible for storing data in an optimized, compressed, columnar format. This includesmicro-partitions, which are the fundamental storage units that contain the actual data stored in Snowflake. Additionally,persisted query results, which are the results of queries that have been materialized and stored for future use, are also kept within this layer.This design allows for efficient data retrieval and management within the Snowflake architecture1.

[COF-C02] SnowPro Core Certification Exam Study Guide

Key Concepts & Architecture | Snowflake Documentation2

In which scenarios would a user have to pay Cloud Services costs? (Select TWO).

A.
Compute Credits = 50 Credits Cloud Services = 10
A.
Compute Credits = 50 Credits Cloud Services = 10
Answers
B.
Compute Credits = 80 Credits Cloud Services = 5
B.
Compute Credits = 80 Credits Cloud Services = 5
Answers
C.
Compute Credits = 10 Credits Cloud Services = 9
C.
Compute Credits = 10 Credits Cloud Services = 9
Answers
D.
Compute Credits = 120 Credits Cloud Services = 10
D.
Compute Credits = 120 Credits Cloud Services = 10
Answers
E.
Compute Credits = 200 Credits Cloud Services = 26
E.
Compute Credits = 200 Credits Cloud Services = 26
Answers
Suggested answer: A, E

Explanation:

In Snowflake, Cloud Services costs are incurred when the Cloud Services usage exceeds 10% of the compute usage (measured in credits). Therefore, scenarios A and E would result in Cloud Services charges because the Cloud Services usage is more than 10% of the compute credits used.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake's official documentation on billing and usage1

What transformations are supported in a CREATE PIPE ... AS COPY ... FROM (....) statement? (Select TWO.)

A.
Data can be filtered by an optional where clause
A.
Data can be filtered by an optional where clause
Answers
B.
Incoming data can be joined with other tables
B.
Incoming data can be joined with other tables
Answers
C.
Columns can be reordered
C.
Columns can be reordered
Answers
D.
Columns can be omitted
D.
Columns can be omitted
Answers
E.
Row level access can be defined
E.
Row level access can be defined
Answers
Suggested answer: A, D

Explanation:

In aCREATE PIPE ... AS COPY ... FROM (....)statement, the supported transformations include filtering data using an optional WHERE clause and omitting columns. The WHERE clause allows for the specification of conditions to filter the data that is being loaded, ensuring only relevant data is inserted into the table. Omitting columns enables the exclusion of certain columns from the data load, which can be useful when the incoming data contains more columns than are needed for the target table.

[COF-C02] SnowPro Core Certification Exam Study Guide

Simple Transformations During a Load1

What is a responsibility of Snowflake's virtual warehouses?

A.
Infrastructure management
A.
Infrastructure management
Answers
B.
Metadata management
B.
Metadata management
Answers
C.
Query execution
C.
Query execution
Answers
D.
Query parsing and optimization
D.
Query parsing and optimization
Answers
E.
Management of the storage layer
E.
Management of the storage layer
Answers
Suggested answer: C

Explanation:

The primary responsibility of Snowflake's virtual warehouses is to execute queries. Virtual warehouses are one of the key components of Snowflake's architecture, providing the compute power required to perform data processing tasks such as running SQL queries, performing joins, aggregations, and other data manipulations.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Virtual Warehouses1

Total 627 questions
Go to page: of 63