ExamGecko
Home / Snowflake / SnowPro Core / List of questions
Ask Question

Snowflake SnowPro Core Practice Test - Questions Answers, Page 9

List of questions

Question 81

Report Export Collapse

What features that are part of the Continuous Data Protection (CDP) feature set in Snowflake do not require additional configuration? (Choose two.)

Row level access policies
Row level access policies
Data masking policies
Data masking policies
Data encryption
Data encryption
Time Travel
Time Travel
External tokenization
External tokenization
Suggested answer: C, D
Explanation:

Data encryption and Time Travel are part of Snowflake's Continuous Data Protection (CDP) feature set that do not require additional configuration.Data encryption is automatically applied to all files stored on internal stages, and Time Travel allows for querying and restoring data without any extra setup

asked 23/09/2024
Stefan Lundmark
48 questions

Question 82

Report Export Collapse

Which Snowflake layer is always leveraged when accessing a query from the result cache?

Metadata
Metadata
Data Storage
Data Storage
Compute
Compute
Cloud Services
Cloud Services
Suggested answer: D
Explanation:

The Cloud Services layer in Snowflake is responsible for managing the result cache.When a query is executed, the results are stored in this cache, and subsequent identical queries can leverage these cached results without re-executing the entire query1.

asked 23/09/2024
C/1094 WOLOGUEDE COTONOU – BENIN STEVE
41 questions

Question 83

Report Export Collapse

A Snowflake Administrator needs to ensure that sensitive corporate data in Snowflake tables is not visible to end users, but is partially visible to functional managers.

How can this requirement be met?

Use data encryption.
Use data encryption.
Use dynamic data masking.
Use dynamic data masking.
Use secure materialized views.
Use secure materialized views.
Revoke all roles for functional managers and end users.
Revoke all roles for functional managers and end users.
Suggested answer: B
Explanation:

Dynamic data masking is a feature in Snowflake that allows administrators to define masking policies to protect sensitive data.It enables partial visibility of the data to certain roles, such as functional managers, while hiding it from others, like end users

asked 23/09/2024
Ian Lloyd
42 questions

Question 84

Report Export Collapse

Users are responsible for data storage costs until what occurs?

Data expires from Time Travel
Data expires from Time Travel
Data expires from Fail-safe
Data expires from Fail-safe
Data is deleted from a table
Data is deleted from a table
Data is truncated from a table
Data is truncated from a table
Suggested answer: B
Explanation:

Users are responsible for data storage costs in Snowflake until the data expires from the Fail-safe period. Fail-safe is the final stage in the data lifecycle, following Time Travel, and provides additional protection against accidental data loss.Once data exits the Fail-safe state, users are no longer billed for its storage

asked 23/09/2024
Donald Wu
40 questions

Question 85

Report Export Collapse

What is the recommended file sizing for data loading using Snowpipe?

A compressed file size greater than 100 MB, and up to 250 MB
A compressed file size greater than 100 MB, and up to 250 MB
A compressed file size greater than 100 GB, and up to 250 GB
A compressed file size greater than 100 GB, and up to 250 GB
A compressed file size greater than 10 MB, and up to 100 MB
A compressed file size greater than 10 MB, and up to 100 MB
A compressed file size greater than 1 GB, and up to 2 GB
A compressed file size greater than 1 GB, and up to 2 GB
Suggested answer: C
Explanation:

For data loading using Snowpipe, the recommended file size is a compressed file greater than 10 MB and up to 100 MB. This size range is optimal for Snowpipe's continuous, micro-batch loading process, allowing for efficient and timely data ingestion without overwhelming the system with files that are too large or too small.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Snowpipe1

asked 23/09/2024
Shantal Aviles
36 questions

Question 86

Report Export Collapse

Which services does the Snowflake Cloud Services layer manage? (Select TWO).

Compute resources
Compute resources
Query execution
Query execution
Authentication
Authentication
Data storage
Data storage
Metadata
Metadata
Suggested answer: C, E
Explanation:

The Snowflake Cloud Services layer manages a variety of services that are crucial for the operation of the Snowflake platform. Among these services, Authentication and Metadata management are key components. Authentication is essential for controlling access to the Snowflake environment, ensuring that only authorized users can perform actions within the platform. Metadata management involves handling all the metadata related to objects within Snowflake, such as tables, views, and databases, which is vital for the organization and retrieval of data.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation12

https://docs.snowflake.com/en/user-guide/intro-key-concepts.html

asked 23/09/2024
ahmad hafiz
37 questions

Question 87

Report Export Collapse

What data is stored in the Snowflake storage layer? (Select TWO).

Snowflake parameters
Snowflake parameters
Micro-partitions
Micro-partitions
Query history
Query history
Persisted query results
Persisted query results
Standard and secure view results
Standard and secure view results
Suggested answer: B, D
Explanation:

The Snowflake storage layer is responsible for storing data in an optimized, compressed, columnar format. This includesmicro-partitions, which are the fundamental storage units that contain the actual data stored in Snowflake. Additionally,persisted query results, which are the results of queries that have been materialized and stored for future use, are also kept within this layer.This design allows for efficient data retrieval and management within the Snowflake architecture1.

[COF-C02] SnowPro Core Certification Exam Study Guide

Key Concepts & Architecture | Snowflake Documentation2

asked 23/09/2024
Michael Geary
40 questions

Question 88

Report Export Collapse

In which scenarios would a user have to pay Cloud Services costs? (Select TWO).

Compute Credits = 50 Credits Cloud Services = 10
Compute Credits = 50 Credits Cloud Services = 10
Compute Credits = 80 Credits Cloud Services = 5
Compute Credits = 80 Credits Cloud Services = 5
Compute Credits = 10 Credits Cloud Services = 9
Compute Credits = 10 Credits Cloud Services = 9
Compute Credits = 120 Credits Cloud Services = 10
Compute Credits = 120 Credits Cloud Services = 10
Compute Credits = 200 Credits Cloud Services = 26
Compute Credits = 200 Credits Cloud Services = 26
Suggested answer: A, E
Explanation:

In Snowflake, Cloud Services costs are incurred when the Cloud Services usage exceeds 10% of the compute usage (measured in credits). Therefore, scenarios A and E would result in Cloud Services charges because the Cloud Services usage is more than 10% of the compute credits used.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake's official documentation on billing and usage1

asked 23/09/2024
DHANANJAY TIWARI
36 questions

Question 89

Report Export Collapse

What transformations are supported in a CREATE PIPE ... AS COPY ... FROM (....) statement? (Select TWO.)

Data can be filtered by an optional where clause
Data can be filtered by an optional where clause
Incoming data can be joined with other tables
Incoming data can be joined with other tables
Columns can be reordered
Columns can be reordered
Columns can be omitted
Columns can be omitted
Row level access can be defined
Row level access can be defined
Suggested answer: A, D
Explanation:

In aCREATE PIPE ... AS COPY ... FROM (....)statement, the supported transformations include filtering data using an optional WHERE clause and omitting columns. The WHERE clause allows for the specification of conditions to filter the data that is being loaded, ensuring only relevant data is inserted into the table. Omitting columns enables the exclusion of certain columns from the data load, which can be useful when the incoming data contains more columns than are needed for the target table.

[COF-C02] SnowPro Core Certification Exam Study Guide

Simple Transformations During a Load1

asked 23/09/2024
Matthew Hillson
46 questions

Question 90

Report Export Collapse

What is a responsibility of Snowflake's virtual warehouses?

Infrastructure management
Infrastructure management
Metadata management
Metadata management
Query execution
Query execution
Query parsing and optimization
Query parsing and optimization
Management of the storage layer
Management of the storage layer
Suggested answer: C
Explanation:

The primary responsibility of Snowflake's virtual warehouses is to execute queries. Virtual warehouses are one of the key components of Snowflake's architecture, providing the compute power required to perform data processing tasks such as running SQL queries, performing joins, aggregations, and other data manipulations.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Virtual Warehouses1

asked 23/09/2024
gregory damon
46 questions
Total 627 questions
Go to page: of 63