ExamGecko
Home / Snowflake / SnowPro Core / List of questions
Ask Question

Snowflake SnowPro Core Practice Test - Questions Answers

List of questions

Question 1

Report Export Collapse

A user is loading JSON documents composed of a huge array containing multiple records into Snowflake. The user enables the strip__outer_array file format option

What does the STRIP_OUTER_ARRAY file format do?

It removes the last element of the outer array.
It removes the last element of the outer array.
It removes the outer array structure and loads the records into separate table rows,
It removes the outer array structure and loads the records into separate table rows,
It removes the trailing spaces in the last element of the outer array and loads the records into separate table columns
It removes the trailing spaces in the last element of the outer array and loads the records into separate table columns
It removes the NULL elements from the JSON object eliminating invalid data and enables the ability to load the records
It removes the NULL elements from the JSON object eliminating invalid data and enables the ability to load the records
Suggested answer: B
Explanation:

TheSTRIP_OUTER_ARRAYfile format option in Snowflake is used when loading JSON documents that are composed of a large array containing multiple records. When this option is enabled, it removes the outer array structure, which allows each record within the array to be loaded as a separate row in the table.This is particularly useful for efficiently loading JSON data that is structured as an array of records1.

Snowflake Documentation on JSON File Format

[COF-C02] SnowPro Core Certification Exam Study Guide

asked 23/09/2024
Phillip Roos
49 questions

Question 2

Report Export Collapse

What are the default Time Travel and Fail-safe retention periods for transient tables?

Time Travel - 1 day. Fail-safe - 1 day
Time Travel - 1 day. Fail-safe - 1 day
Time Travel - 0 days. Fail-safe - 1 day
Time Travel - 0 days. Fail-safe - 1 day
Time Travel - 1 day. Fail-safe - 0 days
Time Travel - 1 day. Fail-safe - 0 days
Transient tables are retained in neither Fail-safe nor Time Travel
Transient tables are retained in neither Fail-safe nor Time Travel
Suggested answer: C
Explanation:

Transient tables in Snowflake have a default Time Travel retention period of 1 day, which allows users to access historical data within the last 24 hours. However, transient tables do not have a Fail-safe period. Fail-safe is an additional layer of data protection that retains data beyond the Time Travel period for recovery purposes in case of extreme data loss.Since transient tables are designed for temporary or intermediate workloads with no requirement for long-term durability, they do not include a Fail-safe period by default1.

Snowflake Documentation on Storage Costs for Time Travel and Fail-safe

asked 23/09/2024
Michael Craig
42 questions

Question 3

Report Export Collapse

What is a best practice after creating a custom role?

Create the custom role using the SYSADMIN role.
Create the custom role using the SYSADMIN role.
Assign the custom role to the SYSADMIN role
Assign the custom role to the SYSADMIN role
Assign the custom role to the PUBLIC role
Assign the custom role to the PUBLIC role
Add__CUSTOM to all custom role names
Add__CUSTOM to all custom role names
Suggested answer: B
Explanation:

Assigning the custom role to the SYSADMIN role is considered a best practice because it allows the SYSADMIN role to manage objects created by the custom role. This is important for maintaining proper access control and ensuring that the SYSADMIN can perform necessary administrative tasks on objects created by users with the custom role.

[COF-C02] SnowPro Core Certification Exam Study Guide

Section 1.3 - SnowPro Core Certification Study Guide1

asked 23/09/2024
Julio Callegaro
38 questions

Question 4

Report Export Collapse

Which of the following Snowflake objects can be shared using a secure share? (Select TWO).

Materialized views
Materialized views
Sequences
Sequences
Procedures
Procedures
Tables
Tables
Secure User Defined Functions (UDFs)
Secure User Defined Functions (UDFs)
Suggested answer: D, E
Explanation:

Secure sharing in Snowflake allows users to share specific objects with other Snowflake accounts without physically copying the data, thus not consuming additional storage. Tables and Secure User Defined Functions (UDFs) are among the objects that can be shared using this feature. Materialized views, sequences, and procedures are not shareable objects in Snowflake.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Secure Data Sharing1

asked 23/09/2024
Adish Narayan
38 questions

Question 5

Report Export Collapse

Will data cached in a warehouse be lost when the warehouse is resized?

Possibly, if the warehouse is resized to a smaller size and the cache no longer fits.
Possibly, if the warehouse is resized to a smaller size and the cache no longer fits.
Yes. because the compute resource is replaced in its entirety with a new compute resource.
Yes. because the compute resource is replaced in its entirety with a new compute resource.
No. because the size of the cache is independent from the warehouse size
No. because the size of the cache is independent from the warehouse size
Yes. became the new compute resource will no longer have access to the cache encryption key
Yes. became the new compute resource will no longer have access to the cache encryption key
Suggested answer: C
Explanation:

When a Snowflake virtual warehouse is resized, the data cached in the warehouse is not lost. This is because the cache is maintained independently of the warehouse size. Resizing a warehouse, whether scaling up or down, does not affect the cached data, ensuring that query performance is not impacted by such changes.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Virtual Warehouse Performance1

asked 23/09/2024
Lukasz Malaczek
32 questions

Question 6

Report Export Collapse

What happens when a virtual warehouse is resized?

When increasing the size of an active warehouse the compute resource for all running and queued queries on the warehouse are affected
When increasing the size of an active warehouse the compute resource for all running and queued queries on the warehouse are affected
When reducing the size of a warehouse the compute resources are removed only when they are no longer being used to execute any current statements.
When reducing the size of a warehouse the compute resources are removed only when they are no longer being used to execute any current statements.
The warehouse will be suspended while the new compute resource is provisioned and will resume automatically once provisioning is complete.
The warehouse will be suspended while the new compute resource is provisioned and will resume automatically once provisioning is complete.
Users who are trying to use the warehouse will receive an error message until the resizing is complete
Users who are trying to use the warehouse will receive an error message until the resizing is complete
Suggested answer: A
Explanation:

When a virtual warehouse in Snowflake is resized, specifically when it is increased in size, the additional compute resources become immediately available to all running and queued queries. This means that the performance of these queries can improve due to the increased resources.Conversely, when the size of a warehouse is reduced, the compute resources are not removed until they are no longer being used by any current operations1.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Virtual Warehouses2

asked 23/09/2024
Harry Vervelde
40 questions

Question 7

Report Export Collapse

A developer is granted ownership of a table that has a masking policy. The developer's role is not able to see the masked data. Will the developer be able to modify the table to read the masked data?

Yes, because a table owner has full control and can unset masking policies.
Yes, because a table owner has full control and can unset masking policies.
Yes, because masking policies only apply to cloned tables.
Yes, because masking policies only apply to cloned tables.
No, because masking policies must always reference specific access roles.
No, because masking policies must always reference specific access roles.
No, because ownership of a table does not include the ability to change masking policies
No, because ownership of a table does not include the ability to change masking policies
Suggested answer: D
Explanation:

Even if a developer is granted ownership of a table with a masking policy, they will not be able to modify the table to read the masked data if their role does not have the necessary permissions. Ownership of a table does not automatically confer the ability to alter masking policies, which are designed to protect sensitive data.Masking policies are applied at the schema level and require specific privileges to modify12.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Masking Policies

asked 23/09/2024
Padmanabhan Kudiarasu
48 questions

Question 8

Report Export Collapse

Which of the following describes how clustering keys work in Snowflake?

Clustering keys update the micro-partitions in place with a full sort, and impact the DML operations.
Clustering keys update the micro-partitions in place with a full sort, and impact the DML operations.
Clustering keys sort the designated columns over time, without blocking DML operations
Clustering keys sort the designated columns over time, without blocking DML operations
Clustering keys create a distributed, parallel data structure of pointers to a table's rows and columns
Clustering keys create a distributed, parallel data structure of pointers to a table's rows and columns
Clustering keys establish a hashed key on each node of a virtual warehouse to optimize joins at run-time
Clustering keys establish a hashed key on each node of a virtual warehouse to optimize joins at run-time
Suggested answer: B
Explanation:

Clustering keys in Snowflake work by sorting the designated columns over time. This process is done in the background and does not block data manipulation language (DML) operations, allowing for normal database operations to continue without interruption.The purpose of clustering keys is to organize the data within micro-partitions to optimize query performance1.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Clustering1

asked 23/09/2024
Mark David
44 questions

Question 9

Report Export Collapse

What is a machine learning and data science partner within the Snowflake Partner Ecosystem?

Informatica
Informatica
Power Bl
Power Bl
Adobe
Adobe
Data Robot
Data Robot
Suggested answer: D
Explanation:

Data Robot is recognized as a machine learning and data science partner within the Snowflake Partner Ecosystem. It provides an enterprise AI platform that enables users to build and deploy accurate predictive models quickly.As a partner, Data Robot integrates with Snowflake to enhance data science capabilities2.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Machine Learning & Data Science Partners

https://docs.snowflake.com/en/user-guide/ecosystem-analytics.html

asked 23/09/2024
Sari Bukhari
33 questions

Question 10

Report Export Collapse

Which of the following is a valid source for an external stage when the Snowflake account is located on Microsoft Azure?

An FTP server with TLS encryption
An FTP server with TLS encryption
An HTTPS server with WebDAV
An HTTPS server with WebDAV
A Google Cloud storage bucket
A Google Cloud storage bucket
A Windows server file share on Azure
A Windows server file share on Azure
Suggested answer: D
Explanation:

In Snowflake, when the account is located on Microsoft Azure, a valid source for an external stage can be an Azure container or a folder path within an Azure container. This includes Azure Blob storage which is accessible via theazure://endpoint. A Windows server file share on Azure, if configured properly, can be a valid source for staging data files for Snowflake.Options A, B, and C are not supported as direct sources for an external stage in Snowflake on Azure12.Reference:[COF-C02] SnowPro Core Certification Exam Study Guide

asked 23/09/2024
juan otero
28 questions
Total 627 questions
Go to page: of 63