ExamGecko
Home / Snowflake / COF-C02
Ask Question

Snowflake COF-C02 Practice Test - Questions Answers, Page 5

Question list
Search

Question 41

Report
Export
Collapse

What Snowflake features allow virtual warehouses to handle high concurrency workloads? (Select TWO)

The ability to scale up warehouses
The ability to scale up warehouses
The use of warehouse auto scaling
The use of warehouse auto scaling
The ability to resize warehouses
The ability to resize warehouses
Use of multi-clustered warehouses
Use of multi-clustered warehouses
The use of warehouse indexing
The use of warehouse indexing
Suggested answer: B, D

Explanation:

Snowflake's architecture is designed to handle high concurrency workloads through several features, two of which are particularly effective:

B . The use of warehouse auto scaling: This feature allows Snowflake to automatically adjust the compute resources allocated to a virtual warehouse in response to the workload. If there is an increase in concurrent queries, Snowflake can scale up the resources to maintain performance.

D . Use of multi-clustered warehouses: Multi-clustered warehouses enable Snowflake to run multiple clusters of compute resources simultaneously. This allows for the distribution of queries across clusters, thereby reducing the load on any single cluster and improving the system's ability to handle a high number of concurrent queries.

These features ensure that Snowflake can manage varying levels of demand without manual intervention, providing a seamless experience even during peak usage.

References:

Snowflake Documentation on Virtual Warehouses

SnowPro Core Certification Study Guide

asked 23/09/2024
bebo here
42 questions

Question 42

Report
Export
Collapse

When reviewing the load for a warehouse using the load monitoring chart, the chart indicates that a high volume of Queries are always queuing in the warehouse

According to recommended best practice, what should be done to reduce the Queue volume? (Select TWO).

Use multi-clustered warehousing to scale out warehouse capacity.
Use multi-clustered warehousing to scale out warehouse capacity.
Scale up the warehouse size to allow Queries to execute faster.
Scale up the warehouse size to allow Queries to execute faster.
Stop and start the warehouse to clear the queued queries
Stop and start the warehouse to clear the queued queries
Migrate some queries to a new warehouse to reduce load
Migrate some queries to a new warehouse to reduce load
Limit user access to the warehouse so fewer queries are run against it.
Limit user access to the warehouse so fewer queries are run against it.
Suggested answer: A, B

Explanation:

To address a high volume of queries queuing in a warehouse, Snowflake recommends two best practices:

A . Use multi-clustered warehousing to scale out warehouse capacity: This approach allows for the distribution of queries across multiple clusters within a warehouse, effectively managing the load and reducing the queue volume.

B . Scale up the warehouse size to allow Queries to execute faster: Increasing the size of the warehouse provides more compute resources, which can reduce the time it takes for queries to execute and thus decrease the number of queries waiting in the queue.

These strategies help to optimize the performance of the warehouse by ensuring that resources are scaled appropriately to meet demand.

References:

Snowflake Documentation on Multi-Cluster Warehousing

SnowPro Core Certification best practices

asked 23/09/2024
Frederik Pardon
36 questions

Question 43

Report
Export
Collapse

Which of the following objects can be shared through secure data sharing?

Masking policy
Masking policy
Stored procedure
Stored procedure
Task
Task
External table
External table
Suggested answer: D

Explanation:

Secure data sharing in Snowflake allows users to share various objects between Snowflake accounts without physically copying the data, thus not consuming additional storage. Among the options provided, external tables can be shared through secure data sharing. External tables are used to query data directly from files in a stage without loading the data into Snowflake tables, making them suitable for sharing across different Snowflake accounts.

References:

Snowflake Documentation on Secure Data Sharing

SnowPro Core Certification Companion: Hands-on Preparation and Practice

asked 23/09/2024
Alex Rector
30 questions

Question 44

Report
Export
Collapse

Which of the following commands cannot be used within a reader account?

CREATE SHARE
CREATE SHARE
ALTER WAREHOUSE
ALTER WAREHOUSE
DROP ROLE
DROP ROLE
SHOW SCHEMAS
SHOW SCHEMAS
DESCRBE TABLE
DESCRBE TABLE
Suggested answer: A

Explanation:

In Snowflake, a reader account is a type of account that is intended for consuming shared data rather than performing any data management or DDL operations. TheCREATE SHAREcommand is used to share data from your account with another account, which is not a capability provided to reader accounts. Reader accounts are typically restricted from creating shares, as their primary purpose is to read shared data rather than to share it themselves.

References:

Snowflake Documentation on Reader Accounts

SnowPro Core Certification Study Guide

asked 23/09/2024
Michael Ulrich
41 questions

Question 45

Report
Export
Collapse

A user unloaded a Snowflake table called mytable to an internal stage called mystage.

Which command can be used to view the list of files that has been uploaded to the staged?

list @mytable;
list @mytable;
list @%raytable;
list @%raytable;
list @ %m.ystage;
list @ %m.ystage;
list @mystage;
list @mystage;
Suggested answer: D

Explanation:

The commandlist @mystage;is used to view the list of files that have been uploaded to an internal stage in Snowflake. Thelistcommand displays the metadata for all files in the specified stage, which in this case ismystage. This command is particularly useful for verifying that files have been successfully unloaded from a Snowflake table to the stage and for managing the files within the stage.

References:

Snowflake Documentation on Stages

SnowPro Core Certification Study Guide

asked 23/09/2024
Veronica Puddu
54 questions

Question 46

Report
Export
Collapse

Which of the following Snowflake capabilities are available in all Snowflake editions? (Select TWO)

Customer-managed encryption keys through Tri-Secret Secure
Customer-managed encryption keys through Tri-Secret Secure
Automatic encryption of all data
Automatic encryption of all data
Up to 90 days of data recovery through Time Travel
Up to 90 days of data recovery through Time Travel
Object-level access control
Object-level access control
Column-level security to apply data masking policies to tables and views
Column-level security to apply data masking policies to tables and views
Suggested answer: B, D

Explanation:

In all Snowflake editions, two key capabilities are universally available:

B . Automatic encryption of all data: Snowflake automatically encrypts all data stored in its platform, ensuring security and compliance with various regulations. This encryption is transparent to users and does not require any configuration or management.

D . Object-level access control: Snowflake provides granular access control mechanisms that allow administrators to define permissions at the object level, including databases, schemas, tables, and views. This ensures that only authorized users can access specific data objects.

These features are part of Snowflake's commitment to security and governance, and they are included in every edition of the Snowflake Data Cloud.

References:

Snowflake Documentation on Security Features

SnowPro Core Certification Exam Study Guide

asked 23/09/2024
Muddasir Solkar
39 questions

Question 47

Report
Export
Collapse

Which command is used to unload data from a Snowflake table into a file in a stage?

COPY INTO
COPY INTO
GET
GET
WRITE
WRITE
EXTRACT INTO
EXTRACT INTO
Suggested answer: A

Explanation:

TheCOPY INTOcommand is used in Snowflake to unload data from a table into a file in a stage. This command allows for the export of data from Snowflake tables into flat files, which can then be used for further analysis, processing, or storage in external systems.

References:

Snowflake Documentation on Unloading Data

Snowflake SnowPro Core: Copy Into Command to Unload Rows to Files in Named Stage

asked 23/09/2024
Colin Ng
46 questions

Question 48

Report
Export
Collapse

How often are encryption keys automatically rotated by Snowflake?

30 Days
30 Days
60 Days
60 Days
90 Days
90 Days
365 Days
365 Days
Suggested answer: A

Explanation:

Snowflake automatically rotates encryption keys when they are more than 30 days old. Active keys are retired, and new keys are created. This process is part of Snowflake's comprehensive security measures to ensure data protection and is managed entirely by the Snowflake service without requiring user intervention.

References:

Understanding Encryption Key Management in Snowflake

asked 23/09/2024
rami Awad
36 questions

Question 49

Report
Export
Collapse

What are value types that a VARIANT column can store? (Select TWO)

STRUCT
STRUCT
OBJECT
OBJECT
BINARY
BINARY
ARRAY
ARRAY
CLOB
CLOB
Suggested answer: B, D

Explanation:

A VARIANT column in Snowflake can store semi-structured data types. This includes:

B . OBJECT: An object is a collection of key-value pairs in JSON, and a VARIANT column can store this type of data structure.

D . ARRAY: An array is an ordered list of zero or more values, which can be of any variant-supported data type, including objects or other arrays.

The VARIANT data type is specifically designed to handle semi-structured data like JSON, Avro, ORC, Parquet, or XML, allowing for the storage of nested and complex data structures.

References:

Snowflake Documentation on Semi-Structured Data Types

SnowPro Core Certification Study Guide

asked 23/09/2024
Francesco Balducci
36 questions

Question 50

Report
Export
Collapse

A user has an application that writes a new Tile to a cloud storage location every 5 minutes.

What would be the MOST efficient way to get the files into Snowflake?

Create a task that runs a copy into operation from an external stage every 5 minutes
Create a task that runs a copy into operation from an external stage every 5 minutes
Create a task that puts the files in an internal stage and automate the data loading wizard
Create a task that puts the files in an internal stage and automate the data loading wizard
Create a task that runs a GET operation to intermittently check for new files
Create a task that runs a GET operation to intermittently check for new files
Set up cloud provider notifications on the Tile location and use Snowpipe with auto-ingest
Set up cloud provider notifications on the Tile location and use Snowpipe with auto-ingest
Suggested answer: D

Explanation:

The most efficient way to get files into Snowflake, especially when new files are being written to a cloud storage location at frequent intervals, is to use Snowpipe with auto-ingest. Snowpipe is Snowflake's continuous data ingestion service that loads data as soon as it becomes available in a cloud storage location. By setting up cloud provider notifications, Snowpipe can be triggered automatically whenever new files are written to the storage location, ensuring that the data is loaded into Snowflake with minimal latency and without the need for manual intervention or scheduling frequent tasks.

References:

Snowflake Documentation on Snowpipe

SnowPro Core Certification Study Guide

asked 23/09/2024
Jevgenij Žarikov
39 questions
Total 716 questions
Go to page: of 72