ExamGecko
Home Home / Snowflake / COF-C02

Snowflake COF-C02 Practice Test - Questions Answers, Page 70

Question list
Search
Search

What optional properties can a Snowflake user set when creating a virtual warehouse? (Select TWO).

A.
Auto-suspend
A.
Auto-suspend
Answers
B.
Cache size
B.
Cache size
Answers
C.
Default role
C.
Default role
Answers
D.
Resource monitor
D.
Resource monitor
Answers
E.
Storage size
E.
Storage size
Answers
Suggested answer: A, D

Explanation:

When creating a virtual warehouse in Snowflake, users have the option to set several properties to manage its behavior and resource usage. Two of these optional properties are Auto-suspend and Resource monitor.

Auto-suspend: This property defines the period of inactivity after which the warehouse will automatically suspend. This helps in managing costs by stopping the warehouse when it is not in use.

CREATE WAREHOUSE my_warehouse

WITH WAREHOUSE_SIZE = 'XSMALL'

AUTO_SUSPEND = 300; -- Auto-suspend after 5 minutes of inactivity

Resource monitor: Users can assign a resource monitor to a warehouse to control and limit the amount of credit usage. Resource monitors help in setting quotas and alerts for warehouse usage.

CREATE WAREHOUSE my_warehouse

WITH WAREHOUSE_SIZE = 'XSMALL'

RESOURCE_MONITOR = 'my_resource_monitor';

References:

Snowflake Documentation: Creating Warehouses

Snowflake Documentation: Resource Monitors

What is the purpose of the use of the VALIDATE command?

A.
To view any queries that encountered an error
A.
To view any queries that encountered an error
Answers
B.
To verify that a SELECT query will run without error
B.
To verify that a SELECT query will run without error
Answers
C.
To prevent a put statement from running if an error occurs
C.
To prevent a put statement from running if an error occurs
Answers
D.
To see all errors from a previously run COPY INTO <table> statement
D.
To see all errors from a previously run COPY INTO <table> statement
Answers
Suggested answer: D

Explanation:

The VALIDATE command in Snowflake is used to check for errors that occurred during the execution of a COPY INTO <table> statement. This command helps users identify and resolve data loading issues.

Run the COPY INTO Statement: Execute the COPY INTO <table> command to load data from a stage into a table.

COPY INTO my_table

FROM @my_stage

FILE_FORMAT = (FORMAT_NAME = 'my_format');

Validate the Load: Use the VALIDATE function to see if there were any errors during the data load.

SELECT *

FROM TABLE(VALIDATE(my_table, JOB_ID => 'my_copy_job_id'));

Review Errors: The VALIDATE function will return details about any errors that occurred, such as parsing errors or data type mismatches.

References:

Snowflake Documentation: Validating Data Loads

Snowflake Documentation: COPY INTO <table>

Which function is used to unload a relational table into a JSON file*?

A.
PARSE_JSON
A.
PARSE_JSON
Answers
B.
JSON_EXTRACT_PATH_TEXT
B.
JSON_EXTRACT_PATH_TEXT
Answers
C.
OBJECT_CONSTRUCT
C.
OBJECT_CONSTRUCT
Answers
D.
TO_JSON
D.
TO_JSON
Answers
Suggested answer: D

Explanation:

The TO_JSON function in Snowflake is used to convert a relational table or individual rows into JSON format. This function is helpful for exporting data in JSON format.

Using TO_JSON Function:

SELECT TO_JSON(OBJECT_CONSTRUCT(*))

FROM my_table;

Exporting Data: The TO_JSON function converts the table rows into JSON format, which can then be exported to a file.

References:

Snowflake Documentation: TO_JSON Function

Snowflake Documentation: Exporting Data

Use of which file function allows a user to share unstructured data from an internal stage with an external reporting tool that does not have access to Snowflake'>

A.
BUILD_SCOPED_FILE_URL
A.
BUILD_SCOPED_FILE_URL
Answers
B.
GET_PRESIGNED_URL
B.
GET_PRESIGNED_URL
Answers
C.
BUILD_STAGE_FILE_URL
C.
BUILD_STAGE_FILE_URL
Answers
D.
GET_STAGE_LOCATION
D.
GET_STAGE_LOCATION
Answers
Suggested answer: B

Explanation:

The GET_PRESIGNED_URL function in Snowflake generates a pre-signed URL for a file in an internal stage. This URL can be shared with external tools or users who do not have direct access to Snowflake, allowing them to download the file.

Generate Pre-Signed URL:

SELECT GET_PRESIGNED_URL(@my_stage/file.txt);

Share the URL: The generated URL can be shared with external users or applications, enabling them to access the file directly.

References:

Snowflake Documentation: GET_PRESIGNED_URL

Snowflake Documentation: Working with Stages


What is the primary purpose of using a masking policy in Snowflake?

A.

To protect sensitive data from unauthorized access when queries are run.

A.

To protect sensitive data from unauthorized access when queries are run.

Answers
B.

To automatically encrypt sensitive data when data is stored in Snowflake.

B.

To automatically encrypt sensitive data when data is stored in Snowflake.

Answers
C.

To protect multiple columns that have different data types in a given table.

C.

To protect multiple columns that have different data types in a given table.

Answers
D.

To protect both column-level and row-level data.

D.

To protect both column-level and row-level data.

Answers
Suggested answer: A

Explanation:

Masking policies in Snowflake are designed to protect sensitive information by dynamically hiding or obfuscating data based on the role of the user executing the query. This helps enforce data privacy and security by allowing only authorized users to see sensitive information. Masking policies do not encrypt data but apply rules to limit data visibility, ensuring sensitive data is protected during query execution without altering the underlying data.

When used with the UNLOAD command, which parameter specifies the destination of unloaded data?

A.

COPY INTO <table name>

A.

COPY INTO <table name>

Answers
B.

COPY INTO <stage name>

B.

COPY INTO <stage name>

Answers
C.

GET <file name>

C.

GET <file name>

Answers
D.

PUT <file name>

D.

PUT <file name>

Answers
Suggested answer: B

Explanation:

In Snowflake, the COPY INTO <stage name> syntax is used with the UNLOAD command to specify the target location where the data should be unloaded, typically a stage or cloud storage (such as Amazon S3 or Azure Blob Storage). This command unloads data from a Snowflake table into files within the specified destination, enabling easy export and external storage of data. GET and PUT commands are used for file management but are not related to unloading table data directly.

What happens when a table or schema with a standard retention period is dropped?

A.

The object is immediately removed from the system.

A.

The object is immediately removed from the system.

Answers
B.

The object is instantaneously moved to Fail-safe.

B.

The object is instantaneously moved to Fail-safe.

Answers
C.

The object is retained but all associated data is immediately purged.

C.

The object is retained but all associated data is immediately purged.

Answers
D.

The object is retained for the data retention period.

D.

The object is retained for the data retention period.

Answers
Suggested answer: D

Explanation:

In Snowflake, when a table or schema is dropped, it is not immediately deleted but retained for the configured data retention period, also known as 'Time Travel.' During this period, users can use commands like UNDROP to recover the dropped object if needed. After the retention period expires, the object is then moved to Fail-safe (if applicable) for an additional seven days before being permanently removed. This feature is intended to provide data protection and recovery options in case of accidental deletions.

How can a data provider validate that a secure view is configured to display only the data the provider wishes to expose?

A.

Log in to the data consumer account and check if the secure view data is appearing as expected.

A.

Log in to the data consumer account and check if the secure view data is appearing as expected.

Answers
B.

Create a data share for a test data consumer account and check if the secure view data is appearing as expected.

B.

Create a data share for a test data consumer account and check if the secure view data is appearing as expected.

Answers
C.

Query the secure view from a consumer account by setting the share_restrictions parameter.

C.

Query the secure view from a consumer account by setting the share_restrictions parameter.

Answers
D.

Simulate querying the secure view by setting the simulated_data_sharing_consumer session parameter.

D.

Simulate querying the secure view by setting the simulated_data_sharing_consumer session parameter.

Answers
Suggested answer: B

Explanation:

The most effective way for a data provider to validate secure view configurations is to create a data share for a test data consumer account. This method allows the provider to review and confirm that only the intended data is accessible in the secure view. Secure views are designed to mask or restrict data visibility, so creating a test share replicates the consumer's experience and ensures data security before sharing with actual consumers.

When unloading data, which combination of parameters should be used to differentiate between empty strings and NULL values? (Select TWO).

A.

ESCAPE_UNENCLOSED_FIELD

A.

ESCAPE_UNENCLOSED_FIELD

Answers
B.

REPLACE_INVALID_CHARACTERS

B.

REPLACE_INVALID_CHARACTERS

Answers
C.

FIELD_OPTIONALLY_ENCLOSED_BY

C.

FIELD_OPTIONALLY_ENCLOSED_BY

Answers
D.

EMPTY_FIELD_AS_NULL

D.

EMPTY_FIELD_AS_NULL

Answers
E.

SKIP_BLANK_LINES

E.

SKIP_BLANK_LINES

Answers
Suggested answer: C, D

Explanation:

When unloading data in Snowflake, it is essential to differentiate between empty strings and NULL values to preserve data integrity. The parameters FIELD_OPTIONALLY_ENCLOSED_BY and EMPTY_FIELD_AS_NULL are used together to address this:

FIELD_OPTIONALLY_ENCLOSED_BY: This parameter specifies the character used to enclose fields, which can differentiate between empty strings (as enclosed fields) and NULLs.

EMPTY_FIELD_AS_NULL: By setting this parameter, Snowflake interprets empty fields as NULL values when unloading data, ensuring accurate representation of NULLs versus empty strings.

These parameters are crucial when exporting data for systems that need explicit differentiation between NULL and empty string values.

How does the search optimization service improve query performance?

A.

By clustering the tables

A.

By clustering the tables

Answers
B.

By creating a persistent data structure

B.

By creating a persistent data structure

Answers
C.

By using caching

C.

By using caching

Answers
D.

By optimizing the use of micro-partitions

D.

By optimizing the use of micro-partitions

Answers
Suggested answer: B

Explanation:

The Search Optimization Service in Snowflake enhances query performance by creating a persistent data structure that enables faster access to specific data, particularly for queries with selective filters on columns not often used in clustering. This persistent structure accelerates data retrieval without depending on clustering or caching, thereby improving response times for targeted queries.

Snowflake's micro-partitioning automatically manages table structure, but search optimization allows further enhancement for certain high-frequency, specific access patterns.

Total 716 questions
Go to page: of 72