ExamGecko
Home / Snowflake / SnowPro Core
Ask Question

Snowflake SnowPro Core Practice Test - Questions Answers, Page 16

Question list
Search

Question 151

Report
Export
Collapse

Which statement is true about running tasks in Snowflake?

A task can be called using a CALL statement to run a set of predefined SQL commands.
A task can be called using a CALL statement to run a set of predefined SQL commands.
A task allows a user to execute a single SQL statement/command using a predefined schedule.
A task allows a user to execute a single SQL statement/command using a predefined schedule.
A task allows a user to execute a set of SQL commands on a predefined schedule.
A task allows a user to execute a set of SQL commands on a predefined schedule.
A task can be executed using a SELECT statement to run a predefined SQL command.
A task can be executed using a SELECT statement to run a predefined SQL command.
Suggested answer: B

Explanation:

In Snowflake, a task allows a user to execute a single SQL statement/command using a predefined schedule (B). Tasks are used to automate the execution of SQL statements at scheduled intervals.

asked 23/09/2024
Vilfride Lutumba
41 questions

Question 152

Report
Export
Collapse

In an auto-scaling multi-cluster virtual warehouse with the setting SCALING_POLICY = ECONOMY enabled, when is another cluster started?

When the system has enough load for 2 minutes
When the system has enough load for 2 minutes
When the system has enough load for 6 minutes
When the system has enough load for 6 minutes
When the system has enough load for 8 minutes
When the system has enough load for 8 minutes
When the system has enough load for 10 minutes
When the system has enough load for 10 minutes
Suggested answer: A

Explanation:

In an auto-scaling multi-cluster virtual warehouse with the SCALING_POLICY set to ECONOMY, another cluster is started when the system has enough load for 2 minutes (A).This policy is designed to optimize the balance between performance and cost, starting additional clusters only when the sustained load justifies it2.

asked 23/09/2024
Jonathan Ang
36 questions

Question 153

Report
Export
Collapse

Which of the following describes a Snowflake stored procedure?

They can be created as secure and hide the underlying metadata from the user.
They can be created as secure and hide the underlying metadata from the user.
They can only access tables from a single database.
They can only access tables from a single database.
They can contain only a single SQL statement.
They can contain only a single SQL statement.
They can be created to run with a caller's rights or an owner's rights.
They can be created to run with a caller's rights or an owner's rights.
Suggested answer: D

Explanation:

Snowflake stored procedures can be created to execute with the privileges of the role that owns the procedure (owner's rights) or with the privileges of the role that calls the procedure (caller's rights).This allows for flexibility in managing security and access control within Snowflake1.

asked 23/09/2024
Reece Scarley
43 questions

Question 154

Report
Export
Collapse

Which columns are part of the result set of the Snowflake LATERAL FLATTEN command? (Choose two.)

CONTENT
CONTENT
PATH
PATH
BYTE_SIZE
BYTE_SIZE
INDEX
INDEX
DATATYPE
DATATYPE
Suggested answer: B, D

Explanation:

TheLATERAL FLATTENcommand in Snowflake produces a result set that includes several columns, among whichPATHandINDEXare includedPATHindicates the path to the element within a data structure that needs to be flattened, andINDEXrepresents the index of the element if it is an array2.

asked 23/09/2024
sangilipandy Arumugam
24 questions

Question 155

Report
Export
Collapse

Which Snowflake function will interpret an input string as a JSON document, and produce a VARIANT value?

parse_json()
parse_json()
json_extract_path_text()
json_extract_path_text()
object_construct()
object_construct()
flatten
flatten
Suggested answer: A

Explanation:

Theparse_json()function in Snowflake interprets an input string as a JSON document and produces a VARIANT value containing the JSON document.This function is specifically designed for parsing strings that contain valid JSON information3.

asked 23/09/2024
Carlos Eduardo Araujo Fonseca
38 questions

Question 156

Report
Export
Collapse

How are serverless features billed?

Per second multiplied by an automatic sizing for the job
Per second multiplied by an automatic sizing for the job
Per minute multiplied by an automatic sizing for the job, with a minimum of one minute
Per minute multiplied by an automatic sizing for the job, with a minimum of one minute
Per second multiplied by the size, as determined by the SERVERLESS_FEATURES_SIZE account parameter
Per second multiplied by the size, as determined by the SERVERLESS_FEATURES_SIZE account parameter
Serverless features are not billed, unless the total cost for the month exceeds 10% of the warehouse credits, on the account
Serverless features are not billed, unless the total cost for the month exceeds 10% of the warehouse credits, on the account
Suggested answer: B

Explanation:

Serverless features in Snowflake are billed based on the time they are used, measured in minutes. The cost is calculated by multiplying the duration of the job by an automatic sizing determined by Snowflake, with a minimum billing increment of one minute. This means that even if a serverless feature is used for less than a minute, it will still be billed for the full minute.

asked 23/09/2024
Rocky Lott
32 questions

Question 157

Report
Export
Collapse

Which Snowflake architectural layer is responsible for a query execution plan?

Compute
Compute
Data storage
Data storage
Cloud services
Cloud services
Cloud provider
Cloud provider
Suggested answer: C

Explanation:

In Snowflake's architecture, the Cloud Services layer is responsible for generating the query execution plan. This layer handles all the coordination, optimization, and management tasks, including query parsing, optimization, and compilation into an execution plan that can be processed by the Compute layer.

asked 23/09/2024
Bill Skadden
31 questions

Question 158

Report
Export
Collapse

Which SQL commands, when committed, will consume a stream and advance the stream offset? (Choose two.)

UPDATE TABLE FROM STREAM
UPDATE TABLE FROM STREAM
SELECT FROM STREAM
SELECT FROM STREAM
INSERT INTO TABLE SELECT FROM STREAM
INSERT INTO TABLE SELECT FROM STREAM
ALTER TABLE AS SELECT FROM STREAM
ALTER TABLE AS SELECT FROM STREAM
BEGIN COMMIT
BEGIN COMMIT
Suggested answer: A, C

Explanation:

The SQL commands that consume a stream and advance the stream offset are those that result in changes to the data, such as UPDATE and INSERT operations. Specifically, 'UPDATE TABLE FROM STREAM' and 'INSERT INTO TABLE SELECT FROM STREAM' will consume the stream and move the offset forward, reflecting the changes made to the data.

asked 23/09/2024
Petya Pavlova
43 questions

Question 159

Report
Export
Collapse

Which methods can be used to delete staged files from a Snowflake stage? (Choose two.)

Use the DROP <file> command after the load completes.
Use the DROP <file> command after the load completes.
Specify the TEMPORARY option when creating the file format.
Specify the TEMPORARY option when creating the file format.
Specify the PURGE copy option in the COPY INTO <table> command.
Specify the PURGE copy option in the COPY INTO <table> command.
Use the REMOVE command after the load completes.
Use the REMOVE command after the load completes.
Use the DELETE LOAD HISTORY command after the load completes.
Use the DELETE LOAD HISTORY command after the load completes.
Suggested answer: C, D

Explanation:

To delete staged files from a Snowflake stage, you can specify thePURGEoption in theCOPY INTO <table>command, which will automatically delete the files after they have been successfully loaded.Additionally, you can use theREMOVEcommand after the load completes to manually delete the files from the stage12.

Reference =DROP STAGE,REMOVE

asked 23/09/2024
ANIKET PATEL
41 questions

Question 160

Report
Export
Collapse

Assume there is a table consisting of five micro-partitions with values ranging from A to Z.

Which diagram indicates a well-clustered table?

Option A
Option A
Option B
Option B
Option C
Option C
Option D
Option D
Suggested answer: C

Explanation:

A well-clustered table in Snowflake means that the data is organized in such a way that related data points are stored close to each other within the micro-partitions. This optimizes query performance by reducing the amount of scanned data.The diagram indicated by option C shows a well-clustered table, as it likely represents a more evenly distributed range of values across the micro-partitions1.

Reference =Snowflake Micro-partitions & Table Clustering

asked 23/09/2024
Adrien BARDE
31 questions
Total 627 questions
Go to page: of 63