Snowflake SnowPro Core Practice Test - Questions Answers, Page 16
List of questions
Related questions
Question 151

Which statement is true about running tasks in Snowflake?
Explanation:
In Snowflake, a task allows a user to execute a single SQL statement/command using a predefined schedule (B). Tasks are used to automate the execution of SQL statements at scheduled intervals.
Question 152

In an auto-scaling multi-cluster virtual warehouse with the setting SCALING_POLICY = ECONOMY enabled, when is another cluster started?
Explanation:
In an auto-scaling multi-cluster virtual warehouse with the SCALING_POLICY set to ECONOMY, another cluster is started when the system has enough load for 2 minutes (A).This policy is designed to optimize the balance between performance and cost, starting additional clusters only when the sustained load justifies it2.
Question 153

Which of the following describes a Snowflake stored procedure?
Explanation:
Snowflake stored procedures can be created to execute with the privileges of the role that owns the procedure (owner's rights) or with the privileges of the role that calls the procedure (caller's rights).This allows for flexibility in managing security and access control within Snowflake1.
Question 154

Which columns are part of the result set of the Snowflake LATERAL FLATTEN command? (Choose two.)
Explanation:
TheLATERAL FLATTENcommand in Snowflake produces a result set that includes several columns, among whichPATHandINDEXare includedPATHindicates the path to the element within a data structure that needs to be flattened, andINDEXrepresents the index of the element if it is an array2.
Question 155

Which Snowflake function will interpret an input string as a JSON document, and produce a VARIANT value?
Explanation:
Theparse_json()function in Snowflake interprets an input string as a JSON document and produces a VARIANT value containing the JSON document.This function is specifically designed for parsing strings that contain valid JSON information3.
Question 156

How are serverless features billed?
Explanation:
Serverless features in Snowflake are billed based on the time they are used, measured in minutes. The cost is calculated by multiplying the duration of the job by an automatic sizing determined by Snowflake, with a minimum billing increment of one minute. This means that even if a serverless feature is used for less than a minute, it will still be billed for the full minute.
Question 157

Which Snowflake architectural layer is responsible for a query execution plan?
Explanation:
In Snowflake's architecture, the Cloud Services layer is responsible for generating the query execution plan. This layer handles all the coordination, optimization, and management tasks, including query parsing, optimization, and compilation into an execution plan that can be processed by the Compute layer.
Question 158

Which SQL commands, when committed, will consume a stream and advance the stream offset? (Choose two.)
Explanation:
The SQL commands that consume a stream and advance the stream offset are those that result in changes to the data, such as UPDATE and INSERT operations. Specifically, 'UPDATE TABLE FROM STREAM' and 'INSERT INTO TABLE SELECT FROM STREAM' will consume the stream and move the offset forward, reflecting the changes made to the data.
Question 159

Which methods can be used to delete staged files from a Snowflake stage? (Choose two.)
Explanation:
To delete staged files from a Snowflake stage, you can specify thePURGEoption in theCOPY INTO <table>command, which will automatically delete the files after they have been successfully loaded.Additionally, you can use theREMOVEcommand after the load completes to manually delete the files from the stage12.
Reference =DROP STAGE,REMOVE
Question 160

Assume there is a table consisting of five micro-partitions with values ranging from A to Z.
Which diagram indicates a well-clustered table?
Explanation:
A well-clustered table in Snowflake means that the data is organized in such a way that related data points are stored close to each other within the micro-partitions. This optimizes query performance by reducing the amount of scanned data.The diagram indicated by option C shows a well-clustered table, as it likely represents a more evenly distributed range of values across the micro-partitions1.
Reference =Snowflake Micro-partitions & Table Clustering
Question