ExamGecko
Home Home / Snowflake / SnowPro Core

Snowflake SnowPro Core Practice Test - Questions Answers, Page 62

Question list
Search
Search

Which SQL statement will require a virtual warehouse to run?

A.

SELECT COUNT{*) FROM TBL_EMPLOYEE;

A.

SELECT COUNT{*) FROM TBL_EMPLOYEE;

Answers
B.

ALTER TABLE TBL_EMPLOYEE ADD COLUMN EMP_REGI0N VARCHAR(20);

B.

ALTER TABLE TBL_EMPLOYEE ADD COLUMN EMP_REGI0N VARCHAR(20);

Answers
C.

INSERT INTO TBL_EMPLOYEE(EMP_ID, EMP_NAME, EMP_SALARY, DEPT) VALUES(1,'Adam*,20000,* Finance');

C.

INSERT INTO TBL_EMPLOYEE(EMP_ID, EMP_NAME, EMP_SALARY, DEPT) VALUES(1,'Adam*,20000,* Finance');

Answers
D.

CREATE OR REPLACE TABLE TBL_EMPLOYEE ( EMP_ID NUMBER, EMP_NAME VARCHAR(30), EMP_SALARY NUMBER, DEPT VARCHAR{20) );

D.

CREATE OR REPLACE TABLE TBL_EMPLOYEE ( EMP_ID NUMBER, EMP_NAME VARCHAR(30), EMP_SALARY NUMBER, DEPT VARCHAR{20) );

Answers
Suggested answer: C

Explanation:

A virtual warehouse in Snowflake is used to perform data processing tasks that require computational resources, such as queries that modify data or perform significant computation. Of the options provided:

C . INSERT INTO TBL_EMPLOYEE(EMP_ID, EMP_NAME, EMP_SALARY, DEPT) VALUES(1,'Adam',20000,'Finance'); This SQL statement performs a data modification operation (DML) by inserting a new record into the TBL_EMPLOYEE table, which requires computational resources provided by a virtual warehouse to execute.

Snowflake Documentation: Understanding Virtual Warehouses

Which statement accurately describes how a virtual warehouse functions?

A.

Increasing the size of a virtual warehouse will always improve data loading performance.

A.

Increasing the size of a virtual warehouse will always improve data loading performance.

Answers
B.

Each virtual warehouse is an independent compute cluster that shares compute resources with other warehouses.

B.

Each virtual warehouse is an independent compute cluster that shares compute resources with other warehouses.

Answers
C.

Each virtual warehouse is a compute cluster composed of multiple compute nodes allocated by Snowflake from a cloud provider.

C.

Each virtual warehouse is a compute cluster composed of multiple compute nodes allocated by Snowflake from a cloud provider.

Answers
D.

All virtual warehouses share the same compute resources so performance degradation of one warehouse can significantly affect all the other warehouses.

D.

All virtual warehouses share the same compute resources so performance degradation of one warehouse can significantly affect all the other warehouses.

Answers
Suggested answer: C

Explanation:

A virtual warehouse in Snowflake is an independent compute cluster that performs data processing tasks such as executing SQL queries. Each virtual warehouse is dynamically allocated by Snowflake from the cloud provider's resources and does not share compute resources with other warehouses. This architecture ensures that the performance of one warehouse does not impact the performance of another. Adjusting the size of a virtual warehouse affects its computational power by increasing or decreasing the number of compute nodes, which can improve the performance of data processing tasks depending on the workload.

Snowflake Documentation: Understanding Virtual Warehouses

From what stage can a Snowflake user omit the FROM clause while loading data into a table?

A.

The user stage

A.

The user stage

Answers
B.

The table stage

B.

The table stage

Answers
C.

The internal named stage

C.

The internal named stage

Answers
D.

The external named stage

D.

The external named stage

Answers
Suggested answer: B

Explanation:

In Snowflake, when loading data into a table using the COPY INTO command, the FROM clause can be omitted if loading from the table's stage, also known as the table stage. The table stage is a default location associated with each table where files can be temporarily stored for loading operations. This simplifies the data loading process by allowing direct loading from files that have been uploaded to the table's stage without specifying the stage explicitly in the COPY INTO command.

Snowflake Documentation: Loading Data into Tables

A Snowflake user needs to share unstructured data from an internal stage to a reporting tool that does not have Snowflake access.

Which file function should be used?

A.

BUILD_SCOPED_FILE_URL

A.

BUILD_SCOPED_FILE_URL

Answers
B.

BUILD_STAGE_FILE_URL

B.

BUILD_STAGE_FILE_URL

Answers
C.

GET_PRESIGNED_URL

C.

GET_PRESIGNED_URL

Answers
D.

GET STAGE LOCATION

D.

GET STAGE LOCATION

Answers
Suggested answer: C

Explanation:

The GET_PRESIGNED_URL function in Snowflake generates a presigned URL for a file stored in an internal stage, allowing direct access to the file without requiring Snowflake access. This feature is particularly useful for sharing unstructured data with external applications or tools that do not have direct access to Snowflake. The presigned URL provides temporary access to the file, making it an ideal solution for securely sharing unstructured data from an internal stage with a reporting tool or any other external application.

Snowflake Documentation: Generating Presigned URLs

A Snowflake account administrator has set the resource monitors as shown in the diagram, with actions defined for each resource monitor as 'Notify & Suspend Immediately'.

What is the MAXIMUM limit of credits that Warehouse 2 can consume?

A.

0

A.

0

Answers
B.

1500

B.

1500

Answers
C.

3500

C.

3500

Answers
D.

5000

D.

5000

Answers
Suggested answer: B

Explanation:

In the provided exhibit, Warehouse 2 is under Resource Monitor 2, which has a credit quota of 1000 credits. However, it's important to note that warehouses can also draw from the account-level resource monitor if they consume their dedicated monitor's quota. In this exhibit, Resource Monitor 1 is set for the account with a credit quota of 500 credits.

The exhibit does not clearly show whether Warehouse 2 is also subject to the account-level Resource Monitor 1, but if we assume the typical case where all warehouses can draw from the account-level monitor, the maximum limit of credits Warehouse 2 can consume would be the sum of its dedicated resource monitor (Resource Monitor 2) and the account-level resource monitor (Resource Monitor 1).

Thus, Warehouse 2 can consume:

1000 credits from Resource Monitor 2

500 credits from Resource Monitor 1 (if applicable)

This gives a total of 1500 credits. Therefore, the maximum credit limit that Warehouse 2 can consume, taking into account the possibility of the account-level monitor's contribution, would be 1500 credits.

However, without explicit confirmation that Warehouse 2 can use credits from Resource Monitor 1, one would default to the specific resource monitor assigned to the warehouse, which is 1000 credits. Given the available options, the closest and most accurate answer is B. 1500, as it represents the sum of credits from both monitors that potentially apply to Warehouse 2.

Snowflake Documentation on Resource Monitors: Managing Resource Monitors

What is a feature of column-level security in Snowflake?

A.

Role access policies

A.

Role access policies

Answers
B.

Network policies

B.

Network policies

Answers
C.

Internal tokenization

C.

Internal tokenization

Answers
D.

External tokenization

D.

External tokenization

Answers
Suggested answer: A

Explanation:

Column-level security in Snowflake is implemented through Role Access Policies. These policies allow administrators to control access to specific columns of a table or view based on the role of the user accessing the data. By applying a role access policy to a column, administrators can ensure that sensitive information remains secure, and only users with the appropriate roles can view or query the data in that column. This feature enhances the security model by providing fine-grained access control at the column level.

Snowflake Documentation: Implementing Column-level Security

What operations can be performed while loading a simple CSV file into a Snowflake table using the COPY INTO command? (Select TWO).

A.

Performing aggregate calculations

A.

Performing aggregate calculations

Answers
B.

Reordering the columns

B.

Reordering the columns

Answers
C.

Grouping by operations

C.

Grouping by operations

Answers
D.

Converting the datatypes

D.

Converting the datatypes

Answers
E.

Selecting the first few rows

E.

Selecting the first few rows

Answers
Suggested answer: B, D

Explanation:

When loading a simple CSV file into a Snowflake table using the COPY INTO command, you can perform various transformations and adjustments on the data as part of the loading process. Specifically, you can:

B . Reorder the columns: Specify the order of columns in the COPY INTO command to match the structure of the target table if it differs from the order of columns in the source CSV file.

D . Convert the datatypes: Explicitly convert the datatypes of the data being loaded to match the datatypes of the columns in the target table. This can be necessary when the source data's format does not match the target table's expected datatype.

Snowflake Documentation: Using the COPY INTO Command for Data Loading

What consideration should be made when loading data into Snowflake?

A.

Create small data files and stage them in cloud storage frequently.

A.

Create small data files and stage them in cloud storage frequently.

Answers
B.

Create large data files to maximize the processing overhead for each file.

B.

Create large data files to maximize the processing overhead for each file.

Answers
C.

The number of load operations That run in parallel can exceed the number of data files to be loaded.

C.

The number of load operations That run in parallel can exceed the number of data files to be loaded.

Answers
D.

The number of data files that are processed in parallel is determined by the virtual warehouse.

D.

The number of data files that are processed in parallel is determined by the virtual warehouse.

Answers
Suggested answer: D

Explanation:

When loading data into Snowflake, one critical consideration is the parallel processing capability of the virtual warehouse used for the data loading operation. The number of data files that can be processed in parallel during a loading operation is determined by the size and resources of the virtual warehouse. A larger warehouse can process more files simultaneously, improving the efficiency and speed of data loading operations. Optimizing the size of the virtual warehouse according to the data loading needs and the size and number of files to be loaded can significantly impact the overall performance of the data loading process.

Snowflake Documentation: Optimizing Data Loading

Which Snowflake feature can be used to find sensitive data in a table or column?

A.

Masking policies

A.

Masking policies

Answers
B.

Data classification

B.

Data classification

Answers
C.

Row level policies

C.

Row level policies

Answers
D.

External functions

D.

External functions

Answers
Suggested answer: B

Explanation:

Data classification in Snowflake is a feature that allows organizations to identify and categorize data stored in tables or columns based on its sensitivity level or content type. This feature can be used to find sensitive data within the database by classifying data as confidential, personal, public, etc., making it easier to apply appropriate security measures, such as masking policies or row-level security, to protect sensitive information.

Snowflake Documentation: Data Classification

What can the Snowflake SCIM API be used to manage? (Select TWO).

A.

Integrations

A.

Integrations

Answers
B.

Network policies

B.

Network policies

Answers
C.

Session policies

C.

Session policies

Answers
D.

Roles

D.

Roles

Answers
E.

Users

E.

Users

Answers
Suggested answer: D, E

Explanation:

The Snowflake SCIM (System for Cross-domain Identity Management) API is used for automated user and role management. It enables integration with identity providers (IdPs) for the provisioning and deprovisioning of user accounts and roles in Snowflake. This helps in managing access control and permissions systematically and aligns with identity governance practices.

Snowflake Documentation: Managing Users and Roles with SCIM API


Total 627 questions
Go to page: of 63