ExamGecko
Home Home / Snowflake / ADA-C01

Snowflake ADA-C01 Practice Test - Questions Answers, Page 2

Question list
Search
Search

List of questions

Search

Related questions











A virtual warehouse report_wh is configured with AUTO_RESUME=TRUE and AUTO_SUSPEND=300. A user has been granted the role accountant.

An application with the accountant role should use this warehouse to run financial reports, and should keep track of compute credits used by the warehouse.

What minimal privileges on the warehouse should be granted to the role to meet the requirements for the application? (Select TWO).

A.
OPERATE
A.
OPERATE
Answers
B.
MODIFY
B.
MODIFY
Answers
C.
MONITOR
C.
MONITOR
Answers
D.
USAGE
D.
USAGE
Answers
E.
OWNERSHIP
E.
OWNERSHIP
Answers
Suggested answer: C, D

Explanation:

According to the Snowflake documentation1, the MONITOR privilege on a warehouse grants the ability to view the warehouse usage and performance metrics, such as the number of credits consumed, the average and maximum run time, and the number of queries executed. The USAGE privilege on a warehouse grants the ability to use the warehouse to execute queries and load data. Therefore, the minimal privileges on the warehouse that should be granted to the role to meet the requirements for the application are MONITOR and USAGE. Option A is incorrect because the OPERATE privilege on a warehouse grants the ability to start, stop, resume, and suspend the warehouse, which is not required for the application. Option B is incorrect because the MODIFY privilege on a warehouse grants the ability to alter the warehouse properties, such as the size, auto-suspend, and auto-resume settings, which is not required for the application. Option E is incorrect because the OWNERSHIP privilege on a warehouse grants the ability to drop the warehouse, grant or revoke privileges on the warehouse, and transfer the ownership to another role, which is not required for the application.

What is required for stages, without credentials, to limit data exfiltration after a storage integration and associated stages are created?

A.
ALTER ACCOUNT my_account SET REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_CREATION = true; ALTER ACCOUNT my_account SET REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_OPERATION = true; ALTER ACCOUNT my_account SET PREVENT_UNLOAD_TO_INLINE_URL = false;
A.
ALTER ACCOUNT my_account SET REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_CREATION = true; ALTER ACCOUNT my_account SET REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_OPERATION = true; ALTER ACCOUNT my_account SET PREVENT_UNLOAD_TO_INLINE_URL = false;
Answers
B.
ALTER ACCOUNT my_account SET REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_CREATION = false; ALTER ACCOUNT my_account SET REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_OPERATION = false; ALTER ACCOUNT my_account SET PREVENT_UNLOAD_TO_INLINE_URL = true;
B.
ALTER ACCOUNT my_account SET REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_CREATION = false; ALTER ACCOUNT my_account SET REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_OPERATION = false; ALTER ACCOUNT my_account SET PREVENT_UNLOAD_TO_INLINE_URL = true;
Answers
C.
ALTER ACCOUNT my_account SET REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_CREATION = false; ALTER ACCOUNT my_account SET REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_OPERATION = false; ALTER ACCOUNT my_account SET PREVENT_UNLOAD_TO_INLINE_URL = false;
C.
ALTER ACCOUNT my_account SET REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_CREATION = false; ALTER ACCOUNT my_account SET REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_OPERATION = false; ALTER ACCOUNT my_account SET PREVENT_UNLOAD_TO_INLINE_URL = false;
Answers
D.
ALTER ACCOUNT my_account SET REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_CREATION = true; ALTER ACCOUNT my_account SET REQUIRE_STORAGE_INTEGRATION FOR STAGE_OPERATION = true; ALTER ACCOUNT my_account SET PREVENT_UNLOAD_TO_INLINE_URL = true;
D.
ALTER ACCOUNT my_account SET REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_CREATION = true; ALTER ACCOUNT my_account SET REQUIRE_STORAGE_INTEGRATION FOR STAGE_OPERATION = true; ALTER ACCOUNT my_account SET PREVENT_UNLOAD_TO_INLINE_URL = true;
Answers
Suggested answer: D

Explanation:

According to the Snowflake documentation1, stages without credentials are a way to create external stages that use storage integrations to access data files in cloud storage without providing any credentials to Snowflake. Storage integrations are objects that define a trust relationship between Snowflake and a cloud provider, allowing Snowflake to authenticate and authorize access to the cloud storage. To limit data exfiltration after a storage integration and associated stages are created, the following account-level parameters can be set:

* REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_CREATION: This parameter enforces that all external stages must be created using a storage integration. This prevents users from creating external stages with inline credentials or URLs that point to unauthorized locations.

* REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_OPERATION: This parameter enforces that all operations on external stages, such as PUT, GET, COPY, and LIST, must use a storage integration. This prevents users from performing operations on external stages with inline credentials or URLs that point to unauthorized locations.

* PREVENT_UNLOAD_TO_INLINE_URL: This parameter prevents users from unloading data from Snowflake tables to inline URLs that do not use a storage integration. This prevents users from exporting data to unauthorized locations.

Therefore, the correct answer is option D, which sets all these parameters to true. Option A is incorrect because it sets PREVENT_UNLOAD_TO_INLINE_URL to false, which allows users to unload data to inline URLs that do not use a storage integration. Option B is incorrect because it sets both REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_CREATION and REQUIRE_STORAGE_INTEGRATION_FOR_STAGE_OPERATION to false, which allows users to create and operate on external stages without using a storage integration. Option C is incorrect because it sets all the parameters to false, which does not enforce any restrictions on data exfiltration.

A Snowflake Administrator has a multi-cluster virtual warehouse and is using the Snowflake Business Critical edition. The minimum number of clusters is set to 2 and the maximum number of clusters is set to 10. This configuration works well for the standard workload, rarely exceeding 5 running clusters. However, once a month the

Administrator notes that there are a few complex long-running queries that are causing increased queue time and the warehouse reaches its maximum limit at 10 clusters.

Which solutions will address the issues happening once a month? (Select TWO).

A.
Use a task to increase the cluster size for the time period that the more complex queries are running and another task to reduce the size of the cluster once the complex queries complete.
A.
Use a task to increase the cluster size for the time period that the more complex queries are running and another task to reduce the size of the cluster once the complex queries complete.
Answers
B.
Have the group running the complex monthly queries use a separate appropriately-sized warehouse to support their workload.
B.
Have the group running the complex monthly queries use a separate appropriately-sized warehouse to support their workload.
Answers
C.
Increase the multi-cluster maximum to 20 or more clusters.
C.
Increase the multi-cluster maximum to 20 or more clusters.
Answers
D.
Examine the complex queries and determine if they can be made more efficient using clustering keys or materialized views.
D.
Examine the complex queries and determine if they can be made more efficient using clustering keys or materialized views.
Answers
E.
Increase the minimum number of clusters started in the multi-cluster configuration to 5.
E.
Increase the minimum number of clusters started in the multi-cluster configuration to 5.
Answers
Suggested answer: A, B

Explanation:

According to the Snowflake documentation1, a multi-cluster warehouse is a virtual warehouse that consists of multiple clusters of compute resources that can scale up or down automatically to handle the concurrency and performance needs of the queries submitted to the warehouse. A multi-cluster warehouse has a minimum and maximum number of clusters that can be specified by the administrator. Option A is a possible solution to address the issues happening once a month, as it allows the administrator to use a task to increase the cluster size for the time period that the more complex queries are running and another task to reduce the size of the cluster once the complex queries complete. This way, the warehouse can have more resources available to handle the complex queries without reaching the maximum limit of 10 clusters, and then return to the normal cluster size to save costs. Option B is another possible solution to address the issues happening once a month, as it allows the administrator to have the group running the complex monthly queries use a separate appropriately-sized warehouse to support their workload. This way, the warehouse can isolate the complex queries from the standard workload and avoid queue time and resource contention. Option C is not a recommended solution to address the issues happening once a month, as it would increase the costs and complexity of managing the multi-cluster warehouse, and may not solve the underlying problem of inefficient queries. Option D is a good practice to improve the performance of the queries, but it is not a direct solution to address the issues happening once a month, as it requires analyzing and optimizing the complex queries using clustering keys or materialized views, which may not be feasible or effective in all cases. Option E is not a recommended solution to address the issues happening once a month, as it would increase the costs and waste resources by starting more clusters than needed for the standard workload.

Which masking policy will mask a column whenever it is queried through a view owned by a role named MASKED_VIEW_ROLE?

A.
create or replace masking policy maskstring as (val string) returns string -> case when is_role_in_session ('MASKED_VIEW_ROLE') then ' ** else val end; *,
A.
create or replace masking policy maskstring as (val string) returns string -> case when is_role_in_session ('MASKED_VIEW_ROLE') then ' ** else val end; *,
Answers
B.
create or replace masking policy maskString as (val string) returns string -> case when array_contains ('MASKED_VIEW_ROLE' :: variant, parse_json (current_available_roles ())) then '* else val end; ** '
B.
create or replace masking policy maskString as (val string) returns string -> case when array_contains ('MASKED_VIEW_ROLE' :: variant, parse_json (current_available_roles ())) then '* else val end; ** '
Answers
C.
create or replace masking policy maskstring as (val string) returns string -> case when invoker_role() in ('MASKED_VIEW_ROLE') then else val end; ' **
C.
create or replace masking policy maskstring as (val string) returns string -> case when invoker_role() in ('MASKED_VIEW_ROLE') then else val end; ' **
Answers
D.
create or replace masking policy maskString as (val string) returns string -> case when current_role() in ('MASKED_VIEW_ROLE') then ' ********* ' else val end;
D.
create or replace masking policy maskString as (val string) returns string -> case when current_role() in ('MASKED_VIEW_ROLE') then ' ********* ' else val end;
Answers
Suggested answer: A

Explanation:

A masking policy is a SQL expression that transforms the data in a column based on the role that queries the column1. The is_role_in_session function returns true if the specified role is in the current session2. Therefore, the masking policy in option A will mask the column data with asterisks whenever it is queried through a view owned by the MASKED_VIEW_ROLE3. The other options use different functions that do not check the ownership of the view, but rather the current role, the invoker role, or the available roles in the session45. These functions may not return the desired result if the role that owns the view is different from the role that queries the view.

What session parameter can be used to test the integrity of secure views based on the account that is accessing that view?

A.
MIMIC_CONSUMER_ACCOUNT
A.
MIMIC_CONSUMER_ACCOUNT
Answers
B.
TEST_ACCOUNT_ID
B.
TEST_ACCOUNT_ID
Answers
C.
PRODUCER_TEST_ACCT
C.
PRODUCER_TEST_ACCT
Answers
D.
SIMULATED_DATA_SHARING_CONSUMER
D.
SIMULATED_DATA_SHARING_CONSUMER
Answers
Suggested answer: D

Explanation:

The SIMULATED_DATA_SHARING_CONSUMER session parameter allows a data provider to test the integrity of secure views based on the account that is accessing that view2. By setting this parameter to the name of the consumer account, the data provider can query the secure view and see the results that a user in the consumer account will see2. This helps to ensure that sensitive data in a shared database is not exposed to unauthorized users1. The other options are not valid session parameters in Snowflake3

A user has enrolled in Multi-factor Authentication (MFA) for connecting to Snowflake. The user informs the Snowflake Administrator that they lost their mobile phone the previous evening.

Which step should the Administrator take to allow the user to log in to the system, without revoking their MFA enrollment?

A.
Alter the user and set MINS TO BYPASS MFA to a value that will disable MFA long enough for the user to log in.
A.
Alter the user and set MINS TO BYPASS MFA to a value that will disable MFA long enough for the user to log in.
Answers
B.
Alter the user and set DISABLE_MFA to true, which will suspend the MFA requirement for 24 hours.
B.
Alter the user and set DISABLE_MFA to true, which will suspend the MFA requirement for 24 hours.
Answers
C.
Instruct the user to connect to Snowflake using SnowSQL, which does not support MFA authentication.
C.
Instruct the user to connect to Snowflake using SnowSQL, which does not support MFA authentication.
Answers
D.
Instruct the user to append the normal URL with /?mode=mfa_bypass&code= to log on.
D.
Instruct the user to append the normal URL with /?mode=mfa_bypass&code= to log on.
Answers
Suggested answer: A

Explanation:

The MINS_TO_BYPASS_MFA property allows the account administrator to temporarily disable MFA for a user who has lost their phone or changed their phone number1. The user can log in without MFA for the specified number of minutes, and then re-enroll in MFA using their new phone1. This does not revoke their MFA enrollment, unlike the DISABLE_MFA property, which cancels their enrollment and requires them to re-enroll from scratch1. The other options are not valid ways to bypass MFA, as SnowSQL does support MFA authentication2, and there is no such URL parameter as /?mode=mfa_bypass&code= for Snowflake3

A company enabled replication between accounts and is ready to replicate data across regions in the same cloud service provider.

The primary database object is : PROD_AWS_EAST. Location : AWS_EAST

The secondary database object is : PROD_AWS_WEST. Location : AWS_WEST

What command and account location is needed to refresh the data?

A.
Location : AWS_WEST Command : REFRESH DATABASE PROD_AWS WEST REFRESH;
A.
Location : AWS_WEST Command : REFRESH DATABASE PROD_AWS WEST REFRESH;
Answers
B.
Location : AWS_WEST Command : ALTER DATABASE PROD AWS WEST REFRESH;
B.
Location : AWS_WEST Command : ALTER DATABASE PROD AWS WEST REFRESH;
Answers
C.
Location : AWS_EAST Command : REFRESH DATABASE PROD_AWS_WEST REFRESH;
C.
Location : AWS_EAST Command : REFRESH DATABASE PROD_AWS_WEST REFRESH;
Answers
D.
Location : AWS EAST Command: ALTER DATABASE PROD_AWS_WEST REFRESH;
D.
Location : AWS EAST Command: ALTER DATABASE PROD_AWS_WEST REFRESH;
Answers
Suggested answer: A

Explanation:

The REFRESH DATABASE command is used to refresh a secondary database with the latest data and metadata from the primary database1. The command must be executed in the target account where the secondary database resides2. Therefore, the answer is A, as the location is AWS_WEST and the command is REFRESH DATABASE PROD_AWS_WEST REFRESH. The other options are incorrect because they either use the wrong location, the wrong command, or the wrong database name.

What roles can be used to create network policies within Snowflake accounts? (Select THREE).

A.
SYSADMIN
A.
SYSADMIN
Answers
B.
SECURITYADMIN
B.
SECURITYADMIN
Answers
C.
ACCOUNTADMIN
C.
ACCOUNTADMIN
Answers
D.
ORGADMIN
D.
ORGADMIN
Answers
E.
Any role with the global permission of CREATE NETWORK POLICY
E.
Any role with the global permission of CREATE NETWORK POLICY
Answers
F.
Any role that owns the database where the network policy is created
F.
Any role that owns the database where the network policy is created
Answers
Suggested answer: B, C, E

Explanation:

Network policies are used to restrict access to the Snowflake service and internal stages based on user IP address1. To create network policies, a role must have the global permission of CREATE NETWORK POLICY2. By default, the system-defined roles of SECURITYADMIN and ACCOUNTADMIN have this permission3. However, any other role can be granted this permission by an administrator4. Therefore, the answer is B, C, and E. The other options are incorrect because SYSADMIN and ORGADMIN do not have the CREATE NETWORK POLICY permission by default3, and network policies are not tied to specific databases5.

In general, the monthly billing for database replication is proportional to which variables? (Select TWO).

A.
The frequency of changes to the primary database as a result of data loading or DML operations
A.
The frequency of changes to the primary database as a result of data loading or DML operations
Answers
B.
The amount of table data in the primary database that changes as a result of data loading or DML operations
B.
The amount of table data in the primary database that changes as a result of data loading or DML operations
Answers
C.
The frequency of the secondary database refreshes from the primary database
C.
The frequency of the secondary database refreshes from the primary database
Answers
D.
The number of times data moves across regions and/or cloud service providers between the primary and secondary database accounts
D.
The number of times data moves across regions and/or cloud service providers between the primary and secondary database accounts
Answers
E.
The number and size of warehouses defined in the primary account
E.
The number and size of warehouses defined in the primary account
Answers
Suggested answer: A, B

Explanation:

Snowflake charges for database replication based on two categories: data transfer and compute resources1. Data transfer costs depend on the amount of data that is transferred from the primary database to the secondary database across regions and/or cloud service providers2. Compute resource costs depend on the use of Snowflake-provided compute resources to copy data between accounts across regions1. Both data transfer and compute resource costs are proportional to the frequency and amount of changes to the primary database as a result of data loading or DML operations3. Therefore, the answer is A and B. The other options are not directly related to the replication billing, as the frequency of secondary database refreshes does not affect the amount of data transferred or copied4, and the number and size of warehouses defined in the primary account do not affect the replication process5.

Which statement allows this user to access this Snowflake account from a specific IP address (192.168.1.100) while blocking their access from anywhere else?

A.
CREATE NETWORK POLICY ADMIN_POLICY ALLOWED_IP_LIST = ('192.168.1.100'); ALTER USER ABC SET NETWORK_POLICY = 'ADMIN_POLICY'; User ABC is the only user with an ACCOUNTADMIN role.
A.
CREATE NETWORK POLICY ADMIN_POLICY ALLOWED_IP_LIST = ('192.168.1.100'); ALTER USER ABC SET NETWORK_POLICY = 'ADMIN_POLICY'; User ABC is the only user with an ACCOUNTADMIN role.
Answers
B.
CREATE NETWORK POLICY ADMIN POLICY ALLOWED_IP_LIST = ('192.168.1.100'); ALTER ROLE ACCOUNTADMIN SET NETWORK_POLICY = 'ADMIN_POLICY';
B.
CREATE NETWORK POLICY ADMIN POLICY ALLOWED_IP_LIST = ('192.168.1.100'); ALTER ROLE ACCOUNTADMIN SET NETWORK_POLICY = 'ADMIN_POLICY';
Answers
C.
CREATE NETWORK POLICY ADMIN_POLICY ALLOWED IP LIST = ('192.168.1.100') BLOCKED_IP_LIST = ('0.0.0.0/0'); ALTER USER ABC SET NETWORK_POLICY = 'ADMIN_POLICY';
C.
CREATE NETWORK POLICY ADMIN_POLICY ALLOWED IP LIST = ('192.168.1.100') BLOCKED_IP_LIST = ('0.0.0.0/0'); ALTER USER ABC SET NETWORK_POLICY = 'ADMIN_POLICY';
Answers
D.
CREATE OR REPLACE NETWORK POLICY ADMIN_POLICY ALLOWED_IP_LIST = ('192.168. 1. 100/0') ; ALTER USER ABC SET NETWORK_POLICY = 'ADMIN_POLICY';
D.
CREATE OR REPLACE NETWORK POLICY ADMIN_POLICY ALLOWED_IP_LIST = ('192.168. 1. 100/0') ; ALTER USER ABC SET NETWORK_POLICY = 'ADMIN_POLICY';
Answers
Suggested answer: C

Explanation:

Option C creates a network policy that allows only the IP address 192.168.1.100 and blocks all other IP addresses using the CIDR notation 0.0.0.0/01. It then applies the network policy to the user ABC, who has the ACCOUNTADMIN role. This ensures that only this user can access the Snowflake account from the specified IP address, while blocking their access from anywhere else. Option A does not block any other IP addresses, option B applies the network policy to the role instead of the user, and option D uses an invalid CIDR notation.

Total 72 questions
Go to page: of 8