ExamGecko
Home Home / Snowflake / ARA-C01

Snowflake ARA-C01 Practice Test - Questions Answers, Page 16

Question list
Search
Search

List of questions

Search

A company has built a data pipeline using Snowpipe to ingest files from an Amazon S3 bucket. Snowpipe is configured to load data into staging database tables. Then a task runs to load the data from the staging database tables into the reporting database tables.

The company is satisfied with the availability of the data in the reporting database tables, but the reporting tables are not pruning effectively. Currently, a size 4X-Large virtual warehouse is being used to query all of the tables in the reporting database.

What step can be taken to improve the pruning of the reporting tables?

A.
Eliminate the use of Snowpipe and load the files into internal stages using PUT commands.
A.
Eliminate the use of Snowpipe and load the files into internal stages using PUT commands.
Answers
B.
Increase the size of the virtual warehouse to a size 5X-Large.
B.
Increase the size of the virtual warehouse to a size 5X-Large.
Answers
C.
Use an ORDER BY <cluster_key (s) > command to load the reporting tables.
C.
Use an ORDER BY <cluster_key (s) > command to load the reporting tables.
Answers
D.
Create larger files for Snowpipe to ingest and ensure the staging frequency does not exceed 1 minute.
D.
Create larger files for Snowpipe to ingest and ensure the staging frequency does not exceed 1 minute.
Answers
Suggested answer: C

Explanation:

Effective pruning in Snowflake relies on the organization of data within micro-partitions. By using an ORDER BY clause with clustering keys when loading data into the reporting tables, Snowflake can better organize the data within micro-partitions. This organization allows Snowflake to skip over irrelevant micro-partitions during a query, thus improving query performance and reducing the amount of data scanned12.

* Snowflake Documentation on micro-partitions and data clustering2

* Community article on recognizing unsatisfactory pruning and improving it1


An Architect is using SnowCD to investigate a connectivity issue.

Which system function will provide a list of endpoints that the network must be able to access to use a specific Snowflake account, leveraging private connectivity?

A.
SYSTEMSALLOWLIST ()
A.
SYSTEMSALLOWLIST ()
Answers
B.
SYSTEMSGET_PRIVATELINK
B.
SYSTEMSGET_PRIVATELINK
Answers
C.
SYSTEMSAUTHORIZE_PRIVATELINK
C.
SYSTEMSAUTHORIZE_PRIVATELINK
Answers
D.
SYSTEMSALLOWLIST_PRIVATELINK ()
D.
SYSTEMSALLOWLIST_PRIVATELINK ()
Answers
Suggested answer: D

An Architect with the ORGADMIN role wants to change a Snowflake account from an Enterprise edition to a Business Critical edition.

How should this be accomplished?

A.
Run an ALTER ACCOUNT command and create a tag of EDITION and set the tag to Business Critical.
A.
Run an ALTER ACCOUNT command and create a tag of EDITION and set the tag to Business Critical.
Answers
B.
Use the account's ACCOUNTADMIN role to change the edition.
B.
Use the account's ACCOUNTADMIN role to change the edition.
Answers
C.
Failover to a new account in the same region and specify the new account's edition upon creation.
C.
Failover to a new account in the same region and specify the new account's edition upon creation.
Answers
D.
Contact Snowflake Support and request that the account's edition be changed.
D.
Contact Snowflake Support and request that the account's edition be changed.
Answers
Suggested answer: D

Explanation:

To change the edition of a Snowflake account, an organization administrator (ORGADMIN) cannot directly alter the account settings through SQL commands or the Snowflake interface. The proper procedure is to contact Snowflake Support to request an edition change for the account. This ensures that the change is managed correctly and aligns with Snowflake's operational protocols.

An Architect is implementing a CI/CD process. When attempting to clone a table from a production to a development environment, the cloning operation fails.

What could be causing this to happen?

A.
The table is transient.
A.
The table is transient.
Answers
B.
The table has a masking policy.
B.
The table has a masking policy.
Answers
C.
The retention time for the table is set to zero.
C.
The retention time for the table is set to zero.
Answers
D.
Tables cannot be cloned from a higher environment to a lower environment.
D.
Tables cannot be cloned from a higher environment to a lower environment.
Answers
Suggested answer: B

Explanation:

Cloning a table with a masking policy can cause the cloning operation to fail because the masking policy is not automatically cloned with the table. This is due to the fact that the masking policy is considered a separate object with its own set of privileges1.

Snowflake Documentation on Cloning Considerations1.

An Architect needs to design a solution for building environments for development, test, and pre-production, all located in a single Snowflake account. The environments should be based on production data.

Which solution would be MOST cost-effective and performant?

A.
Use zero-copy cloning into transient tables.
A.
Use zero-copy cloning into transient tables.
Answers
B.
Use zero-copy cloning into permanent tables.
B.
Use zero-copy cloning into permanent tables.
Answers
C.
Use CREATE TABLE ... AS SELECT (CTAS) statements.
C.
Use CREATE TABLE ... AS SELECT (CTAS) statements.
Answers
D.
Use a Snowflake task to trigger a stored procedure to copy data.
D.
Use a Snowflake task to trigger a stored procedure to copy data.
Answers
Suggested answer: A

Explanation:

Zero-copy cloning is a feature in Snowflake that allows for the creation of a clone of a database, schema, or table without duplicating any data, which is cost-effective as it saves on storage costs. Transient tables are temporary and do not incur storage costs for the time they are not accessed, making them a cost-effective option for development, test, and pre-production environments that do not require the durability of permanent tables123.

* Snowflake Documentation on Zero-Copy Cloning3.

* Articles discussing the cost-effectiveness and performance benefits of zero-copy cloning12.

A user has activated primary and secondary roles for a session.

What operation is the user prohibited from using as part of SQL actions in Snowflake using the secondary role?

A.

Insert

A.

Insert

Answers
B.

Create

B.

Create

Answers
C.

Delete

C.

Delete

Answers
D.

Truncate

D.

Truncate

Answers
Suggested answer: B

Explanation:

In Snowflake, when a user activates a secondary role during a session, certain privileges associated with DDL (Data Definition Language) operations are restricted. The CREATE statement, which falls under DDL operations, cannot be executed using a secondary role. This limitation is designed to enforce role-based access control and ensure that schema modifications are managed carefully, typically reserved for primary roles that have explicit permissions to modify database structures.

Reference: Snowflake's security and access control documentation specifying the limitations and capabilities of primary versus secondary roles in session management.

Two queries are run on the customer_address table:

create or replace TABLE CUSTOMER_ADDRESS ( CA_ADDRESS_SK NUMBER(38,0), CA_ADDRESS_ID VARCHAR(16), CA_STREET_NUMBER VARCHAR(IO) CA_STREET_NAME VARCHAR(60), CA_STREET_TYPE VARCHAR(15), CA_SUITE_NUMBER VARCHAR(10), CA_CITY VARCHAR(60), CA_COUNTY

VARCHAR(30), CA_STATE VARCHAR(2), CA_ZIP VARCHAR(10), CA_COUNTRY VARCHAR(20), CA_GMT_OFFSET NUMBER(5,2), CA_LOCATION_TYPE

VARCHAR(20) );

ALTER TABLE DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS ADD SEARCH OPTIMIZATION ON SUBSTRING(CA_ADDRESS_ID);

Which queries will benefit from the use of the search optimization service? (Select TWO).

A.

select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where substring(CA_ADDRESS_ID,1,8)= substring('AAAAAAAAPHPPLBAAASKDJHASLKDJHASKJD',1,8);

A.

select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where substring(CA_ADDRESS_ID,1,8)= substring('AAAAAAAAPHPPLBAAASKDJHASLKDJHASKJD',1,8);

Answers
B.

select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where CA_ADDRESS_ID= substring('AAAAAAAAPHPPLBAAASKDJHASLKDJHASKJD',1,16);

B.

select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where CA_ADDRESS_ID= substring('AAAAAAAAPHPPLBAAASKDJHASLKDJHASKJD',1,16);

Answers
C.

select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where CA_ADDRESS_ID LIKE '%BAAASKD%';

C.

select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where CA_ADDRESS_ID LIKE '%BAAASKD%';

Answers
D.

select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where CA_ADDRESS_ID LIKE '%PHPP%';

D.

select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where CA_ADDRESS_ID LIKE '%PHPP%';

Answers
E.

select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where CA_ADDRESS_ID NOT LIKE '%AAAAAAAAPHPPL%';

E.

select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where CA_ADDRESS_ID NOT LIKE '%AAAAAAAAPHPPL%';

Answers
Suggested answer: A, B

Explanation:

The use of the search optimization service in Snowflake is particularly effective when queries involve operations that match exact substrings or start from the beginning of a string. The ALTER TABLE command adding search optimization specifically for substrings on the CA_ADDRESS_ID field allows the service to create an optimized search path for queries using substring matches.

Option A benefits because it directly matches a substring from the start of the CA_ADDRESS_ID, aligning with the optimization's capability to quickly locate records based on the beginning segments of strings.

Option B also benefits, despite performing a full equality check, because it essentially compares the full length of CA_ADDRESS_ID to a substring, which can leverage the substring index for efficient retrieval. Options C, D, and E involve patterns that do not start from the beginning of the string or use negations, which are not optimized by the search optimization service configured for starting substring matches.

Reference: Snowflake's documentation on the use of search optimization for substring matching in SQL queries.

A user, analyst_user has been granted the analyst_role, and is deploying a SnowSQL script to run as a background service to extract data from Snowflake.

What steps should be taken to allow the IP addresses to be accessed? (Select TWO).

A.

ALTER ROLE ANALYST_ROLE SET NETWORK_POLICY='ANALYST_POLICY';

A.

ALTER ROLE ANALYST_ROLE SET NETWORK_POLICY='ANALYST_POLICY';

Answers
B.

ALTER USER ANALYSTJJSER SET NETWORK_POLICY='ANALYST_POLICY';

B.

ALTER USER ANALYSTJJSER SET NETWORK_POLICY='ANALYST_POLICY';

Answers
C.

ALTER USER ANALYST_USER SET NETWORK_POLICY='10.1.1.20';

C.

ALTER USER ANALYST_USER SET NETWORK_POLICY='10.1.1.20';

Answers
D.

USE ROLE SECURITYADMIN; CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY ALLOWED_IP_LIST = ('10.1.1.20');

D.

USE ROLE SECURITYADMIN; CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY ALLOWED_IP_LIST = ('10.1.1.20');

Answers
E.

USE ROLE USERADMIN; CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY ALLOWED_IP_LIST = ('10.1.1.20');

E.

USE ROLE USERADMIN; CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY ALLOWED_IP_LIST = ('10.1.1.20');

Answers
Suggested answer: B, D

Explanation:

To ensure that an analyst_user can only access Snowflake from specific IP addresses, the following steps are required:

Option B: This alters the network policy directly linked to analyst_user. Setting a network policy on the user level is effective and ensures that the specified network restrictions apply directly and exclusively to this user.

Option D: Before a network policy can be set or altered, the appropriate role with permission to manage network policies must be used. SECURITYADMIN is typically the role that has privileges to create and manage network policies in Snowflake. Creating a network policy that specifies allowed IP addresses ensures that only requests coming from those IPs can access Snowflake under this policy. After creation, this policy can be linked to specific users or roles as needed.

Options A and E mention altering roles or using the wrong role (USERADMIN typically does not manage network security settings), and option C incorrectly attempts to set a network policy directly as an IP address, which is not syntactically or functionally valid.

Reference: Snowflake's security management documentation covering network policies and role-based access controls.



The data share exists between a data provider account and a data consumer account. Five tables from the provider account are being shared with the consumer account. The consumer role has been granted the imported privileges privilege.

What will happen to the consumer account if a new table (table_6) is added to the provider schema?

A.

The consumer role will automatically see the new table and no additional grants are needed.

A.

The consumer role will automatically see the new table and no additional grants are needed.

Answers
B.

The consumer role will see the table only after this grant is given on the consumer side: grant imported privileges on database PSHARE_EDW_4TEST_DB to DEV_ROLE;

B.

The consumer role will see the table only after this grant is given on the consumer side: grant imported privileges on database PSHARE_EDW_4TEST_DB to DEV_ROLE;

Answers
C.

The consumer role will see the table only after this grant is given on the provider side: use role accountadmin; Grant select on table EDW.ACCOUNTING.Table_6 to share PSHARE_EDW_4TEST;

C.

The consumer role will see the table only after this grant is given on the provider side: use role accountadmin; Grant select on table EDW.ACCOUNTING.Table_6 to share PSHARE_EDW_4TEST;

Answers
D.

The consumer role will see the table only after this grant is given on the provider side: use role accountadmin; grant usage on database EDW to share PSHARE_EDW_4TEST ; grant usage on schema EDW.ACCOUNTING to share PSHARE_EDW_4TEST ; Grant select on table EDW.ACCOUNTING.Table_6 to database PSHARE_EDW_4TEST_DB ;

D.

The consumer role will see the table only after this grant is given on the provider side: use role accountadmin; grant usage on database EDW to share PSHARE_EDW_4TEST ; grant usage on schema EDW.ACCOUNTING to share PSHARE_EDW_4TEST ; Grant select on table EDW.ACCOUNTING.Table_6 to database PSHARE_EDW_4TEST_DB ;

Answers
Suggested answer: D

Explanation:

When a new table (table_6) is added to a schema in the provider's account that is part of a data share, the consumer will not automatically see the new table. The consumer will only be able to access the new table once the appropriate privileges are granted by the provider. The correct process, as outlined in option D, involves using the provider's ACCOUNTADMIN role to grant USAGE privileges on the database and schema, followed by SELECT privileges on the new table, specifically to the share that includes the consumer's database. This ensures that the consumer account can access the new table under the established data sharing setup.

Reference:

Snowflake Documentation on Managing Access Control

Snowflake Documentation on Data Sharing

A company is following the Data Mesh principles, including domain separation, and chose one Snowflake account for its data platform.

An Architect created two data domains to produce two data products. The Architect needs a third data domain that will use both of the data products to create an aggregate data product. The read access to the data products will be granted through a separate role.

Based on the Data Mesh principles, how should the third domain be configured to create the aggregate product if it has been granted the two read roles?

A.

Use secondary roles for all users.

A.

Use secondary roles for all users.

Answers
B.

Create a hierarchy between the two read roles.

B.

Create a hierarchy between the two read roles.

Answers
C.

Request a technical ETL user with the sysadmin role.

C.

Request a technical ETL user with the sysadmin role.

Answers
D.

Request that the two data domains share data using the Data Exchange.

D.

Request that the two data domains share data using the Data Exchange.

Answers
Suggested answer: D

Explanation:

In the scenario described, where a third data domain needs access to two existing data products in a Snowflake account structured according to Data Mesh principles, the best approach is to utilize Snowflake's Data Exchange functionality. Option D is correct as it facilitates the sharing and governance of data across different domains efficiently and securely. Data Exchange allows domains to publish and subscribe to live data products, enabling real-time data collaboration and access management in a governed manner. This approach is in line with Data Mesh principles, which advocate for decentralized data ownership and architecture, enhancing agility and scalability across the organization.

Reference:

Snowflake Documentation on Data Exchange

Articles on Data Mesh Principles in Data Management

Total 162 questions
Go to page: of 17