ExamGecko
Home / Snowflake / SnowPro Core
Ask Question

Snowflake SnowPro Core Practice Test - Questions Answers, Page 19

Question list
Search

Question 181

Report
Export
Collapse

Which of the following accurately describes shares?

Tables, secure views, and secure UDFs can be shared
Tables, secure views, and secure UDFs can be shared
Shares can be shared
Shares can be shared
Data consumers can clone a new table from a share
Data consumers can clone a new table from a share
Access to a share cannot be revoked once granted
Access to a share cannot be revoked once granted
Suggested answer: A

Explanation:

Shares in Snowflake are named objects that encapsulate all the information required to share databases, schemas, tables, secure views, and secure UDFs.These objects can be added to a share by granting privileges on them to the share via a database role

asked 23/09/2024
Alemu, Fissha
38 questions

Question 182

Report
Export
Collapse

What are best practice recommendations for using the ACCOUNTADMIN system-defined role in Snowflake? (Choose two.)

Ensure all ACCOUNTADMIN roles use Multi-factor Authentication (MFA).
Ensure all ACCOUNTADMIN roles use Multi-factor Authentication (MFA).
All users granted ACCOUNTADMIN role must be owned by the ACCOUNTADMIN role.
All users granted ACCOUNTADMIN role must be owned by the ACCOUNTADMIN role.
The ACCOUNTADMIN role must be granted to only one user.
The ACCOUNTADMIN role must be granted to only one user.
Assign the ACCOUNTADMIN role to at least two users, but as few as possible.
Assign the ACCOUNTADMIN role to at least two users, but as few as possible.
All users granted ACCOUNTADMIN role must also be granted SECURITYADMIN role.
All users granted ACCOUNTADMIN role must also be granted SECURITYADMIN role.
Suggested answer: A, D

Explanation:

Best practices for using the ACCOUNTADMIN role include ensuring that all users with this role use Multi-factor Authentication (MFA) for added security.Additionally, it is recommended to assign the ACCOUNTADMIN role to at least two users to avoid delays in case of password recovery issues, but to as few users as possible to maintain strict control over account-level operations4.

asked 23/09/2024
Frank van Hout
37 questions

Question 183

Report
Export
Collapse

What is the minimum Snowflake edition required for row level security?

Standard
Standard
Enterprise
Enterprise
Business Critical
Business Critical
Virtual Private Snowflake
Virtual Private Snowflake
Suggested answer: B

Explanation:

Row level security in Snowflake is available starting with the Enterprise edition.This feature allows for the creation of row access policies that can control access to data at the row level within tables and views

asked 23/09/2024
Sonjoy Kanwal
42 questions

Question 184

Report
Export
Collapse

The is the minimum Fail-safe retention time period for transient tables?

1 day
1 day
7 days
7 days
12 hours
12 hours
0 days
0 days
Suggested answer: D

Explanation:

Transient tables in Snowflake have a minimum Fail-safe retention time period of 0 days.This means that once the Time Travel retention period ends, there is no additional Fail-safe period for transient tables

asked 23/09/2024
KENEILWE DITHLAGE
42 questions

Question 185

Report
Export
Collapse

Which statements are correct concerning the leveraging of third-party data from the Snowflake Data Marketplace? (Choose two.)

Data is live, ready-to-query, and can be personalized.
Data is live, ready-to-query, and can be personalized.
Data needs to be loaded into a cloud provider as a consumer account.
Data needs to be loaded into a cloud provider as a consumer account.
Data is not available for copying or moving to an individual Snowflake account.
Data is not available for copying or moving to an individual Snowflake account.
Data is available without copying or moving.
Data is available without copying or moving.
Data transformations are required when combining Data Marketplace datasets with existing data in Snowflake.
Data transformations are required when combining Data Marketplace datasets with existing data in Snowflake.
Suggested answer: A, D

Explanation:

When leveraging third-party data from the Snowflake Data Marketplace, the data is live, ready-to-query, and can be personalized.Additionally, the data is available without the need for copying or moving it to an individual Snowflake account, allowing for seamless integration with existing data

asked 23/09/2024
Fiertelmeister Tibor
33 questions

Question 186

Report
Export
Collapse

What impacts the credit consumption of maintaining a materialized view? (Choose two.)

Whether or not it is also a secure view
Whether or not it is also a secure view
How often the underlying base table is queried
How often the underlying base table is queried
How often the base table changes
How often the base table changes
Whether the materialized view has a cluster key defined
Whether the materialized view has a cluster key defined
How often the materialized view is queried
How often the materialized view is queried
Suggested answer: C, D

Explanation:

The credit consumption for maintaining a materialized view is impacted by how often the base table changes and whether the materialized view has a cluster key defined (D). Changes to the base table can trigger a refresh of the materialized view, consuming credits. Additionally, having a cluster key defined can optimize the performance and credit usage during the materialized view's maintenance.Reference:SnowPro Core Certification materialized view credit consumption

asked 23/09/2024
Elliott Fields
34 questions

Question 187

Report
Export
Collapse

What COPY INTO SQL command should be used to unload data into multiple files?

SINGLE=TRUE
SINGLE=TRUE
MULTIPLE=TRUE
MULTIPLE=TRUE
MULTIPLE=FALSE
MULTIPLE=FALSE
SINGLE=FALSE
SINGLE=FALSE
Suggested answer: D

Explanation:

The COPY INTO SQL command with the optionSINGLE=FALSEis used to unload data into multiple files. This option allows the data to be split into multiple files during the unload process.Reference:SnowPro Core Certification COPY INTO SQL command unload multiple files

asked 23/09/2024
Charles Marlin
36 questions

Question 188

Report
Export
Collapse

When cloning a database containing stored procedures and regular views, that have fully qualified table references, which of the following will occur?

The cloned views and the stored procedures will reference the cloned tables in the cloned database.
The cloned views and the stored procedures will reference the cloned tables in the cloned database.
An error will occur, as views with qualified references cannot be cloned.
An error will occur, as views with qualified references cannot be cloned.
An error will occur, as stored objects cannot be cloned.
An error will occur, as stored objects cannot be cloned.
The stored procedures and views will refer to tables in the source database.
The stored procedures and views will refer to tables in the source database.
Suggested answer: A

Explanation:

When cloning a database containing stored procedures and regular views with fully qualified table references, the cloned views and stored procedures will reference the cloned tables in the cloned database (A). This ensures that the cloned database is a self-contained copy of the original, with all references pointing to objects within the same cloned database.Reference:SnowPro Core Certification cloning database stored procedures views

asked 23/09/2024
Sam Patel
34 questions

Question 189

Report
Export
Collapse

When loading data into Snowflake, how should the data be organized?

Into single files with 100-250 MB of compressed data per file
Into single files with 100-250 MB of compressed data per file
Into single files with 1-100 MB of compressed data per file
Into single files with 1-100 MB of compressed data per file
Into files of maximum size of 1 GB of compressed data per file
Into files of maximum size of 1 GB of compressed data per file
Into files of maximum size of 4 GB of compressed data per file
Into files of maximum size of 4 GB of compressed data per file
Suggested answer: A

Explanation:

When loading data into Snowflake, it is recommended to organize the data into single files with 100-250 MB of compressed data per file. This size range is optimal for parallel processing and can help in achieving better performance during data loading operations.Reference:[COF-C02] SnowPro Core Certification Exam Study Guide

asked 23/09/2024
Simone Perego
42 questions

Question 190

Report
Export
Collapse

Which of the following objects can be directly restored using the UNDROP command? (Choose two.)

Schema
Schema
View
View
Internal stage
Internal stage
Table
Table
User
User
Role
Role
Suggested answer: B, D

Explanation:

The UNDROP command in Snowflake can be used to directly restore Views and Tables. These objects, when dropped, are moved to a 'Recycle Bin' where they can be restored within a time limit before they are permanently deleted.Reference:[COF-C02] SnowPro Core Certification Exam Study Guide

asked 23/09/2024
Alex Agenjo Fernandez
26 questions
Total 627 questions
Go to page: of 63