ExamGecko

DEA-C01: SnowPro Advanced: Data Engineer

SnowPro Advanced: Data Engineer
Vendor:

Snowflake

SnowPro Advanced: Data Engineer Exam Questions: 130
SnowPro Advanced: Data Engineer   2.370 Learners
Take Practice Tests
Comming soon
PDF | VPLUS
This study guide should help you understand what to expect on the exam and includes a summary of the topics the exam might cover and links to additional resources. The information and materials in this document should help you focus your studies as you prepare for the exam.

Related questions

Snowflake web interface can be used to create users with no passwords or remove passwords from existing users?

Become a Premium Member for full access
Unlock Premium Member  Unlock Premium Member

Select the correct usage statements with regards to SQL UDF?

Become a Premium Member for full access
Unlock Premium Member  Unlock Premium Member

Mark the Correct Statements:

Statement 1. Snowflake's zero-copy cloning feature provides a convenient way to quickly take a "snapshot" of any table, schema, or database.

Statement 2. Data Engineer can use zero-copy cloning feature for creating instant backups that do not incur any additional costs (until changes are made to the cloned object).

A.
Statement 1
A.
Statement 1
Answers
B.
Statement 2
B.
Statement 2
Answers
C.
Both are False.
C.
Both are False.
Answers
D.
Statement 1 & 2 are correct.
D.
Statement 1 & 2 are correct.
Answers
Suggested answer: C

Explanation:

Snowflake's zero-copy cloning feature provides a convenient way to quickly take a "snapshot" of any table, schema, or database and create a derived copy of that object which initially shares the underlying storage. This can be extremely useful for creating instant backups that do not incur any additional costs (until changes are made to the cloned object).

For example, when a clone is created of a table, the clone utilizes no data storage because it shares all the existing micro-partitions of the original table at the time it was cloned; however, rows can then be added, deleted, or updated in the clone independently from the original table. Each change to the clone results in new micro-partitions that are owned exclusively by the clone and are protected through CDP.

asked 23/09/2024
Ian Wilson
45 questions

You as Data engineer might want to consider disabling auto-suspend for a warehouse if?

Become a Premium Member for full access
Unlock Premium Member  Unlock Premium Member

The COPY command supports several options for loading data files from a stage i.e.

A.
By pathII. Specifying a list of specific files to load.III. Using pattern matching to identify specific files by pattern.IV. Organize files into logical paths that reflect a scheduling pattern.Of the aforesaid options for identifying/specifying data files to load from a stage, which option in general is the fastest & best considerate?
A.
By pathII. Specifying a list of specific files to load.III. Using pattern matching to identify specific files by pattern.IV. Organize files into logical paths that reflect a scheduling pattern.Of the aforesaid options for identifying/specifying data files to load from a stage, which option in general is the fastest & best considerate?
Answers
B.
I
B.
I
Answers
C.
II
C.
II
Answers
D.
III
D.
III
Answers
E.
IV
E.
IV
Answers
Suggested answer: B

Explanation:

Of the above options for identifying/specifying data files to load from a stage, providing a discrete list of files is generally the fastest; however, the FILES parameter supports a maximum of 1,000 files, meaning a COPY command executed with the FILES parameter can only load up to 1,000 files.

For example:

copy into load1 from @%load1/Snow1/ files=('mydata1.csv', 'mydata2.csv', 'mydata3.csv')

asked 23/09/2024
Eric Hebert
35 questions

Clones can be cloned, with no limitations on the number or iterations of clones that can be created (e.g. you can create a clone of a clone of a clone, and so on), which results in a n-level hierarchy of cloned objects, each with their own portion of shared and independent data storage?

A.
TRUE
A.
TRUE
Answers
B.
FALSE
B.
FALSE
Answers
Suggested answer: A
asked 23/09/2024
mostafa badawi
43 questions

Streams record the differences between two offsets. If a row is added and then updated in the current offset, what will be the value of METADATA^^SUPDATE Columns in this scenario?

A.
TRUE
A.
TRUE
Answers
B.
FALSE
B.
FALSE
Answers
C.
UPDATE
C.
UPDATE
Answers
D.
INSERT
D.
INSERT
Answers
Suggested answer: B

Explanation:

Stream Columns

A stream stores an offset for the source object and not any actual table columns or data. When queried, a stream accesses and returns the historic data in the same shape as the source object (i.e. the same column names and ordering) with the following additional columns:

METADATA$ACTION Indicates the DML operation (INSERT, DELETE) recorded.

METADATA^^SUPDATE Indicates whether the operation was part of an UPDATE statement. Updates to rows in the source object are represented as a pair of DELETE and INSERT records in the stream with a metadata column METADATA^^SUPDATE values set to TRUE.

METADATA$ROW_ID Specifies the unique and immutable ID for the row, which can be used to track changes to specific rows over time.

Note that streams record the differences between two offsets. If a row is added and then updated in the current offset, the delta change is a new row. The METADATA^^SUPDATE row records a FALSE value.

asked 23/09/2024
Verónica Crespo
36 questions

Select the Incorrect statement about External Functions in SnowFlake?

A.
An external function is a type of UDF.
A.
An external function is a type of UDF.
Answers
B.
An external function does not contain its own code; instead, the external function calls code that is stored and executed outside Snowflake.
B.
An external function does not contain its own code; instead, the external function calls code that is stored and executed outside Snowflake.
Answers
C.
Inside Snowflake, the external function is stored as a API Integration object.
C.
Inside Snowflake, the external function is stored as a API Integration object.
Answers
D.
Inside Snowflake, the external function is stored as a database object that contains in-formation that Snowflake uses to call the remote service.
D.
Inside Snowflake, the external function is stored as a database object that contains in-formation that Snowflake uses to call the remote service.
Answers
Suggested answer: C
asked 23/09/2024
Rebecca Gillespie
40 questions

For enabling non-ACCOUNTADMIN Roles to Perform Data Sharing Tasks, which two glob-al/account privileges snowflake provide?

A.
CREATE SHARE
A.
CREATE SHARE
Answers
B.
IMPORT SHARE
B.
IMPORT SHARE
Answers
C.
REFERENCE USAGE
C.
REFERENCE USAGE
Answers
D.
OPERATE
D.
OPERATE
Answers
Suggested answer: A, B

Explanation:

CREATE SHARE

In a provider account, this privilege enables creating and managing shares (for sharing data with consumer accounts).

IMPORT SHARE

In a consumer account, this privilege enables viewing the inbound shares shared with the account.

Also enables creating databases from inbound shares; requires the global CREATE DATABASE privilege.

By default, these privileges are granted only to the ACCOUNTADMIN role, ensuring that only ac-count administrators can perform these tasks. However, the privileges can be granted to other roles, enabling the tasks to be delegated to other users in the account.

asked 23/09/2024
Tracy Sampson
34 questions

Which are false statements about Star Schema?

A.
Star schemas are denormalized.
A.
Star schemas are denormalized.
Answers
B.
The star schema separates business process data into facts, which hold the measurable, quantitative data about a business, and dimensions which are descriptive attributes re-lated to fact data.
B.
The star schema separates business process data into facts, which hold the measurable, quantitative data about a business, and dimensions which are descriptive attributes re-lated to fact data.
Answers
C.
Star schema is more flexible in terms of analytical needs compared to Data Vault Mod-elling.
C.
Star schema is more flexible in terms of analytical needs compared to Data Vault Mod-elling.
Answers
D.
The star schema is an important special case of the snowflake schema and is more effec-tive for handling simpler queries.
D.
The star schema is an important special case of the snowflake schema and is more effec-tive for handling simpler queries.
Answers
Suggested answer: C
asked 23/09/2024
Saeed Awwad
46 questions