Snowflake DEA-C01 Practice Test - Questions Answers, Page 8

List of questions
Question 71

For the most efficient and cost-effective Data load experience, Data Engineer needs to inconsider-ate which of the following considerations?
Split larger files into a greater number of smaller files to distribute the load among the compute resources in an active warehouse. This would minimize the processing overhead rather than maximize it.
Rest is recommended Data loading considerations.
Question 72

The COPY command supports several options for loading data files from a stage i.e.
Of the above options for identifying/specifying data files to load from a stage, providing a discrete list of files is generally the fastest; however, the FILES parameter supports a maximum of 1,000 files, meaning a COPY command executed with the FILES parameter can only load up to 1,000 files.
For example:
copy into load1 from @%load1/Snow1/ files=('mydata1.csv', 'mydata2.csv', 'mydata3.csv')
Question 73

As Data Engineer, you have requirement to Load set of New Product Files containing Product relevant information into the Snowflake internal tables, Later you analyzed that some of the Source files are already loaded in one of the historical batch & for that you have prechecked Metadata col-umn LAST_MODIFIED date for a staged data file & found out that LAST_MODIFIED date is older than 64 days for few files and the initial set of data was loaded into the table more than 64 days earlier, Which one is the best approach to Load Source data files with expired load metadata along with set of files whose metadata might be available to avoid data duplication?
To load files whose metadata has expired, set the LOAD_UNCERTAIN_FILES copy option to true. The copy option references load metadata, if available, to avoid data duplication, but also at-tempts to load files with expired load metadata.
Alternatively, set the FORCE option to load all files, ignoring load metadata if it exists. Note that this option reloads files, potentially duplicating data in a table.
Please refer the Example as mentioned in the link below:
https://docs.snowflake.com/en/user-guide/data-load-considerations-load.html#loading-older-files
Question 74

If external software i.e. TIBCO, exports Data fields enclosed in quotes but inserts a leading space before the opening quotation character for each field, How Snowflake handle it? [Select 2]
If your external software exports fields enclosed in quotes but inserts a leading space before the opening quotation character for each field, Snowflake reads the leading space rather than the opening quotation character as the beginning of the field. The quotation characters are interpreted as string data.
Use the TRIM_SPACE file format option to remove undesirable spaces during the data load.
Question 75

Data Engineer Loading File named snowdata.tsv in the /datadir directory from his local machine to Snowflake stage and try to prefix the file with a folder named tablestage, please mark the correct command which helps him to load the files data into snowflake internal Table stage?
Execute PUT to upload (stage) local data files into an internal stage.
@% character combination identifies a table stage.
Question 76

Mark the Correct Statements for the VALIDATION_MODE option used by Data Engineer for Da-ta loading operations in his/her COPY INTO <table> command:
All the Statements are correct except the statement saying VALIDATION_MODE only support Data loading operation.
VALIDATION_MODE can be used with COPY INTO <location> command as well i.e for data unloading operation.
VALIDATION_MODE = RETURN_ROWS can be used at the time of Data unloading.
This option instructs the COPY command to return the results of the query in the SQL statement instead of unloading the results to the specified cloud storage location. The only supported validation option is RETURN_ROWS. This option returns all rows produced by the query.
When you have validated the query, you can remove the VALIDATION_MODE to perform the unload operation.
Question 77

To troubleshoot data load failure in one of your Copy Statement, Data Engineer have Executed a COPY statement with the VALIDATION_MODE copy option set to RETURN_ALL_ERRORS with reference to the set of files he had attempted to load. Which below function can facilitate analysis of the problematic records on top of the Results produced? [Select 2]
LAST_QUERY_ID() Function
Returns the ID of a specified query in the current session. If no query is specified, the most recently executed query is returned.
RESULT_SCAN() Function
Returns the result set of a previous command (within 24 hours of when you executed the query) as if the result was a table.
The following example validates a set of files (SFfile.csv.gz) that contain errors. To facilitate analy-sis of the errors, a COPY INTO <location> statement then unloads the problematic records into a text file so they could be analyzed and fixed in the original data files. The statement queries the RESULT_SCAN table.
1. #copy into Snowtable
2. from @SFstage/SFfile.csv.gz
3. validation_mode=return_all_errors;
4. #set qid=last_query_id();
5. #copy into @SFstage/errors/load_errors.txt from (select rejected_record from table( result_scan($qid))); Note: Other options are not valid functions.
Question 78

As part of Table Designing, Data Engineer added a timestamp column that inserts the current timestamp as the default value as records are loaded into a table. The intent is to capture the time when each record was loaded into the table; however, the timestamps are earlier than the LOAD_TIME column values returned by COPY_HISTORY view (Account Usage). What could be reason of this issue?
Question 79

Snowpipe loads data from files as soon as they are available in a stage. Automated data loads leverage event notifications for cloud storage to inform Snowpipe of the arrival of new data files to load.
Which Cloud hosted platform provides cross cloud support for automated data loading via Snowpipe?
Question 80

Find out the odd one out:
Question