Which of the following are best practices for loading data into Snowflake? (Choose three.)
A.
Aim to produce data files that are between 100 MB and 250 MB in size, compressed.
A.
Aim to produce data files that are between 100 MB and 250 MB in size, compressed.
B.
Load data from files in a cloud storage service in a different region or cloud platform from the service or region containing the Snowflake account, to save on cost.
B.
Load data from files in a cloud storage service in a different region or cloud platform from the service or region containing the Snowflake account, to save on cost.
C.
Enclose fields that contain delimiter characters in single or double quotes.
C.
Enclose fields that contain delimiter characters in single or double quotes.
D.
Split large files into a greater number of smaller files to distribute the load among the compute resources in an active warehouse.
D.
Split large files into a greater number of smaller files to distribute the load among the compute resources in an active warehouse.
E.
When planning which warehouse(s) to use for data loading, start with the largest warehouse possible.
E.
When planning which warehouse(s) to use for data loading, start with the largest warehouse possible.
F.
Partition the staged data into large folders with random paths, allowing Snowflake to determine the best way to load each file.
F.
Partition the staged data into large folders with random paths, allowing Snowflake to determine the best way to load each file.
Suggested answer: A, C, D
Explanation:
Best practices for loading data into Snowflake include aiming for data file sizes between 100 MB and 250 MB when compressed, as this size is optimal for parallel processing and minimizes overhead. Enclosing fields with delimiter characters in quotes ensures proper field recognition during the load process. Splitting large files into smaller ones allows for better distribution of the load across compute resources, enhancing performance and efficiency.
Question