ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 68 - DAS-C01 discussion

Report
Export

A retail company is building its data warehouse solution using Amazon Redshift. As a part of that effort, the company is loading hundreds of files into the fact table created in its Amazon Redshift cluster. The company wants the solution to achieve the highest throughput and optimally use cluster resources when loading data into the company’s fact table. How should the company meet these requirements?

A.
Use multiple COPY commands to load the data into the Amazon Redshift cluster.
Answers
A.
Use multiple COPY commands to load the data into the Amazon Redshift cluster.
B.
Use S3DistCp to load multiple files into the Hadoop Distributed File System (HDFS) and use an HDFS connector to ingest the data into the Amazon Redshift cluster.
Answers
B.
Use S3DistCp to load multiple files into the Hadoop Distributed File System (HDFS) and use an HDFS connector to ingest the data into the Amazon Redshift cluster.
C.
Use LOAD commands equal to the number of Amazon Redshift cluster nodes and load the data in parallel into each node.
Answers
C.
Use LOAD commands equal to the number of Amazon Redshift cluster nodes and load the data in parallel into each node.
D.
Use a single COPY command to load the data into the Amazon Redshift cluster.
Answers
D.
Use a single COPY command to load the data into the Amazon Redshift cluster.
Suggested answer: B
asked 16/09/2024
Ramon Lim
37 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first