ExamGecko
Question list
Search
Search

List of questions

Search

Question 137 - ARA-C01 discussion

Report
Export

A company is designing a process for importing a large amount of loT JSON data from cloud storage into Snowflake. New sets of loT data get generated and uploaded approximately every 5 minutes.

Once the loT data is in Snowflake, the company needs up-to-date information from an external vendor to join to the data. This data is then presented to users through a dashboard that shows different levels of aggregation. The external vendor is a Snowflake customer.

What solution will MINIMIZE complexity and MAXIMIZE performance?

A.
1. Create an external table over the JSON data in cloud storage. 2. Create a task that runs every 5 minutes to run a transformation procedure on new data, based on a saved timestamp. 3. Ask the vendor to expose an API so an external function can be used to generate a call to join the data back to the loT data in the transformation procedure. 4. Give the transformed table access to the dashboard tool. 5. Perform the aggregations on the dashboard tool.
Answers
A.
1. Create an external table over the JSON data in cloud storage. 2. Create a task that runs every 5 minutes to run a transformation procedure on new data, based on a saved timestamp. 3. Ask the vendor to expose an API so an external function can be used to generate a call to join the data back to the loT data in the transformation procedure. 4. Give the transformed table access to the dashboard tool. 5. Perform the aggregations on the dashboard tool.
B.
1. Create an external table over the JSON data in cloud storage. 2. Create a task that runs every 5 minutes to run a transformation procedure on new data based on a saved timestamp. 3. Ask the vendor to create a data share with the required data that can be imported into the company's Snowflake account. 4. Join the vendor's data back to the loT data using a transformation procedure. 5. Create views over the larger dataset to perform the aggregations required by the dashboard. 6. Give the views access to the dashboard tool.
Answers
B.
1. Create an external table over the JSON data in cloud storage. 2. Create a task that runs every 5 minutes to run a transformation procedure on new data based on a saved timestamp. 3. Ask the vendor to create a data share with the required data that can be imported into the company's Snowflake account. 4. Join the vendor's data back to the loT data using a transformation procedure. 5. Create views over the larger dataset to perform the aggregations required by the dashboard. 6. Give the views access to the dashboard tool.
C.
1. Create a Snowpipe to bring the JSON data into Snowflake. 2. Use streams and tasks to trigger a transformation procedure when new JSON data arrives. 3. Ask the vendor to expose an API so an external function call can be made to join the vendor's data back to the loT data in a transformation procedure. 4. Create materialized views over the larger dataset to perform the aggregations required by the dashboard. 5. Give the materialized views access to the dashboard tool.
Answers
C.
1. Create a Snowpipe to bring the JSON data into Snowflake. 2. Use streams and tasks to trigger a transformation procedure when new JSON data arrives. 3. Ask the vendor to expose an API so an external function call can be made to join the vendor's data back to the loT data in a transformation procedure. 4. Create materialized views over the larger dataset to perform the aggregations required by the dashboard. 5. Give the materialized views access to the dashboard tool.
D.
1. Create a Snowpipe to bring the JSON data into Snowflake. 2. Use streams and tasks to trigger a transformation procedure when new JSON data arrives. 3. Ask the vendor to create a data share with the required data that is then imported into the Snowflake account. 4. Join the vendor's data back to the loT data in a transformation procedure 5. Create materialized views over the larger dataset to perform the aggregations required by the dashboard. 6. Give the materialized views access to the dashboard tool.
Answers
D.
1. Create a Snowpipe to bring the JSON data into Snowflake. 2. Use streams and tasks to trigger a transformation procedure when new JSON data arrives. 3. Ask the vendor to create a data share with the required data that is then imported into the Snowflake account. 4. Join the vendor's data back to the loT data in a transformation procedure 5. Create materialized views over the larger dataset to perform the aggregations required by the dashboard. 6. Give the materialized views access to the dashboard tool.
Suggested answer: D

Explanation:

Using Snowpipe for continuous, automated data ingestion minimizes the need for manual intervention and ensures that data is available in Snowflake promptly after it is generated. Leveraging Snowflake's data sharing capabilities allows for efficient and secure access to the vendor's data without the need for complex API integrations. Materialized views provide pre-aggregated data for fast access, which is ideal for dashboards that require high performance1234.

* Snowflake Documentation on Snowpipe4

* Snowflake Documentation on Secure Data Sharing2

* Best Practices for Data Ingestion with Snowflake1

asked 23/09/2024
Martin White
35 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first