ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 84 - BDS-C00 discussion

Report
Export

An organization has 10,000 devices that generate 10 GB of telemetry data per day, with each record size around 10 KB. Each record has 100 fields, and one field consists of unstructured log data with a "String" data type in the English language. Some fields are required for the real-time dashboard, but all fields must be available for long-term generation.

The organization also has 10 PB of previously cleaned and structured data, partitioned by Date, in a SAN that must be migrated to AWS within one month. Currently, the organization does not have any real-time capabilities in their solution. Because of storage limitations in the on-premises data warehouse, selective data is loaded while generating the long-term trend with ANSI SQL queries through JDBC for visualization. In addition to the one-time data loading, the organization needs a cost-effective and real-time solution. How can these requirements be met? (Choose two.)

A.
use AWS IoT to send data from devices to an Amazon SQS queue, create a set of workers in an Auto Scaling group and read records in batch from thequeue to process and save the data. Fan out to an Amazon SNS queue attachedwith an AWS Lambda function to filter the request dataset and save it to Amazon Elasticsearch Service for real-time analytics.
Answers
A.
use AWS IoT to send data from devices to an Amazon SQS queue, create a set of workers in an Auto Scaling group and read records in batch from thequeue to process and save the data. Fan out to an Amazon SNS queue attachedwith an AWS Lambda function to filter the request dataset and save it to Amazon Elasticsearch Service for real-time analytics.
B.
Create a Direct Connect connection between AWS and the on-premises data center and copy the data to Amazon S3 using S3 Acceleration. Use AmazonAthena to query the data.
Answers
B.
Create a Direct Connect connection between AWS and the on-premises data center and copy the data to Amazon S3 using S3 Acceleration. Use AmazonAthena to query the data.
C.
Use AWS IoT to send the data from devices to Amazon Kinesis Data Streams with the IoT rules engine. Use one Kinesis Data Firehose stream attached to aKinesis stream to batch and stream the data partitioned by date. Use anotherKinesis Firehose stream attached to the same Kinesis stream to filter out the required fields to ingest into Elasticsearch for real-time analytics.
Answers
C.
Use AWS IoT to send the data from devices to Amazon Kinesis Data Streams with the IoT rules engine. Use one Kinesis Data Firehose stream attached to aKinesis stream to batch and stream the data partitioned by date. Use anotherKinesis Firehose stream attached to the same Kinesis stream to filter out the required fields to ingest into Elasticsearch for real-time analytics.
D.
Use AWS IoT to send the data from devices to Amazon Kinesis Data Streams with the IoT rules engine. Use one Kinesis Data Firehose stream attached to aKinesis stream to stream the data into an Amazon S3 bucket partitioned bydate. Attach an AWS Lambda function with the same Kinesis stream to filter out the required fields for ingestion into Amazon DynamoDB for real-time analytics.
Answers
D.
Use AWS IoT to send the data from devices to Amazon Kinesis Data Streams with the IoT rules engine. Use one Kinesis Data Firehose stream attached to aKinesis stream to stream the data into an Amazon S3 bucket partitioned bydate. Attach an AWS Lambda function with the same Kinesis stream to filter out the required fields for ingestion into Amazon DynamoDB for real-time analytics.
E.
use multiple AWS Snowball Edge devices to transfer data to Amazon S3, and use Amazon Athena to query the data.
Answers
E.
use multiple AWS Snowball Edge devices to transfer data to Amazon S3, and use Amazon Athena to query the data.
Suggested answer: A, D
asked 16/09/2024
Velli Mutham
28 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first