List of questions
Related questions
Question 81 - BDS-C00 discussion
A real-time bidding company is rebuilding their monolithic application and is focusing on serving real-time data. A large number of reads and writes are generated from thousands of concurrent users who follow items and bid on the company's sale offers.
The company is experiencing high latency during special event spikes, with millions of concurrent users.
The company needs to analyze and aggregate a part of the data in near real time to feed an internal dashboard.
What is the BEST approach for serving and analyzing data, considering the constraint of the row latency on the highly demanded data?
A.
Use Amazon Aurora with Multi Availability Zone and read replicas. Use Amazon ElastiCache in front of the read replicas to serve read-only content quickly.Use the same database as datasource for the dashboard.
B.
Use Amazon DynamoDB to store real-time data with Amazon DynamoDB. Accelerator to serve content quickly. use Amazon DynamoDB Streams to replayall changes to the table, process and stream to Amazon Elasti search Servicewith AWS Lambda.
C.
Use Amazon RDS with Multi Availability Zone. Provisioned IOPS EBS volume for storage. Enable up to five read replicas to serve read-only content quickly.Use Amazon EMR with Sqoop to import Amazon RDS data into HDFS foranalysis.
D.
Use Amazon Redshift with a DC2 node type and a multi-mode cluster. Create an Amazon EC2 instance with pgpoo1 installed. Create an AmazonElastiCache cluster and route read requests through pgpoo1, and use Amazon Redshift for analysis.
Your answer:
0 comments
Sorted by
Leave a comment first