ExamGecko
Home Home / Amazon / BDS-C00

Amazon BDS-C00 Practice Test - Questions Answers, Page 9

Question list
Search
Search

Related questions











A real-time bidding company is rebuilding their monolithic application and is focusing on serving real-time data. A large number of reads and writes are generated from thousands of concurrent users who follow items and bid on the company's sale offers.

The company is experiencing high latency during special event spikes, with millions of concurrent users.

The company needs to analyze and aggregate a part of the data in near real time to feed an internal dashboard.

What is the BEST approach for serving and analyzing data, considering the constraint of the row latency on the highly demanded data?

A.
Use Amazon Aurora with Multi Availability Zone and read replicas. Use Amazon ElastiCache in front of the read replicas to serve read-only content quickly.Use the same database as datasource for the dashboard.
A.
Use Amazon Aurora with Multi Availability Zone and read replicas. Use Amazon ElastiCache in front of the read replicas to serve read-only content quickly.Use the same database as datasource for the dashboard.
Answers
B.
Use Amazon DynamoDB to store real-time data with Amazon DynamoDB. Accelerator to serve content quickly. use Amazon DynamoDB Streams to replayall changes to the table, process and stream to Amazon Elasti search Servicewith AWS Lambda.
B.
Use Amazon DynamoDB to store real-time data with Amazon DynamoDB. Accelerator to serve content quickly. use Amazon DynamoDB Streams to replayall changes to the table, process and stream to Amazon Elasti search Servicewith AWS Lambda.
Answers
C.
Use Amazon RDS with Multi Availability Zone. Provisioned IOPS EBS volume for storage. Enable up to five read replicas to serve read-only content quickly.Use Amazon EMR with Sqoop to import Amazon RDS data into HDFS foranalysis.
C.
Use Amazon RDS with Multi Availability Zone. Provisioned IOPS EBS volume for storage. Enable up to five read replicas to serve read-only content quickly.Use Amazon EMR with Sqoop to import Amazon RDS data into HDFS foranalysis.
Answers
D.
Use Amazon Redshift with a DC2 node type and a multi-mode cluster. Create an Amazon EC2 instance with pgpoo1 installed. Create an AmazonElastiCache cluster and route read requests through pgpoo1, and use Amazon Redshift for analysis.
D.
Use Amazon Redshift with a DC2 node type and a multi-mode cluster. Create an Amazon EC2 instance with pgpoo1 installed. Create an AmazonElastiCache cluster and route read requests through pgpoo1, and use Amazon Redshift for analysis.
Answers
Suggested answer: D

A gas company needs to monitor gas pressure in their pipelines. Pressure data is streamed from sensors placed throughout the pipelines to monitor the data in real time. When an anomaly is detected, the system must send a notification to open valve. An Amazon Kinesis stream collects the data from the sensors and an anomaly Kinesis stream triggers an AWS Lambda function to open the appropriate valve. Which solution is the MOST cost-effective for responding to anomalies in real time?

A.
Attach a Kinesis Firehose to the stream and persist the sensor data in an Amazon S3 bucket. Schedule an AWS Lambda function to run a query in AmazonAthena against the data in Amazon S3 to identify anomalies. When a change isdetected, the Lambda function sends a message to the anomaly stream to open the valve.
A.
Attach a Kinesis Firehose to the stream and persist the sensor data in an Amazon S3 bucket. Schedule an AWS Lambda function to run a query in AmazonAthena against the data in Amazon S3 to identify anomalies. When a change isdetected, the Lambda function sends a message to the anomaly stream to open the valve.
Answers
B.
Launch an Amazon EMR cluster that uses Spark Streaming to connect to the Kinesis stream and Spark machine learning to detect anomalies. When achange is detected, the Spark application sends a message to the anomaly stream toopen the valve.
B.
Launch an Amazon EMR cluster that uses Spark Streaming to connect to the Kinesis stream and Spark machine learning to detect anomalies. When achange is detected, the Spark application sends a message to the anomaly stream toopen the valve.
Answers
C.
Launch a fleet of Amazon EC2 instances with a Kinesis Client Library application that consumes the stream and aggregates sensor data over time to identifyanomalies. When an anomaly is detected, the application sends a message tothe anomaly stream to open the valve.
C.
Launch a fleet of Amazon EC2 instances with a Kinesis Client Library application that consumes the stream and aggregates sensor data over time to identifyanomalies. When an anomaly is detected, the application sends a message tothe anomaly stream to open the valve.
Answers
D.
Create a Kinesis Analytics application by using the RANDOM_CUT_FOREST function to detect an anomaly. When the anomaly score that is returned fromthe function is outside of an acceptable range, a message is sent to the anomaly stream to open the valve.
D.
Create a Kinesis Analytics application by using the RANDOM_CUT_FOREST function to detect an anomaly. When the anomaly score that is returned fromthe function is outside of an acceptable range, a message is sent to the anomaly stream to open the valve.
Answers
Suggested answer: A

A gaming organization is developing a new game and would like to offer real-time competition to their users. The data architecture has the following characteristics:

The game application is writing events directly to Amazon DynamoDB from the user's mobile device.

Users from the website can access their statistics directly from DynamoDB.

The game servers are accessing DynamoDB to update the user's information.

The data science team extracts data from DynamoDB for various applications.

The engineering team has already agreed to the IAM roles and policies to use for the data science team and the application.

Which actions will provide the MOST security, while maintaining the necessary access to the website and game application? (Choose two.)

A.
Use Amazon Cognito user pool to authenticate to both the website and the game application.
A.
Use Amazon Cognito user pool to authenticate to both the website and the game application.
Answers
B.
Use IAM identity federation to authenticate to both the website and the game application.
B.
Use IAM identity federation to authenticate to both the website and the game application.
Answers
C.
Create an IAM policy with PUT permission for both the website and the game application.
C.
Create an IAM policy with PUT permission for both the website and the game application.
Answers
D.
Create an IAM policy with fine-grained permission for both the website and the game application.
D.
Create an IAM policy with fine-grained permission for both the website and the game application.
Answers
E.
Create an IAM policy with PUT permission for the game application and an IAM policy with GET permission for the website.
E.
Create an IAM policy with PUT permission for the game application and an IAM policy with GET permission for the website.
Answers
Suggested answer: B, E

An organization has 10,000 devices that generate 10 GB of telemetry data per day, with each record size around 10 KB. Each record has 100 fields, and one field consists of unstructured log data with a "String" data type in the English language. Some fields are required for the real-time dashboard, but all fields must be available for long-term generation.

The organization also has 10 PB of previously cleaned and structured data, partitioned by Date, in a SAN that must be migrated to AWS within one month. Currently, the organization does not have any real-time capabilities in their solution. Because of storage limitations in the on-premises data warehouse, selective data is loaded while generating the long-term trend with ANSI SQL queries through JDBC for visualization. In addition to the one-time data loading, the organization needs a cost-effective and real-time solution. How can these requirements be met? (Choose two.)

A.
use AWS IoT to send data from devices to an Amazon SQS queue, create a set of workers in an Auto Scaling group and read records in batch from thequeue to process and save the data. Fan out to an Amazon SNS queue attachedwith an AWS Lambda function to filter the request dataset and save it to Amazon Elasticsearch Service for real-time analytics.
A.
use AWS IoT to send data from devices to an Amazon SQS queue, create a set of workers in an Auto Scaling group and read records in batch from thequeue to process and save the data. Fan out to an Amazon SNS queue attachedwith an AWS Lambda function to filter the request dataset and save it to Amazon Elasticsearch Service for real-time analytics.
Answers
B.
Create a Direct Connect connection between AWS and the on-premises data center and copy the data to Amazon S3 using S3 Acceleration. Use AmazonAthena to query the data.
B.
Create a Direct Connect connection between AWS and the on-premises data center and copy the data to Amazon S3 using S3 Acceleration. Use AmazonAthena to query the data.
Answers
C.
Use AWS IoT to send the data from devices to Amazon Kinesis Data Streams with the IoT rules engine. Use one Kinesis Data Firehose stream attached to aKinesis stream to batch and stream the data partitioned by date. Use anotherKinesis Firehose stream attached to the same Kinesis stream to filter out the required fields to ingest into Elasticsearch for real-time analytics.
C.
Use AWS IoT to send the data from devices to Amazon Kinesis Data Streams with the IoT rules engine. Use one Kinesis Data Firehose stream attached to aKinesis stream to batch and stream the data partitioned by date. Use anotherKinesis Firehose stream attached to the same Kinesis stream to filter out the required fields to ingest into Elasticsearch for real-time analytics.
Answers
D.
Use AWS IoT to send the data from devices to Amazon Kinesis Data Streams with the IoT rules engine. Use one Kinesis Data Firehose stream attached to aKinesis stream to stream the data into an Amazon S3 bucket partitioned bydate. Attach an AWS Lambda function with the same Kinesis stream to filter out the required fields for ingestion into Amazon DynamoDB for real-time analytics.
D.
Use AWS IoT to send the data from devices to Amazon Kinesis Data Streams with the IoT rules engine. Use one Kinesis Data Firehose stream attached to aKinesis stream to stream the data into an Amazon S3 bucket partitioned bydate. Attach an AWS Lambda function with the same Kinesis stream to filter out the required fields for ingestion into Amazon DynamoDB for real-time analytics.
Answers
E.
use multiple AWS Snowball Edge devices to transfer data to Amazon S3, and use Amazon Athena to query the data.
E.
use multiple AWS Snowball Edge devices to transfer data to Amazon S3, and use Amazon Athena to query the data.
Answers
Suggested answer: A, D

An organization is designing a public web application and has a requirement that states all application users must be centrally authenticated before any operations are permitted. The organization will need to create a user table with fast data lookup for the application in which a user can read only his or her own data. All users already have an account with amazon.com. How can these requirements be met?

A.
Create an Amazon RDS Aurora table, with Amazon_ID as the primary key. The application uses amazon.com web identity federation to get a token that isused to assume an IAM role from AWS STS. Use IAM database authentication byusing the rds:db-tag IAM authentication policy and GRANT Amazon RDS row-level read permission per user.
A.
Create an Amazon RDS Aurora table, with Amazon_ID as the primary key. The application uses amazon.com web identity federation to get a token that isused to assume an IAM role from AWS STS. Use IAM database authentication byusing the rds:db-tag IAM authentication policy and GRANT Amazon RDS row-level read permission per user.
Answers
B.
Create an Amazon RDS Aurora table, with Amazon_ID as the primary key for each user. The application uses amazon.com web identity federation to get atoken that is used to assume an IAM role. Use IAM database authentication byusing rds:db-tag IAM authentication policy and GRANT Amazon RDS row-level read permission per user.
B.
Create an Amazon RDS Aurora table, with Amazon_ID as the primary key for each user. The application uses amazon.com web identity federation to get atoken that is used to assume an IAM role. Use IAM database authentication byusing rds:db-tag IAM authentication policy and GRANT Amazon RDS row-level read permission per user.
Answers
C.
Create an Amazon DynamoDB table, with Amazon_ID as the partition key. The application uses amazon.com web identity federation to get a token that isused to assume an IAM role from AWS STS in the Role, use IAM condition context key dynamodb:LeadingKeys with IAM substitution variables $ {www.amazon.com:user_id} and allow the required DynamoDB API operations in IAM JSON policy Action element for reading the records.
C.
Create an Amazon DynamoDB table, with Amazon_ID as the partition key. The application uses amazon.com web identity federation to get a token that isused to assume an IAM role from AWS STS in the Role, use IAM condition context key dynamodb:LeadingKeys with IAM substitution variables $ {www.amazon.com:user_id} and allow the required DynamoDB API operations in IAM JSON policy Action element for reading the records.
Answers
D.
Create an Amazon DynamoDB table, with Amazon_ID as the partition key. The application uses amazon.com web identity federation to assume an IAM rolefrom AWS STS in the Role, use IAM condition context key dynamodb:LeadingKeys with IAM substitution variables $ {www.amazon.com:user_id} and allow the required DynamoDB API operations in IAM JSON policy Action element for reading the records.
D.
Create an Amazon DynamoDB table, with Amazon_ID as the partition key. The application uses amazon.com web identity federation to assume an IAM rolefrom AWS STS in the Role, use IAM condition context key dynamodb:LeadingKeys with IAM substitution variables $ {www.amazon.com:user_id} and allow the required DynamoDB API operations in IAM JSON policy Action element for reading the records.
Answers
Suggested answer: C

Explanation:


Total 85 questions
Go to page: of 9