ExamGecko
Home Home / Amazon / SAP-C01

Amazon SAP-C01 Practice Test - Questions Answers, Page 63

Question list
Search
Search

List of questions

Search

Related questions











A company is planning the migration of several lab environments used for software testing. An assortment of custom tooling is used to manage the test runs for each lab. The labs use immutable infrastructure for the software test runs, and the results are stored in a highly available SQL database cluster. Although completely rewriting the custom tooling is out of scope for the migration project, the company would like to optimize workloads during the migration. Which application migration strategy meets this requirement?

A.
Re-host
A.
Re-host
Answers
B.
Re-platform
B.
Re-platform
Answers
C.
Re-factor/re-architect
C.
Re-factor/re-architect
Answers
D.
Retire
D.
Retire
Answers
Suggested answer: B

Explanation:

Reference:

https://aws.amazon.com/blogs/enterprise-strategy/6-strategies-for-migrating-applications-to-the-cloud/

For Amazon EC2 issues, while troubleshooting AWS CloudFormation, you need to view the cloud-init and cfn logs for more information. Identify a directory to which these logs are published.

A.
/var/opt/log/ec2
A.
/var/opt/log/ec2
Answers
B.
/var/log/lastlog
B.
/var/log/lastlog
Answers
C.
/var/log/
C.
/var/log/
Answers
D.
/var/log/ec2
D.
/var/log/ec2
Answers
Suggested answer: C

Explanation:

When you use AWS CloudFormation, you might encounter issues when you create, update, or delete AWS CloudFormation stacks. For Amazon EC2 issues, view the cloud-init and cfn logs. These logs are published on the Amazon EC2 instance in the /var/log/ directory. These logs capture processes and command outputs while AWS CloudFormation is setting up your instance. For Windows, view the EC2Configure service and cfn logs in %ProgramFiles%\Amazon\EC2ConfigService and C:\cfn\log. You can also configure your AWS CloudFormation template so that the logs are published to Amazon CloudWatch, which displays logs in the AWS Management Console so you don't have to connect to your Amazon EC2 instance.

Reference: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/troubleshooting.html

Which of the following should be followed before connecting to Amazon Virtual Private Cloud (Amazon VPC) using AWS Direct Connect?

A.
Provide a public Autonomous System Number (ASN) to identify your network on the Internet.
A.
Provide a public Autonomous System Number (ASN) to identify your network on the Internet.
Answers
B.
Create a virtual private gateway and attach it to your Virtual Private Cloud (VPC).
B.
Create a virtual private gateway and attach it to your Virtual Private Cloud (VPC).
Answers
C.
Allocate a private IP address to your network in the 122.x.x.x range.
C.
Allocate a private IP address to your network in the 122.x.x.x range.
Answers
D.
Provide a public IP address for each Border Gateway Protocol (BGP) session.
D.
Provide a public IP address for each Border Gateway Protocol (BGP) session.
Answers
Suggested answer: B

Explanation:

To connect to Amazon Virtual Private Cloud (Amazon VPC) by using AWS Direct Connect, you must first do the following:

Provide a private Autonomous System Number (ASN) to identify your network on the Internet. Amazon then allocates a private IP address in the 169.x.x.x range to you. Create a virtual private gateway and attach it to your VPC.

Reference: http://docs.aws.amazon.com/directconnect/latest/UserGuide/Welcome.html

A Development team is deploying new APIs as serverless applications within a company. The team is currently using the AWS Management Console to provision Amazon API Gateway, AWS Lambda, and Amazon DynamoDB resources. A Solutions Architect has been tasked with automating the future deployments of these serverless APIs.

How can this be accomplished?

A.
Use AWS CloudFormation with a Lambda-backed custom resource to provision API Gateway. Use the AWS::DynamoDB::Table and AWS::Lambda::Function resources to create the Amazon DynamoDB table and Lambda functions. Write a script to automate the deployment of the CloudFormation template.
A.
Use AWS CloudFormation with a Lambda-backed custom resource to provision API Gateway. Use the AWS::DynamoDB::Table and AWS::Lambda::Function resources to create the Amazon DynamoDB table and Lambda functions. Write a script to automate the deployment of the CloudFormation template.
Answers
B.
Use the AWS Serverless Application Model to define the resources. Upload a YAML template and application files to the code repository. Use AWS CodePipeline to connect to the code repository and to create an action to build using AWS CodeBuild. Use the AWS CloudFormation deployment provider in CodePipeline to deploy the solution.
B.
Use the AWS Serverless Application Model to define the resources. Upload a YAML template and application files to the code repository. Use AWS CodePipeline to connect to the code repository and to create an action to build using AWS CodeBuild. Use the AWS CloudFormation deployment provider in CodePipeline to deploy the solution.
Answers
C.
Use AWS CloudFormation to define the serverless application. Implement versioning on the Lambda functions and create aliases to point to the versions. When deploying, configure weights to implement shifting traffic to the newest version, and gradually update the weights as traffic moves over.
C.
Use AWS CloudFormation to define the serverless application. Implement versioning on the Lambda functions and create aliases to point to the versions. When deploying, configure weights to implement shifting traffic to the newest version, and gradually update the weights as traffic moves over.
Answers
D.
Commit the application code to the AWS CodeCommit code repository. Use AWS CodePipeline and connect to the CodeCommit code repository. Use AWS CodeBuild to build and deploy the Lambda functions using AWS CodeDeploy. Specify the deployment preference type in CodeDeploy to gradually shift traffic over to the new version.
D.
Commit the application code to the AWS CodeCommit code repository. Use AWS CodePipeline and connect to the CodeCommit code repository. Use AWS CodeBuild to build and deploy the Lambda functions using AWS CodeDeploy. Specify the deployment preference type in CodeDeploy to gradually shift traffic over to the new version.
Answers
Suggested answer: B

Explanation:

Reference: https://aws.amazon.com/quickstart/architecture/serverless-cicd-for-enterprise/ https://awsquickstart.s3.amazonaws.com/quickstart-trek10-serverless-enterprise-cicd/doc/serverless-cicd-for-the-enterprise-on-theaws-cloud.pdf

A company is running a batch analysis every hour on their main transactional DB, running on an RDS MySQL instance, to populate their central Data Warehouse running on Redshift. During the execution of the batch, their transactional applications are very slow. When the batch completes they need to update the top management dashboard with the new data. The dashboard is produced by another system running on-premises that is currently started when a manually-sent email notifies that an update is required. The on-premises system cannot be modified because is managed by another team. How would you optimize this scenario to solve performance issues and automate the process as much as possible?

A.
Replace RDS with Redshift for the batch analysis and SNS to notify the on-premises system to update the dashboard
A.
Replace RDS with Redshift for the batch analysis and SNS to notify the on-premises system to update the dashboard
Answers
B.
Replace RDS with Redshift for the oaten analysis and SQS to send a message to the on-premises system to update the dashboard
B.
Replace RDS with Redshift for the oaten analysis and SQS to send a message to the on-premises system to update the dashboard
Answers
C.
Create an RDS Read Replica for the batch analysis and SNS to notify me on-premises system to update the dashboard
C.
Create an RDS Read Replica for the batch analysis and SNS to notify me on-premises system to update the dashboard
Answers
D.
Create an RDS Read Replica for the batch analysis and SQS to send a message to the on-premises system to update the dashboard.
D.
Create an RDS Read Replica for the batch analysis and SQS to send a message to the on-premises system to update the dashboard.
Answers
Suggested answer: C

A large real-estate brokerage is exploring the option of adding a cost-effective location based alert to their existing mobile application. The application backend infrastructure currently runs on AWS. Users who opt in to this service will receive alerts on their mobile device regarding real-estate otters in proximity to their location. For the alerts to be relevant delivery time needs to be in the low minute count the existing mobile app has 5 million users across the US. Which one of the following architectural suggestions would you make to the customer?

A.
The mobile application will submit its location to a web service endpoint utilizing Elastic Load Balancing and EC2 instances; DynamoDB will be used to store and retrieve relevant offers EC2 instances will communicate with mobile earners/device providers to push alerts back to mobile application.
A.
The mobile application will submit its location to a web service endpoint utilizing Elastic Load Balancing and EC2 instances; DynamoDB will be used to store and retrieve relevant offers EC2 instances will communicate with mobile earners/device providers to push alerts back to mobile application.
Answers
B.
Use AWS DirectConnect or VPN to establish connectivity with mobile carriers EC2 instances will receive the mobile applications location through carrier connection: RDS will be used to store and relevant offers. EC2 instances will communicate with mobile carriers to push alerts back to the mobile application.
B.
Use AWS DirectConnect or VPN to establish connectivity with mobile carriers EC2 instances will receive the mobile applications location through carrier connection: RDS will be used to store and relevant offers. EC2 instances will communicate with mobile carriers to push alerts back to the mobile application.
Answers
C.
The mobile application will send device location using SQS. EC2 instances will retrieve the relevant others from DynamoDB. AWS Mobile Push will be used to send offers to the mobile application.
C.
The mobile application will send device location using SQS. EC2 instances will retrieve the relevant others from DynamoDB. AWS Mobile Push will be used to send offers to the mobile application.
Answers
D.
The mobile application will send device location using AWS Mobile Push EC2 instances will retrieve the relevant offers from DynamoDB. EC2 instances will communicate with mobile carriers/device providers to push alerts back to the mobile application.
D.
The mobile application will send device location using AWS Mobile Push EC2 instances will retrieve the relevant offers from DynamoDB. EC2 instances will communicate with mobile carriers/device providers to push alerts back to the mobile application.
Answers
Suggested answer: C

A software company is using three AWS accounts for each of its 10 development teams. The company has developed an AWS CloudFormation standard VPC template that includes three NAT gateways. The template is added to each account for each team. The company is concerned that network costs will increase each time a new development team is added. A solutions architect must maintain the reliability of the company’s solutions and minimize operational complexity.

What should the solutions architect do to reduce the network costs while meeting these requirements?

A.
Create a single VPC with three NAT gateways in a shared services account. Configure each account VPC with a default route through a transit gateway to the NAT gateway in the shared services account VPC. Remove all NAT gateways from the standard VPC template.
A.
Create a single VPC with three NAT gateways in a shared services account. Configure each account VPC with a default route through a transit gateway to the NAT gateway in the shared services account VPC. Remove all NAT gateways from the standard VPC template.
Answers
B.
Create a single VPC with three NAT gateways in a shared services account. Configure each account VPC with a default route through a VPC peering connection to the NAT gateway in the shared services account VPC. Remove all NAT gateways from the standard VPC template.
B.
Create a single VPC with three NAT gateways in a shared services account. Configure each account VPC with a default route through a VPC peering connection to the NAT gateway in the shared services account VPC. Remove all NAT gateways from the standard VPC template.
Answers
C.
Remove two NAT gateways from the standard VPC template. Rely on the NAT gateway SLA to cover reliability for the remaining NAT gateway.
C.
Remove two NAT gateways from the standard VPC template. Rely on the NAT gateway SLA to cover reliability for the remaining NAT gateway.
Answers
D.
Create a single VPC with three NAT gateways in a shared services account. Configure a Site-to-Site VPN connection from each account to the shared services account. Remove all NAT gateways from the standard VPC template.
D.
Create a single VPC with three NAT gateways in a shared services account. Configure a Site-to-Site VPN connection from each account to the shared services account. Remove all NAT gateways from the standard VPC template.
Answers
Suggested answer: B

What is the role of the PollForTask action when it is called by a task runner in AWS Data Pipeline?

A.
It is used to retrieve the pipeline definition.
A.
It is used to retrieve the pipeline definition.
Answers
B.
It is used to report the progress of the task runner to AWS Data Pipeline.
B.
It is used to report the progress of the task runner to AWS Data Pipeline.
Answers
C.
It is used to receive a task to perform from AWS Data Pipeline.
C.
It is used to receive a task to perform from AWS Data Pipeline.
Answers
D.
It is used to inform AWS Data Pipeline of the outcome when the task runner completes a task.
D.
It is used to inform AWS Data Pipeline of the outcome when the task runner completes a task.
Answers
Suggested answer: C

Explanation:

Task runners call PollForTask to receive a task to perform from AWS Data Pipeline. If tasks are ready in the work queue, PollForTask returns a response immediately. If no tasks are available in the queue, PollForTask uses long-polling and holds on to a poll connection for up to 90 seconds, during which time any newly scheduled tasks are handed to the task agent. Your remote worker should not call PollForTask again on the same worker group until it receives a response, and this may take up to 90 seconds.

Reference: http://docs.aws.amazon.com/datapipeline/latest/APIReference/API_PollForTask.html

A company’s solutions architect is analyzing costs of a multi-application environment. The environment is deployed across multiple Availability Zones in a single AWS Region. After a recent acquisition, the company manages two organizations in AWS Organizations. The company has created multiple service provider applications as AWS PrivateLink-powered VPC endpoint services in one organization. The company has created multiple service consumer applications in the other organization.

Data transfer charges are much higher than the company expected, and the solutions architect needs to reduce the costs. The solutions architect must recommend guidelines for developers to follow when they deploy services. These guidelines must minimize data transfer charges for the whole environment. Which guidelines meet these requirements? (Choose two.)

A.
Use AWS Resource Access Manager to share the subnets that host the service provider applications with other accounts in the organization.
A.
Use AWS Resource Access Manager to share the subnets that host the service provider applications with other accounts in the organization.
Answers
B.
Place the service provider applications and the service consumer applications in AWS accounts in the same organization.
B.
Place the service provider applications and the service consumer applications in AWS accounts in the same organization.
Answers
C.
Turn off cross-zone load balancing for the Network Load Balancer in all service provider application deployments.
C.
Turn off cross-zone load balancing for the Network Load Balancer in all service provider application deployments.
Answers
D.
Ensure that service consumer compute resources use the Availability Zone-specific endpoint service by using the endpoint’s local DNS name.
D.
Ensure that service consumer compute resources use the Availability Zone-specific endpoint service by using the endpoint’s local DNS name.
Answers
E.
Create a Savings Plan that provides adequate coverage for the organization’s planned inter-Availability Zone data transfer usage.
E.
Create a Savings Plan that provides adequate coverage for the organization’s planned inter-Availability Zone data transfer usage.
Answers
Suggested answer: A, C

Explanation:

Reference: https://docs.aws.amazon.com/ram/latest/userguide/getting-started-sharing.html

https://docs.aws.amazon.com/elasticloadbalancing/latest/classic/enable-disable-crosszone-lb.html

A company is running an Apache Hadoop cluster on Amazon EC2 instances. The Hadoop cluster stores approximately 100 TB of data for weekly operational reports and allows occasional access for data scientists to retrieve data. The company needs to reduce the cost and operational complexity for storing and serving this data.

Which solution meets these requirements in the MOST cost-effective manner?

A.
Move the Hadoop cluster from EC2 instances to Amazon EMR. Allow data access patterns to remain the same.
A.
Move the Hadoop cluster from EC2 instances to Amazon EMR. Allow data access patterns to remain the same.
Answers
B.
Write a script that resizes the EC2 instances to a smaller instance type during downtime and resizes the instances to a larger instance type before the reports are created.
B.
Write a script that resizes the EC2 instances to a smaller instance type during downtime and resizes the instances to a larger instance type before the reports are created.
Answers
C.
Move the data to Amazon S3 and use Amazon Athena to query the data for reports. Allow the data scientists to access the data directly in Amazon S3.
C.
Move the data to Amazon S3 and use Amazon Athena to query the data for reports. Allow the data scientists to access the data directly in Amazon S3.
Answers
D.
Migrate the data to Amazon DynamoDB and modify the reports to fetch data from DynamoDB. Allow the data scientists to access the data directly in DynamoDB.
D.
Migrate the data to Amazon DynamoDB and modify the reports to fetch data from DynamoDB. Allow the data scientists to access the data directly in DynamoDB.
Answers
Suggested answer: C
Total 906 questions
Go to page: of 91