Amazon SAP-C01 Practice Test - Questions Answers, Page 7
List of questions
Question 61
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A medical company is running an application in the AWS Cloud. The application simulates the effect of medical drugs in development. The application consists of two parts: configuration and simulation. The configuration part runs in AWS Fargate containers in an Amazon Elastic Container Service (Amazon ECS) cluster. The simulation part runs on large, compute optimized Amazon EC2 instances. Simulations can restart if they are interrupted.
The configuration part runs 24 hours a day with a steady load. The simulation part runs only for a few hours each night with a variable load. The company stores simulation results in Amazon S3, and researchers use the results for 30 days. The company must store simulations for 10 years and must be able to retrieve the simulations within 5 hours. Which solution meets these requirements MOST cost-effectively?
Question 62
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A company is currently running a production workload on AWS that is very I/O intensive. Its workload consists of a single tier with 10 c4.8xlarge instances, each with 2 TB gp2 volumes. The number of processing jobs has recently increased, and latency has increased as well. The team realizes that they are constrained on the IOPS. For the application to perform efficiently, they need to increase the IOPS by 3,000 for each of the instances. Which of the following designs will meet the performance goal MOST cost effectively?
Explanation:
Reference: https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/CHAP_Storage.html
Question 63
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A company that tracks medical devices in hospitals wants to migrate its existing storage solution to the AWS Cloud. The company equips all of its devices with sensors that collect location and usage information. This sensor data is sent in unpredictable patterns with large spikes. The data is stored in a MySQL database running on premises at each hospital. The company wants the cloud storage solution to scale with usage. The company’s analytics team uses the sensor data to calculate usage by device type and hospital. The team needs to keep analysis tools running locally while fetching data from the cloud. The team also needs to use existing Java application and SQL queries with as few changes as possible.
How should a solutions architect meet these requirements while ensuring the sensor data is secure?
Question 64
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A company has an application. Once a month, the application creates a compressed file that contains every object within an Amazon S3 bucket. The total size of the objects before compression is 1 TB. The application runs by using a scheduled cron job on an Amazon EC2 instance that has a 5 TB Amazon Elastic Block Store (Amazon EBS) volume attached. The application downloads all the files from the source S3 bucket to the EBS volume, compresses the file, and uploads the file to a target S3 bucket. Every invocation of the application takes 2 hours from start to finish. Which combination of actions should a solutions architect take to OPTIMIZE costs for this application? (Choose two.)
Question 65
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A company has an internal AWS Elastic Beanstalk worker environment inside a VPC that must access an external payment gateway API available on an HTTPS endpoint on the public internet. Because of security policies, the payment gateway’s Application team can grant access to only one public IP address.
Which architecture will set up an Elastic Beanstalk environment to access the company’s application without making multiple changes on the company’s end?
Explanation:
Reference: https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/vpc.html
Question 66
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A solutions architect needs to implement a client-side encryption mechanism for objects that will be stored in a new Amazon S3 bucket. The solutions architect created a CMK that is stored in AWS Key Management Service (AWS KMS) for this purpose.
The solutions architect created the following IAM policy and attached it to an IAM role:
During tests, the solutions architect was able to successfully get existing test objects in the S3 bucket. However, attempts to upload a new object resulted in an error message. The error message stated that the action was forbidden. Which action must the solutions architect add to the IAM policy to meet all the requirements?
Question 67
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
An organization is setting up their website on AWS. The organization is working on various security measures to be performed on the AWS EC2 instances. Which of the below mentioned security mechanisms will not help the organization to avoid future data leaks and identify security weaknesses?
Explanation:
AWS security follows the shared security model where the user is as much responsible as Amazon. Since Amazon is a public cloud it is bound to be targeted by hackers. If an organization is planning to host their application on AWS EC2, they should perform the below mentioned security checks as a measure to find any security weakness/data leaks:
Perform penetration testing as performed by attackers to find any vulnerability. The organization must take an approval from AWS before performing penetration testing Perform hardening testing to find if there are any unnecessary ports open Perform SQL injection to find any DB security issues The code memory checks are generally useful when the organization wants to improve the application performance.
Reference: http://aws.amazon.com/security/penetration-testing/
Question 68
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
When does an AWS Data Pipeline terminate the AWS Data Pipeline-managed compute resources?
Explanation:
Compute resources will be provisioned by AWS Data Pipeline when the first activity for a scheduled time that uses those resources is ready to run, and those instances will be terminated when the final activity that uses the resources has completed successfully or failed.
Reference:
https://aws.amazon.com/datapipeline/faqs/
Question 69
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
You have just added a new instance to your Auto Scaling group, which receives ELB health checks. An ELB heath check says the new instance's state is out of Service. What does Auto Scaling do in this particular scenario?
Explanation:
If you have attached a load balancer to your Auto Scaling group, you can have Auto Scaling include the results of Elastic Load Balancing health checks when it determines the health status of an instance. After you add ELB health checks, Auto Scaling will mark an instance as unhealthy if Elastic Load Balancing reports the instance state as Out of Service. Frequently, an Auto Scaling instance that has just come into service needs to warm up before it can pass the Auto Scaling health check.
Auto Scaling waits until the health check grace period ends before checking the health status of the instance. While the EC2 status checks and ELB health checks can complete before the health check grace period expires, Auto Scaling does not act on them until the health check grace period expires. To provide ample warm-up time for your instances, ensure that the health check grace period covers the expected startup time for your application.
Reference: http://docs.aws.amazon.com/autoscaling/latest/userguide/healthcheck.html
Question 70
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A company has an application that uses Amazon EC2 instances in an Auto Scaling group. The Quality Assurance (QA) department needs to launch a large number of short-lived environments to test the application. The application environments are currently launched by the Manager of the department using an AWS CloudFormation template. To launch the stack, the Manager uses a role with permission to use CloudFormation, EC2, and Auto Scaling APIs. The Manager wants to allow testers to launch their own environments, but does not want to grant broad permissions to each user. Which set up would achieve these goals?
Explanation:
Reference:
https://aws.amazon.com/ru/blogs/mt/how-to-launch-secure-and-governed-aws-resources-with-aws-cloudformation-and-awsservice-catalog/
Question