Amazon SAP-C02 Practice Test - Questions Answers, Page 29
List of questions
Question 281

A company wants to manage the costs associated with a group of 20 applications that are infrequently used, but are still business-critical, by migrating to AWS. The applications are a mix of Java and Node.js spread across different instance clusters. The company wants to minimize costs while standardizing by using a single deployment methodology.
Most of the applications are part of month-end processing routines with a small number of concurrent users, but they are occasionally run at other times Average application memory consumption is less than 1 GB. though some applications use as much as 2.5 GB of memory during peak processing. The most important application in the group is a billing report written in Java that accesses multiple data sources and often runs for several hours.
Which is the MOST cost-effective solution?
Explanation:
https://docs.aws.amazon.com/whitepapers/latest/serverless-architectures-lambda/timeout.html
Question 282

During an audit, a security team discovered that a development team was putting IAM user secret access keys in their code and then committing it to an AWS CodeCommit repository. The security team wants to automatically find and remediate instances of this security vulnerability.
Which solution will ensure that the credentials are appropriately secured automatically?
Explanation:
CodeCommit may use S3 on the back end (and it also uses DynamoDB on the back end) but I don't think they're stored in buckets that you can see or point Macie to. In fact, there are even solutions out there describing how to copy your repo from CodeCommit into S3 to back it up: https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/automate-event-driven-backups-from-codecommit-to-amazon-s3-using-codebuild-and-cloudwatch-events.html
Question 283

A company has an application that generates reports and stores them in an Amazon S3 bucket When a user accesses their report, the application generates a signed URL to allow the user to download the report. The company's security team has discovered that the files are public and that anyone can download them without authentication The company has suspended the generation of new reports until the problem is resolved.
Which set of actions will immediately remediate the security issue without impacting the application's normal workflow?
Explanation:
https://docs.aws.amazon.com/AmazonS3/latest/userguide/access-control-block-public-access.html
Question 284

A large company recently experienced an unexpected increase in Amazon RDS and Amazon DynamoDB costs. The company needs to increase visibility into details of AWS Billing and Cost Management There are various accounts associated with AWS Organizations, including many development and production accounts There is no consistent tagging strategy across the organization, but there are guidelines in place that require all infrastructure to be deployed using AWS CloudFormation with consistent tagging. Management requires cost center numbers and project ID numbers for all existing and future DynamoDB tables and RDS instances.
Which strategy should the solutions architect provide to meet these requirements?
Explanation:
Using Tag Editor to remediate untagged resources is a Best Practice (Page 14 or AWS Tagging Best Practices WhitePaper). However, that is were answer A stops. It doesn't address the requirement of "Management requires cost center numbers and project ID number for all existing and future DynamoDB tables and RDS instances". That is where Answer C comes in and addresses that requirement with SCPs in the company's AWS Organization. AWS Tagging Best Practices -
https://d1.awsstatic.com/whitepapers/aws-tagging-best-practices.pdf
Question 285

A company is serving files to its customers through an SFTP server that is accessible over the internet The SFTP server is running on a single Amazon EC2 instance with an Elastic IP address attached Customers connect to the SFTP server through its Elastic IP address and use SSH for authentication The EC2 instance also has an attached security group that allows access from all customer IP addresses.
A solutions architect must implement a solution to improve availability minimize the complexity of infrastructure management and minimize the disruption to customers who access files. The solution must not change the way customers connect Which solution will meet these requirements?
Explanation:
https://aws.amazon.com/premiumsupport/knowledge-center/aws-sftp-endpoint-type/
Question 286

A solutions architect is creating an application that stores objects in an Amazon S3 bucket The solutions architect must deploy the application in two AWS Regions that will be used simultaneously The objects in the two S3 buckets must remain synchronized with each other.
Which combination of steps will meet these requirements with the LEAST operational overhead?
(Select THREE)
Explanation:
https://docs.aws.amazon.com/AmazonS3/latest/userguide/MultiRegionAccessPointRequestRouting.html
https://stackoverflow.com/questions/60947157/aws-s3-replication-withoutversioning#:~:text=The%20automated%20Same%20Region%20Replication,is%20replicated%20between%20S3%20buckets.
Question 287

A solutions architect is designing an application to accept timesheet entries from employees on their mobile devices. Timesheets will be submitted weekly, with most of the submissions occurring on
Friday. The data must be stored in a format that allows payroll administrators to run monthly reports The infrastructure must be highly available and scale to match the rate of incoming data and reporting requests.
Which combination of steps meets these requirements while minimizing operational overhead?
(Select TWO}
Explanation:
https://aws.amazon.com/blogs/architecture/create-dynamic-contact-forms-for-s3-static-websitesusing-aws-lambda-amazon-api-gateway-and-amazon-ses/
Question 288

A company wants to send data from its on-premises systems to Amazon S3 buckets. The company created the S3 buckets in three different accounts. The company must send the data privately without the data traveling across the internet The company has no existing dedicated connectivity to AWS Which combination of steps should a solutions architect take to meet these requirements? (Select TWO.)
Explanation:
https://docs.aws.amazon.com/AmazonS3/latest/userguide/privatelink-interfaceendpoints.html#types-of-vpc-endpoints-for-s3
https://aws.amazon.com/premiumsupport/knowledge-center/s3-bucket-access-direct-connect/
Use a private IP address over Direct Connect (with an interface VPC endpoint) To access Amazon S3 using a private IP address over Direct Connect, perform the following steps:
...
3. Create a private virtual interface for your connection.
...
5. Create an interface VPC endpoint for Amazon S3 in a VPC that is associated with the virtual private gateway. The VGW must connect to a Direct Connect private virtual interface. This interface VPC endpoint resolves to a private IP address even if you enable a VPC endpoint for S3.
Question 289

A company operates quick-service restaurants. The restaurants follow a predictable model with high sales traffic for 4 hours daily Sales traffic is lower outside of those peak hours.
The point of sale and management platform is deployed in the AWS Cloud and has a backend that is based on Amazon DynamoDB. The database table uses provisioned throughput mode with 100.000 RCUs and 80.000 WCUs to match known peak resource consumption.
The company wants to reduce its DynamoDB cost and minimize the operational overhead for the IT staff.
Which solution meets these requirements MOST cost-effectively?
Explanation:
https://aws.amazon.com/blogs/database/amazon-dynamodb-auto-scaling-performance-and-costoptimization-at-any-scale/
"As you can see, there are compelling reasons to use DynamoDB auto scaling with actively changing traffic. Auto scaling responds quickly and simplifies capacity management, which lowers costs by scaling your table's provisioned capacity and reducing operational overhead."
Question 290

A company manages hundreds of AWS accounts centrally in an organization in AWS Organizations.
The company recently started to allow product teams to create and manage their own S3 access points in their accounts. The S3 access points can be accessed only within VPCs not on the internet.
What is the MOST operationally efficient way to enforce this requirement?
Explanation:
https://aws.amazon.com/s3/features/access-points/
https://aws.amazon.com/blogs/storage/managing-amazon-s3-access-with-vpc-endpoints-and-s3-access-points/
Question