ExamGecko
Home Home / Amazon / SAA-C03

Amazon SAA-C03 Practice Test - Questions Answers, Page 72

Question list
Search
Search

List of questions

Search

Related questions











A company has an application that runs on Amazon EC2 instances in a private subnet The application needs to process sensitive information from an Amazon S3 bucket The application must not use the internet to connect to the S3 bucket.

Which solution will meet these requirements?

A.
Configure an internet gateway. Update the S3 bucket policy to allow access from the internet gateway Update the application to use the new internet gateway
A.
Configure an internet gateway. Update the S3 bucket policy to allow access from the internet gateway Update the application to use the new internet gateway
Answers
B.
Configure a VPN connection. Update the S3 bucket policy to allow access from the VPN connection. Update the application to use the new VPN connection.
B.
Configure a VPN connection. Update the S3 bucket policy to allow access from the VPN connection. Update the application to use the new VPN connection.
Answers
C.
Configure a NAT gateway. Update the S3 bucket policy to allow access from the NAT gateway. Update the application to use the new NAT gateway.
C.
Configure a NAT gateway. Update the S3 bucket policy to allow access from the NAT gateway. Update the application to use the new NAT gateway.
Answers
D.
Configure a VPC endpoint. Update the S3 bucket policy to allow access from the VPC endpoint. Update the application to use the new VPC endpoint.
D.
Configure a VPC endpoint. Update the S3 bucket policy to allow access from the VPC endpoint. Update the application to use the new VPC endpoint.
Answers
Suggested answer: D

Explanation:

Understanding the Requirement: The application running on EC2 instances in a private subnet needs to process sensitive information from an S3 bucket without using the internet.

Analysis of Options:

Internet Gateway: This would expose the application to the internet, which is not suitable for accessing sensitive information securely.

VPN Connection: VPN is primarily used for secure connections between on-premises networks and AWS VPCs, not for direct S3 access within the same VPC.

NAT Gateway: This allows instances in a private subnet to connect to the internet, but the goal is to avoid internet access.

VPC Endpoint: Provides a private connection between the VPC and S3 without using the internet, ensuring secure access to the S3 bucket.

Best Solution:

VPC Endpoint: Configuring a VPC endpoint allows secure, private communication between the EC2 instances and the S3 bucket without using the internet, ensuring data security and compliance.

Amazon VPC Endpoints

Amazon S3 VPC Endpoint

A company uses Amazon EC2 instances and Amazon Elastic Block Store (Amazon EBS) to run its self-managed database The company has 350 TB of data spread across all EBS volumes. The company takes daily EBS snapshots and keeps the snapshots for 1 month. The dally change rate is 5% of the EBS volumes.

Because of new regulations, the company needs to keep the monthly snapshots for 7 years. The company needs to change its backup strategy to comply with the new regulations and to ensure that data is available with minimal administrative effort.

Which solution will meet these requirements MOST cost-effectively?

A.
Keep the daily snapshot in the EBS snapshot standard tier for 1 month Copy the monthly snapshot to Amazon S3 Glacier Deep Archive with a 7-year retention period.
A.
Keep the daily snapshot in the EBS snapshot standard tier for 1 month Copy the monthly snapshot to Amazon S3 Glacier Deep Archive with a 7-year retention period.
Answers
B.
Continue with the current EBS snapshot policy. Add a new policy to move the monthly snapshot to Amazon EBS Snapshots Archive with a 7-year retention period.
B.
Continue with the current EBS snapshot policy. Add a new policy to move the monthly snapshot to Amazon EBS Snapshots Archive with a 7-year retention period.
Answers
C.
Keep the daily snapshot in the EBS snapshot standard tier for 1 month Keep the monthly snapshot in the standard tier for 7 years Use incremental snapshots.
C.
Keep the daily snapshot in the EBS snapshot standard tier for 1 month Keep the monthly snapshot in the standard tier for 7 years Use incremental snapshots.
Answers
D.
Keep the daily snapshot in the EBS snapshot standard tier. Use EBS direct APIs to take snapshots of all the EBS volumes every month. Store the snapshots in an Amazon S3 bucket in the Infrequent Access tier for 7 years.
D.
Keep the daily snapshot in the EBS snapshot standard tier. Use EBS direct APIs to take snapshots of all the EBS volumes every month. Store the snapshots in an Amazon S3 bucket in the Infrequent Access tier for 7 years.
Answers
Suggested answer: B

Explanation:

Understanding the Requirement: The company needs to keep daily EBS snapshots for 1 month and retain monthly snapshots for 7 years due to new regulations.

Analysis of Options:

S3 Glacier Deep Archive: Moving snapshots to S3 Glacier Deep Archive involves additional complexity and might not be the most straightforward approach for EBS snapshots.

EBS Snapshots Archive: This is a cost-effective solution designed specifically for long-term storage of EBS snapshots.

Standard Tier for 7 Years: Keeping snapshots in the standard tier for 7 years is more expensive and does not optimize costs.

EBS Direct APIs to S3: This involves additional operational overhead and is not the most cost-effective approach compared to using EBS Snapshots Archive.

Best Solution:

EBS Snapshots Archive: Adding a policy to move monthly snapshots to the EBS Snapshots Archive for long-term retention is the most cost-effective and administratively simple solution.

Amazon EBS Snapshots

Amazon EBS Snapshots Archive

A company is migrating five on-premises applications to VPCs in the AWS Cloud. Each application is currently deployed in isolated virtual networks on premises and should be deployed similarly in the AWS Cloud. The applications need to reach a shared services VPC. All the applications must be able to communicate with each other.

If the migration is successful, the company will repeat the migration process for more than 100 applications.

Which solution will meet these requirements with the LEAST administrative overhead?

A.
Deploy software VPN tunnels between the application VPCs and the shared services VPC. Add routes between the application VPCs in their subnets to the shared services VPC.
A.
Deploy software VPN tunnels between the application VPCs and the shared services VPC. Add routes between the application VPCs in their subnets to the shared services VPC.
Answers
B.
Deploy VPC peering connections between the application VPCs and the shared services VPC. Add routes between the application VPCs in their subnets to the shared services VPC through the peering connection.
B.
Deploy VPC peering connections between the application VPCs and the shared services VPC. Add routes between the application VPCs in their subnets to the shared services VPC through the peering connection.
Answers
C.
Deploy an AWS Direct Connect connection between the application VPCs and the shared services VPC. Add routes from the application VPCs in their subnets to the shared services VPC and the applications VPCs. Add routes from the shared services VPC subnets to the applications VPCs.
C.
Deploy an AWS Direct Connect connection between the application VPCs and the shared services VPC. Add routes from the application VPCs in their subnets to the shared services VPC and the applications VPCs. Add routes from the shared services VPC subnets to the applications VPCs.
Answers
D.
Deploy a transit gateway with associations between the transit gateway and the application VPCs and the shared services VPC Add routes between the application VPCs in their subnets and the application VPCs to the shared services VPC through the transit gateway.
D.
Deploy a transit gateway with associations between the transit gateway and the application VPCs and the shared services VPC Add routes between the application VPCs in their subnets and the application VPCs to the shared services VPC through the transit gateway.
Answers
Suggested answer: D

Explanation:

Understanding the Requirement: The company needs to migrate applications to AWS, maintaining isolated networks while allowing communication with a shared services VPC and among the applications.

Analysis of Options:

Software VPN Tunnels: This approach involves high administrative overhead and complexity in managing multiple VPN connections.

VPC Peering: While suitable for smaller numbers of VPCs, it becomes complex and hard to manage at scale with over 100 applications.

Direct Connect: Primarily used for high-bandwidth, low-latency connections to on-premises networks, not inter-VPC communication.

Transit Gateway: Simplifies network management by acting as a central hub, allowing easy routing and scalability as more applications are migrated.

Best Solution:

Transit Gateway: This provides a scalable, efficient solution with minimal administrative overhead for managing network connections between multiple VPCs and the shared services VPC.

AWS Transit Gateway

Building a Transit Gateway

A company has two AWS accounts: Production and Development. The company needs to push code changes in the Development account to the Production account. In the alpha phase, only two senior developers on the development team need access to the Production account. In the beta phase, more developers will need access to perform testing.

Which solution will meet these requirements?

A.
Create two policy documents by using the AWS Management Console in each account. Assign the policy to developers who need access.
A.
Create two policy documents by using the AWS Management Console in each account. Assign the policy to developers who need access.
Answers
B.
Create an 1AM role in the Development account Grant the 1AM role access to the Production account. Allow developers to assume the role
B.
Create an 1AM role in the Development account Grant the 1AM role access to the Production account. Allow developers to assume the role
Answers
C.
Create an IAM role in the Production account. Define a trust policy that specifies the Development account Allow developers to assume the role
C.
Create an IAM role in the Production account. Define a trust policy that specifies the Development account Allow developers to assume the role
Answers
D.
Create an IAM group in the Production account. Add the group as a principal in a trust policy that specifies the Production account. Add developers to the group.
D.
Create an IAM group in the Production account. Add the group as a principal in a trust policy that specifies the Production account. Add developers to the group.
Answers
Suggested answer: C

Explanation:

Understanding the Requirement: Developers in the Development account need to push code changes to the Production account, with phased access control for different stages of the project.

Analysis of Options:

Policy Documents in Each Account: This approach increases complexity and is harder to manage compared to role-based access.

IAM Role in Development Account: Roles in the Development account cannot directly control access to resources in the Production account.

IAM Role in Production Account: Creating a role in the Production account with a trust policy that allows the Development account to assume it provides controlled, secure access.

IAM Group in Production Account: This approach does not provide the required cross-account access control.

Best Solution:

IAM Role in the Production Account: This method allows precise control over who can access the Production account from the Development account, with the ability to manage permissions and access levels effectively.

IAM Roles with Cross-Account Access

Creating a Role for Cross-Account Access

A robotics company is designing a solution for medical surgery The robots will use advanced sensors, cameras, and Al algorithms to perceive their environment and to complete surgeries.

The company needs a public load balancer in the AWS Cloud that will ensure seamless communication with backend services. The load balancer must be capable of routing traffic based on the query strings to different target groups. The traffic must also be encrypted

Which solution will meet these requirements?

A.
Use a Network Load Balancer with a certificate attached from AWS Certificate Manager (ACM) Use query parameter-based routing
A.
Use a Network Load Balancer with a certificate attached from AWS Certificate Manager (ACM) Use query parameter-based routing
Answers
B.
Use a Gateway Load Balancer. Import a generated certificate in AWS Identity and Access Management (1AM). Attach the certificate to the load balancer. Use HTTP path-based routing.
B.
Use a Gateway Load Balancer. Import a generated certificate in AWS Identity and Access Management (1AM). Attach the certificate to the load balancer. Use HTTP path-based routing.
Answers
C.
Use an Application Load Balancer with a certificate attached from AWS Certificate Manager (ACM). Use query parameter-based routing.
C.
Use an Application Load Balancer with a certificate attached from AWS Certificate Manager (ACM). Use query parameter-based routing.
Answers
D.
Use a Network Load Balancer. Import a generated certificate in AWS Identity and Access Management (1AM). Attach the certificate to the load balancer. Use query parameter-based routing.
D.
Use a Network Load Balancer. Import a generated certificate in AWS Identity and Access Management (1AM). Attach the certificate to the load balancer. Use query parameter-based routing.
Answers
Suggested answer: C

Explanation:

Understanding the Requirement: The robotics company needs a public load balancer to ensure seamless communication with backend services, route traffic based on query strings, and encrypt traffic.

Analysis of Options:

Network Load Balancer with ACM Certificate: NLBs primarily operate at the transport layer (Layer 4) and do not natively support query parameter-based routing, which is a Layer 7 feature.

Gateway Load Balancer with IAM Certificate: Gateway Load Balancers are designed for deploying, scaling, and managing third-party virtual appliances and do not support HTTP path-based or query parameter-based routing.

Application Load Balancer with ACM Certificate: ALBs operate at the application layer (Layer 7), supporting features like query parameter-based routing and SSL/TLS termination with ACM certificates.

Network Load Balancer with IAM Certificate: As with the first option, NLBs do not support query parameter-based routing, making it unsuitable for this requirement.

Best Solution:

Application Load Balancer with ACM Certificate: This option provides the necessary Layer 7 routing capabilities and SSL/TLS termination to meet the requirements for query parameter-based routing and encrypted communication.

Application Load Balancer

AWS Certificate Manager

A company has multiple VPCs across AWS Regions to support and run workloads that are isolated from workloads in other Regions Because of a recent application launch requirement, the company's VPCs must communicate with all other VPCs across all Regions.

Which solution will meet these requirements with the LEAST amount of administrative effort?

A.
Use VPC peering to manage VPC communication in a single Region Use VPC peering across Regions to manage VPC communications.
A.
Use VPC peering to manage VPC communication in a single Region Use VPC peering across Regions to manage VPC communications.
Answers
B.
Use AWS Direct Connect gateways across all Regions to connect VPCs across regions and manage VPC communications.
B.
Use AWS Direct Connect gateways across all Regions to connect VPCs across regions and manage VPC communications.
Answers
C.
Use AWS Transit Gateway to manage VPC communication in a single Region and Transit Gateway peering across Regions to manage VPC communications.
C.
Use AWS Transit Gateway to manage VPC communication in a single Region and Transit Gateway peering across Regions to manage VPC communications.
Answers
D.
Use AWS PrivateLink across all Regions to connect VPCs across Regions and manage VPC communications.
D.
Use AWS PrivateLink across all Regions to connect VPCs across Regions and manage VPC communications.
Answers
Suggested answer: C

Explanation:

Understanding the Requirement: The company needs to enable communication between VPCs across multiple AWS Regions with minimal administrative effort.

Analysis of Options:

VPC Peering: Managing multiple VPC peering connections across regions is complex and difficult to scale, leading to significant administrative overhead.

AWS Direct Connect Gateways: Primarily used for creating private connections between AWS and on-premises environments, not for inter-VPC communication across regions.

AWS Transit Gateway: Simplifies VPC interconnections within a region and supports Transit Gateway peering for cross-region connectivity, reducing administrative complexity.

AWS PrivateLink: Used for accessing AWS services and third-party services over a private connection, not for inter-VPC communication.

Best Solution:

AWS Transit Gateway with Transit Gateway Peering: This option provides a scalable and efficient solution for managing VPC communications both within a single region and across multiple regions with minimal administrative overhead.

AWS Transit Gateway

Transit Gateway Peering

A company has migrated a fleet of hundreds of on-premises virtual machines (VMs) to Amazon EC2 instances. The instances run a diverse fleet of Windows Server versions along with several Linux distributions. The company wants a solution that will automate inventory and updates of the operating systems. The company also needs a summary of common vulnerabilities of each instance for regular monthly reviews.

What should a solutions architect recommend to meet these requirements?

A.
Set up AWS Systems Manager Patch Manager to manage all the EC2 instances. Configure AWS Security Hub to produce monthly reports.
A.
Set up AWS Systems Manager Patch Manager to manage all the EC2 instances. Configure AWS Security Hub to produce monthly reports.
Answers
B.
Set up AWS Systems Manager Patch Manager to manage all the EC2 instances Deploy Amazon Inspector, and configure monthly reports
B.
Set up AWS Systems Manager Patch Manager to manage all the EC2 instances Deploy Amazon Inspector, and configure monthly reports
Answers
C.
Set up AWS Shield Advanced, and configure monthly reports Deploy AWS Config to automate patch installations on the EC2 instances
C.
Set up AWS Shield Advanced, and configure monthly reports Deploy AWS Config to automate patch installations on the EC2 instances
Answers
D.
Set up Amazon GuardDuty in the account to monitor all EC2 instances Deploy AWS Config to automate patch installations on the EC2 instances.
D.
Set up Amazon GuardDuty in the account to monitor all EC2 instances Deploy AWS Config to automate patch installations on the EC2 instances.
Answers
Suggested answer: B

Explanation:

Understanding the Requirement: The company needs to automate inventory and updates of diverse OS versions on EC2 instances and summarize common vulnerabilities for monthly reviews.

Analysis of Options:

Systems Manager Patch Manager and Security Hub: Patch Manager automates patching, but Security Hub is more focused on compliance and security posture rather than inventory and vulnerability management.

Systems Manager Patch Manager and Amazon Inspector: Patch Manager automates OS updates, and Amazon Inspector provides vulnerability assessments, making this a comprehensive solution for the requirements.

AWS Shield Advanced and AWS Config: Shield Advanced is for DDoS protection, not suitable for OS patch management and vulnerability reporting.

Amazon GuardDuty and AWS Config: GuardDuty is for threat detection and monitoring, not specifically for patch management and vulnerability assessments.

Best Solution:

Systems Manager Patch Manager and Amazon Inspector: This combination automates OS updates and provides detailed vulnerability assessments, meeting both the inventory and security reporting needs effectively.

AWS Systems Manager Patch Manager

Amazon Inspector

A company wants to use Amazon Elastic Container Service (Amazon ECS) to run its on-premises application in a hybrid environment The application currently runs on containers on premises.

The company needs a single container solution that can scale in an on-premises, hybrid, or cloud environment The company must run new application containers in the AWS Cloud and must use a load balancer for HTTP traffic.

Which combination of actions will meet these requirements? (Select TWO.)

A.
Set up an ECS cluster that uses the AWS Fargate launch type for the cloud application containers Use an Amazon ECS Anywhere external launch type for the on-premises application containers.
A.
Set up an ECS cluster that uses the AWS Fargate launch type for the cloud application containers Use an Amazon ECS Anywhere external launch type for the on-premises application containers.
Answers
B.
Set up an Application Load Balancer for cloud ECS services
B.
Set up an Application Load Balancer for cloud ECS services
Answers
C.
Set up a Network Load Balancer for cloud ECS services.
C.
Set up a Network Load Balancer for cloud ECS services.
Answers
D.
Set up an ECS cluster that uses the AWS Fargate launch type Use Fargate for the cloud application containers and the on-premises application containers.
D.
Set up an ECS cluster that uses the AWS Fargate launch type Use Fargate for the cloud application containers and the on-premises application containers.
Answers
E.
Set up an ECS cluster that uses the Amazon EC2 launch type for the cloud application containers. Use Amazon ECS Anywhere with an AWS Fargate launch type for the on-premises application containers.
E.
Set up an ECS cluster that uses the Amazon EC2 launch type for the cloud application containers. Use Amazon ECS Anywhere with an AWS Fargate launch type for the on-premises application containers.
Answers
Suggested answer: A, B

Explanation:

Understanding the Requirement: The company needs a container solution that can scale across on-premises, hybrid, and cloud environments, with a load balancer for HTTP traffic.

Analysis of Options:

Fargate Launch Type and ECS Anywhere: Using Fargate for cloud-based containers and ECS Anywhere for on-premises containers provides a unified management experience across environments without needing to manage infrastructure.

Application Load Balancer: Suitable for HTTP traffic and can distribute requests to the ECS services, ensuring scalability and performance.

Network Load Balancer: Typically used for TCP/UDP traffic, not specifically optimized for HTTP traffic.

EC2 Launch Type for ECS and ECS Anywhere with Fargate: Involves managing infrastructure for EC2 instances, increasing operational overhead.

Best Combination of Solutions:

ECS with Fargate Launch Type and ECS Anywhere: This provides flexibility and scalability across hybrid environments with minimal operational overhead.

Application Load Balancer: Optimized for HTTP traffic, ensuring efficient load distribution and scaling for the ECS services.

Amazon ECS on AWS Fargate

Amazon ECS Anywhere

Application Load Balancer

A company is migrating its workloads to AWS. The company has sensitive and critical data in on-premises relational databases that run on SQL Server instances. The company wants to use the AWS Cloud to increase security and reduce operational overhead for the databases. Which solution will meet these requirements?

A.
Migrate the databases to Amazon EC2 instances. Use an AWS Key Management Service (AWS KMS) AWS managed key for encryption.
A.
Migrate the databases to Amazon EC2 instances. Use an AWS Key Management Service (AWS KMS) AWS managed key for encryption.
Answers
B.
Migrate the databases to a Multi-AZ Amazon RDS for SQL Server DB instance Use an AWS Key Management Service (AWS KMS) AWS managed key for encryption.
B.
Migrate the databases to a Multi-AZ Amazon RDS for SQL Server DB instance Use an AWS Key Management Service (AWS KMS) AWS managed key for encryption.
Answers
C.
Migrate the data to an Amazon S3 bucket Use Amazon Macie to ensure data security
C.
Migrate the data to an Amazon S3 bucket Use Amazon Macie to ensure data security
Answers
D.
Migrate the databases to an Amazon DynamoDB table. Use Amazon CloudWatch Logs to ensure data security
D.
Migrate the databases to an Amazon DynamoDB table. Use Amazon CloudWatch Logs to ensure data security
Answers
Suggested answer: B

Explanation:

Understanding the Requirement: The company needs to migrate sensitive and critical data from on-premises SQL Server databases to AWS, aiming to increase security and reduce operational overhead.

Analysis of Options:

EC2 Instances with KMS: Running SQL Server on EC2 provides control but requires significant operational overhead for management, backups, patching, and high availability.

Multi-AZ Amazon RDS for SQL Server with KMS: Amazon RDS for SQL Server offers managed database services, reducing operational overhead. Multi-AZ deployment provides high availability, and KMS encryption ensures data security.

Amazon S3 and Macie: S3 is not a suitable replacement for relational databases, and Macie is used for data security and compliance but not for database operations.

Amazon DynamoDB and CloudWatch Logs: DynamoDB is a NoSQL database and does not support SQL Server workloads directly. CloudWatch Logs are used for monitoring, not for ensuring database security.

Best Solution:

Multi-AZ Amazon RDS for SQL Server with KMS: This solution meets the requirements for security, high availability, and reduced operational overhead by using a managed database service with encryption.

Amazon RDS for SQL Server

AWS Key Management Service (KMS)

A company uses 50 TB of data for reporting The company wants to move this data from on premises to AWS A custom application in the company's data center runs a weekly data transformation job The company plans to pause the application until the data transfer is complete and needs to begin the transfer process as soon as possible

The data center does not have any available network bandwidth for additional workloads. A solutions architect must transfer the data and must configure the transformation job to continue to run in the AWS Cloud.

Which solution will meet these requirements with the LEAST operational overhead?

A.
Use AWS DataSync to move the data Create a custom transformation job by using AWS Glue.
A.
Use AWS DataSync to move the data Create a custom transformation job by using AWS Glue.
Answers
B.
Order an AWS Snowcone device to move the data Deploy the transformation application to the device.
B.
Order an AWS Snowcone device to move the data Deploy the transformation application to the device.
Answers
C.
Order an AWS Snowball Edge Storage Optimized device. Copy the data to the device. Create a custom transformation Job by using AWS Glue.
C.
Order an AWS Snowball Edge Storage Optimized device. Copy the data to the device. Create a custom transformation Job by using AWS Glue.
Answers
D.
Order an AWS Snowball Edge Storage Optimized device that includes Amazon EC2 compute Copy the data to the device Create a new EC2 instance on AWS to run the transformation application.
D.
Order an AWS Snowball Edge Storage Optimized device that includes Amazon EC2 compute Copy the data to the device Create a new EC2 instance on AWS to run the transformation application.
Answers
Suggested answer: C

Explanation:

Understanding the Requirement: The company needs to transfer 50 TB of data to AWS with minimal operational overhead and no available network bandwidth for the transfer. The transformation job must continue running in the AWS Cloud.

Analysis of Options:

AWS DataSync and AWS Glue: DataSync is suitable for online data transfer, but there is no available network bandwidth. AWS Glue can be used for data transformation but does not solve the bandwidth issue.

AWS Snowcone: Snowcone is a smaller device suitable for smaller data transfers, and deploying the transformation application on it may not be feasible for 50 TB of data.

AWS Snowball Edge Storage Optimized with Glue: This device is designed for large data transfers. Copying the data to the device is straightforward, and AWS Glue can handle data transformation in the cloud.

AWS Snowball Edge Storage Optimized with EC2: This involves setting up EC2 instances for transformation, adding operational complexity compared to using AWS Glue.

Best Solution:

AWS Snowball Edge Storage Optimized with AWS Glue: This provides the least operational overhead for transferring large amounts of data and setting up the transformation job in the cloud.

AWS Snowball Edge

AWS Glue

Total 886 questions
Go to page: of 89