ExamGecko
Home Home / Amazon / PAS-C01

Amazon PAS-C01 Practice Test - Questions Answers, Page 6

Question list
Search
Search

List of questions

Search

Related questions











A company has deployed its SAP applications into multiple Availability Zones in the same AWS Region To accommodate storage of media files database table export and import and files dropped by third-party tools the company has mounted Amazon Elastic File System (Amazon EFS) file systems between the SAP instances The company needs to retrieve the files quickly for installations updates and system refreshes Over time the EFS file systems have grown exponentially to multiple terabytes An SAP solutions architect must optimize storage cost for the tiles that are stored in Amazon EFS. Which solution will meet this requirement with the LEAST administrative overhead?

A.
Scan the files manually to identify unnecessary files Delete the unnecessary files
A.
Scan the files manually to identify unnecessary files Delete the unnecessary files
Answers
B.
Move the files to Amazon S3 Glacier Deep Archive
B.
Move the files to Amazon S3 Glacier Deep Archive
Answers
C.
Apply a lifecycle policy on the files in Amazon EFS to move the files to EFS Standard-Infrequent Access (Standard-IA)
C.
Apply a lifecycle policy on the files in Amazon EFS to move the files to EFS Standard-Infrequent Access (Standard-IA)
Answers
D.
Move the files to Amazon S3 Glacier Apply an S3 Glacier vault lock policy to the files
D.
Move the files to Amazon S3 Glacier Apply an S3 Glacier vault lock policy to the files
Answers
Suggested answer: C

Explanation:


An SAP specialist is budding an SAP environment The SAP environment contains Amazon EC2 instances that fun in a private subnet in a VPC. The VPC includes a NAT gateway. The SAP specialist is selling up IBM Db2 high availability disaster recovery for the SAP duster. After configuration of overlay IP address routing traffic is not routing to the database EC2 instances. What should the SAP specialist do to resolve this issue?

A.
Open a security group tor SAP ports »o allow traffic on port 443
A.
Open a security group tor SAP ports »o allow traffic on port 443
Answers
B.
Create route table entries to allow traffic from the database EC2 instances to the NAT gateway
B.
Create route table entries to allow traffic from the database EC2 instances to the NAT gateway
Answers
C.
Turn off the source destination check for the database EC2 instances
C.
Turn off the source destination check for the database EC2 instances
Answers
D.
Create an IAM role that has permission to access network traffic Associate the role with the database EC2 instances
D.
Create an IAM role that has permission to access network traffic Associate the role with the database EC2 instances
Answers
Suggested answer: C

Explanation:

C is correct because turning off the source destination check for the database EC2 instances is required to enable overlay IP address routing for IBM Db2 high availability disaster recovery. The source destination check prevents instances from sending or receiving traffic that is not intended for them. The other options are not relevant or necessary for resolving this issue. Reference: https://docs.aws.amazon.com/whitepapers/latest/sap-on-aws-technical-deployment-guide/high-availability.html https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using- eni.html#change_source_dest_check


A company uses an SAP application that runs batch jobs that ate performance sensitive. The batch jobs can be restarted safely The SAP application has sot application servers. The SAP application functions reliability as long as the SAP application availability remains greater than 60%. The company wants to migrate the SAP application to AWS. The company is using a duster with two Availability Zones How should the company distribute the SAP application servers to maintain system reliability?

A.
Distribute the SAP application servers equally across three partition placement groups
A.
Distribute the SAP application servers equally across three partition placement groups
Answers
B.
Distribute the SAP application servers equally across three Availability Zones
B.
Distribute the SAP application servers equally across three Availability Zones
Answers
C.
Distribute the SAP application servers equally across two Availability Zones
C.
Distribute the SAP application servers equally across two Availability Zones
Answers
D.
Create an Amazon EC2 Auto Scaling group across two Availability Zones Set a minimum capacity value of 4.
D.
Create an Amazon EC2 Auto Scaling group across two Availability Zones Set a minimum capacity value of 4.
Answers
Suggested answer: C

Explanation:

C is correct because distributing the SAP application servers equally across two Availability Zones provides system reliability by ensuring that at least three application servers are available in each Availability Zone. This meets the requirement that the SAP application availability remains greater than 60%. This will also provide an additional layer of redundancy and protection against a single point of failure. Reference: https://docs.aws.amazon.com/whitepapers/latest/sap-on-aws-technical-deployment-guide/high-availability.html https://docs.aws.amazon.com/whitepapers/latest/sap-on- aws-technical-deployment-guide/amazon-ec2.htm


A company wants to migrate its SAP landscape from on premises to AWS

What are the MINIMUM requirements that the company must meet to ensure full support of SAP on AWS? (Select THREE.)

A.
Enable detailed monitoring for Amazon CloudWatch on each instance in the landscape
A.
Enable detailed monitoring for Amazon CloudWatch on each instance in the landscape
Answers
B.
Deploy the infrastructure by using SAP Cloud Appliance Library
B.
Deploy the infrastructure by using SAP Cloud Appliance Library
Answers
C.
Install configure and run the AWS Data Provider for SAP on each instance m the landscape
C.
Install configure and run the AWS Data Provider for SAP on each instance m the landscape
Answers
D.
Protect all production instances by using Amazon EC2 automatic recovery
D.
Protect all production instances by using Amazon EC2 automatic recovery
Answers
E.
Deploy the infrastructure for the SAP landscape by using AWS Launch Wizard for SAP
E.
Deploy the infrastructure for the SAP landscape by using AWS Launch Wizard for SAP
Answers
F.
Deploy the SAP landscape on an AWS account that has either an AWS Business Support plan or an AWS Enterprise Support plan
F.
Deploy the SAP landscape on an AWS account that has either an AWS Business Support plan or an AWS Enterprise Support plan
Answers
Suggested answer: C, D, F

Explanation:

C is correct because installing, configuring and running the AWS Data Provider for SAP on each instance in the landscape is required to provide specific performance metrics that are needed by SAP support. D is correct because protecting all production instances by using Amazon EC2 automatic recovery is required to ensure high availability and fault tolerance for SAP systems. F is correct because deploying the SAP landscape on an AWS account that has either an AWS Business Support plan or an AWS Enterprise Support plan is required to ensure full support of SAP on AWS. The other options are not relevant or necessary for ensuring full support of SAP on AWS. Reference: https://docs.aws.amazon.com/whitepapers/latest/sap-on-aws-technical-deployment-guide/security.html https://docs.aws.amazon.com/whitepapers/latest/sap-on-aws-technical-deployment-guide/high-availability.html https://docs.aws.amazon.com/whitepapers/latest/sap-on- aws-technical-deployment-guide/support.html



A company wants to migrate its SAP S/4HANA software from on premises to AWS in a few weeks An SAP solutions architect plans to use AWS Launch Wizard for SAP to automate me SAP deployment on AWS Which combination of steps must the SAP solutions architect take to use Launch Wizard to meet these requirements? (Select TWO.)

A.
Download the SAP software files from the SAP Support Portal Upload the SAP software files to Amazon S3 Provide the S3 bucket path as an input to Launch Wizard
A.
Download the SAP software files from the SAP Support Portal Upload the SAP software files to Amazon S3 Provide the S3 bucket path as an input to Launch Wizard
Answers
B.
Provide the SAP S-user ID and password as inputs to Launch Wizard to download the software automatically.
B.
Provide the SAP S-user ID and password as inputs to Launch Wizard to download the software automatically.
Answers
C.
Format the S3 Tile path syntax according to the Launch Wizard deployment recommendation
C.
Format the S3 Tile path syntax according to the Launch Wizard deployment recommendation
Answers
D.
Use an AWS CloudFormation template for the automated deployment of the SAP landscape
D.
Use an AWS CloudFormation template for the automated deployment of the SAP landscape
Answers
E.
Provision Amazon EC2 instances Tag the instances to install SAP S'4HANA on them
E.
Provision Amazon EC2 instances Tag the instances to install SAP S'4HANA on them
Answers
Suggested answer: A, C

A company runs its SAP ERP 6 0 EHP 8 system on SAP HANAon AWS The system is deployed on an r4 I6xlarge Amazon EC2 instance with default tenancy. The company needs to migrate the SAP HANA database to an x2gd/.6xiarge High Memory instance After an operations engineer changes the instance type and starts the instance the AWS Management Console shows a failed instance status check What is the cause of this problem?

A.
The operations engineer missed the network configuration step during the post-migration activities
A.
The operations engineer missed the network configuration step during the post-migration activities
Answers
B.
The operations engineer missed the Amazon CloudWatch configuration step during the postmigration activities.
B.
The operations engineer missed the Amazon CloudWatch configuration step during the postmigration activities.
Answers
C.
The operations engineer did not install Elastic Network Adapter (ENA) drivers before changing the instance type
C.
The operations engineer did not install Elastic Network Adapter (ENA) drivers before changing the instance type
Answers
D.
The operations engineer did not create a new AMI from the original instance and did not launch a new instance with dedicated tenancy from the AMI
D.
The operations engineer did not create a new AMI from the original instance and did not launch a new instance with dedicated tenancy from the AMI
Answers
Suggested answer: C

Explanation:

he Elastic Network Adapter (ENA) is a software-based network interface that provides high- performance network connectivity and is required for instances with higher network performance requirements. If the ENA drivers are not installed before changing the instance type, the instance will not be able to communicate with the network, resulting in a failed instance status check.C is correct because installing Elastic Network Adapter (ENA) drivers before changing the instance type is required to support High Memory instances. ENA drivers provide enhanced networking capabilities for EC2 instances. The other options are not relevant or necessary for migrating SAP HANA database to a High Memory instance. Reference: https://docs.aws.amazon.com/whitepapers/latest/sap-on-aws-technical-deployment-guide/amazon-ec2.html https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/enhanced- networking.html


A company is running SAP on anyDB at a remote location that has slow and inconsistent internet connectivity. The company wants to migrate its system to AWS and wants to convert its database to SAP HANA during this process Because of the inconsistent internet connection the company has not established connectivity between the remote location and the company's VPC in the AWS Cloud. How should the company perform this migration?

A.
Migrate by using SAP HANA system replication over the internet connection Specify a public IP address on the target system
A.
Migrate by using SAP HANA system replication over the internet connection Specify a public IP address on the target system
Answers
B.
Migrate by using SAP Software Update Manager (SUM) Database Migration Option (DMO) with System Move Use an AWS Snowball Edge Storage Optimized device to transfer the SAP export files to AWS
B.
Migrate by using SAP Software Update Manager (SUM) Database Migration Option (DMO) with System Move Use an AWS Snowball Edge Storage Optimized device to transfer the SAP export files to AWS
Answers
C.
Migrate by using SAP HANA system replication with initialization through backup and restore Use an AWS Snowball Edge Storage Optimized device to transfer the SAP export files to AWS
C.
Migrate by using SAP HANA system replication with initialization through backup and restore Use an AWS Snowball Edge Storage Optimized device to transfer the SAP export files to AWS
Answers
D.
Migrate by using SAP Software Update Manager (SUM) Database Migration Option (DMO) with System Move Use Amazon Elastic File System (Amazon EPS) to transfer the SAP export files to AWS
D.
Migrate by using SAP Software Update Manager (SUM) Database Migration Option (DMO) with System Move Use Amazon Elastic File System (Amazon EPS) to transfer the SAP export files to AWS
Answers
Suggested answer: B

Explanation:

Snowball Edge allows for offline data transfer, which is suitable for cases where there is slow and inconsistent internet connectivity. The data can be transferred to the Snowball Edge device, shipped to the AWS region, and then imported into the new SAP HANA system in the VPC. This is a suitable option for migrating large amounts of data from a remote location to AWS.https://docs.aws.amazon.com/sap/latest/sap-hana/migrating-hana-tools.html#migrating-hana- snowball


A financial services company is implementing SAP core banking on AWS. The company must not allow any system information to traverse the public internet. The company needs to implement secure monitoring of its SAP ERP Central Component (SAP ECO system to check for performance issues and faults in its application. The solution must maximize security and must be supported by SAP and AWS. How should be company integrate AWS metrics with its SAP system to meet these requirements?

A.
Set up SAP Solution Manager to call Amazon CoudWatch and Amazon EC2 endpoints with RESTbased calls to populate SAPOSCOL details Use SAP transaction ST06N to monitor CPU and memory utilization on each EC2 instance
A.
Set up SAP Solution Manager to call Amazon CoudWatch and Amazon EC2 endpoints with RESTbased calls to populate SAPOSCOL details Use SAP transaction ST06N to monitor CPU and memory utilization on each EC2 instance
Answers
B.
Install the AWS Data Provider for SAP on the Amazon EC2 instances that host SAP Allow access to the Amazon CloudWatch and EC2 endpoints through a NAT gateway Create an IAM policy that allows the ec2 Describeinstances action the cloudwatch.GetMetricStatistics action and the ec2 DescribeVolumes action for all EC2 resources.
B.
Install the AWS Data Provider for SAP on the Amazon EC2 instances that host SAP Allow access to the Amazon CloudWatch and EC2 endpoints through a NAT gateway Create an IAM policy that allows the ec2 Describeinstances action the cloudwatch.GetMetricStatistics action and the ec2 DescribeVolumes action for all EC2 resources.
Answers
C.
Install the AWS Data Provider for SAP on the Amazon EC2 instances that host SAP Create VPC endpoints for Amazon CloudWatch and Amazon EC2 Allow access through these endpoints Create an IAM policy that allows the ec2 Describe instances action the cloudwatch GetMemcStatistics action and the ec2 DescribeVolumes action for all EC2 resources.
C.
Install the AWS Data Provider for SAP on the Amazon EC2 instances that host SAP Create VPC endpoints for Amazon CloudWatch and Amazon EC2 Allow access through these endpoints Create an IAM policy that allows the ec2 Describe instances action the cloudwatch GetMemcStatistics action and the ec2 DescribeVolumes action for all EC2 resources.
Answers
D.
install the AWS data Provider for SAP on the Amazon EC2 instances that host SAP Create VPC endpoints for Amazon CloudWatch and Amazon EC2 Allow access through these endpoints Create an IAM policy that allows all actions for all EC2 resources.
D.
install the AWS data Provider for SAP on the Amazon EC2 instances that host SAP Create VPC endpoints for Amazon CloudWatch and Amazon EC2 Allow access through these endpoints Create an IAM policy that allows all actions for all EC2 resources.
Answers
Suggested answer: C

Explanation:

VPC endpoints to ensure that traffic to and from the CloudWatch and EC2 services stays within the VPC. Additionally, an IAM policy is created to grant access to only the necessary actions, such as DescribeInstances and GetMetricStatistics, for all EC2 resources. This approach will provide secure monitoring of the SAP system while maximizing security and ensuring support from both SAP and AWS.https://docs.aws.amazon.com/sap/latest/general/data-provider-req.html#vpc-endpoints


Business users are reporting timeouts during periods of peak query activity on an enterprise SAP HANAdata mart An SAP system administrator has discovered that at peak volume the CPU utilization increases rapidly to 100% for extended periods on the x1.32xlarge Amazon EC2 instance where the database is installed However the SAP HANA database is occupying only 1 120 GiB of the available 1 952 GiB on the instance 10 wart times are not increasing Extensive query tuning and system tuning have not resolved this performance problem Which solutions should the SAP system administrator use to improve the performance? (Select TWO.)

A.
Reduce the global_allocation_limit parameter to i 120 GiB
A.
Reduce the global_allocation_limit parameter to i 120 GiB
Answers
B.
Migrate the SAP HANA database to an EC2 High Memory instance with a larger number of available vCPUs
B.
Migrate the SAP HANA database to an EC2 High Memory instance with a larger number of available vCPUs
Answers
C.
Move to a scale-out architecture for SAP HANA with at least three x1 16xlarge instances
C.
Move to a scale-out architecture for SAP HANA with at least three x1 16xlarge instances
Answers
D.
Modify the Amazon Elastic Block Store (Amazon EBS) volume type from General Purpose to Provisioned lOPS for ail SAP HANA data volumes
D.
Modify the Amazon Elastic Block Store (Amazon EBS) volume type from General Purpose to Provisioned lOPS for ail SAP HANA data volumes
Answers
E.
Change to a supported compute optimized instance type for SAP HANA
E.
Change to a supported compute optimized instance type for SAP HANA
Answers
Suggested answer: C, E

Explanation:

Reference: https://docs.aws.amazon.com/whitepapers/latest/sap-on-aws-technical-deployment-guide/sap-hana.html https://docs.aws.amazon.com/whitepapers/latest/sap-on-aws-technical- deployment-guide/amazon-ec2.html


A company is moving to the AWS Cloud gradually. The company has multiple SAP landscapes on VMware The company already has sandbox development and QA systems on AWS The company's production system is still running on premises. The company has 2 months to cut over the entre landscape to the AWS Cloud The company has adopted a hybrid architecture for the next 2 months and needs to synchronize its shared file systems between the landscapes These shared file systems include trans directory mounts, /software directory mounts, and third-party integration mounts in the on-premises landscape the company has NFS mounts between the servers On the AWS infrastructure side the company is using Amazon Elastic File System (Amazon EFS) to share the common files An SAP solutions architect needs to design a solution to schedule transfer of these shared files bidirectional^ four times each day. The data transfer must be encrypted Which solution will meet these requirements?

A.
Write an rsync script Schedule the script through cron for four times each day in the on-premises VMware servers to transfer the data from on premises to AWS
A.
Write an rsync script Schedule the script through cron for four times each day in the on-premises VMware servers to transfer the data from on premises to AWS
Answers
B.
Install an AWS DataSync agent on the on-premises VMware platform Use the DataSync endpoint to synchronize between the on-premises NFS server and Amazon EFS on AWS
B.
Install an AWS DataSync agent on the on-premises VMware platform Use the DataSync endpoint to synchronize between the on-premises NFS server and Amazon EFS on AWS
Answers
C.
Order an AWS Snowcone device Use the Snowcone device to transfer data between the onpremises servers and AWS
C.
Order an AWS Snowcone device Use the Snowcone device to transfer data between the onpremises servers and AWS
Answers
D.
Set up a separate AWS Direct Connect connection for synchronization between the on-premises servers and AWS
D.
Set up a separate AWS Direct Connect connection for synchronization between the on-premises servers and AWS
Answers
Suggested answer: B

Explanation:

AWS DataSync can be used to schedule and automate the transfer of data between on-premises NFS servers and Amazon EFS on AWS. It also allows data to be encrypted during transfer, which meets the requirement for encryption.



Total 65 questions
Go to page: of 7