ExamGecko
Home Home / Amazon / SCS-C01

Amazon SCS-C01 Practice Test - Questions Answers, Page 30

Question list
Search
Search

List of questions

Search

Related questions











Your current setup in AWS consists of the following architecture. 2 public subnets, one subnet which has the web servers accessed by users across the internet and the other subnet for the database server. Which of the following changes to the architecture would add a better security boundary to the resources hosted in your setup Please select:

A.
Consider moving the web server to a private subnet
A.
Consider moving the web server to a private subnet
Answers
B.
Consider moving the database server to a private subnet
B.
Consider moving the database server to a private subnet
Answers
C.
Consider moving both the web and database server to a private subnet
C.
Consider moving both the web and database server to a private subnet
Answers
D.
Consider creating a private subnet and adding a NAT instance to that subnet
D.
Consider creating a private subnet and adding a NAT instance to that subnet
Answers
Suggested answer: B

Explanation:

The ideal setup is to ensure that the web server is hosted in the public subnet so that it can be accessed by users on the internet. The database server can be hosted in the private subnet. The below diagram from the AWS Documentation shows how this can be setup

Option A and C are invalid because if you move the web server to a private subnet, then it cannot be accessed by users Option D is invalid because NAT instances should be present in the public subnet For more information on public and private subnets in AWS, please visit the following url .com/AmazonVPC/latest/UserGuide/VPC Scenario2. The correct answer is: Consider moving the database server to a private subnet Submit your Feedback/Queries to our Experts

Your company has confidential documents stored in the simple storage service. Due to compliance requirements, you have to ensure that the data in the S3 bucket is available in a different geographical location. As an architect what is the change you would make to comply with this requirement.

Please select:

A.
Apply Multi-AZ for the underlying 53 bucket
A.
Apply Multi-AZ for the underlying 53 bucket
Answers
B.
Copy the data to an EBS Volume in another Region
B.
Copy the data to an EBS Volume in another Region
Answers
C.
Create a snapshot of the S3 bucket and copy it to another region
C.
Create a snapshot of the S3 bucket and copy it to another region
Answers
D.
Enable Cross region replication for the S3 bucket
D.
Enable Cross region replication for the S3 bucket
Answers
Suggested answer: D

Explanation:

This is mentioned clearly as a use case for S3 cross-region replication You might configure cross-region replication on a bucket for various reasons, including the following:

• Compliance requirements - Although, by default Amazon S3 stores your data across multiple geographically distant Availability Zones, compliance requirements might dictate that you store data at even further distances. Cross-region replication allows you to replicate data between distant AWS Regions to satisfy these compliance requirements. Option A is invalid because Multi-AZ cannot be used to S3 buckets

Option B is invalid because copying it to an EBS volume is not a recommended practice

Option C is invalid because creating snapshots is not possible in S3 For more information on S3 cross-region replication, please visit the following URL:

https://docs.aws.amazon.com/AmazonS3/latest/dev/crr.htmllThe correct answer is: Enable Cross region replication for the S3 bucketSubmit your Feedback/Queries to our Experts

When managing permissions for the API gateway, what can be used to ensure that the right level of permissions are given to developers, IT admins and users? These permissions should be easily managed. Please select:

A.
Use the secure token service to manage the permissions for the different users
A.
Use the secure token service to manage the permissions for the different users
Answers
B.
Use IAM Policies to create different policies for the different types of users.
B.
Use IAM Policies to create different policies for the different types of users.
Answers
C.
Use the AWS Config tool to manage the permissions for the different users
C.
Use the AWS Config tool to manage the permissions for the different users
Answers
D.
Use IAM Access Keys to create sets of keys for the different types of users.
D.
Use IAM Access Keys to create sets of keys for the different types of users.
Answers
Suggested answer: B

Explanation:

The AWS Documentation mentions the following

You control access to Amazon API Gateway with IAM permissions by controlling access to the following two API Gateway component processes:

* To create, deploy, and manage an API in API Gateway, you must grant the API developer permissions to perform the required actions supported by the API management component of API Gateway.

* To call a deployed API or to refresh the API caching, you must grant the API caller permissions to perform required IAM actions supported by the API execution component of API Gateway.

Option A, C and D are invalid because these cannot be used to control access to AWS services. This needs to be done via policies. For more information on permissions with the API gateway, please visit the following URL:

https://docs.aws.amazon.com/apisateway/latest/developerguide/permissions.htmlThe correct answer is: Use IAM Policies to create different policies for the different types of users. Submit your Feedback/Queries to our Experts

A company hosts data in S3. There is a requirement to control access to the S3 buckets. Which are the 2 ways in which this can be achieved? Please select:

A.
Use Bucket policies
A.
Use Bucket policies
Answers
B.
Use the Secure Token service
B.
Use the Secure Token service
Answers
C.
Use IAM user policies
C.
Use IAM user policies
Answers
D.
Use AWS Access Keys
D.
Use AWS Access Keys
Answers
Suggested answer: A, C

Explanation:

The AWS Documentation mentions the following

Amazon S3 offers access policy options broadly categorized as resource-based policies and user policies. Access policies you attach to your resources (buckets and objects) are referred to as resource-based policies. For example, bucket policies and access control lists (ACLs) are resourcebased policies. You can also attach access policies to users in your account. These are called user policies. You may choose to use resource-based policies, user policies, or some combination of these to manage permissions to your Amazon S3 resources.

Option B and D are invalid because these cannot be used to control access to S3 buckets For more information on S3 access control, please refer to the below Link:

https://docs.aws.amazon.com/AmazonS3/latest/dev/s3-access-control.htmllThe correct answers are: Use Bucket policies. Use IAM user policies Submit your Feedback/Queriesto our Experts

You are responsible to deploying a critical application onto AWS. Part of the requirements for this application is to ensure that the controls set for this application met PCI compliance. Also there is a need to monitor web application logs to identify any malicious activity. Which of the following services can be used to fulfil this requirement. Choose 2 answers from the options given below Please select:

A.
Amazon Cloudwatch Logs
A.
Amazon Cloudwatch Logs
Answers
B.
Amazon VPC Flow Logs
B.
Amazon VPC Flow Logs
Answers
C.
Amazon AWS Config
C.
Amazon AWS Config
Answers
D.
Amazon Cloudtrail
D.
Amazon Cloudtrail
Answers
Suggested answer: A, D

Explanation:

The AWS Documentation mentions the following about these services

AWS CloudTrail is a service that enables governance, compliance, operational auditing, and risk auditing of your AWS account. With CloudTrail, you can log, continuously monitor, and retain account activity related to actions across your AWS infrastructure. CloudTrail provides event history of your AWS account activity, including actions taken through the AWS Management Console, AWS SDKs, command line tools, and other AWS services. This event history simplifies security analysis, resource change tracking, and troubleshooting.

Option B is incorrect because VPC flow logs can only check for flow to instances in a VPC

Option C is incorrect because this can check for configuration changes only For more information on Cloudtrail, please refer to below URL:

https://aws.amazon.com/cloudtrail;You can use Amazon CloudWatch Logs to monitor, store, and access your log files from AmazonElastic Compute Cloud (Amazon EC2) instances, AWS CloudTrail, Amazon Route 53, and othersources. You can then retrieve the associated log data from CloudWatch Logs.

For more information on Cloudwatch logs, please refer to below URL: http://docs.aws.amazon.com/AmazonCloudWatch/latest/loes/WhatisCloudWatchLoES.htmll The correct answers are: Amazon Cloudwatch Logs, Amazon Cloudtrail

A company wishes to enable Single Sign On (SSO) so its employees can login to the management console using their corporate directory identity. Which steps below are required as part of the process? Select 2 answers from the options given below.

Please select:

A.
Create a Direct Connect connection between on-premise network and AWS. Use an AD connector for connecting AWS with on-premise active directory.
A.
Create a Direct Connect connection between on-premise network and AWS. Use an AD connector for connecting AWS with on-premise active directory.
Answers
B.
Create IAM policies that can be mapped to group memberships in the corporate directory.
B.
Create IAM policies that can be mapped to group memberships in the corporate directory.
Answers
C.
Create a Lambda function to assign IAM roles to the temporary security tokens provided to the users.
C.
Create a Lambda function to assign IAM roles to the temporary security tokens provided to the users.
Answers
D.
Create IAM users that can be mapped to the employees' corporate identities
D.
Create IAM users that can be mapped to the employees' corporate identities
Answers
E.
Create an IAM role that establishes a trust relationship between IAM and the corporate directory identity provider (IdP)
E.
Create an IAM role that establishes a trust relationship between IAM and the corporate directory identity provider (IdP)
Answers
Suggested answer: A, E

Explanation:

Create a Direct Connect connection so that corporate users can access the AWS account

Option B is incorrect because IAM policies are not directly mapped to group memberships in the corporate directory. It is IAM roles which are mapped. Option C is incorrect because Lambda functions is an incorrect option to assign roles.

Option D is incorrect because IAM users are not directly mapped to employees' corporate identities. For more information on Direct Connect, please refer to below URL:

' https://aws.amazon.com/directconnect/

From the AWS Documentation, for federated access, you also need to ensure the right policy permissions are in place Configure permissions in AWS for your federated users The next step is to create an IAM role that establishes a trust relationship between IAM and your organization's IdP that identifies your IdP as a principal (trusted entity) for purposes of federation. The role also defines what users authenticated your organization's IdP are allowed to do in AWS. You can use the IAM console to create this role. When you create the trust policy that indicates who can assume the role, you specify the SAML provider that you created earlier in IAM along with one or more SAML attributes that a user must match to be allowed to assume the role. For example, you can specify that only users whose SAML eduPersonOrgDN value is ExampleOrg are allowed to sign in. The role wizard automatically adds a condition to test the saml:aud attribute to make sure that the role is assumed only for sign-in to the AWS Management Console. The trust policy for the role might look like this:

For more information on SAML federation, please refer to below URL:

https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_providers_enabliNote:

What directories can I use with AWS SSO?

You can connect AWS SSO to Microsoft Active Directory, running either on-premises or in the AWS Cloud. AWS SSO supports AWS Directory Service for Microsoft Active Directory, also known as AWS Managed Microsoft AD, and AD Connector. AWS SSO does not support Simple AD. See AWS Directory Service Getting Started to learn more. To connect to your on-premises directory with AD Connector, you need the following:

VPC

Set up a VPC with the following:

• At least two subnets. Each of the subnets must be in a different Availability Zone.

• The VPC must be connected to your on-premises network through a virtual private network (VPN) connection or AWS Direct Connect. • The VPC must have default hardware tenancy.

• https://aws.amazon.com/single-sign-on/

• https://aws.amazon.com/single-sign-on/faqs/

• https://aws.amazon.com/bloj using-corporate-credentials/

• https://docs.aws.amazon.com/directoryservice/latest/admin-

The correct answers are: Create a Direct Connect connection between on-premise network and AWS.

Use an AD connector connecting AWS with on-premise active directory.. Create an IAM role that establishes a trust relationship between IAM and corporate directory identity provider (IdP) Submit your Feedback/Queries to our Experts

A company continually generates sensitive records that it stores in an S3 bucket. All objects in the bucket are encrypted using SSE-KMS using one of the company's CMKs. Company compliance policies require that no more than one month of data be encrypted using the same encryption key.

What solution below will meet the company's requirements?

Please select:

A.
Trigger a Lambda function with a monthly CloudWatch event that creates a new CMK and updates the S3 bucket to use the new CMK.
A.
Trigger a Lambda function with a monthly CloudWatch event that creates a new CMK and updates the S3 bucket to use the new CMK.
Answers
B.
Configure the CMK to rotate the key material every month.
B.
Configure the CMK to rotate the key material every month.
Answers
C.
Trigger a Lambda function with a monthly CloudWatch event that creates a new CMK, updates the S3 bucket to use thfl new CMK, and deletes the old CMK.
C.
Trigger a Lambda function with a monthly CloudWatch event that creates a new CMK, updates the S3 bucket to use thfl new CMK, and deletes the old CMK.
Answers
D.
Trigger a Lambda function with a monthly CloudWatch event that rotates the key material in the CMK.
D.
Trigger a Lambda function with a monthly CloudWatch event that rotates the key material in the CMK.
Answers
Suggested answer: A

Explanation:

You can use a Lambda function to create a new key and then update the S3 bucket to use the new key. Remember not to delete the old key, else you will not be able to decrypt the documents stored in the S3 bucket using the older key. Option B is incorrect because AWS KMS cannot rotate keys on a monthly basis

Option C is incorrect because deleting the old key means that you cannot access the older objects

Option D is incorrect because rotating key material is not possible.

For more information on AWS KMS keys, please refer to below URL:

https://docs.aws.amazon.com/kms/latest/developereuide/concepts.htmllThe correct answer is: Trigger a Lambda function with a monthly CloudWatch event that creates anew CMK and updates the S3 bucket to use the new CMK. Submit your Feedback/Queries to our Experts

Company policy requires that all insecure server protocols, such as FTP, Telnet, HTTP, etc be disabledon all servers. The security team would like to regularly check all servers to ensure compliance withthis requirement by using a scheduled CloudWatch event to trigger a review of the currentinfrastructure. What process will check compliance of the company's EC2 instances? Please select:

A.
Trigger an AWS Config Rules evaluation of the restricted-common-ports rule against every EC2 instance.
A.
Trigger an AWS Config Rules evaluation of the restricted-common-ports rule against every EC2 instance.
Answers
B.
Query the Trusted Advisor API for all best practice security checks and check for "action recommened" status.
B.
Query the Trusted Advisor API for all best practice security checks and check for "action recommened" status.
Answers
C.
Enable a GuardDuty threat detection analysis targeting the port configuration on every EC2 instance.
C.
Enable a GuardDuty threat detection analysis targeting the port configuration on every EC2 instance.
Answers
D.
Run an Amazon inspector assessment using the Runtime Behavior Analysis rules package against every EC2 instance.
D.
Run an Amazon inspector assessment using the Runtime Behavior Analysis rules package against every EC2 instance.
Answers
Suggested answer: D

Explanation:

Option B is incorrect because querying Trusted Advisor API's are not possible

Option C is incorrect because GuardDuty should be used to detect threats and not check the compliance of security protocols. Option D states that Run Amazon Inspector using runtime behavior analysis rules which will analyze the behavior of your instances during an assessment run, and provide guidance about how to make your EC2 instances more secure. Insecure Server Protocols

This rule helps determine whether your EC2 instances allow support for insecure and unencrypted ports/services such as FTP, Telnet HTTP, IMAP, POP version 3, SMTP, SNMP versions 1 and 2, rsh, and rlogin. For more information, please refer to below URL:

https://docs.aws.amazon.eom/mspector/latest/userguide/inspector_runtime-behavioranalysis.html#insecure-protocols ( The correct answer is: Run an Amazon Inspector assessment using the Runtime Behavior Analysis rules package against every EC2 instance. Submit your Feedback/Queries to our Experts

A web application runs in a VPC on EC2 instances behind an ELB Application Load Balancer. The application stores data in an RDS MySQL DB instance. A Linux bastion host is used to apply schema updates to the database - administrators connect to the host via SSH from a corporate workstation.

The following security groups are applied to the infrastructure-

* sgLB - associated with the ELB

* sgWeb - associated with the EC2 instances.

* sgDB - associated with the database

* sgBastion - associated with the bastion host Which security group configuration will allow the application to be secure and functional? Please select:

A.
sgLB :allow port 80 and 443 traffic from 0.0.0.0/0 sgWeb :allow port 80 and 443 traffic from 0.0.0.0/0 sgDB :allow port 3306 traffic from sgWeb and sgBastion sgBastion: allow port 22 traffic from the corporate IP address range
A.
sgLB :allow port 80 and 443 traffic from 0.0.0.0/0 sgWeb :allow port 80 and 443 traffic from 0.0.0.0/0 sgDB :allow port 3306 traffic from sgWeb and sgBastion sgBastion: allow port 22 traffic from the corporate IP address range
Answers
B.
sgLB :aIlow port 80 and 443 traffic from 0.0.0.0/0 sgWeb :allow port 80 and 443 traffic from sgLB sgDB :allow port 3306 traffic from sgWeb and sgLB sgBastion: allow port 22 traffic from the VPC IP address range
B.
sgLB :aIlow port 80 and 443 traffic from 0.0.0.0/0 sgWeb :allow port 80 and 443 traffic from sgLB sgDB :allow port 3306 traffic from sgWeb and sgLB sgBastion: allow port 22 traffic from the VPC IP address range
Answers
C.
sgLB :allow port 80 and 443 traffic from 0.0.0.0/0 sgWeb :allow port 80 and 443 traffic from sgLB sgDB :allow port 3306 traffic from sgWeb and sgBastion sgBastion: allow port 22 traffic from the VPC IP address range
C.
sgLB :allow port 80 and 443 traffic from 0.0.0.0/0 sgWeb :allow port 80 and 443 traffic from sgLB sgDB :allow port 3306 traffic from sgWeb and sgBastion sgBastion: allow port 22 traffic from the VPC IP address range
Answers
D.
sgLB :allow port 80 and 443 traffic from 0.0.0.0/0 sgWeb :allow port 80 and 443 traffic from sgLB sgDB :allow port 3306 traffic from sgWeb and sgBastion sgBastion: allow port 22 traffic from the corporate IP address range
D.
sgLB :allow port 80 and 443 traffic from 0.0.0.0/0 sgWeb :allow port 80 and 443 traffic from sgLB sgDB :allow port 3306 traffic from sgWeb and sgBastion sgBastion: allow port 22 traffic from the corporate IP address range
Answers
Suggested answer: D

Explanation:

The Load Balancer should accept traffic on ow port 80 and 443 traffic from 0.0.0.0/0 The backend EC2 Instances should accept traffic from the Load Balancer The database should allow traffic from the Web server And the Bastion host should only allow traffic from a specific corporate IP address range

Option A is incorrect because the Web group should only allow traffic from the Load balancer For more information on AWS Security Groups, please refer to below URL:

https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/usins-network-security.htmllThe correct answer is: sgLB :allow port 80 and 443 traffic from 0.0.0.0/0sgWeb :allow port 80 and 443 traffic from sgLBsgDB :allow port 3306 traffic from sgWeb and sgBastionsgBastion: allow port 22 traffic from the corporate IP address rangeSubmit your Feedback/Queries to our Experts

A company is planning on extending their on-premise AWS Infrastructure to the AWS Cloud. They need to have a solution that would give core benefits of traffic encryption and ensure latency is kept to a minimum. Which of the following would help fulfil this requirement? Choose 2 answers from the options given below Please select:

A.
AWS VPN
A.
AWS VPN
Answers
B.
AWS VPC Peering
B.
AWS VPC Peering
Answers
C.
AWS NAT gateways
C.
AWS NAT gateways
Answers
D.
AWS Direct Connect
D.
AWS Direct Connect
Answers
Suggested answer: A, D

Explanation:

The AWS Document mention the following which supports the requirement

Option B is invalid because VPC peering is only used for connection between VPCs and cannot be used to connect On-premise infrastructure to the AWS Cloud. Option C is invalid because NAT gateways is used to connect instances in a private subnet to the internet For more information on VPN Connections, please visit the following url https://docs.aws.amazon.com/AmazonVPC/latest/UserGuideA/pn-connections.htmlThe correct answers are: AWS VPN, AWS Direct Connect Submit your Feedback/Queries to ourExperts

Total 590 questions
Go to page: of 59