ExamGecko
Home Home / Amazon / SCS-C01

Amazon SCS-C01 Practice Test - Questions Answers, Page 33

Question list
Search
Search

List of questions

Search

Related questions











You have a requirement to conduct penetration testing on the AWS Cloud for a couple of EC2 Instances. How could you go about doing this? Choose 2 right answers from the options given below. Please select:

A.
Get prior approval from AWS for conducting the test
A.
Get prior approval from AWS for conducting the test
Answers
B.
Use a pre-approved penetration testing tool.
B.
Use a pre-approved penetration testing tool.
Answers
C.
Work with an AWS partner and no need for prior approval request from AWS
C.
Work with an AWS partner and no need for prior approval request from AWS
Answers
D.
Choose any of the AWS instance type
D.
Choose any of the AWS instance type
Answers
Suggested answer: A, B

Explanation:

You can use a pre-approved solution from the AWS Marketplace. But till date the AWS Documentation still mentions that you have to get prior approval before conducting a test on the AWS Cloud for EC2 Instances. Option C and D are invalid because you have to get prior approval first.

AWS Docs Provides following details:

"For performing a penetration test on AWS resources first of all we need to take permission from AWS and complete a requisition form and submit it for approval. The form should contain information about the instances you wish to test identify the expected start and end dates/times of your test and requires you to read and agree to Terms and Conditions specific to penetration testing and to the use of appropriate tools for testing. Note that the end date may not be more than 90 days from the start date."

( At this time, our policy does not permit testing small or micro RDS instance types. Testing of ml .small, t1 .micro or t2.nano EC2 instance types is not permitted. For more information on penetration testing please visit the following URL:

https://aws.amazon.eom/security/penetration-testine/lThe correct answers are: Get prior approval from AWS for conducting the test Use a pre-approvedpenetration testing tool. Submit your Feedback/Queries to our Experts

You currently have an S3 bucket hosted in an AWS Account. It holds information that needs be accessed by a partner account. Which is the MOST secure way to allow the partner account to access the S3 bucket in your account? Select 3 options.

Please select:

A.
Ensure an IAM role is created which can be assumed by the partner account.
A.
Ensure an IAM role is created which can be assumed by the partner account.
Answers
B.
Ensure an IAM user is created which can be assumed by the partner account.
B.
Ensure an IAM user is created which can be assumed by the partner account.
Answers
C.
Ensure the partner uses an external id when making the request
C.
Ensure the partner uses an external id when making the request
Answers
D.
Provide the ARN for the role to the partner account
D.
Provide the ARN for the role to the partner account
Answers
E.
Provide the Account Id to the partner account
E.
Provide the Account Id to the partner account
Answers
F.
Provide access keys for your account to the partner account
F.
Provide access keys for your account to the partner account
Answers
Suggested answer: A, C, D

Explanation:

Option B is invalid because Roles are assumed and not IAM users

Option E is invalid because you should not give the account ID to the partner

Option F is invalid because you should not give the access keys to the partner The below diagram from the AWS documentation showcases an example on this wherein an IAM role and external ID is us> access an AWS account resources

For more information on creating roles for external ID'S please visit the following URL:

The correct answers are: Ensure an IAM role is created which can be assumed by the partner account. Ensure the partner uses an external id when making the request Provide the ARN for the role to the partner account Submit your Feedback/Queries to our Experts

Your company has created a set of keys using the AWS KMS service. They need to ensure that each key is only used for certain services. For example , they want one key to be used only for the S3 service. How can this be achieved? Please select:

A.
Create an IAM policy that allows the key to be accessed by only the S3 service.
A.
Create an IAM policy that allows the key to be accessed by only the S3 service.
Answers
B.
Create a bucket policy that allows the key to be accessed by only the S3 service.
B.
Create a bucket policy that allows the key to be accessed by only the S3 service.
Answers
C.
Use the kms:ViaService condition in the Key policy
C.
Use the kms:ViaService condition in the Key policy
Answers
D.
Define an IAM user, allocate the key and then assign the permissions to the required service
D.
Define an IAM user, allocate the key and then assign the permissions to the required service
Answers
Suggested answer: C

Explanation:

Option A and B are invalid because mapping keys to services cannot be done via either the IAM or bucket policy Option D is invalid because keys for IAM users cannot be assigned to services This is mentioned in the AWS Documentation The kms:ViaService condition key limits use of a customer-managed CMK to requests from particular AWS services. (AWS managed CMKs in your account, such as aws/s3, are always restricted to the AWS service that created them.) For example, you can use kms:V1aService to allow a user to use a customer managed CMK only for requests that Amazon S3 makes on their behalf. Or you can use it to deny the user permission to a CMK when a request on their behalf comes from AWS Lambda.

For more information on key policy's for KMS please visit the following URL:

https://docs.aws.amazon.com/kms/latest/developereuide/policy-conditions.htmlThe correct answer is: Use the kms:ViaServtce condition in the Key policy Submit yourFeedback/Queries to our Experts

You have a set of Customer keys created using the AWS KMS service. These keys have been used for around 6 months. You are now trying to use the new KMS features for the existing set of key's but are not able to do so. What could be the reason for this.

Please select:

A.
You have not explicitly given access via the key policy
A.
You have not explicitly given access via the key policy
Answers
B.
You have not explicitly given access via the IAM policy
B.
You have not explicitly given access via the IAM policy
Answers
C.
You have not given access via the IAM roles
C.
You have not given access via the IAM roles
Answers
D.
You have not explicitly given access via IAM users
D.
You have not explicitly given access via IAM users
Answers
Suggested answer: A

Explanation:

By default, keys created in KMS are created with the default key policy. When features are added to KMS, you need to explii update the default key policy for these keys. Option B,C and D are invalid because the key policy is the main entity used to provide access to the keys For more information on upgrading key policies please visit the following URL:

https://docs.aws.ama20n.com/kms/latest/developerguide/key-policy-upgrading.html

( The correct answer is: You have not explicitly given access via the key policy Submit your Feedback/Queries to our Experts

You are planning on hosting a web application on AWS. You create an EC2 Instance in a public subnet. This instance needs to connect to an EC2 Instance that will host an Oracle database. Which of the following steps should be followed to ensure a secure setup is in place? Select 2 answers. Please select:

A.
Place the EC2 Instance with the Oracle database in the same public subnet as the Web server for faster communication
A.
Place the EC2 Instance with the Oracle database in the same public subnet as the Web server for faster communication
Answers
B.
Place the EC2 Instance with the Oracle database in a separate private subnet
B.
Place the EC2 Instance with the Oracle database in a separate private subnet
Answers
C.
Create a database security group and ensure the web security group to allowed incoming access
C.
Create a database security group and ensure the web security group to allowed incoming access
Answers
D.
Ensure the database security group allows incoming traffic from 0.0.0.0/0
D.
Ensure the database security group allows incoming traffic from 0.0.0.0/0
Answers
Suggested answer: B, C

Explanation:

The best secure option is to place the database in a private subnet. The below diagram from the AWS Documentation shows this setup. Also ensure that access is not allowed from all sources but just from the web servers.

Option A is invalid because databases should not be placed in the public subnet

Option D is invalid because the database security group should not allow traffic from the internet For more information on this type of setup, please refer to the below URL:

https://docs.aws.amazon.com/AmazonVPC/latest/UserGuideA/PC Scenario2.

The correct answers are: Place the EC2 Instance with the Oracle database in a separate private subnet Create a database security group and ensure the web security group to allowed incoming access Submit your Feedback/Queries to our Experts

An EC2 Instance hosts a Java based application that access a DynamoDB table. This EC2 Instance is currently serving production based users. Which of the following is a secure way of ensuring that the EC2 Instance access the Dynamo table Please select:

A.
Use IAM Roles with permissions to interact with DynamoDB and assign it to the EC2 Instance
A.
Use IAM Roles with permissions to interact with DynamoDB and assign it to the EC2 Instance
Answers
B.
Use KMS keys with the right permissions to interact with DynamoDB and assign it to the EC2 Instance
B.
Use KMS keys with the right permissions to interact with DynamoDB and assign it to the EC2 Instance
Answers
C.
Use IAM Access Keys with the right permissions to interact with DynamoDB and assign it to the EC2 Instance
C.
Use IAM Access Keys with the right permissions to interact with DynamoDB and assign it to the EC2 Instance
Answers
D.
Use IAM Access Groups with the right permissions to interact with DynamoDB and assign it to the EC2 Instance
D.
Use IAM Access Groups with the right permissions to interact with DynamoDB and assign it to the EC2 Instance
Answers
Suggested answer: A

Explanation:

To always ensure secure access to AWS resources from EC2 Instances, always ensure to assign a Role to the EC2 Instance Option B is invalid because KMS keys are not used as a mechanism for providing EC2 Instances access to AWS services. Option C is invalid Access keys is not a safe mechanism for providing EC2 Instances access to AWS services. Option D is invalid because there is no way access groups can be assigned to EC2 Instances. For more information on IAM Roles, please refer to the below URL:

https://docs.aws.amazon.com/IAM/latest/UserGuide/id roles.htmlThe correct answer is: Use IAM Roles with permissions to interact with DynamoDB and assign it tothe EC2 Instance Submit your Feedback/Queries to our Experts

An application running on EC2 instances processes sensitive information stored on Amazon S3. The information is accessed over the Internet. The security team is concerned that the Internet connectivity to Amazon S3 is a security risk. Which solution will resolve the security concern?

Please select:

A.
Access the data through an Internet Gateway.
A.
Access the data through an Internet Gateway.
Answers
B.
Access the data through a VPN connection.
B.
Access the data through a VPN connection.
Answers
C.
Access the data through a NAT Gateway.
C.
Access the data through a NAT Gateway.
Answers
D.
Access the data through a VPC endpoint for Amazon S3
D.
Access the data through a VPC endpoint for Amazon S3
Answers
Suggested answer: D

Explanation:

The AWS Documentation mentions the followii

A VPC endpoint enables you to privately connect your VPC to supported AWS services and VPC endpoint services powered by PrivateLink without requiring an internet gateway, NAT device, VPN connection, or AWS Direct Connect connection. Instances in your VPC do not require public IP addresses to communicate with resources in the service. Traffic between your VPC and the other service does not leave the Amazon network. Option A.B and C are all invalid because the question specifically mentions that access should not be provided via the Internet For more information on VPC endpoints, please refer to the below URL:

The correct answer is: Access the data through a VPC endpoint for Amazon S3 Submit your Feedback/Queries to our Experts

Development teams in your organization use S3 buckets to store the log files for various applications hosted ir development environments in AWS. The developers want to keep the logs for one month for troubleshooting purposes, and then purge the logs. What feature will enable this requirement?

Please select:

A.
Adding a bucket policy on the S3 bucket.
A.
Adding a bucket policy on the S3 bucket.
Answers
B.
Configuring lifecycle configuration rules on the S3 bucket.
B.
Configuring lifecycle configuration rules on the S3 bucket.
Answers
C.
Creating an IAM policy for the S3 bucket.
C.
Creating an IAM policy for the S3 bucket.
Answers
D.
Enabling CORS on the S3 bucket.
D.
Enabling CORS on the S3 bucket.
Answers
Suggested answer: B

Explanation:

The AWS Documentation mentions the following on lifecycle policies

Lifecycle configuration enables you to specify the lifecycle management of objects in a bucket. The configuration is a set of one or more rules, where each rule defines an action for Amazon S3 to apply to a group of objects. These actions can be classified a« follows:

Transition actions - In which you define when objects transition to another . For example, you may choose to transition objects to the STANDARDJA (IA, for infrequent access) storage class 30 days after creation, or archive objects to the GLACIER storage class one year after creation.

Expiration actions - In which you specify when the objects expire. Then Amazon S3 deletes the expired objects on your behalf. Option A and C are invalid because neither bucket policies neither IAM policy's can control the purging of logs Option D is invalid CORS is used for accessing objects across domains and not for purging of logs For more information on AWS S3 Lifecycle policies, please visit the following URL:

.com/AmazonS3/latest/d<

The correct answer is: Configuring lifecycle configuration rules on the S3 bucket. Submit your Feedback/Queries to our Experts

A company is using a Redshift cluster to store their data warehouse. There is a requirement from the Internal IT Security team to ensure that data gets encrypted for the Redshift database. How can this be achieved? Please select:

A.
Encrypt the EBS volumes of the underlying EC2 Instances
A.
Encrypt the EBS volumes of the underlying EC2 Instances
Answers
B.
Use AWS KMS Customer Default master key
B.
Use AWS KMS Customer Default master key
Answers
C.
Use SSL/TLS for encrypting the data
C.
Use SSL/TLS for encrypting the data
Answers
D.
Use S3 Encryption
D.
Use S3 Encryption
Answers
Suggested answer: B

Explanation:

The AWS Documentation mentions the following

Amazon Redshift uses a hierarchy of encryption keys to encrypt the database. You can use either AWS Key Management Servic (AWS KMS) or a hardware security module (HSM) to manage the toplevel encryption keys in this hierarchy. The process that Amazon Redshift uses for encryption differs depending on how you manage keys.

Option A is invalid because its the cluster that needs to be encrypted

Option C is invalid because this encrypts objects in transit and not objects at rest

Option D is invalid because this is used only for objects in S3 buckets For more information on Redshift encryption, please visit the following URL:

https://docs.aws.amazon.com/redshift/latest/memt/workine-with-db-encryption.htmllThe correct answer is: Use AWS KMS Customer Default master key Submit your Feedback/Queries toour Experts

A company has resources hosted in their AWS Account. There is a requirement to monitor all API activity for all regions. The audit needs to be applied for future regions as well. Which of the following can be used to fulfil this requirement. Please select:

A.
Ensure Cloudtrail for each region. Then enable for each future region.
A.
Ensure Cloudtrail for each region. Then enable for each future region.
Answers
B.
Ensure one Cloudtrail trail is enabled for all regions.
B.
Ensure one Cloudtrail trail is enabled for all regions.
Answers
C.
Create a Cloudtrail for each region. Use Cloudformation to enable the trail for all future regions.
C.
Create a Cloudtrail for each region. Use Cloudformation to enable the trail for all future regions.
Answers
D.
Create a Cloudtrail for each region. Use AWS Config to enable the trail for all future regions.
D.
Create a Cloudtrail for each region. Use AWS Config to enable the trail for all future regions.
Answers
Suggested answer: B

Explanation:

The AWS Documentation mentions the following

You can now turn on a trail across all regions for your AWS account. CloudTrail will deliver log files from all regions to the Amazon S3 bucket and an optional CloudWatch Logs log group you specified. Additionally, when AWS launches a new region, CloudTrail will create the same trail in the new region. As a result you will receive log files containing API activity for the new region without taking any action. Option A and C is invalid because this would be a maintenance overhead to enable cloudtrail for every region Option D is invalid because this AWS Config cannot be used to enable trails For more information on this feature, please visit the following URL:

https://aws.ama2on.com/about-aws/whats-new/20l5/l2/turn-on-cloudtrail-across-all-reeions-andsupport-for-multiple-trailsThe correct answer is: Ensure one Cloudtrail trail is enabled for all regions. Submit yourFeedback/Queries to our Experts

Total 590 questions
Go to page: of 59