Amazon SCS-C01 Practice Test - Questions Answers, Page 33
List of questions
Question 321
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
You have a requirement to conduct penetration testing on the AWS Cloud for a couple of EC2 Instances. How could you go about doing this? Choose 2 right answers from the options given below. Please select:
Explanation:
You can use a pre-approved solution from the AWS Marketplace. But till date the AWS Documentation still mentions that you have to get prior approval before conducting a test on the AWS Cloud for EC2 Instances. Option C and D are invalid because you have to get prior approval first.
AWS Docs Provides following details:
"For performing a penetration test on AWS resources first of all we need to take permission from AWS and complete a requisition form and submit it for approval. The form should contain information about the instances you wish to test identify the expected start and end dates/times of your test and requires you to read and agree to Terms and Conditions specific to penetration testing and to the use of appropriate tools for testing. Note that the end date may not be more than 90 days from the start date."
( At this time, our policy does not permit testing small or micro RDS instance types. Testing of ml .small, t1 .micro or t2.nano EC2 instance types is not permitted. For more information on penetration testing please visit the following URL:
https://aws.amazon.eom/security/penetration-testine/lThe correct answers are: Get prior approval from AWS for conducting the test Use a pre-approvedpenetration testing tool. Submit your Feedback/Queries to our Experts
Question 322
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
You currently have an S3 bucket hosted in an AWS Account. It holds information that needs be accessed by a partner account. Which is the MOST secure way to allow the partner account to access the S3 bucket in your account? Select 3 options.
Please select:
Explanation:
Option B is invalid because Roles are assumed and not IAM users
Option E is invalid because you should not give the account ID to the partner
Option F is invalid because you should not give the access keys to the partner The below diagram from the AWS documentation showcases an example on this wherein an IAM role and external ID is us> access an AWS account resources
For more information on creating roles for external ID'S please visit the following URL:
The correct answers are: Ensure an IAM role is created which can be assumed by the partner account. Ensure the partner uses an external id when making the request Provide the ARN for the role to the partner account Submit your Feedback/Queries to our Experts
Question 323
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
Your company has created a set of keys using the AWS KMS service. They need to ensure that each key is only used for certain services. For example , they want one key to be used only for the S3 service. How can this be achieved? Please select:
Explanation:
Option A and B are invalid because mapping keys to services cannot be done via either the IAM or bucket policy Option D is invalid because keys for IAM users cannot be assigned to services This is mentioned in the AWS Documentation The kms:ViaService condition key limits use of a customer-managed CMK to requests from particular AWS services. (AWS managed CMKs in your account, such as aws/s3, are always restricted to the AWS service that created them.) For example, you can use kms:V1aService to allow a user to use a customer managed CMK only for requests that Amazon S3 makes on their behalf. Or you can use it to deny the user permission to a CMK when a request on their behalf comes from AWS Lambda.
For more information on key policy's for KMS please visit the following URL:
https://docs.aws.amazon.com/kms/latest/developereuide/policy-conditions.htmlThe correct answer is: Use the kms:ViaServtce condition in the Key policy Submit yourFeedback/Queries to our Experts
Question 324
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
You have a set of Customer keys created using the AWS KMS service. These keys have been used for around 6 months. You are now trying to use the new KMS features for the existing set of key's but are not able to do so. What could be the reason for this.
Please select:
Explanation:
By default, keys created in KMS are created with the default key policy. When features are added to KMS, you need to explii update the default key policy for these keys. Option B,C and D are invalid because the key policy is the main entity used to provide access to the keys For more information on upgrading key policies please visit the following URL:
https://docs.aws.ama20n.com/kms/latest/developerguide/key-policy-upgrading.html
( The correct answer is: You have not explicitly given access via the key policy Submit your Feedback/Queries to our Experts
Question 325
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
You are planning on hosting a web application on AWS. You create an EC2 Instance in a public subnet. This instance needs to connect to an EC2 Instance that will host an Oracle database. Which of the following steps should be followed to ensure a secure setup is in place? Select 2 answers. Please select:
Explanation:
The best secure option is to place the database in a private subnet. The below diagram from the AWS Documentation shows this setup. Also ensure that access is not allowed from all sources but just from the web servers.
Option A is invalid because databases should not be placed in the public subnet
Option D is invalid because the database security group should not allow traffic from the internet For more information on this type of setup, please refer to the below URL:
https://docs.aws.amazon.com/AmazonVPC/latest/UserGuideA/PC Scenario2.
The correct answers are: Place the EC2 Instance with the Oracle database in a separate private subnet Create a database security group and ensure the web security group to allowed incoming access Submit your Feedback/Queries to our Experts
Question 326
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
An EC2 Instance hosts a Java based application that access a DynamoDB table. This EC2 Instance is currently serving production based users. Which of the following is a secure way of ensuring that the EC2 Instance access the Dynamo table Please select:
Explanation:
To always ensure secure access to AWS resources from EC2 Instances, always ensure to assign a Role to the EC2 Instance Option B is invalid because KMS keys are not used as a mechanism for providing EC2 Instances access to AWS services. Option C is invalid Access keys is not a safe mechanism for providing EC2 Instances access to AWS services. Option D is invalid because there is no way access groups can be assigned to EC2 Instances. For more information on IAM Roles, please refer to the below URL:
https://docs.aws.amazon.com/IAM/latest/UserGuide/id roles.htmlThe correct answer is: Use IAM Roles with permissions to interact with DynamoDB and assign it tothe EC2 Instance Submit your Feedback/Queries to our Experts
Question 327
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
An application running on EC2 instances processes sensitive information stored on Amazon S3. The information is accessed over the Internet. The security team is concerned that the Internet connectivity to Amazon S3 is a security risk. Which solution will resolve the security concern?
Please select:
Explanation:
The AWS Documentation mentions the followii
A VPC endpoint enables you to privately connect your VPC to supported AWS services and VPC endpoint services powered by PrivateLink without requiring an internet gateway, NAT device, VPN connection, or AWS Direct Connect connection. Instances in your VPC do not require public IP addresses to communicate with resources in the service. Traffic between your VPC and the other service does not leave the Amazon network. Option A.B and C are all invalid because the question specifically mentions that access should not be provided via the Internet For more information on VPC endpoints, please refer to the below URL:
The correct answer is: Access the data through a VPC endpoint for Amazon S3 Submit your Feedback/Queries to our Experts
Question 328
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
Development teams in your organization use S3 buckets to store the log files for various applications hosted ir development environments in AWS. The developers want to keep the logs for one month for troubleshooting purposes, and then purge the logs. What feature will enable this requirement?
Please select:
Explanation:
The AWS Documentation mentions the following on lifecycle policies
Lifecycle configuration enables you to specify the lifecycle management of objects in a bucket. The configuration is a set of one or more rules, where each rule defines an action for Amazon S3 to apply to a group of objects. These actions can be classified a« follows:
Transition actions - In which you define when objects transition to another . For example, you may choose to transition objects to the STANDARDJA (IA, for infrequent access) storage class 30 days after creation, or archive objects to the GLACIER storage class one year after creation.
Expiration actions - In which you specify when the objects expire. Then Amazon S3 deletes the expired objects on your behalf. Option A and C are invalid because neither bucket policies neither IAM policy's can control the purging of logs Option D is invalid CORS is used for accessing objects across domains and not for purging of logs For more information on AWS S3 Lifecycle policies, please visit the following URL:
.com/AmazonS3/latest/d<
The correct answer is: Configuring lifecycle configuration rules on the S3 bucket. Submit your Feedback/Queries to our Experts
Question 329
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A company is using a Redshift cluster to store their data warehouse. There is a requirement from the Internal IT Security team to ensure that data gets encrypted for the Redshift database. How can this be achieved? Please select:
Explanation:
The AWS Documentation mentions the following
Amazon Redshift uses a hierarchy of encryption keys to encrypt the database. You can use either AWS Key Management Servic (AWS KMS) or a hardware security module (HSM) to manage the toplevel encryption keys in this hierarchy. The process that Amazon Redshift uses for encryption differs depending on how you manage keys.
Option A is invalid because its the cluster that needs to be encrypted
Option C is invalid because this encrypts objects in transit and not objects at rest
Option D is invalid because this is used only for objects in S3 buckets For more information on Redshift encryption, please visit the following URL:
https://docs.aws.amazon.com/redshift/latest/memt/workine-with-db-encryption.htmllThe correct answer is: Use AWS KMS Customer Default master key Submit your Feedback/Queries toour Experts
Question 330
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A company has resources hosted in their AWS Account. There is a requirement to monitor all API activity for all regions. The audit needs to be applied for future regions as well. Which of the following can be used to fulfil this requirement. Please select:
Explanation:
The AWS Documentation mentions the following
You can now turn on a trail across all regions for your AWS account. CloudTrail will deliver log files from all regions to the Amazon S3 bucket and an optional CloudWatch Logs log group you specified. Additionally, when AWS launches a new region, CloudTrail will create the same trail in the new region. As a result you will receive log files containing API activity for the new region without taking any action. Option A and C is invalid because this would be a maintenance overhead to enable cloudtrail for every region Option D is invalid because this AWS Config cannot be used to enable trails For more information on this feature, please visit the following URL:
https://aws.ama2on.com/about-aws/whats-new/20l5/l2/turn-on-cloudtrail-across-all-reeions-andsupport-for-multiple-trailsThe correct answer is: Ensure one Cloudtrail trail is enabled for all regions. Submit yourFeedback/Queries to our Experts
Question