ExamGecko
Home Home / Amazon / SCS-C01

Amazon SCS-C01 Practice Test - Questions Answers, Page 37

Question list
Search
Search

List of questions

Search

Related questions











A large organization is planning on AWS to host their resources. They have a number of autonomous departments that wish to use AWS. What could be the strategy to adopt for managing the accounts. Please select:

A.
Use multiple VPCs in the account each VPC for each department
A.
Use multiple VPCs in the account each VPC for each department
Answers
B.
Use multiple IAM groups, each group for each department
B.
Use multiple IAM groups, each group for each department
Answers
C.
Use multiple IAM roles, each group for each department
C.
Use multiple IAM roles, each group for each department
Answers
D.
Use multiple AWS accounts, each account for each department
D.
Use multiple AWS accounts, each account for each department
Answers
Suggested answer: D

Explanation:

A recommendation for this is given in the AWS Security best practices

Option A is incorrect since this would be applicable for resources in a VPC Options B and C are incorrect since operationally it would be difficult to manage For more information on AWS Security best practices please refer to the below URL https://d1.awsstatic.com/whitepapers/Security/AWS Security Best Practices.pdlThe correct answer is: Use multiple AWS accounts, each account for each department Submit yourFeedback/Queries to our Experts

An employee keeps terminating EC2 instances on the production environment. You've determined the best way to ensure this doesn't happen is to add an extra layer of defense against terminating the instances. What is the best method to ensure the employee does not terminate the production instances? Choose the 2 correct answers from the options below Please select:

A.
Tag the instance with a production-identifying tag and add resource-level permissions to the employee user with an explicit deny on the terminate API call to instances with the production tag.
A.
Tag the instance with a production-identifying tag and add resource-level permissions to the employee user with an explicit deny on the terminate API call to instances with the production tag.
Answers
B.
Tag the instance with a production-identifying tag and modify the employees group to allow only start stop, and reboot API calls and not the terminate instance call.
B.
Tag the instance with a production-identifying tag and modify the employees group to allow only start stop, and reboot API calls and not the terminate instance call.
Answers
C.
Modify the IAM policy on the user to require MFA before deleting EC2 instances and disable MFA access to the employee
C.
Modify the IAM policy on the user to require MFA before deleting EC2 instances and disable MFA access to the employee
Answers
D.
Modify the IAM policy on the user to require MFA before deleting EC2 instances
D.
Modify the IAM policy on the user to require MFA before deleting EC2 instances
Answers
Suggested answer: A, B

Explanation:

Tags enable you to categorize your AWS resources in different ways, for example, by purpose, owner, or environment. This is useful when you have many resources of the same type — you can quickly identify a specific resource based on the tags you've assigned to it. Each tag consists of a key and an optional value, both of which you define Options C&D are incorrect because it will not ensure that the employee cannot terminate the instance. For more information on tagging answer resources please refer to the below URL: http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/Usins_Tags.htmll The correct answers are: Tag the instance with a production-identifying tag and add resource-level permissions to the employe user with an explicit deny on the terminate API call to instances with the production tag.. Tag the instance with a production-identifying tag and modify the employees group to allow only start stop, and reboot API calls and not the terminate instance Submit your Feedback/Queries to our Experts

You have been given a new brief from your supervisor for a client who needs a web application set up on AWS. The a most important requirement is that MySQL must be used as the database, and this database must not be hosted in t« public cloud, but rather at the client's data center due to security risks. Which of the following solutions would be the ^ best to assure that the client's requirements are met? Choose the correct answer from the options below Please select:

A.
Build the application server on a public subnet and the database at the client's data center.Connect them with a VPN connection which uses IPsec.
A.
Build the application server on a public subnet and the database at the client's data center.Connect them with a VPN connection which uses IPsec.
Answers
B.
Use the public subnet for the application server and use RDS with a storage gateway to access and synchronize the data securely from the local data center.
B.
Use the public subnet for the application server and use RDS with a storage gateway to access and synchronize the data securely from the local data center.
Answers
C.
Build the application server on a public subnet and the database on a private subnet with a NAT instance between them.
C.
Build the application server on a public subnet and the database on a private subnet with a NAT instance between them.
Answers
D.
Build the application server on a public subnet and build the database in a private subnet with a secure ssh connection to the private subnet from the client's data center.
D.
Build the application server on a public subnet and build the database in a private subnet with a secure ssh connection to the private subnet from the client's data center.
Answers
Suggested answer: A

Explanation:

Since the database should not be hosted on the cloud all other options are invalid.

The best option is to create a VPN connection for securing traffic as shown below.

Option B is invalid because this is the incorrect use of the Storage gateway

Option C is invalid since this is the incorrect use of the NAT instance

Option D is invalid since this is an incorrect configuration For more information on VPN connections, please visit the below URL http://docs.aws.amazon.com/AmazonVPC/latest/UserGuide/VPC_VPN.htmll The correct answer is: Build the application server on a public subnet and the database at the client's data center. Connect them with a VPN connection which uses IPsec Submit your Feedback/Queries to our Experts

You are planning on using the AWS KMS service for managing keys for your application. For which of the following can the KMS CMK keys be used for encrypting? Choose 2 answers from the options given below Please select:

A.
Image Objects
A.
Image Objects
Answers
B.
Large files
B.
Large files
Answers
C.
Password
C.
Password
Answers
D.
RSA Keys
D.
RSA Keys
Answers
Suggested answer: C, D

Explanation:

The CMK keys themselves can only be used for encrypting data that is maximum 4KB in size. Hence it can be used for encryptii information such as passwords and RSA keys. Option A and B are invalid because the actual CMK key can only be used to encrypt small amounts of data and not large amoui of data. You have to generate the data key from the CMK key in order to encrypt high amounts of data For more information on the concepts for KMS, please visit the following URL:

https://docs.aws.amazon.com/kms/latest/developereuide/concepts.htmllThe correct answers are: Password, RSA Keys Submit your Feedback/Queries to our Experts

A company has been using the AW5 KMS service for managing its keys. They are planning on carrying out housekeeping activities and deleting keys which are no longer in use. What are the ways that can be incorporated to see which keys are in use? Choose 2 answers from the options given below Please select:

A.
Determine the age of the master key
A.
Determine the age of the master key
Answers
B.
See who is assigned permissions to the master key
B.
See who is assigned permissions to the master key
Answers
C.
See Cloudtrail for usage of the key
C.
See Cloudtrail for usage of the key
Answers
D.
Use AWS cloudwatch events for events generated for the key
D.
Use AWS cloudwatch events for events generated for the key
Answers
Suggested answer: B, C

Explanation:

The direct ways that can be used to see how the key is being used is to see the current access permissions and cloudtrail logs Option A is invalid because seeing how long ago the key was created would not determine the usage of the key Option D is invalid because Cloudtrail Event is better for seeing for events generated by the key This is also mentioned in the AWS Documentation Examining CMK Permissions to Determine the Scope of Potential Usage Determining who or what currently has access to a customer master key (CMK) might help you determine how widely the CM was used and whether it is still needed. To learn how to determine who or what currently has access to a CMK, go to Determining Access to an AWS KMS Customer Master Key.

Examining AWS CloudTrail Logs to Determine Actual Usage

AWS KMS is integrated with AWS CloudTrail, so all AWS KMS API activity is recorded in CloudTrail log files. If you have CloudTrail turned on in the region where your customer master key (CMK) is located, you can examine your CloudTrail log files to view a history of all AWS KMS API activity for a particular CMK, and thus its usage history. You might be able to use a CMK's usage history to help you determine whether or not you still need it For more information on determining the usage of CMK keys, please visit the following URL:

https://docs.aws.amazon.com/kms/latest/developerguide/deleting-keys-determining-usage.htmlThe correct answers are: See who is assigned permissions to the master key. See Cloudtrail for usageof the key Submit your Feedback/Queries to our Experts

Which of the following is the correct sequence of how KMS manages the keys when used along with the Redshift cluster service Please select:

A.
The master keys encrypts the cluster key. The cluster key encrypts the database key. The database key encrypts the data encryption keys.
A.
The master keys encrypts the cluster key. The cluster key encrypts the database key. The database key encrypts the data encryption keys.
Answers
B.
The master keys encrypts the database key. The database key encrypts the data encryption keys.
B.
The master keys encrypts the database key. The database key encrypts the data encryption keys.
Answers
C.
The master keys encrypts the data encryption keys. The data encryption keys encrypts the database key
C.
The master keys encrypts the data encryption keys. The data encryption keys encrypts the database key
Answers
D.
The master keys encrypts the cluster key, database key and data encryption keys
D.
The master keys encrypts the cluster key, database key and data encryption keys
Answers
Suggested answer: A

Explanation:

This is mentioned in the AWS Documentation

Amazon Redshift uses a four-tier, key-based architecture for encryption. The architecture consists of data encryption keys, a database key, a cluster key, and a master key. Data encryption keys encrypt data blocks in the cluster. Each data block is assigned a randomlygenerated AES-256 key. These keys are encrypted by using the database key for the cluster. The database key encrypts data encryption keys in the cluster. The database key is a randomlygenerated AES-256 key. It is stored on disk in a separate network from the Amazon Redshift cluster and passed to the cluster across a secure channel.

The cluster key encrypts the database key for the Amazon Redshift cluster.

Option B is incorrect because the master key encrypts the cluster key and not the database key

Option C is incorrect because the master key encrypts the cluster key and not the data encryption keys Option D is incorrect because the master key encrypts the cluster key only For more information on how keys are used in Redshift, please visit the following URL:

https://docs.aws.amazon.com/kms/latest/developereuide/services-redshift.htmlThe correct answer is: The master keys encrypts the cluster key. The cluster key encrypts thedatabase key. The database key encrypts the data encryption keys. Submit your Feedback/Queries to our Experts

A company wants to use Cloudtrail for logging all API activity. They want to segregate the logging of data events and management events. How can this be achieved? Choose 2 answers from the options given below Please select:

A.
Create one Cloudtrail log group for data events
A.
Create one Cloudtrail log group for data events
Answers
B.
Create one trail that logs data events to an S3 bucket
B.
Create one trail that logs data events to an S3 bucket
Answers
C.
Create another trail that logs management events to another S3 bucket
C.
Create another trail that logs management events to another S3 bucket
Answers
D.
Create another Cloudtrail log group for management events
D.
Create another Cloudtrail log group for management events
Answers
Suggested answer: B, C

Explanation:

The AWS Documentation mentions the following

You can configure multiple trails differently so that the trails process and log only the events that you specify. For example, one trail can log read-only data and management events, so that all read-only events are delivered to one S3 bucket. Another trail can log only write-only data and management events, so that all write-only events are delivered to a separate S3 bucket Options A and D are invalid because you have to create a trail and not a log group For more information on managing events with cloudtrail, please visit the following URL:

https://docs.aws.amazon.com/awscloudtrail/latest/userguide/loHEing-manasement-and-dataevents-with-cloudtraiThe correct answers are: Create one trail that logs data events to an S3 bucket. Create another trailthat logs management events to another S3 bucketSubmit your Feedback/Queries to our Experts

Your company has been using AWS for the past 2 years. They have separate S3 buckets for logging the various AWS services that have been used. They have hired an external vendor for analyzing their log files. They have their own AWS account. What is the best way to ensure that the partner account can access the log files in the company account for analysis. Choose 2 answers from the options given below Please select:

A.
Create an IAM user in the company account
A.
Create an IAM user in the company account
Answers
B.
Create an IAM Role in the company account
B.
Create an IAM Role in the company account
Answers
C.
Ensure the IAM user has access for read-only to the S3 buckets
C.
Ensure the IAM user has access for read-only to the S3 buckets
Answers
D.
Ensure the IAM Role has access for read-only to the S3 buckets
D.
Ensure the IAM Role has access for read-only to the S3 buckets
Answers
Suggested answer: B, D

Explanation:

The AWS Documentation mentions the following

To share log files between multiple AWS accounts, you must perform the following general steps.

These steps are explained in detail later in this section.

Create an IAM role for each account that you want to share log files with.

For each of these IAM roles, create an access policy that grants read-only access to the account you want to share the log files with. Have an IAM user in each account programmatically assume the appropriate role and retrieve the log files. Options A and C are invalid because creating an IAM user and then sharing the IAM user credentials with the vendor is a direct 'NO' practise from a security perspective. For more information on sharing cloudtrail logs files, please visit the following URL

https://docs.aws.amazon.com/awscloudtrail/latest/userguide/cloudtrail-sharine-loes.htmllThe correct answers are: Create an IAM Role in the company account Ensure the IAM Role has accessfor read-only to the S3 bucketsSubmit your Feedback/Queries to our Experts

Your company has been using AWS for hosting EC2 Instances for their web and database applications.

They want to have a compliance check to see the following

Whether any ports are left open other than admin ones like SSH and RDP Whether any ports to the database server other than ones from the web server security group are open Which of the following can help achieve this in the easiest way possible. You don't want to carry out an extra configuration changes?

Please select:

A.
AWS Config
A.
AWS Config
Answers
B.
AWS Trusted Advisor
B.
AWS Trusted Advisor
Answers
C.
AWS Inspector D.AWSGuardDuty
C.
AWS Inspector D.AWSGuardDuty
Answers
Suggested answer: B

Explanation:

Trusted Advisor checks for compliance with the following security recommendations:

Limited access to common administrative ports to only a small subset of addresses. This includes ports 22 (SSH), 23 (Telnet) 3389 (RDP), and 5500 (VNQ. Limited access to common database ports. This includes ports 1433 (MSSQL Server), 1434 (MSSQL Monitor), 3306 (MySQL), Oracle (1521) and 5432 (PostgreSQL). Option A is partially correct but then you would need to write custom rules for this. The AWS trusted advisor can give you all o these checks on its dashboard Option C is incorrect. Amazon Inspector needs a software agent to be installed on all EC2 instances that are included in th. assessment target, the security of which you want to evaluate with Amazon Inspector. It monitors the behavior of the EC2 instance on which it is installed, including network, file system, and process activity, and collects a wide set of behavior and configuration data (telemetry), which it then passes to the Amazon Inspector service. Our question's requirement is to choose a choice that is easy to implement. Hence Trusted Advisor is more appropriate for this ) question. Options D is invalid because this service dont provide these details.

For more information on the Trusted Advisor, please visit the following URL

https://aws.amazon.com/premiumsupport/trustedadvisor>The correct answer is: AWS Trusted Advisor Submit your Feedback/Queries to our Experts

A company is planning on using AWS for hosting their applications. They want complete separation and isolation of their production , testing and development environments. Which of the following is an ideal way to design such a setup? Please select:

A.
Use separate VPCs for each of the environments
A.
Use separate VPCs for each of the environments
Answers
B.
Use separate IAM Roles for each of the environments
B.
Use separate IAM Roles for each of the environments
Answers
C.
Use separate IAM Policies for each of the environments
C.
Use separate IAM Policies for each of the environments
Answers
D.
Use separate AWS accounts for each of the environments
D.
Use separate AWS accounts for each of the environments
Answers
Suggested answer: D

Explanation:

A recommendation from the AWS Security Best practices highlights this as well

option A is partially valid, you can segregate resources, but a best practise is to have multiple accounts for this setup. Options B and C are invalid because from a maintenance perspective this could become very difficult For more information on the Security Best practices, please visit the following URL:

https://dl.awsstatic.com/whitepapers/Security/AWS_Security_Best_Practices.pdfThe correct answer is: Use separate AWS accounts for each of the environments Submit yourFeedback/Queries to our Experts

Total 590 questions
Go to page: of 59