ExamGecko
Home Home / Isaca / CISA

Isaca CISA Practice Test - Questions Answers, Page 68

Question list
Search
Search

List of questions

Search

Related questions











An IS auditor is reviewing the service agreement with a technology company that provides IT help desk services to the organization. Which of the following monthly performance metrics is the BEST indicator of service quality?

A.
The total number of users requesting help desk services
A.
The total number of users requesting help desk services
Answers
B.
The average call waiting time on each request
B.
The average call waiting time on each request
Answers
C.
The percent of issues resolved by the first contact
C.
The percent of issues resolved by the first contact
Answers
D.
The average turnaround time spent on each reported issue
D.
The average turnaround time spent on each reported issue
Answers
Suggested answer: C

Explanation:

The percent of issues resolved by the first contact, also known as the first contact resolution (FCR) rate, is a metric that measures the effectiveness and efficiency of the IT help desk services. It indicates how many customer support issues are resolved on the first interaction with the IT help desk, without requiring any follow-up calls, emails, chats, or escalations.The FCR rate is calculated by dividing the number of issues resolved on the first contact by the total number of customer support issues, and multiplying by 100%1.

The FCR rate is the best indicator of service quality among the four monthly performance metrics, because it reflects the following aspects of the IT help desk services:

Customer satisfaction: Customers are more likely to be satisfied with the IT help desk services if their issues are resolved quickly and effectively on the first contact, without having to wait for a response or repeat their problem to multiple agents.A high FCR rate can improve customer loyalty, retention, and advocacy2.

Cost efficiency: Resolving issues on the first contact can reduce the operational costs of the IT help desk services, such as labor costs, phone costs, or overhead costs.A high FCR rate can also increase the productivity and utilization of the IT help desk agents, as they can handle more issues in less time3.

Service level: Resolving issues on the first contact can improve the service level of the IT help desk services, such as reducing the average handle time (AHT), increasing the service level agreement (SLA) compliance, or decreasing the backlog of unresolved issues.A high FCR rate can also enhance the reputation and credibility of the IT help desk services4.

Therefore, an IS auditor should review the FCR rate as a key performance indicator (KPI) of the IT help desk services, and compare it with the industry standards and benchmarks. According to MetricNet's benchmarking database, the FCR industry standard is 74 percent.This number varies widely, however, from a low of about 41 percent to a high of 94 percent5. An IS auditor should also recommend ways to improve the FCR rate, such as:

Training and empowering the IT help desk agents to handle a wide range of issues and provide accurate and consistent solutions

Implementing a knowledge base or a self-service portal that provides relevant and updated information and guidance for common or simple issues

Improving communication and collaboration between different departments or teams that may be involved in resolving complex or escalated issues

Using feedback and analytics tools to monitor and measure customer satisfaction and identify areas for improvement

This is because analyzing the data against predefined specifications is a method of data quality assessment that can help the organization achieve a reasonable level of data quality. Data quality assessment is the process of measuring and evaluating the accuracy, completeness, consistency, timeliness, validity, and usability of the data. Predefined specifications are the criteria or standards that define the expected or desired quality of the data.By comparing the actual data with the predefined specifications, the organization can identify and quantify any gaps, errors, or deviations in the data quality, and take corrective actions accordingly12.

Reviewing data against data classification standards (A) is not the best answer, because it is not a method of data quality assessment, but rather a method of data security management. Data classification standards are the rules or guidelines that define the level of sensitivity and confidentiality of the data, and determine the appropriate security and access controls for the data. For example, data can be classified into public, internal, confidential, or restricted categories.Reviewing data against data classification standards can help the organization protect the data from unauthorized or inappropriate use or disclosure, but it does not directly improve the data quality3.

Outsourcing data cleansing to skilled service providers (B) is not the best answer, because it is not a recommendation to help the organization achieve a reasonable level of data quality, but rather a decision to delegate or transfer the responsibility of data quality management to external parties. Data cleansing is the process of detecting and correcting any errors, inconsistencies, or anomalies in the data. Skilled service providers are third-party vendors or contractors that have the expertise and resources to perform data cleansing tasks.Outsourcing data cleansing to skilled service providers may have some benefits, such as cost savings, efficiency, or scalability, but it also has some risks, such as loss of control, dependency, or liability4.

Consolidating data stored across separate databases into a warehouse is not the best answer, because it is not a method of data quality assessment, but rather a method of data integration and storage. Data integration is the process of combining and transforming data from different sources and formats into a unified and consistent view. Data warehouse is a centralized repository that stores integrated and historical data for analytical purposes. Consolidating data stored across separate databases into a warehouse can help the organization improve the availability and accessibility of the data, but it does not necessarily improve the data quality.

Which of the following should be identified FIRST during the risk assessment process?

A.
Vulnerability to threats
A.
Vulnerability to threats
Answers
B.
Existing controls
B.
Existing controls
Answers
C.
Information assets
C.
Information assets
Answers
D.
Legal requirements
D.
Legal requirements
Answers
Suggested answer: C

Explanation:

The risk assessment process involves identifying the information assets that are at risk, analyzing the threats and vulnerabilities that could affect them, evaluating the impact and likelihood of a risk event, and determining the appropriate controls to mitigate the risk. The first step is to identify the information assets, as they are the objects of protection and the basis for the rest of the process. Without knowing what assets are at risk, it is not possible to assess their value, exposure, or protection level.Reference:ISACA Frameworks: Blueprints for Success

Which of the following should be done FIRST to minimize the risk of unstructured data?

A.
Identify repositories of unstructured data.
A.
Identify repositories of unstructured data.
Answers
B.
Purchase tools to analyze unstructured data.
B.
Purchase tools to analyze unstructured data.
Answers
C.
Implement strong encryption for unstructured data.
C.
Implement strong encryption for unstructured data.
Answers
D.
Implement user access controls to unstructured data.
D.
Implement user access controls to unstructured data.
Answers
Suggested answer: A

Explanation:

Unstructured data is data that does not have a predefined model or organization, making it difficult to store, process, and analyze using traditional relational databases or spreadsheets. Unstructured data can pose a risk to an organization if it contains sensitive, confidential, or regulated information that is not properly secured, managed, or governed. To minimize the risk of unstructured data, the first step is to identify the repositories of unstructured data, such as file servers, cloud storage, email systems, social media platforms, etc. This will help to understand the scope, volume, and nature of unstructured data in the organization, and to prioritize the areas that need further analysis and action.Reference:Unstructured data - Wikipedia

Which of the following should be done FIRST to minimize the risk of unstructured data?

A.
Identify repositories of unstructured data.
A.
Identify repositories of unstructured data.
Answers
B.
Purchase tools to analyze unstructured data.
B.
Purchase tools to analyze unstructured data.
Answers
C.
Implement strong encryption for unstructured data.
C.
Implement strong encryption for unstructured data.
Answers
D.
Implement user access controls to unstructured data.
D.
Implement user access controls to unstructured data.
Answers
Suggested answer: A

Explanation:

Unstructured data is data that does not have a predefined model or organization, making it difficult to store, process, and analyze using traditional relational databases or spreadsheets. Unstructured data can pose a risk to an organization if it contains sensitive, confidential, or regulated information that is not properly secured, managed, or governed. To minimize the risk of unstructured data, the first step is to identify the repositories of unstructured data, such as file servers, cloud storage, email systems, social media platforms, etc. This will help to understand the scope, volume, and nature of unstructured data in the organization, and to prioritize the areas that need further analysis and action.Reference:Unstructured data - Wikipedia

An organization has an acceptable use policy in place, but users do not formally acknowledge the policy. Which of the following is the MOST significant risk from this finding?

A.
Lack of data for measuring compliance
A.
Lack of data for measuring compliance
Answers
B.
Violation of industry standards
B.
Violation of industry standards
Answers
C.
Noncompliance with documentation requirements
C.
Noncompliance with documentation requirements
Answers
D.
Lack of user accountability
D.
Lack of user accountability
Answers
Suggested answer: D

Explanation:

An acceptable use policy (AUP) is a document that defines the rules and guidelines for using an organization's IT resources, such as networks, devices, and software. It aims to protect the organization's assets, security, and productivity. An AUP should be formally acknowledged by users to ensure that they are aware of their responsibilities and obligations when using the IT resources. Without formal acknowledgment, users may not be held accountable for violating the AUP or may claim ignorance of the policy. This can expose the organization to legal, regulatory, reputational, or operational risks. Lack of data for measuring compliance, violation of industry standards, and noncompliance with documentation requirements are also possible risks from not having users acknowledge the AUP, but they are less significant than lack of user accountability.Reference:Workable: Acceptable use policy template,Wikipedia: Acceptable use policy

An organization is concerned about duplicate vendor payments on a complex system with a high volume of transactions. Which of the following would be MOST helpful to an IS auditor to determine whether duplicate vendor payments exist?

A.
Computer-assisted technique
A.
Computer-assisted technique
Answers
B.
Stratified sampling
B.
Stratified sampling
Answers
C.
Statistical sampling
C.
Statistical sampling
Answers
D.
Process walk-through
D.
Process walk-through
Answers
Suggested answer: A

Explanation:

A computer-assisted technique is the most helpful method for an IS auditor to determine whether duplicate vendor payments exist on a complex system with a high volume of transactions. A computer-assisted technique is a tool or procedure that can be used to perform audit tests or procedures on data stored in electronic form. Examples of computer-assisted techniques include data analysis software, query tools, scripting languages, and specialized audit software. A computer-assisted technique can help an IS auditor to identify and extract duplicate payments from a large data set, perform calculations and comparisons, and generate reports and summaries. A computer-assisted technique can also provide more accuracy, efficiency, and coverage than manual methods.

Stratified sampling, statistical sampling, and process walk-through are not as helpful as a computer-assisted technique for this purpose. Stratified sampling is a sampling method that divides the population into subgroups based on certain characteristics and selects samples from each subgroup. Statistical sampling is a sampling method that uses probability theory to determine the sample size and selection criteria. Process walk-through is a review technique that involves following a transaction or process from start to finish and observing the inputs, outputs, controls, and documentation. These methods may be useful for other audit objectives, but they are not as effective as a computer-assisted technique for detecting duplicate payments in a complex and high-volume system.Reference:ISACA Frameworks: Blueprints for Success, [ISACA Glossary of Terms]

An organization has recently become aware of a pervasive chip-level security vulnerability that affects all of its processors. Which of the following is the BEST way to prevent this vulnerability from being exploited?

A.
Implement security awareness training.
A.
Implement security awareness training.
Answers
B.
Install vendor patches
B.
Install vendor patches
Answers
C.
Review hardware vendor contracts.
C.
Review hardware vendor contracts.
Answers
D.
Review security log incidents.
D.
Review security log incidents.
Answers
Suggested answer: B

Explanation:

The best way to prevent a chip-level security vulnerability from being exploited is to install vendor patches. A chip-level security vulnerability is a flaw in the design or implementation of a processor that allows an attacker to bypass the normal security mechanisms and access privileged information or execute malicious code. A vendor patch is a software update provided by the manufacturer of the processor that fixes or mitigates the vulnerability. Installing vendor patches can help to protect the system from known exploits and reduce the risk of data leakage or compromise.

Security awareness training, reviewing hardware vendor contracts, and reviewing security log incidents are not as effective as installing vendor patches for preventing a chip-level security vulnerability from being exploited. Security awareness training is an educational program that teaches users about the importance of security and how to avoid common threats. Reviewing hardware vendor contracts is a legal process that evaluates the terms and conditions of the agreement between the organization and the processor supplier. Reviewing security log incidents is an analytical process that examines the records of security events and activities on the system. These methods may be useful for other security purposes, but they do not directly address the root cause of the chip-level vulnerability or prevent its exploitation.Reference:Protecting your device against chip-related security vulnerabilities,New 'Downfall' Flaw Exposes Valuable Data in Generations of Intel Chips

Which of the following should be the GREATEST concern to an IS auditor reviewing an organization's method to transport sensitive data between offices?

A.
The method relies exclusively on the use of public key infrastructure (PKI).
A.
The method relies exclusively on the use of public key infrastructure (PKI).
Answers
B.
The method relies exclusively on the use of digital signatures.
B.
The method relies exclusively on the use of digital signatures.
Answers
C.
The method relies exclusively on the use of asymmetric encryption algorithms.
C.
The method relies exclusively on the use of asymmetric encryption algorithms.
Answers
D.
The method relies exclusively on the use of 128-bit encryption.
D.
The method relies exclusively on the use of 128-bit encryption.
Answers
Suggested answer: C

Explanation:

The greatest concern to an IS auditor reviewing an organization's method to transport sensitive data between offices is that the method relies exclusively on the use of asymmetric encryption algorithms. Asymmetric encryption algorithms, also known as public key encryption, use two different keys for encryption and decryption: a public key that is shared with anyone who wants to communicate with the sender, and a private key that is kept secret by the sender. Asymmetric encryption algorithms are more secure than symmetric encryption algorithms, which use the same key for both encryption and decryption, but they are also slower and more computationally intensive. Therefore, relying exclusively on asymmetric encryption algorithms may not be efficient or practical for transporting large amounts of sensitive data between offices. A better method would be to use a combination of symmetric and asymmetric encryption algorithms, such as using asymmetric encryption to exchange a symmetric key and then using symmetric encryption to encrypt and decrypt the data.

The other options are not as concerning as option C. The method relying exclusively on the use of public key infrastructure (PKI) is not a concern, because PKI is a system that provides the services and mechanisms for creating, managing, distributing, using, storing, and revoking digital certificates that are based on asymmetric encryption algorithms. PKI enables secure and authenticated communication between parties who do not have a prior trust relationship. The method relying exclusively on the use of digital signatures is not a concern, because digital signatures are a way of verifying the authenticity and integrity of a message or document by using asymmetric encryption algorithms. Digital signatures ensure that the sender cannot deny sending the message or document, and that the receiver can detect any tampering or alteration of the message or document. The method relying exclusively on the use of 128-bit encryption is not a concern, because 128-bit encryption is a level of encryption that uses a 128-bit key to encrypt and decrypt data. 128-bit encryption is considered to be strong enough to resist brute-force attacks by modern computers.Reference:Asymmetric vs Symmetric Encryption: What are differences?,Public Key Infrastructure (PKI),Digital Signature,What is 128-bit Encryption?

Which of the following is the BEST point in time to conduct a post-implementation review?

A.
After a full processing cycle
A.
After a full processing cycle
Answers
B.
Immediately after deployment
B.
Immediately after deployment
Answers
C.
After the warranty period
C.
After the warranty period
Answers
D.
Prior to the annual performance review
D.
Prior to the annual performance review
Answers
Suggested answer: A

Explanation:

The best point in time to conduct a post-implementation review is after a full processing cycle. A post-implementation review is a process to evaluate whether the objectives of the project were met, how effective the project was managed, what benefits were realized, and what lessons were learned. A post-implementation review should be conducted after a full processing cycle, which is the period of time required for a system or process to complete all its functions and produce its outputs. This allows for a more accurate and comprehensive assessment of the project's performance, outcomes, impacts, and issues.

The other options are not as good as option A. Conducting a post-implementation review immediately after deployment is too soon, because it does not allow enough time for the project's product or service to operate in the real world and generate measurable results. Conducting a post-implementation review after the warranty period is too late, because it may miss some important feedback or opportunities for improvement that could have been addressed earlier. Conducting a post-implementation review prior to the annual performance review is irrelevant, because it does not align with the project's life cycle or objectives.Reference:What is Post-Implementation Review in Project Management?,What Is the Post-Implementation Review (PIR) Process?,Post-implementation review in project management?

During a project audit, an IS auditor notes that project reporting does not accurately reflect current progress. Which of the following is the GREATEST resulting impact?

A.
The project manager will have to be replaced.
A.
The project manager will have to be replaced.
Answers
B.
The project reporting to the board of directors will be incomplete.
B.
The project reporting to the board of directors will be incomplete.
Answers
C.
The project steering committee cannot provide effective governance.
C.
The project steering committee cannot provide effective governance.
Answers
D.
The project will not withstand a quality assurance (QA) review.
D.
The project will not withstand a quality assurance (QA) review.
Answers
Suggested answer: C

Explanation:

The greatest resulting impact of project reporting not accurately reflecting current progress is that the project steering committee cannot provide effective governance. The project steering committee is a group of senior executives or stakeholders who oversee the project and provide strategic direction, guidance, and support. The project steering committee relies on accurate and timely project reporting to monitor the project's status, performance, risks, issues, and changes. If the project reporting is inaccurate, the project steering committee cannot make informed decisions, resolve problems, allocate resources, or ensure alignment with the organizational goals and objectives.

The other options are not as impactful as option C. The project manager will have to be replaced is a possible consequence, but not the greatest impact, of inaccurate project reporting. The project manager is responsible for planning, executing, monitoring, controlling, and closing the project. The project manager may face disciplinary actions or termination if they fail to provide accurate and honest project reporting. However, this does not necessarily affect the overall governance of the project. The project reporting to the board of directors will be incomplete is a potential risk, but not the greatest impact, of inaccurate project reporting. The board of directors is the highest governing body of an organization that sets the vision, mission, values, and policies. The board of directors may receive periodic or ad hoc project reporting to ensure that the project is aligned with the organizational strategy and delivers value. If the project reporting is inaccurate, the board of directors may lose confidence in the project or intervene in its management. However, this does not directly affect the day-to-day governance of the project. The project will not withstand a quality assurance (QA) review is a possible outcome, but not the greatest impact, of inaccurate project reporting. A quality assurance review is a process to evaluate the quality of the project's processes and deliverables against predefined standards and criteria. A quality assurance review may reveal discrepancies or errors in the project reporting that may affect the credibility and reliability of the project. However, this does not necessarily affect the governance of the project.Reference:Project Steering Committee - Roles & Responsibilities,Project Reporting Best Practices,Quality Assurance in Project Management

Total 1.198 questions
Go to page: of 120