ExamGecko
Question list
Search
Search

Related questions











Question 101 - CIPP-US discussion

Report
Export

Acme Student Loan Company has developed an artificial intelligence algorithm that determines whether an individual is likely to pay their bill or default. A person who is determined by the algorithm to be more likely to default will receive frequent payment reminder calls, while those who are less likely to default will not receive payment reminders.

Which of the following most accurately reflects the privacy concerns with Acme Student Loan Company using artificial intelligence in this manner?

A.

If the algorithm uses risk factors that impact the automatic decision engine. Acme must ensure that the algorithm does not have a disparate impact on protected classes in the output.

Answers
A.

If the algorithm uses risk factors that impact the automatic decision engine. Acme must ensure that the algorithm does not have a disparate impact on protected classes in the output.

B.

If the algorithm makes automated decisions based on risk factors and public information, Acme need not determine if the algorithm has a disparate impact on protected classes.

Answers
B.

If the algorithm makes automated decisions based on risk factors and public information, Acme need not determine if the algorithm has a disparate impact on protected classes.

C.

If the algorithm's methodology is disclosed to consumers, then it is acceptable for Acme to have a disparate impact on protected classes.

Answers
C.

If the algorithm's methodology is disclosed to consumers, then it is acceptable for Acme to have a disparate impact on protected classes.

D.

If the algorithm uses information about protected classes to make automated decisions, Acme must ensure that the algorithm does not have a disparate impact on protected classes in the output.

Answers
D.

If the algorithm uses information about protected classes to make automated decisions, Acme must ensure that the algorithm does not have a disparate impact on protected classes in the output.

Suggested answer: D

Explanation:

The correct answer is D. If the algorithm uses information about protected classes to make automated decisions, Acme must ensure that the algorithm does not have a disparate impact on protected classes in the output. The Fair Credit Reporting Act (FCRA) protects consumers from unfair, inaccurate, and discriminatory treatment by creditors and other businesses that use credit reports. The FCRA prohibits creditors from using information about protected classes, such as race, color, religion, national origin, sex, marital status, age, or because they receive income from a public assistance program, to make decisions about credit. In the case of Acme Student Loan Company, the algorithm is using information about protected classes to make automated decisions about whether to send payment reminder calls. This could have a disparate impact on protected classes, such as people of color or people with low incomes. For example, people of color may be more likely to be identified as being at risk of default, even if they are just as likely to repay their loans as people of other races. Acme Student Loan Company must ensure that the algorithm does not have a disparate impact on protected classes. This could be done by using a variety of methods, such as:

Testing the algorithm for accuracy, fairness, and bias before and after deployment

Providing consumers with notice and consent options for the use of their data

Allowing consumers to access, correct, or delete their data

Implementing accountability and oversight mechanisms for the algorithm

Ensuring compliance with applicable laws and regulations

https://pupuweb.com/iapp-cipp-us-qa-privacy-concerns-acme-student-loan-company-artificial-intelligence/

asked 22/11/2024
Anthony Bradley
43 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first