ExamGecko
Question list
Search
Search

List of questions

Search

Question 28 - H13-311_V3.5 discussion

Report
Export

Which of the following statements are true about decision trees?

A.
The common decision tree algorithms include ID3, C4.5, and CART.
Answers
A.
The common decision tree algorithms include ID3, C4.5, and CART.
B.
Quantitative indicators of purity can only be obtained by using information entropy.
Answers
B.
Quantitative indicators of purity can only be obtained by using information entropy.
C.
Building a decision tree means selecting feature attributes and determining their tree structure.
Answers
C.
Building a decision tree means selecting feature attributes and determining their tree structure.
D.
A key step to building a decision tree involves dividing all feature attributes and comparing the purity of the division's result sets.
Answers
D.
A key step to building a decision tree involves dividing all feature attributes and comparing the purity of the division's result sets.
Suggested answer: A, C, D

Explanation:

A . TRUE. The common decision tree algorithms include ID3, C4.5, and CART. These are the most widely used algorithms for decision tree generation.

B . FALSE. Purity in decision trees can be measured using multiple metrics, such as information gain, Gini index, and others, not just information entropy.

C . TRUE. Building a decision tree involves selecting the best features and determining their order in the tree structure to split the data effectively.

D . TRUE. One key step in decision tree generation is evaluating the purity of different splits (e.g., how well the split segregates the target variable) by comparing metrics like information gain or Gini index.

HCIA AI

Machine Learning Overview: Covers decision tree algorithms and their use cases.

Deep Learning Overview: While this focuses on neural networks, it touches on how decision-making algorithms are used in structured data models.

asked 26/09/2024
Ange YAO
38 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first