ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 123 - MLS-C01 discussion

Report
Export

While working on a neural network project, a Machine Learning Specialist discovers thai some features in the data have very high magnitude resulting in this data being weighted more in the cost function What should the Specialist do to ensure better convergence during backpropagation?

A.
Dimensionality reduction
Answers
A.
Dimensionality reduction
B.
Data normalization
Answers
B.
Data normalization
C.
Model regulanzation
Answers
C.
Model regulanzation
D.
Data augmentation for the minority class
Answers
D.
Data augmentation for the minority class
Suggested answer: B

Explanation:

Data normalization is a data preprocessing technique that scales the features to a common range, such as [0, 1] or [-1, 1]. This helps reduce the impact of features with high magnitude on the cost function and improves the convergence during backpropagation. Data normalization can be done using different methods, such as min-max scaling, z-score standardization, or unit vector normalization. Data normalization is different from dimensionality reduction, which reduces the number of features; model regularization, which adds a penalty term to the cost function to prevent overfitting; and data augmentation, which increases the amount of data by creating synthetic samples.References:

Data processing options for AI/ML | AWS Machine Learning Blog

Data preprocessing - Machine Learning Lens

How to Normalize Data Using scikit-learn in Python

Normalization | Machine Learning | Google for Developers

asked 16/09/2024
Jonas Weimar
45 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first