ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 135 - MLS-C01 discussion

Report
Export

An interactive online dictionary wants to add a widget that displays words used in similar contexts. A Machine Learning Specialist is asked to provide word features for the downstream nearest neighbor model powering the widget.

What should the Specialist do to meet these requirements?

A.
Create one-hot word encoding vectors.
Answers
A.
Create one-hot word encoding vectors.
B.
Produce a set of synonyms for every word using Amazon Mechanical Turk.
Answers
B.
Produce a set of synonyms for every word using Amazon Mechanical Turk.
C.
Create word embedding factors that store edit distance with every other word.
Answers
C.
Create word embedding factors that store edit distance with every other word.
D.
Download word embedding's pre-trained on a large corpus.
Answers
D.
Download word embedding's pre-trained on a large corpus.
Suggested answer: D

Explanation:

Word embeddings are a type of dense representation of words, which encode semantic meaning in a vector form. These embeddings are typically pre-trained on a large corpus of text data, such as a large set of books, news articles, or web pages, and capture the context in which words are used. Word embeddings can be used as features for a nearest neighbor model, which can be used to find words used in similar contexts. Downloading pre-trained word embeddings is a good way to get started quickly and leverage the strengths of these representations, which have been optimized on a large amount of data. This is likely to result in more accurate and reliable features than other options like one-hot encoding, edit distance, or using Amazon Mechanical Turk to produce synonyms.

asked 16/09/2024
Jordan Pfingsten
44 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first