ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 233 - Professional Machine Learning Engineer discussion

Report
Export

You are developing a custom TensorFlow classification model based on tabular data. Your raw data is stored in BigQuery contains hundreds of millions of rows, and includes both categorical and numerical features. You need to use a MaxMin scaler on some numerical features, and apply a one-hot encoding to some categorical features such as SKU names. Your model will be trained over multiple epochs. You want to minimize the effort and cost of your solution. What should you do?

A.
1 Write a SQL query to create a separate lookup table to scale the numerical features. 2. Deploy a TensorFlow-based model from Hugging Face to BigQuery to encode the text features. 3. Feed the resulting BigQuery view into Vertex Al Training.
Answers
A.
1 Write a SQL query to create a separate lookup table to scale the numerical features. 2. Deploy a TensorFlow-based model from Hugging Face to BigQuery to encode the text features. 3. Feed the resulting BigQuery view into Vertex Al Training.
B.
1 Use BigQuery to scale the numerical features. 2. Feed the features into Vertex Al Training. 3 Allow TensorFlow to perform the one-hot text encoding.
Answers
B.
1 Use BigQuery to scale the numerical features. 2. Feed the features into Vertex Al Training. 3 Allow TensorFlow to perform the one-hot text encoding.
C.
1 Use TFX components with Dataflow to encode the text features and scale the numerical features. 2 Export results to Cloud Storage as TFRecords. 3 Feed the data into Vertex Al Training.
Answers
C.
1 Use TFX components with Dataflow to encode the text features and scale the numerical features. 2 Export results to Cloud Storage as TFRecords. 3 Feed the data into Vertex Al Training.
D.
1 Write a SQL query to create a separate lookup table to scale the numerical features. 2 Perform the one-hot text encoding in BigQuery. 3. Feed the resulting BigQuery view into Vertex Al Training.
Answers
D.
1 Write a SQL query to create a separate lookup table to scale the numerical features. 2 Perform the one-hot text encoding in BigQuery. 3. Feed the resulting BigQuery view into Vertex Al Training.
Suggested answer: C

Explanation:

TFX (TensorFlow Extended) is a platform for end-to-end machine learning pipelines. It provides components for data ingestion, preprocessing, validation, model training, serving, and monitoring. Dataflow is a fully managed service for scalable data processing. By using TFX components with Dataflow, you can perform feature engineering on large-scale tabular data in a distributed and efficient way. You can use the Transform component to apply the MaxMin scaler and the one-hot encoding to the numerical and categorical features, respectively. You can also use the ExampleGen component to read data from BigQuery and the Trainer component to train your TensorFlow model. The output of the Transform component is a TFRecord file, which is a binary format for storing TensorFlow data. You can export the TFRecord file to Cloud Storage and feed it into Vertex AI Training, which is a managed service for training custom machine learning models on Google Cloud.Reference:

TFX | TensorFlow

Dataflow | Google Cloud

Vertex AI Training | Google Cloud

asked 18/09/2024
Aung Aung Myo Myint
39 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first