List of questions
Related questions
Question 202 - Professional Machine Learning Engineer discussion
You need to use TensorFlow to train an image classification model. Your dataset is located in a Cloud Storage directory and contains millions of labeled images Before training the model, you need to prepare the data. You want the data preprocessing and model training workflow to be as efficient scalable, and low maintenance as possible. What should you do?
A.
1 Create a Dataflow job that creates sharded TFRecord files in a Cloud Storage directory. 2 Reference tf .data.TFRecordDataset in the training script. 3. Train the model by using Vertex Al Training with a V100 GPU.
B.
1 Create a Dataflow job that moves the images into multiple Cloud Storage directories, where each directory is named according to the corresponding label. 2 Reference tfds.fclder_da-asst.imageFclder in the training script. 3. Train the model by using Vertex AI Training with a V100 GPU.
C.
1 Create a Jupyter notebook that uses an n1-standard-64, V100 GPU Vertex Al Workbench instance. 2 Write a Python script that creates sharded TFRecord files in a directory inside the instance 3. Reference tf. da-a.TFRecrrdDataset in the training script. 4. Train the model by using the Workbench instance.
D.
1 Create a Jupyter notebook that uses an n1-standard-64, V100 GPU Vertex Al Workbench instance. 2 Write a Python scnpt that copies the images into multiple Cloud Storage directories, where each directory is named according to the corresponding label. 3 Reference tf ds. f older_dataset. imageFolder in the training script. 4. Train the model by using the Workbench instance.
Your answer:
0 comments
Sorted by
Leave a comment first