ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 303 - Professional Data Engineer discussion

Report
Export

You want to create a machine learning model using BigQuery ML and create an endpoint foe hosting the model using Vertex Al. This will enable the processing of continuous streaming data in near-real time from multiple vendors. The data may contain invalid values. What should you do?

A.
Create a new BigOuery dataset and use streaming inserts to land the data from multiple vendors. Configure your BigQuery ML model to use the 'ingestion' dataset as the training data.
Answers
A.
Create a new BigOuery dataset and use streaming inserts to land the data from multiple vendors. Configure your BigQuery ML model to use the 'ingestion' dataset as the training data.
B.
Use BigQuery streaming inserts to land the data from multiple vendors whore your BigQuery dataset ML model is deployed.
Answers
B.
Use BigQuery streaming inserts to land the data from multiple vendors whore your BigQuery dataset ML model is deployed.
C.
Create a Pub'Sub topic and send all vendor data to it Connect a Cloud Function to the topic to process the data and store it in BigQuery.
Answers
C.
Create a Pub'Sub topic and send all vendor data to it Connect a Cloud Function to the topic to process the data and store it in BigQuery.
D.
Create a Pub/Sub topic and send all vendor data to it Use Dataflow to process and sanitize the Pub/Sub data and stream it to BigQuery.
Answers
D.
Create a Pub/Sub topic and send all vendor data to it Use Dataflow to process and sanitize the Pub/Sub data and stream it to BigQuery.
Suggested answer: D

Explanation:

Dataflow provides a scalable and flexible way to process and clean the incoming data in real-time before loading it into BigQuery.

asked 18/09/2024
Aung Zin
41 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first