ExamGecko
Question list
Search
Search

List of questions

Search

Question 19 - DEA-C01 discussion

Report
Export

Which connector creates the RECORD_CONTENT and RECORD_METADATA columns in the existing Snowflake table while connecting to Snowflake?

A.
Python Connector
Answers
A.
Python Connector
B.
Spark Connector
Answers
B.
Spark Connector
C.
Node.js connector
Answers
C.
Node.js connector
D.
Kafka Connector
Answers
D.
Kafka Connector
Suggested answer: D

Explanation:

Apache Kafka software uses a publish and subscribe model to write and read streams of records, similar to a message queue or enterprise messaging system. Kafka allows processes to read and write messages asynchronously. A subscriber does not need to be connected directly to a publisher; a pub-lisher can queue a message in Kafka for the subscriber to receive later.

An application publishes messages to a topic, and an application subscribes to a topic to receive those messages. Kafka can process, as well as transmit, messages; however, that is outside the scope of this document. Topics can be divided into partitions to increase scalability.

Kafka Connect is a framework for connecting Kafka with external systems, including databases. A

Kafka Connect cluster is a separate cluster from the Kafka cluster. The Kafka Connect cluster supports running and scaling out connectors (components that support reading and/or writing between external systems).

The Kafka connector is designed to run in a Kafka Connect cluster to read data from Kafka topics and write the data into Snowflake tables.

Every Snowflake table loaded by the Kafka connector has a schema consisting of two VARIANT columns:

RECORD_CONTENT. This contains the Kafka message.

RECORD_METADATA. This contains metadata about the message, for example, the topic from which the message was read.

asked 23/09/2024
Pachara Suwannasit
32 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first