ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 33 - Certified Heroku Architecture Designer discussion

Report
Export

Universal Containers (UC) uses Apache Kafka on Heroku to stream shipment inventory data in real time throughout the world. A Kafka topic is used to send messages with updates on the shipping container GPS coordinates as they are in transit. UC is using a Heroku Kafka basic-0 plan. The topic was provisioned with 8 partitions, 1 week of retention, and no compaction. The keys for the events are being assigned by Heroku Kafka, which means that they will be randomly distributed between the partitions.

UC has a single-dyno consumer application that persists the data to their Enterprise Data Warehouse (EDW). Recently, they've been noticing data loss in the EDW.

What should an Architect with Kafka experience recommend?

A.
Enable compaction on the topic to drop older messages, which will drop older messages with the same key.
Answers
A.
Enable compaction on the topic to drop older messages, which will drop older messages with the same key.
B.
Upgrade to a larger Apache Kafka for Heroku plan, which has greater data capacity.
Answers
B.
Upgrade to a larger Apache Kafka for Heroku plan, which has greater data capacity.
C.
Use Heroku Redis to store message receipt information to account for "at-least" once delivery, which will guarantee that messages are never processed more than once. Scale up the consumer dynos to match the number of partitions so that there is one process for each partition.
Answers
C.
Use Heroku Redis to store message receipt information to account for "at-least" once delivery, which will guarantee that messages are never processed more than once. Scale up the consumer dynos to match the number of partitions so that there is one process for each partition.
Suggested answer: C
asked 23/09/2024
Koen Poos
40 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first