ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 351 - Professional Data Engineer discussion

Report
Export

You recently deployed several data processing jobs into your Cloud Composer 2 environment. You notice that some tasks are failing in Apache Airflow. On the monitoring dashboard, you see an increase in the total workers' memory usage, and there were worker pod evictions. You need to resolve these errors. What should you do?

Choose 2 answers

A.
Increase the directed acyclic graph (DAG) file parsing interval.
Answers
A.
Increase the directed acyclic graph (DAG) file parsing interval.
B.
Increase the memory available to the Airflow workers.
Answers
B.
Increase the memory available to the Airflow workers.
C.
Increase the maximum number of workers and reduce worker concurrency.
Answers
C.
Increase the maximum number of workers and reduce worker concurrency.
D.
Increase the memory available to the Airflow triggerer.
Answers
D.
Increase the memory available to the Airflow triggerer.
E.
Increase the Cloud Composer 2 environment size from medium to large.
Answers
E.
Increase the Cloud Composer 2 environment size from medium to large.
Suggested answer: B, C

Explanation:

To resolve issues related to increased memory usage and worker pod evictions in your Cloud Composer 2 environment, the following steps are recommended:

Increase Memory Available to Airflow Workers:

By increasing the memory allocated to Airflow workers, you can handle more memory-intensive tasks, reducing the likelihood of pod evictions due to memory limits.

Increase Maximum Number of Workers and Reduce Worker Concurrency:

Increasing the number of workers allows the workload to be distributed across more pods, preventing any single pod from becoming overwhelmed.

Reducing worker concurrency limits the number of tasks that each worker can handle simultaneously, thereby lowering the memory consumption per worker.

Steps to Implement:

Increase Worker Memory:

Modify the configuration settings in Cloud Composer to allocate more memory to Airflow workers. This can be done through the environment configuration settings.

Adjust Worker and Concurrency Settings:

Increase the maximum number of workers in the Cloud Composer environment settings.

Reduce the concurrency setting for Airflow workers to ensure that each worker handles fewer tasks at a time, thus consuming less memory per worker.

Cloud Composer Worker Configuration

Scaling Airflow Workers

asked 18/09/2024
Bogdan Paun
35 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first