Send task logs from KubeExecutor tasks to pod stdout too#47731
Merged
jedcunningham merged 1 commit intoapache:mainfrom Mar 13, 2025
Merged
Send task logs from KubeExecutor tasks to pod stdout too#47731jedcunningham merged 1 commit intoapache:mainfrom
jedcunningham merged 1 commit intoapache:mainfrom
Conversation
dstandish
reviewed
Mar 13, 2025
dstandish
reviewed
Mar 13, 2025
dstandish
reviewed
Mar 13, 2025
85749d8 to
5cddc04
Compare
This is needed so that when the KubeExecutor is asked to get logs for a running pod they show up in the pod. This changes the `airflow.sdk.execution_time.execute_workload` entrypoint to a. produce all logs as JSON, and b. ask the SDK to send all output log messages to the top level logger to, so they appear on stdout. In order to make the output a bit nicer this also tidies up/removes some of the logging from dispose_orm so that it doesn't "pollute" the logs (this was required as due to the current hack we have to upload remote logs, we ended up alling dispose_orm at the end and that _wasn't_ JSON formatted). Closes apache#46894
5cddc04 to
9950707
Compare
jedcunningham
approved these changes
Mar 13, 2025
nailo2c
pushed a commit
to nailo2c/airflow
that referenced
this pull request
Apr 4, 2025
This is needed so that when the KubeExecutor is asked to get logs for a running pod they show up in the pod. This changes the `airflow.sdk.execution_time.execute_workload` entrypoint to a. produce all logs as JSON, and b. ask the SDK to send all output log messages to the top level logger to, so they appear on stdout. In order to make the output a bit nicer this also tidies up/removes some of the logging from dispose_orm so that it doesn't "pollute" the logs (this was required as due to the current hack we have to upload remote logs, we ended up alling dispose_orm at the end and that _wasn't_ JSON formatted). Closes apache#46894
taranlu-houzz
added a commit
to taranlu-houzz/airflow
that referenced
this pull request
Mar 20, 2026
LocalExecutor's `_execute_work()` calls `supervise()` without passing `subprocess_logs_to_stdout=True`, so task logs are only written to log files and never reach the container's stdout. This breaks log collection in Kubernetes deployments that rely on container stdout for aggregation (e.g., Fluentd, Coralogix, Datadog). The containerized executor (`execute_workload.py`) already passes this flag (added in apache#47731), but LocalExecutor was not updated. Closes apache#54501
1 task
taranlu-houzz
added a commit
to taranlu-houzz/airflow
that referenced
this pull request
Mar 20, 2026
LocalExecutor's `_execute_work()` calls `supervise()` without passing `subprocess_logs_to_stdout=True`, so task logs are only written to log files and never reach the container's stdout. This breaks log collection in Kubernetes deployments that rely on container stdout for aggregation (e.g., Fluentd, Coralogix, Datadog). The containerized executor (`execute_workload.py`) already passes this flag (added in apache#47731), but LocalExecutor was not updated. Closes apache#54501
eladkal
pushed a commit
to taranlu-houzz/airflow
that referenced
this pull request
Mar 22, 2026
LocalExecutor's `_execute_work()` calls `supervise()` without passing `subprocess_logs_to_stdout=True`, so task logs are only written to log files and never reach the container's stdout. This breaks log collection in Kubernetes deployments that rely on container stdout for aggregation (e.g., Fluentd, Coralogix, Datadog). The containerized executor (`execute_workload.py`) already passes this flag (added in apache#47731), but LocalExecutor was not updated. Closes apache#54501
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This is needed so that when the KubeExecutor is asked to get logs for a
running pod they show up in the pod.
This changes the
airflow.sdk.execution_time.execute_workloadentrypoint toa. produce all logs as JSON, and
b. ask the SDK to send all output log messages to the top level logger to, so
they appear on stdout.
In order to make the output a bit nicer this also tidies up/removes some of
the logging from dispose_orm so that it doesn't "pollute" the logs (this was
required as due to the current hack we have to upload remote logs, we ended up
alling dispose_orm at the end and that wasn't JSON formatted).
Closes #46894
^ Add meaningful description above
Read the Pull Request Guidelines for more information.
In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in a newsfragment file, named
{pr_number}.significant.rstor{issue_number}.significant.rst, in newsfragments.