Apache Airflow version
3.0.0+astro.2
If "Other Airflow 2 version" selected, which one?
No response
What happened?
Relates to #51062 and #48554 but I decided to open a new issue because neither of these open issues specifically relate to dag.test().
Received error
File "/usr/local/lib/python3.12/site-packages/airflow/sdk/execution_time/context.py", line 150, in _get_connection
from airflow.sdk.execution_time.task_runner import SUPERVISOR_COMMS
ImportError: cannot import name 'SUPERVISOR_COMMS' from 'airflow.sdk.execution_time.task_runner' (/usr/local/lib/python3.12/site-packages/airflow/sdk/execution_time/task_runner.py)
What you think should happen instead?
I would expect the task to either succeed if the connection exists, or fail with "connection s3_internal does not exist" if it does not.
How to reproduce
Below is a minimal example to reproduce this error. Invoke as a test (run from CLI like so python path/to/file.py). This error does not show up when run in the UI, for example. I believe that when a DAG is not invoked "normally" (related issues invoke from CLI and some separate virtual environment) the secrets backend is not configured properly. With the example I show below, I believe that dag.test() is more or less unusable because most DAGs use connections/variables.
from airflow.models.dag import DAG
from airflow.providers.standard.operators.python import PythonOperator
from airflow.sdk.definitions.connection import Connection
MSSQL_CONN_ID = "s3_internal"
with DAG(
dag_id="test_dag",
catchup=False,
) as dag:
PythonOperator(
task_id = "foo",
python_callable=Connection.get,
op_args=[MSSQL_CONN_ID]
)
if __name__ == "__main__":
dag.test()
Operating System
Linux
Versions of Apache Airflow Providers
apache-airflow-providers-amazon==9.6.1
apache-airflow-providers-celery==3.10.6
apache-airflow-providers-common-compat==1.6.0
apache-airflow-providers-common-io==1.5.3
apache-airflow-providers-common-sql==1.25.0
apache-airflow-providers-elasticsearch==6.2.1
apache-airflow-providers-http==5.3.0
apache-airflow-providers-microsoft-azure==12.3.1
apache-airflow-providers-microsoft-mssql==4.2.2
apache-airflow-providers-mysql==6.2.1
apache-airflow-providers-odbc==4.9.2
apache-airflow-providers-openlineage==2.2.0
apache-airflow-providers-postgres==6.1.2
apache-airflow-providers-samba==4.9.2
apache-airflow-providers-sftp==5.2.1
apache-airflow-providers-slack==9.0.5
apache-airflow-providers-smtp==2.0.2
apache-airflow-providers-snowflake==6.2.2
apache-airflow-providers-ssh==4.1.0
apache-airflow-providers-standard==1.0.0
Deployment
Other Docker-based deployment
Deployment details
No response
Anything else?
No response
Are you willing to submit PR?
Code of Conduct
Apache Airflow version
3.0.0+astro.2
If "Other Airflow 2 version" selected, which one?
No response
What happened?
Relates to #51062 and #48554 but I decided to open a new issue because neither of these open issues specifically relate to
dag.test().Received error
What you think should happen instead?
I would expect the task to either succeed if the connection exists, or fail with "connection
s3_internaldoes not exist" if it does not.How to reproduce
Below is a minimal example to reproduce this error. Invoke as a test (run from CLI like so
python path/to/file.py). This error does not show up when run in the UI, for example. I believe that when a DAG is not invoked "normally" (related issues invoke from CLI and some separate virtual environment) the secrets backend is not configured properly. With the example I show below, I believe thatdag.test()is more or less unusable because most DAGs use connections/variables.Operating System
Linux
Versions of Apache Airflow Providers
Deployment
Other Docker-based deployment
Deployment details
No response
Anything else?
No response
Are you willing to submit PR?
Code of Conduct