You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I’m running Apache Airflow 2.9.3 with CeleryKubernetesExecutor, deployed on Kubernetes.
Current setup
• Airflow is deployed in Kubernetes
• Application bootstrap secrets (DB, broker, etc.) are stored in HashiCorp Vault
• Those bootstrap secrets are injected using ExternalSecrets
• Vault is reachable via HTTPS
from airflow import DAG
from airflow.operators.python import PythonOperator
from airflow.hooks.base import BaseHook
from datetime import datetime
def test_conn():
conn = BaseHook.get_connection("psql")
print(conn.get_uri())
with DAG(
dag_id="vault_test_conn",
start_date=datetime(2024, 1, 1),
schedule=None,
catchup=False,
) as dag:
PythonOperator(
task_id="test",
python_callable=test_conn,
)
AppRole issues (tokens + SSL errors)
Using AppRole creates many Vault tokens — one per Airflow component and per task execution.
I tried tuning token TTL, which helps a bit, but not enough.
From my investigation, it looks like:
AppRole + Airflow + CeleryKubernetesExecutor is an unstable combination
especially because every task runs in its own Kubernetes pod and performs a fresh AppRole login.
If someone is successfully using AppRole in a similar setup, I’d really appreciate seeing how you solved this (token reuse, TTLs, architecture, etc.).
More importantly, when tasks run, I consistently get SSL errors during AppRole login, for example:
[2026-02-09 12:19:50,972: WARNING/ForkPoolWorker-1] Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get issuer certificate (_ssl.c:1007)'))': /v1/auth/approle/login
[2026-02-09 12:19:51,185: WARNING/ForkPoolWorker-1] Retrying (Retry(total=1, connect=None, read=None, redirect=None, status=None)) after connection broken by 'SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get issuer certificate (_ssl.c:1007)'))': /v1/auth/approle/login
[2026-02-09 12:19:51,599: WARNING/ForkPoolWorker-1] Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get issuer certificate (_ssl.c:1007)'))': /v1/auth/approle/login
[2026-02-09 12:19:51,629: ERROR/ForkPoolWorker-1] Unable to retrieve connection from secrets backend (VaultBackend). Checking subsequent secrets backend.
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.10/site-packages/urllib3/connectionpool.py", line 468, in _make_request
self._validate_conn(conn)
File "/home/airflow/.local/lib/python3.10/site-packages/urllib3/connectionpool.py", line 1097, in _validate_conn
conn.connect()
File "/home/airflow/.local/lib/python3.10/site-packages/urllib3/connection.py", line 642, in connect
sock_and_verified = _ssl_wrap_socket_and_match_hostname(
File "/home/airflow/.local/lib/python3.10/site-packages/urllib3/connection.py", line 783, in _ssl_wrap_socket_and_match_hostname
ssl_sock = ssl_wrap_socket(
File "/home/airflow/.local/lib/python3.10/site-packages/urllib3/util/ssl_.py", line 471, in ssl_wrap_socket
ssl_sock = _ssl_wrap_socket_impl(sock, context, tls_in_tls, server_hostname)
File "/home/airflow/.local/lib/python3.10/site-packages/urllib3/util/ssl_.py", line 515, in _ssl_wrap_socket_impl
return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
File "/usr/local/lib/python3.10/ssl.py", line 513, in wrap_socket
return self.sslsocket_class._create(
File "/usr/local/lib/python3.10/ssl.py", line 1104, in _create
self.do_handshake()
File "/usr/local/lib/python3.10/ssl.py", line 1375, in do_handshake
self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get issuer certificate (_ssl.c:1007)
Using auth_type=token is not straightforward
If I switch to auth_type=token, the token expires due to TTL.
That means:
• The token must be renewed continuously
• Otherwise Airflow breaks once TTL expires
This can be solved with Vault Agent, but I’m currently not using Vault Agent in my setup.
Can Vault Injector be used to inject connections into Airflow?
I do have Vault Injector enabled in my Helm chart.
Question:
Can Vault Injector be used to fetch secrets from Vault and write them into application config files, so that Airflow can read them and use them as Connections?
In other words:
• Vault Injector → writes secrets to files
• Airflow → reads those files and applies connections dynamically
Is this possible with Airflow, or does Airflow strictly require direct Vault API access via Secrets Backend?
If AppRole is the only authentication method available, how can it be implemented correctly with Apache Airflow?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone,
I’m running Apache Airflow 2.9.3 with CeleryKubernetesExecutor, deployed on Kubernetes.
Current setup
• Airflow is deployed in Kubernetes
• Application bootstrap secrets (DB, broker, etc.) are stored in HashiCorp Vault
• Those bootstrap secrets are injected using ExternalSecrets
• Vault is reachable via HTTPS
Now I’m trying to solve the following task:
I want to store Airflow Connections (visible in the Airflow UI → Connections) in Vault, instead of defining them directly in Airflow.
I’m following the official documentation:
https://airflow.apache.org/docs/apache-airflow-providers-hashicorp/2.2.0/secrets-backends/hashicorp-vault.html
Airflow Vault Secrets Backend configuration:
ENV:
DAG:
Using AppRole creates many Vault tokens — one per Airflow component and per task execution.
I tried tuning token TTL, which helps a bit, but not enough.
From my investigation, it looks like:
AppRole + Airflow + CeleryKubernetesExecutor is an unstable combination
especially because every task runs in its own Kubernetes pod and performs a fresh AppRole login.
If someone is successfully using AppRole in a similar setup, I’d really appreciate seeing how you solved this (token reuse, TTLs, architecture, etc.).
More importantly, when tasks run, I consistently get SSL errors during AppRole login, for example:
Using auth_type=token is not straightforward
If I switch to auth_type=token, the token expires due to TTL.
That means:
• The token must be renewed continuously
• Otherwise Airflow breaks once TTL expires
This can be solved with Vault Agent, but I’m currently not using Vault Agent in my setup.
Can Vault Injector be used to inject connections into Airflow?
I do have Vault Injector enabled in my Helm chart.
Question:
Can Vault Injector be used to fetch secrets from Vault and write them into application config files, so that Airflow can read them and use them as Connections?
In other words:
• Vault Injector → writes secrets to files
• Airflow → reads those files and applies connections dynamically
Is this possible with Airflow, or does Airflow strictly require direct Vault API access via Secrets Backend?
If AppRole is the only authentication method available, how can it be implemented correctly with Apache Airflow?
Beta Was this translation helpful? Give feedback.
All reactions