You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have an AWS EKS cluster on which I am running airflow installed via Helm. I'm using the apache/airflow:2.9.3 container and kubernetes version 1.30. I'm trying to get started with the EksPodOperator using this very simple dag:
I want the operator to create the pod in the same cluster (hence in_cluster=True), and currently it doesn't really do anything because I haven't been able to get past this step. When I manually trigger this dag and watch the logs of the worker pod, it always runs into this:
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/models/taskinstance.py", line 465, in _execute_task
result = _execute_callable(context=context, **execute_callable_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/models/taskinstance.py", line 432, in _execute_callable
return execute_callable(context=context, **execute_callable_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/models/baseoperator.py", line 401, in wrapper
return func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/amazon/aws/operators/eks.py", line 1103, in execute
return super().execute(context)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/models/baseoperator.py", line 401, in wrapper
return func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/cncf/kubernetes/operators/pod.py", line 591, in execute
return self.execute_sync(context)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/cncf/kubernetes/operators/pod.py", line 599, in execute_sync
self.pod_request_obj = self.build_pod_request_obj(context)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/cncf/kubernetes/operators/pod.py", line 1130, in build_pod_request_obj
"airflow_kpo_in_cluster": str(self.hook.is_in_cluster),
^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/cncf/kubernetes/hooks/kubernetes.py", line 283, in is_in_cluster
self.api_client # so we can determine if we are in_cluster or not
^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/functools.py", line 995, in __get__
val = self.func(instance)
^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/cncf/kubernetes/hooks/kubernetes.py", line 291, in api_client
return self.get_conn()
^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/cncf/kubernetes/hooks/kubernetes.py", line 212, in get_conn
raise AirflowException(
airflow.exceptions.AirflowException: Invalid connection configuration. Options kube_config_path, kube_config, in_cluster are mutually exclusive. You can only use one option at a time.
I'm confused because obviously I have not provided either kube_config_path or kube_config. If I omit in_cluster entirely, it does attempt to launch the pod, but I end up getting a different error:
{
"kind": "Status",
"apiVersion": "v1",
"metadata": {},
"status": "Failure",
"message": "pods \"run-clusterv2-ohba5ny7\" is forbidden: pod does not have \"kubernetes.io/config.mirror\" annotation, node <node> can only create mirror pods",
"reason": "Forbidden",
"details": { "name": "run-clusterv2-ohba5ny7", "kind": "pods" },
"code": 403
}
This error doesn't make sense to me either since workers should be under the airflow-pod-launcher-role created by the helm chart, which does as the name suggests.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I have an AWS EKS cluster on which I am running airflow installed via Helm. I'm using the
apache/airflow:2.9.3
container and kubernetes version 1.30. I'm trying to get started with theEksPodOperator
using this very simple dag:I want the operator to create the pod in the same cluster (hence
in_cluster=True
), and currently it doesn't really do anything because I haven't been able to get past this step. When I manually trigger this dag and watch the logs of the worker pod, it always runs into this:I'm confused because obviously I have not provided either
kube_config_path
orkube_config
. If I omitin_cluster
entirely, it does attempt to launch the pod, but I end up getting a different error:This error doesn't make sense to me either since workers should be under the
airflow-pod-launcher-role
created by the helm chart, which does as the name suggests.Beta Was this translation helpful? Give feedback.
All reactions