feat: Add webHDFS support. Fixes #7540 #8443
Closed
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Signed-off-by: Alexander Dittmann alexander.dittmann@sap.com
Fixes #7540
This PR adds support for webHDFS input/output artifacts, supporting two ways for authentication:
client_id
,client_secret
and atokenURL
.An example yaml can be found here: https://github.com/argoproj/argo-workflows/compare/master...alexdittmann:feat-add-webhdfs-support?expand=1#diff-36d89ba256bdc0e7c8728a206b718b821dd7e38db2043abb699290d1204be7bb
Note:
I initially created a (private) fork of this repo, where I successfully tested these changes against, both with an Azure and SAP Hana Datalake store. When creating this official fork from the latest
master
, however, submitting new workflows always resulted inStartError
of the workflow pod. Here is the events log:Im using k3d btw. Can this error somehow be mitigated? (I have run
make argo-exec
already)Also appreciate any feedback!
Thanks
Alex