Replies: 2 comments 13 replies
-
I'm pretty sure I have done this a while back. If you setup credentials correctly there shouldn't be anything stopping you. I'll try and look up my old notes. The conda-store env availability is of course the biggest usability issue. There is a way to make the conda env available via argo workflow controller (https://www.nebari.dev/docs/how-tos/setup-argo/) but it is not well documented on how to use it right now. @kcpevey was doing some experiments with Hera and was struggling with activating environments etc. |
Beta Was this translation helpful? Give feedback.
-
The outcome of my struggles with argo/hera is summarized here in the docs. I ended up creating some helper functions (included) which help you construct a single line command to execute a python script in a conda environment via the argo scheduler. |
Beta Was this translation helpful? Give feedback.
-
I tried out using Argo from the Argo UI, and it worked! Awesome!
Now I'm wondering whether it is possible to launch an Argo workflow on the Nebari kube cluster from a remote machine (e.g. my laptop) via CLI commands and still be able to use existing conda environments via the Nebari Workflow Controller?
Aside: Although I couldn't figure out how to run Argo workflows on the Nebari k8s from my laptop, I did manage to launch jobs there using SkyPilot. I couldn't take advantage of any of the Nebari goodies like conda-store environments of course.
Beta Was this translation helpful? Give feedback.
All reactions