Alternative method of submitting jobs to DF Runner #756
+784
−0
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Running a dataflow job via the axlearn gcp vm start command is not necessary or intuative for non-apple users. Additionally, if you are already running your commands from a VM (e.g from a remote desktop), this process does not work. I recognize that there are still scenarios where you would want to launch your dataflow jobs from a VM, so rather than replacing that ability, I am adding an alternative.
In order to submit jobs to the Dataflow runner without using ‘axlearn gcp vm start’, changes to the quoting behavior of dataflow.py are necessary. Unfortunately, there’s not an obviously elegant way to provide two versions of dataflow.py, so if you can think of a better option, please let me know.
What I’ve done is this: dataflow.py remains as it was, and I’m adding dataflow.alt.py. In the directions, I’ve added instructions to replace the original module if the user wants to submit jobs to the dataflow runner without needing to create a VM.
Additional note: PR #711 makes a change to the quoting behavior that allows the submission of dataflow jobs without ‘axlearn gcp vm start’, however this fix will not work for any commands that include parameters that require quotes, e.g --dataflow_service_options='worker_accelerator=type:nvidia-tesla-t4;count:1;install-nvidia-driver'
The dataflow.alt.py module DOES work these kinds of parameters.