if you want to submit parallel jobs with condor_submit
:
condor.sub
universe = docker
docker_image = {{docker_url}}
executable = {{bash_filename}}
should_transfer_files = YES
transfer_input_files = {{bash_filename}}
when_to_transfer_output = ON_EXIT
output = log/$(cluster).$(process).out
error = log/$(cluster).$(process).err
log = log/$(cluster).$(process).log
request_cpus = 1
request_memory = {{MEM}}
request_disk = 500MB
max_materialize = 20
arguments = "$(state)"
queue state from my.args
if you need to submit parallel jobs via condor_submit_dag
per condor-doc* condor_submit_dag
will not respect
max_materialized
in sub file- multiple lines of
arguments
- condor-doc, see
DAGMAN_USE_DIRECT_SUBMIT
in https://htcondor.readthedocs.io/en/latest/admin-manual/configuration-macros.html "But this method will ignore some submit file features such as max_materialize and more than one QUEUE statement."
So either update DAGMAN_USE_DIRECT_SUBMIT or modify to move parallel jobs to condor.dag (ref https://github.com/WIPACrepo/simprod_condor_dag/blob/master/dag.sub )
condor.sub
arguments = "$(MYARGS)"
condor.dag
JOB gen1 condor.sub
VARS gen1 MYARGS="{{myarguments}}" MEM="12GB"
JOB genAAA condorAAA.sub
VARS genAAA MYARGS="{{myargumentsAAA}}" MEM="1GB"
PARENT gen1 CHILD genAAA
above condor.dag does not work since you can only have "one queue" per job!
https://www-auth.cs.wisc.edu/lists/htcondor-users/2005-September/msg00209.shtml
the solution to simulate multiple queue is to use VARS in the dag file:
https://web.ma.utexas.edu/condor/manual/2_10DAGMan_Applications.html
see
condor.dag
in per Original Post!Alternative example with MAXJOBS specified:
in the condor script, add MYARGS as argument placeholder:
then define MYARGS in dag script: