matsci_meta_workflow_driver.py Command Help
Command: $SCHRODINGER/run matsci_meta_workflow_gui_dir/matsci_meta_workflow_driver.py
usage: $SCHRODINGER/run matsci_meta_workflow_gui_dir/matsci_meta_workflow_driver.py
[-h] -stages STAGES [-format] [-combine_input] [-cpu_host HOSTNAME:X]
[-gpu_host HOSTNAME:X] [-md_umbrella]
[-stage_umbrella {never,always,per_flag}] [-HOST <hostname>] [-D]
[-VIEWNAME <viewname>] [-OPLSDIR OPLSDIR] [-JOBNAME JOBNAME]
input_files [input_files ...]
steps of the workflow are capable multiple structures as input - i.e. ' 'steps
such as the Disordered System Builder or Polymer Builder.') A driver to run a
workflow of workflows Copyright Schrodinger, LLC. All rights reserved.
positional arguments:
input_files Input structure files. For most supported workflows,
these will be .mae, .maegz or .cms files. Other input
structure formats such as .smi, .pdb and .sdf are also
supported. If the file is a .mae, .maegz or .smi file
and contains multiple structures/SMILES strings, each
structure will be run in a separate workflow. Other
custom input such as .csv or .txt files are be
supported if the initial step in the workflow is a
custom script that handles such input. For workflows
that begin with a polymer builder step, the input
structure must be a multi-structure .mae/.maegz file
that contains all the required polymer components, and
only a single workflow will be run.
options:
-h, -help Show this help message and exit.
-stages STAGES The text file containing information for the workflow
stages (default: None)
-format Print extended information about the input format and
exit (default: None)
-combine_input Combine structures in the input file into a single
workflow rather than run one workflow per structure.
Only valid if the initial steps of the workflow are
capable of multiple structures as input - i.e. steps
such as the Disordered System Builder or Polymer
Builder or a custom script with input=other. Note that
having a root (first) stage that is a custom stage
that takes input=other will cause this behavior even
if -combine_input is not given on the command line.
(default: False)
-cpu_host HOSTNAME:X Specify the host, HOSTNAME, for CPU subjobs and the
number of allowed simultaneous CPU subjobs, X
(default: None)
-gpu_host HOSTNAME:X Specify the host, HOSTNAME, for GPU subjobs and the
number of allowed simultaneous GPU subjobs, X
(default: None)
-md_umbrella Allow some stages that run GPU (Desmond) tasks to run
locally on the driver machine, one at a time, rather
than resubmitting to the queue. The default behavior
is to allow only CPU tasks to run locally on the
driver machine. Use of this flag will request a
Desmond license from the queue to ensure that one is
available for local Desmond tasks. (default: False)
-stage_umbrella {never,always,per_flag}
Stages that combine a Python driver with a GPU subjob
and implement the -md_umbrella flag can submit the
driver to the CPU queue and the subjob to the GPU
queue, or the driver to the GPU queue and then run the
subjob on that GPU without submitting the subjob back
to the queue. The latter is known as "umbrella" mode.
The value of the -stage_umbrella flag defines the
behavior for these stages. "per_flag" means that the
behavior for each stage is controlled by whether the
-md_umbrella flag is included in the command for that
stage. "always" means these stages will always use
umbrella mode, and "never" means these stages will
never use umbrella mode. "always" and "never" means
the presence or absence of the -md_umbrella flag in
the stage command will be ignored. Note that if
-cpu_host and -gpu_host are not given, the value of
this flag is ignored and the behavior of "always" is
used. (default: per_flag)
Job Control Options:
-HOST <hostname> Run job remotely on the indicated host entry.
(default: localhost)
-D, -DEBUG Show details of Job Control operation. (default:
False)
-VIEWNAME <viewname> Specifies viewname used in job filtering in maestro.
(default: False)
-OPLSDIR OPLSDIR Specifies directory for custom forcefield parameters.
(default: None)
-JOBNAME JOBNAME Provide an explicit name for the job. (default: None)