Re: Multiple Jobs

From: Mike McCallum (mmccallum_at_PACIFIC.EDU)
Date: Sun Feb 06 2022 - 10:29:07 CST

Hi Kelly,
We usually create a bunch of directories, one for each run, with same file names in all (.conf, .pdb, .psf, .cons etc) so the .conf file is the same. Then shell script scatter the files to each directory. I don’t know how different each of your runs are, but this works for us when each system is the same.


On Feb 6, 2022, at 12:43 AM, McGuire, Kelly <<>> wrote:

CAUTION: This email originated from outside of Pacific. Do not click any links or open attachments if this is unsolicited email.
Has anyone ever submitted lots of jobs - (let's say for example 100 minimization jobs (100 different protein systems) - in parallel with SLURM using only one minimization configuration file, but defining environment variables for the coordinates, structure, consref, conskfile, outputname, dcdfile, and restartname that are passed from you bash script to the configuration file?

Or, is there no way around making 100 minimization files for each of the 100 protein systems, 100 annealing files, and 100 equilibration files?

Dr. Kelly L. McGuire
PhD Biophysics
Department of Physiology and Developmental Biology
Brigham Young University
LSB 3050
Provo, UT 84602

C. Michael McCallum
Department of Chemistry, UOP
mmccallum .at. pacific .dot. edu (209) 946-2636 v / (209) 946-2607 fax

This archive was generated by hypermail 2.1.6 : Tue Dec 13 2022 - 14:32:44 CST