Tuning QM-MM with namd-orca on one cluster node

From: Francesco Pietra (chiendarret_at_gmail.com)
Date: Thu Jan 31 2019 - 12:25:15 CST

Hello
Having obtained very good performance of NAMD(nightbuild)-MOPAC on one
cluster node on my system (large qm part, see below, including two iron
ions) , I am now trying the same with NAMD(nightbuild)-ORCA on the same
cluster (36 cores along two sockets). So far I was unable to have namd and
orca running on more than one core each.

namd.conf
qmConfigLine "! UKS BP86 RI SV def2/J enGrad SlowConv"
qmConfigLine "%%output Printlevel Mini Print\[ P_Mulliken \] 1
Print\[P_AtCharges_M\] 1 end"
(SCF already converged by omitting "enGrad")

namd.job
#SBATCH --nodes=1
#SBATCH --ntasks=1
#SBATCH --cpus-per-task=36
/galileo/home/userexternal/fpietra0/NAMD_Git-2018-11-22_Linux-x86_64-multicore/namd2
namd-01.conf +p5 +setcpuaffinity + showcpuaffinity > namd-01.log

namd.log
Info: Running on 5 processors, 1 nodes, 1 physical nodes.
Info: Number of QM atoms (excluding Dummy atoms): 315
Info: We found 26 QM-MM bonds.
Info: Applying user defined multiplicity 1 to QM group ID 1
Info: 1) Group ID: 1 ; Group size: 315 atoms ; Total PSF charge: -1
Info: Found user defined charge 1 for QM group ID 1. Will ignore PSF charge.
Info: MM-QM pair: 180:191 -> Value (distance or ratio): 1.09 (QM Group 0 ID
1)
Info: MM-QM pair: 208:195 -> Value (distance or ratio): 1.09 (QM Group 0 ID
1)
Info: MM-QM pair: 243:258 -> Value (distance or ratio): 1.09 (QM Group 0 ID
1)
Info: MM-QM pair: 273:262 -> Value (distance or ratio): 1.09 (QM Group 0 ID
1)
Info: MM-QM pair: 296:313 -> Value (distance or ratio): 1.09 (QM Group 0 ID
1)
Info: MM-QM pair: 324:317 -> Value (distance or ratio): 1.09 (QM Group 0 ID
1)
Info: MM-QM pair: 358:373 -> Value (distance or ratio): 1.09 (QM Group 0 ID
1)
Info: MM-QM pair: 394:377 -> Value (distance or ratio): 1.09 (QM Group 0 ID
1)
Info: MM-QM pair: 704:724 -> Value (distance or ratio): 1.09 (QM Group 0 ID
1)
Info: MM-QM pair: 742:728 -> Value (distance or ratio): 1.09 (QM Group 0 ID
1)
Info: MM-QM pair: 756:769 -> Value (distance or ratio): 1.09 (QM Group 0 ID
1)
Info: MM-QM pair: 799:788 -> Value (distance or ratio): 1.09 (QM Group 0 ID
1)
Info: MM-QM pair: 820:830 -> Value (distance or ratio): 1.09 (QM Group 0 ID
1)
Info: MM-QM pair: 864:851 -> Value (distance or ratio): 1.09 (QM Group 0 ID
1)
Info: MM-QM pair: 1461:1479 -> Value (distance or ratio): 1.09 (QM Group 0
ID 1)
Info: MM-QM pair: 1511:1500 -> Value (distance or ratio): 1.09 (QM Group 0
ID 1)
Info: MM-QM pair: 1532:1547 -> Value (distance or ratio): 1.09 (QM Group 0
ID 1)
Info: MM-QM pair: 1566:1551 -> Value (distance or ratio): 1.09 (QM Group 0
ID 1)
Info: MM-QM pair: 1933:1946 -> Value (distance or ratio): 1.09 (QM Group 0
ID 1)
Info: MM-QM pair: 1991:1974 -> Value (distance or ratio): 1.09 (QM Group 0
ID 1)
Info: MM-QM pair: 2011:2018 -> Value (distance or ratio): 1.09 (QM Group 0
ID 1)
Info: MM-QM pair: 2050:2037 -> Value (distance or ratio): 1.09 (QM Group 0
ID 1)
Info: MM-QM pair: 2072:2083 -> Value (distance or ratio): 1.09 (QM Group 0
ID 1)
Info: MM-QM pair: 2098:2087 -> Value (distance or ratio): 1.09 (QM Group 0
ID 1)
Info: MM-QM pair: 2139:2154 -> Value (distance or ratio): 1.09 (QM Group 0
ID 1)
Info: MM-QM pair: 2174:2158 -> Value (distance or ratio): 1.09 (QM Group 0
ID 1)
TCL: Minimizing for 200 steps
Info: List of ranks running QM simulations: 0.
Nothing about affinity!! (which was clearly displayed in MOPAC case)

/0/qmm_0_input.TmpOut shows SCF ITERATIONS

"top" shown a single PR for both namd and orca.
___-
I had already tried a different job setting
#SBATCH --nodes=1
#SBATCH --ntasks-per-node=4
#SBATCH --ntasks-per-socket=2
module load profile/archive
module load autoload openmpi/2.1.1--gnu--6.1.0
/galileo/home/userexternal/fpietra0/NAMD_Git-2018-11-22_Linux-x86_64-multicore/namd2
namd-01.conf +p5 > namd-01.log

Here too, "top" showed a single PR for both namd and orca, so that in about
20 hous, namd.log was at "ENERGY 2", indicating that 1400 hrs were needed
to complete the simulation.

Thanks for advice
francesco pietra

This archive was generated by hypermail 2.1.6 : Tue Dec 31 2019 - 23:20:28 CST