From: Francesco Pietra (chiendarret_at_gmail.com)
Date: Tue Jan 17 2017 - 12:10:52 CST
Hi Marcelo:
Adding "ENGRAD" let the simulation running. It was running on 6 cores,
therefore much too slow for this system. Killed while minimization was
slowly moving atoms.
Probably obvious; can minimization be run separately, submitting to
"prepare.qm.region.tlc" minimized (perhaps also "heated") files?
Nonetheless I would be happy to have the simulation running with MOPAC too;
with DFT, especially with BP86, one does not not where he is as to the spin
state.
Thanks a lot
francesco pietra
On Mon, Jan 16, 2017 at 11:17 PM, Marcelo C. R. Melo <melomcr_at_gmail.com>
wrote:
> In the case of ORCA, you should always have the keyword "ENGRAD" in your
> qmConfigLine. This tells ORCA to write a file ending in "engrad" where the
> gradient is written.
>
> That should solve your issue.
>
> Marcelo
>
> ---
> Marcelo Cardoso dos Reis Melo
> PhD Candidate
> Luthey-Schulten Group
> University of Illinois at Urbana-Champaign
> crdsdsr2_at_illinois.edu
> +1 (217) 244-5983 <(217)%20244-5983>
>
> On 16 January 2017 at 01:55, Francesco Pietra <chiendarret_at_gmail.com>
> wrote:
>
>> Hello:
>>
>> This mail in parallel to previous mail about the same system and same
>> issue with QM-MM MOPAC.
>>
>> System of total spin 7 (total six unpaired electrons on two open shell
>> molecules)
>>
>> qmConfigLine "! UKS BP86 def2-TZVP def2-TZVP/J KDIIS SOSCF"
>> qmConfigLine "%%output PrintLevel Mini Print\[ P_Mulliken \] 1
>> Print\[P_AtCharges_M\] 1 end"
>> # Multiplicity of the QM region. This is needed for propper
>> # construction of ORCA's input file.
>> qmMult "1 7"
>> The gmConfigLine is the best for transition metals in my experience with
>> orca)
>>
>> Folder /0 contains
>> qmmm_0.input
>> qmmm_0.input.gbw
>> qmmm_0.input.pntchrg
>> qmmm_0.input.prop
>> qmmm_0.input.TmpOut
>>
>> The TmpOut file:
>>
>> Total SCF time: 0 days 1 hours 14 min 18 sec
>>
>> ------------------------- --------------------
>> FINAL SINGLE POINT ENERGY nan
>> ------------------------- --------------------
>>
>>
>> ***************************************
>> * ORCA property calculations *
>> ***************************************
>>
>> ---------------------
>> Active property flags
>> ---------------------
>> (+) Dipole Moment
>>
>>
>> ------------------------------------------------------------
>> ------------------
>> ORCA ELECTRIC PROPERTIES CALCULATION
>> ------------------------------------------------------------
>> ------------------
>>
>> Dipole Moment Calculation ... on
>> Quadrupole Moment Calculation ... off
>> Polarizability Calculation ... off
>> GBWName ... /dev/shm/NAMD_4IEV/0/
>> qmmm_0.input.gbw
>> Electron density file ... /dev/shm/NAMD_4IEV/0/
>> qmmm_0.input.scfp.tmp
>>
>> -------------
>> DIPOLE MOMENT
>> -------------
>> X Y Z
>> Electronic contribution: 4.59465 27.21307 -7.80847
>> Nuclear contribution : -16.01483 -32.18596 12.99059
>> -----------------------------------------
>> Total Dipole Moment : -11.42018 -4.97289 5.18211
>> -----------------------------------------
>> Magnitude (a.u.) : 13.49090
>> Magnitude (Debye) : 34.29115
>>
>>
>> Timings for individual modules:
>>
>> Sum of individual times ... 4508.572 sec (= 75.143 min)
>> GTO integral calculation ... 8.134 sec (= 0.136 min)
>> 0.2 %
>> SCF iterations ... 4500.439 sec (= 75.007 min)
>> 99.8 %
>> ****ORCA TERMINATED NORMALLY****
>> TOTAL RUN TIME: 0 days 1 hours 15 minutes 27 seconds 331 msec
>>
>>
>> The NAMD log:
>>
>> TCL: Minimizing for 100 steps
>> Info: List of ranks running QM simulations: 0.
>> ERROR: Could not find QM output file!
>> FATAL ERROR: No such file or directory
>>
>> i.e, the same error as with MOPAC for the same system. Comparing with
>> the furnished Example1 - which ended OK also in my hands - I was unable to
>> catch which file corresponds to the "QM output file" charmm++ is
>> complaining about.
>>
>> Thanks for help
>>
>> francesco pietra
>>
>
>
This archive was generated by hypermail 2.1.6 : Sun Dec 31 2017 - 23:21:00 CST