From: Marcelo C. R. Melo (melomcr_at_gmail.com)
Date: Fri Jan 20 2017 - 08:33:18 CST
Hi Francesco,
There is nothing that would keep NAMD from using constraints like that in a
QM/MM simulation, just like it would in a "regular" or classical MD
simulation.
Best,
Marcelo
--- Marcelo Cardoso dos Reis Melo PhD Candidate Luthey-Schulten Group University of Illinois at Urbana-Champaign crdsdsr2_at_illinois.edu +1 (217) 244-5983 On 19 January 2017 at 11:56, Francesco Pietra <chiendarret_at_gmail.com> wrote: > Hi Marcelo: > > What about colvars in the config.namd file (the same colvars that where > used in minimizing/heating/npt)? They are accepte and the simulation is > running. Actually these colvars are for the qm part, keeping a foreign > molecule near the transition metal atom, which might well be unreasonable > for a qm-mm. > > thanks > > francesco pietra > > On Tue, Jan 17, 2017 at 10:58 PM, Marcelo C. R. Melo <melomcr_at_gmail.com> > wrote: > >> Sure, you could minimize, equilibrate (with and/or without constraints) >> and run you system for some time before selecting a conformation to be used >> in the TCL script, and then initiating QM/MM runs. There is nothing in the >> method or code that would prevent you from doing that. >> Some discussion could ensue regarding the difference in force fields used >> for minimizing/equilibrating and then running the simulations, so you >> should check the literature for your particular case for that. >> >> Best, >> Marcelo >> >> --- >> Marcelo Cardoso dos Reis Melo >> PhD Candidate >> Luthey-Schulten Group >> University of Illinois at Urbana-Champaign >> crdsdsr2_at_illinois.edu >> +1 (217) 244-5983 <(217)%20244-5983> >> >> On 17 January 2017 at 12:10, Francesco Pietra <chiendarret_at_gmail.com> >> wrote: >> >>> Hi Marcelo: >>> >>> Adding "ENGRAD" let the simulation running. It was running on 6 cores, >>> therefore much too slow for this system. Killed while minimization was >>> slowly moving atoms. >>> >>> Probably obvious; can minimization be run separately, submitting to >>> "prepare.qm.region.tlc" minimized (perhaps also "heated") files? >>> >>> Nonetheless I would be happy to have the simulation running with MOPAC >>> too; with DFT, especially with BP86, one does not not where he is as to the >>> spin state. >>> >>> Thanks a lot >>> >>> francesco pietra >>> >>> On Mon, Jan 16, 2017 at 11:17 PM, Marcelo C. R. Melo <melomcr_at_gmail.com> >>> wrote: >>> >>>> In the case of ORCA, you should always have the keyword "ENGRAD" in >>>> your qmConfigLine. This tells ORCA to write a file ending in "engrad" where >>>> the gradient is written. >>>> >>>> That should solve your issue. >>>> >>>> Marcelo >>>> >>>> --- >>>> Marcelo Cardoso dos Reis Melo >>>> PhD Candidate >>>> Luthey-Schulten Group >>>> University of Illinois at Urbana-Champaign >>>> crdsdsr2_at_illinois.edu >>>> +1 (217) 244-5983 <(217)%20244-5983> >>>> >>>> On 16 January 2017 at 01:55, Francesco Pietra <chiendarret_at_gmail.com> >>>> wrote: >>>> >>>>> Hello: >>>>> >>>>> This mail in parallel to previous mail about the same system and same >>>>> issue with QM-MM MOPAC. >>>>> >>>>> System of total spin 7 (total six unpaired electrons on two open shell >>>>> molecules) >>>>> >>>>> qmConfigLine "! UKS BP86 def2-TZVP def2-TZVP/J KDIIS SOSCF" >>>>> qmConfigLine "%%output PrintLevel Mini Print\[ P_Mulliken \] 1 >>>>> Print\[P_AtCharges_M\] 1 end" >>>>> # Multiplicity of the QM region. This is needed for propper >>>>> # construction of ORCA's input file. >>>>> qmMult "1 7" >>>>> The gmConfigLine is the best for transition metals in my experience >>>>> with orca) >>>>> >>>>> Folder /0 contains >>>>> qmmm_0.input >>>>> qmmm_0.input.gbw >>>>> qmmm_0.input.pntchrg >>>>> qmmm_0.input.prop >>>>> qmmm_0.input.TmpOut >>>>> >>>>> The TmpOut file: >>>>> >>>>> Total SCF time: 0 days 1 hours 14 min 18 sec >>>>> >>>>> ------------------------- -------------------- >>>>> FINAL SINGLE POINT ENERGY nan >>>>> ------------------------- -------------------- >>>>> >>>>> >>>>> *************************************** >>>>> * ORCA property calculations * >>>>> *************************************** >>>>> >>>>> --------------------- >>>>> Active property flags >>>>> --------------------- >>>>> (+) Dipole Moment >>>>> >>>>> >>>>> ------------------------------------------------------------ >>>>> ------------------ >>>>> ORCA ELECTRIC PROPERTIES CALCULATION >>>>> ------------------------------------------------------------ >>>>> ------------------ >>>>> >>>>> Dipole Moment Calculation ... on >>>>> Quadrupole Moment Calculation ... off >>>>> Polarizability Calculation ... off >>>>> GBWName ... >>>>> /dev/shm/NAMD_4IEV/0/qmmm_0.input.gbw >>>>> Electron density file ... >>>>> /dev/shm/NAMD_4IEV/0/qmmm_0.input.scfp.tmp >>>>> >>>>> ------------- >>>>> DIPOLE MOMENT >>>>> ------------- >>>>> X Y Z >>>>> Electronic contribution: 4.59465 27.21307 -7.80847 >>>>> Nuclear contribution : -16.01483 -32.18596 12.99059 >>>>> ----------------------------------------- >>>>> Total Dipole Moment : -11.42018 -4.97289 5.18211 >>>>> ----------------------------------------- >>>>> Magnitude (a.u.) : 13.49090 >>>>> Magnitude (Debye) : 34.29115 >>>>> >>>>> >>>>> Timings for individual modules: >>>>> >>>>> Sum of individual times ... 4508.572 sec (= 75.143 min) >>>>> GTO integral calculation ... 8.134 sec (= 0.136 min) >>>>> 0.2 % >>>>> SCF iterations ... 4500.439 sec (= 75.007 min) >>>>> 99.8 % >>>>> ****ORCA TERMINATED NORMALLY**** >>>>> TOTAL RUN TIME: 0 days 1 hours 15 minutes 27 seconds 331 msec >>>>> >>>>> >>>>> The NAMD log: >>>>> >>>>> TCL: Minimizing for 100 steps >>>>> Info: List of ranks running QM simulations: 0. >>>>> ERROR: Could not find QM output file! >>>>> FATAL ERROR: No such file or directory >>>>> >>>>> i.e, the same error as with MOPAC for the same system. Comparing with >>>>> the furnished Example1 - which ended OK also in my hands - I was unable to >>>>> catch which file corresponds to the "QM output file" charmm++ is >>>>> complaining about. >>>>> >>>>> Thanks for help >>>>> >>>>> francesco pietra >>>>> >>>> >>>> >>> >> >
This archive was generated by hypermail 2.1.6 : Mon Dec 31 2018 - 23:20:02 CST