From: Jérôme Hénin (jhenin_at_ifr88.cnrs-mrs.fr)
Date: Sat Sep 04 2010 - 02:50:54 CDT
Hi,
If you use a scripted NAMD config file (a series of several "run" or
"minimize" commands), there is only a small number of parameters that
can change in between: see
http://www.ks.uiuc.edu/Research/namd/2.7b2/ug/node10.html#843 for a list.
You cannot enable or disable the colvar module at that point, so the
colvar option has to be enabled before the first "run" command (it is
not available during minimization). In practice, that means you should
just use the restart files from this run to start a separate NAMD
simulation.
Best,
Jerome
On 4 September 2010 06:26, dhacademic <dhacademic_at_gmail.com> wrote:
> I find that the combination of TMD and colvars may work well in NAMD.
>
> However, I still have a problem at the end of the calculation. In my
> calculations, the number of steps are 20000. After 20000 steps, there is a
> following error message. Can anyone give some help on this issue?
>
> ENERGY: 200000 146049.7538 127806.8847 27617.5521
> 1209.0265 -1445341.5183 104279.5996 0.0000
> 0.0000 394277.1053 -644101.5964 310.1640 -1038378.7016
> -640605.3333 309.3770 -10.3663 44.3511
> 4082608.4264 0.2499 1.7430
> WRITING EXTENDED SYSTEM TO RESTART FILE AT STEP 200000
> WRITING COORDINATES TO DCD FILE AT STEP 200000
> WRITING COORDINATES TO RESTART FILE AT STEP 200000
> FINISHED WRITING RESTART COORDINATES
> WRITING VELOCITIES TO RESTART FILE AT STEP 200000
> FINISHED WRITING RESTART VELOCITIES
> TCL: Setting parameter colvars to on
> FATAL ERROR: Setting parameter colvars from script failed!
> FATAL ERROR: Setting parameter colvars from script failed!
> FATAL ERROR: Setting parameter colvars from script failed!
> FATAL ERROR: Setting parameter colvars from script failed!
> FATAL ERROR: Setting parameter colvars from script failed!
> FATAL ERROR: Setting parameter colvars from script failed!
>
>
>
> On Fri, Sep 3, 2010 at 1:30 PM, dhacademic <dhacademic_at_gmail.com> wrote:
>>
>> Hi Jerome,
>>
>> Thanks for the reply. According to your suggestion, I have tried "colvars"
>> in NAMD. But actually I feel quite confused about the colvars calculations.
>> There are a couple of questions about the parameters in colvars:
>>
>> (1) The first two colvars are defined for distance restraint (my
>> dist1colvar and dist2colvar). In the distance restraint, the
>> "lowerWallConstant" (or the "upperWallConstant") imposes a potential
>> centered on "lowerBoundary" (or the "uperBoundary"), so the distance is
>> restricted within the range. Does it means that the "harmonic" bias type
>> (the "distbias" in the following conf file) is not necessary in this case?
>>
>> (2) The third colvars is defined for RMSD (rmsdcolvar), where I want to
>> implement TMD by using colvars. In TMD, RMSD can gradually decrease to a
>> target value. Similarly, I specify the keywords of "targets 0.0" and
>> "targetsNumsteps 10000" in the harmonic restraint of colvars. But I'm not
>> clear weather it is correct. And I meet this problem after 10000 steps run:
>> TCL: Setting parameter colvars to on
>> FATAL ERROR: Setting parameter colvars from script failed!
>>
>> (3) The histogram block is included in the colvars configuration file, but
>> no histogram information can be found in the output file.
>>
>> Following is my namd configuration file for colvars and the corresponding
>> colvars.conf file. Can you please help to figure out what is wrong in these
>> files? Many thanks.
>>
>> #### collective variables ####
>> colvars on
>> analysis on
>> colvarsConfig colvars.conf
>>
>>
>> #### colvars.conf ####
>> colvar {
>> name dist1colvar
>> distance {
>> atoms {
>> group1 {atomNumbers 100970}
>> group2 {atomNumbers 94229}
>> atomsFile colvar.pdb
>> }
>> lowerBoundary 3.4
>> upperBoundary 4.2
>> lowerWallConstant 10.0
>> upperWallConstant 10.0
>> width 0.1
>> outputAppliedForce on
>> }
>>
>> name dist2colvar
>> distance {
>> atoms {
>> group1 {atomNumbers 107523}
>> group2 {atomNumbers 87680}
>> atomsFile colvar.pdb
>> }
>> lowerBoundary 3.4
>> upperBoundary 4.2
>> lowerWallConstant 10.0
>> upperWallConstant 10.0
>> width 0.1
>> outputAppliedForce on
>> }
>> }
>>
>> colvar {
>> name rmsdcolvar
>> rmsd {
>> atoms {
>> atomNameResidueRange 386-485
>> psfSegID A
>> atomNameResidueRange 386-485
>> psfSegID B
>> atomNameResidueRange 386-485
>> psfSegID C
>> atomNameResidueRange 386-485
>> psfSegID D
>> atomNameResidueRange 628-763
>> psfSegID A
>> atomNameResidueRange 628-763
>> psfSegID B
>> atomNameResidueRange 628-763
>> psfSegID C
>> atomNameResidueRange 628-763
>> psfSegID D
>> }
>> refPositionsFile colvar.pdb
>> refPositionsCol O
>> refPositionsColvalue 5
>> lowerBoundary 4.0
>> upperBoundary 0.0
>> lowerWallConstant 1000.0
>> upperWallConstant 1000.0
>> outputAppliedForce on
>> }
>> }
>>
>> harmonic {
>> name distbias
>> colvars dist1colvar dist2colvar
>> forceConstant 10.0
>> }
>>
>> harmonic {
>> name rmsdbias
>> colvars rmsdcolvar
>> forceConstant 1000.0
>> targets 0.0
>> targetsNumsteps 10000
>> }
>>
>> histogram {
>> outputFreq 10
>>
>>
>> On Tue, Aug 31, 2010 at 11:22 AM, Jérôme Hénin <jhenin_at_ifr88.cnrs-mrs.fr>
>> wrote:
>>>
>>> Hi,
>>>
>>> You could implement both TMD and the distance restraint as colvar
>>> biases - see "Collective Variable Calculations" in the user's guide.
>>>
>>> Cheers,
>>> Jerome
>>>
>>> On 31 August 2010 15:25, dhacademic <dhacademic_at_gmail.com> wrote:
>>> > Hi everyone,
>>> >
>>> > I meet this "FATAL ERROR: Due to a design error, GlobalMasterServer
>>> > does not
>>> > support individual atom requests from multiple global force clients on
>>> > parallel runs" problem when I run TMD with TCL force. I have searched
>>> > the
>>> > mail-list and found that someone else also had this problem. It seems
>>> > that
>>> > two user defined forces (TMD and TCL force) can not be used at the same
>>> > time.
>>> >
>>> > In my system, TMD is applied for segment A and a distance restraint is
>>> > imposed for segment B (by using TCL force). In NAMD, is there any
>>> > alternative that can fulfill this requirement?
>>> >
>>> > Many thanks for any suggestion.
>>> >
>>> >
>>> >
>>
>
>
This archive was generated by hypermail 2.1.6 : Wed Feb 29 2012 - 15:54:29 CST