**From:** Brian Radak (*brian.radak_at_gmail.com*)

**Date:** Fri Mar 16 2018 - 09:26:38 CDT

**Next message:**Nick Guros: "Lowe-Andersen Thermostat Speed"**Previous message:**Srijita Paul: "(no subject)"**Maybe in reply to:**Victor Ovchinnikov: "Re:"**Next in thread:**Srijita Paul: "Re:"**Messages sorted by:**[ date ] [ thread ] [ subject ] [ author ] [ attachment ]

Please make responses to namd-l (reply all is fine), so that the discussion

is archived for the benefit of others.

The acceptance probability has very little to do with the temperature range

and way more to do with the temperature *spacing*. The prevailing wisdom is

to use exponential spacing:

T_i+1 = T_i*alpha

where alpha is a constant slightly greater than one. The closer alpha is to

one, the higher the acceptance probability, but a high acceptance

probability does not guarantee that the simulation explores the temperature

space efficiently. The seminal work on this is from Kofke and later Nadler

and Hansmann - take a look if you are curious (shameless plug, I also

recently submitted a paper on a related topic).

The above only covers efficiency in the temperature exchange, but does

NOTHING to consider efficiency in the configuration space. I don't think

there are any strong agreements on what improves efficiency the most here.

Presumably a high temperature is better (how high? 400K, 500K, 600K?), but

there are some system specific considerations:

1) does the system degrade in an effectively irreversible way?

2) is the integrator still valid?

You hit upon the first of these. I will only add that:

1) the force field melting point and the experimental melting point are not

guaranteed to be the same and the force field temperature may in fact be

much higher

2) a simulation may be immune to the kinds of degradation that you are

speaking of since the valence is fixed and covalent bonds cannot break. You

may be able to go to much higher temperatures than you think. I suggest you

try a fixed temperature simulation to see if the degradation actually

occurs.

I don't see the second concern considered as often, but it can be serious.

For example, using a 2 fs timestep at high temperature (even with rigid

heavy-light atom bonds) can lead to unstable and ill-defined results (i.e.

the simulation will never converge). Again, a simulation at the highest

proposed temperature should reveal this kind problem by running Newtonian

dynamics (not Langevin!) and tracking energy conservation for a few 100 ps

say.

HTH,

BKR

On Fri, Mar 16, 2018 at 2:05 AM, Srijita Paul <srijitap91_at_gmail.com> wrote:

*> Hi Brian,
*

*> Thanks for your reply. Actually the temperature range of my remd
*

*> simulation is very short that is 280-350K as the molecule breaks after
*

*> 350K. The short temperature limit may be the cause of this low acceptance
*

*> probabilty. Is it a valid way to do remd in this low temperature limit?
*

*> Generally people do remd for enhanced sampling in the higher temperature
*

*> range, that is why I am asking.
*

*>
*

*> Srijita Paul
*

*> IIT Guwahati
*

*>
*

*> On Thu, Mar 15, 2018 at 6:54 PM, Brian Radak <brian.radak_at_gmail.com>
*

*> wrote:
*

*>
*

*>> I think you are looking for the *mean* acceptance probability, that is,
*

*>> the average probability with which an exchange is accepted. You can compute
*

*>> this in two different ways which, to my knowledge, are essentially
*

*>> equivalent for any reasonably large number of REMD cycles:
*

*>>
*

*>> 1) take the expectation of the Metropolis criterion P = min[1,
*

*>> exp(-Delta_ij)] where Delta_ij contains the observed energies and
*

*>> temperatures -- this method is cumbersome and requires sifting through a
*

*>> lot of data
*

*>>
*

*>> 2) just divide the number of success by the number of attempts = (50 /
*

*>> 450) = 0.11 = 11%
*

*>>
*

*>> The reasonableness of the observed value depends on what your exchange
*

*>> scheme is. I assume that your script implements a nearest neighbor
*

*>> sampling? If the neighbors are chosen stochastically with equal
*

*>> probabilities, then 11% is very close to optimal. If the neighbors are
*

*>> chosen in the deterministic "up/down" strategy then something closer to 20%
*

*>> is preferred.
*

*>>
*

*>> If you are unhappy with your acceptance probability you have two options:
*

*>> 1) assume that you have bad statistics and keep running until the
*

*>> performance numbers change or 2) start over and choose more replicas over
*

*>> the temperature range.
*

*>>
*

*>> HTH,
*

*>> BKR
*

*>>
*

*>>
*

*>> On Thu, Mar 15, 2018 at 7:44 AM, Srijita Paul <srijitap91_at_gmail.com>
*

*>> wrote:
*

*>>
*

*>>> Hi,
*

*>>>
*

*>>> Can anybody explain me the output file obtained from a remd simulation
*

*>>> .job0.restart900.0.tcl.
*

*>>>
*

*>>> array set replica {index.b 4 index 3 temperature.a 283.48
*

*>>> exchanges_attempted 450 loc.a 4 temperature.b 289.07 temperature 286.26
*

*>>> exchanges_accepted 50 loc.b 18 index.a 2}
*

*>>>
*

*>>> exchanges_attempted 450
*

*>>> exchanges_accepted 50
*

*>>>
*

*>>> Is it a good result for remd? How can I find acceptance probability for
*

*>>> my system?
*

*>>>
*

*>>
*

*>>
*

*>
*

**Next message:**Nick Guros: "Lowe-Andersen Thermostat Speed"**Previous message:**Srijita Paul: "(no subject)"**Maybe in reply to:**Victor Ovchinnikov: "Re:"**Next in thread:**Srijita Paul: "Re:"**Messages sorted by:**[ date ] [ thread ] [ subject ] [ author ] [ attachment ]

*
This archive was generated by hypermail 2.1.6
: Mon Dec 31 2018 - 23:20:55 CST
*