Date: Fri Jun 04 2010 - 01:22:07 CDT
A good value will depend on the self-correlation properties of your system - if
you are plotting the trajectory of a protein, then 1000 per ns is pretty close
to the maximum that is useful. For example, if it takes (plucking a number out
of thin air,) 20 ps for a sidechain to flip 180 degrees, then taking 0.5 ps
frames is really not going to tell you anything more than taking 2 ps frames.
If you can tell us what you're working with/trying to find out, answering will
be easier .
Quoting Philip Peartree <p.peartree_at_postgrad.manchester.ac.uk>:
> Hi All,
> I've run 13 ns of dynamics on a 99000 atom system with a 1 fs timestep (2
> kept dying on me) and having used a dcdfreq of 50 I've ended up with quite a
> large amount of data (22 Gb/ns). I'm looking to use catdcd to reduce the
> number of frames in each file to get a more reasonable size, what I'm
> wondering is what is the best thing to aim for? 1000 frames per ns, 2000? I
> don't really want to throw away data, but I'm maxing out my abilities to
> load trajectories!
> Phil Peartree
> University of Manchester
This archive was generated by hypermail 2.1.6 : Wed Feb 29 2012 - 15:54:12 CST