From: Axel Kohlmeyer (
Date: Fri Feb 15 2008 - 17:00:40 CST

On Fri, 15 Feb 2008, Dong Xu wrote:

DX> Hi,
DX> We have large trajectory (dcd) files (>60GB) and are planning to
DX> purchase a visualization computer that comes with 32-64GB RAM. Before
DX> making this purchase, our concern is whether memory size is the only
DX> limiting factor for loading large trajectory files to VMD? We'd like


i have been using a machine with 16GB (dual woodcrest) to process
even larger data sets and so far there were no limitations but the
main memory. the data sets that i am operating on are a total of
> 130GB (mostly in compressed .xtc format, so even more than
your 60GB or data once loaded into VMD), but i found that it is
rather pointless to count on loading a whole trajectory into
VMD and that it is much more efficient to reduce the data up
front by preprocessing/analysis scripts (using mechanism like
bigdcd). so before you dump loads of money into a machine,
please keep in mind that many machines become slower if
you add more memory modules and current high end memory gets
_very_ hot. so i'd also consider putting more effort into
smart tcl/python/VMD programming and mainly a _fast_ machine
with decent but not excessive memory and good graphics...


DX> to know if there are other hardware/software requirements in the play
DX> so that we can come to the correct purchasing specifications.
DX> Thanks,
DX> Dong Xu

Axel Kohlmeyer
   Center for Molecular Modeling   --   University of Pennsylvania
Department of Chemistry, 231 S.34th Street, Philadelphia, PA 19104-6323
tel: 1-215-898-1582,  fax: 1-215-573-6233,  office-tel: 1-215-898-5425
If you make something idiot-proof, the universe creates a better idiot.