From: John Stone (johns_at_ks.uiuc.edu)
Date: Fri Feb 15 2008 - 17:03:15 CST

Hi,
  For loading large trajectories, the only real limitation is
the amount of physical memory on your machine, making sure that
you're running a 64-bit kernel, and a 64-bit build of VMD.
The Windows and MacOS X versions of VMD are still 32-bit, so you'll need
to run one of the 64-bit Unix versions of VMD for datasets of this size
at present.

We regularly do visualization and analysis work with huge trajectories on
machines with 16GB and 32GB locally, on both Linux and Solaris host machines.
This is currently the easiest way to work with huge trajectories efficiently.
If you need any help getting things setup, let us know.

I'm also working on a future design change for the VMD internals that will
enable it to work with trajectories that are far larger than the amount
of physical memory in the machine through a new out-of-core trajectory
plugin API. I will likely implement this first using my own special
trajectory format and use mmap() and related kernel VM calls to allow
VMD to map monstrously huge MD trajectories into virtual memory.
The trick will be to add code for pre-fetching threads during trajectory
analysis and playback, and to give the OS kernel "hints" about which
timesteps need to be in-core and which ones can optionally be paged out.
Later on, I hope to have a more general implementation that can work with
any reasonable trajectory format (and without the need for mmap()), where
VMD will keep a working set of frames in-core, and will dynamically
load/free frames as analysis/visualization operations demand. This too
will attempt to use scout threads to prefetch frames on-the-fly before
they are needed so that the user "feels" like they were already in memory.
I don't have a timeline for these developments yet, I'll know more once
my experiments with my initial Unix-specific mmap() based implementation
have made significant progress.

Cheers,
  John

On Fri, Feb 15, 2008 at 02:48:45PM -0800, Dong Xu wrote:
> Hi,
>
> We have large trajectory (dcd) files (>60GB) and are planning to
> purchase a visualization computer that comes with 32-64GB RAM. Before
> making this purchase, our concern is whether memory size is the only
> limiting factor for loading large trajectory files to VMD? We'd like
> to know if there are other hardware/software requirements in the play
> so that we can come to the correct purchasing specifications.
>
> Thanks,
>
> Dong Xu

-- 
NIH Resource for Macromolecular Modeling and Bioinformatics
Beckman Institute for Advanced Science and Technology
University of Illinois, 405 N. Mathews Ave, Urbana, IL 61801
Email: johns_at_ks.uiuc.edu                 Phone: 217-244-3349
  WWW: http://www.ks.uiuc.edu/~johns/      Fax: 217-244-6078