Re: performance loss on GPUs

From: Axel Kohlmeyer (akohlmey_at_gmail.com)
Date: Wed Nov 28 2012 - 02:21:14 CST

On Wed, Nov 28, 2012 at 8:13 AM, Thomas Evangelidis <tevang3_at_gmail.com> wrote:
> Greetings,
>
> Is it normal to see a performance loss on GPUs after a few hours of
> simulation. I am running an implicit solvent simulation on 2 Tesla M2070 and
> 2 CPUs. At the beginning I got 0.0126392 s/step, but after 9 hours I get
> 0.0248662/step. Is this normal?

there are two possible explanations.

- the system becomes more disordered
  over time and hence memory accesses
  become more scattered and thus caching
  is less efficient. many MD programs now
  have an option to sort atoms regularly
  according to their domains (or patches in
  NAMD lingo). the most extreme effect has
  been reported for HOOMD, which uses only
  an single GPU and sorts atom according
  to a space filling hilbert curve. this is more
  likely to happen for a dense system, i.e.
  with explicit liquid

- the average number of neighbors is changing,
  e.g. when the system shrinks or aggregates.
  more neighbors within the cutoff means more
  work to compute and hence slower execution.
  this is more likely for an implicit solvent system,
  but you are in the best position to tell.

ciao,
    axel.

>
> thanks,
> Thomas
>
>
> --
>
> ======================================================================
>
> Thomas Evangelidis
>
> PhD student
>
> University of Athens
> Faculty of Pharmacy
> Department of Pharmaceutical Chemistry
> Panepistimioupoli-Zografou
> 157 71 Athens
> GREECE
>
> email: tevang_at_pharm.uoa.gr
>
> tevang3_at_gmail.com
>
>
> website: https://sites.google.com/site/thomasevangelidishomepage/
>
>
>

--
Dr. Axel Kohlmeyer  akohlmey_at_gmail.com  http://goo.gl/1wk0
International Centre for Theoretical Physics, Trieste. Italy.

This archive was generated by hypermail 2.1.6 : Mon Dec 31 2012 - 23:22:18 CST