Please read: fatal error namd 2.7b1 for Win32 Memory allocation failed on processor > 0

From: Mariana Graterol (marianagraterol_at_gmail.com)
Date: Sun Aug 26 2012 - 14:07:17 CDT

Hi namd's support team:

I work with NAMD 2.7b1 for Win32 in a personal computer with

6 AMD FX(tm)-6100 Six-Core Processor and 8 Gb of ram memory.

I were doing an equilbration run of my protein system, with pbc, in NPT
ensamble; then, when I reached constante pressure, total energy, temp. etc,
and when I want to switch to an NVT ensamble, I can't begin the calculation
with restart option because it appears a fatal error with the number of
processors (at the end of this post I describe briefly).

I checked the output files of my NPT run, I guess not having a constant
volume cell size changes, as expected, but it increases from aprox. 120 x
120 x120 until 1100 x 1100 x 1100 AA, which is huge to PME calculation, as
reflected in the size of the grid that is needed for.

How can I fix this bug? or how can I restart the calculation with my new
conditions - constant pressure, no cell volume constant-?

Thank you to anyone who read this message...

Mariana

***************************************************************

Info: Based on Charm++/Converse 60100 for multicore-win32

Info: Built Mon Mar 23 10:43:08 CDT 2009 by jcphill on honor

Info: Running on 6 processors.

Info: Charm++/Converse parallel runtime startup completed at 0 s

Info: 0 MB of memory in use based on nothing

***TCL: Suspending until startup complete.

Info: SIMULATION PARAMETERS:

Info: TIMESTEP 1

Info: NUMBER OF STEPS 400000

Info: STEPS PER CYCLE 20

Info: PERIODIC CELL BASIS 1 1015.77 0 0

Info: PERIODIC CELL BASIS 2 0 1432.08 0

Info: PERIODIC CELL BASIS 3 0 0 1354.43

Info: PERIODIC CELL CENTER 1.5e-005 -3.8e-005 -4.6e-005

Info: WRAPPING WATERS AROUND PERIODIC BOUNDARIES ON OUTPUT.

Info: LOAD BALANCE STRATEGY New Load Balancers -- ASB

Info: LDB PERIOD 4000 steps

Info: FIRST LDB TIMESTEP 100

Info: LAST LDB TIMESTEP -1

Info: LDB BACKGROUND SCALING 1

Info: HOM BACKGROUND SCALING 1

Info: PME BACKGROUND SCALING 1

Info: MAX SELF PARTITIONS 20

Info: MAX PAIR PARTITIONS 8

Info: SELF PARTITION ATOMS 154

Info: SELF2 PARTITION ATOMS 154

Info: PAIR PARTITION ATOMS 318

Info: PAIR2 PARTITION ATOMS 637

Info: MIN ATOMS PER PATCH 100

***

Info: SWITCHING ON 8

Info: SWITCHING OFF 12

Info: PAIRLIST DISTANCE 12

Info: PAIRLIST SHRINK RATE 0.01

Info: PAIRLIST GROW RATE 0.01

Info: PAIRLIST TRIGGER 0.3

Info: PAIRLISTS PER CYCLE 2

Info: PAIRLISTS ENABLED

Info: MARGIN 0

Info: HYDROGEN GROUP CUTOFF 2.5

Info: PATCH DIMENSION 14.5

Info: ENERGY OUTPUT STEPS 1000

Info: CROSSTERM ENERGY INCLUDED IN DIHEDRAL

Info: TIMING OUTPUT STEPS 10000

Info: HARMONIC CONSTRAINTS ACTIVE

Info: HARMONIC CONS EXP 2

Info: INTERACTIVE MD ACTIVE

Info: INTERACTIVE MD PORT 3000

Info: INTERACTIVE MD FREQ 20

Info: LANGEVIN DYNAMICS ACTIVE

Info: LANGEVIN TEMPERATURE 300

Info: LANGEVIN DAMPING COEFFICIENT IS 5 INVERSE PS

Info: LANGEVIN DYNAMICS NOT APPLIED TO HYDROGENS

Info: PARTICLE MESH EWALD (PME) ACTIVE

Info: PME TOLERANCE 1e-006

Info: PME EWALD COEFFICIENT 0.257952

Info: PME INTERPOLATION ORDER 4

Info: PME GRID DIMENSIONS 1024 1440 1440

Info: PME MAXIMUM GRID SPACING 1

Info: Attempting to read FFTW data from FFTW_NAMD_2.7b1_Win32.txt

Info: Optimizing 6 FFT steps. 1... 2... 3... 4... 5... 6... Done.

Info: Writing FFTW data to FFTW_NAMD_2.7b1_Win32.txt

Info: FULL ELECTROSTATIC EVALUATION FREQUENCY 4

Info: USING VERLET I (r-RESPA) MTS SCHEME.

Info: C1 SPLITTING OF LONG RANGE ELECTROSTATICS

Info: PLACING ATOMS IN PATCHES BY HYDROGEN GROUPS

Info: RANDOM NUMBER SEED 12345

Info: USE HYDROGEN BONDS? NO

***

Info: SUMMARY OF PARAMETERS:

Info: 1051 BONDS

Info: 4379 ANGLES

Info: 612 DIHEDRAL

Info: 399 IMPROPER

Info: 0 CROSSTERM

Info: 108 VDW

Info: 14 VDW_PAIRS

Warning: VDW TYPE NAME NP MATCHES PARAMETER TYPE NAME N*

Warning: VDW TYPE NAME NC MATCHES PARAMETER TYPE NAME N*

Warning: VDW TYPE NAME ST MATCHES PARAMETER TYPE NAME S*

Warning: Residue 1 out of order in segment MOL, lookup for additional
residues in this segment disabled.

***

Info: TIME FOR READING PSF FILE: 2.698

Warning: Man, tiny Elvis, that number is huge!

Warning: We don't know how X-Plor represents over Z999 residues

Warning: And you just tried - so we'll fake it as -55000

Warning: This is reversible, but only inside this program.

Info: TIME FOR READING PDB FILE: 0.437

Info:

Info: Reading from binary file C:\nombre.coor

Info: ****************************

Info: STRUCTURE SUMMARY:

Info: 127011 ATOMS

Info: 89678 BONDS

Info: 64131 ANGLES

Info: 38854 DIHEDRALS

Info: 7038 IMPROPERS

Info: 0 CROSSTERMS

Info: 0 EXCLUSIONS

Info: 4742 CONSTRAINTS

Info: 381033 DEGREES OF FREEDOM

Info: 44771 HYDROGEN GROUPS

Info: TOTAL MASS = 779157 amu

Info: TOTAL CHARGE = -0.000231652 e

Info: *****************************

Info:

Info: Entering startup at 483.71 s, 0 MB of memory in use

Info: Startup phase 0 took 0 s, 0 MB of memory in use

Info: Startup phase 1 took 0.171 s, 0 MB of memory in use

Info: Startup phase 2 took 0 s, 0 MB of memory in use

Info: PATCH GRID IS 8 (PERIODIC) BY 12 (PERIODIC) BY 11 (PERIODIC)

Info: PATCH GRID IS 1-AWAY BY 1-AWAY BY 1-AWAY

Info: nombre.vel

Info: REMOVING COM VELOCITY 0.016296 -0.00786343 -0.00547567

Info: LARGEST PATCH (523) HAS 5267 ATOMS

Info: CREATING 21412 COMPUTE OBJECTS

Info: Startup phase 3 took 0.0940001 s, 0 MB of memory in use

Info: PME using 6 and 6 processors for FFT and reciprocal sum.

Info: PME GRID LOCATIONS: 0 1 2 3 4 5

Info: PME TRANS LOCATIONS: 0 1 2 3 4 5

FATAL ERROR: Memory allocation failed on processor 4.

This application has requested the Runtime to terminate it in an unusual
way.

Please contact the application's support team for more information.

FATAL ERROR: Memory allocation failed on processor 5.

Program finished.

FATAL ERROR: Memory allocation failed on processor 2.

------------- Processor 4 Exiting: Called CmiAbort ------------

Reason: FATAL ERROR: Memory allocation failed on processor 4.

-- 
* mari *

This archive was generated by hypermail 2.1.6 : Tue Dec 31 2013 - 23:22:25 CST