Previous Thread
Next Thread
Print Thread
Simulation error in MPI parallel but not in one processor
#37880 05/13/20 11:30 AM
Joined: Nov 2019
Posts: 35
A
Forum Member
OP Offline
Forum Member
A
Joined: Nov 2019
Posts: 35
Hello Charmm users,

An error called In parallel lonepairs must be within group boundary! occured when running a drude simulation. The error occured immediately after dynavv2 command. The reason I think it is an error that relates with installing but nothing to do with setup is that if I run with one processor on a cluster or run with an executable with non M option on my Desktop , then it is error free. I have tried both ./install.com GNU M and ./install.com em64t M xxlarge, and it makes no difference. I wonder whether it is related with MPI issue? MPI I have is mpi/openmpi/2.0.0/intel and mpi/openmpi/2.0.0/gnu Compling is error free by the way. What could be possible solution of this issue?

I am copying part of my input for a reference.

crystal define Monoclinic 21.678 26.698 9.908 90 91.31 90
crystal build noper 0 cutoff 30
image BYSEgments xcen 0.0 ycen 0.0 zcen 0.0 select all end
update inbfrq -1 imgfrq -1 ihbfrq 0 -
ewald pmewald kappa 0.34 fftx 32 ffty 32 fftz 32 order 6 lrc -
@eatom @etrun @vatom @vtrun cutnb @cutnb ctonnb @ctonnb ctofnb @ctofnb -
cutim @cutim
ener

! minimization
cons harm force 100000. sele .not. type D* end
!mini ABNR nstep 100 nprint 20
mini SD nstep 500 nprint 20

cons harm force 0.0 sele all end
mini SD nstep 1000 nprint 20
mini ABNR nstep 1000 nprint 20


SHAKE bonh param tolerance 1.0e-9 -
nofast -
select ( .not. (type D*)) end -
select ( .not. (type D*)) end noreset

drudehardwall l_wall 0.2


! Set up temperature control for the NPT simulation
bomlev -1
TPCONTROL NTHER 2 CMDAMP 10.0 NSTEP 20 -
THER 1 TAU 0.1 TREF 50 SELECT .NOT. TYPE D* END -
THER 2 TAU 0.005 TREF 1.00 SELECT TYPE D* END -
BARO BTAU 0.2 PREF 1.00 DSCY FULL

! open files for restart, trajectory and energies
open write FORMatted unit 93 name lactose_8cell_heating.50.res
open write unform unit 92 name lactose__8cell_heating.50.dcd
! ihtfrq 10000 :The step frequency for heating the molecule in increments of TEMINC degrees in the heating portion of a dynamicsrun.

DYNAMICS vv2 start nstep 5000 timestp 0.0005 -
ntrfrq 2000 iprfrq -1 -
nprint 1000 iseed @seed -
iasvel 1 firstt 50 finalt 50 -
inbfrq -1 imgfrq -1 ihbfrq 0 ilbfrq 0 -
iunread -1 -
iunwrite 93 KUNIT 30 -
iuncrd 92 nsavcrd 2000

Re: Simulation error in MPI parallel but not in one processor
Allen_123 #37881 05/13/20 05:42 PM
Joined: Sep 2003
Posts: 8,485
rmv Offline
Forum Member
Offline
Forum Member
Joined: Sep 2003
Posts: 8,485
What is the CHARMM version? One possible solution might be an upgrade; the latest release is c44b2.

We have seen some issues that are dependent on the mpi implementation (openmpi, mvapich2, intel mpi) and the version number. Trying an alternative mpi library may be useful.

I have limited experience with Drude simulations, and do not have much insight on this issue.


Rick Venable
computational chemist

Re: Simulation error in MPI parallel but not in one processor
Allen_123 #37882 05/14/20 09:24 AM
Joined: Sep 2003
Posts: 4,784
Likes: 2
Forum Member
Online Content
Forum Member
Joined: Sep 2003
Posts: 4,784
Likes: 2
In lactose there are lone-pairs (Drudes) on atoms in the linkage between the two residues. These atoms then are from different groups, and this is not handled well in parallel, probably because the atoms may be on different processors.
I would suggest that you try making a single RESI LACT in the rtf, and make sure that all atoms required in the defintion of a lone-pair are in the same group, as defined by the GROUP statments in the RTF. DO NOT MAKE THE WHOLE RESIDUE INTO A SINGLE GROUP. Groups should be no larger than 12Å across.


Lennart Nilsson
Karolinska Institutet
Stockholm, Sweden

Moderated by  lennart, rmv 

Link Copied to Clipboard
Powered by UBB.threads™ PHP Forum Software 7.7.4
(Release build 20200307)
Responsive Width:

PHP: 5.6.33-0+deb8u1 Page Time: 0.008s Queries: 20 (0.003s) Memory: 0.9054 MB (Peak: 0.9788 MB) Data Comp: Off Server Time: 2020-08-14 11:37:45 UTC
Valid HTML 5 and Valid CSS