Giter Site home page Giter Site logo

valdes-tresanco-ms / gmx_mmpbsa Goto Github PK

View Code? Open in Web Editor NEW
189.0 9.0 61.0 336.97 MB

gmx_MMPBSA is a new tool based on AMBER's MMPBSA.py aiming to perform end-state free energy calculations with GROMACS files.

Home Page: https://valdes-tresanco-ms.github.io/gmx_MMPBSA/

License: GNU General Public License v3.0

Python 99.51% Shell 0.49%
gmx mmpbsa mmgbsa gromacs ambertools energy-calculations gmx-mmpbsa gmx-mmgbsa

gmx_mmpbsa's Introduction

Python Pypi Downloads DOI

Help forum Help forum Issue tracking

Support Support

Welcome to gmx_MMPBSA!

gmx_MMPBSA is a new tool based on AMBER's MMPBSA.py aiming to perform end-state free energy calculations with GROMACS files. It works with all GROMACS versions along with AmberTools >= 20.

Please see the documentation here

Cite us

SCImago Journal & Country Rank

Valdés-Tresanco, M.S., Valdés-Tresanco, M.E., Valiente, P.A. and Moreno E. gmx_MMPBSA: A New Tool to Perform End-State Free Energy Calculations with GROMACS. Journal of Chemical Theory and Computation, 2021 17 (10), 6281-6291 https://pubs.acs.org/doi/10.1021/acs.jctc.1c00645

Please also consider citing MMPBSA.py's paper:

Bill R. Miller, T. Dwight McGee, Jason M. Swails, Nadine Homeyer, Holger Gohlke, and Adrian E. Roitberg. MMPBSA.py: An Efficient Program for End-State Free Energy Calculations. Journal of Chemical Theory and Computation, 2012 8 (9), 3314-3321. https://pubs.acs.org/doi/10.1021/ct300418h

Please, visit Cite gmx_MMPBSA page for more information on how to cite gmx_MMPBSA and the programs/methods implemented in it.


Authors:


Acknowledgements:

  • First of all, to Amber and GROMACS developers. Without their incredible and hard work, gmx_MMPBSA would not exist.
  • Jason Swails (Amber developer and ParmEd principal developer) for his continuous support on ParmEd issues.
  • Dr. Hymavathi Veeravarapu for helping with the introductory video for gmx_MMPBSA.
  • To the Open Source license of the JetBrains programs.
  • To the Sourcery team for supporting us with the Pro version.
  • To all researchers that help improving gmx_MMPBSA with comments, feedback, and bug reports.

gmx_mmpbsa's People

Contributors

ale94mleon avatar landersen1 avatar marioernestovaldes avatar valdes-tresanco-ms avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gmx_mmpbsa's Issues

It shows Warning...will it create some issue ?

Hope installed it.. But returns following WARNING...

$:amber.python -m pip install gmx_MMPBSA
Collecting gmx_MMPBSA
Downloading gmx_MMPBSA-1.0.0-py3-none-any.whl (135 kB)
|████████████████████████████████| 135 kB 962 kB/s
Installing collected packages: gmx-MMPBSA
WARNING: The scripts gmx_MMPBSA and gmx_MMPBSA_gui are installed in '/home/debanjan/Amber/amber20/miniconda/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
Successfully installed gmx-MMPBSA-1.0.0

when I call it from the installed directory " '/home/debanjan/Amber/amber20/miniconda/bin" it returns following error
(base) debanjan@debanjan:/Amber/amber20/miniconda/bin$ gmx
gmx gmx_d gmx_MMPBSA gmx_MMPBSA_gui
(base) debanjan@debanjan:
/Amber/amber20/miniconda/bin$ gmx_MMPBSA_gui
Traceback (most recent call last):
File "/home/debanjan/Amber/amber20/miniconda/lib/python3.8/site-packages/GMXMMPBSA/app.py", line 24, in
from GMXMMPBSA import main
File "/home/debanjan/Amber/amber20/miniconda/lib/python3.8/site-packages/GMXMMPBSA/main.py", line 44, in
from GMXMMPBSA.createinput import create_inputs
File "/home/debanjan/Amber/amber20/miniconda/lib/python3.8/site-packages/GMXMMPBSA/createinput.py", line 43, in
from parmed.amber.mdin import Mdin
ModuleNotFoundError: No module named 'parmed'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "./gmx_MMPBSA_gui", line 5, in
from GMXMMPBSA.app import gmxmmpbsa_gui
File "/home/debanjan/Amber/amber20/miniconda/lib/python3.8/site-packages/GMXMMPBSA/app.py", line 30, in
raise ImportError('Could not import Amber Python modules. Please make sure '
ImportError: Could not import Amber Python modules. Please make sure you have sourced /home/debanjan/Amber/amber20/amber.sh (if you are using sh/ksh/bash/zsh) or /home/debanjan/Amber/amber20/amber.csh (if you are using csh/tcsh)
(base) debanjan@debanjan:~/Amber/amber20/miniconda/bin$ gmx_MMPBSA
Traceback (most recent call last):
File "/home/debanjan/Amber/amber20/miniconda/lib/python3.8/site-packages/GMXMMPBSA/app.py", line 24, in
from GMXMMPBSA import main
File "/home/debanjan/Amber/amber20/miniconda/lib/python3.8/site-packages/GMXMMPBSA/main.py", line 44, in
from GMXMMPBSA.createinput import create_inputs
File "/home/debanjan/Amber/amber20/miniconda/lib/python3.8/site-packages/GMXMMPBSA/createinput.py", line 43, in
from parmed.amber.mdin import Mdin
ModuleNotFoundError: No module named 'parmed'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "./gmx_MMPBSA", line 5, in
from GMXMMPBSA.app import gmxmmpbsa
File "/home/debanjan/Amber/amber20/miniconda/lib/python3.8/site-packages/GMXMMPBSA/app.py", line 30, in
raise ImportError('Could not import Amber Python modules. Please make sure '
ImportError: Could not import Amber Python modules. Please make sure you have sourced /home/debanjan/Amber/amber20/amber.sh (if you are using sh/ksh/bash/zsh) or /home/debanjan/Amber/amber20/amber.csh (if you are using csh/tcsh)
(base) debanjan@debanjan:~/Amber/amber20/miniconda/bin$

However parmed is installed
(base) debanjan@debanjan:/Amber/amber20/miniconda/bin$ parm
parmcal parmchk2 parmed
(base) debanjan@debanjan:
/Amber/amber20/miniconda/bin$ whereis parmed
parmed: /usr/local/bin/parmed /home/debanjan/Amber/amber20/bin/parmed

trjconv failed when querying md.tpr

Hi
I am trying to calculate the mmpbsa between a protein and DNA. I use the following command
gmx_MMPBSA -O -i mmpbsa.in -cs md.tpr -ci index.ndx -cg 1 15 -ct md.xtc
However, it gives me this error and the job stops after a few minutes
MMPBSA_Error: /cvmfs/soft.computecanada.ca/easybuild/software/2020/avx2/MPI/gcc9/openmpi4/gromacs/2020.4/bin/gmx trjconv failed when querying md.tpr
Exiting. All files have been retained.
I used gromacs2020 and the pbc was removed from the trajectory.
Any help would be appreciated.
Thanks!

leap failed when querying _GMXMMPBSA_leap.

PLEASE ATENTION!
It work well at v1.2.0, untill I updated gmx_MMPBSA by:
pip install gmx-MMPBSA --upgrade

protein_forcefield="oldff/leaprc.ff99SBildn"
gmx_MMPBSA v1.3.1 based on MMPBSA version 16.0 and AmberTools 20

gmx_MMPBSA.log:

[INFO   ] Started
[INFO   ] Loading and checking parameter files for compatibility...

[INFO   ] Checking external programs...
[INFO   ] cpptraj found! Using /home/zhangxin/anaconda3/envs/Bio/bin/cpptraj
[INFO   ] tleap found! Using /home/zhangxin/anaconda3/envs/Bio/bin/tleap
[INFO   ] parmchk2 found! Using /home/zhangxin/anaconda3/envs/Bio/bin/parmchk2
[INFO   ] mmpbsa_py_energy found! Using /home/zhangxin/anaconda3/envs/Bio/bin/mmpbsa_py_energy
[INFO   ] Using GROMACS version > 5.x.x!
[INFO   ] gmx found! Using /home/zhangxin/env/gmx2020.4/bin/gmx
[INFO   ] Checking external programs...Done.

[INFO   ] Building AMBER Topologies from GROMACS files...
[INFO   ] Checking if supported force fields exists in Amber data...
[INFO   ] Get PDB files from GROMACS structures files...
[INFO   ] Making gmx_MMPBSA index for complex...
                     :-) GROMACS - gmx make_ndx, 2020.4 (-:

                            GROMACS is written by:
     Emile Apol      Rossen Apostolov      Paul Bauer     Herman J.C. Berendsen
    Par Bjelkmar      Christian Blau   Viacheslav Bolnykh     Kevin Boyd    
 Aldert van Buuren   Rudi van Drunen     Anton Feenstra       Alan Gray     
  Gerrit Groenhof     Anca Hamuraru    Vincent Hindriksen  M. Eric Irrgang  
  Aleksei Iupinov   Christoph Junghans     Joe Jordan     Dimitrios Karkoulis
    Peter Kasson        Jiri Kraus      Carsten Kutzner      Per Larsson    
  Justin A. Lemkul    Viveca Lindahl    Magnus Lundborg     Erik Marklund   
    Pascal Merz     Pieter Meulenhoff    Teemu Murtola       Szilard Pall   
    Sander Pronk      Roland Schulz      Michael Shirts    Alexey Shvetsov  
   Alfons Sijbers     Peter Tieleman      Jon Vincent      Teemu Virolainen 
 Christian Wennberg    Maarten Wolf      Artem Zhmurov   
                           and the project leaders:
        Mark Abraham, Berk Hess, Erik Lindahl, and David van der Spoel

Copyright (c) 1991-2000, University of Groningen, The Netherlands.
Copyright (c) 2001-2019, The GROMACS development team at
Uppsala University, Stockholm University and
the Royal Institute of Technology, Sweden.
check out http://www.gromacs.org for more information.

GROMACS is free software; you can redistribute it and/or modify it
under the terms of the GNU Lesser General Public License
as published by the Free Software Foundation; either version 2.1
of the License, or (at your option) any later version.

GROMACS:      gmx make_ndx, version 2020.4
Executable:   /home/zhangxin/env/gmx2020.4/bin/gmx
Data prefix:  /home/zhangxin/env/gmx2020.4
Working dir:  /home/zhangxin/ACS/gmx/interaction/Nb6/pbsa
Command line:
  gmx make_ndx -n index.ndx -o _GMXMMPBSA_COM_index.ndx


GROMACS reminds you: "Chance favors the prepared mind." (Louis Pasteur)

Going to read 1 old index file(s)
Counted atom numbers up to 350281 in index file

  0 System              : 350281 atoms
  1 Protein             : 10156 atoms
  2 Protein-H           :  5163 atoms
  3 C-alpha             :   660 atoms
  4 Backbone            :  1980 atoms
  5 MainChain           :  2643 atoms
  6 MainChain+Cb        :  3250 atoms
  7 MainChain+H         :  3272 atoms
  8 SideChain           :  6884 atoms
  9 SideChain-H         :  2520 atoms
 10 Prot-Masses         : 10156 atoms
 11 non-Protein         : 340125 atoms
 12 Water               : 340116 atoms
 13 SOL                 : 340116 atoms
 14 non-Water           : 10165 atoms
 15 Ion                 :     9 atoms
 16 CL                  :     9 atoms
 17 Water_and_ions      : 340125 atoms
 18 receptor            :  2102 atoms
 19 com                 : 12136 atoms
 20 ligand              : 10034 atoms

 nr : group      '!': not  'name' nr name   'splitch' nr    Enter: list groups
 'a': atom       '&': and  'del' nr         'splitres' nr   'l': list residues
 't': atom type  '|': or   'keep' nr        'splitat' nr    'h': help
 'r': residue              'res' nr         'chain' char
 "name": group             'case': case sensitive           'q': save and quit
 'ri': residue index

> 

> 

> 
Copied index group 18 'GMXMMPBSA_REC'
Copied index group 20 'GMXMMPBSA_LIG'
Merged two groups with OR: 2102 10034 -> 12136

 21 GMXMMPBSA_REC_GMXMMPBSA_LIG: 12136 atoms

> 
[INFO   ] Normal Complex: Saving group 18_20 in _GMXMMPBSA_COM_index.ndx file as _GMXMMPBSA_COM.pdb
                     :-) GROMACS - gmx editconf, 2020.4 (-:

                            GROMACS is written by:
     Emile Apol      Rossen Apostolov      Paul Bauer     Herman J.C. Berendsen
    Par Bjelkmar      Christian Blau   Viacheslav Bolnykh     Kevin Boyd    
 Aldert van Buuren   Rudi van Drunen     Anton Feenstra       Alan Gray     
  Gerrit Groenhof     Anca Hamuraru    Vincent Hindriksen  M. Eric Irrgang  
  Aleksei Iupinov   Christoph Junghans     Joe Jordan     Dimitrios Karkoulis
    Peter Kasson        Jiri Kraus      Carsten Kutzner      Per Larsson    
  Justin A. Lemkul    Viveca Lindahl    Magnus Lundborg     Erik Marklund   
    Pascal Merz     Pieter Meulenhoff    Teemu Murtola       Szilard Pall   
    Sander Pronk      Roland Schulz      Michael Shirts    Alexey Shvetsov  
   Alfons Sijbers     Peter Tieleman      Jon Vincent      Teemu Virolainen 
 Christian Wennberg    Maarten Wolf      Artem Zhmurov   
                           and the project leaders:
        Mark Abraham, Berk Hess, Erik Lindahl, and David van der Spoel

Copyright (c) 1991-2000, University of Groningen, The Netherlands.
Copyright (c) 2001-2019, The GROMACS development team at
Uppsala University, Stockholm University and
the Royal Institute of Technology, Sweden.
check out http://www.gromacs.org for more information.

GROMACS is free software; you can redistribute it and/or modify it
under the terms of the GNU Lesser General Public License
as published by the Free Software Foundation; either version 2.1
of the License, or (at your option) any later version.

GROMACS:      gmx editconf, version 2020.4
Executable:   /home/zhangxin/env/gmx2020.4/bin/gmx
Data prefix:  /home/zhangxin/env/gmx2020.4
Working dir:  /home/zhangxin/ACS/gmx/interaction/Nb6/pbsa
Command line:
  gmx editconf -f md_0_1.tpr -o _GMXMMPBSA_COM.pdb -n _GMXMMPBSA_COM_index.ndx

Reading file md_0_1.tpr, VERSION 2020.4 (single precision)
Reading file md_0_1.tpr, VERSION 2020.4 (single precision)

Select a group for output:
Group     0 (         System) has 350281 elements
Group     1 (        Protein) has 10156 elements
Group     2 (      Protein-H) has  5163 elements
Group     3 (        C-alpha) has   660 elements
Group     4 (       Backbone) has  1980 elements
Group     5 (      MainChain) has  2643 elements
Group     6 (   MainChain+Cb) has  3250 elements
Group     7 (    MainChain+H) has  3272 elements
Group     8 (      SideChain) has  6884 elements
Group     9 (    SideChain-H) has  2520 elements
Group    10 (    Prot-Masses) has 10156 elements
Group    11 (    non-Protein) has 340125 elements
Group    12 (          Water) has 340116 elements
Group    13 (            SOL) has 340116 elements
Group    14 (      non-Water) has 10165 elements
Group    15 (            Ion) has     9 elements
Group    16 (             CL) has     9 elements
Group    17 ( Water_and_ions) has 340125 elements
Group    18 (  GMXMMPBSA_REC) has  2102 elements
Group    19 (            com) has 12136 elements
Group    20 (  GMXMMPBSA_LIG) has 10034 elements
Group    21 (GMXMMPBSA_REC_GMXMMPBSA_LIG) has 12136 elements
Select a group: 
GROMACS reminds you: "Contemplating answers that could break my bonds." (Peter Hammill)

Note that major changes are planned in future for editconf, to improve usability and utility.
Read 350281 atoms
Volume: 3485.93 nm^3, corresponds to roughly 1568600 electrons
Velocities found
Selected 21: 'GMXMMPBSA_REC_GMXMMPBSA_LIG'
[INFO   ] No receptor structure file was defined. Using ST approach...
[INFO   ] Using receptor structure from complex to generate AMBER topology
[INFO   ] Normal Complex: Saving group 18 in _GMXMMPBSA_COM_index.ndx file as _GMXMMPBSA_REC.pdb
                     :-) GROMACS - gmx editconf, 2020.4 (-:

                            GROMACS is written by:
     Emile Apol      Rossen Apostolov      Paul Bauer     Herman J.C. Berendsen
    Par Bjelkmar      Christian Blau   Viacheslav Bolnykh     Kevin Boyd    
 Aldert van Buuren   Rudi van Drunen     Anton Feenstra       Alan Gray     
  Gerrit Groenhof     Anca Hamuraru    Vincent Hindriksen  M. Eric Irrgang  
  Aleksei Iupinov   Christoph Junghans     Joe Jordan     Dimitrios Karkoulis
    Peter Kasson        Jiri Kraus      Carsten Kutzner      Per Larsson    
  Justin A. Lemkul    Viveca Lindahl    Magnus Lundborg     Erik Marklund   
    Pascal Merz     Pieter Meulenhoff    Teemu Murtola       Szilard Pall   
    Sander Pronk      Roland Schulz      Michael Shirts    Alexey Shvetsov  
   Alfons Sijbers     Peter Tieleman      Jon Vincent      Teemu Virolainen 
 Christian Wennberg    Maarten Wolf      Artem Zhmurov   
                           and the project leaders:
        Mark Abraham, Berk Hess, Erik Lindahl, and David van der Spoel

Copyright (c) 1991-2000, University of Groningen, The Netherlands.
Copyright (c) 2001-2019, The GROMACS development team at
Uppsala University, Stockholm University and
the Royal Institute of Technology, Sweden.
check out http://www.gromacs.org for more information.

GROMACS is free software; you can redistribute it and/or modify it
under the terms of the GNU Lesser General Public License
as published by the Free Software Foundation; either version 2.1
of the License, or (at your option) any later version.

GROMACS:      gmx editconf, version 2020.4
Executable:   /home/zhangxin/env/gmx2020.4/bin/gmx
Data prefix:  /home/zhangxin/env/gmx2020.4
Working dir:  /home/zhangxin/ACS/gmx/interaction/Nb6/pbsa
Command line:
  gmx editconf -f md_0_1.tpr -o _GMXMMPBSA_REC.pdb -n _GMXMMPBSA_COM_index.ndx

Reading file md_0_1.tpr, VERSION 2020.4 (single precision)
Reading file md_0_1.tpr, VERSION 2020.4 (single precision)

Select a group for output:
Group     0 (         System) has 350281 elements
Group     1 (        Protein) has 10156 elements
Group     2 (      Protein-H) has  5163 elements
Group     3 (        C-alpha) has   660 elements
Group     4 (       Backbone) has  1980 elements
Group     5 (      MainChain) has  2643 elements
Group     6 (   MainChain+Cb) has  3250 elements
Group     7 (    MainChain+H) has  3272 elements
Group     8 (      SideChain) has  6884 elements
Group     9 (    SideChain-H) has  2520 elements
Group    10 (    Prot-Masses) has 10156 elements
Group    11 (    non-Protein) has 340125 elements
Group    12 (          Water) has 340116 elements
Group    13 (            SOL) has 340116 elements
Group    14 (      non-Water) has 10165 elements
Group    15 (            Ion) has     9 elements
Group    16 (             CL) has     9 elements
Group    17 ( Water_and_ions) has 340125 elements
Group    18 (  GMXMMPBSA_REC) has  2102 elements
Group    19 (            com) has 12136 elements
Group    20 (  GMXMMPBSA_LIG) has 10034 elements
Group    21 (GMXMMPBSA_REC_GMXMMPBSA_LIG) has 12136 elements
Select a group: 
GROMACS reminds you: "You could give Aristotle a tutorial. And you could thrill him to the core of his being. Such is the privilege of living after Newton, Darwin, Einstein, Planck, Watson, Crick and their colleagues." (Richard Dawkins)

Note that major changes are planned in future for editconf, to improve usability and utility.
Read 350281 atoms
Volume: 3485.93 nm^3, corresponds to roughly 1568600 electrons
Velocities found
Selected 18: 'GMXMMPBSA_REC'
[INFO   ] No ligand structure file was defined. Using ST approach...
[INFO   ] Using ligand structure from complex to generate AMBER topology
[INFO   ] Normal ligand: Saving group 20 in _GMXMMPBSA_COM_index.ndx file as _GMXMMPBSA_LIG.pdb
                     :-) GROMACS - gmx editconf, 2020.4 (-:

                            GROMACS is written by:
     Emile Apol      Rossen Apostolov      Paul Bauer     Herman J.C. Berendsen
    Par Bjelkmar      Christian Blau   Viacheslav Bolnykh     Kevin Boyd    
 Aldert van Buuren   Rudi van Drunen     Anton Feenstra       Alan Gray     
  Gerrit Groenhof     Anca Hamuraru    Vincent Hindriksen  M. Eric Irrgang  
  Aleksei Iupinov   Christoph Junghans     Joe Jordan     Dimitrios Karkoulis
    Peter Kasson        Jiri Kraus      Carsten Kutzner      Per Larsson    
  Justin A. Lemkul    Viveca Lindahl    Magnus Lundborg     Erik Marklund   
    Pascal Merz     Pieter Meulenhoff    Teemu Murtola       Szilard Pall   
    Sander Pronk      Roland Schulz      Michael Shirts    Alexey Shvetsov  
   Alfons Sijbers     Peter Tieleman      Jon Vincent      Teemu Virolainen 
 Christian Wennberg    Maarten Wolf      Artem Zhmurov   
                           and the project leaders:
        Mark Abraham, Berk Hess, Erik Lindahl, and David van der Spoel

Copyright (c) 1991-2000, University of Groningen, The Netherlands.
Copyright (c) 2001-2019, The GROMACS development team at
Uppsala University, Stockholm University and
the Royal Institute of Technology, Sweden.
check out http://www.gromacs.org for more information.

GROMACS is free software; you can redistribute it and/or modify it
under the terms of the GNU Lesser General Public License
as published by the Free Software Foundation; either version 2.1
of the License, or (at your option) any later version.

GROMACS:      gmx editconf, version 2020.4
Executable:   /home/zhangxin/env/gmx2020.4/bin/gmx
Data prefix:  /home/zhangxin/env/gmx2020.4
Working dir:  /home/zhangxin/ACS/gmx/interaction/Nb6/pbsa
Command line:
  gmx editconf -f md_0_1.tpr -o _GMXMMPBSA_LIG.pdb -n _GMXMMPBSA_COM_index.ndx

Reading file md_0_1.tpr, VERSION 2020.4 (single precision)
Reading file md_0_1.tpr, VERSION 2020.4 (single precision)

Select a group for output:
Group     0 (         System) has 350281 elements
Group     1 (        Protein) has 10156 elements
Group     2 (      Protein-H) has  5163 elements
Group     3 (        C-alpha) has   660 elements
Group     4 (       Backbone) has  1980 elements
Group     5 (      MainChain) has  2643 elements
Group     6 (   MainChain+Cb) has  3250 elements
Group     7 (    MainChain+H) has  3272 elements
Group     8 (      SideChain) has  6884 elements
Group     9 (    SideChain-H) has  2520 elements
Group    10 (    Prot-Masses) has 10156 elements
Group    11 (    non-Protein) has 340125 elements
Group    12 (          Water) has 340116 elements
Group    13 (            SOL) has 340116 elements
Group    14 (      non-Water) has 10165 elements
Group    15 (            Ion) has     9 elements
Group    16 (             CL) has     9 elements
Group    17 ( Water_and_ions) has 340125 elements
Group    18 (  GMXMMPBSA_REC) has  2102 elements
Group    19 (            com) has 12136 elements
Group    20 (  GMXMMPBSA_LIG) has 10034 elements
Group    21 (GMXMMPBSA_REC_GMXMMPBSA_LIG) has 12136 elements
Select a group: 
GROMACS reminds you: "Still I had a lurking question. Would it not be better if one could really 'see' whether molecules as complicated as the sterols, or strychnine were just as experiment suggested?" (Dorothy Hodgkin)

Note that major changes are planned in future for editconf, to improve usability and utility.
Read 350281 atoms
Volume: 3485.93 nm^3, corresponds to roughly 1568600 electrons
Velocities found
Selected 20: 'GMXMMPBSA_LIG'
[INFO   ] Generating AMBER Compatible PDB Files...
[INFO   ] Building Tleap input files...
-I: Adding /home/zhangxin/anaconda3/envs/Bio/dat/leap/prep to search path.
-I: Adding /home/zhangxin/anaconda3/envs/Bio/dat/leap/lib to search path.
-I: Adding /home/zhangxin/anaconda3/envs/Bio/dat/leap/parm to search path.
-I: Adding /home/zhangxin/anaconda3/envs/Bio/dat/leap/cmd to search path.
-f: Source _GMXMMPBSA_leap.in.

Welcome to LEaP!
(no leaprc in search path)
Sourcing: ./_GMXMMPBSA_leap.in
----- Source: /home/zhangxin/anaconda3/envs/Bio/dat/leap/cmd/oldff/leaprc.ff99SBildn
----- Source of /home/zhangxin/anaconda3/envs/Bio/dat/leap/cmd/oldff/leaprc.ff99SBildn done
Log file: ./leap.log
Loading parameters: /home/zhangxin/anaconda3/envs/Bio/dat/leap/parm/parm99.dat
Reading title:
PARM99 for DNA,RNA,AA, organic molecules, TIP3P wat. Polariz.& LP incl.02/04/99
Loading parameters: /home/zhangxin/anaconda3/envs/Bio/dat/leap/parm/frcmod.ff99SBildn
Reading force field modification type file (frcmod)
Reading title:
Modification/update of parm99.dat (Hornak & Simmerling) + ILDN corrections
Loading library: /home/zhangxin/anaconda3/envs/Bio/dat/leap/lib/all_nucleic94.lib
Loading library: /home/zhangxin/anaconda3/envs/Bio/dat/leap/lib/all_amino94ildn.lib
Loading library: /home/zhangxin/anaconda3/envs/Bio/dat/leap/lib/all_aminoct94ildn.lib
Loading library: /home/zhangxin/anaconda3/envs/Bio/dat/leap/lib/all_aminont94ildn.lib
Loading library: /home/zhangxin/anaconda3/envs/Bio/dat/leap/lib/ions94.lib
Loading library: /home/zhangxin/anaconda3/envs/Bio/dat/leap/lib/solvents.lib
----- Source: /home/zhangxin/anaconda3/envs/Bio/dat/leap/cmd/leaprc.gaff
----- Source of /home/zhangxin/anaconda3/envs/Bio/dat/leap/cmd/leaprc.gaff done
Log file: ./leap.log
Loading parameters: /home/zhangxin/anaconda3/envs/Bio/dat/leap/parm/gaff.dat
Reading title:
AMBER General Force Field for organic molecules (Version 1.81, May 2017)
Loading library: /home/zhangxin/anaconda3/envs/Bio/dat/leap/lib/atomic_ions.lib
Loading parameters: /home/zhangxin/anaconda3/envs/Bio/dat/leap/parm/frcmod.ions234lm_126_tip3p
Reading force field modification type file (frcmod)
Reading title:
Li/Merz ion parameters of divalent to tetravalent ions for TIP3P water model (12-6 normal usage set)
Using H(N)-modified Bondi radii
Loading PDB file: ./_GMXMMPBSA_REC_F1.pdb
  Added missing heavy atom: .R<CGLN 273>.A<OXT 18>
  total atoms in file: 16
  Leap added 20 missing atoms according to residue templates:
       1 Heavy
       19 H / lone pairs
Loading PDB file: ./_GMXMMPBSA_REC_F2.pdb
  Added missing heavy atom: .R<CLEU 274>.A<OXT 20>
  total atoms in file: 8
  Leap added 12 missing atoms according to residue templates:
       1 Heavy
       11 H / lone pairs
Loading PDB file: ./_GMXMMPBSA_REC_F3.pdb
  Added missing heavy atom: .R<CSER 277>.A<OXT 12>
  total atoms in file: 6
  Leap added 6 missing atoms according to residue templates:
       1 Heavy
       5 H / lone pairs
Loading PDB file: ./_GMXMMPBSA_REC_F4.pdb
  Added missing heavy atom: .R<CGLN 283>.A<OXT 18>
  total atoms in file: 9
  Leap added 9 missing atoms according to residue templates:
       1 Heavy
       8 H / lone pairs
Loading PDB file: ./_GMXMMPBSA_REC_F5.pdb
  Added missing heavy atom: .R<CSER 295>.A<OXT 12>
  total atoms in file: 6
  Leap added 6 missing atoms according to residue templates:
       1 Heavy
       5 H / lone pairs
Loading PDB file: ./_GMXMMPBSA_REC_F6.pdb
  Added missing heavy atom: .R<CGLY 320>.A<OXT 8>
  total atoms in file: 9
  Leap added 11 missing atoms according to residue templates:
       1 Heavy
       10 H / lone pairs
Loading PDB file: ./_GMXMMPBSA_REC_F7.pdb
  Added missing heavy atom: .R<CALA 368>.A<OXT 11>
  total atoms in file: 5
  Leap added 6 missing atoms according to residue templates:
       1 Heavy
       5 H / lone pairs
Loading PDB file: ./_GMXMMPBSA_REC_F8.pdb

/home/zhangxin/anaconda3/envs/Bio/bin/teLeap: Warning!
Unknown residue: SOL   number: 0   type: Terminal/last
..relaxing end constraints to try for a dbase match

/home/zhangxin/anaconda3/envs/Bio/bin/teLeap: Warning!
  -no luck
Creating new UNIT for residue: SOL sequence: 347
Created a new atom named: OW within residue: .R<SOL 347>
  total atoms in file: 1
  The file contained 1 atoms not in residue templates
Loading PDB file: ./_GMXMMPBSA_REC_F9.pdb

/home/zhangxin/anaconda3/envs/Bio/bin/teLeap: Fatal Error!
	No atoms!

Exiting LEaP: Errors = 1; Warnings = 2; Notes = 0.
[ERROR  ] /home/zhangxin/anaconda3/envs/Bio/bin/tleap failed when querying _GMXMMPBSA_leap.in

unable to install

i installed amber tools. it working nicely. during installation of gmx_MMPBSA tool encountered by following error

(base) kimlab@kimlab:~/Desktop$ amber.python -m pip install gmx_MMPBSA
WARNING: Value for scheme.headers does not match. Please report this to pypa/pip#9617
distutils: /home/kimlab/Downloads/amber20/miniconda/include/python3.8/UNKNOWN
sysconfig: /home/kimlab/Downloads/amber20/miniconda/include/python3.8
WARNING: Additional context:
user = False
home = None
root = None
prefix = None
Collecting gmx_MMPBSA
Using cached gmx_MMPBSA-1.4.1-py3-none-any.whl (566 kB)
Collecting mpi4py>=3.0.3
Using cached mpi4py-3.0.3.tar.gz (1.4 MB)
Requirement already satisfied: seaborn>=0.11.1 in /home/kimlab/Downloads/amber20/miniconda/lib/python3.8/site-packages (from gmx_MMPBSA) (0.11.1)
Requirement already satisfied: pandas>=1.2.2 in /home/kimlab/Downloads/amber20/miniconda/lib/python3.8/site-packages (from gmx_MMPBSA) (1.2.4)
Requirement already satisfied: pytz>=2017.3 in /home/kimlab/Downloads/amber20/miniconda/lib/python3.8/site-packages (from pandas>=1.2.2->gmx_MMPBSA) (2021.1)
Requirement already satisfied: python-dateutil>=2.7.3 in /home/kimlab/Downloads/amber20/miniconda/lib/python3.8/site-packages (from pandas>=1.2.2->gmx_MMPBSA) (2.8.1)
Requirement already satisfied: numpy>=1.16.5 in /home/kimlab/Downloads/amber20/miniconda/lib/python3.8/site-packages (from pandas>=1.2.2->gmx_MMPBSA) (1.20.1)
Requirement already satisfied: six>=1.5 in /home/kimlab/Downloads/amber20/miniconda/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas>=1.2.2->gmx_MMPBSA) (1.15.0)
Requirement already satisfied: scipy>=1.0 in /home/kimlab/Downloads/amber20/miniconda/lib/python3.8/site-packages (from seaborn>=0.11.1->gmx_MMPBSA) (1.6.2)
Requirement already satisfied: matplotlib>=2.2 in /home/kimlab/Downloads/amber20/miniconda/lib/python3.8/site-packages (from seaborn>=0.11.1->gmx_MMPBSA) (3.4.1)
Requirement already satisfied: pillow>=6.2.0 in /home/kimlab/Downloads/amber20/miniconda/lib/python3.8/site-packages (from matplotlib>=2.2->seaborn>=0.11.1->gmx_MMPBSA) (8.2.0)
Requirement already satisfied: pyparsing>=2.2.1 in /home/kimlab/Downloads/amber20/miniconda/lib/python3.8/site-packages (from matplotlib>=2.2->seaborn>=0.11.1->gmx_MMPBSA) (2.4.7)
Requirement already satisfied: kiwisolver>=1.0.1 in /home/kimlab/Downloads/amber20/miniconda/lib/python3.8/site-packages (from matplotlib>=2.2->seaborn>=0.11.1->gmx_MMPBSA) (1.3.1)
Requirement already satisfied: cycler>=0.10 in /home/kimlab/Downloads/amber20/miniconda/lib/python3.8/site-packages (from matplotlib>=2.2->seaborn>=0.11.1->gmx_MMPBSA) (0.10.0)
Building wheels for collected packages: mpi4py
Building wheel for mpi4py (setup.py) ... error
ERROR: Command errored out with exit status 1:
command: /home/kimlab/Downloads/amber20/bin/amber.python -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-30ys9i7_/mpi4py_0b0ebb9a6b924999bb908103e40053b8/setup.py'"'"'; file='"'"'/tmp/pip-install-30ys9i7_/mpi4py_0b0ebb9a6b924999bb908103e40053b8/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(file) if os.path.exists(file) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-fhpwgooj
cwd: /tmp/pip-install-30ys9i7_/mpi4py_0b0ebb9a6b924999bb908103e40053b8/
Complete output (126 lines):
running bdist_wheel
running build
running build_src
running build_py
creating build
creating build/lib.linux-x86_64-3.8
creating build/lib.linux-x86_64-3.8/mpi4py
copying src/mpi4py/init.py -> build/lib.linux-x86_64-3.8/mpi4py
copying src/mpi4py/run.py -> build/lib.linux-x86_64-3.8/mpi4py
copying src/mpi4py/bench.py -> build/lib.linux-x86_64-3.8/mpi4py
copying src/mpi4py/main.py -> build/lib.linux-x86_64-3.8/mpi4py
creating build/lib.linux-x86_64-3.8/mpi4py/futures
copying src/mpi4py/futures/_base.py -> build/lib.linux-x86_64-3.8/mpi4py/futures
copying src/mpi4py/futures/init.py -> build/lib.linux-x86_64-3.8/mpi4py/futures
copying src/mpi4py/futures/server.py -> build/lib.linux-x86_64-3.8/mpi4py/futures
copying src/mpi4py/futures/_lib.py -> build/lib.linux-x86_64-3.8/mpi4py/futures
copying src/mpi4py/futures/aplus.py -> build/lib.linux-x86_64-3.8/mpi4py/futures
copying src/mpi4py/futures/pool.py -> build/lib.linux-x86_64-3.8/mpi4py/futures
copying src/mpi4py/futures/main.py -> build/lib.linux-x86_64-3.8/mpi4py/futures
copying src/mpi4py/init.pxd -> build/lib.linux-x86_64-3.8/mpi4py
copying src/mpi4py/libmpi.pxd -> build/lib.linux-x86_64-3.8/mpi4py
copying src/mpi4py/MPI.pxd -> build/lib.linux-x86_64-3.8/mpi4py
creating build/lib.linux-x86_64-3.8/mpi4py/include
creating build/lib.linux-x86_64-3.8/mpi4py/include/mpi4py
copying src/mpi4py/include/mpi4py/mpi4py.MPI_api.h -> build/lib.linux-x86_64-3.8/mpi4py/include/mpi4py
copying src/mpi4py/include/mpi4py/mpi4py.h -> build/lib.linux-x86_64-3.8/mpi4py/include/mpi4py
copying src/mpi4py/include/mpi4py/mpi4py.MPI.h -> build/lib.linux-x86_64-3.8/mpi4py/include/mpi4py
copying src/mpi4py/include/mpi4py/mpi4py.i -> build/lib.linux-x86_64-3.8/mpi4py/include/mpi4py
copying src/mpi4py/include/mpi4py/mpi.pxi -> build/lib.linux-x86_64-3.8/mpi4py/include/mpi4py
running build_clib
MPI configuration: [mpi] from 'mpi.cfg'
checking for library 'lmpe' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -llmpe -o _configtest
/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat/ld: cannot find -llmpe
collect2: error: ld returned 1 exit status
failure.
removing: _configtest.c _configtest.o
building 'mpe' dylib library
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/src
creating build/temp.linux-x86_64-3.8/src/lib-pmpi
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c src/lib-pmpi/mpe.c -o build/temp.linux-x86_64-3.8/src/lib-pmpi/mpe.o
creating build/lib.linux-x86_64-3.8/mpi4py/lib-pmpi
gcc -pthread -shared -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -L/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,-rpath=/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,--no-as-needed -Wl,--sysroot=/ -Wl,--no-as-needed build/temp.linux-x86_64-3.8/src/lib-pmpi/mpe.o -o build/lib.linux-x86_64-3.8/mpi4py/lib-pmpi/libmpe.so
checking for library 'vt-mpi' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -lvt-mpi -o _configtest
/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat/ld: cannot find -lvt-mpi
collect2: error: ld returned 1 exit status
failure.
removing: _configtest.c _configtest.o
checking for library 'vt.mpi' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -lvt.mpi -o _configtest
/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat/ld: cannot find -lvt.mpi
collect2: error: ld returned 1 exit status
failure.
removing: _configtest.c _configtest.o
building 'vt' dylib library
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c src/lib-pmpi/vt.c -o build/temp.linux-x86_64-3.8/src/lib-pmpi/vt.o
gcc -pthread -shared -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -L/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,-rpath=/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,--no-as-needed -Wl,--sysroot=/ -Wl,--no-as-needed build/temp.linux-x86_64-3.8/src/lib-pmpi/vt.o -o build/lib.linux-x86_64-3.8/mpi4py/lib-pmpi/libvt.so
checking for library 'vt-mpi' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -lvt-mpi -o _configtest
/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat/ld: cannot find -lvt-mpi
collect2: error: ld returned 1 exit status
failure.
removing: _configtest.c _configtest.o
checking for library 'vt.mpi' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -lvt.mpi -o _configtest
/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat/ld: cannot find -lvt.mpi
collect2: error: ld returned 1 exit status
failure.
removing: _configtest.c _configtest.o
building 'vt-mpi' dylib library
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c src/lib-pmpi/vt-mpi.c -o build/temp.linux-x86_64-3.8/src/lib-pmpi/vt-mpi.o
gcc -pthread -shared -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -L/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,-rpath=/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,--no-as-needed -Wl,--sysroot=/ -Wl,--no-as-needed build/temp.linux-x86_64-3.8/src/lib-pmpi/vt-mpi.o -o build/lib.linux-x86_64-3.8/mpi4py/lib-pmpi/libvt-mpi.so
checking for library 'vt-hyb' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -lvt-hyb -o _configtest
/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat/ld: cannot find -lvt-hyb
collect2: error: ld returned 1 exit status
failure.
removing: _configtest.c _configtest.o
checking for library 'vt.ompi' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -lvt.ompi -o _configtest
/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat/ld: cannot find -lvt.ompi
collect2: error: ld returned 1 exit status
failure.
removing: _configtest.c _configtest.o
building 'vt-hyb' dylib library
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c src/lib-pmpi/vt-hyb.c -o build/temp.linux-x86_64-3.8/src/lib-pmpi/vt-hyb.o
gcc -pthread -shared -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -L/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,-rpath=/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,--no-as-needed -Wl,--sysroot=/ -Wl,--no-as-needed build/temp.linux-x86_64-3.8/src/lib-pmpi/vt-hyb.o -o build/lib.linux-x86_64-3.8/mpi4py/lib-pmpi/libvt-hyb.so
running build_ext
MPI configuration: [mpi] from 'mpi.cfg'
checking for dlopen() availability ...
checking for header 'dlfcn.h' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/home/kimlab/Downloads/amber20/miniconda/include/python3.8 -c _configtest.c -o _configtest.o
success!
removing: _configtest.c _configtest.o
success!
checking for library 'dl' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/home/kimlab/Downloads/amber20/miniconda/include/python3.8 -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -Lbuild/temp.linux-x86_64-3.8 -ldl -o _configtest
success!
removing: _configtest.c _configtest.o _configtest
checking for function 'dlopen' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/home/kimlab/Downloads/amber20/miniconda/include/python3.8 -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -Lbuild/temp.linux-x86_64-3.8 -ldl -o _configtest
success!
removing: _configtest.c _configtest.o _configtest
building 'mpi4py.dl' extension
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -DHAVE_DLFCN_H=1 -DHAVE_DLOPEN=1 -I/home/kimlab/Downloads/amber20/miniconda/include/python3.8 -c src/dynload.c -o build/temp.linux-x86_64-3.8/src/dynload.o
gcc -pthread -shared -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -L/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,-rpath=/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,--no-as-needed -Wl,--sysroot=/ build/temp.linux-x86_64-3.8/src/dynload.o -Lbuild/temp.linux-x86_64-3.8 -ldl -o build/lib.linux-x86_64-3.8/mpi4py/dl.cpython-38-x86_64-linux-gnu.so
checking for MPI compile and link ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/home/kimlab/Downloads/amber20/miniconda/include/python3.8 -c _configtest.c -o _configtest.o
_configtest.c:2:10: fatal error: mpi.h: No such file or directory
#include <mpi.h>
^~~~~~~
compilation terminated.
failure.
removing: _configtest.c _configtest.o
error: Cannot compile MPI programs. Check your configuration!!!

ERROR: Failed building wheel for mpi4py
Running setup.py clean for mpi4py
Failed to build mpi4py
Installing collected packages: mpi4py, gmx-MMPBSA
Running setup.py install for mpi4py ... error
ERROR: Command errored out with exit status 1:
command: /home/kimlab/Downloads/amber20/bin/amber.python -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-30ys9i7_/mpi4py_0b0ebb9a6b924999bb908103e40053b8/setup.py'"'"'; file='"'"'/tmp/pip-install-30ys9i7_/mpi4py_0b0ebb9a6b924999bb908103e40053b8/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(file) if os.path.exists(file) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' install --record /tmp/pip-record-l0vfp8zd/install-record.txt --single-version-externally-managed --compile --install-headers /home/kimlab/Downloads/amber20/miniconda/include/python3.8/mpi4py
cwd: /tmp/pip-install-30ys9i7_/mpi4py_0b0ebb9a6b924999bb908103e40053b8/
Complete output (126 lines):
running install
running build
running build_src
running build_py
creating build
creating build/lib.linux-x86_64-3.8
creating build/lib.linux-x86_64-3.8/mpi4py
copying src/mpi4py/init.py -> build/lib.linux-x86_64-3.8/mpi4py
copying src/mpi4py/run.py -> build/lib.linux-x86_64-3.8/mpi4py
copying src/mpi4py/bench.py -> build/lib.linux-x86_64-3.8/mpi4py
copying src/mpi4py/main.py -> build/lib.linux-x86_64-3.8/mpi4py
creating build/lib.linux-x86_64-3.8/mpi4py/futures
copying src/mpi4py/futures/_base.py -> build/lib.linux-x86_64-3.8/mpi4py/futures
copying src/mpi4py/futures/init.py -> build/lib.linux-x86_64-3.8/mpi4py/futures
copying src/mpi4py/futures/server.py -> build/lib.linux-x86_64-3.8/mpi4py/futures
copying src/mpi4py/futures/_lib.py -> build/lib.linux-x86_64-3.8/mpi4py/futures
copying src/mpi4py/futures/aplus.py -> build/lib.linux-x86_64-3.8/mpi4py/futures
copying src/mpi4py/futures/pool.py -> build/lib.linux-x86_64-3.8/mpi4py/futures
copying src/mpi4py/futures/main.py -> build/lib.linux-x86_64-3.8/mpi4py/futures
copying src/mpi4py/init.pxd -> build/lib.linux-x86_64-3.8/mpi4py
copying src/mpi4py/libmpi.pxd -> build/lib.linux-x86_64-3.8/mpi4py
copying src/mpi4py/MPI.pxd -> build/lib.linux-x86_64-3.8/mpi4py
creating build/lib.linux-x86_64-3.8/mpi4py/include
creating build/lib.linux-x86_64-3.8/mpi4py/include/mpi4py
copying src/mpi4py/include/mpi4py/mpi4py.MPI_api.h -> build/lib.linux-x86_64-3.8/mpi4py/include/mpi4py
copying src/mpi4py/include/mpi4py/mpi4py.h -> build/lib.linux-x86_64-3.8/mpi4py/include/mpi4py
copying src/mpi4py/include/mpi4py/mpi4py.MPI.h -> build/lib.linux-x86_64-3.8/mpi4py/include/mpi4py
copying src/mpi4py/include/mpi4py/mpi4py.i -> build/lib.linux-x86_64-3.8/mpi4py/include/mpi4py
copying src/mpi4py/include/mpi4py/mpi.pxi -> build/lib.linux-x86_64-3.8/mpi4py/include/mpi4py
running build_clib
MPI configuration: [mpi] from 'mpi.cfg'
checking for library 'lmpe' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -llmpe -o _configtest
/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat/ld: cannot find -llmpe
collect2: error: ld returned 1 exit status
failure.
removing: _configtest.c _configtest.o
building 'mpe' dylib library
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/src
creating build/temp.linux-x86_64-3.8/src/lib-pmpi
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c src/lib-pmpi/mpe.c -o build/temp.linux-x86_64-3.8/src/lib-pmpi/mpe.o
creating build/lib.linux-x86_64-3.8/mpi4py/lib-pmpi
gcc -pthread -shared -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -L/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,-rpath=/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,--no-as-needed -Wl,--sysroot=/ -Wl,--no-as-needed build/temp.linux-x86_64-3.8/src/lib-pmpi/mpe.o -o build/lib.linux-x86_64-3.8/mpi4py/lib-pmpi/libmpe.so
checking for library 'vt-mpi' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -lvt-mpi -o _configtest
/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat/ld: cannot find -lvt-mpi
collect2: error: ld returned 1 exit status
failure.
removing: _configtest.c _configtest.o
checking for library 'vt.mpi' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -lvt.mpi -o _configtest
/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat/ld: cannot find -lvt.mpi
collect2: error: ld returned 1 exit status
failure.
removing: _configtest.c _configtest.o
building 'vt' dylib library
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c src/lib-pmpi/vt.c -o build/temp.linux-x86_64-3.8/src/lib-pmpi/vt.o
gcc -pthread -shared -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -L/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,-rpath=/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,--no-as-needed -Wl,--sysroot=/ -Wl,--no-as-needed build/temp.linux-x86_64-3.8/src/lib-pmpi/vt.o -o build/lib.linux-x86_64-3.8/mpi4py/lib-pmpi/libvt.so
checking for library 'vt-mpi' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -lvt-mpi -o _configtest
/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat/ld: cannot find -lvt-mpi
collect2: error: ld returned 1 exit status
failure.
removing: _configtest.c _configtest.o
checking for library 'vt.mpi' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -lvt.mpi -o _configtest
/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat/ld: cannot find -lvt.mpi
collect2: error: ld returned 1 exit status
failure.
removing: _configtest.c _configtest.o
building 'vt-mpi' dylib library
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c src/lib-pmpi/vt-mpi.c -o build/temp.linux-x86_64-3.8/src/lib-pmpi/vt-mpi.o
gcc -pthread -shared -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -L/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,-rpath=/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,--no-as-needed -Wl,--sysroot=/ -Wl,--no-as-needed build/temp.linux-x86_64-3.8/src/lib-pmpi/vt-mpi.o -o build/lib.linux-x86_64-3.8/mpi4py/lib-pmpi/libvt-mpi.so
checking for library 'vt-hyb' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -lvt-hyb -o _configtest
/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat/ld: cannot find -lvt-hyb
collect2: error: ld returned 1 exit status
failure.
removing: _configtest.c _configtest.o
checking for library 'vt.ompi' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -lvt.ompi -o _configtest
/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat/ld: cannot find -lvt.ompi
collect2: error: ld returned 1 exit status
failure.
removing: _configtest.c _configtest.o
building 'vt-hyb' dylib library
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -c src/lib-pmpi/vt-hyb.c -o build/temp.linux-x86_64-3.8/src/lib-pmpi/vt-hyb.o
gcc -pthread -shared -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -L/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,-rpath=/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,--no-as-needed -Wl,--sysroot=/ -Wl,--no-as-needed build/temp.linux-x86_64-3.8/src/lib-pmpi/vt-hyb.o -o build/lib.linux-x86_64-3.8/mpi4py/lib-pmpi/libvt-hyb.so
running build_ext
MPI configuration: [mpi] from 'mpi.cfg'
checking for dlopen() availability ...
checking for header 'dlfcn.h' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/home/kimlab/Downloads/amber20/miniconda/include/python3.8 -c _configtest.c -o _configtest.o
success!
removing: _configtest.c _configtest.o
success!
checking for library 'dl' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/home/kimlab/Downloads/amber20/miniconda/include/python3.8 -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -Lbuild/temp.linux-x86_64-3.8 -ldl -o _configtest
success!
removing: _configtest.c _configtest.o _configtest
checking for function 'dlopen' ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/home/kimlab/Downloads/amber20/miniconda/include/python3.8 -c _configtest.c -o _configtest.o
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ _configtest.o -Lbuild/temp.linux-x86_64-3.8 -ldl -o _configtest
success!
removing: _configtest.c _configtest.o _configtest
building 'mpi4py.dl' extension
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -DHAVE_DLFCN_H=1 -DHAVE_DLOPEN=1 -I/home/kimlab/Downloads/amber20/miniconda/include/python3.8 -c src/dynload.c -o build/temp.linux-x86_64-3.8/src/dynload.o
gcc -pthread -shared -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -L/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,-rpath=/home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/lib -Wl,--no-as-needed -Wl,--sysroot=/ build/temp.linux-x86_64-3.8/src/dynload.o -Lbuild/temp.linux-x86_64-3.8 -ldl -o build/lib.linux-x86_64-3.8/mpi4py/dl.cpython-38-x86_64-linux-gnu.so
checking for MPI compile and link ...
gcc -pthread -B /home/kimlab/Downloads/amber20_src/build/CMakeFiles/miniconda/install/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/home/kimlab/Downloads/amber20/miniconda/include/python3.8 -c _configtest.c -o _configtest.o
_configtest.c:2:10: fatal error: mpi.h: No such file or directory
#include <mpi.h>
^~~~~~~
compilation terminated.
failure.
removing: configtest.c configtest.o
error: Cannot compile MPI programs. Check your configuration!!!
----------------------------------------
ERROR: Command errored out with exit status 1: /home/kimlab/Downloads/amber20/bin/amber.python -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-30ys9i7
/mpi4py_0b0ebb9a6b924999bb908103e40053b8/setup.py'"'"'; file='"'"'/tmp/pip-install-30ys9i7
/mpi4py_0b0ebb9a6b924999bb908103e40053b8/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(file) if os.path.exists(file) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' install --record /tmp/pip-record-l0vfp8zd/install-record.txt --single-version-externally-managed --compile --install-headers /home/kimlab/Downloads/amber20/miniconda/include/python3.8/mpi4py Check the logs for full command output.

# atoms in XTC file does not match # atoms in parm COM.prmtop

I was following Protein_DNA_RNA_ion_ligand tutorial for MMPBSA calculation.
My system contains protein, RNA and DNA. I considered RNA_Protein as receptor and DNA as ligand and made the files accordingly.
Command used:
gmx_MMPBSA MPI -O -i mmgbsa.in -cs sys_4oo8.pdb -ci md_prot_rna.ndx -cg 22 3 -ct md_fit.xtc

Error encountered:
Preparing trajectories for simulation...
Error: # atoms in XTC file (26235) does not match # atoms in parm COM.prmtop (26264)
Error: Could not set up 'COM_traj_0.xtc' for reading.

Could you please guide me on where am I going wrong? Thanks in advance.

unable to execute after installation

Describe the issue
Installed the program gmx_MMPBSA with amber.python and also installed other dependencies as mentioned in program website. All path is set as instructed.

A clear and concise description of what the issue is.

If you will report an error, please complete this form
Installing collected packages: seaborn, mpi4py, gmx-MMPBSA
WARNING: The scripts gmx_MMPBSA, gmx_MMPBSA_ana and gmx_MMPBSA_test are installed in '/home/ssinha/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
Successfully installed gmx-MMPBSA-1.4.3 mpi4py-3.0.3 seaborn-0.11.1

(base) ssinha@susi:~/Desktop$ gmx_MMPBSA
[INFO ] Starting
[INFO ] Command-line
gmx_MMPBSA

InputError: No input file was provided!
Enter gmx_MMPBSA --help for help
(base) ssinha@susi:/Desktop$ gmx_MMPBSA
gmx_MMPBSA gmx_MMPBSA_ana gmx_MMPBSA_test
(base) ssinha@susi:
/Desktop$ gmx_MMPBSA
gmx_MMPBSA gmx_MMPBSA_ana gmx_MMPBSA_test
(base) ssinha@susi:~/Desktop$ gmx_MMPBSA_test
[INFO ] Cloning gmx_MMPBSA repository in gmx_MMPBSA_test
Cloning into 'gmx_MMPBSA_test'...
remote: Enumerating objects: 6224, done.
remote: Counting objects: 100% (1581/1581), done.
remote: Compressing objects: 100% (527/527), done.
remote: Total 6224 (delta 1068), reused 1381 (delta 910), pack-reused 4643
Receiving objects: 100% (6224/6224), 255.69 MiB | 7.44 MiB/s, done.
Resolving deltas: 100% (4390/4390), done.
Updating files: 100% (364/364), done.
[INFO ] Cloning gmx_MMPBSA repository...Done.
[INFO ] Example STATE

[INFO ] Protein-Ligand (Single trajectory approximation) RUNNING
[INFO ] Protein-Protein RUNNING
[INFO ] Protein-Ligand (Single trajectory approximation) [ 1/12] DONE
[INFO ] Protein-DNA RUNNING
multiprocessing.pool.RemoteTraceback:
"""
Traceback (most recent call last):
File "/home/ssinha/amber20/miniconda/lib/python3.8/multiprocessing/pool.py", line 125, in worker
result = (True, func(*args, **kwds))
File "/home/ssinha/.local/lib/python3.8/site-packages/GMXMMPBSA/tester.py", line 29, in calculatestar
return run_process(*args)
File "/home/ssinha/.local/lib/python3.8/site-packages/GMXMMPBSA/tester.py", line 33, in run_process
os.chdir(system[0])
FileNotFoundError: [Errno 2] No such file or directory: 'gmx_MMPBSA_test/docs/examples/Protein_protein'
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/ssinha/.local/bin/gmx_MMPBSA_test", line 8, in
sys.exit(gmxmmpbsa_test())
File "/home/ssinha/.local/lib/python3.8/site-packages/GMXMMPBSA/app.py", line 146, in gmxmmpbsa_test
run_test(parser)
File "/home/ssinha/.local/lib/python3.8/site-packages/GMXMMPBSA/tester.py", line 132, in run_test
for x in imap_unordered_it:
File "/home/ssinha/amber20/miniconda/lib/python3.8/multiprocessing/pool.py", line 868, in next
raise value
FileNotFoundError: [Errno 2] No such file or directory: 'gmx_MMPBSA_test/docs/examples/Protein_protein'
To Reproduce
Steps to reproduce the behavior:

  1. Which system type are you running? (Protein-Protein, Protein-Ligand, etc)
  2. The command-line are you using.

Plz help

tleap fail

I reinstalled amber20 with miniconda3 and python3
But I have encountered another problem and below is the gmx_MMPBSA.log

[INFO   ] Started
[INFO   ] Loading and checking parameter files for compatibility...

[INFO   ] Checking external programs...
[INFO   ] cpptraj found! Using /opt/AMBER/amber20/bin/cpptraj
[INFO   ] tleap found! Using /opt/AMBER/amber20/bin/tleap
[INFO   ] parmchk2 found! Using /opt/AMBER/amber20/bin/parmchk2
[INFO   ] mmpbsa_py_energy found! Using /opt/AMBER/amber20/bin/mmpbsa_py_energy
[INFO   ] mmpbsa_py_nabnmode found! Using /opt/AMBER/amber20/bin/mmpbsa_py_nabnmode
[INFO   ] Using GROMACS version > 5.x.x!
[INFO   ] gmx_mpi found! Using /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2/bin/gmx_mpi
[INFO   ] Checking external programs...Done.

[INFO   ] Building AMBER Topologies from GROMACS files...
[INFO   ] Checking if supported force fields exists in Amber data...
[INFO   ] Get PDB files from GROMACS structures files...
[INFO   ] Making gmx_MMPBSA index for complex...
                     :-) GROMACS - gmx make_ndx, 2020.2 (-:

                            GROMACS is written by:
     Emile Apol      Rossen Apostolov      Paul Bauer     Herman J.C. Berendsen
    Par Bjelkmar      Christian Blau   Viacheslav Bolnykh     Kevin Boyd    
 Aldert van Buuren   Rudi van Drunen     Anton Feenstra       Alan Gray     
  Gerrit Groenhof     Anca Hamuraru    Vincent Hindriksen  M. Eric Irrgang  
  Aleksei Iupinov   Christoph Junghans     Joe Jordan     Dimitrios Karkoulis
    Peter Kasson        Jiri Kraus      Carsten Kutzner      Per Larsson    
  Justin A. Lemkul    Viveca Lindahl    Magnus Lundborg     Erik Marklund   
    Pascal Merz     Pieter Meulenhoff    Teemu Murtola       Szilard Pall   
    Sander Pronk      Roland Schulz      Michael Shirts    Alexey Shvetsov  
   Alfons Sijbers     Peter Tieleman      Jon Vincent      Teemu Virolainen 
 Christian Wennberg    Maarten Wolf      Artem Zhmurov   
                           and the project leaders:
        Mark Abraham, Berk Hess, Erik Lindahl, and David van der Spoel

Copyright (c) 1991-2000, University of Groningen, The Netherlands.
Copyright (c) 2001-2019, The GROMACS development team at
Uppsala University, Stockholm University and
the Royal Institute of Technology, Sweden.
check out http://www.gromacs.org for more information.

GROMACS is free software; you can redistribute it and/or modify it
under the terms of the GNU Lesser General Public License
as published by the Free Software Foundation; either version 2.1
of the License, or (at your option) any later version.

GROMACS:      gmx make_ndx, version 2020.2
Executable:   /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2/bin/gmx_mpi
Data prefix:  /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2
Working dir:  /home/byun/PROject/CORONAMD/AMBER/CHARMM36-ff/X0072/amber/gmx_MMPBSA
Command line:
  gmx_mpi make_ndx -n index.ndx -o _GMXMMPBSA_COM_index.ndx


GROMACS reminds you: "Inventions have long since reached their limit, and I see no hope for further development." (Julius Sextus Frontinus, 1st century A.D.)

Going to read 1 old index file(s)
Counted atom numbers up to 90160 in index file

  0 System              : 90160 atoms
  1 Protein             :  4645 atoms
  2 Protein-H           :  2348 atoms
  3 C-alpha             :   304 atoms
  4 Backbone            :   912 atoms
  5 MainChain           :  1217 atoms
  6 MainChain+Cb        :  1495 atoms
  7 MainChain+H         :  1507 atoms
  8 SideChain           :  3138 atoms
  9 SideChain-H         :  1131 atoms
 10 Prot-Masses         :  4645 atoms
 11 non-Protein         : 85515 atoms
 12 Other               :    30 atoms
 13 LIG                 :    26 atoms
 14 POT                 :     4 atoms
 15 Water               : 85485 atoms
 16 SOL                 : 85485 atoms
 17 non-Water           :  4675 atoms
 18 Protein_LIG         :  4671 atoms
 19 Water_and_ions      : 85489 atoms

 nr : group      '!': not  'name' nr name   'splitch' nr    Enter: list groups
 'a': atom       '&': and  'del' nr         'splitres' nr   'l': list residues
 't': atom type  '|': or   'keep' nr        'splitat' nr    'h': help
 'r': residue              'res' nr         'chain' char
 "name": group             'case': case sensitive           'q': save and quit
 'ri': residue index

> 

> 

> 
Copied index group 1 'GMXMMPBSA_REC'
Copied index group 13 'GMXMMPBSA_LIG'
Merged two groups with OR: 4645 26 -> 4671

 20 GMXMMPBSA_REC_GMXMMPBSA_LIG:  4671 atoms

> 
[INFO   ] Normal Complex: Saving group 1_13 in _GMXMMPBSA_COM_index.ndx file as _GMXMMPBSA_COM.pdb
                     :-) GROMACS - gmx editconf, 2020.2 (-:

                            GROMACS is written by:
     Emile Apol      Rossen Apostolov      Paul Bauer     Herman J.C. Berendsen
    Par Bjelkmar      Christian Blau   Viacheslav Bolnykh     Kevin Boyd    
 Aldert van Buuren   Rudi van Drunen     Anton Feenstra       Alan Gray     
  Gerrit Groenhof     Anca Hamuraru    Vincent Hindriksen  M. Eric Irrgang  
  Aleksei Iupinov   Christoph Junghans     Joe Jordan     Dimitrios Karkoulis
    Peter Kasson        Jiri Kraus      Carsten Kutzner      Per Larsson    
  Justin A. Lemkul    Viveca Lindahl    Magnus Lundborg     Erik Marklund   
    Pascal Merz     Pieter Meulenhoff    Teemu Murtola       Szilard Pall   
    Sander Pronk      Roland Schulz      Michael Shirts    Alexey Shvetsov  
   Alfons Sijbers     Peter Tieleman      Jon Vincent      Teemu Virolainen 
 Christian Wennberg    Maarten Wolf      Artem Zhmurov   
                           and the project leaders:
        Mark Abraham, Berk Hess, Erik Lindahl, and David van der Spoel

Copyright (c) 1991-2000, University of Groningen, The Netherlands.
Copyright (c) 2001-2019, The GROMACS development team at
Uppsala University, Stockholm University and
the Royal Institute of Technology, Sweden.
check out http://www.gromacs.org for more information.

GROMACS is free software; you can redistribute it and/or modify it
under the terms of the GNU Lesser General Public License
as published by the Free Software Foundation; either version 2.1
of the License, or (at your option) any later version.

GROMACS:      gmx editconf, version 2020.2
Executable:   /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2/bin/gmx_mpi
Data prefix:  /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2
Working dir:  /home/byun/PROject/CORONAMD/AMBER/CHARMM36-ff/X0072/amber/gmx_MMPBSA
Command line:
  gmx_mpi editconf -f simulation.tpr -o _GMXMMPBSA_COM.pdb -n _GMXMMPBSA_COM_index.ndx

Reading file simulation.tpr, VERSION 2020.2 (single precision)
Reading file simulation.tpr, VERSION 2020.2 (single precision)

Select a group for output:
Group     0 (         System) has 90160 elements
Group     1 (  GMXMMPBSA_REC) has  4645 elements
Group     2 (      Protein-H) has  2348 elements
Group     3 (        C-alpha) has   304 elements
Group     4 (       Backbone) has   912 elements
Group     5 (      MainChain) has  1217 elements
Group     6 (   MainChain+Cb) has  1495 elements
Group     7 (    MainChain+H) has  1507 elements
Group     8 (      SideChain) has  3138 elements
Group     9 (    SideChain-H) has  1131 elements
Group    10 (    Prot-Masses) has  4645 elements
Group    11 (    non-Protein) has 85515 elements
Group    12 (          Other) has    30 elements
Group    13 (  GMXMMPBSA_LIG) has    26 elements
Group    14 (            POT) has     4 elements
Group    15 (          Water) has 85485 elements
Group    16 (            SOL) has 85485 elements
Group    17 (      non-Water) has  4675 elements
Group    18 (    Protein_LIG) has  4671 elements
Group    19 ( Water_and_ions) has 85489 elements
Group    20 (GMXMMPBSA_REC_GMXMMPBSA_LIG) has  4671 elements
Select a group: 
GROMACS reminds you: "Shake Yourself" (YES)

Note that major changes are planned in future for editconf, to improve usability and utility.
Read 90160 atoms
Volume: 891.722 nm^3, corresponds to roughly 401200 electrons
No velocities found
Selected 20: 'GMXMMPBSA_REC_GMXMMPBSA_LIG'
[INFO   ] No receptor structure file was defined. Using ST approach...
[INFO   ] Using receptor structure from complex to generate AMBER topology
[INFO   ] Normal Complex: Saving group 1 in _GMXMMPBSA_COM_index.ndx file as _GMXMMPBSA_REC.pdb
                     :-) GROMACS - gmx editconf, 2020.2 (-:

                            GROMACS is written by:
     Emile Apol      Rossen Apostolov      Paul Bauer     Herman J.C. Berendsen
    Par Bjelkmar      Christian Blau   Viacheslav Bolnykh     Kevin Boyd    
 Aldert van Buuren   Rudi van Drunen     Anton Feenstra       Alan Gray     
  Gerrit Groenhof     Anca Hamuraru    Vincent Hindriksen  M. Eric Irrgang  
  Aleksei Iupinov   Christoph Junghans     Joe Jordan     Dimitrios Karkoulis
    Peter Kasson        Jiri Kraus      Carsten Kutzner      Per Larsson    
  Justin A. Lemkul    Viveca Lindahl    Magnus Lundborg     Erik Marklund   
    Pascal Merz     Pieter Meulenhoff    Teemu Murtola       Szilard Pall   
    Sander Pronk      Roland Schulz      Michael Shirts    Alexey Shvetsov  
   Alfons Sijbers     Peter Tieleman      Jon Vincent      Teemu Virolainen 
 Christian Wennberg    Maarten Wolf      Artem Zhmurov   
                           and the project leaders:
        Mark Abraham, Berk Hess, Erik Lindahl, and David van der Spoel

Copyright (c) 1991-2000, University of Groningen, The Netherlands.
Copyright (c) 2001-2019, The GROMACS development team at
Uppsala University, Stockholm University and
the Royal Institute of Technology, Sweden.
check out http://www.gromacs.org for more information.

GROMACS is free software; you can redistribute it and/or modify it
under the terms of the GNU Lesser General Public License
as published by the Free Software Foundation; either version 2.1
of the License, or (at your option) any later version.

GROMACS:      gmx editconf, version 2020.2
Executable:   /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2/bin/gmx_mpi
Data prefix:  /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2
Working dir:  /home/byun/PROject/CORONAMD/AMBER/CHARMM36-ff/X0072/amber/gmx_MMPBSA
Command line:
  gmx_mpi editconf -f simulation.tpr -o _GMXMMPBSA_REC.pdb -n _GMXMMPBSA_COM_index.ndx

Reading file simulation.tpr, VERSION 2020.2 (single precision)
Reading file simulation.tpr, VERSION 2020.2 (single precision)

Select a group for output:
Group     0 (         System) has 90160 elements
Group     1 (  GMXMMPBSA_REC) has  4645 elements
Group     2 (      Protein-H) has  2348 elements
Group     3 (        C-alpha) has   304 elements
Group     4 (       Backbone) has   912 elements
Group     5 (      MainChain) has  1217 elements
Group     6 (   MainChain+Cb) has  1495 elements
Group     7 (    MainChain+H) has  1507 elements
Group     8 (      SideChain) has  3138 elements
Group     9 (    SideChain-H) has  1131 elements
Group    10 (    Prot-Masses) has  4645 elements
Group    11 (    non-Protein) has 85515 elements
Group    12 (          Other) has    30 elements
Group    13 (  GMXMMPBSA_LIG) has    26 elements
Group    14 (            POT) has     4 elements
Group    15 (          Water) has 85485 elements
Group    16 (            SOL) has 85485 elements
Group    17 (      non-Water) has  4675 elements
Group    18 (    Protein_LIG) has  4671 elements
Group    19 ( Water_and_ions) has 85489 elements
Group    20 (GMXMMPBSA_REC_GMXMMPBSA_LIG) has  4671 elements
Select a group: 
GROMACS reminds you: "The Path Of the Righteous Man is Beset On All Sides With the Iniquities Of the Selfish and the Tyranny Of Evil Men." (Pulp Fiction)

Note that major changes are planned in future for editconf, to improve usability and utility.
Read 90160 atoms
Volume: 891.722 nm^3, corresponds to roughly 401200 electrons
No velocities found
Selected 1: 'GMXMMPBSA_REC'
[INFO   ] No ligand structure file was defined. Using ST approach...
[INFO   ] Using ligand structure from complex to generate AMBER topology
[INFO   ] Normal ligand: Saving group 13 in _GMXMMPBSA_COM_index.ndx file as _GMXMMPBSA_LIG.pdb
                     :-) GROMACS - gmx editconf, 2020.2 (-:

                            GROMACS is written by:
     Emile Apol      Rossen Apostolov      Paul Bauer     Herman J.C. Berendsen
    Par Bjelkmar      Christian Blau   Viacheslav Bolnykh     Kevin Boyd    
 Aldert van Buuren   Rudi van Drunen     Anton Feenstra       Alan Gray     
  Gerrit Groenhof     Anca Hamuraru    Vincent Hindriksen  M. Eric Irrgang  
  Aleksei Iupinov   Christoph Junghans     Joe Jordan     Dimitrios Karkoulis
    Peter Kasson        Jiri Kraus      Carsten Kutzner      Per Larsson    
  Justin A. Lemkul    Viveca Lindahl    Magnus Lundborg     Erik Marklund   
    Pascal Merz     Pieter Meulenhoff    Teemu Murtola       Szilard Pall   
    Sander Pronk      Roland Schulz      Michael Shirts    Alexey Shvetsov  
   Alfons Sijbers     Peter Tieleman      Jon Vincent      Teemu Virolainen 
 Christian Wennberg    Maarten Wolf      Artem Zhmurov   
                           and the project leaders:
        Mark Abraham, Berk Hess, Erik Lindahl, and David van der Spoel

Copyright (c) 1991-2000, University of Groningen, The Netherlands.
Copyright (c) 2001-2019, The GROMACS development team at
Uppsala University, Stockholm University and
the Royal Institute of Technology, Sweden.
check out http://www.gromacs.org for more information.

GROMACS is free software; you can redistribute it and/or modify it
under the terms of the GNU Lesser General Public License
as published by the Free Software Foundation; either version 2.1
of the License, or (at your option) any later version.

GROMACS:      gmx editconf, version 2020.2
Executable:   /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2/bin/gmx_mpi
Data prefix:  /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2
Working dir:  /home/byun/PROject/CORONAMD/AMBER/CHARMM36-ff/X0072/amber/gmx_MMPBSA
Command line:
  gmx_mpi editconf -f simulation.tpr -o _GMXMMPBSA_LIG.pdb -n _GMXMMPBSA_COM_index.ndx

Reading file simulation.tpr, VERSION 2020.2 (single precision)
Reading file simulation.tpr, VERSION 2020.2 (single precision)

Select a group for output:
Group     0 (         System) has 90160 elements
Group     1 (  GMXMMPBSA_REC) has  4645 elements
Group     2 (      Protein-H) has  2348 elements
Group     3 (        C-alpha) has   304 elements
Group     4 (       Backbone) has   912 elements
Group     5 (      MainChain) has  1217 elements
Group     6 (   MainChain+Cb) has  1495 elements
Group     7 (    MainChain+H) has  1507 elements
Group     8 (      SideChain) has  3138 elements
Group     9 (    SideChain-H) has  1131 elements
Group    10 (    Prot-Masses) has  4645 elements
Group    11 (    non-Protein) has 85515 elements
Group    12 (          Other) has    30 elements
Group    13 (  GMXMMPBSA_LIG) has    26 elements
Group    14 (            POT) has     4 elements
Group    15 (          Water) has 85485 elements
Group    16 (            SOL) has 85485 elements
Group    17 (      non-Water) has  4675 elements
Group    18 (    Protein_LIG) has  4671 elements
Group    19 ( Water_and_ions) has 85489 elements
Group    20 (GMXMMPBSA_REC_GMXMMPBSA_LIG) has  4671 elements
Select a group: 
GROMACS reminds you: "Martin [Karplus] had a green laser, Arieh [Warshel] had a red laser, I have a *blue* laser" (Michael Levitt, Nobel lecture 2013)

Note that major changes are planned in future for editconf, to improve usability and utility.
Read 90160 atoms
Volume: 891.722 nm^3, corresponds to roughly 401200 electrons
No velocities found
Selected 13: 'GMXMMPBSA_LIG'
[INFO   ] Generating AMBER Compatible PDB Files...
[INFO   ] Building Tleap input files...
-I: Adding /opt/AMBER/amber20/dat/leap/prep to search path.
-I: Adding /opt/AMBER/amber20/dat/leap/lib to search path.
-I: Adding /opt/AMBER/amber20/dat/leap/parm to search path.
-I: Adding /opt/AMBER/amber20/dat/leap/cmd to search path.
-f: Source _GMXMMPBSA_leap.in.

Welcome to LEaP!
(no leaprc in search path)
Sourcing: ./_GMXMMPBSA_leap.in
----- Source: /opt/AMBER/amber20/dat/leap/cmd/oldff/leaprc.ff99SB
----- Source of /opt/AMBER/amber20/dat/leap/cmd/oldff/leaprc.ff99SB done
Log file: ./leap.log
Loading parameters: /opt/AMBER/amber20/dat/leap/parm/parm99.dat
Reading title:
PARM99 for DNA,RNA,AA, organic molecules, TIP3P wat. Polariz.& LP incl.02/04/99
Loading parameters: /opt/AMBER/amber20/dat/leap/parm/frcmod.ff99SB
Reading force field modification type file (frcmod)
Reading title:
Modification/update of parm99.dat (Hornak & Simmerling)
Loading library: /opt/AMBER/amber20/dat/leap/lib/all_nucleic94.lib
Loading library: /opt/AMBER/amber20/dat/leap/lib/all_amino94.lib
Loading library: /opt/AMBER/amber20/dat/leap/lib/all_aminoct94.lib
Loading library: /opt/AMBER/amber20/dat/leap/lib/all_aminont94.lib
Loading library: /opt/AMBER/amber20/dat/leap/lib/ions94.lib
Loading library: /opt/AMBER/amber20/dat/leap/lib/solvents.lib
----- Source: /opt/AMBER/amber20/dat/leap/cmd/leaprc.gaff
----- Source of /opt/AMBER/amber20/dat/leap/cmd/leaprc.gaff done
Log file: ./leap.log
Loading parameters: /opt/AMBER/amber20/dat/leap/parm/gaff.dat
Reading title:
AMBER General Force Field for organic molecules (Version 1.81, May 2017)
Loading library: /opt/AMBER/amber20/dat/leap/lib/atomic_ions.lib
Loading parameters: /opt/AMBER/amber20/dat/leap/parm/frcmod.ions234lm_126_tip3p
Reading force field modification type file (frcmod)
Reading title:
Li/Merz ion parameters of divalent to tetravalent ions for TIP3P water model (12-6 normal usage set)
Using H(N)-modified Bondi radii
Loading PDB file: ./_GMXMMPBSA_REC_F1.pdb

/opt/AMBER/amber20/bin/teLeap: Warning!
Unknown residue: HSD   number: 40   type: Nonterminal

/opt/AMBER/amber20/bin/teLeap: Warning!
Unknown residue: HSD   number: 63   type: Nonterminal

/opt/AMBER/amber20/bin/teLeap: Warning!
Unknown residue: HSD   number: 79   type: Nonterminal

/opt/AMBER/amber20/bin/teLeap: Warning!
Unknown residue: HSD   number: 162   type: Nonterminal

/opt/AMBER/amber20/bin/teLeap: Warning!
Unknown residue: HSD   number: 163   type: Nonterminal

/opt/AMBER/amber20/bin/teLeap: Warning!
Unknown residue: HSD   number: 171   type: Nonterminal

/opt/AMBER/amber20/bin/teLeap: Warning!
Unknown residue: HSD   number: 245   type: Nonterminal
Creating new UNIT for residue: HSD sequence: 41

/opt/AMBER/amber20/bin/teLeap: Warning!
One sided connection. Residue () missing connect0 atom.
Created a new atom named: N within residue: .R<HSD 41>
Created a new atom named: CA within residue: .R<HSD 41>
Created a new atom named: CB within residue: .R<HSD 41>
Created a new atom named: ND1 within residue: .R<HSD 41>
Created a new atom named: CG within residue: .R<HSD 41>
Created a new atom named: CE1 within residue: .R<HSD 41>
Created a new atom named: NE2 within residue: .R<HSD 41>
Created a new atom named: CD2 within residue: .R<HSD 41>
Created a new atom named: C within residue: .R<HSD 41>
Created a new atom named: O within residue: .R<HSD 41>

/opt/AMBER/amber20/bin/teLeap: Warning!
One sided connection. Residue (default_name) missing connect1 atom.
Creating new UNIT for residue: HSD sequence: 64

/opt/AMBER/amber20/bin/teLeap: Warning!
One sided connection. Residue () missing connect0 atom.
Created a new atom named: N within residue: .R<HSD 64>
Created a new atom named: CA within residue: .R<HSD 64>
Created a new atom named: CB within residue: .R<HSD 64>
Created a new atom named: ND1 within residue: .R<HSD 64>
Created a new atom named: CG within residue: .R<HSD 64>
Created a new atom named: CE1 within residue: .R<HSD 64>
Created a new atom named: NE2 within residue: .R<HSD 64>
Created a new atom named: CD2 within residue: .R<HSD 64>
Created a new atom named: C within residue: .R<HSD 64>
Created a new atom named: O within residue: .R<HSD 64>

/opt/AMBER/amber20/bin/teLeap: Warning!
One sided connection. Residue (default_name) missing connect1 atom.
Creating new UNIT for residue: HSD sequence: 80

/opt/AMBER/amber20/bin/teLeap: Warning!
One sided connection. Residue () missing connect0 atom.
Created a new atom named: N within residue: .R<HSD 80>
Created a new atom named: CA within residue: .R<HSD 80>
Created a new atom named: CB within residue: .R<HSD 80>
Created a new atom named: ND1 within residue: .R<HSD 80>
Created a new atom named: CG within residue: .R<HSD 80>
Created a new atom named: CE1 within residue: .R<HSD 80>
Created a new atom named: NE2 within residue: .R<HSD 80>
Created a new atom named: CD2 within residue: .R<HSD 80>
Created a new atom named: C within residue: .R<HSD 80>
Created a new atom named: O within residue: .R<HSD 80>

/opt/AMBER/amber20/bin/teLeap: Warning!
One sided connection. Residue (default_name) missing connect1 atom.
Creating new UNIT for residue: HSD sequence: 163

/opt/AMBER/amber20/bin/teLeap: Warning!
One sided connection. Residue () missing connect0 atom.
Created a new atom named: N within residue: .R<HSD 163>
Created a new atom named: CA within residue: .R<HSD 163>
Created a new atom named: CB within residue: .R<HSD 163>
Created a new atom named: ND1 within residue: .R<HSD 163>
Created a new atom named: CG within residue: .R<HSD 163>
Created a new atom named: CE1 within residue: .R<HSD 163>
Created a new atom named: NE2 within residue: .R<HSD 163>
Created a new atom named: CD2 within residue: .R<HSD 163>
Created a new atom named: C within residue: .R<HSD 163>
Created a new atom named: O within residue: .R<HSD 163>
Creating new UNIT for residue: HSD sequence: 164
Created a new atom named: N within residue: .R<HSD 164>
Created a new atom named: CA within residue: .R<HSD 164>
Created a new atom named: CB within residue: .R<HSD 164>
Created a new atom named: ND1 within residue: .R<HSD 164>
Created a new atom named: CG within residue: .R<HSD 164>
Created a new atom named: CE1 within residue: .R<HSD 164>
Created a new atom named: NE2 within residue: .R<HSD 164>
Created a new atom named: CD2 within residue: .R<HSD 164>
Created a new atom named: C within residue: .R<HSD 164>
Created a new atom named: O within residue: .R<HSD 164>

/opt/AMBER/amber20/bin/teLeap: Warning!
One sided connection. Residue (default_name) missing connect1 atom.
Creating new UNIT for residue: HSD sequence: 172

/opt/AMBER/amber20/bin/teLeap: Warning!
One sided connection. Residue () missing connect0 atom.
Created a new atom named: N within residue: .R<HSD 172>
Created a new atom named: CA within residue: .R<HSD 172>
Created a new atom named: CB within residue: .R<HSD 172>
Created a new atom named: ND1 within residue: .R<HSD 172>
Created a new atom named: CG within residue: .R<HSD 172>
Created a new atom named: CE1 within residue: .R<HSD 172>
Created a new atom named: NE2 within residue: .R<HSD 172>
Created a new atom named: CD2 within residue: .R<HSD 172>
Created a new atom named: C within residue: .R<HSD 172>
Created a new atom named: O within residue: .R<HSD 172>

/opt/AMBER/amber20/bin/teLeap: Warning!
One sided connection. Residue (default_name) missing connect1 atom.
Creating new UNIT for residue: HSD sequence: 246

/opt/AMBER/amber20/bin/teLeap: Warning!
One sided connection. Residue () missing connect0 atom.
Created a new atom named: N within residue: .R<HSD 246>
Created a new atom named: CA within residue: .R<HSD 246>
Created a new atom named: CB within residue: .R<HSD 246>
Created a new atom named: ND1 within residue: .R<HSD 246>
Created a new atom named: CG within residue: .R<HSD 246>
Created a new atom named: CE1 within residue: .R<HSD 246>
Created a new atom named: NE2 within residue: .R<HSD 246>
Created a new atom named: CD2 within residue: .R<HSD 246>
Created a new atom named: C within residue: .R<HSD 246>
Created a new atom named: O within residue: .R<HSD 246>

/opt/AMBER/amber20/bin/teLeap: Warning!
One sided connection. Residue (default_name) missing connect1 atom.
Created a new atom named: OT1 within residue: .R<CTHR 304>
Created a new atom named: OT2 within residue: .R<CTHR 304>
  Added missing heavy atom: .R<CTHR 304>.A<O 14>
  Added missing heavy atom: .R<CTHR 304>.A<OXT 15>
  total atoms in file: 2348
  Leap added 2250 missing atoms according to residue templates:
       2 Heavy
       2248 H / lone pairs
  The file contained 72 atoms not in residue templates
Loading PDB file: ./_GMXMMPBSA_LIG_F1.pdb
 (starting new molecule for chain B)

/opt/AMBER/amber20/bin/teLeap: Warning!
Unknown residue: LIG   number: 1   type: Terminal/last
..relaxing end constraints to try for a dbase match

/opt/AMBER/amber20/bin/teLeap: Warning!
  -no luck
Creating new UNIT for residue: LIG sequence: 306
Created a new atom named: C1 within residue: .R<LIG 306>
Created a new atom named: C2 within residue: .R<LIG 306>
Created a new atom named: C3 within residue: .R<LIG 306>
Created a new atom named: C4 within residue: .R<LIG 306>
Created a new atom named: C5 within residue: .R<LIG 306>
Created a new atom named: C6 within residue: .R<LIG 306>
Created a new atom named: C7 within residue: .R<LIG 306>
Created a new atom named: C8 within residue: .R<LIG 306>
Created a new atom named: C9 within residue: .R<LIG 306>
Created a new atom named: N within residue: .R<LIG 306>
Created a new atom named: O1 within residue: .R<LIG 306>
Created a new atom named: O2 within residue: .R<LIG 306>
Created a new atom named: S within residue: .R<LIG 306>
  total atoms in file: 14
  Leap added 2 missing atoms according to residue templates:
       2 H / lone pairs
  The file contained 13 atoms not in residue templates
Checking Unit.
FATAL:  Atom .R<LIG 306>.A<S 13> does not have a type.
FATAL:  Atom .R<LIG 306>.A<O2 12> does not have a type.
FATAL:  Atom .R<LIG 306>.A<O1 11> does not have a type.
FATAL:  Atom .R<LIG 306>.A<N 10> does not have a type.
FATAL:  Atom .R<LIG 306>.A<C9 9> does not have a type.
FATAL:  Atom .R<LIG 306>.A<C8 8> does not have a type.
FATAL:  Atom .R<LIG 306>.A<C7 7> does not have a type.
FATAL:  Atom .R<LIG 306>.A<C6 6> does not have a type.
FATAL:  Atom .R<LIG 306>.A<C5 5> does not have a type.
FATAL:  Atom .R<LIG 306>.A<C4 4> does not have a type.
FATAL:  Atom .R<LIG 306>.A<C3 3> does not have a type.
FATAL:  Atom .R<LIG 306>.A<C2 2> does not have a type.
FATAL:  Atom .R<LIG 306>.A<C1 1> does not have a type.

/opt/AMBER/amber20/bin/teLeap: Fatal Error!
Failed to generate parameters

Exiting LEaP: Errors = 1; Warnings = 21; Notes = 0.
[ERROR  ] /opt/AMBER/amber20/bin/tleap failed when querying _GMXMMPBSA_leap.in

If you(@Valdes-Tresanco-MS) want to start another issue, I would start another issue because I think it is better to distinguish this issue into another issue.
p.s. Because I'm not native English user, please tell me what is wrong if there is an improper expression. :)

Originally posted by @Byun-jinyoung in #21 (comment)

Possible inconsistent structure generated with editconf if the tpr has PBC

The structure in the PDB generated with editconf can be inconsistent if the tpr file has PBC?
If it has PBC, can it be removed with editconf -pbc option?
We need more testing.
If we use trjconv to avoid this problem, we lose the chain ID in GROMACS 2020.x. Does the variable assign_chainID solve it? It should be assign_chainID = 1 by default.

Improve gmx_MMPBSA_test

  • Add a flag to overwrite or not the folder where the repository with the examples was cloned.
    It would be something like gmx_MMPBSA_test [-nr / -noreuse] (Default = True)
    -nr will store a 'store_false' action

  • Add a flag to delete the test folder once finished

  • Add the use of MPI for the examples if possible

OverflowError on IE calculation

Describe the issue
I am having an overflow error when trying to use Interaction Entropy

To Reproduce
Steps to reproduce the behavior:

  1. Which system type are you running?
    Protein-DNA-RNA-Ions-Ligand
  2. The command-line are you using.
    mpirun -np 14 --allow-run-as-root gmx_MMPBSA MPI -O -i mmpbsa.in -cs md_0_1_rstw.tpr -ci md_0_1_rstw.ndx -cg 1 12 -ct md_0_1_mmpbsa.xtc -nogui

Additional context
The input file:

&general
verbose=2, interval=10, interaction_entropy=1, ie_segment=100, temperature=303.15
/
&gb
saltcon=0.1
/

The error:

Preparing trajectories for simulation...
201 frames were processed by cpptraj for use in calculation.

Running calculations on normal system...

Beginning GB calculations with /root/miniconda3/bin/mmpbsa_py_energy
  calculating complex contribution...
  calculating receptor contribution...
  calculating ligand contribution...
  File "/root/miniconda3/bin/gmx_MMPBSA", line 8, in <module>
    sys.exit(gmxmmpbsa())
  File "/root/miniconda3/lib/python3.8/site-packages/GMXMMPBSA/app.py", line 107, in gmxmmpbsa
    app.parse_output_files()
  File "/root/miniconda3/lib/python3.8/site-packages/GMXMMPBSA/main.py", line 1068, in parse_output_files
    ie = InteractionEntropyCalc(edata, self, self.pre + 'iteraction_entropy.dat')
  File "/root/miniconda3/lib/python3.8/site-packages/GMXMMPBSA/calculation.py", line 566, in __init__
    self._calculate()
  File "/root/miniconda3/lib/python3.8/site-packages/GMXMMPBSA/calculation.py", line 583, in _calculate
    eceint = math.exp(deint / (k * temp))
OverflowError: math range error
Error occured on rank 0.
Exiting. All files have been retained.

suddenly stopped working

ram@ram:~/Desktop/gmx_trial/output$ gmx_MMPBSA
Traceback (most recent call last):
File "/home/ram/.local/bin/gmx_MMPBSA", line 7, in
from GMXMMPBSA.app import gmxmmpbsa
File "/home/ram/.local/lib/python2.7/site-packages/GMXMMPBSA/app.py", line 145
sys.exit(app.exec())
^
SyntaxError: invalid syntax

gmx_MMPBSA does not recognize variable Temperature in .in file

Greetings,

I'm running a single trajectory protein-ligand system GBSA free energy calculation with IE using a temperature for the IE of 310.15 K.

When I run the software, the next prompt appears:

[INFO ] Started
InputError: Unknown variable temperature in &general
Enter gmx_MMPBSA --help for help

Here is the command line:

gmx_MMPBSA -O -i mmpbsa.in -cs md_0_10.tpr -ci index_mmpbsa.ndx -cg 1 13 -ct md_0_10_mmpbsa.xtc -lm MOL.mol2 -eo output

Here is the log file output
**[INFO ] Started**

Here is the .in file input

Sample input file for entropy calculations (IE)
This input file is meant to show only that gmx_MMPBSA works. Although,
we tried to use the input files as recommended in the Amber manual,
some parameters have been changed to perform more expensive calculations
in a reasonable amount of time. Feel free to change the parameters according to what is better for your system.

&general
#
startframe=3000, endframe=5000, verbose=2, interval=4,
protein_forcefield="oldff/leaprc.ff99SB",

#entropy variable control whether to perform a quasi-harmonic entropy (QH)
# approximation or the Interaction Entropy approximation
# (https://pubs.acs.org/doi/abs/10.1021/jacs.6b02682) 
entropy=2, entropy_seg=25, temperature=310.15
/

&gb
igb=2, saltcon=0.01,
/

Thank you

Calculate the p-value in the correlation analysis

Improve the calculation of correlation coefficients

Currently, it is a basic function that uses pandas tools to calculate the Pearson and Spearman coefficients. However, pandas do not calculate the p-value so we will implement scipy for it, and future calculations

Installation issue

Describe the issue
gmx_MMPBSA installation issue

If you will report an error, please complete this form
To Reproduce

$source home/byun/apps/AMBER/amber20_src/amber.sh
$conda activate /home/byun/apps/AMBER/amber20_src/miniconda
$which amber.python 
   > ~/apps/AMBER/amber20_src/bin/amber.python
$ amber.python -m pip install gmx_MMPBSA

When I prompted the lines above, I have encountered installation issue. Below is the error message

Additional context

DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 will drop support for Python 2.7 in January 2021. More details about Python 2 support in pip can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support pip 21.0 will remove support for this functionality.
Collecting gmx_MMPBSA
  Using cached gmx_MMPBSA-1.3.1.tar.gz (510 kB)
    ERROR: Command errored out with exit status 1:
     command: /home/byun/apps/AMBER/amber20_src/bin/amber.python -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-hmLq4m/gmx-mmpbsa/setup.py'"'"'; __file__='"'"'/tmp/pip-install-hmLq4m/gmx-mmpbsa/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-pip-egg-info-7_05IR
         cwd: /tmp/pip-install-hmLq4m/gmx-mmpbsa/
    Complete output (8 lines):
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/tmp/pip-install-hmLq4m/gmx-mmpbsa/setup.py", line 20, in <module>
        import versioneer
      File "versioneer.py", line 1739
        file=sys.stderr)
            ^
    SyntaxError: invalid syntax
    ----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.

getPDBfromTpr error

The following error occurred:

raise MMPBSA_Error('%s failed when querying %s' % (' '.join(trjconv), self.FILES.complex_tpr))

The command I used is following:

gmx_MMPBSA -O -i mmpbsa.in -cs md_0_10.tpr -ci index.ndx -cg 1 14 -ct md_0_10_noPBC.xtc -lm 6DO_bcc_gaff.mol2

The error in the make_top.log file is following:

Fatal error:
Index[76] 13524 is larger than the number of atoms in the
trajectory file (13523). There is a mismatch in the contents
of your -f, -s and/or -n files.

Thanks!
make_top.log

MMPBSA_Error: No info files found!

Hi Mario, after the installation I followed with the command gmx_MMPBSA_test to test all the examples and I encountered the error below, however I cannot identify the error in the installation steps

(base) juliana@juliana-VB:~$ gmx_MMPBSA_test -f /home/juliana/Documents -t all -n 10
[INFO ] Cloning gmx_MMPBSA repository in /home/juliana/Documents/gmx_MMPBSA_test
Cloning into '/home/juliana/Documents/gmx_MMPBSA_test'...
remote: Enumerating objects: 5030, done.
remote: Counting objects: 100% (383/383), done.
remote: Compressing objects: 100% (152/152), done.
remote: Total 5030 (delta 234), reused 336 (delta 212), pack-reused 4647
Receiving objects: 100% (5030/5030), 245.85 MiB | 1.74 MiB/s, done.
Resolving deltas: 100% (3559/3559), done.
Updating files: 100% (326/326), done.
[INFO ] Cloning gmx_MMPBSA repository...Done.
[WARNING] Using all processors
[INFO ] Example STATE

[INFO ] Protein-Ligand (Single trajectory approximation) RUNNING
[INFO ] Protein-Protein RUNNING
[ERROR ] Protein-Ligand (Single trajectory approximation) [ 1/16] ERROR
Please, check the test log
(/home/juliana/Documents/gmx_MMPBSA_test/docs/examples/Protein_ligand/ST/prot_lig_st.log)
[INFO ] Protein-DNA RUNNING
[ERROR ] Protein-Protein [ 2/16] ERROR
Please, check the test log
(/home/juliana/Documents/gmx_MMPBSA_test/docs/examples/Protein_protein/prot_prot.log)
[INFO ] Protein-Membrane RUNNING
[ERROR ] Protein-DNA [ 3/16] ERROR
Please, check the test log
(/home/juliana/Documents/gmx_MMPBSA_test/docs/examples/Protein_DNA/prot_dna.log)
[INFO ] Protein-Glycan RUNNING
[ERROR ] Protein-Membrane [ 4/16] ERROR
Please, check the test log
(/home/juliana/Documents/gmx_MMPBSA_test/docs/examples/Protein_membrane/memb_prot.log)
[INFO ] Metalloprotein-Peptide RUNNING
[ERROR ] Protein-Glycan [ 5/16] ERROR
Please, check the test log
(/home/juliana/Documents/gmx_MMPBSA_test/docs/examples/Protein_glycan/prot_glycan.log)
[INFO ] Protein-DNA-RNA-IONs-Ligand RUNNING
[ERROR ] Metalloprotein-Peptide [ 6/16] ERROR
Please, check the test log
(/home/juliana/Documents/gmx_MMPBSA_test/docs/examples/Metalloprotein_peptide/metalloprot_pep.log)
[INFO ] Protein-Ligand (CHARMM force field) RUNNING
[ERROR ] Protein-DNA-RNA-IONs-Ligand [ 7/16] ERROR
Please, check the test log
(/home/juliana/Documents/gmx_MMPBSA_test/docs/examples/Protein_DNA_RNA_Ion_ligand/prot_dna_rna_ions_lig.log)
[INFO ] Protein-ligand complex in membrane with CHARMMff RUNNING
[ERROR ] Protein-Ligand (CHARMM force field) [ 8/16] ERROR
Please, check the test log
(/home/juliana/Documents/gmx_MMPBSA_test/docs/examples/Protein_ligand_CHARMMff/prot_lig_charmm.log)
[INFO ] Alanine Scanning RUNNING
[ERROR ] Protein-ligand complex in membrane with CHARMMff [ 9/16] ERROR
Please, check the test log
(/home/juliana/Documents/gmx_MMPBSA_test/docs/examples/Protein_membrane_CHARMMff/memb_charmm.log)
[INFO ] Stability calculation RUNNING
[ERROR ] Alanine Scanning [10/16] ERROR
Please, check the test log
(/home/juliana/Documents/gmx_MMPBSA_test/docs/examples/Alanine_scanning/ala_scan.log)
[INFO ] Decomposition Analysis RUNNING
[ERROR ] Stability calculation [11/16] ERROR
Please, check the test log
(/home/juliana/Documents/gmx_MMPBSA_test/docs/examples/Stability/stability.log)
[INFO ] Protein-Ligand (Multiple trajectory approximation) RUNNING
[ERROR ] Decomposition Analysis [12/16] ERROR
Please, check the test log
(/home/juliana/Documents/gmx_MMPBSA_test/docs/examples/Decomposition_analysis/decomp.log)
[INFO ] Interaction Entropy approximation RUNNING
[ERROR ] Protein-Ligand (Multiple trajectory approximation) [13/16] ERROR
Please, check the test log
(/home/juliana/Documents/gmx_MMPBSA_test/docs/examples/Protein_ligand/MT/prot_lig_mt.log)
[INFO ] Entropy calculation using Normal Mode approximation RUNNING
[ERROR ] Interaction Entropy approximation [14/16] ERROR
Please, check the test log
(/home/juliana/Documents/gmx_MMPBSA_test/docs/examples/Entropy_calculations/Interaction_Entropy/ie.log)
[INFO ] Calculations using 3D-RISM approximation RUNNING
[ERROR ] Entropy calculation using Normal Mode approximation [15/16] ERROR
Please, check the test log
(/home/juliana/Documents/gmx_MMPBSA_test/docs/examples/Entropy_calculations/nmode/nmode.log)
[ERROR ] Calculations using 3D-RISM approximation [16/16] ERROR
Please, check the test log
(/home/juliana/Documents/gmx_MMPBSA_test/docs/examples/3D-RISM/3drism.log)

[INFO ] Opening gmx_MMPBSA_ana...
ERROR:root:MMPBSA_Error No info files found!.
Check the gmx_MMPBSA.log file to report the problem.
Traceback (most recent call last):
File "/home/juliana/Downloads/AmberTools20/amber20/miniconda/bin/gmx_MMPBSA_ana", line 8, in
sys.exit(gmxmmpbsa_ana())
File "/home/juliana/.local/lib/python3.8/site-packages/GMXMMPBSA/app.py", line 126, in gmxmmpbsa_ana
ifiles = get_files(parser)
File "/home/juliana/.local/lib/python3.8/site-packages/GMXMMPBSA/analyzer/utils.py", line 162, in get_files
GMXMMPBSA_ERROR('No info files found!')
File "/home/juliana/.local/lib/python3.8/site-packages/GMXMMPBSA/exceptions.py", line 169, in init
raise exc(msg + '. Check the gmx_MMPBSA.log file to report the problem.')
GMXMMPBSA.exceptions.MMPBSA_Error: No info files found!. Check the gmx_MMPBSA.log file to report the problem.

PDBError: Coordinate mismatch in model 1

Hi,

I am running a heterodimeric membrane protein (2 chains) with a ligand. I have tried with both a *.gro and a *.tpr file as input.
I installed AmberTools20 with conda and then did pip install gmx_MMPBSA in the conda environment. This is after I realised that Amber18 package does not work with this tool (python3 issue raised before here)
I am using this command:

gmx_MMPBSA -i MMPBSA_input -cs ../pr_200ns.tpr -ci index.ndx -cg 1 13 -ct ../pr_200ns_whole_nojump_center_rotxy-transxy.xtc --overwrite

Can you provide more info as to what is wrong with my files? The error message I am receiving is not very enlightening as to what the problem is...

OUTPUT:

Loading and checking parameter files for compatibility...

cpptraj found! Using /sansom/s152/bras4549/anaconda3/envs/AmberTools20/bin/cpptraj
gmx found! Using /sbcb/packages/opt/Linux_x86_64/gromacs/2020.3_GCC6.2_CUDA10.1.AVX2/bin/gmx
tleap found! Using /sansom/s152/bras4549/anaconda3/envs/AmberTools20/bin/tleap
parmchk2 found! Using /sansom/s152/bras4549/anaconda3/envs/AmberTools20/bin/parmchk2
sander found! Using /sansom/s152/bras4549/anaconda3/envs/AmberTools20/bin/sander
Normal Complex: Save group 1_13 in index.ndx (gromacs index) file as _GMXMMPBSA_COM.pdb
Clear normal complex trajectories...
Using receptor structure from complex to make amber topology
Normal Complex: Save group 1 in _GMXMMPBSA_COM_index.ndx (gromacs index) file as _GMXMMPBSA_REC.pdb
Using ligand structure from complex to make amber topology
Save group 13 in _GMXMMPBSA_COM_index.ndx (gromacs index) file as _GMXMMPBSA_LIG.pdb
File "/sansom/s152/bras4549/anaconda3/bin/gmx_MMPBSA", line 8, in
sys.exit(gmxmmpbsa())
File "/sansom/s152/bras4549/anaconda3/lib/python3.8/site-packages/GMXMMPBSA/app.py", line 92, in gmxmmpbsa
app.loadcheck_prmtops()
File "/sansom/s152/bras4549/anaconda3/lib/python3.8/site-packages/GMXMMPBSA/main.py", line 590, in loadcheck_prmtops
maketop = CheckMakeTop(FILES, INPUT, self.external_progs)
File "/sansom/s152/bras4549/anaconda3/lib/python3.8/site-packages/GMXMMPBSA/make_top.py", line 99, in init
self.checkPDB()
File "/sansom/s152/bras4549/anaconda3/lib/python3.8/site-packages/GMXMMPBSA/make_top.py", line 294, in checkPDB
self.complex_str = parmed.read_PDB(self.complex_pdb) # can always be initialized
File "/sansom/s152/bras4549/anaconda3/lib/python3.8/site-packages/parmed/formats/pdb.py", line 376, in parse
inst._parse_open_file(fileobj)
File "/sansom/s152/bras4549/anaconda3/lib/python3.8/site-packages/parmed/formats/pdb.py", line 430, in _parse_open_file
method_dispatchrec
File "/sansom/s152/bras4549/anaconda3/lib/python3.8/site-packages/parmed/formats/pdb.py", line 703, in _end_model
raise PDBError('Coordinate mismatch in model %d' % self._current_model_number)
PDBError: Coordinate mismatch in model 1
Exiting. All files have been retained.

Error about ligand parameterization using ACPYPE (I guess...)

Greetings,

I'm working with my own Protein-Ligand system. Ligand was parameterized using ACPYPE (ACPYPE - AnteChamber PYthon Parser interfacE) [Da Silva, A. W. S., & Vranken, W. F. (2012). ACPYPE-Antechamber python parser interface. BMC research notes, 5(1), 1-8.].

Running a gmx_MMPBSA calculation using:

gmx_MMPBSA -O -i mmpbsa.in -cs md_0_10.tpr -ci index.ndx -cg 1 13 -ct md_0_10_center.xtc -lm oxml.mol2 -eo output

I got this error:

[INFO ] Loading and checking parameter files for compatibility...

Preparing trajectories for simulation...
Error: # atoms in XTC file (14027) does not match # atoms in parm COM.prmtop (14020)
Error: Could not set up 'COM_traj_0.xtc' for reading.
Error: Could not set up input trajectory 'COM_traj_0.xtc'.
Error: Error(s) occurred during execution.
file "/home/XXX/anaconda3/envs/AmberTools20/bin/gmx_MMPBSA", line 8, in

Inspectioning of gmx_MMPBSA.log I found this error:

Loading Mol2 file: ./oxml.mol2
Reading MOLECULE named MOL
Checking 'LIG1'....
Checking parameters for unit 'LIG1'.
Checking for bond parameters.
Checking for angle parameters.

/home/XXXX/anaconda3/envs/AmberTools20/bin/teLeap: Error!
Could not find angle parameter: n2 - ce - oh

/home/XXXX/anaconda3/envs/AmberTools20/bin/teLeap: Warning!
There are missing parameters.
Unit is OK.

Checking Leap.log I found that only ff99SB (Hornak & Simmerling) and gaff force fields are invoked (without taking account ions, solvent and nucleid acid). Ie:

Loading parameters: /home/XXXX/anaconda3/envs/AmberTools20/dat/leap/parm/gaff.dat

Inspecting gaff.dat, I was able to find that, certainly, that angle parameter does not exist (nevertheless I was able to generate topology of ligand and that topology is being used as oxml.mol2 in gmx_MMPBSA command). But checking ACPYPE source code I found:

 parser.add_argument(
    "-a",
    "--atom_type",
    choices=["gaff", "amber", "gaff2", "amber2"],
    action="store",
    default="gaff",
    dest="atom_type",
    help="atom type, can be gaff, gaff2, amber (AMBER14SB) or amber2 (AMBER14SB + GAFF2), default is gaff",
)

So apparently, more ff are being used to parameterized molecules and that's why I was able to create the angle parameter. Indeed according to Leap.log file generated by ACPYPE:

Building improper torsion parameters.
--Impropers:
1 C14 - C16 - C15 - C20
1 C4 - C9 - C8 - C13
1 H11 - C8 - C13 - C12
1 H12 - C11 - C12 - C13
1 H13 - C18 - C19 - C20
1 H19 - C16 - C17 - C18
1 H1 - C8 - C9 - C10
1 H21 - C10 - C11 - C12
1 H22 - C17 - C18 - C19
1 H24 - C15 - C20 - C19
1 H3 - C9 - C10 - C11
1 H6 - C15 - C16 - C17
1 N1 - C15 - C14 - O4
total 13 improper torsions applied
Building H-Bond parameters.

The parameters were built using the ff before mentioned.

Since ACPYPE is commonly used with GROMACS to generate topologies in a reliable way. There is a posibility to include these topologies as default in tleap command of gmx_MMPSA, is not? Or maybe the option is to modify manually gaff.dat to include that angle (but with the issue that gaff2 and other ff are not being invoked by gmx_MMPBSA and there is a posibility of new conflicts, atleast, according to my gmx_MMPBSA.log).

Thank you

Psdta: I attatched my gmx_MMPBSA.log and I want to clarify I was able to used gmx_MMPBSA in other Protein-Ligans ST system without problem, ando comparing both .log the only difference is the ligand topology that is given by mol2 file (certainly, the ligands are different and that's why I tought the problem is what I wrote above, since the ligand without problems does not have this n2 - ce - oh feature). Sorry for the long post. Have a nice day.

gmx_MMPBSA.log

gmx_mpi make_ndx falied when querying index file

Describe the issue
I've tried to calculate Interaction entropy with gmx_MMPBSA, but I got the error below

If you will report an error, please complete this form

To Reproduce
Steps to reproduce the behavior:

  1. Which system type are you running? Protein-ligand system
  2. The command-line are you using.
    mpirun -np 4 gmx_MMPBSA MPI -O -i gmx_MMPBSA_ie.in -cs amber2gmx.pdb -ci index.ndx -cg 1 13 -ct nosolv_Nopbc_prod.xtc -cp amber2gmx.top

Additional context

[INFO   ] Started
[INFO   ] Loading and checking parameter files for compatibility...

[INFO   ] Building AMBER Topologies from GROMACS files...
[INFO   ] Checking if supported force fields exists in Amber data...
[INFO   ] Get PDB files from GROMACS structures files...
[INFO   ] Making gmx_MMPBSA index for complex...
*** An error occurred in MPI_Init_thread
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** An error occurred in MPI_Init_thread
***    and potentially your MPI job)
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[cpu5:239906] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee t\
hat all other processes were killed!
[cpu5:239905] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee t\
hat all other processes were killed!
*** An error occurred in MPI_Init_thread
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[cpu5:239907] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee t\
hat all other processes were killed!
[ERROR  ] /home/byun/apps/GROMACS/gromacs_cpu-2020.2_gcc-7.5.0_openmpi-4.0.2/bin/gmx_mpi make_ndx failed when querying index.ndx
*** An error occurred in MPI_Init_thread
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[cpu5:239908] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee t\
hat all other processes were killed!

Output and temporary files

If the program ends in error, the clean trajectories are not removed. Detect if they exist then we use them, that way the computation time is reduced

  • Prevented deleting initial temporary files
  • Reuse intermediary files
  • Rename the output and temporary files
  • Delete all intermediate files when cleaning
  • Make a more informative log file

Question: Gromacs united atom force field is supported?

Dear gmx_MMPBSA developers.
Hello. Because I want to use "gmx_MMPBSA", I took a look tutorial examples. There is a option for a ligand; -lm option which required the ligand mole2 file from Antechamber.
But I wondered whether the "gmx_MMPBSA" supported the gromacs united atom force field because, when I used parmed in order to convert gromacs force field into AMBER format, it was not working properly.....(NOTE: I'm not expert on the parmed....)

Thank you for your reply in advanced. :)

charmm36 force field

Hello,

I would like to know if gmx_mmpbsa is compatible with charmm36 force field
Thanks...

Best regards

Error while running gmx_MMPBSA command.

Hello sir,
After installation of gmx_MMPBSA , i'm getting error while running the code.

~/Downloads/amber20_src/bin$ gmx_MMPBSA
Traceback (most recent call last):
File "/home/user/Downloads/amber20_src/miniconda/bin/gmx_MMPBSA", line 5, in
from GMXMMPBSA.app import gmxmmpbsa
File "/home/user/Downloads/amber20_src/miniconda/lib/python2.7/site-packages/GMXMMPBSA/app.py", line 122
sys.exit(app.exec())
^
SyntaxError: invalid syntax

Fatal error: reading tpx file (step7.tpr) version 122 with version 116 program

Describe the issue
I have done 50ns of MD simulation in Gromacs for a membrane protein-ligand complex. The system was build in CHARMM-GUI. The simulation was done in Gromacs 2021.1 version. When I was trying to run gmx_MMPBSA its given following error.

Fatal error:
reading tpx file (step7.tpr) version 122 with version 116 program

Please help to solve the issue

If you will report an error, please complete this form
To Reproduce
Steps to reproduce the behavior:

  1. Which system type are you running? (Protein-Protein, Protein-Ligand, etc)
  2. The command-line are you using.

Additional context
The gmx_MMPBSA.log file

Create a conda package

Create a conda package
Packages to be installed:

  • AmberTools >= 20 (if not installed by compilation)
  • mpi4py >= 3.0.3 (PyPi == conda) --> conda is preferred to avoid the incompatibility with compilers package])
  • pandas >= 1.2.4 (PyPi == conda)
  • matplotlib >= 3.4.2 (PyPi == conda)
  • scipy >= 1.6.1 (PyPi = 1.7.0 | conda = 1.6.3)
  • seaborn >= 0.11.1 (PyPi == conda)
  • PyQt5 >= 5.15.4 (PyPi = 5.15.4 | conda = 5.9.2)

Error when trajectory does not start at time = 0

The PDB file is extracted using trjconv at time = 0. Would it be better to use editconf to generate the PDB file? editconf has the disadvantage that it uses the tpr structure instead of the initial structure of the trajectory. In most cases, these structures must match

Cannot use MPI with -np higher than 2

Describe the issue
Cannot use MPI with -np higher than 2

If you will report an error, please complete this form
To Reproduce
Steps to reproduce the behavior:

  1. Which system type are you running?
    Protein-DNA-RNA-Ions-Ligand
  2. The command-line are you using.
    mpirun -np 3 gmx_MMPBSA MPI -O -i mmpbsa.in -cs md_0_1_rstw.tpr -ci md_0_1_rstw.ndx -cg 1 12 -ct md_0_1_mmpbsa.xtc
    Additional context
    Everytime I try to run a MPI job with more than 2 proccesses, an error appears on the stderr/stdout, despite no error showing on gmx_MMPBSA.log
    stdout.txt
    gmx_MMPBSA.log

trjconv failed when querying md.tpr

This is in reference to issue #15 .
I also facing the same issue. gmx_MMPBSA worked perfectly fine with files in Protein-DNA examples but failed while running it for my files.

Command used: gmx_MMPBSA -O -i mmpbsa_onlygbsa.in -cs md_0-900ns-1000ns_nodt_nocpmact.tpr -ci index_prot.ndx -cg 1 12 -ct md_0-900ns-1000ns_nodt_nocpmact.xtc

Error reported: File "/home/gayatrip/miniconda3/envs/amber/lib/python3.9/site-packages/GMXMMPBSA/exceptions.py", line 169, in init
raise exc(msg + '. Check the gmx_MMPBSA.log file to report the problem.')
MMPBSA_Error: /home/gayatrip/gromacs-2020/build/bin/gmx trjconv failed when querying md_0-900ns-1000ns_nodt_nocpmact.tpr. Check the gmx_MMPBSA.log file to report the problem.

Could you please tell me where am I going wrong?

Thanks in advance

Check the reference structure

We must review more in depth the reference structure and the matching with the complex before assigning the chain IDs

Unable to execute

Describe the issue
using amber.python i installed this program. I tried to export the path using ~/.bashrc
eg:
export gmx_MMPBSA=path/to/it

after calling it returns command not found (after sourcing bashrc)
]
instead of export i used alias gmx_MMPBSA =path/to/it

now its working.
but whein i tried to execute using given example files it return following error
[INFO ] Starting
[INFO ] Command-line
gmx_MMPBSA -O -i mmpbsa.in -cs com.tpr -ci index.ndx -cg 1 13 -ct com_traj.xtc

[WARNING] protein_forcefield and ligand_forcefield variables are deprecate since version 1.4.1 and will be remove in the next version. Please, use forcefield instead.
[WARNING] entropy_seg variable is deprecate since version 1.4.2 and will be remove in v1.5.0. Please, use ie_segment instead.
[INFO ] Checking external programs...
[INFO ] cpptraj found! Using /home/pharmacoinformatics/amber/amber20/bin/cpptraj
[INFO ] tleap found! Using /home/pharmacoinformatics/amber/amber20/bin/tleap
[INFO ] parmchk2 found! Using /home/pharmacoinformatics/amber/amber20/bin/parmchk2
[INFO ] mmpbsa_py_energy found! Using /home/pharmacoinformatics/amber/amber20/bin/mmpbsa_py_energy
[INFO ] Using GROMACS version 4.x.x!
[ERROR ] MMPBSA_Error Could not find necessary program [ GROMACS ].
Check the gmx_MMPBSA.log file to report the problem.

i am using CentOS

and gromacs not installed.

If you will report an error, please complete this form
To Reproduce
Steps to reproduce the behavior:

  1. Which system type are you running? (Protein-Protein, Protein-Ligand, etc)
  2. The command-line are you using.

Additional context

pharmacoinformatics@localhost MT]$ export gmx_MMPBSA=/home/pharmacoinformatics/amber/amber20/miniconda/bin/gmx_MMPBSA
[pharmacoinformatics@localhost MT]$ gmx_MMPBSA -O -i mmpbsa.in -cs com.tpr -ci index.ndx -cg 1 13 -ct com_traj.xtc
bash: gmx_MMPBSA: command not found...

[pharmacoinformatics@localhost MT]$ alias gmx_MMPBSA=/home/pharmacoinformatics/amber/amber20/miniconda/bin/gmx_MMPBSA
[pharmacoinformatics@localhost MT]$ gmx_MMPBSA -O -i mmpbsa.in -cs com.tpr -ci index.ndx -cg 1 13 -ct com_traj.xtc
[INFO ] Starting
[INFO ] Command-line
gmx_MMPBSA -O -i mmpbsa.in -cs com.tpr -ci index.ndx -cg 1 13 -ct com_traj.xtc

[WARNING] protein_forcefield and ligand_forcefield variables are deprecate since version 1.4.1 and will be remove in the next version. Please, use forcefield instead.
[WARNING] entropy_seg variable is deprecate since version 1.4.2 and will be remove in v1.5.0. Please, use ie_segment instead.
[INFO ] Checking external programs...
[INFO ] cpptraj found! Using /home/pharmacoinformatics/amber/amber20/bin/cpptraj
[INFO ] tleap found! Using /home/pharmacoinformatics/amber/amber20/bin/tleap
[INFO ] parmchk2 found! Using /home/pharmacoinformatics/amber/amber20/bin/parmchk2
[INFO ] mmpbsa_py_energy found! Using /home/pharmacoinformatics/amber/amber20/bin/mmpbsa_py_energy
[INFO ] Using GROMACS version 4.x.x!
[ERROR ] MMPBSA_Error Could not find necessary program [ GROMACS ].
Check the gmx_MMPBSA.log file to report the problem.
File "/home/pharmacoinformatics/amber/amber20/miniconda/bin/gmx_MMPBSA", line 8, in
sys.exit(gmxmmpbsa())
File "/home/pharmacoinformatics/amber/amber20/miniconda/lib/python3.8/site-packages/GMXMMPBSA/app.py", line 95, in gmxmmpbsa
app.make_prmtops()
File "/home/pharmacoinformatics/amber/amber20/miniconda/lib/python3.8/site-packages/GMXMMPBSA/main.py", line 566, in make_prmtops
external_progs = find_progs(self.INPUT, self.mpi_size)
File "/home/pharmacoinformatics/amber/amber20/miniconda/lib/python3.8/site-packages/GMXMMPBSA/findprogs.py", line 105, in find_progs
GMXMMPBSA_ERROR('Could not find necessary program [ GROMACS ]')
File "/home/pharmacoinformatics/amber/amber20/miniconda/lib/python3.8/site-packages/GMXMMPBSA/exceptions.py", line 169, in init
raise exc(msg + '. Check the gmx_MMPBSA.log file to report the problem.')
MMPBSA_Error: Could not find necessary program [ GROMACS ]. Check the gmx_MMPBSA.log file to report the problem.
Exiting. All files have been retained.
[pharmacoinformatics@localhost MT]$

ModuleNotFoundError: No module named 'parmed'

Hi,

I have followed the previous issue about this error, however I still unable to solve this problem.

in my ~./bashrc, I have put the path:

source /home/user/Coral/software/amber20_src/amber.sh
export PYTHONPATH="/home/user/Coral/software/amber20_src/miniconda/bin:$PATH"

and when I echo $PYTHONPATH:

(myvenv) [login-0-0 gmx_MMPBSA-1.1.1]$ echo $PYTHONPATH
/home/user/Coral/software/amber20_src/lib/python3.8/site-packages

Can you give me a hand?

Best,

Ben

[ERROR ] You did not specify any type of calculation!

While following the Protein-DNA binding free energy calculation, it gave this error after executing command

gmx_MMPBSA -i mmpbsa.in -cs com.tpr -ci index.ndx -cg 1 12 -ct com_traj.xtc

"[INFO ] Started
[ERROR ] You did not specify any type of calculation!
"
Can you please guide me where am I going wrong?
Thanks

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.