Here is the problem. In September 2021, I installed the MDAnalysis (https://www.mdanalysis.org/) software which can be used to analyze trajectories from applications such as OpenMM, CHARMM, Gromacs, etc
The CMAKE procedure used an OpenMM library in the directory /usr/lib/x86_64-linux-gnu/libOpenMM.so placed there by the previous MDAnalysis install procedure.
That is the library that CMAKE procedure used, ALTHOUGH all the needed libraries were defined BEFORE using the CMAKE procedure:
The way in which I found the problem involved looking at the files that were generated after ../configure. Looking them over, I was able to discern that the incorrect plugins file was assigned by cmake.
As far as the OpenMM/CHARMM compatibility issue, I don't think there should be much of a problem. Although there is the on-going blending of these two applications as time goes by. By that I mean, periodically, i.e. at least yearly, I know that new options are added to CHARMM. And there are also continuing improvements and additions to OpenMM. It's possible that there might be a slight incompatibility for a brief period, but will correct itself. But, the bread-and-butter issue for me is the ability to perform very fast MD simulations using GPU hardware. And that won't change from one OpenMM version to another.
I did want to discuss conda/miniconda. On Debian Linux, conda isn't available. So, I first installed miniconda, which then creates an anaconda3 directory. From there, I installed OpenMM, but it also resides in the anaconda3 directory. The miniconda installation also places code in my .bashrc file (I don't like that.). Somehow, I think the miniconda installation diverts the normal execution of python commands , i.e. uses python code in the anaconda3 directory, but I can't be sure.
So, to test compatibility, I suppose one could rename the anaconda3 directory to preserve it. And then repeat the process install miniconda then OpenMM creating a whole new OpenMM installation.
While I have used the OpenMM interface in the past, thanks to some help from a very talented post-doc I've moved on to using OpenMM entirely via the Python bindings, using conda environments and a custom package that allows the use of a dyn.py script that bears some resemblance to typical dyn.inp charmm scripts I've used to run MD. The biggest reason is several newer features supported in OpenMM that do not appear to be supported in the charmm interface, at least according to the documentation for the latest code. Those features are:
the charmm Drude polarizable force field
the LJPME code for long range VDW forces
the Nosé-Hoover thermostat
These feature are important for the work I've been doing, but the Fortran implementation in charmm for the first two listed above is exceedingly slow, esp. the use of the Drude force field. Also, with help from the post-doc, I've worked out a simple means of including a boundary plane restraint, comparable to what one might do with MMFP if it actually worked with domdec (it doesn't).
OpenMM can read native charmm topology and parameter files, PSF files, and coordinate files on its own; the DCD files produced can be analyzed with charmm. So other than the actual MD engine, my workflow for most projects has not substantially changed, except for the use of Python to run simulations. I am by no means a Python maven, but have learned to accept its usefulness and flexibility.
Thanks so much for the very important/useful information. I like!!
Rick, would you consider posting an example of the dyn.py file mentioned in your immediate post into the scripts category? All of the scripts written by you and Lennart have been so helpful not only to me but to many, many others.
Thanks so much.
P.S. I forgive you for your fractured loyalty to CHARMM.
Well, I continue to use CHARMM for model building, equilibration, analysis, visualization, and simulations that require pressure calculations using the atomic virial, something many MD programs lack. I believe GROMACS and LAMMPS offer such pressure calculations, I'm not aware of others; OpenMM, NAMD, and AMBER do not. Plus, I continue to contribute small code fixes and enhancements, since I'm quite comfortable working in Fortran.
I'm not sure the Script Archive forum is the best place for OpenMM python examples, though; I may need to consider some other venue.
global/u2/a/angelor /global/homes/a/angelor/openmm /usr/lib/python36.zip /usr/lib64/python3.6 /usr/lib64/python3.6/lib-dynload /usr/lib64/python3.6/site-packages /usr/lib64/python3.6/_import_failed /usr/lib/python3.6/site-packages /global/homes/a/angelor/openmm Traceback (most recent call last): File "test2.py", line 5, in from openmm.app import * File "/global/u2/a/angelor/openmm/__init__.py", line 19, in from openmm.openmm import * File "/global/u2/a/angelor/openmm/openmm.py", line 13, in from . import _openmm ImportError: cannot import name '_openmm'
The openmm directory is in my home directory and also in the python path, as seen from the output above:
[Sat Jan 15 10:20:24 angelor@cori04:~ ] $ ls openmm __init__.py __pycache__/ _openmm.cpython-39-x86_64-linux-gnu.so* amd.py app/ mtsintegrator.py openmm.py testInstallation.py unit/ version.py
Sat Jan 15 10:20:29 angelor@cori04:~ ] $ env | grep PYTHONPATH PYTHONPATH=/global/homes/a/angelor/openmm:/global/common/cori/software/nwchem/6.6/contrib/python
Why can't the python interpreter find these files? I have had so many problems understanding python. It's like there is this big gap between two extremes: I can compile code and know Linux system stuff; and I can run simulations and analyze results, but I don't know what to do with the python gap in the middle! DUH!
If you can help, I would really appreciate this. Otherwise delete the python component.