Comments (6)
This has nothing to do with Open MPI, but instead points to a problem with your compilers on this machine. Here is the relevant snippet from your config.log
icc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and will be removed from product release in the second half of 2023. The Intel(R) oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. Please transition to use this compiler. Use '-diag-disable=10441' to disable this message.
ld: skipping incompatible /usr/lib/x86_64-linux-gnu/libm.so when searching for -lm
ld: skipping incompatible /usr/lib/x86_64-linux-gnu/libm.a when searching for -lm
ld: cannot find -lm: No such file or directory
ld: skipping incompatible /usr/lib/x86_64-linux-gnu/libm.so when searching for -lm
ld: skipping incompatible /usr/lib/gcc/x86_64-linux-gnu/11//libstdc++.so when searching for -lstdc++
ld: skipping incompatible /usr/lib/gcc/x86_64-linux-gnu/11//libstdc++.a when searching for -lstdc++
ld: cannot find -lstdc++: No such file or directory
ld: skipping incompatible /usr/lib/gcc/x86_64-linux-gnu/11//libstdc++.so when searching for -lstdc++
ld: skipping incompatible /usr/lib/gcc/x86_64-linux-gnu/11//libgcc.a when searching for -lgcc
ld: cannot find -lgcc: No such file or directory
ld: skipping incompatible /usr/lib/gcc/x86_64-linux-gnu/11//libgcc.a when searching for -lgcc
ld: cannot find -lgcc: No such file or directory
from ompi.
But, even so, I still don't know how to solve this problem.
from ompi.
First you should not load the mpi/2021.10.0
module and you should not set MPICC=mpiicc MPIFC=mpiifort
.
It looks like your compiler setup is busted.
Anyway, can you compile and link a trivial hello world program with icc
?
If not, can you at least compile a trivial hello world program with icc -c
If so, what does file <object_file>
says?
from ompi.
First you should not load the
mpi/2021.10.0
module
Thank you for pointing this out. When compiling OpenMPI, I should not load the Intel MPI environment. This is because OpenMPI and Intel MPI are two different implementations of MPI, each with its libraries and runtime environments. Loading both MPI implementations simultaneously may cause conflicts in environment variables and libraries, leading to errors during compilation and runtime.
and you should not set
MPICC=mpiicc MPIFC=mpiifort
.
Again, I do have a misunderstanding here. I only need to make the above settings when I plan to use Intel MPI and Intel compilers to compile other applications.
But in my case, the issue discussed here is not triggered by the above inappropriate settings.
It looks like your compiler setup is busted. Anyway, can you compile and link a trivial hello world program with
icc
? If not, can you at least compile a trivial hello world program withicc -c
Both failed, see below:
werner@MZ73-LM1-000:~$ module load compiler/2023.1.0 mkl32/2023.2.0
werner@MZ73-LM1-000:~$ cat hello.c
#include <stdio.h>
int main() {
printf("Hello, World!\n");
return 0;
}
werner@MZ73-LM1-000:~$ icc -o hello hello.c
icc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and will be removed from product release in the second half of 2023. The Intel(R) oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. Please transition to use this compiler. Use '-diag-disable=10441' to disable this message.
In file included from /usr/include/features.h(510),
from /usr/include/x86_64-linux-gnu/bits/libc-header-start.h(33),
from /usr/include/stdio.h(27),
from hello.c(1):
/usr/include/x86_64-linux-gnu/gnu/stubs.h(7): catastrophic error: cannot open source file "/usr/include/x86_64-linux-gnu/gnu/stubs.h"
# include <gnu/stubs-32.h>
^
compilation aborted for hello.c (code 4)
werner@MZ73-LM1-000:~$ icc -c hello.c
icc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and will be removed from product release in the second half of 2023. The Intel(R) oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. Please transition to use this compiler. Use '-diag-disable=10441' to disable this message.
In file included from /usr/include/features.h(510),
from /usr/include/x86_64-linux-gnu/bits/libc-header-start.h(33),
from /usr/include/stdio.h(27),
from hello.c(1):
/usr/include/x86_64-linux-gnu/gnu/stubs.h(7): catastrophic error: cannot open source file "/usr/include/x86_64-linux-gnu/gnu/stubs.h"
# include <gnu/stubs-32.h>
^
compilation aborted for hello.c (code 4)
from ompi.
Got it. The problem is triggered by the incomplete or error module file settings shipped by Intel OneAPI.
Using the following module file that I created myself runs smoothly:
werner@MZ73-LM1-000:~$ module load oneapi/2023.2.0
werner@MZ73-LM1-000:~$ cat hello.c
#include <stdio.h>
int main() {
printf("Hello, World!\n");
return 0;
}
werner@MZ73-LM1-000:~$ icc -o hello hello.c
icc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and will be removed from product release in the second half of 2023. The Intel(R) oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. Please transition to use this compiler. Use '-diag-disable=10441' to disable this message.
werner@MZ73-LM1-000:~$ ./hello
Hello, World!
werner@MZ73-LM1-000:~$ icc -c hello.c
icc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and will be removed from product release in the second half of 2023. The Intel(R) oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. Please transition to use this compiler. Use '-diag-disable=10441' to disable this message.
werner@MZ73-LM1-000:~$ file hello.o
hello.o: ELF 64-bit LSB relocatable, x86-64, version 1 (GNU/Linux), not stripped
werner@MZ73-LM1-000:~$ cd ~/Public/repo/github.com/open-mpi/ompi.git/build
werner@MZ73-LM1-000:~/Public/repo/github.com/open-mpi/ompi.git/build$ CC=icc CXX=icpc FC=ifort ../configure
But my above module file also loads intel mpi. So, it would be better to create a more granular module file that doesn't load intel mpi.
Below are the contents of the related module files:
werner@MZ73-LM1-000:~/Public/repo/github.com/open-mpi/ompi.git/build$ module show compiler/2023.1.0 mkl32/2023.2.0
------------------------------------------------------------------------------------------------------------------------------------------------------------------
/opt/intel/oneapi/2023.1.0/modulefiles/compiler/2023.1.0:
------------------------------------------------------------------------------------------------------------------------------------------------------------------
conflict("compiler32")
conflict("compiler")
whatis("Configure for use with Intel 64-bit compiler(s).")
unsetenv("INTEL_TARGET_ARCH_IA32","")
setenv("CMPLR_ROOT","/opt/intel/oneapi/2023.1.0/compiler/2023.1.0")
load("tbb")
load("compiler-rt")
try_load("oclfpga")
LmodMsgRaw(" Load "debugger" to debug DPC++ applications with the gdb-oneapi debugger.
")
LmodMsgRaw(" Load "dpl" for additional DPC++ APIs: https://github.com/oneapi-src/oneDPL
")
append_path("OCL_ICD_FILENAMES","/opt/intel/oneapi/2023.1.0/compiler/2023.1.0/linux/lib/x64/libintelocl.so")
prepend_path("PATH","/opt/intel/oneapi/2023.1.0/compiler/2023.1.0/linux/bin")
prepend_path("PATH","/opt/intel/oneapi/2023.1.0/compiler/2023.1.0/linux/bin/intel64")
append_path("MANPATH","/opt/intel/oneapi/2023.1.0/compiler/2023.1.0/documentation/en/man/common")
prepend_path("CMAKE_PREFIX_PATH","/opt/intel/oneapi/2023.1.0/compiler/2023.1.0/linux/IntelDPCPP")
prepend_path("NLSPATH","/opt/intel/oneapi/2023.1.0/compiler/2023.1.0/linux/compiler/lib/intel64_lin/locale/%l_%t/%N")
prepend_path("DIAGUTIL_PATH","/opt/intel/oneapi/2023.1.0/compiler/2023.1.0/sys_check/sys_check.sh")
------------------------------------------------------------------------------------------------------------------------------------------------------------------
/opt/intel/oneapi/2023.2.0/modulefiles/mkl32/2023.2.0:
------------------------------------------------------------------------------------------------------------------------------------------------------------------
conflict("mkl32")
conflict("mkl")
whatis("Intel(R) oneAPI Math Kernel Library (oneMKL) IA-32 architecture")
load("tbb32")
load("compiler-rt32")
setenv("MKLROOT","/opt/intel/oneapi/2023.2.0/mkl/2023.2.0")
prepend_path("LD_LIBRARY_PATH","/opt/intel/oneapi/2023.2.0/mkl/2023.2.0/lib/ia32")
prepend_path("LIBRARY_PATH","/opt/intel/oneapi/2023.2.0/mkl/2023.2.0/lib/ia32")
prepend_path("CPATH","/opt/intel/oneapi/2023.2.0/mkl/2023.2.0/include")
prepend_path("PKG_CONFIG_PATH","/opt/intel/oneapi/2023.2.0/mkl/2023.2.0/lib/pkgconfig")
prepend_path("NLSPATH","/opt/intel/oneapi/2023.2.0/mkl/2023.2.0/lib/ia32/locale/%l_%t/%N")
werner@MZ73-LM1-000:~/Public/repo/github.com/open-mpi/ompi.git/build$ cat /home/werner/Public/repo/github.com/TACC/modulefiles/toolchains/oneapi/2023.2.0
#%Module
setenv ACL_BOARD_VENDOR_PATH {/opt/Intel/OpenCLFPGA/oneAPI/Boards}
setenv ADVISOR_2023_DIR {/opt/intel/oneapi/2023.2.0/advisor/2023.2.0}
setenv APM {/opt/intel/oneapi/2023.2.0/advisor/2023.2.0/perfmodels}
setenv CCL_CONFIGURATION {cpu_gpu_dpcpp}
setenv CCL_ROOT {/opt/intel/oneapi/2023.2.0/ccl/2021.10.0}
setenv CLASSPATH {/opt/intel/oneapi/2023.2.0/mpi/2021.10.0//lib/mpi.jar:/opt/intel/oneapi/2023.2.0/dal/2023.2.0/lib/onedal.jar}
setenv CMAKE_PREFIX_PATH {/opt/intel/oneapi/2023.2.0/tbb/2021.10.0/env/..:/opt/intel/oneapi/2023.2.0/ipp/2021.9.0/lib/cmake/ipp:/opt/intel/oneapi/2023.2.0/dnnl/2023.2.0/cpu_dpcpp_gpu_dpcpp/../lib/cmake:/opt/intel/oneapi/2023.2.0/dal/2023.2.0:/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/linux/IntelDPCPP:/opt/intel/oneapi/2023.2.0/ccl/2021.10.0/lib/cmake/oneCCL}
setenv CMPLR_ROOT {/opt/intel/oneapi/2023.2.0/compiler/2023.2.0}
setenv CPATH {/opt/intel/oneapi/2023.2.0/tbb/2021.10.0/env/../include:/opt/intel/oneapi/2023.2.0/mpi/2021.10.0//include:/opt/intel/oneapi/2023.2.0/mkl/2023.2.0/include:/opt/intel/oneapi/2023.2.0/ippcp/2021.8.0/include:/opt/intel/oneapi/2023.2.0/ipp/2021.9.0/include:/opt/intel/oneapi/2023.2.0/dpl/2022.2.0/linux/include:/opt/intel/oneapi/2023.2.0/dpcpp-ct/2023.2.0/include:/opt/intel/oneapi/2023.2.0/dnnl/2023.2.0/cpu_dpcpp_gpu_dpcpp/include:/opt/intel/oneapi/2023.2.0/dev-utilities/2021.10.0/include:/opt/intel/oneapi/2023.2.0/dal/2023.2.0/include:/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/linux/lib/oclfpga/include:/opt/intel/oneapi/2023.2.0/ccl/2021.10.0/include/cpu_gpu_dpcpp}
setenv DAALROOT {/opt/intel/oneapi/2023.2.0/dal/2023.2.0}
setenv DALROOT {/opt/intel/oneapi/2023.2.0/dal/2023.2.0}
setenv DAL_MAJOR_BINARY {1}
setenv DAL_MINOR_BINARY {1}
setenv DIAGUTIL_PATH {/opt/intel/oneapi/2023.2.0/vtune/2023.2.0/sys_check/vtune_sys_check.py:/opt/intel/oneapi/2023.2.0/dpcpp-ct/2023.2.0/sys_check/sys_check.sh:/opt/intel/oneapi/2023.2.0/debugger/2023.2.0/sys_check/debugger_sys_check.py:/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/sys_check/sys_check.sh:/opt/intel/oneapi/2023.2.0/advisor/2023.2.0/sys_check/advisor_sys_check.py}
setenv DNNLROOT {/opt/intel/oneapi/2023.2.0/dnnl/2023.2.0/cpu_dpcpp_gpu_dpcpp}
setenv DPCT_BUNDLE_ROOT {/opt/intel/oneapi/2023.2.0/dpcpp-ct/2023.2.0}
setenv DPL_ROOT {/opt/intel/oneapi/2023.2.0/dpl/2022.2.0}
setenv FI_PROVIDER_PATH {/opt/intel/oneapi/2023.2.0/mpi/2021.10.0//libfabric/lib/prov:/usr/lib/x86_64-linux-gnu/libfabric}
setenv FPGA_VARS_DIR {/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/linux/lib/oclfpga}
setenv GDB_INFO {/opt/intel/oneapi/2023.2.0/debugger/2023.2.0/documentation/info/}
setenv INFOPATH {/opt/intel/oneapi/2023.2.0/debugger/2023.2.0/gdb/intel64/lib}
setenv INSPECTOR_2023_DIR {/opt/intel/oneapi/2023.2.0/inspector/2023.2.0}
setenv INTELFPGAOCLSDKROOT {/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/linux/lib/oclfpga}
setenv INTEL_LICENSE_FILE {/opt/intel/licenses:/home/werner/intel/licenses}
setenv IPPCP_TARGET_ARCH {intel64}
setenv IPPCRYPTOROOT {/opt/intel/oneapi/2023.2.0/ippcp/2021.8.0}
setenv IPPROOT {/opt/intel/oneapi/2023.2.0/ipp/2021.9.0}
setenv IPP_TARGET_ARCH {intel64}
setenv I_MPI_ROOT {/opt/intel/oneapi/2023.2.0/mpi/2021.10.0}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/ccl/2021.10.0/lib/cpu_gpu_dpcpp}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/linux/compiler/lib/intel64_lin}
prepend-path LD_LIBRARY_PATH {/home/werner/Public/repo/github.com/SCM-NV/ftl.git/install/lib}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/ccl/2021.10.0/lib/cpu_gpu_dpcpp}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/linux/compiler/lib/intel64_lin}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/linux/lib/oclfpga/host/linux64/lib}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/linux/lib/x64}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/linux/lib}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/dal/2023.2.0/lib/intel64}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/debugger/2023.2.0/dep/lib}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/debugger/2023.2.0/libipt/intel64/lib}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/debugger/2023.2.0/gdb/intel64/lib}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/dnnl/2023.2.0/cpu_dpcpp_gpu_dpcpp/lib}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/ipp/2021.9.0/lib/intel64}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/ippcp/2021.8.0/lib/intel64}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/itac/2021.10.0/slib}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/mkl/2023.2.0/lib/intel64}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/mpi/2021.10.0/lib}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/mpi/2021.10.0/lib/release}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/mpi/2021.10.0/libfabric/lib}
prepend-path LD_LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/tbb/2021.10.0/env/../lib/intel64/gcc4.8}
setenv LIBRARY_PATH {/opt/intel/oneapi/2023.2.0/tbb/2021.10.0/env/../lib/intel64/gcc4.8:/opt/intel/oneapi/2023.2.0/mpi/2021.10.0//libfabric/lib:/opt/intel/oneapi/2023.2.0/mpi/2021.10.0//lib/release:/opt/intel/oneapi/2023.2.0/mpi/2021.10.0//lib:/opt/intel/oneapi/2023.2.0/mkl/2023.2.0/lib/intel64:/opt/intel/oneapi/2023.2.0/ippcp/2021.8.0/lib/intel64:/opt/intel/oneapi/2023.2.0/ipp/2021.9.0/lib/intel64:/opt/intel/oneapi/2023.2.0/dnnl/2023.2.0/cpu_dpcpp_gpu_dpcpp/lib:/opt/intel/oneapi/2023.2.0/dal/2023.2.0/lib/intel64:/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/linux/compiler/lib/intel64_lin:/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/linux/lib:/opt/intel/oneapi/2023.2.0/ccl/2021.10.0/lib/cpu_gpu_dpcpp}
setenv MANPATH {/opt/intel/oneapi/2023.2.0/mpi/2021.10.0/man:/opt/intel/oneapi/2023.2.0/itac/2021.10.0/man:/opt/intel/oneapi/2023.2.0/debugger/2023.2.0/documentation/man:/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/documentation/en/man/common}
setenv MKLROOT {/opt/intel/oneapi/2023.2.0/mkl/2023.2.0}
setenv NLSPATH {/opt/intel/oneapi/2023.2.0/mkl/2023.2.0/lib/intel64/locale/%l_%t/%N:/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/linux/compiler/lib/intel64_lin/locale/%l_%t/%N}
setenv OCL_ICD_FILENAMES {libintelocl_emu.so:libalteracl.so:/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/linux/lib/x64/libintelocl.so}
setenv ONEAPI_ROOT {/opt/intel/oneapi/2023.2.0}
prepend-path PATH {/opt/intel/oneapi/2023.2.0/advisor/2023.2.0/bin64}
prepend-path PATH {/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/linux/bin}
prepend-path PATH {/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/linux/bin/intel64}
prepend-path PATH {/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/linux/lib/oclfpga/bin}
prepend-path PATH {/opt/intel/oneapi/2023.2.0/debugger/2023.2.0/gdb/intel64/bin}
prepend-path PATH {/opt/intel/oneapi/2023.2.0/dev-utilities/2021.10.0/bin}
prepend-path PATH {/opt/intel/oneapi/2023.2.0/dpcpp-ct/2023.2.0/bin}
prepend-path PATH {/opt/intel/oneapi/2023.2.0/inspector/2023.2.0/bin64}
prepend-path PATH {/opt/intel/oneapi/2023.2.0/itac/2021.10.0/bin}
prepend-path PATH {/opt/intel/oneapi/2023.2.0/mkl/2023.2.0/bin/intel64}
prepend-path PATH {/opt/intel/oneapi/2023.2.0/mpi/2021.10.0/bin}
prepend-path PATH {/opt/intel/oneapi/2023.2.0/mpi/2021.10.0/libfabric/bin}
prepend-path PATH {/opt/intel/oneapi/2023.2.0/vtune/2023.2.0/bin64}
setenv PKG_CONFIG_PATH {/opt/intel/oneapi/2023.2.0/vtune/2023.2.0/include/pkgconfig/lib64:/opt/intel/oneapi/2023.2.0/tbb/2021.10.0/env/../lib/pkgconfig:/opt/intel/oneapi/2023.2.0/mpi/2021.10.0/lib/pkgconfig:/opt/intel/oneapi/2023.2.0/mkl/2023.2.0/lib/pkgconfig:/opt/intel/oneapi/2023.2.0/ippcp/2021.8.0/lib/pkgconfig:/opt/intel/oneapi/2023.2.0/inspector/2023.2.0/include/pkgconfig/lib64:/opt/intel/oneapi/2023.2.0/dpl/2022.2.0/lib/pkgconfig:/opt/intel/oneapi/2023.2.0/dnnl/2023.2.0/cpu_dpcpp_gpu_dpcpp/../lib/pkgconfig:/opt/intel/oneapi/2023.2.0/dal/2023.2.0/lib/pkgconfig:/opt/intel/oneapi/2023.2.0/compiler/2023.2.0/lib/pkgconfig:/opt/intel/oneapi/2023.2.0/ccl/2021.10.0/lib/pkgconfig:/opt/intel/oneapi/2023.2.0/advisor/2023.2.0/include/pkgconfig/lib64}
setenv SETVARS_COMPLETED {1}
setenv TBBROOT {/opt/intel/oneapi/2023.2.0/tbb/2021.10.0/env/..}
setenv VTUNE_PROFILER_2023_DIR {/opt/intel/oneapi/2023.2.0/vtune/2023.2.0}
setenv VTUNE_PROFILER_DIR {/opt/intel/oneapi/2023.2.0/vtune/2023.2.0}
setenv VT_ADD_LIBS {-ldwarf -lelf -lvtunwind -lm -lpthread}
setenv VT_LIB_DIR {/opt/intel/oneapi/2023.2.0/itac/2021.10.0/lib}
setenv VT_MPI {impi4}
setenv VT_ROOT {/opt/intel/oneapi/2023.2.0/itac/2021.10.0}
setenv VT_SLIB_DIR {/opt/intel/oneapi/2023.2.0/itac/2021.10.0/slib}
from ompi.
Looks like the original reporting user has resolved the compiler issue in their environment. Closing the issue.
from ompi.
Related Issues (20)
- Open MPI fails with 480 processes on a single node HOT 5
- Unable to run openMPI from two machines HOT 5
- mca_pml_ob1_recv_frag_callback_match occasional segfault HOT 9
- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. HOT 5
- OpenMPI configure script wrongly recognizes which directive to use for ignoring tkr in case of the new LLVM Fortran compiler HOT 9
- mpirun 5.0.2 hangs - ssh works HOT 11
- --with-cuda failes to find libcuda.so HOT 4
- Scaling issue run openmp on a cluster HOT 4
- openmpi osc_ucx_component error HOT 4
- Error using openmpi mpirun in Fedora 40 HOT 5
- Errors when running mpi programs HOT 5
- Trying to run MPI 3.0.6 on docker HOT 6
- problem with MPI_Comm_Create_Group HOT 8
- Error `Could not find viable pmix build` while building in Docker HOT 2
- COLL/UCC doesn't compile against head of UCC at master HOT 2
- Support zero-copy non-contiguous send HOT 4
- OpenMPI/5.0.3 with PMIx/4.2.7 compilation error HOT 2
- Configure --with-tm=/opt/pbs/ with PBS Professional fails with openmpi-5.0.3, succeeds with openmpi-4.1.4 HOT 10
- Failed to build RPM from SRPM because of large UID and old tar command HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from ompi.