[it-c05b10:00466] mca_base_component_repository_open: unable to open mca_patcher_overwrite: /nfs/apps/test/openmpi/lib/openmpi/mca_patcher_overwrite.so: undefined symbol: mca_patcher_base_patch_t_class (ignored) [it-c05b10:00467] mca_base_component_repository_open: unable to open mca_patcher_overwrite: /nfs/apps/test/openmpi/lib/openmpi/mca_patcher_overwrite.so: undefined symbol: mca_patcher_base_patch_t_class (ignored) [it-c05b10:00465] mca_base_component_repository_open: unable to open mca_patcher_overwrite: /nfs/apps/test/openmpi/lib/openmpi/mca_patcher_overwrite.so: undefined symbol: mca_patcher_base_patch_t_class (ignored) [it-c05b10:00482] mca_base_component_repository_open: unable to open mca_patcher_overwrite: /nfs/apps/test/openmpi/lib/openmpi/mca_patcher_overwrite.so: undefined symbol: mca_patcher_base_patch_t_class (ignored) [it-c05b10:00468] mca_base_component_repository_open: unable to open mca_patcher_overwrite: /nfs/apps/test/openmpi/lib/openmpi/mca_patcher_overwrite.so: undefined symbol: mca_patcher_base_patch_t_class (ignored) [it-c05b10:00473] mca_base_component_repository_open: unable to open mca_patcher_overwrite: /nfs/apps/test/openmpi/lib/openmpi/mca_patcher_overwrite.so: undefined symbol: mca_patcher_base_patch_t_class (ignored) [it-c05b10:00476] mca_base_component_repository_open: unable to open mca_patcher_overwrite: /nfs/apps/test/openmpi/lib/openmpi/mca_patcher_overwrite.so: undefined symbol: mca_patcher_base_patch_t_class (ignored) [it-c05b10:00481] mca_base_component_repository_open: unable to open mca_patcher_overwrite: /nfs/apps/test/openmpi/lib/openmpi/mca_patcher_overwrite.so: undefined symbol: mca_patcher_base_patch_t_class (ignored) [it-c05b10:00466] mca_base_component_repository_open: unable to open mca_shmem_mmap: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_mmap.so: undefined symbol: opal_show_help (ignored) [it-c05b10:00467] mca_base_component_repository_open: unable to open mca_shmem_mmap: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_mmap.so: undefined symbol: opal_show_help (ignored) [it-c05b10:00468] mca_base_component_repository_open: unable to open mca_shmem_mmap: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_mmap.so: undefined symbol: opal_show_help (ignored) [it-c05b10:00466] mca_base_component_repository_open: unable to open mca_shmem_posix: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_posix.so: undefined symbol: opal_shmem_base_framework (ignored) [it-c05b10:00476] mca_base_component_repository_open: unable to open mca_shmem_mmap: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_mmap.so: undefined symbol: opal_show_help (ignored) [it-c05b10:00467] mca_base_component_repository_open: unable to open mca_shmem_posix: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_posix.so: undefined symbol: opal_shmem_base_framework (ignored) [it-c05b10:00468] mca_base_component_repository_open: unable to open mca_shmem_posix: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_posix.so: undefined symbol: opal_shmem_base_framework (ignored) [it-c05b10:00473] mca_base_component_repository_open: unable to open mca_shmem_mmap: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_mmap.so: undefined symbol: opal_show_help (ignored) [it-c05b10:00482] mca_base_component_repository_open: unable to open mca_shmem_mmap: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_mmap.so: undefined symbol: opal_show_help (ignored) [it-c05b10:00467] mca_base_component_repository_open: unable to open mca_shmem_sysv: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_sysv.so: undefined symbol: opal_show_help (ignored) [it-c05b10:00476] mca_base_component_repository_open: unable to open mca_shmem_posix: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_posix.so: undefined symbol: opal_shmem_base_framework (ignored) [it-c05b10:00466] mca_base_component_repository_open: unable to open mca_shmem_sysv: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_sysv.so: undefined symbol: opal_show_help (ignored) [it-c05b10:00473] mca_base_component_repository_open: unable to open mca_shmem_posix: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_posix.so: undefined symbol: opal_shmem_base_framework (ignored) [it-c05b10:00468] mca_base_component_repository_open: unable to open mca_shmem_sysv: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_sysv.so: undefined symbol: opal_show_help (ignored) [it-c05b10:00465] mca_base_component_repository_open: unable to open mca_shmem_mmap: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_mmap.so: undefined symbol: opal_show_help (ignored) [it-c05b10:00481] mca_base_component_repository_open: unable to open mca_shmem_mmap: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_mmap.so: undefined symbol: opal_show_help (ignored) [it-c05b10:00482] mca_base_component_repository_open: unable to open mca_shmem_posix: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_posix.so: undefined symbol: opal_shmem_base_framework (ignored) [it-c05b10:00476] mca_base_component_repository_open: unable to open mca_shmem_sysv: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_sysv.so: undefined symbol: opal_show_help (ignored) [it-c05b10:00473] mca_base_component_repository_open: unable to open mca_shmem_sysv: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_sysv.so: undefined symbol: opal_show_help (ignored) [it-c05b10:00482] mca_base_component_repository_open: unable to open mca_shmem_sysv: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_sysv.so: undefined symbol: opal_show_help (ignored) [it-c05b10:00481] mca_base_component_repository_open: unable to open mca_shmem_posix: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_posix.so: undefined symbol: opal_shmem_base_framework (ignored) [it-c05b10:00465] mca_base_component_repository_open: unable to open mca_shmem_posix: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_posix.so: undefined symbol: opal_shmem_base_framework (ignored) [it-c05b10:00481] mca_base_component_repository_open: unable to open mca_shmem_sysv: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_sysv.so: undefined symbol: opal_show_help (ignored) [it-c05b10:00465] mca_base_component_repository_open: unable to open mca_shmem_sysv: /nfs/apps/test/openmpi/lib/openmpi/mca_shmem_sysv.so: undefined symbol: opal_show_help (ignored) -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [it-c05b10:466] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [it-c05b10:467] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [it-c05b10:473] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [it-c05b10:476] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [it-c05b10:482] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- ------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code.. Per user-direction, the job has been aborted. ------------------------------------------------------- -------------------------------------------------------------------------- mpirun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[30092,1],1] Exit code: 1 -------------------------------------------------------------------------- ------------------------------------------------------------ Sender: LSF System Subject: Job 1703425: Exited Job was submitted from host by user in cluster . Job was executed on host(s) , in queue , as user in cluster . was used as the home directory. was used as the working directory. Started at Fri Apr 7 09:04:14 2017 Results reported at Fri Apr 7 09:04:18 2017 Your job looked like: ------------------------------------------------------------ # LSBATCH: User input /nfs/apps/test/openmpi/bin/mpirun -n 8 /nfs/apps/test/maker/bin/maker ------------------------------------------------------------ Exited with exit code 1. Resource usage summary: CPU time : 5.99 sec. Max Memory : 3 MB Max Swap : 37 MB Max Processes : 1 Max Threads : 1 The output (if any) is above this job summary.