Regression
projectroot.teshsuite.smpi.tesh_smpi_gh_139_thread (from CTest)
Failing for the past 1 build
(Since Success
)
Stacktrace
[Tesh/INFO] change directory to /builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/build/teshsuite/smpi/gh-139 Ignore all cruft seen on SimGrid's continuous integration servers Test suite '/builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/teshsuite/smpi/gh-139/gh-139.tesh' [Tesh/INFO] setenv VALGRIND_NO_LEAK_CHECK=--leak-check=no --show-leak-kinds=none [Tesh/INFO] setenv VALGRIND_NO_TRACE_CHILDREN=--trace-children=no [Tesh/INFO] setenv platfdir=/builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/examples/platforms [Tesh/INFO] setenv bindir=/builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/build/teshsuite/smpi/gh-139 [gh-139.tesh:2] /builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/tools/cmake/scripts/my_valgrind.pl --trace-children=yes --trace-children-skip=/usr/bin/*,/bin/* --leak-check=full --show-reachable=yes --track-origins=no --read-var-info=no --num-callers=20 --suppressions=/builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/tools/simgrid.supp --xml=yes --xml-file=memcheck_test_%p.memcheck --child-silent-after-fork=yes /builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/build/teshsuite/smpi/gh-139/../../../smpi_script/bin/smpirun -np 2 -platform /builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/examples/platforms/small_platform.xml -hostfile ../hostfile /builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/build/teshsuite/smpi/gh-139/gh-139 --cfg=smpi/simulate-computation:no --log=smpi_config.thres:warning --log=xbt_cfg.thres:warning --cfg=contexts/factory:thread --- expected +++ obtained @@ -1,14 +1,2 @@ -[Jupiter:1:(2) 0.000000] [smpi_test/INFO] I'm 1/2 -[Jupiter:1:(2) 2.000000] [smpi_test/INFO] finally 42 -[Jupiter:wait recv:(4) 0.000000] [smpi_test/INFO] 1 has MPI rank 1 and global variable rank 1 -[Jupiter:wait recv:(4) 0.000000] [smpi_test/INFO] 1 waiting request -[Jupiter:wait recv:(4) 0.000000] [smpi_test/INFO] new thread has parameter rank 1 and global variable rank 1 -[Jupiter:wait recv:(4) 0.002948] [smpi_test/INFO] 1 request done, return MPI_SUCCESS -[Jupiter:wait recv:(4) 0.002948] [smpi_test/INFO] 1 still has MPI rank 1 and global variable 1 -[Tremblay:0:(1) 0.000000] [smpi_test/INFO] I'm 0/2 -[Tremblay:0:(1) 1.000000] [smpi_test/INFO] finally 42 -[Tremblay:wait send:(3) 0.000000] [smpi_test/INFO] 0 has MPI rank 0 and global variable rank 0 -[Tremblay:wait send:(3) 0.000000] [smpi_test/INFO] 0 request done, return MPI_SUCCESS -[Tremblay:wait send:(3) 0.000000] [smpi_test/INFO] 0 still has MPI rank 0 and global variable 0 -[Tremblay:wait send:(3) 0.000000] [smpi_test/INFO] 0 waiting request -[Tremblay:wait send:(3) 0.000000] [smpi_test/INFO] new thread has parameter rank 0 and global variable rank 0 +/builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/build/lib/simgrid/smpimain: error while loading shared libraries: libns3-csma.so.40: cannot open shared object file: No such file or directory +Execution failed with code 127. Output of <gh-139.tesh:2> mismatch: Unsorted observed output: /builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/build/lib/simgrid/smpimain: error while loading shared libraries: libns3-csma.so.40: cannot open shared object file: No such file or directory Execution failed with code 127. Test suite `gh-139.tesh': NOK (<gh-139.tesh:2> output mismatch) In addition, <gh-139.tesh:2> returned code 127.
Standard Output
[Tesh/INFO] change directory to /builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/build/teshsuite/smpi/gh-139 Ignore all cruft seen on SimGrid's continuous integration servers Test suite '/builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/teshsuite/smpi/gh-139/gh-139.tesh' [Tesh/INFO] setenv VALGRIND_NO_LEAK_CHECK=--leak-check=no --show-leak-kinds=none [Tesh/INFO] setenv VALGRIND_NO_TRACE_CHILDREN=--trace-children=no [Tesh/INFO] setenv platfdir=/builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/examples/platforms [Tesh/INFO] setenv bindir=/builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/build/teshsuite/smpi/gh-139 [gh-139.tesh:2] /builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/tools/cmake/scripts/my_valgrind.pl --trace-children=yes --trace-children-skip=/usr/bin/*,/bin/* --leak-check=full --show-reachable=yes --track-origins=no --read-var-info=no --num-callers=20 --suppressions=/builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/tools/simgrid.supp --xml=yes --xml-file=memcheck_test_%p.memcheck --child-silent-after-fork=yes /builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/build/teshsuite/smpi/gh-139/../../../smpi_script/bin/smpirun -np 2 -platform /builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/examples/platforms/small_platform.xml -hostfile ../hostfile /builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/build/teshsuite/smpi/gh-139/gh-139 --cfg=smpi/simulate-computation:no --log=smpi_config.thres:warning --log=xbt_cfg.thres:warning --cfg=contexts/factory:thread --- expected +++ obtained @@ -1,14 +1,2 @@ -[Jupiter:1:(2) 0.000000] [smpi_test/INFO] I'm 1/2 -[Jupiter:1:(2) 2.000000] [smpi_test/INFO] finally 42 -[Jupiter:wait recv:(4) 0.000000] [smpi_test/INFO] 1 has MPI rank 1 and global variable rank 1 -[Jupiter:wait recv:(4) 0.000000] [smpi_test/INFO] 1 waiting request -[Jupiter:wait recv:(4) 0.000000] [smpi_test/INFO] new thread has parameter rank 1 and global variable rank 1 -[Jupiter:wait recv:(4) 0.002948] [smpi_test/INFO] 1 request done, return MPI_SUCCESS -[Jupiter:wait recv:(4) 0.002948] [smpi_test/INFO] 1 still has MPI rank 1 and global variable 1 -[Tremblay:0:(1) 0.000000] [smpi_test/INFO] I'm 0/2 -[Tremblay:0:(1) 1.000000] [smpi_test/INFO] finally 42 -[Tremblay:wait send:(3) 0.000000] [smpi_test/INFO] 0 has MPI rank 0 and global variable rank 0 -[Tremblay:wait send:(3) 0.000000] [smpi_test/INFO] 0 request done, return MPI_SUCCESS -[Tremblay:wait send:(3) 0.000000] [smpi_test/INFO] 0 still has MPI rank 0 and global variable 0 -[Tremblay:wait send:(3) 0.000000] [smpi_test/INFO] 0 waiting request -[Tremblay:wait send:(3) 0.000000] [smpi_test/INFO] new thread has parameter rank 0 and global variable rank 0 +/builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/build/lib/simgrid/smpimain: error while loading shared libraries: libns3-csma.so.40: cannot open shared object file: No such file or directory +Execution failed with code 127. Output of <gh-139.tesh:2> mismatch: Unsorted observed output: /builds/workspace/SimGrid-DynamicAnalysis-Valgrind/label/DynamicAnalysis/build/lib/simgrid/smpimain: error while loading shared libraries: libns3-csma.so.40: cannot open shared object file: No such file or directory Execution failed with code 127. Test suite `gh-139.tesh': NOK (<gh-139.tesh:2> output mismatch) In addition, <gh-139.tesh:2> returned code 127.