Tue Mar 9 15:37:24 PST 2010
Job id = 11005
dawndev5
PARTITION AVAIL TIMELIMIT NODES STATE BP_LIST
pdebug* up 2:00:00 16 err dawndev000
pdebug* up 2:00:00 16 alloc dawndev000
pdebug* up 2:00:00 992 idle dawndev[000x001]
JOBID PARTITION NAME USER ST TIME NODES BP_LIST(REASON)
11005 pdebug moab.job blaise R 0:03 16 dawndev000[8]
/g/g0/blaise/bgp
FE_MPI (Info) : initialize() - using jobname '' provided by scheduler interface
FE_MPI (Info) : Invoking mpirun backend
FE_MPI (Info) : connectToServer() - Handshake successful
BRIDGE (Info) : rm_set_serial() - The machine serial number (alias) is BGP
FE_MPI (Info) : Preparing partition
BE_MPI (Info) : Examining specified partition
BE_MPI (Info) : Checking partition RMP05Ma130421237 initial state ...
BE_MPI (Info) : Partition RMP05Ma130421237 initial state = READY ('I')
BE_MPI (Info) : Checking partition owner...
BE_MPI (Info) : partition RMP05Ma130421237 owner is 'slurm'
BE_MPI (Info) : Partition owner (slurm) differs from the current user (blaise)
BE_MPI (Info) : Checking if the user is in the partition users list...
BE_MPI (Info) : Current user is in the partition user list
BE_MPI (Info) : Done preparing partition
FE_MPI (Info) : Adding job
BE_MPI (Info) : Adding job to database...
FE_MPI (Info) : Job added with the following id: 204440
FE_MPI (Info) : Starting job 204440
FE_MPI (Info) : Waiting for job to terminate
BE_MPI (Info) : IO - Threads initialized
BE_MPI (Info) : I/O input runner thread terminated
MASTER: Number of MPI tasks is: 16
Thread 3 of task 3 in parallel region
Thread 3 of task 0 in parallel region
Thread 3 of task 11 in parallel region
Thread 3 of task 15 in parallel region
Thread 3 of task 7 in parallel region
Thread 3 of task 6 in parallel region
Thread 0 of task 3 in parallel region
Thread 3 of task 2 in parallel region
Thread 0 of task 0 in parallel region
Thread 3 of task 9 in parallel region
Thread 0 of task 11 in parallel region
Thread 3 of task 14 in parallel region
Thread 0 of task 15 in parallel region
Thread 0 of task 7 in parallel region
Thread 0 of task 6 in parallel region
Thread 2 of task 3 in parallel region
Thread 0 of task 2 in parallel region
Thread 3 of task 1 in parallel region
Thread 2 of task 0 in parallel region
Thread 0 of task 9 in parallel region
Thread 3 of task 8 in parallel region
Thread 2 of task 11 in parallel region
Thread 0 of task 14 in parallel region
Thread 3 of task 5 in parallel region
Thread 2 of task 15 in parallel region
Thread 2 of task 7 in parallel region
Thread 2 of task 6 in parallel region
Thread 1 of task 3 in parallel region
Thread 2 of task 2 in parallel region
Thread 3 of task 13 in parallel region
Thread 0 of task 1 in parallel region
Thread 1 of task 0 in parallel region
Thread 2 of task 9 in parallel region
Thread 0 of task 8 in parallel region
Thread 1 of task 11 in parallel region
Thread 3 of task 12 in parallel region
Thread 2 of task 14 in parallel region
Thread 0 of task 5 in parallel region
Thread 1 of task 15 in parallel region
Thread 1 of task 7 in parallel region
Thread 1 of task 6 in parallel region
MPI task 3 node sum = 328350.000000 running on Rank 3 of 16 <3,0,0,0> R00-M0-N04-J31
Thread 1 of task 2 in parallel region
Thread 0 of task 13 in parallel region
Thread 2 of task 1 in parallel region
MPI task 0 node sum = 328350.000000 running on Rank 0 of 16 <0,0,0,0> R00-M0-N04-J23
Thread 1 of task 9 in parallel region
Thread 2 of task 8 in parallel region
MPI task 11 node sum = 328350.000000 running on Rank 11 of 16 <3,0,1,0> R00-M0-N04-J35
Thread 0 of task 12 in parallel region
Thread 1 of task 14 in parallel region
Thread 2 of task 5 in parallel region
MPI task 15 node sum = 328350.000000 running on Rank 15 of 16 <3,1,1,0> R00-M0-N04-J34
MPI task 7 node sum = 328350.000000 running on Rank 7 of 16 <3,1,0,0> R00-M0-N04-J30
MPI task 6 node sum = 328350.000000 running on Rank 6 of 16 <2,1,0,0> R00-M0-N04-J13
MPI task 2 node sum = 328350.000000 running on Rank 2 of 16 <2,0,0,0> R00-M0-N04-J12
Thread 2 of task 13 in parallel region
Thread 1 of task 1 in parallel region
MPI task 9 node sum = 328350.000000 running on Rank 9 of 16 <1,0,1,0> R00-M0-N04-J08
Thread 1 of task 8 in parallel region
Thread 2 of task 12 in parallel region
MPI task 14 node sum = 328350.000000 running on Rank 14 of 16 <2,1,1,0> R00-M0-N04-J17
Thread 1 of task 5 in parallel region
Thread 1 of task 13 in parallel region
MPI task 1 node sum = 328350.000000 running on Rank 1 of 16 <1,0,0,0> R00-M0-N04-J04
MPI task 8 node sum = 328350.000000 running on Rank 8 of 16 <0,0,1,0> R00-M0-N04-J27
Thread 1 of task 12 in parallel region
MPI task 5 node sum = 328350.000000 running on Rank 5 of 16 <1,1,0,0> R00-M0-N04-J05
MPI task 13 node sum = 328350.000000 running on Rank 13 of 16 <1,1,1,0> R00-M0-N04-J09
MPI task 12 node sum = 328350.000000 running on Rank 12 of 16 <0,1,1,0> R00-M0-N04-J26
Thread 3 of task 10 in parallel region
Thread 3 of task 4 in parallel region
Thread 0 of task 10 in parallel region
Thread 0 of task 4 in parallel region
Thread 2 of task 10 in parallel region
Thread 2 of task 4 in parallel region
Thread 1 of task 10 in parallel region
Thread 1 of task 4 in parallel region
MPI task 10 node sum = 328350.000000 running on Rank 10 of 16 <2,0,1,0> R00-M0-N04-J16
MPI task 4 node sum = 328350.000000 running on Rank 4 of 16 <0,1,0,0> R00-M0-N04-J22
*** Global sum = 459422.000000 ***
Done.
BE_MPI (Info) : I/O output runner thread terminated
BE_MPI (Info) : Job 204440 switched to state TERMINATED ('T')
BE_MPI (Info) : Job successfully terminated - TERMINATED ('T')
FE_MPI (Info) : Job terminated normally
FE_MPI (Info) : exit status = (0)
BE_MPI (Info) : Starting cleanup sequence
BE_MPI (Info) : cleanupDatabase() - job already terminated / hasn't been added
BE_MPI (Info) : cleanupDatabase() - Specified "nofree" option - leaving partition as is
FE_MPI (Info) : == FE completed ==
FE_MPI (Info) : == Exit status: 0 ==
Done
Tue Mar 9 15:37:31 PST 2010
.