Hi Carson,<div><div><br></div><div>It looks like there may be a locking issue with the datastore index log in MAKER 2.25/openmpi 1.4.5. I noticed this when running 8 MPI maker instances, each with 32 nodes. Examples from the log:</div>
<div><br></div><div><div><font face="'courier new', monospace">scaffold1001.1 genome_datastore/93/A6/scaffold1001.1/ FINISHED</font></div><div><font face="'courier new', monospace">scaffold1002.1 genome_datastore/72/43/scaffold1002.1/ FINISHED</font></div>
<div><font face="'courier new', monospace"><br></font></div><div><font face="'courier new', monospace">scaffold1003.1 genome_datastore/B8/05/scaffold1003.1/ FINISHED</font></div></div><div><font face="'courier new', monospace"><br>
</font></div><div><font face="'courier new', monospace">...</font></div><div><font face="'courier new', monospace"><br></font></div><div><div style="font-family:'courier new',monospace">scaffold10085.1 genome_datastore/1C/7E/scaffold10085.1/ FINISHED</div>
<div style="font-family:'courier new',monospace">scaffold8265.1 genome_datastore/01/E4/scaffold8265.1/ FINISHED</div><div style="font-family:'courier new',monospace">D</div><div style="font-family:'courier new',monospace">
scaffold8295.1 genome_datastore/63/13/scaffold8295.1/ FINISHED</div><div style="font-family:'courier new',monospace"><br></div><div style="font-family:'courier new',monospace">...</div><div style="font-family:'courier new',monospace">
<br></div><div style="font-family:'courier new',monospace"><div>scaffold8351.1 genome_datastore/27/52/scaffold8351.1/ FINISHED</div><div>scaffold8343.1 genome_datastore/BF/31/scaffold8343.1/ FINISHED</div><div>
scaffold10167.1 genome_datastore/0B/9A/scaffold10167.1/ FINISHEscaffold10170.1 genome_datastore/F4/FF/scaffold10170.1/ FINISHED</div><div>scaffold10209.1 genome_datastore/2D/AA/scaffold10209.1/ FINISHEscaffold10072.1 genome_datastore/E0/A5/scaffold10072.1/ FINISHED</div>
<div>scaffold10113.1 genome_datastore/00/23/scaffold10113.1/ FINISHED</div></div><div style="font-family:'courier new',monospace"><br></div><div><font face="arial, helvetica, sans-serif">I see this even when running a single MPI instance, 32 nodes, when no actual processing is required apart from marking the scaffolds FINISHED. Comparing the result to a single, non-MPI maker instance running on the same completed hierarchy reveals that many entries aren't being written to the log at all when running under MPI. The single process instance runs just fine, generating a complete log that can be used for the downstream scripts.</font></div>
<div><font face="arial, helvetica, sans-serif"><br></font></div><div style="font-family:'courier new',monospace"><span style="font-family:arial,helvetica,sans-serif">Between runs, I execute a</span></div></div><div>
<font face="arial, helvetica, sans-serif"><br></font></div><div><font face="arial, helvetica, sans-serif">find genome.maker.output/ -name .NFSLock* -type f -print0 | xargs -0 rm &</font></div><div><font face="arial, helvetica, sans-serif"><br>
</font></div><div><font face="arial, helvetica, sans-serif">to be sure lingering lock files from badly exiting processes weren't interfering.</font></div><div><font face="arial, helvetica, sans-serif"><br></font></div>
<div><font face="arial, helvetica, sans-serif">This looks like the sort of thing that may be difficult to track down, and there's a clear workaround, but I'm happy to provide more information if you'd like to debug it.</font></div>
</div><div><font face="arial, helvetica, sans-serif"><br></font></div><div><font face="arial, helvetica, sans-serif">Thanks,</font></div><div><font face="arial, helvetica, sans-serif">Evan</font></div>