<br><br><div class="gmail_quote">---------- Forwarded message ----------<br>From: <b class="gmail_sendername">Yunfei Guo</b> <span dir="ltr"><<a href="mailto:guoyunfei1989@gmail.com">guoyunfei1989@gmail.com</a>></span><br>
Date: Thu, Jul 26, 2012 at 8:10 AM<br>Subject: Re: [maker-devel] ERROR: MPI_Recv(186), dequeue_and_set_error(596)<br>To: Carson Holt <<a href="mailto:carsonhh@gmail.com">carsonhh@gmail.com</a>><br><br><br><div>Hi Carson, same error occurred again. What should I do to check if it was caused by the same node? Also, if I ran maker on a single node instead of two nodes, will the same error appear again? Thank you.</div>
<div>#-------------------------------#</div><div class="im">
<div>SIGCHLD handler "DEFAULT" not defined.</div></div><div class="im"><div>Fatal error in MPI_Recv: Other MPI error, error stack:</div></div><div>MPI_Recv(186).............: MPI_Recv(buf=0x7fff1c3dd3b0, count=2, MPI_</div>
<div>INT, src=MPI_ANY_SOURCE, tag=1111, MPI_COMM_WORLD, status=0x7fff1c3dd3</div>
<div>90) failed</div><div>dequeue_and_set_error(596): Communication error with rank 21</div><div>running exonerate search.</div><div><div class="im"><div>#--------- command -------------#</div><div>Widget::exonerate::protein2genome:</div>
<div>/home/username/usr/bin/exonerate -q /home/yunfeiguo/projects/fish/Nigro</div><div>/run/dir_Nigro-53k00/Nigro-53k_part.maker.output/Nigro-53k_part_datast</div></div><div>ore/81/43/scaffold5780//theVoid.scaffold5780/tr%7CG3N4L5%7CG3N4L5_GASA</div>
<div>C.for.6527-8832.2.fasta -t /home/yunfeiguo/projects/fish/Nigro/run/dir</div><div>_Nigro-53k00/Nigro-53k_part.maker.output/Nigro-53k_part_datastore/81/4</div><div>3/scaffold5780//theVoid.scaffold5780/scaffold5780.6527-8832.2.fasta -Q</div>
<div class="im">
<div> protein -T dna -m protein2genome --softmasktarget --percent 20 --sho</div><div>wcigar > /home/yunfeiguo/projects/fish/Nigro/run/dir_Nigro-53k00/Nigr</div></div><div>o-53k_part.maker.output/Nigro-53k_part_datastore/81/43/scaffold5780//t</div>
<div>heVoid.scaffold5780/<a href="http://scaffold5780.6527-8832.tr" target="_blank">scaffold5780.6527-8832.tr</a>%7CG3N4L5%7CG3N4L5_GASAC.</div><div>p_exonerate.2</div><div>#-------------------------------#</div><div class="im">
<div>Perl exited with active threads:</div>
<div> 1 running and unjoined</div><div> 0 finished and unjoined</div><div> 0 running and detached</div></div></div><div>...</div><span class="HOEnZb"><font color="#888888"><div><br></div><div>Yunfei</div>
</font></span><div class="HOEnZb"><div class="h5"><br><div class="gmail_quote">On Wed, Jul 25, 2012 at 1:43 PM, Yunfei Guo <span dir="ltr"><<a href="mailto:guoyunfei1989@gmail.com" target="_blank">guoyunfei1989@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Thanks, Carson. Actually I already set clean_try=1.<div><div><br><br><div class="gmail_quote">
On Wed, Jul 25, 2012 at 1:34 PM, Carson Holt <span dir="ltr"><<a href="mailto:carsonhh@gmail.com" target="_blank">carsonhh@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div style="font-size:14px;font-family:Calibri,sans-serif;word-wrap:break-word"><div>That second error from 2.25, Seems to be thrown by Perl's Storable module. There may be some weird partial serialization of the data that occurred on the first failure and is now causing failures on retry. You can set clean_try=1 in MAKER, to let it wipe out data for a failed contig before retrying. That can sometimes help get around weird hard failures.</div>
<div><br></div><div>Thanks,</div><div>Carson</div><div><br></div><div><br></div><span><div style="border-right:medium none;padding-right:0in;padding-left:0in;padding-top:3pt;text-align:left;font-size:11pt;border-bottom:medium none;font-family:Calibri;border-top:#b5c4df 1pt solid;padding-bottom:0in;border-left:medium none">
<span style="font-weight:bold">From: </span> Yunfei Guo <<a href="mailto:guoyunfei1989@gmail.com" target="_blank">guoyunfei1989@gmail.com</a>><br><span style="font-weight:bold">Date: </span> Wednesday, 25 July, 2012 4:26 PM<br>
<span style="font-weight:bold">To: </span> Carson Holt <<a href="mailto:carsonhh@gmail.com" target="_blank">carsonhh@gmail.com</a>><br><span style="font-weight:bold">Subject: </span> Re: [maker-devel] ERROR: MPI_Recv(186), dequeue_and_set_error(596)<br>
</div><div><div><div><br></div>Thank you, Carson. I'm rerunning maker2.26 now. I just tried maker2.25 as well, it failed this time, with similar errors below. I guess it might be caused by the cluster (or node) itself, like you said, because we just added a few nodes and more memories. I'll ask the admin to see whether he can explain this.<div>
#-------------------------------#<div>Thread 1 terminated abnormally: </div><div>------------- EXCEPTION: Bio::Root::Exception -------------</div><div>MSG: no data for midline Sequence with id BL_ORD_ID:126195 no longer exists in database...alignment skipped</div>
<div>STACK: Error::throw</div><div>STACK: Bio::Root::Root::throw /home/yunfeiguo/perl5/lib/perl5/Bio/Root/Root.pm:472</div><div>STACK: Bio::SearchIO::blast::next_result /home/yunfeiguo/perl5/lib/perl5/Bio/SearchIO/<a href="http://blast.pm:1888" target="_blank">blast.pm:1888</a></div>
<div>STACK: Widget::tblastx::keepers /home/yunfeiguo/Downloads/maker/bin/../lib/Widget/<a href="http://tblastx.pm:114" target="_blank">tblastx.pm:114</a></div><div>STACK: Widget::tblastx::parse /home/yunfeiguo/Downloads/maker/bin/../lib/Widget/<a href="http://tblastx.pm:95" target="_blank">tblastx.pm:95</a></div>
<div>STACK: GI::tblastx_as_chunks /home/yunfeiguo/Downloads/maker/bin/../lib/GI.pm:2612</div><div>STACK: Process::MpiChunk::_go /home/yunfeiguo/Downloads/maker/bin/../lib/Process/MpiChunk.pm:1829</div><div>STACK: Process::MpiChunk::run /home/yunfeiguo/Downloads/maker/bin/../lib/Process/MpiChunk.pm:331</div>
<div>STACK: main::node_thread /home/yunfeiguo/Downloads/maker/bin/maker:1308</div><div>STACK: threads::new /home/yunfeiguo/perl5/lib/perl5/x86_64-linux-thread-multi/<a href="http://forks.pm:799" target="_blank">forks.pm:799</a></div>
<div>
STACK: /home/yunfeiguo/Downloads/maker/bin/maker:804</div><div>-----------------------------------------------------------</div><div>Cannot restore overloading on HASH(0x1b7f9b60) (package Bio::Root::Exception) (even after a "require Bio::Root::Exception;") at /home/yunfeiguo/perl5/lib/perl5/x86_64-linux-thread-multi/Storable.pm line 416, at /home/yunfeiguo/perl5/lib/perl5/x86_64-linux-thread-multi/<a href="http://forks.pm" target="_blank">forks.pm</a> line 2256.</div>
<div>Compilation failed in require at /home/yunfeiguo/Downloads/maker/bin/maker line 11.</div><div>BEGIN failed--compilation aborted at /home/yunfeiguo/Downloads/maker/bin/maker line 11.</div><div>Perl exited with active threads:</div>
<div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 running and detached</div>
<div>deleted:0 hits</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Fatal error in MPI_Recv: Other MPI error, error stack:</div><div>MPI_Recv(186).............: MPI_Recv(buf=0x7fff05c78630, count=2, MPI_INT, src=0, tag=5555, MPI_COMM_WORLD, status=0x7fff05c78610) failed</div>
<div>dequeue_and_set_error(596): Communication error with rank 0</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Fatal error in MPI_Recv: Other MPI error, error stack:</div><div>MPI_Recv(186).............: MPI_Recv(buf=0x7fff1414fb00, count=2, MPI_INT, src=0, tag=5555, MPI_COMM_WORLD, status=0x7fff1414fae0) failed</div>
<div>dequeue_and_set_error(596): Communication error with rank 0</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Fatal error in MPI_Recv: Other MPI error, error stack:</div><div>MPI_Recv(186).............: MPI_Recv(buf=0x7fff25d86c00, count=2, MPI_INT, src=0, tag=5555, MPI_COMM_WORLD, status=0x7fff25d86be0) failed</div>
<div>dequeue_and_set_error(596): Communication error with rank 0</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>cleaning tblastx...</div><div>cleaning clusters....</div><div>Fatal error in MPI_Recv: Other MPI error, error stack:</div><div>MPI_Recv(186).............: MPI_Recv(buf=0x7fff1b71f1f0, count=2, MPI_INT, src=0, tag=5555, MPI_COMM_WORLD, status=0x7fff1b71f1d0) failed</div>
<div>dequeue_and_set_error(596): Communication error with rank 0</div><div>
Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 running and detached</div>
<div>Fatal error in MPI_Recv: Other MPI error, error stack:</div><div>MPI_Recv(186).............: MPI_Recv(buf=0x7fffc99d29c0, count=2, MPI_INT, src=0, tag=5555, MPI_COMM_WORLD, status=0x7fffc99d29a0) failed</div><div>dequeue_and_set_error(596): Communication error with rank 0</div>
<div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 running and detached</div>
<div>Fatal error in MPI_Recv: Other MPI error, error stack:</div><div>MPI_Recv(186).............: MPI_Recv(buf=0x7fffc4aaf720, count=2, MPI_INT, src=0, tag=5555, MPI_COMM_WORLD, status=0x7fffc4aaf700) failed</div><div>dequeue_and_set_error(596): Communication error with rank 0</div>
<div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 running and detached</div>
<div>in cluster::shadow_cluster...</div><div>Fatal error in MPI_Recv: Other MPI error, error stack:</div><div>MPI_Recv(186).............: MPI_Recv(buf=0x7fff317862a0, count=2, MPI_INT, src=0, tag=5555, MPI_COMM_WORLD, status=0x7fff31786280) failed</div>
<div>dequeue_and_set_error(596): Communication error with rank 0</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>...finished clustering.</div><div>Fatal error in MPI_Recv: Other MPI error, error stack:</div><div>MPI_Recv(186).............: MPI_Recv(buf=0x7fff8abb8e50, count=2, MPI_INT, src=0, tag=5555, MPI_COMM_WORLD, status=0x7fff8abb8e30) failed</div>
<div>dequeue_and_set_error(596): Communication error with rank 0</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Fatal error in MPI_Recv: Other MPI error, error stack:</div><div>MPI_Recv(186).............: MPI_Recv(buf=0x7fff1d1ff180, count=2, MPI_INT, src=0, tag=5555, MPI_COMM_WORLD, status=0x7fff1d1ff160) failed</div>
<div>dequeue_and_set_error(596): Communication error with rank 0</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Fatal error in MPI_Recv: Other MPI error, error stack:</div><div>MPI_Recv(186).............: MPI_Recv(buf=0x7fff4d865850, count=2, MPI_INT, src=0, tag=5555, MPI_COMM_WORLD, status=0x7fff4d865830) failed</div>
<div>dequeue_and_set_error(596): Communication error with rank 0</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Fatal error in MPI_Recv: Other MPI error, error stack:</div><div>MPI_Recv(186).............: MPI_Recv(buf=0x7fffbec98150, count=2, MPI_INT, src=0, tag=5555, MPI_COMM_WORLD, status=0x7fffbec98130) failed</div>
<div>dequeue_and_set_error(596): Communication error with rank 0</div><div>Fatal error in MPI_Recv: Other MPI error, error stack:</div><div>MPI_Recv(186).............: MPI_Recv(buf=0x7fffa4ead990, count=2, MPI_INT, src=0, tag=5555, MPI_COMM_WORLD, status=0x7fffa4ead970) failed</div>
<div>dequeue_and_set_error(596): Communication error with rank 0</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 running and detached</div><br><div class="gmail_quote">On Wed, Jul 25, 2012 at 12:46 PM, Carson Holt <span dir="ltr"><<a href="mailto:carsonhh@gmail.com" target="_blank">carsonhh@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div style="font-size:14px;font-family:Calibri,sans-serif;word-wrap:break-word"><div>MPI is notorious for unexplicable communication errors, so first I would suggest just restarting and seeing if it happens again (MAKER will pick up where it left off on restart, so no need to alter settings or files).</div>
<div><br></div><div>If it happens again, we can look into it, but no component of the MPI communication framework changed between 2.25 and 2.26 (100% identical), so my first instinct is that this was just what the message said, a"Communication error with rank 18". If it happens again I can try and add some extra messages so we can see the hostname of rank 18. That way we can identify if it's constantly a specific node on your cluster.</div>
<div><br></div><div>Let me know if you see it again.</div><div><br></div><div>Thanks,</div><div>Carson</div><div><br></div><div><br></div><div><br></div><span><div style="border-right:medium none;padding-right:0in;padding-left:0in;padding-top:3pt;text-align:left;font-size:11pt;border-bottom:medium none;font-family:Calibri;border-top:#b5c4df 1pt solid;padding-bottom:0in;border-left:medium none">
<span style="font-weight:bold">From: </span> Yunfei Guo <<a href="mailto:guoyunfei1989@gmail.com" target="_blank">guoyunfei1989@gmail.com</a>><br><span style="font-weight:bold">Date: </span> Wednesday, 25 July, 2012 3:15 PM<br>
<span style="font-weight:bold">To: </span> <<a href="mailto:maker-devel@yandell-lab.org" target="_blank">maker-devel@yandell-lab.org</a>><br><span style="font-weight:bold">Subject: </span> [maker-devel] ERROR: MPI_Recv(186), dequeue_and_set_error(596)<br>
</div><div><div><div><br></div><div><div>Hi everyone,</div><div><br></div><div>I ran maker2.25 without a problem, but with maker2.26, I encountered the following error after running it for ~8 hr with 2 nodes and 24 cpus, do you have any idea what's going on here? Some contigs did get finished, maybe this is not a big problem. My mpich2 version 1.4.1p1, job scheduling system is SGE. Thanks!</div>
<div><br></div><div>running blast search.</div><div>#--------- command -------------#</div><div>Widget::blastx:</div><div>/home/yunfeiguo/Downloads/maker/bin/../exe/blast/bin/blastx -db /tmp/6480.1.all.q/maker_PQOTIq/concatPro%2Etxt.mpi.10.4 -query /tmp/6480.1.all.q/maker_PQOTIq/rank3/scaffold2602.0 -num_alignments 10000 -num_descriptions 10000 -evalue 1e-06 -dbsize 300 -searchsp 500000000 -num_threads 1 -seg yes -soft_masking true -lcase_masking -show_gis -out /home/yunfeiguo/projects/fish/Nigro/run/dir_Nigro-53k00/Nigro-53k_part.maker.output/Nigro-53k_part_datastore/7A/37/scaffold2602//theVoid.scaffold2602/scaffold2602.0.concatPro%2Etxt.blastx.temp_dir/concatPro%2Etxt.mpi.10.4.blastx</div>
<div>#-------------------------------#</div><div>deleted:-1 hits</div><div>SIGCHLD handler "DEFAULT" not defined.</div><div>SIGCHLD handler "DEFAULT" not defined.</div><div>running exonerate search.</div>
<div>#--------- command -------------#</div><div>Widget::exonerate::protein2genome:</div><div>/home/username/usr/bin/exonerate -q /home/yunfeiguo/projects/fish/Nigro/run/dir_Nigro-53k00/Nigro-53k_part.maker.output/Nigro-53k_part_datastore/F9/9B/scaffold2590//theVoid.scaffold2590/sp%7CQ8N8A2%7CANR44_HUMAN.for.1-3712.8.fasta -t /home/yunfeiguo/projects/fish/Nigro/run/dir_Nigro-53k00/Nigro-53k_part.maker.output/Nigro-53k_part_datastore/F9/9B/scaffold2590//theVoid.scaffold2590/scaffold2590.1-3712.8.fasta -Q protein -T dna -m protein2genome --softmasktarget --percent 20 --showcigar > /home/yunfeiguo/projects/fish/Nigro/run/dir_Nigro-53k00/Nigro-53k_part.maker.output/Nigro-53k_part_datastore/F9/9B/scaffold2590//theVoid.scaffold2590/scaffold2590.1-3712.sp%7CQ8N8A2%7CANR44_HUMAN.p_exonerate.8</div>
<div>#-------------------------------#</div><div>Fatal error in MPI_Recv: Other MPI error, error stack:</div><div>MPI_Recv(186).............: MPI_Recv(buf=0x7fffa3a2e760, count=2, MPI_INT, src=MPI_ANY_SOURCE, tag=1111, MPI_COMM_WORLD, status=0x7fffa3a2e740) failed</div>
<div>dequeue_and_set_error(596): Communication error with rank 18</div><div>running blast search.</div><div>#--------- command -------------#</div><div>Widget::blastx:</div><div>/home/yunfeiguo/Downloads/maker/bin/../exe/blast/bin/blastx -db /tmp/6480.1.all.q/maker_PQOTIq/concatPro%2Etxt.mpi.10.8 -query /tmp/6480.1.all.q/maker_PQOTIq/rank11/scaffold2575.0 -num_alignments 10000 -num_descriptions 10000 -evalue 1e-06 -dbsize 300 -searchsp 500000000 -num_threads 1 -seg yes -soft_masking true -lcase_masking -show_gis -out /home/yunfeiguo/projects/fish/Nigro/run/dir_Nigro-53k00/Nigro-53k_part.maker.output/Nigro-53k_part_datastore/F0/AE/scaffold2575//theVoid.scaffold2575/scaffold2575.0.concatPro%2Etxt.blastx.temp_dir/concatPro%2Etxt.mpi.10.8.blastx</div>
<div>#-------------------------------#</div><div>running blast search.</div><div>#--------- command -------------#</div><div>Widget::tblastx:</div><div>/home/yunfeiguo/Downloads/maker/bin/../exe/blast/bin/tblastx -db /tmp/6480.1.all.q/maker_PQOTIq/AllSebESTs_plus_Rubri%2Efasta.mpi.10.1 -query /tmp/6480.1.all.q/maker_PQOTIq/rank7/scaffold2620.0 -num_alignments 10000 -num_descriptions 10000 -evalue 1e-10 -dbsize 1000 -searchsp 500000000 -num_threads 1 -lcase_masking -seg yes -soft_masking true -show_gis -out /home/yunfeiguo/projects/fish/Nigro/run/dir_Nigro-53k00/Nigro-53k_part.maker.output/Nigro-53k_part_datastore/6B/FB/scaffold2620//theVoid.scaffold2620/scaffold2620.0.AllSebESTs_plus_Rubri%2Efasta.tblastx.temp_dir/AllSebESTs_plus_Rubri%2Efasta.mpi.10.1.tblastx</div>
<div>#-------------------------------#</div><div>running exonerate search.</div><div>#--------- command -------------#</div><div>Widget::exonerate::protein2genome:</div><div>/home/username/usr/bin/exonerate -q /home/yunfeiguo/projects/fish/Nigro/run/dir_Nigro-53k00/Nigro-53k_part.maker.output/Nigro-53k_part_datastore/F9/9B/scaffold2590//theVoid.scaffold2590/sp%7CQ8NB46%7CANR52_HUMAN.for.1-3712.8.fasta -t /home/yunfeiguo/projects/fish/Nigro/run/dir_Nigro-53k00/Nigro-53k_part.maker.output/Nigro-53k_part_datastore/F9/9B/scaffold2590//theVoid.scaffold2590/scaffold2590.1-3712.8.fasta -Q protein -T dna -m protein2genome --softmasktarget --percent 20 --showcigar > /home/yunfeiguo/projects/fish/Nigro/run/dir_Nigro-53k00/Nigro-53k_part.maker.output/Nigro-53k_part_datastore/F9/9B/scaffold2590//theVoid.scaffold2590/scaffold2590.1-3712.sp%7CQ8NB46%7CANR52_HUMAN.p_exonerate.8</div>
<div>#-------------------------------#</div><div>cleaning blastx...</div><div>in cluster::shadow_cluster...</div><div>...finished clustering.</div><div>cleaning clusters....</div><div>total clusters:1 now processing 0</div>
<div> ...processing 0 of 2</div><div>deleted:0 hits</div><div> ...processing 1 of 2</div><div>running blast search.</div><div>#--------- command -------------#</div><div>Widget::tblastx:</div><div>/home/yunfeiguo/Downloads/maker/bin/../exe/blast/bin/tblastx -db /tmp/6480.1.all.q/maker_PQOTIq/AllSebESTs_plus_Rubri%2Efasta.mpi.10.6 -query /tmp/6480.1.all.q/maker_PQOTIq/rank9/scaffold2615.0 -num_alignments 10000 -num_descriptions 10000 -evalue 1e-10 -dbsize 1000 -searchsp 500000000 -num_threads 1 -lcase_masking -seg yes -soft_masking true -show_gis -out /home/yunfeiguo/projects/fish/Nigro/run/dir_Nigro-53k00/Nigro-53k_part.maker.output/Nigro-53k_part_datastore/E2/6E/scaffold2615//theVoid.scaffold2615/scaffold2615.0.AllSebESTs_plus_Rubri%2Efasta.tblastx.temp_dir/AllSebESTs_plus_Rubri%2Efasta.mpi.10.6.tblastx</div>
<div>#-------------------------------#</div><div>deleted:0 hits</div><div>running exonerate search.</div><div>#--------- command -------------#</div><div>Widget::exonerate::protein2genome:</div><div>/home/username/usr/bin/exonerate -q /home/yunfeiguo/projects/fish/Nigro/run/dir_Nigro-53k00/Nigro-53k_part.maker.output/Nigro-53k_part_datastore/F9/9B/scaffold2590//theVoid.scaffold2590/tr%7CE7F7S0%7CE7F7S0_DANRE.for.1-3712.9.fasta -t /home/yunfeiguo/projects/fish/Nigro/run/dir_Nigro-53k00/Nigro-53k_part.maker.output/Nigro-53k_part_datastore/F9/9B/scaffold2590//theVoid.scaffold2590/scaffold2590.1-3712.9.fasta -Q protein -T dna -m protein2genome --softmasktarget --percent 20 --showcigar > /home/yunfeiguo/projects/fish/Nigro/run/dir_Nigro-53k00/Nigro-53k_part.maker.output/Nigro-53k_part_datastore/F9/9B/scaffold2590//theVoid.scaffold2590/<a href="http://scaffold2590.1-3712.tr" target="_blank">scaffold2590.1-3712.tr</a>%7CE7F7S0%7CE7F7S0_DANRE.p_exonerate.9</div>
<div>#-------------------------------#</div><div>deleted:0 hits</div><div>cleaning blastx...</div><div>cleaning clusters....</div><div>total clusters:1 now processing 0</div><div>cleaning clusters....</div><div>total clusters:1 now processing 0</div>
<div>deleted:-1 hits</div><div>deleted:-1 hits</div><div>deleted:-6 hits</div><div>deleted:-3 hits</div><div>deleted:-2 hits</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 running and detached</div><div>Perl exited with active threads:</div><div><span style="white-space:pre-wrap"> </span>1 running and unjoined</div>
<div><span style="white-space:pre-wrap"> </span>0 finished and unjoined</div><div><span style="white-space:pre-wrap"> </span>0 running and detached</div></div><div><br></div><div>Yunfei</div><div><br></div></div></div>
_______________________________________________
maker-devel mailing list
<a href="mailto:maker-devel@box290.bluehost.com" target="_blank">maker-devel@box290.bluehost.com</a><a href="http://box290.bluehost.com/mailman/listinfo/maker-devel_yandell-lab.org" target="_blank">http://box290.bluehost.com/mailman/listinfo/maker-devel_yandell-lab.org</a></span></div>
</blockquote></div><br></div></div></div></span></div>
</blockquote></div><br>
</div></div></blockquote></div><br>
</div></div></div><br>