FailedConsole Output

Skipping 20 KB.. Full Log
3ICfVoyQ3xy+/JNU2Yj/Tagmf50wMjD4M7CWJ6SCJEgYhn6zEskT9nMS8dP3gkqLMvHTriiIGKaihyfl5xfk5qXrOEBpkDgMEMDIxMFQUlDDI2RQXJOYpFJdU5qTaKoEttlJQNjBwdjEwsFayAwAsE8VZpQAAAA==- build a forest with a single item and retrieve data
- build a forest with data from a single contig and retrieve data
- build a forest with data from multiple contigs and retrieve data
2020-02-11 14:05:44 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2020-02-11 14:05:45 WARN  Utils:66 - Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0)
2020-02-11 14:05:45 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
- build a forest out of data on a single contig and retrieve data
- run a join between data on a single contig
HardLimiterSuite:
- add a read to an empty buffer
- add a read to a non-empty buffer, without moving forward
- add a read to a non-empty buffer, and move forward
- trying to add a read to a full buffer—without moving forward—drops the read
- add a read to a full buffer, while moving forward and keeping buffer full
- add a read to a full buffer, while moving forward and emptying buffer
- adding an out of order read should fire an assert
- adding a read that is on the wrong contig should fire an assert
- apply hard limiting to an iterator that is wholly under the coverage limit
- apply hard limiting to an iterator that is partially under the coverage limit
- apply hard limiting to an iterator that is wholly over the coverage limit
- apply hard limiting on a file that is wholly under the coverage limit
- apply hard limiting on a file with sections over the coverage limit
VariantSummarySuite:
- create from genotype without strand bias components
- create from genotype with strand bias components
- invalid strand bias causes exception
- merge two fully populated summaries
- merge two partially populated summaries
- populating an annotation should carry old fields
RewriteHetsSuite:
- should rewrite a bad het snp
- should not rewrite het snp if snp filtering is disabled
- should rewrite a bad het indel
- should not rewrite het indel if indel filtering is disabled
- don't rewrite good het calls
- don't rewrite homozygous calls
- rewrite a het call as a hom alt snp
- processing a valid call should not change the call
- if processing is disabled, don't rewrite bad calls
- process a bad het snp call
- process a bad het indel call
- disable processing for a whole rdd
- process a whole rdd
RealignerSuite:
- realignment candidate code needs at least one block
- read is not a realignment candidate if it is canonical
- read is not a realignment candidate if it is canonical and clipped
- read is a realignment candidate if there is at least one non-canonical block
- realign an indel that is not left normalized
- realign a mnp expressed as a complex indel
- realign two snps expressed as a complex indel
- align sequence with a complex deletion
- realign a read with a complex deletion
- realign a read with a snp and deletion separated by a flank
- realigning a repetative read will fire an assert
- realign a set of reads around an insert
- realign a set of reads around a deletion
2020-02-11 14:05:54 WARN  Realigner:101 - Realigning A_READ failed with exception java.lang.AssertionError: assertion failed: Input sequence contains a repeat..
- realigning a read with a repeat will return the original read
- one sample read should fail due to a repeat, all others should realign
HardFilterGenotypesSuite:
- filter out reference calls
- filter out low quality calls
- filter out genotypes for emission
- filter out genotypes with a low quality per depth
- filter out genotypes with a low depth
- filter out genotypes with a high depth
- filter out genotypes with a low RMS mapping quality
- filter out genotypes with a high strand bias
- update genotype where no filters were applied
- update genotype where filters were applied and passed
- update genotype where filters were applied and failed
- discard a ref genotype call
- keep a ref genotype call
- discard a genotype whose quality is too low
- build filters and apply to snp
- build filters and apply to indel
- test adding filters
- filter out genotypes with a low allelic fraction
- filter out genotypes with a high allelic fraction
TrioCallerSuite:
- cannot have a sample with no record groups
- cannot have a sample with discordant sample ids
- extract id from a single read group
- extract id from multiple read groups
- filter an empty site
- filter a site with only ref calls
- keep a site with a non-ref call
- fill in no-calls for site with missing parents
- pass through site with odd copy number
- confirm call at site where proband and parents are consistent and phase
- confirm call at site where proband and parents are consistent but cannot phase
- invalidate call at site where proband and parents are inconsistent
- end-to-end trio call test
BlockSuite:
- folding over a match block returns a match operator
- an unknown block must have mismatching input sequences
- folding over an unknown block returns our function's result
AlignerSuite:
- aligning a repetative sequence will fire an assert
- align a minimally flanked sequence with a snp
- align a minimally flanked sequence with a 3 bp mnp
- align a minimally flanked sequence with 2 snps separated by 1bp
- align a minimally flanked sequence with 2 snps separated by 3bp
- align a minimally flanked sequence with a simple insert
- align a minimally flanked sequence with a complex insert
- align a minimally flanked sequence with a simple deletion
- align a minimally flanked sequence that contains a discordant k-mer pair
- align a minimally flanked sequence with a complex deletion
- align a minimally flanked sequence with 2 snps separated by two matching k-mers
- align a minimally flanked sequence with a snp and an indel separated by one matching k-mer
- zip and trim short insert
- zip and trim short deletion
- cut up a sequence that is longer than the k-mer length
- cutting up a sequence that is shorter than the k-mer length yields an empty map
- cutting up a repeated sequence throws an assert
- get no indices if we have no intersection
- get correct index for a single intersection
- get correct indices for two k-mers in a row
- get correct indices for two k-mers separated by a snp
- get correct indices for two k-mers separated by an indel
- get correct indices for two k-mers whose positions are flipped
- fire assert when cutting up repeatitive reads
- fire assert when checking negative index pair
- a set of a single index pair is concordant
- a set with a pair of index pairs is concordant
- a set with multiple good index pairs is concordant
- a set with a pair of swapped index pairs is discordant
- a set with a pair of both con/discordant index pairs is discordant
- making blocks from no indices returns a single unknown block
- make blocks from a single match between two snps
- make blocks from three matches between two snps
- make blocks from three matches between two indels, opposite events
- make blocks from three matches between two indels, same events
- make blocks from matches between snp/indel/snp
BiallelicGenotyperSuite:
- properly handle haploid genotype state
- properly handle diploid genotype state with het call
- properly handle triploid genotype state with hom alt call
- scoring read that overlaps no variants should return empty observations in variant only mode
- scoring read that overlaps no variants should return empty observations
2020-02-11 14:06:03 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (4472 KB). The maximum recommended task size is 100 KB.
2020-02-11 14:06:04 WARN  Utils:66 - Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.debug.maxToStringFields' in SparkEnv.conf.
- score snps in a read overlapping a copy number dup boundary
2020-02-11 14:06:28 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (4060 KB). The maximum recommended task size is 100 KB.
- score snps in a read overlapping a copy number del boundary
- score snp in a read with no evidence of the snp
- score snp in a read with evidence of the snp
- score snp in a read with evidence of the snp, and non-variant bases
- build genotype for het snp
- force call possible STR/indel !!! IGNORED !!!
- log space factorial
- fisher test for strand bias
2020-02-11 14:06:51 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2020-02-11 14:07:00 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- discover and call simple SNP
2020-02-11 14:08:27 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2020-02-11 14:08:36 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- discover and call simple SNP and score all sites
2020-02-11 14:10:05 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2020-02-11 14:10:14 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- discover and call short indel
2020-02-11 14:11:48 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2020-02-11 14:11:57 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
2020-02-11 14:12:13 ERROR BiallelicGenotyper:387 - Processing read H06JUADXX130110:1:1109:10925:52628 failed with exception java.lang.StringIndexOutOfBoundsException: String index out of range: 0. Skipping...
2020-02-11 14:12:13 ERROR BiallelicGenotyper:387 - Processing read H06JUADXX130110:1:1116:7369:15293 failed with exception java.lang.StringIndexOutOfBoundsException: String index out of range: 0. Skipping...
2020-02-11 14:12:13 ERROR BiallelicGenotyper:387 - Processing read H06HDADXX130110:2:1115:12347:40533 failed with exception java.lang.StringIndexOutOfBoundsException: String index out of range: 0. Skipping...
2020-02-11 14:12:13 ERROR BiallelicGenotyper:387 - Processing read H06HDADXX130110:1:2110:7844:95190 failed with exception java.lang.StringIndexOutOfBoundsException: String index out of range: 0. Skipping...
2020-02-11 14:12:13 ERROR BiallelicGenotyper:387 - Processing read H06HDADXX130110:1:2203:13041:33390 failed with exception java.lang.StringIndexOutOfBoundsException: String index out of range: 0. Skipping...
- discover and call het and hom snps
- score a single read covering a deletion
2020-02-11 14:13:35 WARN  TaskSetManager:66 - Stage 7 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- discover and force call hom alt deletion
2020-02-11 14:15:12 WARN  TaskSetManager:66 - Stage 7 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call hom alt AGCCAGTGGACGCCGACCT->A deletion at 1/875159
2020-02-11 14:16:41 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2020-02-11 14:16:49 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call hom alt TACACACACACACACACACACACACACACAC->T deletion at 1/1777263
2020-02-11 14:18:14 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2020-02-11 14:18:24 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call hom alt CAG->C deletion at 1/1067596
2020-02-11 14:19:52 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2020-02-11 14:20:01 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call hom alt C->G snp at 1/877715
2020-02-11 14:21:28 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2020-02-11 14:21:36 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call hom alt ACAG->A deletion at 1/886049
2020-02-11 14:23:06 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2020-02-11 14:23:13 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call hom alt GA->CC mnp at 1/889158–9
2020-02-11 14:24:39 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2020-02-11 14:24:48 WARN  TaskSetManager:66 - Stage 14 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call hom alt C->CCCCT insertion at 1/866511
2020-02-11 14:24:54 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2020-02-11 14:25:02 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call het ATG->A deletion at 1/905130
2020-02-11 14:26:29 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2020-02-11 14:26:37 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call het ATG->A deletion at 1/905130 while scoring all sites
2020-02-11 14:28:02 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2020-02-11 14:28:09 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call het AG->A deletion at 1/907170
2020-02-11 14:29:35 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2020-02-11 14:29:44 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call het T->G snp at 1/240898
2020-02-11 14:31:10 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2020-02-11 14:31:17 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- make het alt calls at biallelic snp locus
2020-02-11 14:32:41 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2020-02-11 14:32:49 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call hom alt T->TAAA insertion at 1/4120185
2020-02-11 14:34:15 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2020-02-11 14:34:23 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
Exception in thread "block-manager-slave-async-thread-pool-0" java.lang.OutOfMemoryError: GC overhead limit exceeded
2020-02-11 14:35:31 ERROR Utils:91 - uncaught error in thread Spark Context Cleaner, stopping SparkContext
java.lang.OutOfMemoryError: GC overhead limit exceeded
2020-02-11 14:35:31 ERROR Executor:91 - Exception in task 2.0 in stage 7.0 (TID 411)
java.lang.OutOfMemoryError: Java heap space
2020-02-11 14:35:31 ERROR Executor:91 - Exception in task 0.0 in stage 7.0 (TID 409)
java.lang.IllegalStateException: unread block data
	at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2783)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1605)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:80)
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
2020-02-11 14:35:31 ERROR Executor:91 - Exception in task 3.0 in stage 7.0 (TID 412)
java.lang.IllegalStateException: unread block data
	at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2783)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1605)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:80)
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
2020-02-11 14:35:31 ERROR Executor:91 - Exception in task 1.0 in stage 7.0 (TID 410)
java.lang.IllegalStateException: unread block data
	at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2783)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1605)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:80)
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
2020-02-11 14:35:31 ERROR Utils:91 - Uncaught exception in thread driver-heartbeater
java.lang.OutOfMemoryError: GC overhead limit exceeded
2020-02-11 14:35:31 ERROR Utils:91 - throw uncaught fatal error in thread Spark Context Cleaner
java.lang.OutOfMemoryError: GC overhead limit exceeded
Exception in thread "Spark Context Cleaner" java.lang.OutOfMemoryError: GC overhead limit exceeded
2020-02-11 14:35:31 ERROR LiveListenerBus:70 - SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo@6be02dc3)
2020-02-11 14:35:31 ERROR LiveListenerBus:70 - SparkListenerBus has already stopped! Dropping event SparkListenerJobEnd(3,1581460531848,JobFailed(org.apache.spark.SparkException: Job 3 cancelled because SparkContext was shut down))
2020-02-11 14:35:31 ERROR SparkUncaughtExceptionHandler:91 - Uncaught exception in thread Thread[Executor task launch worker for task 411,5,main]
java.lang.OutOfMemoryError: Java heap space
2020-02-11 14:35:31 WARN  SparkContext:87 - Multiple running SparkContexts detected in the same JVM!
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:76)
org.bdgenomics.utils.misc.SparkFunSuite$class.setupSparkContext(SparkFunSuite.scala:56)
org.bdgenomics.avocado.genotyping.BiallelicGenotyperSuite.setupSparkContext(BiallelicGenotyperSuite.scala:46)
org.bdgenomics.utils.misc.SparkFunSuite$$anonfun$sparkTest$1.apply$mcV$sp(SparkFunSuite.scala:99)
org.bdgenomics.utils.misc.SparkFunSuite$$anonfun$sparkTest$1.apply(SparkFunSuite.scala:98)
org.bdgenomics.utils.misc.SparkFunSuite$$anonfun$sparkTest$1.apply(SparkFunSuite.scala:98)
org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
org.scalatest.Transformer.apply(Transformer.scala:22)
org.scalatest.Transformer.apply(Transformer.scala:20)
org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:166)
org.scalatest.Suite$class.withFixture(Suite.scala:1122)
org.scalatest.FunSuite.withFixture(FunSuite.scala:1555)
org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:163)
org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:175)
org.bdgenomics.avocado.genotyping.BiallelicGenotyperSuite.org$scalatest$BeforeAndAfter$$super$runTest(BiallelicGenotyperSuite.scala:46)
	at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2472)
	at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2468)
	at scala.Option.foreach(Option.scala:256)
	at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2468)
	at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2557)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:85)
	at org.bdgenomics.utils.misc.SparkFunSuite$class.setupSparkContext(SparkFunSuite.scala:56)
	at org.bdgenomics.avocado.genotyping.ObserverSuite.setupSparkContext(ObserverSuite.scala:26)
	at org.bdgenomics.utils.misc.SparkFunSuite$$anonfun$sparkTest$1.apply$mcV$sp(SparkFunSuite.scala:99)
	at org.bdgenomics.utils.misc.SparkFunSuite$$anonfun$sparkTest$1.apply(SparkFunSuite.scala:98)
	at org.bdgenomics.utils.misc.SparkFunSuite$$anonfun$sparkTest$1.apply(SparkFunSuite.scala:98)
	at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
	at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
	at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
	at org.scalatest.Transformer.apply(Transformer.scala:22)
	at org.scalatest.Transformer.apply(Transformer.scala:20)
	at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:166)
	at org.scalatest.Suite$class.withFixture(Suite.scala:1122)
	at org.scalatest.FunSuite.withFixture(FunSuite.scala:1555)
	at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:163)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
	at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
	at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:175)
	at org.bdgenomics.avocado.genotyping.ObserverSuite.org$scalatest$BeforeAndAfter$$super$runTest(ObserverSuite.scala:26)
	at org.scalatest.BeforeAndAfter$class.runTest(BeforeAndAfter.scala:200)
	at org.bdgenomics.avocado.genotyping.ObserverSuite.runTest(ObserverSuite.scala:26)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
	at scala.collection.immutable.List.foreach(List.scala:381)
	at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
	at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
	at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
	at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:208)
	at org.scalatest.FunSuite.runTests(FunSuite.scala:1555)
	at org.scalatest.Suite$class.run(Suite.scala:1424)
	at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1555)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
	at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
	at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:212)
	at org.bdgenomics.avocado.genotyping.ObserverSuite.org$scalatest$BeforeAndAfter$$super$run(ObserverSuite.scala:26)
	at org.scalatest.BeforeAndAfter$class.run(BeforeAndAfter.scala:241)
	at org.bdgenomics.avocado.genotyping.ObserverSuite.run(ObserverSuite.scala:26)
	at org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1492)
	at org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1528)
	at org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1526)
	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
	at org.scalatest.Suite$class.runNestedSuites(Suite.scala:1526)
	at org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:29)
	at org.scalatest.Suite$class.run(Suite.scala:1421)
	at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:29)
	at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55)
	at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2563)
	at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2557)
	at scala.collection.immutable.List.foreach(List.scala:381)
	at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:2557)
	at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1044)
	at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1043)
	at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:2722)
	at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1043)
	at org.scalatest.tools.Runner$.main(Runner.scala:860)
	at org.scalatest.tools.Runner.main(Runner.scala)
2020-02-11 14:35:31 ERROR Inbox:91 - Ignoring error
java.util.concurrent.RejectedExecutionException: Task org.apache.spark.executor.Executor$TaskRunner@225b5aeb rejected from java.util.concurrent.ThreadPoolExecutor@44f12a97[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 412]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at org.apache.spark.executor.Executor.launchTask(Executor.scala:173)
	at org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$reviveOffers$1.apply(LocalSchedulerBackend.scala:87)
	at org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$reviveOffers$1.apply(LocalSchedulerBackend.scala:85)
	at scala.collection.Iterator$class.foreach(Iterator.scala:743)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1195)
	at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
	at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
	at org.apache.spark.scheduler.local.LocalEndpoint.reviveOffers(LocalSchedulerBackend.scala:85)
	at org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$receive$1.applyOrElse(LocalSchedulerBackend.scala:70)
	at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:117)
	at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:205)
	at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101)
	at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:213)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
- call het alt TTATA,TTA->T insertion at 1/5274547 *** FAILED ***
  org.apache.spark.SparkException: Job 3 cancelled because SparkContext was shut down
  at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:820)
2020-02-11 14:35:31 ERROR Inbox:91 - Ignoring error
  at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:818)
  at scala.collection.mutable.HashSet.foreach(HashSet.scala:78)
  at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:818)
  at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1732)
  at org.apache.spark.util.EventLoop.stop(EventLoop.scala:83)
  at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1651)
  at org.apache.spark.SparkContext$$anonfun$stop$8.apply$mcV$sp(SparkContext.scala:1921)
  at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1317)
  at org.apache.spark.SparkContext.stop(SparkContext.scala:1920)
  at org.apache.spark.SparkContext$$anon$3.run(SparkContext.scala:1865)
  at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:630)
  at org.apache.spark.SparkContext.runJob(SparkContext.scala:2022)
  at org.apache.spark.SparkContext.runJob(SparkContext.scala:2043)
  at org.apache.spark.SparkContext.runJob(SparkContext.scala:2062)
  at org.apache.spark.SparkContext.runJob(SparkContext.scala:2087)
  at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:936)
  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
  at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
  at org.apache.spark.rdd.RDD.collect(RDD.scala:935)
java.util.concurrent.RejectedExecutionException: Task org.apache.spark.executor.Executor$TaskRunner@386dc5d8 rejected from java.util.concurrent.ThreadPoolExecutor@44f12a97[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 412]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
  at org.bdgenomics.avocado.genotyping.BiallelicGenotyperSuite$$anonfun$39.apply$mcV$sp(BiallelicGenotyperSuite.scala:898)
  at org.bdgenomics.utils.misc.SparkFunSuite$$anonfun$sparkTest$1.apply$mcV$sp(SparkFunSuite.scala:102)
  at org.bdgenomics.utils.misc.SparkFunSuite$$anonfun$sparkTest$1.apply(SparkFunSuite.scala:98)
  at org.bdgenomics.utils.misc.SparkFunSuite$$anonfun$sparkTest$1.apply(SparkFunSuite.scala:98)
  at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
  at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
  at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
  at org.scalatest.Transformer.apply(Transformer.scala:22)
  at org.scalatest.Transformer.apply(Transformer.scala:20)
  at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:166)
  at org.scalatest.Suite$class.withFixture(Suite.scala:1122)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at org.apache.spark.executor.Executor.launchTask(Executor.scala:173)
	at org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$reviveOffers$1.apply(LocalSchedulerBackend.scala:87)
	at org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$reviveOffers$1.apply(LocalSchedulerBackend.scala:85)
	at scala.collection.Iterator$class.foreach(Iterator.scala:743)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1195)
  at org.scalatest.FunSuite.withFixture(FunSuite.scala:1555)
  at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:163)
	at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
	at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
	at org.apache.spark.scheduler.local.LocalEndpoint.reviveOffers(LocalSchedulerBackend.scala:85)
	at org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$receive$1.applyOrElse(LocalSchedulerBackend.scala:70)
	at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:117)
	at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:205)
	at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101)
	at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:213)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
  at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
  at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
  at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
  at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:175)
  at org.bdgenomics.avocado.genotyping.BiallelicGenotyperSuite.org$scalatest$BeforeAndAfter$$super$runTest(BiallelicGenotyperSuite.scala:46)
  at org.scalatest.BeforeAndAfter$class.runTest(BeforeAndAfter.scala:200)
  at org.bdgenomics.avocado.genotyping.BiallelicGenotyperSuite.runTest(BiallelicGenotyperSuite.scala:46)
  at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
  at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
  at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
  at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
  at scala.collection.immutable.List.foreach(List.scala:381)
  at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
  at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
  at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
  at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:208)
  at org.scalatest.FunSuite.runTests(FunSuite.scala:1555)
  at org.scalatest.Suite$class.run(Suite.scala:1424)
  at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1555)
  at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
  at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
  at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
  at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:212)
  at org.bdgenomics.avocado.genotyping.BiallelicGenotyperSuite.org$scalatest$BeforeAndAfter$$super$run(BiallelicGenotyperSuite.scala:46)
  at org.scalatest.BeforeAndAfter$class.run(BeforeAndAfter.scala:241)
  at org.bdgenomics.avocado.genotyping.BiallelicGenotyperSuite.run(BiallelicGenotyperSuite.scala:46)
  at org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1492)
  at org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1528)
  at org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1526)
  at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
  at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
  at org.scalatest.Suite$class.runNestedSuites(Suite.scala:1526)
  at org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:29)
  at org.scalatest.Suite$class.run(Suite.scala:1421)
  at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:29)
  at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55)
  at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2563)
  at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2557)
  at scala.collection.immutable.List.foreach(List.scala:381)
  at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:2557)
  at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1044)
  at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1043)
  at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:2722)
  at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1043)
  at org.scalatest.tools.Runner$.main(Runner.scala:860)
  at org.scalatest.tools.Runner.main(Runner.scala)
2020-02-11 14:35:31 ERROR Inbox:91 - Ignoring error
java.util.concurrent.RejectedExecutionException: Task org.apache.spark.executor.Executor$TaskRunner@2f293c97 rejected from java.util.concurrent.ThreadPoolExecutor@44f12a97[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 412]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at org.apache.spark.executor.Executor.launchTask(Executor.scala:173)
	at org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$reviveOffers$1.apply(LocalSchedulerBackend.scala:87)
	at org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$reviveOffers$1.apply(LocalSchedulerBackend.scala:85)
	at scala.collection.Iterator$class.foreach(Iterator.scala:743)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1195)
	at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
	at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
	at org.apache.spark.scheduler.local.LocalEndpoint.reviveOffers(LocalSchedulerBackend.scala:85)
	at org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$receive$1.applyOrElse(LocalSchedulerBackend.scala:70)
	at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:117)
	at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:205)
	at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101)
	at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:213)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
2020-02-11 14:35:31 ERROR Inbox:91 - Ignoring error
java.util.concurrent.RejectedExecutionException: Task org.apache.spark.executor.Executor$TaskRunner@198c2e26 rejected from java.util.concurrent.ThreadPoolExecutor@44f12a97[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 412]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at org.apache.spark.executor.Executor.launchTask(Executor.scala:173)
	at org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$reviveOffers$1.apply(LocalSchedulerBackend.scala:87)
	at org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$reviveOffers$1.apply(LocalSchedulerBackend.scala:85)
	at scala.collection.Iterator$class.foreach(Iterator.scala:743)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1195)
	at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
	at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
	at org.apache.spark.scheduler.local.LocalEndpoint.reviveOffers(LocalSchedulerBackend.scala:85)
	at org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$receive$1.applyOrElse(LocalSchedulerBackend.scala:70)
	at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:117)
	at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:205)
	at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101)
	at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:213)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping avocado: A Variant Caller, Distributed
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] avocado: A Variant Caller, Distributed ............. SUCCESS [  6.020 s]
[INFO] avocado-core: Core variant calling algorithms ...... FAILURE [30:43 min]
[INFO] avocado-cli: Command line interface for a distributed variant caller SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 30:49 min
[INFO] Finished at: 2020-02-11T14:35:32-08:00
[INFO] Final Memory: 32M/1137M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test (test) on project avocado-core_2.11: There are test failures -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :avocado-core_2.11
Build step 'Execute shell' marked build as failure
Recording test results
ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error?
Sending e-mails to: fnothaft@berkeley.edu
Finished: FAILURE