FailedConsole Output

Skipping 219 KB.. Full Log
U4P0+vIKc0PTOvWC8xrzgzOT8nv0gvODO3ICfVoyQ3xy+/JNU2Yj/Tagmf50wMjD4M7CWJ6SCJEgYhn6zEskT9nMS8dP3gkqLMvHTriiIGKaihyfl5xfk5qXrOEBpkDgMEMDIxMFQUlDDI2RQXJOYpFJdU5qTaKoEttlJQNjBwdjEwsFayAwAsE8VZpQAAAA==- keep a ref genotype call
- discard a genotype whose quality is too low
- build filters and apply to snp
- build filters and apply to indel
- test adding filters
- filter out genotypes with a low allelic fraction
- filter out genotypes with a high allelic fraction
TrioCallerSuite:
- cannot have a sample with no record groups
- cannot have a sample with discordant sample ids
- extract id from a single read group
- extract id from multiple read groups
- filter an empty site
- filter a site with only ref calls
- keep a site with a non-ref call
- fill in no-calls for site with missing parents
- pass through site with odd copy number
- confirm call at site where proband and parents are consistent and phase
- confirm call at site where proband and parents are consistent but cannot phase
- invalidate call at site where proband and parents are inconsistent
- end-to-end trio call test
BlockSuite:
- folding over a match block returns a match operator
- an unknown block must have mismatching input sequences
- folding over an unknown block returns our function's result
AlignerSuite:
- aligning a repetative sequence will fire an assert
- align a minimally flanked sequence with a snp
- align a minimally flanked sequence with a 3 bp mnp
- align a minimally flanked sequence with 2 snps separated by 1bp
- align a minimally flanked sequence with 2 snps separated by 3bp
- align a minimally flanked sequence with a simple insert
- align a minimally flanked sequence with a complex insert
- align a minimally flanked sequence with a simple deletion
- align a minimally flanked sequence that contains a discordant k-mer pair
- align a minimally flanked sequence with a complex deletion
- align a minimally flanked sequence with 2 snps separated by two matching k-mers
- align a minimally flanked sequence with a snp and an indel separated by one matching k-mer
- zip and trim short insert
- zip and trim short deletion
- cut up a sequence that is longer than the k-mer length
- cutting up a sequence that is shorter than the k-mer length yields an empty map
- cutting up a repeated sequence throws an assert
- get no indices if we have no intersection
- get correct index for a single intersection
- get correct indices for two k-mers in a row
- get correct indices for two k-mers separated by a snp
- get correct indices for two k-mers separated by an indel
- get correct indices for two k-mers whose positions are flipped
- fire assert when cutting up repeatitive reads
- fire assert when checking negative index pair
- a set of a single index pair is concordant
- a set with a pair of index pairs is concordant
- a set with multiple good index pairs is concordant
- a set with a pair of swapped index pairs is discordant
- a set with a pair of both con/discordant index pairs is discordant
- making blocks from no indices returns a single unknown block
- make blocks from a single match between two snps
- make blocks from three matches between two snps
- make blocks from three matches between two indels, opposite events
- make blocks from three matches between two indels, same events
- make blocks from matches between snp/indel/snp
BiallelicGenotyperSuite:
- properly handle haploid genotype state
- properly handle diploid genotype state with het call
- properly handle triploid genotype state with hom alt call
- scoring read that overlaps no variants should return empty observations in variant only mode
- scoring read that overlaps no variants should return empty observations
2019-09-27 09:37:02 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (4472 KB). The maximum recommended task size is 100 KB.
2019-09-27 09:37:03 WARN  Utils:66 - Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.debug.maxToStringFields' in SparkEnv.conf.
- score snps in a read overlapping a copy number dup boundary
2019-09-27 09:37:20 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (4060 KB). The maximum recommended task size is 100 KB.
- score snps in a read overlapping a copy number del boundary
- score snp in a read with no evidence of the snp
- score snp in a read with evidence of the snp
- score snp in a read with evidence of the snp, and non-variant bases
- build genotype for het snp
- force call possible STR/indel !!! IGNORED !!!
- log space factorial
- fisher test for strand bias
2019-09-27 09:37:37 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2019-09-27 09:37:43 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- discover and call simple SNP
2019-09-27 09:38:50 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2019-09-27 09:38:56 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- discover and call simple SNP and score all sites
2019-09-27 09:40:05 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2019-09-27 09:40:14 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- discover and call short indel
2019-09-27 09:41:26 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2019-09-27 09:41:36 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
2019-09-27 09:41:51 ERROR BiallelicGenotyper:387 - Processing read H06JUADXX130110:1:1109:10925:52628 failed with exception java.lang.StringIndexOutOfBoundsException: String index out of range: 0. Skipping...
2019-09-27 09:41:51 ERROR BiallelicGenotyper:387 - Processing read H06JUADXX130110:1:1116:7369:15293 failed with exception java.lang.StringIndexOutOfBoundsException: String index out of range: 0. Skipping...
2019-09-27 09:41:51 ERROR BiallelicGenotyper:387 - Processing read H06HDADXX130110:2:1115:12347:40533 failed with exception java.lang.StringIndexOutOfBoundsException: String index out of range: 0. Skipping...
2019-09-27 09:41:51 ERROR BiallelicGenotyper:387 - Processing read H06HDADXX130110:1:2110:7844:95190 failed with exception java.lang.StringIndexOutOfBoundsException: String index out of range: 0. Skipping...
2019-09-27 09:41:51 ERROR BiallelicGenotyper:387 - Processing read H06HDADXX130110:1:2203:13041:33390 failed with exception java.lang.StringIndexOutOfBoundsException: String index out of range: 0. Skipping...
- discover and call het and hom snps
- score a single read covering a deletion
2019-09-27 09:43:00 WARN  TaskSetManager:66 - Stage 7 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- discover and force call hom alt deletion
2019-09-27 09:44:18 WARN  TaskSetManager:66 - Stage 7 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call hom alt AGCCAGTGGACGCCGACCT->A deletion at 1/875159
2019-09-27 09:45:26 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2019-09-27 09:45:34 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call hom alt TACACACACACACACACACACACACACACAC->T deletion at 1/1777263
2019-09-27 09:46:42 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2019-09-27 09:46:50 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call hom alt CAG->C deletion at 1/1067596
2019-09-27 09:47:59 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2019-09-27 09:48:07 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call hom alt C->G snp at 1/877715
2019-09-27 09:49:13 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2019-09-27 09:49:21 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call hom alt ACAG->A deletion at 1/886049
2019-09-27 09:50:30 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2019-09-27 09:50:37 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call hom alt GA->CC mnp at 1/889158–9
2019-09-27 09:51:45 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2019-09-27 09:51:53 WARN  TaskSetManager:66 - Stage 14 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call hom alt C->CCCCT insertion at 1/866511
2019-09-27 09:51:58 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2019-09-27 09:52:05 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call het ATG->A deletion at 1/905130
2019-09-27 09:53:13 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2019-09-27 09:53:21 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call het ATG->A deletion at 1/905130 while scoring all sites
2019-09-27 09:54:31 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2019-09-27 09:54:38 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call het AG->A deletion at 1/907170
2019-09-27 09:55:45 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2019-09-27 09:55:54 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call het T->G snp at 1/240898
2019-09-27 09:57:03 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2019-09-27 09:57:11 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- make het alt calls at biallelic snp locus
2019-09-27 09:58:18 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2019-09-27 09:58:26 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call hom alt T->TAAA insertion at 1/4120185
2019-09-27 09:59:40 WARN  BiallelicGenotyper:170 - Input RDD is not persisted. Performance may be degraded.
2019-09-27 09:59:48 WARN  TaskSetManager:66 - Stage 5 contains a task of very large size (10813 KB). The maximum recommended task size is 100 KB.
- call het alt TTATA,TTA->T insertion at 1/5274547
ObserverSuite:
- a fully clipped read will not generate any observations
- generate observations for a sequence match under diploid model
- generate observations for a read with an insertion under diploid model
- generate observations for a read with a deletion under diploid model
DiscoveredVariantSuite:
- round trip conversion to/from variant
SquareOffReferenceModelSuite:
- don't trim a snp
- trim a mnp
- trim an insertion
- don't trim a deletion
- extract variants finds sites with a called alt
- find genotype if variant is present
- don't find genotype if variant is not present
- excise a genotype from a reference block
- square off a site with data from multiple samples
DiscoverVariantsSuite:
- no variants in unaligned read
- no variants in rdd with unaligned read
- no variants in read that is a perfect sequence match
- no variants in rdd with sequence match reads
- find snp in read with a 1bp sequence mismatch
- find one snp in reads with 1bp sequence mismatch
- find insertion in read
- find insertion in reads
- find deletion in read
- find deletion in reads
- find variants in alignment record rdd
- break TT->CA mnp into two snps
ObservationOperatorSuite:
- zero operators are empty
- non-zero operators are non-empty
- cannot build mismatch with wrong ref length
- collapsing a non repeated set of operators should eliminate 0 ops
- collapsing a repeated set of operators with mixed match/mismatch
- collapse a set of operators with repeats
- collapse a set of operators with repeats and clips
- make a cigar and md tag from a single sequence match
- make a cigar and md tag from a single sequence mismatch
- make a cigar and md tag from a single multi-base sequence match
- make a cigar and md tag from a single deletion
- make a cigar and md tag from a single insertion
- make a cigar for a match followed by a deletion
- make a cigar for an insertion flanked by matches
- make a cigar for a match followed by a mismatch
- make a cigar for a multi-base mismatch flanked by matches
- make a cigar for a match after a clip
- make a cigar for a mismatch after a clip
- extract reference from a single snp
- extract reference from a single deletion
- extract reference from a single insertion
- extract reference from a soft clipped sequence
- extract reference from a hard clipped sequence
- extract reference from a match flanked deletion
- extract reference from a match flanked insertion
- read must be mapped to extract alignment operators
- extracting alignment operators will fail if cigar is unset
- extracting alignment operators will fail if cigar is *
- extracting alignment operators will fail if MD tag is unset
- extract alignment operators from a perfect read
- extract alignment operators from a read with a single mismatch
- extract alignment operators from a read with a single deletion
- extract alignment operators from a read with a single insertion
LogUtilsSuite:
- test our nifty log summer
- can we compute the sum of logs correctly?
- can we compute the additive inverse of logs correctly?
ObservationSuite:
- cannot create an observation with empty likelihoods
- cannot create an observation with 1-length likelihoods
- cannot create an observation with mismatching likelihood lengths
- forward strand must be >= 0
- forward strand cannot exceed coverage
- square map-q must be >= 0
- coverage is strictly positive
- invert an observation
- null an observation
RealignmentBlockSuite:
- folding over a clip returns the clip operator, soft clip
- folding over a clip returns the clip operator, hard clip
- folding over a canonical block returns the original alignment
- violate an invariant of the fold function, part 1
- violate an invariant of the fold function, part 2
- apply the fold function on a realignable block
- having a clip in the middle of a read is illegal
- can't have two soft clips back to back
- a read that is an exact sequence match is canonical
- hard clip before soft clip is ok at start of read
- hard clip after soft clip is ok at end of read
- a read with a single snp is canonical
- a read containing an indel with exact flanks is wholly realignable
- a read containing an indel with exact flanks is wholly realignable, with soft clipped bases
- a read containing an indel with longer flanks can be split into multiple blocks
- a read containing an indel with longer flanks on both sides can be split into multiple blocks
- properly handle a read that starts with a long soft clip
JointAnnotatorCallerSuite:
- discard reference site
- calculate MAF for all called genotypes
- calculate MAF ignoring uncalled genotypes
- roll up variant annotations from a single genotype
- roll up variant annotations across multiple genotypes
- recalling genotypes is a no-op for no calls and complex hets
- recall a genotype so that the state changes
- allele frequency being outside of (0.0, 1.0) just computes posteriors
- compute variant quality from a single genotype
- compute variant quality from multiple genotypes
CopyNumberMapSuite:
- create an empty map
- create a map with only diploid features
- create a map with a mix of features
PrefilterReadsSuite:
- filter on read uniqueness
- filter unmapped reads
- filter autosomal chromosomes with grc names
- filter sex chromosomes with grc names
- filter mitochondrial chromosome with a grc names
- filter autosomal chromosomes with hg names
- filter sex chromosomes with hg names
- filter mitochondrial chromosome with a hg names
- filter autosomal chromosomes from generator
- filter autosomal + sex chromosomes from generator
- filter all chromosomes from generator
- update a read whose mate is mapped to a filtered contig
- filter reads mapped to autosomal chromosomes from generator
- filter reads mapped to autosomal + sex chromosomes from generator
- filter reads mapped to all chromosomes from generator
- filter reads uniquely mapped to autosomal chromosomes from generator
- filter reads uniquely mapped to autosomal + sex chromosomes from generator
- filter reads uniquely mapped to all chromosomes from generator
- filter rdd of reads mapped to autosomal chromosomes from generator
- filter rdd of reads mapped to autosomal + sex chromosomes from generator
- filter rdd of reads mapped to all chromosomes from generator
- filter rdd of reads uniquely mapped to autosomal chromosomes from generator
- filter rdd of reads uniquely mapped to autosomal + sex chromosomes from generator
- filter rdd of reads uniquely mapped to all chromosomes from generator
Run completed in 24 minutes, 20 seconds.
Total number of tests run: 282
Suites: completed 23, aborted 0
Tests: succeeded 282, failed 0, canceled 0, ignored 1, pending 0
All tests passed.
[INFO] 
[INFO] <<< scoverage-maven-plugin:1.1.1:report (default-cli) < [scoverage]test @ avocado-core_2.11 <<<
[INFO] 
[INFO] --- scoverage-maven-plugin:1.1.1:report (default-cli) @ avocado-core_2.11 ---
[INFO] [scoverage] Generating cobertura XML report...
[INFO] [scoverage] Generating scoverage XML report...
[INFO] [scoverage] Generating scoverage HTML report...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building avocado-cli: Command line interface for a distributed variant caller 0.1.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ avocado-cli_2.11 ---
[INFO] 
[INFO] --- git-commit-id-plugin:2.2.1:revision (default) @ avocado-cli_2.11 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.10:add-source (add-source) @ avocado-cli_2.11 ---
[INFO] Source directory: /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/src/main/scala added.
[INFO] 
[INFO] --- templating-maven-plugin:1.0.0:filter-sources (filter-src) @ avocado-cli_2.11 ---
[INFO] Coping files with filtering to temporary directory.
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] Copied 1 files to output directory: /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/target/generated-sources/java-templates
[INFO] Source directory: /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/target/generated-sources/java-templates added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ avocado-cli_2.11 ---
[INFO] Modified 0 of 9 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ avocado-cli_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/src/main/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ avocado-cli_2.11 ---
[WARNING]  Expected all dependencies to require Scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-misc-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-metrics-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-cli-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-misc-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-serialization-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-io-spark2_2.11:0.2.13 requires scala version: 2.11.8
[WARNING] Multiple versions of scala libraries detected!
[INFO] /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/src/main/scala:-1: info: compiling
[INFO] /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/target/generated-sources/java-templates:-1: info: compiling
[INFO] Compiling 8 source files to /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/target/scala-2.11.4/classes at 1569603674155
[INFO] prepare-compile in 0 s
[INFO] compile in 4 s
[INFO] 
[INFO] --- maven-compiler-plugin:3.5.1:compile (default-compile) @ avocado-cli_2.11 ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 1 source file to /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/target/scala-2.11.4/classes
[INFO] 
[INFO] --- build-helper-maven-plugin:1.10:add-test-source (add-test-source) @ avocado-cli_2.11 ---
[INFO] Test Source directory: /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ avocado-cli_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 2 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ avocado-cli_2.11 ---
[WARNING]  Expected all dependencies to require Scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-misc-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-metrics-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-cli-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-misc-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-serialization-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-io-spark2_2.11:0.2.13 requires scala version: 2.11.8
[WARNING] Multiple versions of scala libraries detected!
[INFO] /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/src/test/scala:-1: info: compiling
[INFO] Compiling 2 source files to /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/target/scala-2.11.4/test-classes at 1569603679036
[INFO] prepare-compile in 0 s
[INFO] compile in 3 s
[INFO] 
[INFO] --- maven-compiler-plugin:3.5.1:testCompile (default-testCompile) @ avocado-cli_2.11 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:2.7:test (default-test) @ avocado-cli_2.11 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ avocado-cli_2.11 ---
Discovery starting.
Discovery completed in 120 milliseconds.
Run starting. Expected test count is: 2
ReassembleSuite:
- k-mer length must be positive
MergeDiscoveredSuite:
2019-09-27 10:01:23 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-09-27 10:01:23 WARN  Utils:66 - Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0)
2019-09-27 10:01:23 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2019-09-27 10:01:28 WARN  DatasetBoundVariantRDD:154 - Saving directly as Parquet from SQL. Options other than compression codec are ignored.
2019-09-27 10:01:28 WARN  Utils:66 - Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.debug.maxToStringFields' in SparkEnv.conf.
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
- merge variants discovered from two samples
Run completed in 10 seconds, 890 milliseconds.
Total number of tests run: 2
Suites: completed 3, aborted 0
Tests: succeeded 2, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
[INFO] 
[INFO] >>> scoverage-maven-plugin:1.1.1:report (default-cli) > [scoverage]test @ avocado-cli_2.11 >>>
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ avocado-cli_2.11 ---
[INFO] 
[INFO] --- git-commit-id-plugin:2.2.1:revision (default) @ avocado-cli_2.11 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.10:add-source (add-source) @ avocado-cli_2.11 ---
[INFO] Source directory: /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/src/main/scala added.
[INFO] 
[INFO] --- templating-maven-plugin:1.0.0:filter-sources (filter-src) @ avocado-cli_2.11 ---
[INFO] Coping files with filtering to temporary directory.
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] No files needs to be copied to output directory. Up to date: /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/target/generated-sources/java-templates
[INFO] Source directory: /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/target/generated-sources/java-templates added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ avocado-cli_2.11 ---
[INFO] Modified 0 of 9 .scala files
[INFO] 
[INFO] --- scoverage-maven-plugin:1.1.1:pre-compile (default-cli) @ avocado-cli_2.11 ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ avocado-cli_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/src/main/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ avocado-cli_2.11 ---
[WARNING]  Expected all dependencies to require Scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-misc-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-metrics-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-cli-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-misc-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-serialization-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-io-spark2_2.11:0.2.13 requires scala version: 2.11.8
[WARNING] Multiple versions of scala libraries detected!
[INFO] /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/src/main/scala:-1: info: compiling
[INFO] /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/target/generated-sources/java-templates:-1: info: compiling
[INFO] /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/target/generated-sources/annotations:-1: info: compiling
[INFO] Compiling 8 source files to /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/target/scala-2.11.4/scoverage-classes at 1569603694377
[INFO] [info] Cleaning datadir [/home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/target/scoverage-data]
[INFO] [info] Beginning coverage instrumentation
[INFO] [info] Instrumentation completed [457 statements]
[INFO] [info] Wrote instrumentation file [/home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/target/scoverage-data/scoverage.coverage.xml]
[INFO] [info] Will write measurement data to [/home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/target/scoverage-data]
[INFO] prepare-compile in 0 s
[INFO] compile in 5 s
[INFO] 
[INFO] --- maven-compiler-plugin:3.5.1:compile (default-compile) @ avocado-cli_2.11 ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 1 source file to /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/target/scala-2.11.4/scoverage-classes
[INFO] 
[INFO] --- scoverage-maven-plugin:1.1.1:post-compile (default-cli) @ avocado-cli_2.11 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.10:add-test-source (add-test-source) @ avocado-cli_2.11 ---
[INFO] Test Source directory: /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/avocado-cli/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ avocado-cli_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 2 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ avocado-cli_2.11 ---
[WARNING]  Expected all dependencies to require Scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-misc-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-metrics-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-cli-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-misc-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-serialization-spark2_2.11:0.2.11 requires scala version: 2.11.4
[WARNING]  org.bdgenomics.utils:utils-io-spark2_2.11:0.2.13 requires scala version: 2.11.8
[WARNING] Multiple versions of scala libraries detected!
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.5.1:testCompile (default-testCompile) @ avocado-cli_2.11 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:2.7:test (default-test) @ avocado-cli_2.11 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ avocado-cli_2.11 ---
Discovery starting.
Discovery completed in 116 milliseconds.
Run starting. Expected test count is: 2
ReassembleSuite:
- k-mer length must be positive
MergeDiscoveredSuite:
2019-09-27 10:01:40 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-09-27 10:01:41 WARN  Utils:66 - Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0)
2019-09-27 10:01:41 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2019-09-27 10:01:46 WARN  DatasetBoundVariantRDD:154 - Saving directly as Parquet from SQL. Options other than compression codec are ignored.
2019-09-27 10:01:46 WARN  Utils:66 - Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.debug.maxToStringFields' in SparkEnv.conf.
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
- merge variants discovered from two samples
Run completed in 11 seconds, 552 milliseconds.
Total number of tests run: 2
Suites: completed 3, aborted 0
Tests: succeeded 2, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
[INFO] 
[INFO] <<< scoverage-maven-plugin:1.1.1:report (default-cli) < [scoverage]test @ avocado-cli_2.11 <<<
[INFO] 
[INFO] --- scoverage-maven-plugin:1.1.1:report (default-cli) @ avocado-cli_2.11 ---
[INFO] [scoverage] Generating cobertura XML report...
[INFO] [scoverage] Generating scoverage XML report...
[INFO] [scoverage] Generating scoverage HTML report...
[INFO] [scoverage] Generating aggregated cobertura XML report...
[INFO] [scoverage] Generating aggregated scoverage XML report...
[INFO] [scoverage] Generating aggregated scoverage HTML report...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building avocado: A Variant Caller, Distributed 0.1.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- coveralls-maven-plugin:3.0.1:report (default-cli) @ avocado-parent_2.11 ---
[INFO] Starting Coveralls job for jenkins (5079 / https://amplab.cs.berkeley.edu/jenkins/job/avocado/HADOOP_VERSION=2.6.0,SCALAVER=2.11,SPARK_VERSION=2.2.0,label=ubuntu/5079/)
[INFO] Using repository token <secret>
[INFO] Git commit 4cea777 in master
[INFO] Writing Coveralls data to /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/target/coveralls.json...
[INFO] Processing coverage report from /home/jenkins/workspace/avocado/HADOOP_VERSION/2.6.0/SCALAVER/2.11/SPARK_VERSION/2.2.0/label/ubuntu/target/cobertura.xml
[INFO] Successfully wrote Coveralls data in 124ms
[INFO] Gathered code coverage metrics for 45 source files with 10503 lines of code:
[INFO] - 1687 relevant lines
[INFO] - 1346 covered lines
[INFO] - 341 missed lines
[INFO] Submitting Coveralls data to API
[ERROR] Submission failed in 1511ms while processing data
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] avocado: A Variant Caller, Distributed ............. FAILURE [  1.974 s]
[INFO] avocado-core: Core variant calling algorithms ...... SUCCESS [48:45 min]
[INFO] avocado-cli: Command line interface for a distributed variant caller SUCCESS [ 41.010 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 49:32 min
[INFO] Finished at: 2019-09-27T10:01:55-07:00
[INFO] Final Memory: 71M/1390M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.eluder.coveralls:coveralls-maven-plugin:3.0.1:report (default-cli) on project avocado-parent_2.11: Processing of input or output data failed: Report submission to Coveralls API failed with HTTP status 500: Internal Server Error (Build processing error.) -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Recording test results
Sending e-mails to: fnothaft@berkeley.edu
Finished: FAILURE