FailedConsole Output

Skipping 2,474 KB.. Full Log
OYpFJdU5qTaKoEttlJQNjBwdnF2sVayAwD4vuoDpQAAAA==Suites: completed 0, aborted 0
Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
No tests were executed.
[INFO] 
[INFO] --- maven-jar-plugin:3.1.2:jar (default-jar) @ adam-core-spark2_2.11 ---
[INFO] Building jar: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/adam-core-spark2_2.11-0.27.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-source-plugin:3.0.1:jar-no-fork (attach-sources) @ adam-core-spark2_2.11 ---
[INFO] Building jar: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/adam-core-spark2_2.11-0.27.0-SNAPSHOT-sources.jar
[INFO] 
[INFO] --- maven-javadoc-plugin:3.1.0:jar (attach-javadoc) @ adam-core-spark2_2.11 ---
[ERROR] Error fetching link: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-codegen/target/apidocs. Ignored it.
[INFO] 
Loading source files for package org.bdgenomics.adam.io...
Loading source files for package org.apache.parquet.avro...
Constructing Javadoc information...
Standard Doclet version 1.8.0_191
Building tree for all the packages and classes...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/bdgenomics/adam/io/FastqInputFormat.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/bdgenomics/adam/io/FastqRecordReader.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/bdgenomics/adam/io/InterleavedFastqInputFormat.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/bdgenomics/adam/io/SingleFastqInputFormat.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/apache/parquet/avro/AvroSchemaConverter.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/overview-frame.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/apache/parquet/avro/package-frame.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/apache/parquet/avro/package-summary.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/apache/parquet/avro/package-tree.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/bdgenomics/adam/io/package-frame.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/bdgenomics/adam/io/package-summary.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/bdgenomics/adam/io/package-tree.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/constant-values.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/bdgenomics/adam/io/class-use/FastqRecordReader.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/bdgenomics/adam/io/class-use/FastqInputFormat.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/bdgenomics/adam/io/class-use/InterleavedFastqInputFormat.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/bdgenomics/adam/io/class-use/SingleFastqInputFormat.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/apache/parquet/avro/class-use/AvroSchemaConverter.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/apache/parquet/avro/package-use.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/org/bdgenomics/adam/io/package-use.html...
Building index for all the packages and classes...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/overview-tree.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/index-all.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/deprecated-list.html...
Building index for all classes...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/allclasses-frame.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/allclasses-noframe.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/index.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/overview-summary.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/apidocs/help-doc.html...
6 warnings
[WARNING] Javadoc Warnings
[WARNING] /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/java/org/bdgenomics/adam/io/FastqRecordReader.java:235: warning: no @param for codec
[WARNING] protected final int positionAtFirstRecord(final FSDataInputStream stream,
[WARNING] ^
[WARNING] /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/java/org/bdgenomics/adam/io/FastqRecordReader.java:235: warning: no @return
[WARNING] protected final int positionAtFirstRecord(final FSDataInputStream stream,
[WARNING] ^
[WARNING] /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/java/org/bdgenomics/adam/io/FastqRecordReader.java:235: warning: no @throws for java.io.IOException
[WARNING] protected final int positionAtFirstRecord(final FSDataInputStream stream,
[WARNING] ^
[WARNING] /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/java/org/bdgenomics/adam/io/FastqRecordReader.java:389: warning: no @throws for java.io.IOException
[WARNING] protected final boolean lowLevelFastqRead(final Text readName, final Text value)
[WARNING] ^
[WARNING] /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/java/org/bdgenomics/adam/io/FastqRecordReader.java:431: warning: no @throws for java.io.IOException
[WARNING] abstract protected boolean next(Text value) throws IOException;
[WARNING] ^
[WARNING] /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/java/org/bdgenomics/adam/io/FastqRecordReader.java:154: warning: no @throws for java.io.IOException
[WARNING] protected FastqRecordReader(final Configuration conf,
[WARNING] ^
[INFO] Building jar: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/adam-core-spark2_2.11-0.27.0-SNAPSHOT-javadoc.jar
[INFO] 
[INFO] >>> scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) > generate-sources @ adam-core-spark2_2.11 >>>
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-core-spark2_2.11 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-core-spark2_2.11 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-source (add-source) @ adam-core-spark2_2.11 ---
[INFO] Source directory: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala added.
[INFO] Source directory: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/generated-sources/src/main/scala added.
[INFO] 
[INFO] --- exec-maven-plugin:1.5.0:java (generate-scala-products) @ adam-core-spark2_2.11 ---
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
[INFO] 
[INFO] --- exec-maven-plugin:1.5.0:java (generate-scala-projection-fields) @ adam-core-spark2_2.11 ---
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
[INFO] 
[INFO] <<< scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) < generate-sources @ adam-core-spark2_2.11 <<<
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) @ adam-core-spark2_2.11 ---
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/converters/FastaConverter.scala:230: warning: discarding unmoored doc comment
    /**
    ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/rdd/read/AlignmentRecordDataset.scala:803: warning: Octal escape literals are deprecated, use \u0001 instead.
      binaryCodec.writeBytes("BAM\001".getBytes())
                                 ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/rich/RichCigar.scala:43: warning: discarding unmoored doc comment
    /**
    ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/rdd/GenomicDataset.scala:3099: warning: no valid targets for annotation on value uTag - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @getter)
  @transient val uTag: TypeTag[U]
   ^
warning: there were 5 feature warnings; re-run with -feature for details
model contains 276 documentable templates
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/util/AttributeUtils.scala:70: warning: Could not find any member to link for "IllegalArgumentException".
  /**
  ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/projections/Projection.scala:45: warning: Could not find any member to link for "IllegalArgumentException".
  /**
  ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/rdd/contig/NucleotideContigFragmentDataset.scala:365: warning: Could not find any member to link for "UnsupportedOperationException".
  /**
  ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/rdd/ADAMContext.scala:1339: warning: Could not find any member to link for "FileNotFoundException".
  /**
  ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/rdd/ADAMContext.scala:1368: warning: Could not find any member to link for "FileNotFoundException".
  /**
  ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/rdd/ADAMContext.scala:1388: warning: Could not find any member to link for "FileNotFoundException".
  /**
  ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/rdd/GenomicPartitioners.scala:75: warning: Could not find any member to link for "IllegalArgumentException".
  /**
  ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/rdd/GenomicPartitioners.scala:178: warning: Could not find any member to link for "IllegalArgumentException".
  /**
  ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/models/ReferenceRegion.scala:456: warning: Could not find any member to link for "IllegalArgumentException".
  /**
  ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/rdd/ADAMContext.scala:3085: warning: Could not find any member to link for "IllegalArgumentException".
  /**
  ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/models/ReferenceRegion.scala:411: warning: Could not find any member to link for "IllegalArgumentException".
  /**
  ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/rdd/GenomicDataset.scala:755: warning: Variable x undefined in comment for method pipe in trait GenomicDataset
   * Files are substituted in to the command with a $x syntax. E.g., to invoke
                                                     ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/rdd/GenomicDataset.scala:756: warning: Variable 0 undefined in comment for method pipe in trait GenomicDataset
   * a command that uses the first file from the files Seq, use $0. To access
                                                                 ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/rdd/GenomicDataset.scala:757: warning: Variable root undefined in comment for method pipe in trait GenomicDataset
   * the path to the directory where the files are copied, use $root.
                                                                ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/models/ReadGroupDictionary.scala:47: warning: Could not find any member to link for "IllegalArgumentException".
/**
^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/models/Alphabet.scala:44: warning: Could not find any member to link for "IllegalArgumentException".
  /**
  ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/models/ReferenceRegion.scala:258: warning: Could not find any member to link for "IllegalArgumentException".
  /**
  ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/models/ReferenceRegion.scala:242: warning: Could not find any member to link for "IllegalArgumentException".
  /**
  ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/models/ReferenceRegion.scala:426: warning: Could not find any member to link for "IllegalArgumentException".
  /**
  ^
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala/org/bdgenomics/adam/models/ReferenceRegion.scala:336: warning: Could not find any member to link for "IllegalArgumentException".
  /**
  ^
25 warnings found
[INFO] Building jar: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/adam-core-spark2_2.11-0.27.0-SNAPSHOT-javadoc.jar
[INFO] 
[INFO] --- maven-jar-plugin:3.1.2:test-jar (default) @ adam-core-spark2_2.11 ---
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: APIs for Java, Python 0.27.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-apis-spark2_2.11 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-apis-spark2_2.11 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-source (add-source) @ adam-apis-spark2_2.11 ---
[INFO] Source directory: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-apis/src/main/scala added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-apis-spark2_2.11 ---
[INFO] Modified 0 of 5 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:resources (default-resources) @ adam-apis-spark2_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-apis/src/main/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-apis-spark2_2.11 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:compile (default-compile) @ adam-apis-spark2_2.11 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-test-source (add-test-source) @ adam-apis-spark2_2.11 ---
[INFO] Test Source directory: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-apis/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:testResources (default-testResources) @ adam-apis-spark2_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 2 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-apis-spark2_2.11 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:testCompile (default-testCompile) @ adam-apis-spark2_2.11 ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 7 source files to /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-apis/target/2.11.12/test-classes
[INFO] 
[INFO] --- maven-surefire-plugin:3.0.0-M3:test (default-test) @ adam-apis-spark2_2.11 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:2.0.0:test (test) @ adam-apis-spark2_2.11 ---
Discovery starting.
Discovery completed in 123 milliseconds.
Run starting. Expected test count is: 0
Run completed in 130 milliseconds.
Total number of tests run: 0
Suites: completed 0, aborted 0
Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
No tests were executed.
[INFO] 
[INFO] --- maven-jar-plugin:3.1.2:jar (default-jar) @ adam-apis-spark2_2.11 ---
[INFO] 
[INFO] --- maven-source-plugin:3.0.1:jar-no-fork (attach-sources) @ adam-apis-spark2_2.11 ---
[INFO] 
[INFO] --- maven-javadoc-plugin:3.1.0:jar (attach-javadoc) @ adam-apis-spark2_2.11 ---
[INFO] Building jar: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-apis/target/adam-apis-spark2_2.11-0.27.0-SNAPSHOT-javadoc.jar
[INFO] 
[INFO] >>> scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) > generate-sources @ adam-apis-spark2_2.11 >>>
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-apis-spark2_2.11 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-apis-spark2_2.11 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-source (add-source) @ adam-apis-spark2_2.11 ---
[INFO] Source directory: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-apis/src/main/scala added.
[INFO] 
[INFO] <<< scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) < generate-sources @ adam-apis-spark2_2.11 <<<
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) @ adam-apis-spark2_2.11 ---
warning: there were two feature warnings; re-run with -feature for details
model contains 124 documentable templates
one warning found
[INFO] Building jar: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-apis/target/adam-apis-spark2_2.11-0.27.0-SNAPSHOT-javadoc.jar
[INFO] 
[INFO] --- maven-jar-plugin:3.1.2:test-jar (default) @ adam-apis-spark2_2.11 ---
[INFO] Building jar: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-apis/target/adam-apis-spark2_2.11-0.27.0-SNAPSHOT-tests.jar
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: CLI 0.27.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-cli-spark2_2.11 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-cli-spark2_2.11 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:timestamp-property (timestamp-property) @ adam-cli-spark2_2.11 ---
[INFO] 
[INFO] --- git-commit-id-plugin:2.2.2:revision (default) @ adam-cli-spark2_2.11 ---
[INFO] 
[INFO] --- templating-maven-plugin:1.0.0:filter-sources (filter-src) @ adam-cli-spark2_2.11 ---
[INFO] Coping files with filtering to temporary directory.
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] No files needs to be copied to output directory. Up to date: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/generated-sources/java-templates
[INFO] Source directory: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/generated-sources/java-templates added.
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-source (add-source) @ adam-cli-spark2_2.11 ---
[INFO] Source directory: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/src/main/scala added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-cli-spark2_2.11 ---
[INFO] Modified 0 of 31 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:resources (default-resources) @ adam-cli-spark2_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/src/main/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-cli-spark2_2.11 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:compile (default-compile) @ adam-cli-spark2_2.11 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-test-source (add-test-source) @ adam-cli-spark2_2.11 ---
[INFO] Test Source directory: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:testResources (default-testResources) @ adam-cli-spark2_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 15 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-cli-spark2_2.11 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:testCompile (default-testCompile) @ adam-cli-spark2_2.11 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:3.0.0-M3:test (default-test) @ adam-cli-spark2_2.11 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:2.0.0:test (test) @ adam-cli-spark2_2.11 ---
Discovery starting.
Discovery completed in 150 milliseconds.
Run starting. Expected test count is: 0
Run completed in 155 milliseconds.
Total number of tests run: 0
Suites: completed 0, aborted 0
Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
No tests were executed.
[INFO] 
[INFO] --- maven-jar-plugin:3.1.2:jar (default-jar) @ adam-cli-spark2_2.11 ---
[INFO] 
[INFO] --- maven-source-plugin:3.0.1:jar-no-fork (attach-sources) @ adam-cli-spark2_2.11 ---
[INFO] Building jar: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/adam-cli-spark2_2.11-0.27.0-SNAPSHOT-sources.jar
[INFO] 
[INFO] --- maven-javadoc-plugin:3.1.0:jar (attach-javadoc) @ adam-cli-spark2_2.11 ---
[ERROR] Error fetching link: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-apis/target/apidocs. Ignored it.
[INFO] 
Loading source files for package org.bdgenomics.adam.cli...
Constructing Javadoc information...
Standard Doclet version 1.8.0_191
Building tree for all the packages and classes...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/apidocs/org/bdgenomics/adam/cli/About.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/apidocs/org/bdgenomics/adam/cli/package-frame.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/apidocs/org/bdgenomics/adam/cli/package-summary.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/apidocs/org/bdgenomics/adam/cli/package-tree.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/apidocs/constant-values.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/apidocs/org/bdgenomics/adam/cli/class-use/About.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/apidocs/org/bdgenomics/adam/cli/package-use.html...
Building index for all the packages and classes...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/apidocs/overview-tree.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/apidocs/index-all.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/apidocs/deprecated-list.html...
Building index for all classes...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/apidocs/allclasses-frame.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/apidocs/allclasses-noframe.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/apidocs/index.html...
Generating /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/apidocs/help-doc.html...
[INFO] Building jar: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/adam-cli-spark2_2.11-0.27.0-SNAPSHOT-javadoc.jar
[INFO] 
[INFO] >>> scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) > generate-sources @ adam-cli-spark2_2.11 >>>
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-cli-spark2_2.11 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-cli-spark2_2.11 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:timestamp-property (timestamp-property) @ adam-cli-spark2_2.11 ---
[INFO] 
[INFO] --- git-commit-id-plugin:2.2.2:revision (default) @ adam-cli-spark2_2.11 ---
[INFO] 
[INFO] --- templating-maven-plugin:1.0.0:filter-sources (filter-src) @ adam-cli-spark2_2.11 ---
[INFO] Coping files with filtering to temporary directory.
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] No files needs to be copied to output directory. Up to date: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/generated-sources/java-templates
[INFO] Source directory: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/generated-sources/java-templates added.
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-source (add-source) @ adam-cli-spark2_2.11 ---
[INFO] Source directory: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/src/main/scala added.
[INFO] 
[INFO] <<< scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) < generate-sources @ adam-cli-spark2_2.11 <<<
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) @ adam-cli-spark2_2.11 ---
model contains 55 documentable templates
[INFO] Building jar: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/adam-cli-spark2_2.11-0.27.0-SNAPSHOT-javadoc.jar
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: Assembly 0.27.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-assembly-spark2_2.11 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-assembly-spark2_2.11 ---
[INFO] 
[INFO] --- git-commit-id-plugin:2.2.2:revision (default) @ adam-assembly-spark2_2.11 ---
[INFO] 
[INFO] --- templating-maven-plugin:1.0.0:filter-sources (filter-src) @ adam-assembly-spark2_2.11 ---
[INFO] Request to add '/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/src/main/java-templates' folder. Not added since it does not exist.
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-source (add-source) @ adam-assembly-spark2_2.11 ---
[INFO] Source directory: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/src/main/scala added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-assembly-spark2_2.11 ---
[INFO] Modified 0 of 1 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:resources (default-resources) @ adam-assembly-spark2_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/src/main/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-assembly-spark2_2.11 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:compile (default-compile) @ adam-assembly-spark2_2.11 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-test-source (add-test-source) @ adam-assembly-spark2_2.11 ---
[INFO] Test Source directory: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:testResources (default-testResources) @ adam-assembly-spark2_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/src/test/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-assembly-spark2_2.11 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:testCompile (default-testCompile) @ adam-assembly-spark2_2.11 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:3.0.0-M3:test (default-test) @ adam-assembly-spark2_2.11 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:3.1.2:jar (default-jar) @ adam-assembly-spark2_2.11 ---
[INFO] 
[INFO] --- maven-source-plugin:3.0.1:jar-no-fork (attach-sources) @ adam-assembly-spark2_2.11 ---
[INFO] 
[INFO] --- maven-javadoc-plugin:3.1.0:jar (attach-javadoc) @ adam-assembly-spark2_2.11 ---
[INFO] 
[INFO] >>> scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) > generate-sources @ adam-assembly-spark2_2.11 >>>
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-assembly-spark2_2.11 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-assembly-spark2_2.11 ---
[INFO] 
[INFO] --- git-commit-id-plugin:2.2.2:revision (default) @ adam-assembly-spark2_2.11 ---
[INFO] 
[INFO] --- templating-maven-plugin:1.0.0:filter-sources (filter-src) @ adam-assembly-spark2_2.11 ---
[INFO] Request to add '/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/src/main/java-templates' folder. Not added since it does not exist.
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-source (add-source) @ adam-assembly-spark2_2.11 ---
[INFO] Source directory: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/src/main/scala added.
[INFO] 
[INFO] <<< scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) < generate-sources @ adam-assembly-spark2_2.11 <<<
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) @ adam-assembly-spark2_2.11 ---
model contains 6 documentable templates
[INFO] Building jar: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT-javadoc.jar
[INFO] 
[INFO] --- maven-shade-plugin:3.2.0:shade (default) @ adam-assembly-spark2_2.11 ---
[INFO] Including org.bdgenomics.adam:adam-cli-spark2_2.11:jar:0.27.0-SNAPSHOT in the shaded jar.
[INFO] Including org.bdgenomics.utils:utils-misc-spark2_2.11:jar:0.2.15 in the shaded jar.
[INFO] Including org.bdgenomics.utils:utils-io-spark2_2.11:jar:0.2.15 in the shaded jar.
[INFO] Including org.apache.httpcomponents:httpclient:jar:4.5.7 in the shaded jar.
[INFO] Including org.apache.httpcomponents:httpcore:jar:4.4.11 in the shaded jar.
[INFO] Including commons-logging:commons-logging:jar:1.2 in the shaded jar.
[INFO] Including commons-codec:commons-codec:jar:1.11 in the shaded jar.
[INFO] Including org.bdgenomics.utils:utils-cli-spark2_2.11:jar:0.2.15 in the shaded jar.
[INFO] Including org.clapper:grizzled-slf4j_2.11:jar:1.3.3 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-avro:jar:1.10.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-column:jar:1.10.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-common:jar:1.10.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-encoding:jar:1.10.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-hadoop:jar:1.10.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-jackson:jar:1.10.1 in the shaded jar.
[INFO] Including commons-pool:commons-pool:jar:1.6 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-format:jar:2.4.0 in the shaded jar.
[INFO] Including org.bdgenomics.utils:utils-metrics-spark2_2.11:jar:0.2.15 in the shaded jar.
[INFO] Including com.netflix.servo:servo-core:jar:0.12.25 in the shaded jar.
[INFO] Including com.netflix.spectator:spectator-api:jar:0.67.0 in the shaded jar.
[INFO] Including org.slf4j:slf4j-api:jar:1.7.25 in the shaded jar.
[INFO] Including org.bdgenomics.bdg-formats:bdg-formats:jar:0.12.0 in the shaded jar.
[INFO] Including org.apache.avro:avro:jar:1.8.2 in the shaded jar.
[INFO] Including org.codehaus.jackson:jackson-core-asl:jar:1.9.13 in the shaded jar.
[INFO] Including org.codehaus.jackson:jackson-mapper-asl:jar:1.9.13 in the shaded jar.
[INFO] Including com.thoughtworks.paranamer:paranamer:jar:2.7 in the shaded jar.
[INFO] Including org.xerial.snappy:snappy-java:jar:1.1.1.3 in the shaded jar.
[INFO] Including org.apache.commons:commons-compress:jar:1.8.1 in the shaded jar.
[INFO] Including org.tukaani:xz:jar:1.5 in the shaded jar.
[INFO] Including org.bdgenomics.adam:adam-core-spark2_2.11:jar:0.27.0-SNAPSHOT in the shaded jar.
[INFO] Including org.bdgenomics.utils:utils-intervalrdd-spark2_2.11:jar:0.2.15 in the shaded jar.
[INFO] Including com.esotericsoftware.kryo:kryo:jar:2.24.0 in the shaded jar.
[INFO] Including com.esotericsoftware.minlog:minlog:jar:1.2 in the shaded jar.
[INFO] Including org.objenesis:objenesis:jar:2.1 in the shaded jar.
[INFO] Including commons-io:commons-io:jar:2.6 in the shaded jar.
[INFO] Including it.unimi.dsi:fastutil:jar:6.6.5 in the shaded jar.
[INFO] Including org.seqdoop:hadoop-bam:jar:7.9.2 in the shaded jar.
[INFO] Including com.github.jsr203hadoop:jsr203hadoop:jar:1.0.3 in the shaded jar.
[INFO] Including com.github.samtools:htsjdk:jar:2.18.2 in the shaded jar.
[INFO] Including org.apache.commons:commons-jexl:jar:2.1.1 in the shaded jar.
[INFO] Including gov.nih.nlm.ncbi:ngs-java:jar:2.9.0 in the shaded jar.
[INFO] Including com.google.guava:guava:jar:27.0-jre in the shaded jar.
[INFO] Including com.google.guava:failureaccess:jar:1.0 in the shaded jar.
[INFO] Including com.google.guava:listenablefuture:jar:9999.0-empty-to-avoid-conflict-with-guava in the shaded jar.
[INFO] Including org.checkerframework:checker-qual:jar:2.5.2 in the shaded jar.
[INFO] Including com.google.errorprone:error_prone_annotations:jar:2.2.0 in the shaded jar.
[INFO] Including com.google.j2objc:j2objc-annotations:jar:1.1 in the shaded jar.
[INFO] Including org.codehaus.mojo:animal-sniffer-annotations:jar:1.17 in the shaded jar.
[INFO] Including org.bdgenomics.adam:adam-codegen-spark2_2.11:jar:0.27.0-SNAPSHOT in the shaded jar.
[INFO] Including org.bdgenomics.adam:adam-apis-spark2_2.11:jar:0.27.0-SNAPSHOT in the shaded jar.
[INFO] Including args4j:args4j:jar:2.33 in the shaded jar.
[INFO] Including net.codingwell:scala-guice_2.11:jar:4.2.1 in the shaded jar.
[INFO] Including com.google.inject:guice:jar:4.2.0 in the shaded jar.
[INFO] Including javax.inject:javax.inject:jar:1 in the shaded jar.
[INFO] Including aopalliance:aopalliance:jar:1.0 in the shaded jar.
[INFO] Including org.scala-lang:scala-reflect:jar:2.11.12 in the shaded jar.
[INFO] Including com.google.code.findbugs:jsr305:jar:1.3.9 in the shaded jar.
[WARNING] WORKAROUND:  refusing to add class org/apache/parquet/avro/AvroSchemaConverter$2.class from jar /home/jenkins/.m2/repository/org/apache/parquet/parquet-avro/1.10.1/parquet-avro-1.10.1.jar
[WARNING] WORKAROUND:  refusing to add class org/apache/parquet/avro/AvroSchemaConverter.class from jar /home/jenkins/.m2/repository/org/apache/parquet/parquet-avro/1.10.1/parquet-avro-1.10.1.jar
[WARNING] WORKAROUND:  refusing to add class org/apache/parquet/avro/AvroSchemaConverter$1.class from jar /home/jenkins/.m2/repository/org/apache/parquet/parquet-avro/1.10.1/parquet-avro-1.10.1.jar
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, checker-qual-2.5.2.jar define 302 overlapping classes: 
[WARNING]   - org.checkerframework.checker.formatter.FormatUtil
[WARNING]   - org.checkerframework.checker.units.qual.MixedUnits
[WARNING]   - org.checkerframework.checker.regex.qual.PolyRegex
[WARNING]   - org.checkerframework.checker.units.qual.PolyUnit
[WARNING]   - org.checkerframework.checker.formatter.FormatUtil$IllegalFormatConversionCategoryException
[WARNING]   - org.checkerframework.framework.qual.Unqualified
[WARNING]   - org.checkerframework.checker.units.qual.C
[WARNING]   - org.checkerframework.common.reflection.qual.UnknownMethod
[WARNING]   - org.checkerframework.framework.qual.EnsuresQualifierIf
[WARNING]   - org.checkerframework.checker.signedness.SignednessUtil
[WARNING]   - 292 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, guava-27.0-jre.jar, failureaccess-1.0.jar define 2 overlapping classes: 
[WARNING]   - com.google.common.util.concurrent.internal.InternalFutureFailureAccess
[WARNING]   - com.google.common.util.concurrent.internal.InternalFutures
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, utils-intervalrdd-spark2_2.11-0.2.15.jar define 51 overlapping classes: 
[WARNING]   - org.bdgenomics.utils.interval.array.IntervalArray$
[WARNING]   - org.bdgenomics.utils.interval.rdd.IntervalRDD$$anonfun$collect$2
[WARNING]   - org.bdgenomics.utils.interval.array.IntervalArray$$anonfun$4
[WARNING]   - org.bdgenomics.utils.interval.array.Interval$class
[WARNING]   - org.bdgenomics.utils.interval.array.IntervalArray$$anonfun$get$1
[WARNING]   - org.bdgenomics.utils.interval.rdd.IntervalRDD$$anonfun$4
[WARNING]   - org.bdgenomics.utils.interval.array.IntervalArray$$anonfun$1
[WARNING]   - org.bdgenomics.utils.interval.rdd.IntervalRDD$
[WARNING]   - org.bdgenomics.utils.interval.rdd.IntervalRDD$$anonfun$filter$1
[WARNING]   - org.bdgenomics.utils.interval.array.IntervalArray$$anonfun$mapValues$1
[WARNING]   - 41 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, commons-logging-1.2.jar define 28 overlapping classes: 
[WARNING]   - org.apache.commons.logging.LogSource
[WARNING]   - org.apache.commons.logging.impl.ServletContextCleaner
[WARNING]   - org.apache.commons.logging.Log
[WARNING]   - org.apache.commons.logging.LogFactory$3
[WARNING]   - org.apache.commons.logging.impl.LogFactoryImpl$2
[WARNING]   - org.apache.commons.logging.impl.LogKitLogger
[WARNING]   - org.apache.commons.logging.impl.Jdk14Logger
[WARNING]   - org.apache.commons.logging.LogConfigurationException
[WARNING]   - org.apache.commons.logging.impl.WeakHashtable$Referenced
[WARNING]   - org.apache.commons.logging.impl.WeakHashtable$WeakKey
[WARNING]   - 18 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, utils-metrics-spark2_2.11-0.2.15.jar define 407 overlapping classes: 
[WARNING]   - org.apache.spark.rdd.InstrumentedPairRDDFunctions$$anonfun$reduceByKey$2$$anonfun$apply$33
[WARNING]   - org.bdgenomics.utils.instrumentation.SimpleMonitorValueExtractor
[WARNING]   - org.apache.spark.rdd.InstrumentedPairRDDFunctions$$anonfun$fullOuterJoin$2
[WARNING]   - org.bdgenomics.utils.instrumentation.MetricsListener$$anonfun$onTaskEnd$2
[WARNING]   - org.bdgenomics.utils.metrics.Histogram$$anonfun$countSubset$1
[WARNING]   - org.bdgenomics.utils.instrumentation.Alignment$
[WARNING]   - org.apache.spark.rdd.InstrumentedPairRDDFunctions$$anonfun$saveAsHadoopFile$3
[WARNING]   - org.bdgenomics.utils.instrumentation.Metrics$TreeNode$$anonfun$6
[WARNING]   - org.apache.spark.rdd.InstrumentedRDD$$anonfun$reduce$1$$anonfun$apply$35
[WARNING]   - org.apache.spark.rdd.InstrumentedRDD$$anonfun$aggregate$1$$anonfun$apply$39$$anonfun$apply$40
[WARNING]   - 397 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, adam-cli-spark2_2.11-0.27.0-SNAPSHOT.jar define 157 overlapping classes: 
[WARNING]   - org.bdgenomics.adam.cli.PrintADAMArgs
[WARNING]   - org.bdgenomics.adam.cli.Reads2Coverage$$anonfun$1$$anonfun$apply$1
[WARNING]   - org.bdgenomics.adam.cli.TransformVariants$$anonfun$maybeCoalesce$2
[WARNING]   - org.bdgenomics.adam.cli.FlagStat$
[WARNING]   - org.bdgenomics.adam.cli.TransformAlignments$$anonfun$7
[WARNING]   - org.bdgenomics.adam.cli.TransformGenotypes$$anonfun$run$2
[WARNING]   - org.bdgenomics.adam.cli.TransformVariants$
[WARNING]   - org.bdgenomics.adam.cli.View$$anonfun$getFilter$1$1
[WARNING]   - org.bdgenomics.adam.cli.TransformAlignments$$anonfun$maybeCoalesce$2
[WARNING]   - org.bdgenomics.adam.cli.TransformAlignments$$anonfun$4
[WARNING]   - 147 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, parquet-jackson-1.10.1.jar define 623 overlapping classes: 
[WARNING]   - shaded.parquet.org.codehaus.jackson.map.InjectableValues$Std
[WARNING]   - shaded.parquet.org.codehaus.jackson.map.introspect.POJOPropertyBuilder$Node
[WARNING]   - shaded.parquet.org.codehaus.jackson.util.TokenBuffer$1
[WARNING]   - shaded.parquet.org.codehaus.jackson.type.TypeReference
[WARNING]   - shaded.parquet.org.codehaus.jackson.map.deser.std.FromStringDeserializer$LocaleDeserializer
[WARNING]   - shaded.parquet.org.codehaus.jackson.map.ser.BasicSerializerFactory
[WARNING]   - shaded.parquet.org.codehaus.jackson.map.deser.StdKeyDeserializer
[WARNING]   - shaded.parquet.org.codehaus.jackson.map.deser.std.StdKeyDeserializer$EnumKD
[WARNING]   - shaded.parquet.org.codehaus.jackson.map.ser.std.InetAddressSerializer
[WARNING]   - shaded.parquet.org.codehaus.jackson.map.jsontype.impl.StdTypeResolverBuilder
[WARNING]   - 613 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, guava-27.0-jre.jar define 1955 overlapping classes: 
[WARNING]   - com.google.common.collect.CompactHashMap$Itr
[WARNING]   - com.google.common.collect.ImmutableMapValues$1
[WARNING]   - com.google.common.util.concurrent.AbstractService$5
[WARNING]   - com.google.common.io.LineProcessor
[WARNING]   - com.google.common.io.BaseEncoding$StandardBaseEncoding$2
[WARNING]   - com.google.common.io.ByteProcessor
[WARNING]   - com.google.common.math.package-info
[WARNING]   - com.google.common.util.concurrent.SimpleTimeLimiter
[WARNING]   - com.google.common.cache.AbstractCache$StatsCounter
[WARNING]   - com.google.common.util.concurrent.CycleDetectingLockFactory$Policies
[WARNING]   - 1945 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, animal-sniffer-annotations-1.17.jar define 1 overlapping classes: 
[WARNING]   - org.codehaus.mojo.animal_sniffer.IgnoreJRERequirement
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, commons-codec-1.11.jar define 96 overlapping classes: 
[WARNING]   - org.apache.commons.codec.language.Nysiis
[WARNING]   - org.apache.commons.codec.language.bm.Rule$1
[WARNING]   - org.apache.commons.codec.language.bm.Rule$RPattern
[WARNING]   - org.apache.commons.codec.language.ColognePhonetic$CologneInputBuffer
[WARNING]   - org.apache.commons.codec.digest.HmacUtils
[WARNING]   - org.apache.commons.codec.language.bm.BeiderMorseEncoder
[WARNING]   - org.apache.commons.codec.digest.UnixCrypt
[WARNING]   - org.apache.commons.codec.language.Soundex
[WARNING]   - org.apache.commons.codec.cli.Digest
[WARNING]   - org.apache.commons.codec.binary.BinaryCodec
[WARNING]   - 86 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, grizzled-slf4j_2.11-1.3.3.jar define 4 overlapping classes: 
[WARNING]   - grizzled.slf4j.Logger$
[WARNING]   - grizzled.slf4j.Logging
[WARNING]   - grizzled.slf4j.Logging$class
[WARNING]   - grizzled.slf4j.Logger
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, parquet-common-1.10.1.jar define 83 overlapping classes: 
[WARNING]   - org.apache.parquet.SemanticVersion$SemanticVersionParseException
[WARNING]   - org.apache.parquet.bytes.SingleBufferInputStream
[WARNING]   - org.apache.parquet.Ints
[WARNING]   - org.apache.parquet.Version
[WARNING]   - org.apache.parquet.SemanticVersion$NumberOrString
[WARNING]   - org.apache.parquet.glob.GlobNode$Atom
[WARNING]   - org.apache.parquet.util.DynMethods$Builder
[WARNING]   - org.apache.parquet.bytes.BytesInput$EmptyBytesInput
[WARNING]   - org.apache.parquet.Exceptions
[WARNING]   - org.apache.parquet.bytes.MultiBufferInputStream$ConcatIterator
[WARNING]   - 73 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, paranamer-2.7.jar define 21 overlapping classes: 
[WARNING]   - com.thoughtworks.paranamer.PositionalParanamer
[WARNING]   - com.thoughtworks.paranamer.JavadocParanamer
[WARNING]   - com.thoughtworks.paranamer.BytecodeReadingParanamer
[WARNING]   - com.thoughtworks.paranamer.BytecodeReadingParanamer$Type
[WARNING]   - com.thoughtworks.paranamer.BytecodeReadingParanamer$1
[WARNING]   - com.thoughtworks.paranamer.JavadocParanamer$DirJavadocProvider
[WARNING]   - com.thoughtworks.paranamer.AnnotationParanamer$Jsr330Helper
[WARNING]   - com.thoughtworks.paranamer.BytecodeReadingParanamer$TypeCollector
[WARNING]   - com.thoughtworks.paranamer.AnnotationParanamer
[WARNING]   - com.thoughtworks.paranamer.NullParanamer
[WARNING]   - 11 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, parquet-format-2.4.0.jar define 461 overlapping classes: 
[WARNING]   - shaded.parquet.org.apache.thrift.transport.TSimpleFileTransport
[WARNING]   - shaded.parquet.org.apache.thrift.transport.TServerSocket
[WARNING]   - shaded.parquet.org.apache.thrift.transport.TFileTransport$TruncableBufferedInputStream
[WARNING]   - shaded.parquet.org.apache.thrift.TFieldIdEnum
[WARNING]   - org.apache.parquet.format.SchemaElement$SchemaElementStandardScheme
[WARNING]   - shaded.parquet.org.apache.thrift.server.AbstractNonblockingServer$AbstractSelectThread
[WARNING]   - shaded.parquet.org.apache.thrift.TEnumHelper
[WARNING]   - org.apache.parquet.format.JsonType$JsonTypeStandardScheme
[WARNING]   - org.apache.parquet.format.DictionaryPageHeader$DictionaryPageHeaderTupleScheme
[WARNING]   - org.apache.parquet.format.UUIDType$1
[WARNING]   - 451 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, xz-1.5.jar define 105 overlapping classes: 
[WARNING]   - org.tukaani.xz.lzma.LZMADecoder$LengthDecoder
[WARNING]   - org.tukaani.xz.index.IndexDecoder
[WARNING]   - org.tukaani.xz.lzma.LZMADecoder
[WARNING]   - org.tukaani.xz.lzma.LZMAEncoderFast
[WARNING]   - org.tukaani.xz.lzma.LZMAEncoder$LengthEncoder
[WARNING]   - org.tukaani.xz.BlockOutputStream
[WARNING]   - org.tukaani.xz.simple.SimpleFilter
[WARNING]   - org.tukaani.xz.rangecoder.RangeCoder
[WARNING]   - org.tukaani.xz.XZOutputStream
[WARNING]   - org.tukaani.xz.UncompressedLZMA2OutputStream
[WARNING]   - 95 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, snappy-java-1.1.1.3.jar define 19 overlapping classes: 
[WARNING]   - org.xerial.snappy.SnappyLoader
[WARNING]   - org.xerial.snappy.SnappyFramedInputStream$FrameMetaData
[WARNING]   - org.xerial.snappy.SnappyFramedInputStream
[WARNING]   - org.xerial.snappy.SnappyOutputStream
[WARNING]   - org.xerial.snappy.SnappyErrorCode
[WARNING]   - org.xerial.snappy.SnappyBundleActivator
[WARNING]   - org.xerial.snappy.SnappyFramedOutputStream
[WARNING]   - org.xerial.snappy.BufferRecycler
[WARNING]   - org.xerial.snappy.SnappyError
[WARNING]   - org.xerial.snappy.SnappyFramedInputStream$FrameAction
[WARNING]   - 9 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, aopalliance-1.0.jar define 9 overlapping classes: 
[WARNING]   - org.aopalliance.intercept.ConstructorInterceptor
[WARNING]   - org.aopalliance.intercept.MethodInvocation
[WARNING]   - org.aopalliance.intercept.MethodInterceptor
[WARNING]   - org.aopalliance.intercept.Invocation
[WARNING]   - org.aopalliance.aop.AspectException
[WARNING]   - org.aopalliance.intercept.Interceptor
[WARNING]   - org.aopalliance.intercept.Joinpoint
[WARNING]   - org.aopalliance.aop.Advice
[WARNING]   - org.aopalliance.intercept.ConstructorInvocation
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, javax.inject-1.jar define 6 overlapping classes: 
[WARNING]   - javax.inject.Inject
[WARNING]   - javax.inject.Singleton
[WARNING]   - javax.inject.Scope
[WARNING]   - javax.inject.Named
[WARNING]   - javax.inject.Provider
[WARNING]   - javax.inject.Qualifier
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, bdg-formats-0.12.0.jar define 61 overlapping classes: 
[WARNING]   - org.bdgenomics.formats.avro.VariantCallingAnnotations$Builder
[WARNING]   - org.bdgenomics.formats.avro.VariantAnnotation$Builder
[WARNING]   - org.bdgenomics.formats.avro.Reference$1
[WARNING]   - org.bdgenomics.formats.avro.VariantAnnotationMessage
[WARNING]   - org.bdgenomics.formats.avro.OntologyTerm$1
[WARNING]   - org.bdgenomics.formats.avro.Feature$1
[WARNING]   - org.bdgenomics.formats.avro.AlignmentRecord
[WARNING]   - org.bdgenomics.formats.avro.Sequence$1
[WARNING]   - org.bdgenomics.formats.avro.ReadGroup$Builder
[WARNING]   - org.bdgenomics.formats.avro.ReadGroup
[WARNING]   - 51 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, slf4j-api-1.7.25.jar define 34 overlapping classes: 
[WARNING]   - org.slf4j.helpers.SubstituteLogger
[WARNING]   - org.slf4j.helpers.NamedLoggerBase
[WARNING]   - org.slf4j.helpers.NOPMDCAdapter
[WARNING]   - org.slf4j.MarkerFactory
[WARNING]   - org.slf4j.spi.LoggerFactoryBinder
[WARNING]   - org.slf4j.helpers.BasicMarker
[WARNING]   - org.slf4j.MDC$MDCCloseable
[WARNING]   - org.slf4j.spi.LocationAwareLogger
[WARNING]   - org.slf4j.helpers.MessageFormatter
[WARNING]   - org.slf4j.helpers.Util$ClassContextSecurityManager
[WARNING]   - 24 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, scala-reflect-2.11.12.jar define 2829 overlapping classes: 
[WARNING]   - scala.reflect.runtime.ReflectionUtils
[WARNING]   - scala.reflect.internal.Scopes$LookupInaccessible$
[WARNING]   - scala.reflect.internal.Types$LazyType
[WARNING]   - scala.reflect.internal.Definitions$DefinitionsClass$$anonfun$newT1NullaryMethod$1
[WARNING]   - scala.reflect.internal.SymbolPairs$Cursor
[WARNING]   - scala.reflect.internal.Types$StaticallyAnnotatedType$
[WARNING]   - scala.reflect.runtime.SynchronizedOps$SynchronizedScope$$anonfun$isEmpty$1
[WARNING]   - scala.reflect.internal.Kinds$TypeConKind$$anonfun$buildState$3
[WARNING]   - scala.reflect.api.StandardLiftables$StandardUnliftableInstances$$anonfun$unliftTuple18$1
[WARNING]   - scala.reflect.runtime.SynchronizedSymbols$SynchronizedClassSymbol
[WARNING]   - 2819 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, htsjdk-2.18.2.jar define 978 overlapping classes: 
[WARNING]   - htsjdk.samtools.cram.ref.ReferenceSource
[WARNING]   - htsjdk.samtools.cram.compression.ExternalCompressor$3
[WARNING]   - htsjdk.samtools.HighAccuracyDownsamplingIterator
[WARNING]   - htsjdk.samtools.util.zip.DeflaterFactory
[WARNING]   - htsjdk.samtools.filter.DuplicateReadFilter
[WARNING]   - htsjdk.samtools.cram.encoding.core.huffmanUtils.HuffmanCode$1
[WARNING]   - htsjdk.samtools.cram.encoding.core.SubexponentialIntegerEncoding
[WARNING]   - htsjdk.variant.vcf.VCFEncoder
[WARNING]   - htsjdk.samtools.util.CloserUtil
[WARNING]   - htsjdk.tribble.TribbleException$FeatureFileDoesntExist
[WARNING]   - 968 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, jsr305-1.3.9.jar define 35 overlapping classes: 
[WARNING]   - javax.annotation.RegEx
[WARNING]   - javax.annotation.concurrent.Immutable
[WARNING]   - javax.annotation.meta.TypeQualifierDefault
[WARNING]   - javax.annotation.meta.TypeQualifier
[WARNING]   - javax.annotation.Syntax
[WARNING]   - javax.annotation.CheckForNull
[WARNING]   - javax.annotation.Nonnull
[WARNING]   - javax.annotation.CheckReturnValue
[WARNING]   - javax.annotation.meta.TypeQualifierNickname
[WARNING]   - javax.annotation.MatchesPattern
[WARNING]   - 25 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, parquet-encoding-1.10.1.jar define 305 overlapping classes: 
[WARNING]   - org.apache.parquet.column.values.bitpacking.ByteBitPackingForLongBE$Packer25
[WARNING]   - org.apache.parquet.column.values.bitpacking.ByteBitPackingBE$Packer14
[WARNING]   - org.apache.parquet.column.values.bitpacking.BytePackerForLong
[WARNING]   - org.apache.parquet.column.values.bitpacking.ByteBitPackingLE$Packer10
[WARNING]   - org.apache.parquet.column.values.bitpacking.ByteBitPackingForLongLE$Packer54
[WARNING]   - org.apache.parquet.column.values.bitpacking.ByteBitPackingForLongLE$Packer41
[WARNING]   - org.apache.parquet.column.values.bitpacking.ByteBitPackingLE$Packer30
[WARNING]   - org.apache.parquet.column.values.bitpacking.ByteBitPackingLE$Packer23
[WARNING]   - org.apache.parquet.column.values.bitpacking.LemireBitPackingLE$Packer19
[WARNING]   - org.apache.parquet.column.values.bitpacking.ByteBitPackingForLongLE$Packer21
[WARNING]   - 295 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, hadoop-bam-7.9.2.jar define 115 overlapping classes: 
[WARNING]   - org.seqdoop.hadoop_bam.BAMSplitGuesser
[WARNING]   - org.seqdoop.hadoop_bam.util.SAMHeaderReader
[WARNING]   - org.seqdoop.hadoop_bam.QseqInputFormat
[WARNING]   - org.seqdoop.hadoop_bam.KeyIgnoringBCFRecordWriter
[WARNING]   - org.seqdoop.hadoop_bam.FastaInputFormat$1
[WARNING]   - org.seqdoop.hadoop_bam.util.SAMOutputPreparer$1
[WARNING]   - org.seqdoop.hadoop_bam.QseqOutputFormat$QseqRecordWriter
[WARNING]   - org.seqdoop.hadoop_bam.FastaInputFormat
[WARNING]   - org.seqdoop.hadoop_bam.LineReader
[WARNING]   - org.seqdoop.hadoop_bam.util.BGZFCodec
[WARNING]   - 105 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, args4j-2.33.jar define 74 overlapping classes: 
[WARNING]   - org.kohsuke.args4j.spi.DoubleOptionHandler
[WARNING]   - org.kohsuke.args4j.spi.MethodSetter
[WARNING]   - org.kohsuke.args4j.spi.MacAddressOptionHandler
[WARNING]   - org.kohsuke.args4j.spi.StringArrayOptionHandler
[WARNING]   - org.kohsuke.args4j.spi.SubCommand
[WARNING]   - org.kohsuke.args4j.spi.PatternOptionHandler
[WARNING]   - org.kohsuke.args4j.ParserProperties$1
[WARNING]   - org.kohsuke.args4j.OptionHandlerFilter$2
[WARNING]   - org.kohsuke.args4j.spi.MultiFileOptionHandler
[WARNING]   - org.kohsuke.args4j.OptionHandlerRegistry$DefaultConstructorHandlerFactory
[WARNING]   - 64 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, spectator-api-0.67.0.jar define 97 overlapping classes: 
[WARNING]   - com.netflix.spectator.api.patterns.PolledMeter$ValueState
[WARNING]   - com.netflix.spectator.api.patterns.PolledMeter
[WARNING]   - com.netflix.spectator.api.NoopId
[WARNING]   - com.netflix.spectator.impl.Scheduler$Options
[WARNING]   - com.netflix.spectator.api.AbstractTimer
[WARNING]   - com.netflix.spectator.impl.AsciiSet
[WARNING]   - com.netflix.spectator.api.Counter
[WARNING]   - com.netflix.spectator.api.DefaultRegistry
[WARNING]   - com.netflix.spectator.api.SwapGauge
[WARNING]   - com.netflix.spectator.api.CompositeCounter
[WARNING]   - 87 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, error_prone_annotations-2.2.0.jar define 22 overlapping classes: 
[WARNING]   - com.google.errorprone.annotations.NoAllocation
[WARNING]   - com.google.errorprone.annotations.Var
[WARNING]   - com.google.errorprone.annotations.IncompatibleModifiers
[WARNING]   - com.google.errorprone.annotations.CompatibleWith
[WARNING]   - com.google.errorprone.annotations.concurrent.LockMethod
[WARNING]   - com.google.errorprone.annotations.FormatString
[WARNING]   - com.google.errorprone.annotations.DoNotCall
[WARNING]   - com.google.errorprone.annotations.Immutable
[WARNING]   - com.google.errorprone.annotations.RestrictedApi
[WARNING]   - com.google.errorprone.annotations.ForOverride
[WARNING]   - 12 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, commons-pool-1.6.jar define 55 overlapping classes: 
[WARNING]   - org.apache.commons.pool.PoolUtils$PoolableObjectFactoryAdaptor
[WARNING]   - org.apache.commons.pool.impl.GenericObjectPool$1
[WARNING]   - org.apache.commons.pool.impl.GenericObjectPool$Latch
[WARNING]   - org.apache.commons.pool.PoolUtils$ErodingFactor
[WARNING]   - org.apache.commons.pool.BasePoolableObjectFactory
[WARNING]   - org.apache.commons.pool.PoolUtils$KeyedPoolableObjectFactoryAdaptor
[WARNING]   - org.apache.commons.pool.impl.EvictionTimer$PrivilegedGetTccl
[WARNING]   - org.apache.commons.pool.impl.StackKeyedObjectPool
[WARNING]   - org.apache.commons.pool.BaseKeyedPoolableObjectFactory
[WARNING]   - org.apache.commons.pool.impl.GenericKeyedObjectPool$ObjectQueue
[WARNING]   - 45 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, parquet-column-1.10.1.jar define 792 overlapping classes: 
[WARNING]   - org.apache.parquet.it.unimi.dsi.fastutil.longs.LongComparator
[WARNING]   - org.apache.parquet.column.values.dictionary.DictionaryValuesWriter$PlainIntegerDictionaryValuesWriter
[WARNING]   - org.apache.parquet.io.PrimitiveColumnIO
[WARNING]   - org.apache.parquet.io.api.Binary$ByteBufferBackedBinary
[WARNING]   - org.apache.parquet.it.unimi.dsi.fastutil.doubles.DoubleSortedSet
[WARNING]   - org.apache.parquet.io.BaseRecordReader
[WARNING]   - org.apache.parquet.column.ParquetProperties$1
[WARNING]   - org.apache.parquet.column.UnknownColumnException
[WARNING]   - org.apache.parquet.filter.ColumnPredicates$12
[WARNING]   - org.apache.parquet.schema.Types$BaseMapBuilder$ListValueBuilder
[WARNING]   - 782 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, adam-core-spark2_2.11-0.27.0-SNAPSHOT.jar define 2800 overlapping classes: 
[WARNING]   - org.bdgenomics.adam.rdd.feature.FeatureDataset$$anonfun$filterToTranscript$2$$anonfun$apply$9$$anonfun$apply$10
[WARNING]   - org.bdgenomics.adam.converters.VariantContextConverter$$anonfun$74$$anonfun$apply$77
[WARNING]   - org.bdgenomics.adam.rdd.read.AlignmentRecordDataset$$typecreator9$1
[WARNING]   - org.bdgenomics.adam.sql.VariantCallingAnnotations$$anonfun$toAvro$58
[WARNING]   - org.bdgenomics.adam.rdd.feature.DatasetBoundFeatureDataset$$anonfun$filterByAttribute$1
[WARNING]   - org.bdgenomics.adam.converters.VariantContextConverter$$anonfun$63
[WARNING]   - org.bdgenomics.adam.io.InterleavedFastqInputFormat
[WARNING]   - org.bdgenomics.adam.util.FileMerger$$anonfun$mergeFilesAcrossFilesystems$4
[WARNING]   - org.bdgenomics.adam.rdd.fragment.DatasetBoundFragmentDataset
[WARNING]   - org.bdgenomics.adam.sql.VariantAnnotation$$anonfun$fromAvro$80
[WARNING]   - 2790 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, utils-io-spark2_2.11-0.2.15.jar define 22 overlapping classes: 
[WARNING]   - org.bdgenomics.utils.io.HTTPRangedByteAccess$$anonfun$3
[WARNING]   - org.bdgenomics.utils.io.FileLocator
[WARNING]   - org.bdgenomics.utils.io.ByteAccess
[WARNING]   - org.bdgenomics.utils.io.HTTPRangedByteAccess
[WARNING]   - org.bdgenomics.utils.io.HTTPRangedByteAccess$$anonfun$readByteStream$1
[WARNING]   - org.bdgenomics.utils.io.HTTPRangedByteAccess$$anonfun$1
[WARNING]   - org.bdgenomics.utils.io.ByteAccess$$anonfun$readFully$2
[WARNING]   - org.bdgenomics.utils.io.HTTPRangedByteAccess$$anonfun$2
[WARNING]   - org.bdgenomics.utils.io.LocalFileByteAccess
[WARNING]   - org.bdgenomics.utils.io.HTTPFileLocator$
[WARNING]   - 12 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, jackson-mapper-asl-1.9.13.jar define 502 overlapping classes: 
[WARNING]   - org.codehaus.jackson.map.ext.DOMSerializer
[WARNING]   - org.codehaus.jackson.node.POJONode
[WARNING]   - org.codehaus.jackson.map.ser.StdSerializers$UtilDateSerializer
[WARNING]   - org.codehaus.jackson.map.deser.std.JsonNodeDeserializer$ArrayDeserializer
[WARNING]   - org.codehaus.jackson.map.ext.JodaDeserializers$LocalDateDeserializer
[WARNING]   - org.codehaus.jackson.map.deser.std.PrimitiveArrayDeserializers$StringDeser
[WARNING]   - org.codehaus.jackson.map.util.Comparators$1
[WARNING]   - org.codehaus.jackson.map.util.StdDateFormat
[WARNING]   - org.codehaus.jackson.map.KeyDeserializer
[WARNING]   - org.codehaus.jackson.map.MapperConfig$Impl
[WARNING]   - 492 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, commons-jexl-2.1.1.jar define 178 overlapping classes: 
[WARNING]   - org.apache.commons.jexl2.internal.AbstractExecutor$Get
[WARNING]   - org.apache.commons.jexl2.introspection.JexlPropertyGet
[WARNING]   - org.apache.commons.jexl2.parser.StringParser
[WARNING]   - org.apache.commons.jexl2.parser.ASTBitwiseOrNode
[WARNING]   - org.apache.commons.jexl2.internal.introspection.MethodKey$1
[WARNING]   - org.apache.commons.jexl2.Main
[WARNING]   - org.apache.commons.jexl2.parser.ASTForeachStatement
[WARNING]   - org.apache.commons.jexl2.introspection.Sandbox
[WARNING]   - org.apache.commons.jexl2.internal.introspection.ClassMap
[WARNING]   - org.apache.commons.jexl2.parser.ASTFunctionNode
[WARNING]   - 168 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, jackson-core-asl-1.9.13.jar define 121 overlapping classes: 
[WARNING]   - org.codehaus.jackson.annotate.JsonManagedReference
[WARNING]   - org.codehaus.jackson.util.DefaultPrettyPrinter$FixedSpaceIndenter
[WARNING]   - org.codehaus.jackson.JsonGenerationException
[WARNING]   - org.codehaus.jackson.util.BufferRecycler$CharBufferType
[WARNING]   - org.codehaus.jackson.io.UTF32Reader
[WARNING]   - org.codehaus.jackson.sym.Name1
[WARNING]   - org.codehaus.jackson.util.MinimalPrettyPrinter
[WARNING]   - org.codehaus.jackson.impl.JsonParserBase
[WARNING]   - org.codehaus.jackson.sym.CharsToNameCanonicalizer$Bucket
[WARNING]   - org.codehaus.jackson.annotate.JsonValue
[WARNING]   - 111 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, ngs-java-2.9.0.jar define 73 overlapping classes: 
[WARNING]   - ngs.itf.ReadItf
[WARNING]   - gov.nih.nlm.ncbi.ngs.error.cause.JvmErrorCause
[WARNING]   - ngs.ReferenceIterator
[WARNING]   - gov.nih.nlm.ncbi.ngs.Manager$1
[WARNING]   - gov.nih.nlm.ncbi.ngs.LMProperties
[WARNING]   - gov.nih.nlm.ncbi.ngs.error.LibraryLoadError
[WARNING]   - gov.nih.nlm.ncbi.ngs.LibDependencies
[WARNING]   - gov.nih.nlm.ncbi.ngs.error.cause.ConnectionProblemCause
[WARNING]   - ngs.itf.PileupEventItf
[WARNING]   - gov.nih.nlm.ncbi.ngs.LibManager$Location
[WARNING]   - 63 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, minlog-1.2.jar define 2 overlapping classes: 
[WARNING]   - com.esotericsoftware.minlog.Log
[WARNING]   - com.esotericsoftware.minlog.Log$Logger
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, httpcore-4.4.11.jar define 252 overlapping classes: 
[WARNING]   - org.apache.http.protocol.HttpRequestHandler
[WARNING]   - org.apache.http.impl.io.ChunkedOutputStream
[WARNING]   - org.apache.http.protocol.ChainBuilder
[WARNING]   - org.apache.http.impl.entity.DisallowIdentityContentLengthStrategy
[WARNING]   - org.apache.http.impl.ConnSupport
[WARNING]   - org.apache.http.impl.io.DefaultHttpResponseParserFactory
[WARNING]   - org.apache.http.HttpClientConnection
[WARNING]   - org.apache.http.NameValuePair
[WARNING]   - org.apache.http.protocol.HttpExpectationVerifier
[WARNING]   - org.apache.http.impl.io.AbstractMessageWriter
[WARNING]   - 242 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, servo-core-0.12.25.jar define 172 overlapping classes: 
[WARNING]   - com.netflix.servo.util.Objects
[WARNING]   - com.netflix.servo.util.Clock
[WARNING]   - com.netflix.servo.monitor.MaxGauge
[WARNING]   - com.netflix.servo.publish.MonitorRegistryMetricPoller$MonitorValueCallable
[WARNING]   - com.netflix.servo.publish.LocalJmxConnector
[WARNING]   - com.netflix.servo.util.ExpiringCache$Entry
[WARNING]   - com.netflix.servo.util.ThreadCpuStats$CpuUsage
[WARNING]   - com.netflix.servo.tag.SmallTagMap$Builder
[WARNING]   - com.netflix.servo.util.Reflection
[WARNING]   - com.netflix.servo.stats.StatsConfig
[WARNING]   - 162 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, fastutil-6.6.5.jar define 10700 overlapping classes: 
[WARNING]   - it.unimi.dsi.fastutil.doubles.Double2IntRBTreeMap$Submap$KeySet
[WARNING]   - it.unimi.dsi.fastutil.longs.Long2CharAVLTreeMap$2$1
[WARNING]   - it.unimi.dsi.fastutil.bytes.Byte2ObjectLinkedOpenHashMap$EntryIterator
[WARNING]   - it.unimi.dsi.fastutil.ints.Int2ReferenceRBTreeMap$Submap
[WARNING]   - it.unimi.dsi.fastutil.shorts.Short2FloatOpenCustomHashMap$KeySet
[WARNING]   - it.unimi.dsi.fastutil.bytes.Byte2BooleanRBTreeMap$Submap$1
[WARNING]   - it.unimi.dsi.fastutil.floats.AbstractFloat2ShortSortedMap$ValuesCollection
[WARNING]   - it.unimi.dsi.fastutil.longs.Long2ReferenceRBTreeMap$Submap$2
[WARNING]   - it.unimi.dsi.fastutil.chars.AbstractChar2LongSortedMap$ValuesIterator
[WARNING]   - it.unimi.dsi.fastutil.doubles.DoubleHeaps
[WARNING]   - 10690 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, httpclient-4.5.7.jar define 467 overlapping classes: 
[WARNING]   - org.apache.http.impl.cookie.RFC2109Spec
[WARNING]   - org.apache.http.impl.execchain.MainClientExec
[WARNING]   - org.apache.http.conn.routing.RouteInfo$TunnelType
[WARNING]   - org.apache.http.client.methods.HttpGet
[WARNING]   - org.apache.http.impl.cookie.BrowserCompatSpecFactory
[WARNING]   - org.apache.http.impl.client.HttpAuthenticator
[WARNING]   - org.apache.http.conn.ManagedClientConnection
[WARNING]   - org.apache.http.client.protocol.RequestAuthCache
[WARNING]   - org.apache.http.conn.params.ConnConnectionParamBean
[WARNING]   - org.apache.http.impl.client.IdleConnectionEvictor
[WARNING]   - 457 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, kryo-2.24.0.jar define 193 overlapping classes: 
[WARNING]   - com.esotericsoftware.kryo.serializers.BeanSerializer$1
[WARNING]   - com.esotericsoftware.kryo.Registration
[WARNING]   - com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.Handler
[WARNING]   - com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ByteVector
[WARNING]   - com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.FieldVisitor
[WARNING]   - com.esotericsoftware.kryo.util.IntMap$Values
[WARNING]   - com.esotericsoftware.kryo.serializers.DefaultSerializers$IntSerializer
[WARNING]   - com.esotericsoftware.kryo.serializers.FieldSerializerUnsafeUtilImpl
[WARNING]   - com.esotericsoftware.kryo.serializers.JavaSerializer
[WARNING]   - com.esotericsoftware.kryo.serializers.ObjectField$ObjectIntField
[WARNING]   - 183 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, utils-cli-spark2_2.11-0.2.15.jar define 23 overlapping classes: 
[WARNING]   - org.bdgenomics.utils.cli.Args4j
[WARNING]   - org.bdgenomics.utils.cli.BDGSparkCommand$$anonfun$printMetrics$4
[WARNING]   - org.bdgenomics.utils.cli.ParquetArgs
[WARNING]   - org.bdgenomics.utils.cli.BDGSparkCommand$$anonfun$printMetrics$3
[WARNING]   - org.bdgenomics.utils.cli.BDGSparkCommand$class
[WARNING]   - org.bdgenomics.utils.cli.BDGSparkCommand$$anonfun$printMetrics$2
[WARNING]   - org.bdgenomics.utils.cli.BDGSparkCommand$$anonfun$run$1
[WARNING]   - org.bdgenomics.utils.cli.ParquetArgs$class
[WARNING]   - org.bdgenomics.utils.cli.BDGSparkCommand$$anonfun$printMetrics$1
[WARNING]   - org.bdgenomics.utils.cli.SaveArgs
[WARNING]   - 13 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, objenesis-2.1.jar define 37 overlapping classes: 
[WARNING]   - org.objenesis.ObjenesisBase
[WARNING]   - org.objenesis.instantiator.gcj.GCJInstantiator
[WARNING]   - org.objenesis.strategy.SingleInstantiatorStrategy
[WARNING]   - org.objenesis.ObjenesisHelper
[WARNING]   - org.objenesis.instantiator.sun.SunReflectionFactoryHelper
[WARNING]   - org.objenesis.instantiator.jrockit.JRockitLegacyInstantiator
[WARNING]   - org.objenesis.instantiator.sun.SunReflectionFactoryInstantiator
[WARNING]   - org.objenesis.instantiator.basic.NullInstantiator
[WARNING]   - org.objenesis.instantiator.android.Android17Instantiator
[WARNING]   - org.objenesis.instantiator.ObjectInstantiator
[WARNING]   - 27 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, adam-codegen-spark2_2.11-0.27.0-SNAPSHOT.jar define 18 overlapping classes: 
[WARNING]   - org.bdgenomics.adam.codegen.DumpSchemasToProjectionEnums$
[WARNING]   - org.bdgenomics.adam.codegen.DumpSchemasToProjectionEnums
[WARNING]   - org.bdgenomics.adam.codegen.DumpSchemasToProduct$$anonfun$getters$1
[WARNING]   - org.bdgenomics.adam.codegen.DumpSchemasToProduct$$anonfun$setters$1
[WARNING]   - org.bdgenomics.adam.codegen.DumpSchemasToProduct$$anonfun$1
[WARNING]   - org.bdgenomics.adam.codegen.DumpSchemasToProduct$$anonfun$apply$1
[WARNING]   - org.bdgenomics.adam.codegen.DumpSchemasToProduct
[WARNING]   - org.bdgenomics.adam.codegen.DumpSchemasToProduct$
[WARNING]   - org.bdgenomics.adam.codegen.DumpSchemasToProduct$$anonfun$fields$1
[WARNING]   - org.bdgenomics.adam.codegen.DumpSchemasToProjectionEnums$$anonfun$fields$1
[WARNING]   - 8 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, avro-1.8.2.jar define 1172 overlapping classes: 
[WARNING]   - org.apache.avro.message.SchemaStore
[WARNING]   - avro.shaded.com.google.common.collect.SingletonImmutableList
[WARNING]   - org.apache.avro.io.EncoderFactory$DefaultEncoderFactory
[WARNING]   - avro.shaded.com.google.common.collect.Iterables$15
[WARNING]   - org.apache.avro.GuavaClasses
[WARNING]   - avro.shaded.com.google.common.collect.Sets$PowerSet$1$1
[WARNING]   - avro.shaded.com.google.common.collect.RegularImmutableMap
[WARNING]   - org.apache.avro.generic.GenericDatumReader$2
[WARNING]   - avro.shaded.com.google.common.collect.Synchronized$SynchronizedSortedSet
[WARNING]   - org.apache.avro.file.BZip2Codec
[WARNING]   - 1162 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, jsr203hadoop-1.0.3.jar define 27 overlapping classes: 
[WARNING]   - hdfs.jsr203.HadoopDirectoryStream
[WARNING]   - hdfs.jsr203.HadoopPath
[WARNING]   - hdfs.jsr203.HadoopFileSystem$1
[WARNING]   - hdfs.jsr203.HadoopFileOwnerAttributeView
[WARNING]   - hdfs.jsr203.HadoopUserPrincipal
[WARNING]   - hdfs.jsr203.IAttributeReader
[WARNING]   - hdfs.jsr203.HadoopPath$1
[WARNING]   - hdfs.jsr203.HadoopBasicFileAttributes
[WARNING]   - hdfs.jsr203.HadoopDirectoryStream$1
[WARNING]   - hdfs.jsr203.package-info
[WARNING]   - 17 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, commons-io-2.6.jar define 127 overlapping classes: 
[WARNING]   - org.apache.commons.io.FileCleaningTracker
[WARNING]   - org.apache.commons.io.comparator.SizeFileComparator
[WARNING]   - org.apache.commons.io.input.CloseShieldInputStream
[WARNING]   - org.apache.commons.io.ByteOrderParser
[WARNING]   - org.apache.commons.io.filefilter.EmptyFileFilter
[WARNING]   - org.apache.commons.io.monitor.FileEntry
[WARNING]   - org.apache.commons.io.output.ThresholdingOutputStream
[WARNING]   - org.apache.commons.io.input.TailerListener
[WARNING]   - org.apache.commons.io.IOExceptionWithCause
[WARNING]   - org.apache.commons.io.filefilter.NotFileFilter
[WARNING]   - 117 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, scala-guice_2.11-4.2.1.jar define 95 overlapping classes: 
[WARNING]   - net.codingwell.scalaguice.ScalaModule$
[WARNING]   - net.codingwell.scalaguice.ScalaModule$ScalaScopedBindingBuilder
[WARNING]   - net.codingwell.scalaguice.InjectorExtensions$ScalaInjector$$typecreator2$1
[WARNING]   - net.codingwell.scalaguice.ScalaModule$ScalaLinkedBindingBuilder$$anon$2$$typecreator1$1
[WARNING]   - net.codingwell.scalaguice.binder.ScopedBindingBuilderProxy
[WARNING]   - net.codingwell.scalaguice.InternalModule$BindingBuilder$$typecreator1$2
[WARNING]   - net.codingwell.scalaguice.ScalaModule$$anonfun$filterTrace$2
[WARNING]   - net.codingwell.scalaguice.ScalaPrivateModule$ElementBuilder
[WARNING]   - net.codingwell.scalaguice.binder.LinkedBindingBuilderProxy$class
[WARNING]   - net.codingwell.scalaguice.ScalaModule$ScalaAnnotatedBindingBuilder
[WARNING]   - 85 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, commons-compress-1.8.1.jar define 191 overlapping classes: 
[WARNING]   - org.apache.commons.compress.archivers.sevenz.SevenZArchiveEntry
[WARNING]   - org.apache.commons.compress.archivers.dump.ShortFileException
[WARNING]   - org.apache.commons.compress.utils.CountingInputStream
[WARNING]   - org.apache.commons.compress.compressors.bzip2.CRC
[WARNING]   - org.apache.commons.compress.compressors.bzip2.BZip2CompressorOutputStream
[WARNING]   - org.apache.commons.compress.archivers.dump.DumpArchiveEntry
[WARNING]   - org.apache.commons.compress.changes.ChangeSetPerformer$ArchiveEntryIterator
[WARNING]   - org.apache.commons.compress.compressors.bzip2.BlockSort
[WARNING]   - org.apache.commons.compress.archivers.tar.TarArchiveEntry
[WARNING]   - org.apache.commons.compress.archivers.dump.UnsupportedCompressionAlgorithmException
[WARNING]   - 181 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, j2objc-annotations-1.1.jar define 12 overlapping classes: 
[WARNING]   - com.google.j2objc.annotations.Property
[WARNING]   - com.google.j2objc.annotations.RetainedWith
[WARNING]   - com.google.j2objc.annotations.RetainedLocalRef
[WARNING]   - com.google.j2objc.annotations.J2ObjCIncompatible
[WARNING]   - com.google.j2objc.annotations.AutoreleasePool
[WARNING]   - com.google.j2objc.annotations.LoopTranslation$LoopStyle
[WARNING]   - com.google.j2objc.annotations.ReflectionSupport$Level
[WARNING]   - com.google.j2objc.annotations.ReflectionSupport
[WARNING]   - com.google.j2objc.annotations.WeakOuter
[WARNING]   - com.google.j2objc.annotations.Weak
[WARNING]   - 2 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, adam-apis-spark2_2.11-0.27.0-SNAPSHOT.jar define 133 overlapping classes: 
[WARNING]   - org.bdgenomics.adam.api.java.FeaturesToFragmentsConverter
[WARNING]   - org.bdgenomics.adam.api.java.GenotypesToCoverageConverter
[WARNING]   - org.bdgenomics.adam.api.java.GenotypesToFragmentsConverter
[WARNING]   - org.bdgenomics.adam.api.java.ToVariantDatasetConversion$class
[WARNING]   - org.bdgenomics.adam.api.java.ToCoverageDatasetConversion$class
[WARNING]   - org.bdgenomics.adam.api.java.FeaturesToVariantsDatasetConverter
[WARNING]   - org.bdgenomics.adam.api.java.ToVariantDatasetConversion$$typecreator7$1
[WARNING]   - org.bdgenomics.adam.api.java.VariantContextsToContigsConverter
[WARNING]   - org.bdgenomics.adam.api.java.VariantsToContigsDatasetConverter
[WARNING]   - org.bdgenomics.adam.api.java.VariantsToVariantsConverter
[WARNING]   - 123 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, parquet-hadoop-1.10.1.jar define 162 overlapping classes: 
[WARNING]   - org.apache.parquet.hadoop.mapred.DeprecatedParquetOutputFormat
[WARNING]   - org.apache.parquet.hadoop.api.WriteSupport$WriteContext
[WARNING]   - org.apache.parquet.format.converter.ParquetMetadataConverter$RangeMetadataFilter
[WARNING]   - org.apache.parquet.hadoop.ColumnChunkPageReadStore$ColumnChunkPageReader$1
[WARNING]   - org.apache.parquet.format.converter.ParquetMetadataConverter$2
[WARNING]   - org.apache.parquet.hadoop.metadata.ColumnChunkMetaData
[WARNING]   - org.apache.parquet.hadoop.api.WriteSupport$FinalizedWriteContext
[WARNING]   - org.apache.parquet.hadoop.util.HadoopPositionOutputStream
[WARNING]   - org.apache.parquet.HadoopReadOptions$1
[WARNING]   - org.apache.parquet.format.converter.ParquetMetadataConverter$NoFilter
[WARNING]   - 152 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, guice-4.2.0.jar define 573 overlapping classes: 
[WARNING]   - com.google.inject.Scope
[WARNING]   - com.google.inject.Binding
[WARNING]   - com.google.inject.internal.cglib.core.$EmitUtils$3
[WARNING]   - com.google.inject.spi.TypeConverter
[WARNING]   - com.google.inject.internal.ConstructionProxy
[WARNING]   - com.google.inject.spi.InjectionPoint
[WARNING]   - com.google.inject.spi.StaticInjectionRequest
[WARNING]   - com.google.inject.internal.cglib.proxy.$FixedValueGenerator
[WARNING]   - com.google.inject.internal.cglib.proxy.$DispatcherGenerator
[WARNING]   - com.google.inject.spi.Elements$ElementsAsModule
[WARNING]   - 563 more...
[WARNING] adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, utils-misc-spark2_2.11-0.2.15.jar define 11 overlapping classes: 
[WARNING]   - org.bdgenomics.utils.misc.MathUtils
[WARNING]   - org.bdgenomics.utils.misc.Logging$
[WARNING]   - org.bdgenomics.utils.misc.MathUtils$$anonfun$scalarArrayMultiply$1
[WARNING]   - org.bdgenomics.utils.misc.Logging$class
[WARNING]   - org.bdgenomics.utils.misc.MathUtils$$anonfun$aggregateArray$1
[WARNING]   - org.bdgenomics.utils.misc.Logging
[WARNING]   - org.bdgenomics.utils.misc.HadoopUtil$
[WARNING]   - org.bdgenomics.utils.misc.Logging$$anonfun$getContextOrClassLoader$1
[WARNING]   - org.bdgenomics.utils.misc.MathUtils$$anonfun$softmax$1
[WARNING]   - org.bdgenomics.utils.misc.MathUtils$
[WARNING]   - 1 more...
[WARNING] maven-shade-plugin has detected that some class files are
[WARNING] present in two or more JARs. When this happens, only one
[WARNING] single version of the class is copied to the uber jar.
[WARNING] Usually this is not harmful and you can skip these warnings,
[WARNING] otherwise try to manually exclude artifacts based on
[WARNING] mvn dependency:tree -Ddetail=true and the above output.
[WARNING] See http://maven.apache.org/plugins/maven-shade-plugin/
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar with /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT-shaded.jar
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: Python APIs 0.27.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-python-spark2_2.11 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-python-spark2_2.11 ---
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-python-spark2_2.11 ---
[INFO] Modified 0 of 0 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:resources (default-resources) @ adam-python-spark2_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-python/src/main/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-python-spark2_2.11 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- exec-maven-plugin:1.5.0:exec (dev-python) @ adam-python-spark2_2.11 ---
pip install -e .
Obtaining file:///tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-python
Requirement already satisfied: pyspark>=1.6.0 in /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.3-bin-hadoop2.7/python (from bdgenomics.adam==0.26.0a0) (2.4.3)
Collecting py4j==0.10.7 (from pyspark>=1.6.0->bdgenomics.adam==0.26.0a0)
  Using cached https://files.pythonhosted.org/packages/e3/53/c737818eb9a7dc32a7cd4f1396e787bd94200c3997c72c1dbe028587bd76/py4j-0.10.7-py2.py3-none-any.whl
Installing collected packages: bdgenomics.adam, py4j
  Found existing installation: bdgenomics.adam 0.26.0a0
    Can't uninstall 'bdgenomics.adam'. No files were found to uninstall.
  Running setup.py develop for bdgenomics.adam
Successfully installed bdgenomics.adam py4j-0.10.7
python setup.py bdist_egg
running bdist_egg
running egg_info
writing bdgenomics.adam.egg-info/PKG-INFO
writing dependency_links to bdgenomics.adam.egg-info/dependency_links.txt
writing requirements to bdgenomics.adam.egg-info/requires.txt
writing top-level names to bdgenomics.adam.egg-info/top_level.txt
Could not import pypandoc - required to package bdgenomics.adam
package init file 'deps/jars/__init__.py' not found (or not a regular file)
package init file 'deps/bin/__init__.py' not found (or not a regular file)
reading manifest file 'bdgenomics.adam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no previously-included files matching '*.py[cod]' found anywhere in distribution
warning: no previously-included files matching '__pycache__' found anywhere in distribution
warning: no previously-included files matching '.DS_Store' found anywhere in distribution
writing manifest file 'bdgenomics.adam.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
creating build
creating build/lib
creating build/lib/bdgenomics
copying bdgenomics/__init__.py -> build/lib/bdgenomics
creating build/lib/bdgenomics/adam
copying bdgenomics/adam/models.py -> build/lib/bdgenomics/adam
copying bdgenomics/adam/rdd.py -> build/lib/bdgenomics/adam
copying bdgenomics/adam/stringency.py -> build/lib/bdgenomics/adam
copying bdgenomics/adam/adamContext.py -> build/lib/bdgenomics/adam
copying bdgenomics/adam/find_adam_home.py -> build/lib/bdgenomics/adam
copying bdgenomics/adam/__init__.py -> build/lib/bdgenomics/adam
creating build/lib/bdgenomics/adam/test
copying bdgenomics/adam/test/coverageDataset_test.py -> build/lib/bdgenomics/adam/test
copying bdgenomics/adam/test/adamContext_test.py -> build/lib/bdgenomics/adam/test
copying bdgenomics/adam/test/variantDataset_test.py -> build/lib/bdgenomics/adam/test
copying bdgenomics/adam/test/genotypeDataset_test.py -> build/lib/bdgenomics/adam/test
copying bdgenomics/adam/test/featureDataset_test.py -> build/lib/bdgenomics/adam/test
copying bdgenomics/adam/test/alignmentRecordDataset_test.py -> build/lib/bdgenomics/adam/test
copying bdgenomics/adam/test/__init__.py -> build/lib/bdgenomics/adam/test
creating build/lib/bdgenomics/adam/jars
copying deps/jars/adam.jar -> build/lib/bdgenomics/adam/jars
creating build/lib/bdgenomics/adam/bin
copying deps/bin/adam-shell -> build/lib/bdgenomics/adam/bin
copying deps/bin/adam-submit -> build/lib/bdgenomics/adam/bin
copying deps/bin/adamR -> build/lib/bdgenomics/adam/bin
copying deps/bin/find-adam-assembly.sh -> build/lib/bdgenomics/adam/bin
copying deps/bin/find-adam-egg.sh -> build/lib/bdgenomics/adam/bin
copying deps/bin/find-adam-home -> build/lib/bdgenomics/adam/bin
copying deps/bin/find-spark.sh -> build/lib/bdgenomics/adam/bin
copying deps/bin/pyadam -> build/lib/bdgenomics/adam/bin
creating build/bdist.linux-x86_64
creating build/bdist.linux-x86_64/egg
creating build/bdist.linux-x86_64/egg/bdgenomics
creating build/bdist.linux-x86_64/egg/bdgenomics/adam
copying build/lib/bdgenomics/adam/models.py -> build/bdist.linux-x86_64/egg/bdgenomics/adam
creating build/bdist.linux-x86_64/egg/bdgenomics/adam/jars
copying build/lib/bdgenomics/adam/jars/adam.jar -> build/bdist.linux-x86_64/egg/bdgenomics/adam/jars
copying build/lib/bdgenomics/adam/rdd.py -> build/bdist.linux-x86_64/egg/bdgenomics/adam
creating build/bdist.linux-x86_64/egg/bdgenomics/adam/test
copying build/lib/bdgenomics/adam/test/coverageDataset_test.py -> build/bdist.linux-x86_64/egg/bdgenomics/adam/test
copying build/lib/bdgenomics/adam/test/adamContext_test.py -> build/bdist.linux-x86_64/egg/bdgenomics/adam/test
copying build/lib/bdgenomics/adam/test/variantDataset_test.py -> build/bdist.linux-x86_64/egg/bdgenomics/adam/test
copying build/lib/bdgenomics/adam/test/genotypeDataset_test.py -> build/bdist.linux-x86_64/egg/bdgenomics/adam/test
copying build/lib/bdgenomics/adam/test/featureDataset_test.py -> build/bdist.linux-x86_64/egg/bdgenomics/adam/test
copying build/lib/bdgenomics/adam/test/alignmentRecordDataset_test.py -> build/bdist.linux-x86_64/egg/bdgenomics/adam/test
copying build/lib/bdgenomics/adam/test/__init__.py -> build/bdist.linux-x86_64/egg/bdgenomics/adam/test
copying build/lib/bdgenomics/adam/stringency.py -> build/bdist.linux-x86_64/egg/bdgenomics/adam
creating build/bdist.linux-x86_64/egg/bdgenomics/adam/bin
copying build/lib/bdgenomics/adam/bin/adam-shell -> build/bdist.linux-x86_64/egg/bdgenomics/adam/bin
copying build/lib/bdgenomics/adam/bin/find-adam-home -> build/bdist.linux-x86_64/egg/bdgenomics/adam/bin
copying build/lib/bdgenomics/adam/bin/find-spark.sh -> build/bdist.linux-x86_64/egg/bdgenomics/adam/bin
copying build/lib/bdgenomics/adam/bin/find-adam-egg.sh -> build/bdist.linux-x86_64/egg/bdgenomics/adam/bin
copying build/lib/bdgenomics/adam/bin/adam-submit -> build/bdist.linux-x86_64/egg/bdgenomics/adam/bin
copying build/lib/bdgenomics/adam/bin/pyadam -> build/bdist.linux-x86_64/egg/bdgenomics/adam/bin
copying build/lib/bdgenomics/adam/bin/adamR -> build/bdist.linux-x86_64/egg/bdgenomics/adam/bin
copying build/lib/bdgenomics/adam/bin/find-adam-assembly.sh -> build/bdist.linux-x86_64/egg/bdgenomics/adam/bin
copying build/lib/bdgenomics/adam/adamContext.py -> build/bdist.linux-x86_64/egg/bdgenomics/adam
copying build/lib/bdgenomics/adam/find_adam_home.py -> build/bdist.linux-x86_64/egg/bdgenomics/adam
copying build/lib/bdgenomics/adam/__init__.py -> build/bdist.linux-x86_64/egg/bdgenomics/adam
copying build/lib/bdgenomics/__init__.py -> build/bdist.linux-x86_64/egg/bdgenomics
byte-compiling build/bdist.linux-x86_64/egg/bdgenomics/adam/models.py to models.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/bdgenomics/adam/rdd.py to rdd.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/bdgenomics/adam/test/coverageDataset_test.py to coverageDataset_test.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/bdgenomics/adam/test/adamContext_test.py to adamContext_test.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/bdgenomics/adam/test/variantDataset_test.py to variantDataset_test.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/bdgenomics/adam/test/genotypeDataset_test.py to genotypeDataset_test.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/bdgenomics/adam/test/featureDataset_test.py to featureDataset_test.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/bdgenomics/adam/test/alignmentRecordDataset_test.py to alignmentRecordDataset_test.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/bdgenomics/adam/test/__init__.py to __init__.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/bdgenomics/adam/stringency.py to stringency.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/bdgenomics/adam/adamContext.py to adamContext.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/bdgenomics/adam/find_adam_home.py to find_adam_home.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/bdgenomics/adam/__init__.py to __init__.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/bdgenomics/__init__.py to __init__.cpython-36.pyc
creating build/bdist.linux-x86_64/egg/EGG-INFO
installing scripts to build/bdist.linux-x86_64/egg/EGG-INFO/scripts
running install_scripts
running build_scripts
creating build/scripts-3.6
copying deps/bin/adam-shell -> build/scripts-3.6
copying deps/bin/find-adam-home -> build/scripts-3.6
copying deps/bin/find-spark.sh -> build/scripts-3.6
copying deps/bin/find-adam-egg.sh -> build/scripts-3.6
copying deps/bin/adam-submit -> build/scripts-3.6
copying deps/bin/pyadam -> build/scripts-3.6
copying deps/bin/adamR -> build/scripts-3.6
copying deps/bin/find-adam-assembly.sh -> build/scripts-3.6
copying and adjusting bdgenomics/adam/find_adam_home.py -> build/scripts-3.6
changing mode of build/scripts-3.6/find_adam_home.py from 664 to 775
creating build/bdist.linux-x86_64/egg/EGG-INFO/scripts
copying build/scripts-3.6/adam-shell -> build/bdist.linux-x86_64/egg/EGG-INFO/scripts
copying build/scripts-3.6/find-adam-home -> build/bdist.linux-x86_64/egg/EGG-INFO/scripts
copying build/scripts-3.6/find-spark.sh -> build/bdist.linux-x86_64/egg/EGG-INFO/scripts
copying build/scripts-3.6/find-adam-egg.sh -> build/bdist.linux-x86_64/egg/EGG-INFO/scripts
copying build/scripts-3.6/adam-submit -> build/bdist.linux-x86_64/egg/EGG-INFO/scripts
copying build/scripts-3.6/pyadam -> build/bdist.linux-x86_64/egg/EGG-INFO/scripts
copying build/scripts-3.6/find_adam_home.py -> build/bdist.linux-x86_64/egg/EGG-INFO/scripts
copying build/scripts-3.6/adamR -> build/bdist.linux-x86_64/egg/EGG-INFO/scripts
copying build/scripts-3.6/find-adam-assembly.sh -> build/bdist.linux-x86_64/egg/EGG-INFO/scripts
changing mode of build/bdist.linux-x86_64/egg/EGG-INFO/scripts/adam-shell to 775
changing mode of build/bdist.linux-x86_64/egg/EGG-INFO/scripts/find-adam-home to 775
changing mode of build/bdist.linux-x86_64/egg/EGG-INFO/scripts/find-spark.sh to 775
changing mode of build/bdist.linux-x86_64/egg/EGG-INFO/scripts/find-adam-egg.sh to 775
changing mode of build/bdist.linux-x86_64/egg/EGG-INFO/scripts/adam-submit to 775
changing mode of build/bdist.linux-x86_64/egg/EGG-INFO/scripts/pyadam to 775
changing mode of build/bdist.linux-x86_64/egg/EGG-INFO/scripts/find_adam_home.py to 775
changing mode of build/bdist.linux-x86_64/egg/EGG-INFO/scripts/adamR to 775
changing mode of build/bdist.linux-x86_64/egg/EGG-INFO/scripts/find-adam-assembly.sh to 775
copying bdgenomics.adam.egg-info/PKG-INFO -> build/bdist.linux-x86_64/egg/EGG-INFO
copying bdgenomics.adam.egg-info/SOURCES.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying bdgenomics.adam.egg-info/dependency_links.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying bdgenomics.adam.egg-info/requires.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying bdgenomics.adam.egg-info/top_level.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
zip_safe flag not set; analyzing archive contents...
bdgenomics.__pycache__.__init__.cpython-36: module references __path__
bdgenomics.adam.__pycache__.find_adam_home.cpython-36: module references __file__
creating dist
creating 'dist/bdgenomics.adam-0.26.0a0-py3.6.egg' and adding 'build/bdist.linux-x86_64/egg' to it
removing 'build/bdist.linux-x86_64/egg' (and everything under it)
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:compile (default-compile) @ adam-python-spark2_2.11 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:testResources (default-testResources) @ adam-python-spark2_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-python/src/test/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-python-spark2_2.11 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- exec-maven-plugin:1.5.0:exec (test-python) @ adam-python-spark2_2.11 ---
mkdir -p target
python -m pytest -vv --junitxml target/pytest-reports/tests.xml bdgenomics
============================= test session starts ==============================
platform linux -- Python 3.6.8, pytest-3.9.1, py-1.8.0, pluggy-0.11.0 -- /home/jenkins/anaconda2/envs/adam-build-af477924-7730-4f36-af57-2d1488172baa/bin/python
cachedir: .pytest_cache
rootdir: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-python, inifile:
collecting ... collected 62 items

bdgenomics/adam/test/adamContext_test.py::ADAMContextTest::test_load_alignments PASSED [  1%]
bdgenomics/adam/test/adamContext_test.py::ADAMContextTest::test_load_bed PASSED [  3%]
bdgenomics/adam/test/adamContext_test.py::ADAMContextTest::test_load_contig_fragments PASSED [  4%]
bdgenomics/adam/test/adamContext_test.py::ADAMContextTest::test_load_coverage PASSED [  6%]
bdgenomics/adam/test/adamContext_test.py::ADAMContextTest::test_load_genotypes PASSED [  8%]
bdgenomics/adam/test/adamContext_test.py::ADAMContextTest::test_load_gtf PASSED [  9%]
bdgenomics/adam/test/adamContext_test.py::ADAMContextTest::test_load_indexed_bam PASSED [ 11%]
bdgenomics/adam/test/adamContext_test.py::ADAMContextTest::test_load_interval_list PASSED [ 12%]
bdgenomics/adam/test/adamContext_test.py::ADAMContextTest::test_load_narrowPeak PASSED [ 14%]
bdgenomics/adam/test/adamContext_test.py::ADAMContextTest::test_load_variants PASSED [ 16%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_broadcast_inner_join PASSED [ 17%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_broadcast_right_outer_join PASSED [ 19%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_caching PASSED [ 20%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_count_kmers PASSED [ 22%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_filterByOverlappingRegion PASSED [ 24%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_filterByOverlappingRegions PASSED [ 25%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_load_indexed_bam PASSED [ 27%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_persisting PASSED [ 29%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_pipe_as_sam PASSED [ 30%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_realignIndels_known_indels PASSED [ 32%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_realignIndels_reads PASSED [ 33%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_save_as_bam PASSED [ 35%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_save_sorted_sam PASSED [ 37%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_save_unordered_sam PASSED [ 38%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_shuffle_full_outer_join PASSED [ 40%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_shuffle_inner_join PASSED [ 41%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_shuffle_inner_join_groupBy_left PASSED [ 43%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_shuffle_left_outer_join PASSED [ 45%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_shuffle_right_outer_join PASSED [ 46%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_shuffle_right_outer_join_groupBy_left PASSED [ 48%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_to_coverage PASSED [ 50%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_to_fragments PASSED [ 51%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_transform PASSED [ 53%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_transmute_to_coverage PASSED [ 54%]
bdgenomics/adam/test/alignmentRecordDataset_test.py::AlignmentRecordDatasetTest::test_union PASSED [ 56%]
bdgenomics/adam/test/coverageDataset_test.py::CoverageDatasetTest::test_aggregatedCoverage PASSED [ 58%]
bdgenomics/adam/test/coverageDataset_test.py::CoverageDatasetTest::test_collapse PASSED [ 59%]
bdgenomics/adam/test/coverageDataset_test.py::CoverageDatasetTest::test_flatten PASSED [ 61%]
bdgenomics/adam/test/coverageDataset_test.py::CoverageDatasetTest::test_save PASSED [ 62%]
bdgenomics/adam/test/coverageDataset_test.py::CoverageDatasetTest::test_toFeatures PASSED [ 64%]
bdgenomics/adam/test/featureDataset_test.py::FeatureDatasetTest::test_round_trip_bed PASSED [ 66%]
bdgenomics/adam/test/featureDataset_test.py::FeatureDatasetTest::test_round_trip_gtf PASSED [ 67%]
bdgenomics/adam/test/featureDataset_test.py::FeatureDatasetTest::test_round_trip_interval_list PASSED [ 69%]
bdgenomics/adam/test/featureDataset_test.py::FeatureDatasetTest::test_round_trip_narrowPeak PASSED [ 70%]
bdgenomics/adam/test/featureDataset_test.py::FeatureDatasetTest::test_transform PASSED [ 72%]
bdgenomics/adam/test/genotypeDataset_test.py::GenotypeDatasetTest::test_to_variants PASSED [ 74%]
bdgenomics/adam/test/genotypeDataset_test.py::GenotypeDatasetTest::test_transform PASSED [ 75%]
bdgenomics/adam/test/genotypeDataset_test.py::GenotypeDatasetTest::test_vcf_add_filter PASSED [ 77%]
bdgenomics/adam/test/genotypeDataset_test.py::GenotypeDatasetTest::test_vcf_add_format_all_array PASSED [ 79%]
bdgenomics/adam/test/genotypeDataset_test.py::GenotypeDatasetTest::test_vcf_add_format_alts_array PASSED [ 80%]
bdgenomics/adam/test/genotypeDataset_test.py::GenotypeDatasetTest::test_vcf_add_format_array PASSED [ 82%]
bdgenomics/adam/test/genotypeDataset_test.py::GenotypeDatasetTest::test_vcf_add_format_genotype_array PASSED [ 83%]
bdgenomics/adam/test/genotypeDataset_test.py::GenotypeDatasetTest::test_vcf_add_format_scalar PASSED [ 85%]
bdgenomics/adam/test/genotypeDataset_test.py::GenotypeDatasetTest::test_vcf_add_info_all_array PASSED [ 87%]
bdgenomics/adam/test/genotypeDataset_test.py::GenotypeDatasetTest::test_vcf_add_info_alts_array PASSED [ 88%]
bdgenomics/adam/test/genotypeDataset_test.py::GenotypeDatasetTest::test_vcf_add_info_array PASSED [ 90%]
bdgenomics/adam/test/genotypeDataset_test.py::GenotypeDatasetTest::test_vcf_add_info_scalar PASSED [ 91%]
bdgenomics/adam/test/genotypeDataset_test.py::GenotypeDatasetTest::test_vcf_round_trip PASSED [ 93%]
bdgenomics/adam/test/genotypeDataset_test.py::GenotypeDatasetTest::test_vcf_sort PASSED [ 95%]
bdgenomics/adam/test/genotypeDataset_test.py::GenotypeDatasetTest::test_vcf_sort_lex FAILED [ 96%]
bdgenomics/adam/test/variantDataset_test.py::VariantDatasetTest::test_transform PASSED [ 98%]
bdgenomics/adam/test/variantDataset_test.py::VariantDatasetTest::test_vcf_round_trip PASSED [100%]

=================================== FAILURES ===================================
____________________ GenotypeDatasetTest.test_vcf_sort_lex _____________________

self = <bdgenomics.adam.test.genotypeDataset_test.GenotypeDatasetTest testMethod=test_vcf_sort_lex>

    def test_vcf_sort_lex(self):
    
        testFile = self.resourceFile("random.vcf")
        ac = ADAMContext(self.ss)
    
        genotypes = ac.loadGenotypes(testFile)
    
        tmpPath = self.tmpFile() + ".vcf"
        genotypes.toVariantContexts().sortLexicographically().saveAsVcf(tmpPath,
>                                                                       asSingleFile=True)

bdgenomics/adam/test/genotypeDataset_test.py:240: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
bdgenomics/adam/rdd.py:1650: in saveAsVcf
    _toJava(stringency, self.sc._jvm))
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.3-bin-hadoop2.7/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py:1257: in __call__
    answer, self.gateway_client, self.target_id, self.name)
../spark-2.4.3-bin-hadoop2.7/python/pyspark/sql/utils.py:63: in deco
    return f(*a, **kw)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

answer = 'xro3044'
gateway_client = <py4j.java_gateway.GatewayClient object at 0x7f5a3b54e588>
target_id = 'o3042', name = 'saveAsVcf'

    def get_return_value(answer, gateway_client, target_id=None, name=None):
        """Converts an answer received from the Java gateway into a Python object.
    
        For example, string representation of integers are converted to Python
        integer, string representation of objects are converted to JavaObject
        instances, etc.
    
        :param answer: the string returned by the Java gateway
        :param gateway_client: the gateway client used to communicate with the Java
            Gateway. Only necessary if the answer is a reference (e.g., object,
            list, map)
        :param target_id: the name of the object from which the answer comes from
            (e.g., *object1* in `object1.hello()`). Optional.
        :param name: the name of the member from which the answer comes from
            (e.g., *hello* in `object1.hello()`). Optional.
        """
        if is_error(answer)[0]:
            if len(answer) > 1:
                type = answer[1]
                value = OUTPUT_CONVERTER[type](answer[2:], gateway_client)
                if answer[1] == REFERENCE_TYPE:
                    raise Py4JJavaError(
                        "An error occurred while calling {0}{1}{2}.\n".
>                       format(target_id, ".", name), value)
E                   py4j.protocol.Py4JJavaError: An error occurred while calling o3042.saveAsVcf.
E                   : org.apache.spark.SparkException: Job aborted.
E                   	at org.apache.spark.internal.io.SparkHadoopWriter$.write(SparkHadoopWriter.scala:100)
E                   	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1083)
E                   	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1081)
E                   	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1081)
E                   	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
E                   	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
E                   	at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
E                   	at org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(PairRDDFunctions.scala:1081)
E                   	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopFile$2.apply$mcV$sp(PairRDDFunctions.scala:1000)
E                   	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopFile$2.apply(PairRDDFunctions.scala:991)
E                   	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopFile$2.apply(PairRDDFunctions.scala:991)
E                   	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
E                   	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
E                   	at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
E                   	at org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopFile(PairRDDFunctions.scala:991)
E                   	at org.bdgenomics.adam.rdd.variant.VariantContextDataset$$anonfun$saveAsVcf$1.apply(VariantContextDataset.scala:420)
E                   	at scala.Option.fold(Option.scala:158)
E                   	at org.apache.spark.rdd.Timer.time(Timer.scala:48)
E                   	at org.bdgenomics.adam.rdd.variant.VariantContextDataset.saveAsVcf(VariantContextDataset.scala:351)
E                   	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
E                   	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
E                   	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
E                   	at java.lang.reflect.Method.invoke(Method.java:498)
E                   	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
E                   	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
E                   	at py4j.Gateway.invoke(Gateway.java:282)
E                   	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
E                   	at py4j.commands.CallCommand.execute(CallCommand.java:79)
E                   	at py4j.GatewayConnection.run(GatewayConnection.java:238)
E                   	at java.lang.Thread.run(Thread.java:748)
E                   Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.io.IOException: Connection from /192.168.10.28:43547 closed
E                   	at org.apache.spark.network.client.TransportResponseHandler.channelInactive(TransportResponseHandler.java:146)
E                   	at org.apache.spark.network.server.TransportChannelHandler.channelInactive(TransportChannelHandler.java:108)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
E                   	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
E                   	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
E                   	at io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:277)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
E                   	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
E                   	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
E                   	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
E                   	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
E                   	at org.apache.spark.network.util.TransportFrameDecoder.channelInactive(TransportFrameDecoder.java:182)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
E                   	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
E                   	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1354)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
E                   	at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:917)
E                   	at io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:822)
E                   	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
E                   	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
E                   	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
E                   	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
E                   	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
E                   	at java.lang.Thread.run(Thread.java:748)
E                   
E                   Driver stacktrace:
E                   	at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
E                   	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
E                   	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
E                   	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
E                   	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
E                   	at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
E                   	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
E                   	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
E                   	at scala.Option.foreach(Option.scala:257)
E                   	at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
E                   	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2110)
E                   	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)
E                   	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2048)
E                   	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
E                   	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
E                   	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
E                   	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)
E                   	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2114)
E                   	at org.apache.spark.internal.io.SparkHadoopWriter$.write(SparkHadoopWriter.scala:78)
E                   	... 29 more
E                   Caused by: java.io.IOException: Connection from /192.168.10.28:43547 closed
E                   	at org.apache.spark.network.client.TransportResponseHandler.channelInactive(TransportResponseHandler.java:146)
E                   	at org.apache.spark.network.server.TransportChannelHandler.channelInactive(TransportChannelHandler.java:108)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
E                   	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
E                   	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
E                   	at io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:277)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
E                   	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
E                   	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
E                   	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
E                   	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
E                   	at org.apache.spark.network.util.TransportFrameDecoder.channelInactive(TransportFrameDecoder.java:182)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
E                   	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
E                   	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1354)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
E                   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
E                   	at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:917)
E                   	at io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:822)
E                   	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
E                   	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
E                   	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
E                   	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
E                   	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
E                   	... 1 more

/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.3-bin-hadoop2.7/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py:328: Py4JJavaError
----------------------------- Captured stderr call -----------------------------
2019-05-14 12:10:22 WARN  Utils:66 - Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
2019-05-14 12:10:22 WARN  Utils:66 - Service 'SparkUI' could not bind on port 4041. Attempting port 4042.

[Stage 0:>                                                          (0 + 1) / 1]2019-05-14 12:10:23 ERROR TransportRequestHandler:292 - Error sending result StreamResponse{streamId=/jars/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, byteCount=44167556, body=FileSegmentManagedBuffer{file=/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar, offset=0, length=44167556}} to /192.168.10.28:35782; closing connection
io.netty.handler.codec.EncoderException: java.lang.OutOfMemoryError: Java heap space
	at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:106)
	at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
	at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
	at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:816)
	at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723)
	at io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:302)
	at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
	at io.netty.channel.AbstractChannelHandlerContext.invokeWriteAndFlush(AbstractChannelHandlerContext.java:801)
	at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:814)
	at io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:794)
	at io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:831)
	at io.netty.channel.DefaultChannelPipeline.writeAndFlush(DefaultChannelPipeline.java:1041)
	at io.netty.channel.AbstractChannel.writeAndFlush(AbstractChannel.java:300)
	at org.apache.spark.network.server.TransportRequestHandler.respond(TransportRequestHandler.java:288)
	at org.apache.spark.network.server.TransportRequestHandler.processStreamRequest(TransportRequestHandler.java:169)
	at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:107)
	at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
	at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
	at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.OutOfMemoryError: Java heap space
	at io.netty.util.internal.PlatformDependent.allocateUninitializedArray(PlatformDependent.java:200)
	at io.netty.buffer.PoolArena$HeapArena.newByteArray(PoolArena.java:676)
	at io.netty.buffer.PoolArena$HeapArena.newChunk(PoolArena.java:686)
	at io.netty.buffer.PoolArena.allocateNormal(PoolArena.java:244)
	at io.netty.buffer.PoolArena.allocate(PoolArena.java:214)
	at io.netty.buffer.PoolArena.allocate(PoolArena.java:146)
	at io.netty.buffer.PooledByteBufAllocator.newHeapBuffer(PooledByteBufAllocator.java:307)
	at io.netty.buffer.AbstractByteBufAllocator.heapBuffer(AbstractByteBufAllocator.java:166)
	at io.netty.buffer.AbstractByteBufAllocator.heapBuffer(AbstractByteBufAllocator.java:157)
	at org.apache.spark.network.protocol.MessageEncoder.encode(MessageEncoder.java:82)
	at org.apache.spark.network.protocol.MessageEncoder.encode(MessageEncoder.java:33)
	at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:88)
	at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
	at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
	at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:816)
	at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723)
	at io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:302)
	at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
	at io.netty.channel.AbstractChannelHandlerContext.invokeWriteAndFlush(AbstractChannelHandlerContext.java:801)
	at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:814)
	at io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:794)
	at io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:831)
	at io.netty.channel.DefaultChannelPipeline.writeAndFlush(DefaultChannelPipeline.java:1041)
	at io.netty.channel.AbstractChannel.writeAndFlush(AbstractChannel.java:300)
	at org.apache.spark.network.server.TransportRequestHandler.respond(TransportRequestHandler.java:288)
	at org.apache.spark.network.server.TransportRequestHandler.processStreamRequest(TransportRequestHandler.java:169)
	at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:107)
	at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
	at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
2019-05-14 12:10:23 ERROR TransportResponseHandler:144 - Still have 1 requests outstanding when connection from /192.168.10.28:43547 is closed
2019-05-14 12:10:23 ERROR Executor:91 - Exception in task 0.0 in stage 0.0 (TID 0)
java.io.IOException: Connection from /192.168.10.28:43547 closed
	at org.apache.spark.network.client.TransportResponseHandler.channelInactive(TransportResponseHandler.java:146)
	at org.apache.spark.network.server.TransportChannelHandler.channelInactive(TransportChannelHandler.java:108)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
	at io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:277)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
	at org.apache.spark.network.util.TransportFrameDecoder.channelInactive(TransportFrameDecoder.java:182)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1354)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:917)
	at io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:822)
	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
	at java.lang.Thread.run(Thread.java:748)
2019-05-14 12:10:23 WARN  TaskSetManager:66 - Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.io.IOException: Connection from /192.168.10.28:43547 closed
	at org.apache.spark.network.client.TransportResponseHandler.channelInactive(TransportResponseHandler.java:146)
	at org.apache.spark.network.server.TransportChannelHandler.channelInactive(TransportChannelHandler.java:108)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
	at io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:277)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
	at org.apache.spark.network.util.TransportFrameDecoder.channelInactive(TransportFrameDecoder.java:182)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1354)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:917)
	at io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:822)
	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
	at java.lang.Thread.run(Thread.java:748)

2019-05-14 12:10:23 ERROR TaskSetManager:70 - Task 0 in stage 0.0 failed 1 times; aborting job
2019-05-14 12:10:23 ERROR SparkHadoopWriter:91 - Aborting job job_20190514121023_0009.
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.io.IOException: Connection from /192.168.10.28:43547 closed
	at org.apache.spark.network.client.TransportResponseHandler.channelInactive(TransportResponseHandler.java:146)
	at org.apache.spark.network.server.TransportChannelHandler.channelInactive(TransportChannelHandler.java:108)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
	at io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:277)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
	at org.apache.spark.network.util.TransportFrameDecoder.channelInactive(TransportFrameDecoder.java:182)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1354)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:917)
	at io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:822)
	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
	at java.lang.Thread.run(Thread.java:748)

Driver stacktrace:
	at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
	at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
	at scala.Option.foreach(Option.scala:257)
	at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2110)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2048)
	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2114)
	at org.apache.spark.internal.io.SparkHadoopWriter$.write(SparkHadoopWriter.scala:78)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1083)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1081)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1081)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
	at org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(PairRDDFunctions.scala:1081)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopFile$2.apply$mcV$sp(PairRDDFunctions.scala:1000)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopFile$2.apply(PairRDDFunctions.scala:991)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopFile$2.apply(PairRDDFunctions.scala:991)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
	at org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopFile(PairRDDFunctions.scala:991)
	at org.bdgenomics.adam.rdd.variant.VariantContextDataset$$anonfun$saveAsVcf$1.apply(VariantContextDataset.scala:420)
	at scala.Option.fold(Option.scala:158)
	at org.apache.spark.rdd.Timer.time(Timer.scala:48)
	at org.bdgenomics.adam.rdd.variant.VariantContextDataset.saveAsVcf(VariantContextDataset.scala:351)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
	at py4j.Gateway.invoke(Gateway.java:282)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.GatewayConnection.run(GatewayConnection.java:238)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Connection from /192.168.10.28:43547 closed
	at org.apache.spark.network.client.TransportResponseHandler.channelInactive(TransportResponseHandler.java:146)
	at org.apache.spark.network.server.TransportChannelHandler.channelInactive(TransportChannelHandler.java:108)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
	at io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:277)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
	at org.apache.spark.network.util.TransportFrameDecoder.channelInactive(TransportFrameDecoder.java:182)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1354)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:917)
	at io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:822)
	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
	... 1 more
- generated xml file: /tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-python/target/pytest-reports/tests.xml -
=============================== warnings summary ===============================
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.3-bin-hadoop2.7/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py:2020: DeprecationWarning: invalid escape sequence \*
/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.3-bin-hadoop2.7/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py:2020: DeprecationWarning: invalid escape sequence \*

/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-python/bdgenomics/adam/test/alignmentRecordDataset_test.py:143: DeprecationWarning: Please use assertEqual instead.
  self.assertEquals(readsAsCoverage.toDF().count(), 5)

/tmp/adamTestyHk1zRx/deleteMePleaseThisIsNoLongerNeeded/adam-python/bdgenomics/adam/test/coverageDataset_test.py:59: DeprecationWarning: Please use assertEqual instead.
  self.assertEquals(features.toDF().count(), coverage.toDF().count())

source:2020: DeprecationWarning: invalid escape sequence \*

-- Docs: https://docs.pytest.org/en/latest/warnings.html
=============== 1 failed, 61 passed, 5 warnings in 90.84 seconds ===============
Makefile:83: recipe for target 'test' failed
make: *** [test] Error 1
[ERROR] Command execution failed.
org.apache.commons.exec.ExecuteException: Process exited with an error: 2 (Exit value: 2)
	at org.apache.commons.exec.DefaultExecutor.executeInternal(DefaultExecutor.java:404)
	at org.apache.commons.exec.DefaultExecutor.execute(DefaultExecutor.java:166)
	at org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:764)
	at org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:711)
	at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:289)
	at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
	at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
	at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
	at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
	at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
	at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
	at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
	at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
	at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
	at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
	at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
	at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
	at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
	at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] ADAM_2.11 .......................................... SUCCESS [  9.588 s]
[INFO] ADAM_2.11: Shader workaround ....................... SUCCESS [  4.042 s]
[INFO] ADAM_2.11: Avro-to-Dataset codegen utils ........... SUCCESS [  3.579 s]
[INFO] ADAM_2.11: Core .................................... SUCCESS [01:11 min]
[INFO] ADAM_2.11: APIs for Java, Python ................... SUCCESS [  7.352 s]
[INFO] ADAM_2.11: CLI ..................................... SUCCESS [  9.622 s]
[INFO] ADAM_2.11: Assembly ................................ SUCCESS [ 16.204 s]
[INFO] ADAM_2.11: Python APIs ............................. FAILURE [01:34 min]
[INFO] ADAM_2.11: R APIs .................................. SKIPPED
[INFO] ADAM_2.11: Distribution ............................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:37 min
[INFO] Finished at: 2019-05-14T12:10:27-07:00
[INFO] Final Memory: 91M/1467M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.5.0:exec (test-python) on project adam-python-spark2_2.11: Command execution failed. Process exited with an error: 2 (Exit value: 2) -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :adam-python-spark2_2.11
Build step 'Execute shell' marked build as failure
Recording test results
Publishing Scoverage XML and HTML report...
Setting commit status on GitHub for https://github.com/bigdatagenomics/adam/commit/3718c8d21eb71b332d4cb15a65a62054d27f210f
Finished: FAILURE