SuccessConsole Output

Skipping 1,925 KB.. Full Log
e - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:compile (default-compile) @ adam-shade-spark2_2.12 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- plexus-component-metadata:1.5.5:generate-metadata (default) @ adam-shade-spark2_2.12 ---
[INFO] Discovered 1 component descriptors(s)
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:testResources (default-testResources) @ adam-shade-spark2_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-shade/src/test/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-shade-spark2_2.12 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:testCompile (default-testCompile) @ adam-shade-spark2_2.12 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:3.0.0-M3:test (default-test) @ adam-shade-spark2_2.12 ---
[INFO] No tests to run.
[INFO] 
[INFO] --- maven-jar-plugin:3.2.0:jar (default-jar) @ adam-shade-spark2_2.12 ---
[INFO] Building jar: /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-shade/target/adam-shade-spark2_2.12-0.33.0-SNAPSHOT.jar
[INFO] 
[INFO] ------------< org.bdgenomics.adam:adam-codegen-spark2_2.12 >------------
[INFO] Building ADAM_2.12: Avro-to-Dataset codegen utils 0.33.0-SNAPSHOT  [3/8]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-codegen-spark2_2.12 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-codegen-spark2_2.12 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-source (add-source) @ adam-codegen-spark2_2.12 ---
[INFO] Source directory: /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-codegen/src/main/scala added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-codegen-spark2_2.12 ---
[INFO] Modified 0 of 4 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:resources (default-resources) @ adam-codegen-spark2_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-codegen/src/main/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-codegen-spark2_2.12 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:compile (default-compile) @ adam-codegen-spark2_2.12 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-test-source (add-test-source) @ adam-codegen-spark2_2.12 ---
[INFO] Test Source directory: /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-codegen/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:testResources (default-testResources) @ adam-codegen-spark2_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-codegen/src/test/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-codegen-spark2_2.12 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:testCompile (default-testCompile) @ adam-codegen-spark2_2.12 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:3.0.0-M3:test (default-test) @ adam-codegen-spark2_2.12 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:2.0.0:test (test) @ adam-codegen-spark2_2.12 ---
Discovery starting.
Discovery completed in 67 milliseconds.
Run starting. Expected test count is: 0
Run completed in 73 milliseconds.
Total number of tests run: 0
Suites: completed 0, aborted 0
Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
No tests were executed.
[INFO] 
[INFO] --- maven-jar-plugin:3.2.0:jar (default-jar) @ adam-codegen-spark2_2.12 ---
[INFO] 
[INFO] -------------< org.bdgenomics.adam:adam-core-spark2_2.12 >--------------
[INFO] Building ADAM_2.12: Core 0.33.0-SNAPSHOT                           [4/8]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-core-spark2_2.12 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-core-spark2_2.12 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-source (add-source) @ adam-core-spark2_2.12 ---
[INFO] Source directory: /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala added.
[INFO] Source directory: /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/generated-sources/src/main/scala added.
[INFO] 
[INFO] --- exec-maven-plugin:1.5.0:java (generate-scala-products) @ adam-core-spark2_2.12 ---
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
[INFO] 
[INFO] --- exec-maven-plugin:1.5.0:java (generate-scala-projection-fields) @ adam-core-spark2_2.12 ---
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-core-spark2_2.12 ---
[INFO] Modified 2 of 204 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:resources (default-resources) @ adam-core-spark2_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-core-spark2_2.12 ---
[INFO] /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/java:-1: info: compiling
[INFO] /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/main/scala:-1: info: compiling
[INFO] /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/generated-sources/src/main/scala:-1: info: compiling
[INFO] Compiling 140 source files to /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/2.12.8/classes at 1599679548985
[WARNING] warning: there were 21 deprecation warnings
[WARNING] warning: there were 31 deprecation warnings (since 0.21.0)
[WARNING] warning: there was one deprecation warning (since 1.0.6)
[WARNING] warning: there was one deprecation warning (since 2.11.0)
[WARNING] warning: there were 175 deprecation warnings (since 2.12.0)
[WARNING] warning: there was one deprecation warning (since 2.12.7)
[WARNING] warning: there were 230 deprecation warnings in total; re-run with -deprecation for details
[WARNING] warning: there were 5 feature warnings; re-run with -feature for details
[WARNING] 8 warnings found
[INFO] prepare-compile in 0 s
[INFO] compile in 26 s
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:compile (default-compile) @ adam-core-spark2_2.12 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-test-source (add-test-source) @ adam-core-spark2_2.12 ---
[INFO] Test Source directory: /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-core/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:testResources (default-testResources) @ adam-core-spark2_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 152 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-core-spark2_2.12 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:testCompile (default-testCompile) @ adam-core-spark2_2.12 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:3.0.0-M3:test (default-test) @ adam-core-spark2_2.12 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:2.0.0:test (test) @ adam-core-spark2_2.12 ---
Discovery starting.
Discovery completed in 631 milliseconds.
Run starting. Expected test count is: 0
Run completed in 639 milliseconds.
Total number of tests run: 0
Suites: completed 0, aborted 0
Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
No tests were executed.
[INFO] 
[INFO] --- maven-jar-plugin:3.2.0:jar (default-jar) @ adam-core-spark2_2.12 ---
[INFO] Building jar: /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-core/target/adam-core-spark2_2.12-0.33.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-jar-plugin:3.2.0:test-jar (default) @ adam-core-spark2_2.12 ---
[INFO] 
[INFO] -------------< org.bdgenomics.adam:adam-apis-spark2_2.12 >--------------
[INFO] Building ADAM_2.12: APIs for Java, Python 0.33.0-SNAPSHOT          [5/8]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-apis-spark2_2.12 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-apis-spark2_2.12 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-source (add-source) @ adam-apis-spark2_2.12 ---
[INFO] Source directory: /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-apis/src/main/scala added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-apis-spark2_2.12 ---
[INFO] Modified 0 of 5 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:resources (default-resources) @ adam-apis-spark2_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-apis/src/main/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-apis-spark2_2.12 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:compile (default-compile) @ adam-apis-spark2_2.12 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-test-source (add-test-source) @ adam-apis-spark2_2.12 ---
[INFO] Test Source directory: /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-apis/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:testResources (default-testResources) @ adam-apis-spark2_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 2 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-apis-spark2_2.12 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:testCompile (default-testCompile) @ adam-apis-spark2_2.12 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:3.0.0-M3:test (default-test) @ adam-apis-spark2_2.12 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:2.0.0:test (test) @ adam-apis-spark2_2.12 ---
Discovery starting.
Discovery completed in 154 milliseconds.
Run starting. Expected test count is: 0
Run completed in 160 milliseconds.
Total number of tests run: 0
Suites: completed 0, aborted 0
Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
No tests were executed.
[INFO] 
[INFO] --- maven-jar-plugin:3.2.0:jar (default-jar) @ adam-apis-spark2_2.12 ---
[INFO] 
[INFO] --- maven-jar-plugin:3.2.0:test-jar (default) @ adam-apis-spark2_2.12 ---
[INFO] 
[INFO] --------------< org.bdgenomics.adam:adam-cli-spark2_2.12 >--------------
[INFO] Building ADAM_2.12: CLI 0.33.0-SNAPSHOT                            [6/8]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-cli-spark2_2.12 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-cli-spark2_2.12 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:timestamp-property (timestamp-property) @ adam-cli-spark2_2.12 ---
[INFO] 
[INFO] --- git-commit-id-plugin:2.2.2:revision (default) @ adam-cli-spark2_2.12 ---
[INFO] 
[INFO] --- templating-maven-plugin:1.0.0:filter-sources (filter-src) @ adam-cli-spark2_2.12 ---
[INFO] Coping files with filtering to temporary directory.
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] No files needs to be copied to output directory. Up to date: /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/generated-sources/java-templates
[INFO] Source directory: /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-cli/target/generated-sources/java-templates added.
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-source (add-source) @ adam-cli-spark2_2.12 ---
[INFO] Source directory: /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-cli/src/main/scala added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-cli-spark2_2.12 ---
[INFO] Modified 0 of 29 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:resources (default-resources) @ adam-cli-spark2_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-cli/src/main/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-cli-spark2_2.12 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:compile (default-compile) @ adam-cli-spark2_2.12 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-test-source (add-test-source) @ adam-cli-spark2_2.12 ---
[INFO] Test Source directory: /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-cli/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:testResources (default-testResources) @ adam-cli-spark2_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 15 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-cli-spark2_2.12 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:testCompile (default-testCompile) @ adam-cli-spark2_2.12 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:3.0.0-M3:test (default-test) @ adam-cli-spark2_2.12 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:2.0.0:test (test) @ adam-cli-spark2_2.12 ---
Discovery starting.
Discovery completed in 138 milliseconds.
Run starting. Expected test count is: 0
Run completed in 146 milliseconds.
Total number of tests run: 0
Suites: completed 0, aborted 0
Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
No tests were executed.
[INFO] 
[INFO] --- maven-jar-plugin:3.2.0:jar (default-jar) @ adam-cli-spark2_2.12 ---
[INFO] 
[INFO] -----------< org.bdgenomics.adam:adam-assembly-spark2_2.12 >------------
[INFO] Building ADAM_2.12: Assembly 0.33.0-SNAPSHOT                       [7/8]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-assembly-spark2_2.12 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-assembly-spark2_2.12 ---
[INFO] 
[INFO] --- git-commit-id-plugin:2.2.2:revision (default) @ adam-assembly-spark2_2.12 ---
[INFO] 
[INFO] --- templating-maven-plugin:1.0.0:filter-sources (filter-src) @ adam-assembly-spark2_2.12 ---
[INFO] Request to add '/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/src/main/java-templates' folder. Not added since it does not exist.
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-source (add-source) @ adam-assembly-spark2_2.12 ---
[INFO] Source directory: /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/src/main/scala added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-assembly-spark2_2.12 ---
[INFO] Modified 0 of 1 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:resources (default-resources) @ adam-assembly-spark2_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/src/main/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-assembly-spark2_2.12 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:compile (default-compile) @ adam-assembly-spark2_2.12 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-test-source (add-test-source) @ adam-assembly-spark2_2.12 ---
[INFO] Test Source directory: /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:testResources (default-testResources) @ adam-assembly-spark2_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/src/test/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-assembly-spark2_2.12 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:testCompile (default-testCompile) @ adam-assembly-spark2_2.12 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:3.0.0-M3:test (default-test) @ adam-assembly-spark2_2.12 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:3.2.0:jar (default-jar) @ adam-assembly-spark2_2.12 ---
[INFO] 
[INFO] --- maven-shade-plugin:3.2.0:shade (default) @ adam-assembly-spark2_2.12 ---
[INFO] Including org.bdgenomics.adam:adam-cli-spark2_2.12:jar:0.33.0-SNAPSHOT in the shaded jar.
[INFO] Including org.bdgenomics.utils:utils-misc-spark2_2.12:jar:0.3.0 in the shaded jar.
[INFO] Including org.bdgenomics.utils:utils-io-spark2_2.12:jar:0.3.0 in the shaded jar.
[INFO] Including org.apache.httpcomponents:httpclient:jar:4.5.7 in the shaded jar.
[INFO] Including org.apache.httpcomponents:httpcore:jar:4.4.11 in the shaded jar.
[INFO] Including commons-logging:commons-logging:jar:1.2 in the shaded jar.
[INFO] Including commons-codec:commons-codec:jar:1.11 in the shaded jar.
[INFO] Including org.bdgenomics.utils:utils-cli-spark2_2.12:jar:0.3.0 in the shaded jar.
[INFO] Including org.clapper:grizzled-slf4j_2.12:jar:1.3.4 in the shaded jar.
[INFO] Including org.slf4j:slf4j-api:jar:1.7.30 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-avro:jar:1.10.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-column:jar:1.10.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-common:jar:1.10.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-encoding:jar:1.10.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-hadoop:jar:1.10.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-jackson:jar:1.10.1 in the shaded jar.
[INFO] Including commons-pool:commons-pool:jar:1.6 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-format:jar:2.4.0 in the shaded jar.
[INFO] Including org.bdgenomics.bdg-formats:bdg-formats:jar:0.15.0 in the shaded jar.
[INFO] Including org.apache.avro:avro:jar:1.8.2 in the shaded jar.
[INFO] Including org.codehaus.jackson:jackson-core-asl:jar:1.9.13 in the shaded jar.
[INFO] Including org.codehaus.jackson:jackson-mapper-asl:jar:1.9.13 in the shaded jar.
[INFO] Including com.thoughtworks.paranamer:paranamer:jar:2.8 in the shaded jar.
[INFO] Including org.xerial.snappy:snappy-java:jar:1.1.1.3 in the shaded jar.
[INFO] Including org.apache.commons:commons-compress:jar:1.8.1 in the shaded jar.
[INFO] Including org.tukaani:xz:jar:1.5 in the shaded jar.
[INFO] Including org.bdgenomics.adam:adam-core-spark2_2.12:jar:0.33.0-SNAPSHOT in the shaded jar.
[INFO] Including org.bdgenomics.utils:utils-intervalrdd-spark2_2.12:jar:0.3.0 in the shaded jar.
[INFO] Including com.esotericsoftware.kryo:kryo:jar:2.24.0 in the shaded jar.
[INFO] Including com.esotericsoftware.minlog:minlog:jar:1.2 in the shaded jar.
[INFO] Including org.objenesis:objenesis:jar:2.1 in the shaded jar.
[INFO] Including commons-io:commons-io:jar:2.6 in the shaded jar.
[INFO] Including it.unimi.dsi:fastutil:jar:6.6.5 in the shaded jar.
[INFO] Including org.seqdoop:hadoop-bam:jar:7.9.2 in the shaded jar.
[INFO] Including com.github.jsr203hadoop:jsr203hadoop:jar:1.0.3 in the shaded jar.
[INFO] Including com.github.samtools:htsjdk:jar:2.19.0 in the shaded jar.
[INFO] Including org.apache.commons:commons-jexl:jar:2.1.1 in the shaded jar.
[INFO] Including gov.nih.nlm.ncbi:ngs-java:jar:2.9.0 in the shaded jar.
[INFO] Including com.google.guava:guava:jar:27.0-jre in the shaded jar.
[INFO] Including com.google.guava:failureaccess:jar:1.0 in the shaded jar.
[INFO] Including com.google.guava:listenablefuture:jar:9999.0-empty-to-avoid-conflict-with-guava in the shaded jar.
[INFO] Including org.checkerframework:checker-qual:jar:2.5.2 in the shaded jar.
[INFO] Including com.google.errorprone:error_prone_annotations:jar:2.2.0 in the shaded jar.
[INFO] Including com.google.j2objc:j2objc-annotations:jar:1.1 in the shaded jar.
[INFO] Including org.codehaus.mojo:animal-sniffer-annotations:jar:1.17 in the shaded jar.
[INFO] Including org.bdgenomics.adam:adam-codegen-spark2_2.12:jar:0.33.0-SNAPSHOT in the shaded jar.
[INFO] Including org.bdgenomics.adam:adam-apis-spark2_2.12:jar:0.33.0-SNAPSHOT in the shaded jar.
[INFO] Including args4j:args4j:jar:2.33 in the shaded jar.
[INFO] Including net.codingwell:scala-guice_2.12:jar:4.2.1 in the shaded jar.
[INFO] Including com.google.inject:guice:jar:4.2.0 in the shaded jar.
[INFO] Including javax.inject:javax.inject:jar:1 in the shaded jar.
[INFO] Including aopalliance:aopalliance:jar:1.0 in the shaded jar.
[INFO] Including org.scala-lang:scala-reflect:jar:2.12.6 in the shaded jar.
[INFO] Including com.google.code.findbugs:jsr305:jar:1.3.9 in the shaded jar.
[WARNING] WORKAROUND:  refusing to add class org/apache/parquet/avro/AvroSchemaConverter$2.class from jar /home/jenkins/.m2/repository/org/apache/parquet/parquet-avro/1.10.1/parquet-avro-1.10.1.jar
[WARNING] WORKAROUND:  refusing to add class org/apache/parquet/avro/AvroSchemaConverter.class from jar /home/jenkins/.m2/repository/org/apache/parquet/parquet-avro/1.10.1/parquet-avro-1.10.1.jar
[WARNING] WORKAROUND:  refusing to add class org/apache/parquet/avro/AvroSchemaConverter$1.class from jar /home/jenkins/.m2/repository/org/apache/parquet/parquet-avro/1.10.1/parquet-avro-1.10.1.jar
[WARNING] guice-4.2.0.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 573 overlapping classes: 
[WARNING]   - com.google.inject.Scope
[WARNING]   - com.google.inject.Binding
[WARNING]   - com.google.inject.internal.cglib.core.$EmitUtils$3
[WARNING]   - com.google.inject.spi.TypeConverter
[WARNING]   - com.google.inject.internal.ConstructionProxy
[WARNING]   - com.google.inject.spi.InjectionPoint
[WARNING]   - com.google.inject.internal.cglib.proxy.$FixedValueGenerator
[WARNING]   - com.google.inject.spi.StaticInjectionRequest
[WARNING]   - com.google.inject.internal.cglib.proxy.$DispatcherGenerator
[WARNING]   - com.google.inject.spi.Elements$ElementsAsModule
[WARNING]   - 563 more...
[WARNING] aopalliance-1.0.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 9 overlapping classes: 
[WARNING]   - org.aopalliance.intercept.ConstructorInterceptor
[WARNING]   - org.aopalliance.intercept.MethodInvocation
[WARNING]   - org.aopalliance.intercept.MethodInterceptor
[WARNING]   - org.aopalliance.intercept.Invocation
[WARNING]   - org.aopalliance.aop.AspectException
[WARNING]   - org.aopalliance.intercept.Interceptor
[WARNING]   - org.aopalliance.intercept.Joinpoint
[WARNING]   - org.aopalliance.aop.Advice
[WARNING]   - org.aopalliance.intercept.ConstructorInvocation
[WARNING] htsjdk-2.19.0.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 993 overlapping classes: 
[WARNING]   - htsjdk.samtools.cram.ref.ReferenceSource
[WARNING]   - htsjdk.samtools.cram.compression.ExternalCompressor$3
[WARNING]   - htsjdk.samtools.HighAccuracyDownsamplingIterator
[WARNING]   - htsjdk.samtools.util.zip.DeflaterFactory
[WARNING]   - htsjdk.samtools.filter.DuplicateReadFilter
[WARNING]   - htsjdk.samtools.cram.encoding.core.huffmanUtils.HuffmanCode$1
[WARNING]   - htsjdk.samtools.cram.encoding.core.SubexponentialIntegerEncoding
[WARNING]   - htsjdk.variant.vcf.VCFEncoder
[WARNING]   - htsjdk.samtools.util.CloserUtil
[WARNING]   - htsjdk.tribble.TribbleException$FeatureFileDoesntExist
[WARNING]   - 983 more...
[WARNING] utils-intervalrdd-spark2_2.12-0.3.0.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 15 overlapping classes: 
[WARNING]   - org.bdgenomics.utils.interval.array.IntervalArray$
[WARNING]   - org.bdgenomics.utils.interval.rdd.IntervalPartition$
[WARNING]   - org.bdgenomics.utils.interval.array.ConcreteIntervalArray$
[WARNING]   - org.bdgenomics.utils.interval.array.Interval
[WARNING]   - org.bdgenomics.utils.interval.array.IntervalArray
[WARNING]   - org.bdgenomics.utils.interval.rdd.IntervalRDD$
[WARNING]   - org.bdgenomics.utils.interval.rdd.intervalrdd.package$
[WARNING]   - org.bdgenomics.utils.interval.rdd.IntervalPartition
[WARNING]   - org.bdgenomics.utils.interval.rdd.PartitionMerger
[WARNING]   - org.bdgenomics.utils.interval.rdd.intervalrdd.package
[WARNING]   - 5 more...
[WARNING] adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar, fastutil-6.6.5.jar define 10700 overlapping classes: 
[WARNING]   - it.unimi.dsi.fastutil.doubles.Double2IntRBTreeMap$Submap$KeySet
[WARNING]   - it.unimi.dsi.fastutil.longs.Long2CharAVLTreeMap$2$1
[WARNING]   - it.unimi.dsi.fastutil.bytes.Byte2ObjectLinkedOpenHashMap$EntryIterator
[WARNING]   - it.unimi.dsi.fastutil.ints.Int2ReferenceRBTreeMap$Submap
[WARNING]   - it.unimi.dsi.fastutil.shorts.Short2FloatOpenCustomHashMap$KeySet
[WARNING]   - it.unimi.dsi.fastutil.bytes.Byte2BooleanRBTreeMap$Submap$1
[WARNING]   - it.unimi.dsi.fastutil.floats.AbstractFloat2ShortSortedMap$ValuesCollection
[WARNING]   - it.unimi.dsi.fastutil.longs.Long2ReferenceRBTreeMap$Submap$2
[WARNING]   - it.unimi.dsi.fastutil.chars.AbstractChar2LongSortedMap$ValuesIterator
[WARNING]   - it.unimi.dsi.fastutil.chars.AbstractChar2ObjectMap$1$1
[WARNING]   - 10690 more...
[WARNING] adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar, commons-io-2.6.jar define 127 overlapping classes: 
[WARNING]   - org.apache.commons.io.FileCleaningTracker
[WARNING]   - org.apache.commons.io.comparator.SizeFileComparator
[WARNING]   - org.apache.commons.io.input.CloseShieldInputStream
[WARNING]   - org.apache.commons.io.ByteOrderParser
[WARNING]   - org.apache.commons.io.filefilter.EmptyFileFilter
[WARNING]   - org.apache.commons.io.monitor.FileEntry
[WARNING]   - org.apache.commons.io.output.ThresholdingOutputStream
[WARNING]   - org.apache.commons.io.input.TailerListener
[WARNING]   - org.apache.commons.io.IOExceptionWithCause
[WARNING]   - org.apache.commons.io.comparator.PathFileComparator
[WARNING]   - 117 more...
[WARNING] adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar, httpclient-4.5.7.jar define 467 overlapping classes: 
[WARNING]   - org.apache.http.impl.cookie.RFC2109Spec
[WARNING]   - org.apache.http.impl.execchain.MainClientExec
[WARNING]   - org.apache.http.conn.routing.RouteInfo$TunnelType
[WARNING]   - org.apache.http.client.methods.HttpGet
[WARNING]   - org.apache.http.impl.cookie.BrowserCompatSpecFactory
[WARNING]   - org.apache.http.impl.client.HttpAuthenticator
[WARNING]   - org.apache.http.conn.ManagedClientConnection
[WARNING]   - org.apache.http.client.protocol.RequestAuthCache
[WARNING]   - org.apache.http.conn.params.ConnConnectionParamBean
[WARNING]   - org.apache.http.impl.client.IdleConnectionEvictor
[WARNING]   - 457 more...
[WARNING] guava-27.0-jre.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 1955 overlapping classes: 
[WARNING]   - com.google.common.collect.CompactHashMap$Itr
[WARNING]   - com.google.common.collect.ImmutableMapValues$1
[WARNING]   - com.google.common.io.LineProcessor
[WARNING]   - com.google.common.util.concurrent.AbstractService$5
[WARNING]   - com.google.common.io.BaseEncoding$StandardBaseEncoding$2
[WARNING]   - com.google.common.io.ByteProcessor
[WARNING]   - com.google.common.math.package-info
[WARNING]   - com.google.common.util.concurrent.SimpleTimeLimiter
[WARNING]   - com.google.common.cache.AbstractCache$StatsCounter
[WARNING]   - com.google.common.util.concurrent.CycleDetectingLockFactory$Policies
[WARNING]   - 1945 more...
[WARNING] ngs-java-2.9.0.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 73 overlapping classes: 
[WARNING]   - ngs.itf.ReadItf
[WARNING]   - gov.nih.nlm.ncbi.ngs.error.cause.JvmErrorCause
[WARNING]   - ngs.ReferenceIterator
[WARNING]   - gov.nih.nlm.ncbi.ngs.Manager$1
[WARNING]   - gov.nih.nlm.ncbi.ngs.LMProperties
[WARNING]   - gov.nih.nlm.ncbi.ngs.error.LibraryLoadError
[WARNING]   - gov.nih.nlm.ncbi.ngs.LibDependencies
[WARNING]   - gov.nih.nlm.ncbi.ngs.error.cause.ConnectionProblemCause
[WARNING]   - ngs.itf.PileupEventItf
[WARNING]   - gov.nih.nlm.ncbi.ngs.LibManager$Location
[WARNING]   - 63 more...
[WARNING] javax.inject-1.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 6 overlapping classes: 
[WARNING]   - javax.inject.Inject
[WARNING]   - javax.inject.Singleton
[WARNING]   - javax.inject.Scope
[WARNING]   - javax.inject.Named
[WARNING]   - javax.inject.Provider
[WARNING]   - javax.inject.Qualifier
[WARNING] utils-cli-spark2_2.12-0.3.0.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 13 overlapping classes: 
[WARNING]   - org.bdgenomics.utils.cli.Args4j
[WARNING]   - org.bdgenomics.utils.cli.ParquetArgs
[WARNING]   - org.bdgenomics.utils.cli.SaveArgs
[WARNING]   - org.bdgenomics.utils.cli.Args4jBase
[WARNING]   - org.bdgenomics.utils.cli.ParquetSaveArgs
[WARNING]   - org.bdgenomics.utils.cli.ParquetLoadSaveArgs
[WARNING]   - org.bdgenomics.utils.cli.BDGCommandCompanion
[WARNING]   - org.bdgenomics.utils.cli.ParquetRDDArgs
[WARNING]   - org.bdgenomics.utils.cli.BDGSparkCommand
[WARNING]   - org.bdgenomics.utils.cli.SaveFileArgs
[WARNING]   - 3 more...
[WARNING] jackson-mapper-asl-1.9.13.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 502 overlapping classes: 
[WARNING]   - org.codehaus.jackson.map.ext.DOMSerializer
[WARNING]   - org.codehaus.jackson.node.POJONode
[WARNING]   - org.codehaus.jackson.map.deser.std.JsonNodeDeserializer$ArrayDeserializer
[WARNING]   - org.codehaus.jackson.map.ser.StdSerializers$UtilDateSerializer
[WARNING]   - org.codehaus.jackson.map.ext.JodaDeserializers$LocalDateDeserializer
[WARNING]   - org.codehaus.jackson.map.deser.std.PrimitiveArrayDeserializers$StringDeser
[WARNING]   - org.codehaus.jackson.map.util.Comparators$1
[WARNING]   - org.codehaus.jackson.map.util.StdDateFormat
[WARNING]   - org.codehaus.jackson.map.KeyDeserializer
[WARNING]   - org.codehaus.jackson.map.MapperConfig$Impl
[WARNING]   - 492 more...
[WARNING] parquet-common-1.10.1.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 83 overlapping classes: 
[WARNING]   - org.apache.parquet.SemanticVersion$SemanticVersionParseException
[WARNING]   - org.apache.parquet.Ints
[WARNING]   - org.apache.parquet.bytes.SingleBufferInputStream
[WARNING]   - org.apache.parquet.Version
[WARNING]   - org.apache.parquet.util.DynMethods$Builder
[WARNING]   - org.apache.parquet.SemanticVersion$NumberOrString
[WARNING]   - org.apache.parquet.glob.GlobNode$Atom
[WARNING]   - org.apache.parquet.bytes.BytesInput$EmptyBytesInput
[WARNING]   - org.apache.parquet.Exceptions
[WARNING]   - org.apache.parquet.bytes.MultiBufferInputStream$ConcatIterator
[WARNING]   - 73 more...
[WARNING] checker-qual-2.5.2.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 302 overlapping classes: 
[WARNING]   - org.checkerframework.checker.formatter.FormatUtil
[WARNING]   - org.checkerframework.checker.units.qual.PolyUnit
[WARNING]   - org.checkerframework.checker.units.qual.MixedUnits
[WARNING]   - org.checkerframework.checker.regex.qual.PolyRegex
[WARNING]   - org.checkerframework.checker.units.qual.C
[WARNING]   - org.checkerframework.checker.formatter.FormatUtil$IllegalFormatConversionCategoryException
[WARNING]   - org.checkerframework.framework.qual.Unqualified
[WARNING]   - org.checkerframework.common.reflection.qual.UnknownMethod
[WARNING]   - org.checkerframework.framework.qual.EnsuresQualifierIf
[WARNING]   - org.checkerframework.checker.signedness.SignednessUtil
[WARNING]   - 292 more...
[WARNING] adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar, scala-guice_2.12-4.2.1.jar define 77 overlapping classes: 
[WARNING]   - net.codingwell.scalaguice.ScalaModule$
[WARNING]   - net.codingwell.scalaguice.ScalaModule$ScalaLinkedBindingBuilder$$anon$3$$typecreator5$1
[WARNING]   - net.codingwell.scalaguice.ScalaModule$ScalaScopedBindingBuilder
[WARNING]   - net.codingwell.scalaguice.binder.ScopedBindingBuilderProxy
[WARNING]   - net.codingwell.scalaguice.InjectorExtensions$ScalaInjector$$typecreator2$1
[WARNING]   - net.codingwell.scalaguice.ScalaPrivateModule$ElementBuilder
[WARNING]   - net.codingwell.scalaguice.ScalaModule$ScalaAnnotatedBindingBuilder
[WARNING]   - net.codingwell.scalaguice.InternalModule
[WARNING]   - net.codingwell.scalaguice.TypeConversions$ArrayType$
[WARNING]   - net.codingwell.scalaguice.ScalaOptionBinder
[WARNING]   - 67 more...
[WARNING] parquet-jackson-1.10.1.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 623 overlapping classes: 
[WARNING]   - shaded.parquet.org.codehaus.jackson.map.InjectableValues$Std
[WARNING]   - shaded.parquet.org.codehaus.jackson.map.introspect.POJOPropertyBuilder$Node
[WARNING]   - shaded.parquet.org.codehaus.jackson.util.TokenBuffer$1
[WARNING]   - shaded.parquet.org.codehaus.jackson.type.TypeReference
[WARNING]   - shaded.parquet.org.codehaus.jackson.map.deser.std.FromStringDeserializer$LocaleDeserializer
[WARNING]   - shaded.parquet.org.codehaus.jackson.map.ser.BasicSerializerFactory
[WARNING]   - shaded.parquet.org.codehaus.jackson.map.deser.StdKeyDeserializer
[WARNING]   - shaded.parquet.org.codehaus.jackson.map.deser.std.StdKeyDeserializer$EnumKD
[WARNING]   - shaded.parquet.org.codehaus.jackson.map.jsontype.impl.StdTypeResolverBuilder
[WARNING]   - shaded.parquet.org.codehaus.jackson.map.ser.std.InetAddressSerializer
[WARNING]   - 613 more...
[WARNING] parquet-format-2.4.0.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 461 overlapping classes: 
[WARNING]   - shaded.parquet.org.apache.thrift.transport.TServerSocket
[WARNING]   - shaded.parquet.org.apache.thrift.transport.TSimpleFileTransport
[WARNING]   - shaded.parquet.org.apache.thrift.transport.TFileTransport$TruncableBufferedInputStream
[WARNING]   - shaded.parquet.org.apache.thrift.TFieldIdEnum
[WARNING]   - org.apache.parquet.format.SchemaElement$SchemaElementStandardScheme
[WARNING]   - shaded.parquet.org.apache.thrift.server.AbstractNonblockingServer$AbstractSelectThread
[WARNING]   - shaded.parquet.org.apache.thrift.TEnumHelper
[WARNING]   - org.apache.parquet.format.JsonType$JsonTypeStandardScheme
[WARNING]   - org.apache.parquet.format.DictionaryPageHeader$DictionaryPageHeaderTupleScheme
[WARNING]   - org.apache.parquet.format.UUIDType$1
[WARNING]   - 451 more...
[WARNING] guava-27.0-jre.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar, failureaccess-1.0.jar define 2 overlapping classes: 
[WARNING]   - com.google.common.util.concurrent.internal.InternalFutureFailureAccess
[WARNING]   - com.google.common.util.concurrent.internal.InternalFutures
[WARNING] adam-apis-spark2_2.12-0.33.0-SNAPSHOT.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 195 overlapping classes: 
[WARNING]   - org.bdgenomics.adam.api.java.FeaturesToFragmentsConverter
[WARNING]   - org.bdgenomics.adam.api.java.FeaturesToVariantsDatasetConverter
[WARNING]   - org.bdgenomics.adam.api.java.AlignmentsToFeaturesConverter
[WARNING]   - org.bdgenomics.adam.api.java.FeaturesToReadsDatasetConverter
[WARNING]   - org.bdgenomics.adam.api.java.VariantsToVariantsConverter
[WARNING]   - org.bdgenomics.adam.api.java.CoverageToReadsDatasetConverter
[WARNING]   - org.bdgenomics.adam.api.java.SequencesToGenotypesConverter
[WARNING]   - org.bdgenomics.adam.api.java.ReadsToSlicesDatasetConverter
[WARNING]   - org.bdgenomics.adam.api.java.SequencesToFeaturesConverter
[WARNING]   - org.bdgenomics.adam.api.java.GenotypesToSlicesConverter
[WARNING]   - 185 more...
[WARNING] commons-codec-1.11.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 96 overlapping classes: 
[WARNING]   - org.apache.commons.codec.language.Nysiis
[WARNING]   - org.apache.commons.codec.language.bm.Rule$1
[WARNING]   - org.apache.commons.codec.language.bm.Rule$RPattern
[WARNING]   - org.apache.commons.codec.language.ColognePhonetic$CologneInputBuffer
[WARNING]   - org.apache.commons.codec.digest.HmacUtils
[WARNING]   - org.apache.commons.codec.language.bm.BeiderMorseEncoder
[WARNING]   - org.apache.commons.codec.digest.UnixCrypt
[WARNING]   - org.apache.commons.codec.language.Soundex
[WARNING]   - org.apache.commons.codec.cli.Digest
[WARNING]   - org.apache.commons.codec.binary.BinaryCodec
[WARNING]   - 86 more...
[WARNING] jsr305-1.3.9.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 35 overlapping classes: 
[WARNING]   - javax.annotation.RegEx
[WARNING]   - javax.annotation.concurrent.Immutable
[WARNING]   - javax.annotation.meta.TypeQualifierDefault
[WARNING]   - javax.annotation.meta.TypeQualifier
[WARNING]   - javax.annotation.Syntax
[WARNING]   - javax.annotation.Nonnull
[WARNING]   - javax.annotation.CheckReturnValue
[WARNING]   - javax.annotation.CheckForNull
[WARNING]   - javax.annotation.meta.TypeQualifierNickname
[WARNING]   - javax.annotation.MatchesPattern
[WARNING]   - 25 more...
[WARNING] scala-reflect-2.12.6.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 1372 overlapping classes: 
[WARNING]   - scala.reflect.runtime.ReflectionUtils
[WARNING]   - scala.reflect.internal.tpe.TypeMaps$CollectTypeCollector
[WARNING]   - scala.reflect.internal.Scopes$LookupInaccessible$
[WARNING]   - scala.reflect.internal.Types$LazyType
[WARNING]   - scala.reflect.internal.SymbolPairs$Cursor
[WARNING]   - scala.reflect.internal.Types$StaticallyAnnotatedType$
[WARNING]   - scala.reflect.internal.tpe.TypeMaps
[WARNING]   - scala.reflect.api.TypeTags$TypeTagImpl
[WARNING]   - scala.reflect.api.StandardLiftables$StandardUnliftableInstances$$anonfun$unliftTuple18$1
[WARNING]   - scala.reflect.runtime.SynchronizedSymbols$SynchronizedClassSymbol
[WARNING]   - 1362 more...
[WARNING] hadoop-bam-7.9.2.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 115 overlapping classes: 
[WARNING]   - org.seqdoop.hadoop_bam.BAMSplitGuesser
[WARNING]   - org.seqdoop.hadoop_bam.util.SAMHeaderReader
[WARNING]   - org.seqdoop.hadoop_bam.QseqInputFormat
[WARNING]   - org.seqdoop.hadoop_bam.KeyIgnoringBCFRecordWriter
[WARNING]   - org.seqdoop.hadoop_bam.FastaInputFormat$1
[WARNING]   - org.seqdoop.hadoop_bam.util.SAMOutputPreparer$1
[WARNING]   - org.seqdoop.hadoop_bam.QseqOutputFormat$QseqRecordWriter
[WARNING]   - org.seqdoop.hadoop_bam.FastaInputFormat
[WARNING]   - org.seqdoop.hadoop_bam.LineReader
[WARNING]   - org.seqdoop.hadoop_bam.util.BGZFCodec
[WARNING]   - 105 more...
[WARNING] args4j-2.33.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 74 overlapping classes: 
[WARNING]   - org.kohsuke.args4j.spi.DoubleOptionHandler
[WARNING]   - org.kohsuke.args4j.spi.MethodSetter
[WARNING]   - org.kohsuke.args4j.spi.MacAddressOptionHandler
[WARNING]   - org.kohsuke.args4j.spi.StringArrayOptionHandler
[WARNING]   - org.kohsuke.args4j.spi.SubCommand
[WARNING]   - org.kohsuke.args4j.spi.PatternOptionHandler
[WARNING]   - org.kohsuke.args4j.ParserProperties$1
[WARNING]   - org.kohsuke.args4j.OptionHandlerFilter$2
[WARNING]   - org.kohsuke.args4j.spi.MultiFileOptionHandler
[WARNING]   - org.kohsuke.args4j.OptionHandlerRegistry$DefaultConstructorHandlerFactory
[WARNING]   - 64 more...
[WARNING] utils-io-spark2_2.12-0.3.0.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 11 overlapping classes: 
[WARNING]   - org.bdgenomics.utils.io.FileLocator
[WARNING]   - org.bdgenomics.utils.io.ByteAccess
[WARNING]   - org.bdgenomics.utils.io.HTTPRangedByteAccess
[WARNING]   - org.bdgenomics.utils.io.LocalFileByteAccess
[WARNING]   - org.bdgenomics.utils.io.HTTPFileLocator$
[WARNING]   - org.bdgenomics.utils.io.HTTPFileLocator
[WARNING]   - org.bdgenomics.utils.io.LocalFileLocator
[WARNING]   - org.bdgenomics.utils.io.FileLocator$
[WARNING]   - org.bdgenomics.utils.io.HTTPRangedByteAccess$
[WARNING]   - org.bdgenomics.utils.io.ByteArrayByteAccess
[WARNING]   - 1 more...
[WARNING] adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar, commons-pool-1.6.jar define 55 overlapping classes: 
[WARNING]   - org.apache.commons.pool.PoolUtils$PoolableObjectFactoryAdaptor
[WARNING]   - org.apache.commons.pool.impl.GenericObjectPool$1
[WARNING]   - org.apache.commons.pool.impl.GenericObjectPool$Latch
[WARNING]   - org.apache.commons.pool.PoolUtils$ErodingFactor
[WARNING]   - org.apache.commons.pool.BasePoolableObjectFactory
[WARNING]   - org.apache.commons.pool.PoolUtils$KeyedPoolableObjectFactoryAdaptor
[WARNING]   - org.apache.commons.pool.impl.EvictionTimer$PrivilegedGetTccl
[WARNING]   - org.apache.commons.pool.impl.StackKeyedObjectPool
[WARNING]   - org.apache.commons.pool.BaseKeyedPoolableObjectFactory
[WARNING]   - org.apache.commons.pool.impl.GenericKeyedObjectPool$ObjectQueue
[WARNING]   - 45 more...
[WARNING] adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar, kryo-2.24.0.jar define 193 overlapping classes: 
[WARNING]   - com.esotericsoftware.kryo.serializers.BeanSerializer$1
[WARNING]   - com.esotericsoftware.kryo.Registration
[WARNING]   - com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.Handler
[WARNING]   - com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ByteVector
[WARNING]   - com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.FieldVisitor
[WARNING]   - com.esotericsoftware.kryo.util.IntMap$Values
[WARNING]   - com.esotericsoftware.kryo.serializers.DefaultSerializers$IntSerializer
[WARNING]   - com.esotericsoftware.kryo.serializers.FieldSerializerUnsafeUtilImpl
[WARNING]   - com.esotericsoftware.kryo.serializers.JavaSerializer
[WARNING]   - com.esotericsoftware.kryo.serializers.ObjectField$ObjectIntField
[WARNING]   - 183 more...
[WARNING] xz-1.5.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 105 overlapping classes: 
[WARNING]   - org.tukaani.xz.lzma.LZMADecoder$LengthDecoder
[WARNING]   - org.tukaani.xz.index.IndexDecoder
[WARNING]   - org.tukaani.xz.lzma.LZMADecoder
[WARNING]   - org.tukaani.xz.lzma.LZMAEncoderFast
[WARNING]   - org.tukaani.xz.lzma.LZMAEncoder$LengthEncoder
[WARNING]   - org.tukaani.xz.BlockOutputStream
[WARNING]   - org.tukaani.xz.simple.SimpleFilter
[WARNING]   - org.tukaani.xz.rangecoder.RangeCoder
[WARNING]   - org.tukaani.xz.XZOutputStream
[WARNING]   - org.tukaani.xz.UncompressedLZMA2OutputStream
[WARNING]   - 95 more...
[WARNING] paranamer-2.8.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 21 overlapping classes: 
[WARNING]   - com.thoughtworks.paranamer.PositionalParanamer
[WARNING]   - com.thoughtworks.paranamer.JavadocParanamer
[WARNING]   - com.thoughtworks.paranamer.BytecodeReadingParanamer
[WARNING]   - com.thoughtworks.paranamer.BytecodeReadingParanamer$Type
[WARNING]   - com.thoughtworks.paranamer.BytecodeReadingParanamer$1
[WARNING]   - com.thoughtworks.paranamer.JavadocParanamer$DirJavadocProvider
[WARNING]   - com.thoughtworks.paranamer.AnnotationParanamer$Jsr330Helper
[WARNING]   - com.thoughtworks.paranamer.BytecodeReadingParanamer$TypeCollector
[WARNING]   - com.thoughtworks.paranamer.AnnotationParanamer
[WARNING]   - com.thoughtworks.paranamer.BytecodeReadingParanamer$ClassReader
[WARNING]   - 11 more...
[WARNING] slf4j-api-1.7.30.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 34 overlapping classes: 
[WARNING]   - org.slf4j.helpers.SubstituteLogger
[WARNING]   - org.slf4j.helpers.NamedLoggerBase
[WARNING]   - org.slf4j.helpers.NOPMDCAdapter
[WARNING]   - org.slf4j.MarkerFactory
[WARNING]   - org.slf4j.helpers.BasicMarker
[WARNING]   - org.slf4j.spi.LoggerFactoryBinder
[WARNING]   - org.slf4j.MDC$MDCCloseable
[WARNING]   - org.slf4j.spi.LocationAwareLogger
[WARNING]   - org.slf4j.helpers.MessageFormatter
[WARNING]   - org.slf4j.helpers.Util$ClassContextSecurityManager
[WARNING]   - 24 more...
[WARNING] utils-misc-spark2_2.12-0.3.0.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 2 overlapping classes: 
[WARNING]   - org.bdgenomics.utils.misc.MathUtils
[WARNING]   - org.bdgenomics.utils.misc.MathUtils$
[WARNING] adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar, parquet-encoding-1.10.1.jar define 305 overlapping classes: 
[WARNING]   - org.apache.parquet.column.values.bitpacking.ByteBitPackingForLongBE$Packer25
[WARNING]   - org.apache.parquet.column.values.bitpacking.ByteBitPackingBE$Packer14
[WARNING]   - org.apache.parquet.column.values.bitpacking.BytePackerForLong
[WARNING]   - org.apache.parquet.column.values.bitpacking.ByteBitPackingLE$Packer10
[WARNING]   - org.apache.parquet.column.values.bitpacking.ByteBitPackingForLongLE$Packer54
[WARNING]   - org.apache.parquet.column.values.bitpacking.ByteBitPackingForLongLE$Packer41
[WARNING]   - org.apache.parquet.column.values.bitpacking.ByteBitPackingLE$Packer30
[WARNING]   - org.apache.parquet.column.values.bitpacking.ByteBitPackingLE$Packer23
[WARNING]   - org.apache.parquet.column.values.bitpacking.LemireBitPackingLE$Packer19
[WARNING]   - org.apache.parquet.column.values.bitpacking.ByteBitPackingForLongLE$Packer21
[WARNING]   - 295 more...
[WARNING] adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar, adam-codegen-spark2_2.12-0.33.0-SNAPSHOT.jar define 7 overlapping classes: 
[WARNING]   - org.bdgenomics.adam.codegen.DumpSchemasToProjectionEnums$
[WARNING]   - org.bdgenomics.adam.codegen.DumpSchemasToProjectionEnums
[WARNING]   - org.bdgenomics.adam.codegen.DumpSchemasToProduct
[WARNING]   - org.bdgenomics.adam.codegen.DumpSchemasToProduct$
[WARNING]   - org.bdgenomics.adam.codegen.Generator
[WARNING]   - org.bdgenomics.adam.codegen.ReflectSchema
[WARNING]   - org.bdgenomics.adam.codegen.ReflectSchema$
[WARNING] httpcore-4.4.11.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 252 overlapping classes: 
[WARNING]   - org.apache.http.protocol.HttpRequestHandler
[WARNING]   - org.apache.http.impl.io.ChunkedOutputStream
[WARNING]   - org.apache.http.protocol.ChainBuilder
[WARNING]   - org.apache.http.impl.entity.DisallowIdentityContentLengthStrategy
[WARNING]   - org.apache.http.impl.ConnSupport
[WARNING]   - org.apache.http.impl.io.DefaultHttpResponseParserFactory
[WARNING]   - org.apache.http.HttpClientConnection
[WARNING]   - org.apache.http.NameValuePair
[WARNING]   - org.apache.http.protocol.HttpExpectationVerifier
[WARNING]   - org.apache.http.impl.io.AbstractMessageWriter
[WARNING]   - 242 more...
[WARNING] animal-sniffer-annotations-1.17.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 1 overlapping classes: 
[WARNING]   - org.codehaus.mojo.animal_sniffer.IgnoreJRERequirement
[WARNING] parquet-column-1.10.1.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 792 overlapping classes: 
[WARNING]   - org.apache.parquet.it.unimi.dsi.fastutil.longs.LongComparator
[WARNING]   - org.apache.parquet.column.values.dictionary.DictionaryValuesWriter$PlainIntegerDictionaryValuesWriter
[WARNING]   - org.apache.parquet.io.api.Binary$ByteBufferBackedBinary
[WARNING]   - org.apache.parquet.io.PrimitiveColumnIO
[WARNING]   - org.apache.parquet.it.unimi.dsi.fastutil.doubles.DoubleSortedSet
[WARNING]   - org.apache.parquet.io.BaseRecordReader
[WARNING]   - org.apache.parquet.column.ParquetProperties$1
[WARNING]   - org.apache.parquet.filter.ColumnPredicates$12
[WARNING]   - org.apache.parquet.column.UnknownColumnException
[WARNING]   - org.apache.parquet.schema.Types$BaseMapBuilder$ListValueBuilder
[WARNING]   - 782 more...
[WARNING] bdg-formats-0.15.0.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 58 overlapping classes: 
[WARNING]   - org.bdgenomics.formats.avro.VariantCallingAnnotations$Builder
[WARNING]   - org.bdgenomics.formats.avro.VariantAnnotation$Builder
[WARNING]   - org.bdgenomics.formats.avro.Reference$1
[WARNING]   - org.bdgenomics.formats.avro.VariantAnnotationMessage
[WARNING]   - org.bdgenomics.formats.avro.Alignment$Builder
[WARNING]   - org.bdgenomics.formats.avro.OntologyTerm$1
[WARNING]   - org.bdgenomics.formats.avro.Alignment$1
[WARNING]   - org.bdgenomics.formats.avro.Feature$1
[WARNING]   - org.bdgenomics.formats.avro.Sequence$1
[WARNING]   - org.bdgenomics.formats.avro.ReadGroup$Builder
[WARNING]   - 48 more...
[WARNING] commons-jexl-2.1.1.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 178 overlapping classes: 
[WARNING]   - org.apache.commons.jexl2.internal.AbstractExecutor$Get
[WARNING]   - org.apache.commons.jexl2.introspection.JexlPropertyGet
[WARNING]   - org.apache.commons.jexl2.parser.StringParser
[WARNING]   - org.apache.commons.jexl2.parser.ASTBitwiseOrNode
[WARNING]   - org.apache.commons.jexl2.internal.introspection.MethodKey$1
[WARNING]   - org.apache.commons.jexl2.Main
[WARNING]   - org.apache.commons.jexl2.parser.ASTForeachStatement
[WARNING]   - org.apache.commons.jexl2.introspection.Sandbox
[WARNING]   - org.apache.commons.jexl2.internal.introspection.ClassMap
[WARNING]   - org.apache.commons.jexl2.parser.ASTFunctionNode
[WARNING]   - 168 more...
[WARNING] adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar, objenesis-2.1.jar define 37 overlapping classes: 
[WARNING]   - org.objenesis.ObjenesisBase
[WARNING]   - org.objenesis.instantiator.gcj.GCJInstantiator
[WARNING]   - org.objenesis.strategy.SingleInstantiatorStrategy
[WARNING]   - org.objenesis.ObjenesisHelper
[WARNING]   - org.objenesis.instantiator.sun.SunReflectionFactoryHelper
[WARNING]   - org.objenesis.instantiator.jrockit.JRockitLegacyInstantiator
[WARNING]   - org.objenesis.instantiator.sun.SunReflectionFactoryInstantiator
[WARNING]   - org.objenesis.instantiator.basic.NullInstantiator
[WARNING]   - org.objenesis.instantiator.android.Android17Instantiator
[WARNING]   - org.objenesis.instantiator.ObjectInstantiator
[WARNING]   - 27 more...
[WARNING] adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar, grizzled-slf4j_2.12-1.3.4.jar define 3 overlapping classes: 
[WARNING]   - grizzled.slf4j.Logger$
[WARNING]   - grizzled.slf4j.Logging
[WARNING]   - grizzled.slf4j.Logger
[WARNING] snappy-java-1.1.1.3.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 19 overlapping classes: 
[WARNING]   - org.xerial.snappy.SnappyLoader
[WARNING]   - org.xerial.snappy.SnappyFramedInputStream$FrameMetaData
[WARNING]   - org.xerial.snappy.SnappyFramedInputStream
[WARNING]   - org.xerial.snappy.SnappyOutputStream
[WARNING]   - org.xerial.snappy.SnappyErrorCode
[WARNING]   - org.xerial.snappy.SnappyFramedOutputStream
[WARNING]   - org.xerial.snappy.BufferRecycler
[WARNING]   - org.xerial.snappy.SnappyBundleActivator
[WARNING]   - org.xerial.snappy.SnappyError
[WARNING]   - org.xerial.snappy.SnappyFramedInputStream$FrameAction
[WARNING]   - 9 more...
[WARNING] commons-logging-1.2.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 28 overlapping classes: 
[WARNING]   - org.apache.commons.logging.LogSource
[WARNING]   - org.apache.commons.logging.impl.ServletContextCleaner
[WARNING]   - org.apache.commons.logging.Log
[WARNING]   - org.apache.commons.logging.LogFactory$3
[WARNING]   - org.apache.commons.logging.impl.LogFactoryImpl$2
[WARNING]   - org.apache.commons.logging.impl.LogKitLogger
[WARNING]   - org.apache.commons.logging.LogConfigurationException
[WARNING]   - org.apache.commons.logging.impl.Jdk14Logger
[WARNING]   - org.apache.commons.logging.impl.WeakHashtable$Referenced
[WARNING]   - org.apache.commons.logging.impl.WeakHashtable$WeakKey
[WARNING]   - 18 more...
[WARNING] parquet-hadoop-1.10.1.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 162 overlapping classes: 
[WARNING]   - org.apache.parquet.hadoop.mapred.DeprecatedParquetOutputFormat
[WARNING]   - org.apache.parquet.format.converter.ParquetMetadataConverter$RangeMetadataFilter
[WARNING]   - org.apache.parquet.hadoop.api.WriteSupport$WriteContext
[WARNING]   - org.apache.parquet.hadoop.ColumnChunkPageReadStore$ColumnChunkPageReader$1
[WARNING]   - org.apache.parquet.format.converter.ParquetMetadataConverter$2
[WARNING]   - org.apache.parquet.hadoop.metadata.ColumnChunkMetaData
[WARNING]   - org.apache.parquet.hadoop.api.WriteSupport$FinalizedWriteContext
[WARNING]   - org.apache.parquet.hadoop.util.HadoopPositionOutputStream
[WARNING]   - org.apache.parquet.HadoopReadOptions$1
[WARNING]   - org.apache.parquet.format.converter.ParquetMetadataConverter$NoFilter
[WARNING]   - 152 more...
[WARNING] jackson-core-asl-1.9.13.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 121 overlapping classes: 
[WARNING]   - org.codehaus.jackson.annotate.JsonManagedReference
[WARNING]   - org.codehaus.jackson.util.DefaultPrettyPrinter$FixedSpaceIndenter
[WARNING]   - org.codehaus.jackson.JsonGenerationException
[WARNING]   - org.codehaus.jackson.util.BufferRecycler$CharBufferType
[WARNING]   - org.codehaus.jackson.io.UTF32Reader
[WARNING]   - org.codehaus.jackson.sym.Name1
[WARNING]   - org.codehaus.jackson.util.MinimalPrettyPrinter
[WARNING]   - org.codehaus.jackson.impl.JsonParserBase
[WARNING]   - org.codehaus.jackson.sym.CharsToNameCanonicalizer$Bucket
[WARNING]   - org.codehaus.jackson.annotate.JsonValue
[WARNING]   - 111 more...
[WARNING] adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar, adam-cli-spark2_2.12-0.33.0-SNAPSHOT.jar define 54 overlapping classes: 
[WARNING]   - org.bdgenomics.adam.cli.PrintADAMArgs
[WARNING]   - org.bdgenomics.adam.cli.PrintADAM$
[WARNING]   - org.bdgenomics.adam.cli.FlagStat$
[WARNING]   - org.bdgenomics.adam.cli.View
[WARNING]   - org.bdgenomics.adam.cli.FlagStat
[WARNING]   - org.bdgenomics.adam.cli.TransformVariantsArgs
[WARNING]   - org.bdgenomics.adam.cli.TransformVariants$
[WARNING]   - org.bdgenomics.adam.cli.TransformFragments
[WARNING]   - org.bdgenomics.adam.cli.TransformSequences
[WARNING]   - org.bdgenomics.adam.cli.ADAMMain$
[WARNING]   - 44 more...
[WARNING] adam-core-spark2_2.12-0.33.0-SNAPSHOT.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 608 overlapping classes: 
[WARNING]   - org.bdgenomics.adam.io.InterleavedFastqInputFormat
[WARNING]   - org.bdgenomics.adam.rdd.fragment.DatasetBoundFragmentDataset
[WARNING]   - org.bdgenomics.adam.rdd.fragment.FragmentDataset$
[WARNING]   - org.bdgenomics.adam.models.ReferenceRegion
[WARNING]   - org.bdgenomics.adam.rdd.read.recalibration.RecalibrationTable
[WARNING]   - org.bdgenomics.adam.rdd.variant.DatasetBoundVariantContextDataset$
[WARNING]   - org.bdgenomics.adam.rdd.read.AnySAMOutFormatter
[WARNING]   - org.bdgenomics.adam.rdd.variant.GenotypeDataset$$typecreator1$1
[WARNING]   - org.bdgenomics.adam.rdd.read.recalibration.BaseQualityRecalibration
[WARNING]   - org.bdgenomics.adam.rdd.LocatableReferenceRegion
[WARNING]   - 598 more...
[WARNING] jsr203hadoop-1.0.3.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 27 overlapping classes: 
[WARNING]   - hdfs.jsr203.HadoopDirectoryStream
[WARNING]   - hdfs.jsr203.HadoopPath
[WARNING]   - hdfs.jsr203.HadoopFileSystem$1
[WARNING]   - hdfs.jsr203.HadoopFileOwnerAttributeView
[WARNING]   - hdfs.jsr203.HadoopUserPrincipal
[WARNING]   - hdfs.jsr203.IAttributeReader
[WARNING]   - hdfs.jsr203.HadoopPath$1
[WARNING]   - hdfs.jsr203.HadoopBasicFileAttributes
[WARNING]   - hdfs.jsr203.HadoopDirectoryStream$1
[WARNING]   - hdfs.jsr203.package-info
[WARNING]   - 17 more...
[WARNING] commons-compress-1.8.1.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 191 overlapping classes: 
[WARNING]   - org.apache.commons.compress.archivers.sevenz.SevenZArchiveEntry
[WARNING]   - org.apache.commons.compress.archivers.dump.ShortFileException
[WARNING]   - org.apache.commons.compress.utils.CountingInputStream
[WARNING]   - org.apache.commons.compress.compressors.bzip2.CRC
[WARNING]   - org.apache.commons.compress.compressors.bzip2.BZip2CompressorOutputStream
[WARNING]   - org.apache.commons.compress.archivers.dump.DumpArchiveEntry
[WARNING]   - org.apache.commons.compress.changes.ChangeSetPerformer$ArchiveEntryIterator
[WARNING]   - org.apache.commons.compress.compressors.bzip2.BlockSort
[WARNING]   - org.apache.commons.compress.archivers.tar.TarArchiveEntry
[WARNING]   - org.apache.commons.compress.archivers.dump.UnsupportedCompressionAlgorithmException
[WARNING]   - 181 more...
[WARNING] j2objc-annotations-1.1.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 12 overlapping classes: 
[WARNING]   - com.google.j2objc.annotations.Property
[WARNING]   - com.google.j2objc.annotations.RetainedWith
[WARNING]   - com.google.j2objc.annotations.RetainedLocalRef
[WARNING]   - com.google.j2objc.annotations.J2ObjCIncompatible
[WARNING]   - com.google.j2objc.annotations.AutoreleasePool
[WARNING]   - com.google.j2objc.annotations.LoopTranslation$LoopStyle
[WARNING]   - com.google.j2objc.annotations.ReflectionSupport$Level
[WARNING]   - com.google.j2objc.annotations.ReflectionSupport
[WARNING]   - com.google.j2objc.annotations.Weak
[WARNING]   - com.google.j2objc.annotations.WeakOuter
[WARNING]   - 2 more...
[WARNING] avro-1.8.2.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 1172 overlapping classes: 
[WARNING]   - org.apache.avro.message.SchemaStore
[WARNING]   - avro.shaded.com.google.common.collect.SingletonImmutableList
[WARNING]   - org.apache.avro.io.EncoderFactory$DefaultEncoderFactory
[WARNING]   - avro.shaded.com.google.common.collect.Iterables$15
[WARNING]   - org.apache.avro.GuavaClasses
[WARNING]   - avro.shaded.com.google.common.collect.Sets$PowerSet$1$1
[WARNING]   - avro.shaded.com.google.common.collect.RegularImmutableMap
[WARNING]   - org.apache.avro.generic.GenericDatumReader$2
[WARNING]   - avro.shaded.com.google.common.collect.Synchronized$SynchronizedSortedSet
[WARNING]   - org.apache.avro.file.BZip2Codec
[WARNING]   - 1162 more...
[WARNING] minlog-1.2.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 2 overlapping classes: 
[WARNING]   - com.esotericsoftware.minlog.Log
[WARNING]   - com.esotericsoftware.minlog.Log$Logger
[WARNING] error_prone_annotations-2.2.0.jar, adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar define 22 overlapping classes: 
[WARNING]   - com.google.errorprone.annotations.Var
[WARNING]   - com.google.errorprone.annotations.NoAllocation
[WARNING]   - com.google.errorprone.annotations.IncompatibleModifiers
[WARNING]   - com.google.errorprone.annotations.CompatibleWith
[WARNING]   - com.google.errorprone.annotations.concurrent.LockMethod
[WARNING]   - com.google.errorprone.annotations.FormatString
[WARNING]   - com.google.errorprone.annotations.DoNotCall
[WARNING]   - com.google.errorprone.annotations.Immutable
[WARNING]   - com.google.errorprone.annotations.RestrictedApi
[WARNING]   - com.google.errorprone.annotations.CompileTimeConstant
[WARNING]   - 12 more...
[WARNING] maven-shade-plugin has detected that some class files are
[WARNING] present in two or more JARs. When this happens, only one
[WARNING] single version of the class is copied to the uber jar.
[WARNING] Usually this is not harmful and you can skip these warnings,
[WARNING] otherwise try to manually exclude artifacts based on
[WARNING] mvn dependency:tree -Ddetail=true and the above output.
[WARNING] See http://maven.apache.org/plugins/maven-shade-plugin/
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar with /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT-shaded.jar
[INFO] 
[INFO] ---------------< org.bdgenomics.adam:adam-r-spark2_2.12 >---------------
[INFO] Building ADAM_2.12: R APIs 0.33.0-SNAPSHOT                         [8/8]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-r-spark2_2.12 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-r-spark2_2.12 ---
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-r-spark2_2.12 ---
[INFO] Modified 0 of 0 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:resources (default-resources) @ adam-r-spark2_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-r/src/main/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-r-spark2_2.12 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- exec-maven-plugin:1.5.0:exec (doc-r) @ adam-r-spark2_2.12 ---

R version 3.6.3 (2020-02-29) -- "Holding the Windsock"
Copyright (C) 2020 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

  Natural language support but running in an English locale

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.

> library(devtools);devtools::document()
Loading required package: usethis
Updating bdgenomics.adam documentation
Loading bdgenomics.adam
Creating a new generic function for ‘pipe’ in package ‘bdgenomics.adam’
Creating a new generic function for ‘transform’ in package ‘bdgenomics.adam’
Creating a new generic function for ‘save’ in package ‘bdgenomics.adam’
Creating a new generic function for ‘sort’ in package ‘bdgenomics.adam’

Attaching package: ‘SparkR’

The following objects are masked from ‘package:stats’:

    cov, filter, lag, na.omit, predict, sd, var, window

The following objects are masked from ‘package:base’:

    as.data.frame, colnames, colnames<-, drop, endsWith, intersect,
    rank, rbind, sample, startsWith, subset, summary, transform, union

Writing NAMESPACE
Writing NAMESPACE
Writing ADAMContext.Rd
Writing GenomicDataset.Rd
Writing AlignmentDataset.Rd
Writing CoverageDataset.Rd
Writing FragmentDataset.Rd
Writing toVariantContexts.Rd
Writing toVariants.Rd
Writing SliceDataset.Rd
Writing VariantContextDataset.Rd
Writing createADAMContext.Rd
Writing loadAlignments-ADAMContext-character-method.Rd
Writing loadDnaSequences-ADAMContext-character-method.Rd
Writing loadProteinSequences-ADAMContext-character-method.Rd
Writing loadRnaSequences-ADAMContext-character-method.Rd
Writing loadSlices-ADAMContext-character-method.Rd
Writing loadFragments-ADAMContext-character-method.Rd
Writing loadFeatures-ADAMContext-character-method.Rd
Writing loadCoverage-ADAMContext-character-method.Rd
Writing loadGenotypes-ADAMContext-character-method.Rd
Writing loadVariants-ADAMContext-character-method.Rd
Writing FeatureDataset.Rd
Writing GenotypeDataset.Rd
Writing SequenceDataset.Rd
Writing VariantDataset.Rd
Writing pipe-GenomicDataset-ANY-character-character-character-method.Rd
Writing cache-GenomicDataset-method.Rd
Writing persist-GenomicDataset-character-method.Rd
Writing unpersist-GenomicDataset-method.Rd
Writing sort-GenomicDataset-method.Rd
Writing sortLexicographically-GenomicDataset-method.Rd
Writing toDF-GenomicDataset-method.Rd
Writing transform-GenomicDataset-function-method.Rd
Writing transmute-GenomicDataset-function-character-method.Rd
Writing broadcastRegionJoin-GenomicDataset-GenomicDataset-method.Rd
Writing rightOuterBroadcastRegionJoin-GenomicDataset-GenomicDataset-method.Rd
Writing broadcastRegionJoinAndGroupByRight-GenomicDataset-GenomicDataset-method.Rd
Writing rightOuterBroadcastRegionJoinAndGroupByRight-GenomicDataset-GenomicDataset-method.Rd
Writing shuffleRegionJoin-GenomicDataset-GenomicDataset-method.Rd
Writing rightOuterShuffleRegionJoin-GenomicDataset-GenomicDataset-method.Rd
Writing leftOuterShuffleRegionJoin-GenomicDataset-GenomicDataset-method.Rd
Writing leftOuterShuffleRegionJoinAndGroupByLeft-GenomicDataset-GenomicDataset-method.Rd
Writing fullOuterShuffleRegionJoin-GenomicDataset-GenomicDataset-method.Rd
Writing rightOuterShuffleRegionJoinAndGroupByLeft-GenomicDataset-GenomicDataset-method.Rd
Writing shuffleRegionJoinAndGroupByLeft-GenomicDataset-GenomicDataset-method.Rd
Writing toFragments-AlignmentDataset-method.Rd
Writing saveAsSam-AlignmentDataset-character-method.Rd
Writing toCoverage-AlignmentDataset-method.Rd
Writing save-AlignmentDataset-character-method.Rd
Writing countKmers-AlignmentDataset-numeric-method.Rd
Writing sortByReadName-AlignmentDataset-method.Rd
Writing sortByReferencePosition-AlignmentDataset-method.Rd
Writing sortByReferencePositionAndIndex-AlignmentDataset-method.Rd
Writing markDuplicates-AlignmentDataset-method.Rd
Writing recalibrateBaseQualities-AlignmentDataset-VariantDataset-character-method.Rd
Writing realignIndels-AlignmentDataset-method.Rd
Writing save-CoverageDataset-character-method.Rd
Writing collapse-CoverageDataset-method.Rd
Writing toFeatures-CoverageDataset-method.Rd
Writing coverage-CoverageDataset-method.Rd
Writing flatten-CoverageDataset-method.Rd
Writing save-FeatureDataset-character-method.Rd
Writing toCoverage-FeatureDataset-method.Rd
Writing toAlignments-FragmentDataset-method.Rd
Writing markDuplicates-FragmentDataset-method.Rd
Writing save-FragmentDataset-character-method.Rd
Writing saveAsParquet-GenotypeDataset-character-method.Rd
Writing toVariants-GenotypeDataset-method.Rd
Writing toVariantContexts-GenotypeDataset-method.Rd
Writing save-SequenceDataset-character-method.Rd
Writing save-SliceDataset-character-method.Rd
Writing flankAdjacentFragments-SliceDataset-numeric-method.Rd
Writing saveAsParquet-VariantDataset-character-method.Rd
Writing toVariantContexts-VariantDataset-method.Rd
Writing saveAsVcf-VariantContextDataset-character-method.Rd
Warning message:
> 
> 
roxygen2 requires Encoding: UTF-8 
[INFO] 
[INFO] --- exec-maven-plugin:1.5.0:exec (dev-r) @ adam-r-spark2_2.12 ---
* checking for file ‘bdgenomics.adam/DESCRIPTION’ ... OK
* preparing ‘bdgenomics.adam’:
* checking DESCRIPTION meta-information ... OK
* checking for LF line-endings in source and make files and shell scripts
* checking for empty or unneeded directories
* building ‘bdgenomics.adam_0.32.0.tar.gz’
Warning in utils::tar(filepath, pkgname, compression = compression, compression_level = 9L,  :
  storing paths of more than 100 bytes is not portable:
  ‘bdgenomics.adam/man/rightOuterBroadcastRegionJoinAndGroupByRight-GenomicDataset-GenomicDataset-method.Rd’
Warning in utils::tar(filepath, pkgname, compression = compression, compression_level = 9L,  :
  storing paths of more than 100 bytes is not portable:
  ‘bdgenomics.adam/man/rightOuterShuffleRegionJoinAndGroupByLeft-GenomicDataset-GenomicDataset-method.Rd’

[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:compile (default-compile) @ adam-r-spark2_2.12 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-resources-plugin:3.1.0:testResources (default-testResources) @ adam-r-spark2_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-r/src/test/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-r-spark2_2.12 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- exec-maven-plugin:1.5.0:exec (test-r) @ adam-r-spark2_2.12 ---
* using log directory ‘/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-r/bdgenomics.adam.Rcheck’
* using R version 3.6.3 (2020-02-29)
* using platform: x86_64-pc-linux-gnu (64-bit)
* using session charset: UTF-8
* checking for file ‘bdgenomics.adam/DESCRIPTION’ ... OK
* checking extension type ... Package
* this is package ‘bdgenomics.adam’ version ‘0.32.0’
* checking package namespace information ... OK
* checking package dependencies ... OK
* checking if this is a source package ... OK
* checking if there is a namespace ... OK
* checking for executable files ... OK
* checking for hidden files and directories ... OK
* checking for portable file names ... NOTE
Found the following non-portable file paths:
  bdgenomics.adam/man/rightOuterBroadcastRegionJoinAndGroupByRight-GenomicDataset-GenomicDataset-method.Rd
  bdgenomics.adam/man/rightOuterShuffleRegionJoinAndGroupByLeft-GenomicDataset-GenomicDataset-method.Rd

Tarballs are only required to store paths of up to 100 bytes and cannot
store those of more than 256 bytes, with restrictions including to 100
bytes for the final component.
See section ‘Package structure’ in the ‘Writing R Extensions’ manual.
* checking for sufficient/correct file permissions ... OK
* checking whether package ‘bdgenomics.adam’ can be installed ... OK
* checking installed package size ... OK
* checking package directory ... OK
* checking DESCRIPTION meta-information ... NOTE
Checking should be performed on sources prepared by ‘R CMD build’.
* checking top-level files ... OK
* checking for left-over files ... OK
* checking index information ... OK
* checking package subdirectories ... OK
* checking R files for non-ASCII characters ... OK
* checking R files for syntax errors ... OK
* checking whether the package can be loaded ... OK
* checking whether the package can be loaded with stated dependencies ... OK
* checking whether the package can be unloaded cleanly ... OK
* checking whether the namespace can be loaded with stated dependencies ... OK
* checking whether the namespace can be unloaded cleanly ... OK
* checking loading without being on the library search path ... OK
* checking dependencies in R code ... OK
* checking S3 generic/method consistency ... OK
* checking replacement functions ... OK
* checking foreign function calls ... OK
* checking R code for possible problems ... OK
* checking Rd files ... OK
* checking Rd metadata ... OK
* checking Rd cross-references ... OK
* checking for missing documentation entries ... OK
* checking for code/documentation mismatches ... OK
* checking Rd \usage sections ... WARNING
Undocumented arguments in documentation object 'toVariants,GenotypeDataset-method'
  ‘ardd’

Undocumented arguments in documentation object 'toVariants'
  ‘...’

Functions with \usage entries need to have the appropriate \alias
entries, and all their arguments documented.
The \usage entries must correspond to syntactically valid R code.
See chapter ‘Writing R documentation files’ in the ‘Writing R
Extensions’ manual.
* checking Rd contents ... OK
* checking for unstated dependencies in examples ... OK
* checking examples ... NONE
* checking for unstated dependencies in ‘tests’ ... OK
* checking tests ...
  Running ‘testthat.R’
 OK
* checking PDF version of manual ... OK
* DONE

Status: 1 WARNING, 2 NOTEs
See
  ‘/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-r/bdgenomics.adam.Rcheck/00check.log’
for details.


[INFO] 
[INFO] --- maven-compiler-plugin:3.8.0:testCompile (default-testCompile) @ adam-r-spark2_2.12 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:3.0.0-M3:test (default-test) @ adam-r-spark2_2.12 ---
[INFO] No tests to run.
[INFO] 
[INFO] --- maven-jar-plugin:3.2.0:jar (default-jar) @ adam-r-spark2_2.12 ---
[WARNING] JAR will be empty - no content was marked for inclusion!
[INFO] Building jar: /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-r/target/adam-r-spark2_2.12-0.33.0-SNAPSHOT.jar
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for ADAM_2.12 0.33.0-SNAPSHOT:
[INFO] 
[INFO] ADAM_2.12 .......................................... SUCCESS [  8.109 s]
[INFO] ADAM_2.12: Shader workaround ....................... SUCCESS [  1.295 s]
[INFO] ADAM_2.12: Avro-to-Dataset codegen utils ........... SUCCESS [  1.192 s]
[INFO] ADAM_2.12: Core .................................... SUCCESS [ 36.751 s]
[INFO] ADAM_2.12: APIs for Java, Python ................... SUCCESS [  2.910 s]
[INFO] ADAM_2.12: CLI ..................................... SUCCESS [  4.100 s]
[INFO] ADAM_2.12: Assembly ................................ SUCCESS [ 14.868 s]
[INFO] ADAM_2.12: R APIs .................................. SUCCESS [01:06 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  02:15 min
[INFO] Finished at: 2020-09-09T12:27:46-07:00
[INFO] ------------------------------------------------------------------------

# define filenames
BAM=mouse_chrM.bam
+ BAM=mouse_chrM.bam
READS=${BAM}.reads.adam
+ READS=mouse_chrM.bam.reads.adam
SORTED_READS=${BAM}.reads.sorted.adam
+ SORTED_READS=mouse_chrM.bam.reads.sorted.adam
FRAGMENTS=${BAM}.fragments.adam
+ FRAGMENTS=mouse_chrM.bam.fragments.adam
    
# fetch our input dataset
echo "Fetching BAM file"
+ echo 'Fetching BAM file'
Fetching BAM file
rm -rf ${BAM}
+ rm -rf mouse_chrM.bam
wget -q https://s3.amazonaws.com/bdgenomics-test/${BAM}
+ wget -q https://s3.amazonaws.com/bdgenomics-test/mouse_chrM.bam

# once fetched, convert BAM to ADAM
echo "Converting BAM to ADAM read format"
+ echo 'Converting BAM to ADAM read format'
Converting BAM to ADAM read format
rm -rf ${READS}
+ rm -rf mouse_chrM.bam.reads.adam
${ADAM} transformAlignments ${BAM} ${READS}
+ ./bin/adam-submit transformAlignments mouse_chrM.bam mouse_chrM.bam.reads.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.6-bin-without-hadoop-scala-2.12/bin/spark-submit
20/09/09 12:27:58 WARN util.Utils: Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0)
20/09/09 12:27:58 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
20/09/09 12:27:58 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/09/09 12:27:58 INFO cli.ADAMMain: ADAM invoked with args: "transformAlignments" "mouse_chrM.bam" "mouse_chrM.bam.reads.adam"
20/09/09 12:27:58 INFO spark.SparkContext: Running Spark version 2.4.6
20/09/09 12:27:58 INFO spark.SparkContext: Submitted application: transformAlignments
20/09/09 12:27:59 INFO spark.SecurityManager: Changing view acls to: jenkins
20/09/09 12:27:59 INFO spark.SecurityManager: Changing modify acls to: jenkins
20/09/09 12:27:59 INFO spark.SecurityManager: Changing view acls groups to: 
20/09/09 12:27:59 INFO spark.SecurityManager: Changing modify acls groups to: 
20/09/09 12:27:59 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
20/09/09 12:27:59 INFO util.Utils: Successfully started service 'sparkDriver' on port 40843.
20/09/09 12:27:59 INFO spark.SparkEnv: Registering MapOutputTracker
20/09/09 12:27:59 INFO spark.SparkEnv: Registering BlockManagerMaster
20/09/09 12:27:59 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
20/09/09 12:27:59 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
20/09/09 12:27:59 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-48f070f0-e1f0-41ca-a87d-39f877bbe86c
20/09/09 12:27:59 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
20/09/09 12:27:59 INFO spark.SparkEnv: Registering OutputCommitCoordinator
20/09/09 12:27:59 INFO util.log: Logging initialized @2623ms
20/09/09 12:27:59 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
20/09/09 12:27:59 INFO server.Server: Started @2705ms
20/09/09 12:27:59 INFO server.AbstractConnector: Started ServerConnector@7692cd34{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
20/09/09 12:27:59 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@49ef32e0{/jobs,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3be8821f{/jobs/json,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@64b31700{/jobs/job,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@bae47a0{/jobs/job/json,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@74a9c4b0{/stages,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@85ec632{/stages/json,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c05a54d{/stages/stage,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@214894fc{/stages/stage/json,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@10567255{/stages/pool,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@e362c57{/stages/pool/json,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c4ee95c{/storage,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@79c4715d{/storage/json,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5aa360ea{/storage/rdd,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6548bb7d{/storage/rdd/json,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@e27ba81{/environment,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@54336c81{/environment/json,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1556f2dd{/executors,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@35e52059{/executors/json,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@62577d6{/executors/threadDump,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@49bd54f7{/executors/threadDump/json,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6b5f8707{/static,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@17ae98d7{/,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@59221b97{/api,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@704b2127{/jobs/job/kill,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3ee39da0{/stages/stage/kill,null,AVAILABLE,@Spark}
20/09/09 12:27:59 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.31:4040
20/09/09 12:27:59 INFO spark.SparkContext: Added JAR file:/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar at spark://192.168.10.31:40843/jars/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar with timestamp 1599679679750
20/09/09 12:27:59 INFO executor.Executor: Starting executor ID driver on host localhost
20/09/09 12:27:59 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 41685.
20/09/09 12:27:59 INFO netty.NettyBlockTransferService: Server created on 192.168.10.31:41685
20/09/09 12:27:59 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
20/09/09 12:27:59 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.31, 41685, None)
20/09/09 12:27:59 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.10.31:41685 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 41685, None)
20/09/09 12:27:59 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.31, 41685, None)
20/09/09 12:27:59 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 41685, None)
20/09/09 12:28:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7bef452c{/metrics/json,null,AVAILABLE,@Spark}
20/09/09 12:28:00 INFO rdd.ADAMContext: Loading mouse_chrM.bam as BAM/CRAM/SAM and converting to Alignments.
20/09/09 12:28:00 INFO rdd.ADAMContext: Loaded header from file:/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam
20/09/09 12:28:01 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 295.6 KB, free 366.0 MB)
20/09/09 12:28:01 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 24.0 KB, free 366.0 MB)
20/09/09 12:28:01 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.31:41685 (size: 24.0 KB, free: 366.3 MB)
20/09/09 12:28:01 INFO spark.SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:2054
20/09/09 12:28:03 INFO read.RDDBoundAlignmentDataset: Saving data in ADAM format
20/09/09 12:28:03 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/09/09 12:28:03 INFO input.FileInputFormat: Total input paths to process : 1
20/09/09 12:28:03 INFO spark.SparkContext: Starting job: runJob at SparkHadoopWriter.scala:78
20/09/09 12:28:03 INFO scheduler.DAGScheduler: Got job 0 (runJob at SparkHadoopWriter.scala:78) with 1 output partitions
20/09/09 12:28:03 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (runJob at SparkHadoopWriter.scala:78)
20/09/09 12:28:03 INFO scheduler.DAGScheduler: Parents of final stage: List()
20/09/09 12:28:03 INFO scheduler.DAGScheduler: Missing parents: List()
20/09/09 12:28:03 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[2] at map at GenomicDataset.scala:3805), which has no missing parents
20/09/09 12:28:03 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 85.6 KB, free 365.9 MB)
20/09/09 12:28:03 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 31.5 KB, free 365.9 MB)
20/09/09 12:28:03 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.10.31:41685 (size: 31.5 KB, free: 366.2 MB)
20/09/09 12:28:03 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1163
20/09/09 12:28:03 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[2] at map at GenomicDataset.scala:3805) (first 15 tasks are for partitions Vector(0))
20/09/09 12:28:03 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
20/09/09 12:28:03 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7441 bytes)
20/09/09 12:28:03 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
20/09/09 12:28:03 INFO executor.Executor: Fetching spark://192.168.10.31:40843/jars/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar with timestamp 1599679679750
20/09/09 12:28:03 INFO client.TransportClientFactory: Successfully created connection to /192.168.10.31:40843 after 39 ms (0 ms spent in bootstraps)
20/09/09 12:28:03 INFO util.Utils: Fetching spark://192.168.10.31:40843/jars/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar to /tmp/spark-33dabec3-1c0a-4a10-9f6a-f46b712814e4/userFiles-df7f2771-3eea-41f0-9873-4e9a0baeb8cd/fetchFileTemp22799955782647872.tmp
20/09/09 12:28:03 INFO executor.Executor: Adding file:/tmp/spark-33dabec3-1c0a-4a10-9f6a-f46b712814e4/userFiles-df7f2771-3eea-41f0-9873-4e9a0baeb8cd/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar to class loader
20/09/09 12:28:03 INFO rdd.NewHadoopRDD: Input split: file:/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam:83361792-833134657535
20/09/09 12:28:04 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/09/09 12:28:04 INFO codec.CodecConfig: Compression: GZIP
20/09/09 12:28:04 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/09/09 12:28:04 INFO hadoop.ParquetOutputFormat: Parquet block size to 134217728
20/09/09 12:28:04 INFO hadoop.ParquetOutputFormat: Parquet page size to 1048576
20/09/09 12:28:04 INFO hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
20/09/09 12:28:04 INFO hadoop.ParquetOutputFormat: Dictionary is on
20/09/09 12:28:04 INFO hadoop.ParquetOutputFormat: Validation is off
20/09/09 12:28:04 INFO hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
20/09/09 12:28:04 INFO hadoop.ParquetOutputFormat: Maximum row group padding size is 8388608 bytes
20/09/09 12:28:04 INFO hadoop.ParquetOutputFormat: Page size checking is: estimated
20/09/09 12:28:04 INFO hadoop.ParquetOutputFormat: Min row count for page size check is: 100
20/09/09 12:28:04 INFO hadoop.ParquetOutputFormat: Max row count for page size check is: 10000
20/09/09 12:28:04 INFO compress.CodecPool: Got brand-new compressor [.gz]
Ignoring SAM validation error: ERROR: Record 162622, Read name 613F0AAXX100423:3:58:9979:16082, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162624, Read name 613F0AAXX100423:6:13:3141:11793, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162625, Read name 613F0AAXX100423:8:39:18592:13552, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162635, Read name 613F1AAXX100423:7:2:13114:10698, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162637, Read name 613F1AAXX100423:6:100:8840:11167, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162639, Read name 613F1AAXX100423:8:15:10944:11181, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162640, Read name 613F1AAXX100423:8:17:5740:10104, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162651, Read name 613F1AAXX100423:1:53:11097:8261, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162654, Read name 613F1AAXX100423:2:112:16779:19612, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162657, Read name 613F0AAXX100423:8:28:7084:17683, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162659, Read name 613F0AAXX100423:8:39:19796:12794, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162662, Read name 613F1AAXX100423:5:116:9339:3264, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162667, Read name 613F0AAXX100423:4:67:2015:3054, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162669, Read name 613F0AAXX100423:7:7:11297:11738, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162674, Read name 613F0AAXX100423:6:59:10490:20829, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162678, Read name 613F1AAXX100423:8:11:17603:4766, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162682, Read name 613F0AAXX100423:5:86:10814:10257, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162683, Read name 613F0AAXX100423:5:117:14178:6111, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162685, Read name 613F0AAXX100423:2:3:13563:6720, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162689, Read name 613F0AAXX100423:7:59:16009:15799, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162696, Read name 613F0AAXX100423:5:31:9663:18252, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162698, Read name 613F1AAXX100423:2:27:12264:14626, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162699, Read name 613F0AAXX100423:1:120:19003:6647, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162702, Read name 613F1AAXX100423:3:37:6972:18407, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162704, Read name 613F1AAXX100423:3:77:6946:3880, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162706, Read name 613F0AAXX100423:7:48:2692:3492, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162708, Read name 613F1AAXX100423:7:80:8790:1648, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162710, Read name 6141AAAXX100423:5:30:15036:17610, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162712, Read name 613F1AAXX100423:8:80:6261:4465, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162713, Read name 6141AAAXX100423:5:74:5542:6195, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162715, Read name 613F1AAXX100423:5:14:14844:13639, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162718, Read name 613F1AAXX100423:7:112:14569:8480, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162725, Read name 613F1AAXX100423:4:56:10160:9879, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162727, Read name 6141AAAXX100423:7:89:12209:9221, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162731, Read name 6141AAAXX100423:6:55:1590:19793, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162732, Read name 6141AAAXX100423:7:102:16679:12368, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162734, Read name 613F1AAXX100423:2:7:4909:18472, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162737, Read name 6141AAAXX100423:4:73:6574:10572, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162741, Read name 6141AAAXX100423:1:8:14113:12655, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162743, Read name 6141AAAXX100423:3:40:7990:5056, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162744, Read name 6141AAAXX100423:4:36:15793:3411, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162745, Read name 6141AAAXX100423:8:83:1139:18985, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162746, Read name 6141AAAXX100423:5:7:18196:13562, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162748, Read name 6141AAAXX100423:3:114:5639:7123, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162751, Read name 6141AAAXX100423:7:47:4898:8640, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162753, Read name 6141AAAXX100423:3:64:8064:8165, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162756, Read name 613F1AAXX100423:1:105:14386:1684, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162757, Read name 613F1AAXX100423:6:98:1237:19470, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162761, Read name 613F1AAXX100423:7:106:19658:9261, MAPQ should be 0 for unmapped read.
20/09/09 12:28:13 INFO hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 16043959
20/09/09 12:28:13 INFO output.FileOutputCommitter: Saved output of task 'attempt_20200909122803_0002_r_000000_0' to file:/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/_temporary/0/task_20200909122803_0002_r_000000
20/09/09 12:28:13 INFO mapred.SparkHadoopMapRedUtil: attempt_20200909122803_0002_r_000000_0: Committed
20/09/09 12:28:13 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 856 bytes result sent to driver
20/09/09 12:28:13 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 10294 ms on localhost (executor driver) (1/1)
20/09/09 12:28:13 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
20/09/09 12:28:13 INFO scheduler.DAGScheduler: ResultStage 0 (runJob at SparkHadoopWriter.scala:78) finished in 10.446 s
20/09/09 12:28:13 INFO scheduler.DAGScheduler: Job 0 finished: runJob at SparkHadoopWriter.scala:78, took 10.516714 s
20/09/09 12:28:13 INFO hadoop.ParquetFileReader: Initiating action with parallelism: 5
20/09/09 12:28:13 INFO io.SparkHadoopWriter: Job job_20200909122803_0002 committed.
20/09/09 12:28:13 INFO spark.SparkContext: Invoking stop() from shutdown hook
20/09/09 12:28:13 INFO server.AbstractConnector: Stopped Spark@7692cd34{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
20/09/09 12:28:13 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.10.31:4040
20/09/09 12:28:13 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/09/09 12:28:13 INFO memory.MemoryStore: MemoryStore cleared
20/09/09 12:28:13 INFO storage.BlockManager: BlockManager stopped
20/09/09 12:28:13 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
20/09/09 12:28:13 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/09/09 12:28:13 INFO spark.SparkContext: Successfully stopped SparkContext
20/09/09 12:28:13 INFO util.ShutdownHookManager: Shutdown hook called
20/09/09 12:28:13 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-33dabec3-1c0a-4a10-9f6a-f46b712814e4
20/09/09 12:28:13 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-caa3921b-c8e5-497c-af52-c1a1d174059b

# then, sort the BAM
echo "Converting BAM to ADAM read format with sorting"
+ echo 'Converting BAM to ADAM read format with sorting'
Converting BAM to ADAM read format with sorting
rm -rf ${SORTED_READS}
+ rm -rf mouse_chrM.bam.reads.sorted.adam
${ADAM} transformAlignments -sort_by_reference_position ${READS} ${SORTED_READS}
+ ./bin/adam-submit transformAlignments -sort_by_reference_position mouse_chrM.bam.reads.adam mouse_chrM.bam.reads.sorted.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.6-bin-without-hadoop-scala-2.12/bin/spark-submit
20/09/09 12:28:15 WARN util.Utils: Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0)
20/09/09 12:28:15 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
20/09/09 12:28:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/09/09 12:28:16 INFO cli.ADAMMain: ADAM invoked with args: "transformAlignments" "-sort_by_reference_position" "mouse_chrM.bam.reads.adam" "mouse_chrM.bam.reads.sorted.adam"
20/09/09 12:28:16 INFO spark.SparkContext: Running Spark version 2.4.6
20/09/09 12:28:16 INFO spark.SparkContext: Submitted application: transformAlignments
20/09/09 12:28:16 INFO spark.SecurityManager: Changing view acls to: jenkins
20/09/09 12:28:16 INFO spark.SecurityManager: Changing modify acls to: jenkins
20/09/09 12:28:16 INFO spark.SecurityManager: Changing view acls groups to: 
20/09/09 12:28:16 INFO spark.SecurityManager: Changing modify acls groups to: 
20/09/09 12:28:16 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
20/09/09 12:28:16 INFO util.Utils: Successfully started service 'sparkDriver' on port 40321.
20/09/09 12:28:16 INFO spark.SparkEnv: Registering MapOutputTracker
20/09/09 12:28:16 INFO spark.SparkEnv: Registering BlockManagerMaster
20/09/09 12:28:16 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
20/09/09 12:28:16 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
20/09/09 12:28:16 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-43a0d814-af01-44f2-abd6-2e7b6c51f628
20/09/09 12:28:17 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
20/09/09 12:28:17 INFO spark.SparkEnv: Registering OutputCommitCoordinator
20/09/09 12:28:17 INFO util.log: Logging initialized @2758ms
20/09/09 12:28:17 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
20/09/09 12:28:17 INFO server.Server: Started @2850ms
20/09/09 12:28:17 INFO server.AbstractConnector: Started ServerConnector@33aa93c{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
20/09/09 12:28:17 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@271f18d3{/jobs,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@64b31700{/jobs/json,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3b65e559{/jobs/job,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@74a9c4b0{/jobs/job/json,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@85ec632{/stages,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c05a54d{/stages/json,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@65ef722a{/stages/stage,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@10567255{/stages/stage/json,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@e362c57{/stages/pool,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c4ee95c{/stages/pool/json,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@79c4715d{/storage,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5aa360ea{/storage/json,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6548bb7d{/storage/rdd,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@e27ba81{/storage/rdd/json,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@54336c81{/environment,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1556f2dd{/environment/json,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@35e52059{/executors,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@62577d6{/executors/json,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@49bd54f7{/executors/threadDump,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6b5f8707{/executors/threadDump/json,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@772485dd{/static,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@59221b97{/,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6ac4944a{/api,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3ee39da0{/jobs/job/kill,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5d332969{/stages/stage/kill,null,AVAILABLE,@Spark}
20/09/09 12:28:17 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.31:4040
20/09/09 12:28:17 INFO spark.SparkContext: Added JAR file:/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar at spark://192.168.10.31:40321/jars/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar with timestamp 1599679697342
20/09/09 12:28:17 INFO executor.Executor: Starting executor ID driver on host localhost
20/09/09 12:28:17 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 42945.
20/09/09 12:28:17 INFO netty.NettyBlockTransferService: Server created on 192.168.10.31:42945
20/09/09 12:28:17 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
20/09/09 12:28:17 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.31, 42945, None)
20/09/09 12:28:17 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.10.31:42945 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 42945, None)
20/09/09 12:28:17 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.31, 42945, None)
20/09/09 12:28:17 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 42945, None)
20/09/09 12:28:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4bb8855f{/metrics/json,null,AVAILABLE,@Spark}
20/09/09 12:28:18 INFO rdd.ADAMContext: Loading mouse_chrM.bam.reads.adam as Parquet of Alignments.
20/09/09 12:28:19 INFO rdd.ADAMContext: Reading the ADAM file at mouse_chrM.bam.reads.adam to create RDD
20/09/09 12:28:20 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 315.0 KB, free 366.0 MB)
20/09/09 12:28:20 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 28.1 KB, free 366.0 MB)
20/09/09 12:28:20 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.31:42945 (size: 28.1 KB, free: 366.3 MB)
20/09/09 12:28:20 INFO spark.SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:1797
20/09/09 12:28:20 INFO cli.TransformAlignments: Sorting alignments by reference position, with references ordered by name
20/09/09 12:28:20 INFO read.RDDBoundAlignmentDataset: Sorting alignments by reference position
20/09/09 12:28:20 INFO input.FileInputFormat: Total input paths to process : 1
20/09/09 12:28:20 INFO hadoop.ParquetInputFormat: Total input paths to process : 1
20/09/09 12:28:20 INFO read.RDDBoundAlignmentDataset: Saving data in ADAM format
20/09/09 12:28:20 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/09/09 12:28:20 INFO spark.SparkContext: Starting job: runJob at SparkHadoopWriter.scala:78
20/09/09 12:28:20 INFO scheduler.DAGScheduler: Registering RDD 2 (sortBy at AlignmentDataset.scala:1004) as input to shuffle 0
20/09/09 12:28:20 INFO scheduler.DAGScheduler: Got job 0 (runJob at SparkHadoopWriter.scala:78) with 1 output partitions
20/09/09 12:28:20 INFO scheduler.DAGScheduler: Final stage: ResultStage 1 (runJob at SparkHadoopWriter.scala:78)
20/09/09 12:28:20 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
20/09/09 12:28:20 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 0)
20/09/09 12:28:20 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[2] at sortBy at AlignmentDataset.scala:1004), which has no missing parents
20/09/09 12:28:20 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 5.9 KB, free 366.0 MB)
20/09/09 12:28:21 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 3.4 KB, free 366.0 MB)
20/09/09 12:28:21 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.10.31:42945 (size: 3.4 KB, free: 366.3 MB)
20/09/09 12:28:21 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1163
20/09/09 12:28:21 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[2] at sortBy at AlignmentDataset.scala:1004) (first 15 tasks are for partitions Vector(0))
20/09/09 12:28:21 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
20/09/09 12:28:21 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7480 bytes)
20/09/09 12:28:21 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
20/09/09 12:28:21 INFO executor.Executor: Fetching spark://192.168.10.31:40321/jars/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar with timestamp 1599679697342
20/09/09 12:28:21 INFO client.TransportClientFactory: Successfully created connection to /192.168.10.31:40321 after 38 ms (0 ms spent in bootstraps)
20/09/09 12:28:21 INFO util.Utils: Fetching spark://192.168.10.31:40321/jars/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar to /tmp/spark-62e85345-0b2d-455c-ada0-f907f46ef739/userFiles-64780499-be62-4f6f-a0e2-71400789897a/fetchFileTemp1486303399237492066.tmp
20/09/09 12:28:21 INFO executor.Executor: Adding file:/tmp/spark-62e85345-0b2d-455c-ada0-f907f46ef739/userFiles-64780499-be62-4f6f-a0e2-71400789897a/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar to class loader
20/09/09 12:28:21 INFO rdd.NewHadoopRDD: Input split: file:/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/part-r-00000.gz.parquet:0+10132211
20/09/09 12:28:21 INFO hadoop.InternalParquetRecordReader: RecordReader initialized will read a total of 163064 records.
20/09/09 12:28:21 INFO hadoop.InternalParquetRecordReader: at row 0. reading next block
20/09/09 12:28:21 INFO compress.CodecPool: Got brand-new decompressor [.gz]
20/09/09 12:28:21 INFO hadoop.InternalParquetRecordReader: block read in memory in 40 ms. row count = 163064
20/09/09 12:28:25 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 962 bytes result sent to driver
20/09/09 12:28:25 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 4077 ms on localhost (executor driver) (1/1)
20/09/09 12:28:25 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
20/09/09 12:28:25 INFO scheduler.DAGScheduler: ShuffleMapStage 0 (sortBy at AlignmentDataset.scala:1004) finished in 4.227 s
20/09/09 12:28:25 INFO scheduler.DAGScheduler: looking for newly runnable stages
20/09/09 12:28:25 INFO scheduler.DAGScheduler: running: Set()
20/09/09 12:28:25 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 1)
20/09/09 12:28:25 INFO scheduler.DAGScheduler: failed: Set()
20/09/09 12:28:25 INFO scheduler.DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[5] at map at GenomicDataset.scala:3805), which has no missing parents
20/09/09 12:28:25 INFO memory.MemoryStore: Block broadcast_2 stored as values in memory (estimated size 86.9 KB, free 365.9 MB)
20/09/09 12:28:25 INFO memory.MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 32.4 KB, free 365.8 MB)
20/09/09 12:28:25 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.10.31:42945 (size: 32.4 KB, free: 366.2 MB)
20/09/09 12:28:25 INFO spark.SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1163
20/09/09 12:28:25 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[5] at map at GenomicDataset.scala:3805) (first 15 tasks are for partitions Vector(0))
20/09/09 12:28:25 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
20/09/09 12:28:25 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, executor driver, partition 0, ANY, 7141 bytes)
20/09/09 12:28:25 INFO executor.Executor: Running task 0.0 in stage 1.0 (TID 1)
20/09/09 12:28:25 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
20/09/09 12:28:25 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 6 ms
20/09/09 12:28:26 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/09/09 12:28:26 INFO codec.CodecConfig: Compression: GZIP
20/09/09 12:28:26 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/09/09 12:28:26 INFO hadoop.ParquetOutputFormat: Parquet block size to 134217728
20/09/09 12:28:26 INFO hadoop.ParquetOutputFormat: Parquet page size to 1048576
20/09/09 12:28:26 INFO hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
20/09/09 12:28:26 INFO hadoop.ParquetOutputFormat: Dictionary is on
20/09/09 12:28:26 INFO hadoop.ParquetOutputFormat: Validation is off
20/09/09 12:28:26 INFO hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
20/09/09 12:28:26 INFO hadoop.ParquetOutputFormat: Maximum row group padding size is 8388608 bytes
20/09/09 12:28:26 INFO hadoop.ParquetOutputFormat: Page size checking is: estimated
20/09/09 12:28:26 INFO hadoop.ParquetOutputFormat: Min row count for page size check is: 100
20/09/09 12:28:26 INFO hadoop.ParquetOutputFormat: Max row count for page size check is: 10000
20/09/09 12:28:26 INFO compress.CodecPool: Got brand-new compressor [.gz]
20/09/09 12:28:30 INFO hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 16004474
20/09/09 12:28:30 INFO output.FileOutputCommitter: Saved output of task 'attempt_20200909122820_0005_r_000000_0' to file:/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.sorted.adam/_temporary/0/task_20200909122820_0005_r_000000
20/09/09 12:28:30 INFO mapred.SparkHadoopMapRedUtil: attempt_20200909122820_0005_r_000000_0: Committed
20/09/09 12:28:30 INFO executor.Executor: Finished task 0.0 in stage 1.0 (TID 1). 1243 bytes result sent to driver
20/09/09 12:28:30 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 5636 ms on localhost (executor driver) (1/1)
20/09/09 12:28:30 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
20/09/09 12:28:30 INFO scheduler.DAGScheduler: ResultStage 1 (runJob at SparkHadoopWriter.scala:78) finished in 5.697 s
20/09/09 12:28:30 INFO scheduler.DAGScheduler: Job 0 finished: runJob at SparkHadoopWriter.scala:78, took 10.034002 s
20/09/09 12:28:30 INFO hadoop.ParquetFileReader: Initiating action with parallelism: 5
20/09/09 12:28:30 INFO io.SparkHadoopWriter: Job job_20200909122820_0005 committed.
20/09/09 12:28:31 INFO spark.SparkContext: Invoking stop() from shutdown hook
20/09/09 12:28:31 INFO server.AbstractConnector: Stopped Spark@33aa93c{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
20/09/09 12:28:31 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.10.31:4040
20/09/09 12:28:31 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/09/09 12:28:31 INFO memory.MemoryStore: MemoryStore cleared
20/09/09 12:28:31 INFO storage.BlockManager: BlockManager stopped
20/09/09 12:28:31 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
20/09/09 12:28:31 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/09/09 12:28:31 INFO spark.SparkContext: Successfully stopped SparkContext
20/09/09 12:28:31 INFO util.ShutdownHookManager: Shutdown hook called
20/09/09 12:28:31 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-62e85345-0b2d-455c-ada0-f907f46ef739
20/09/09 12:28:31 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-dc9ef5e3-5c17-45e1-bbdf-a2d94b516b1e

# convert the reads to fragments to re-pair the reads
echo "Converting read file to fragments"
+ echo 'Converting read file to fragments'
Converting read file to fragments
rm -rf ${FRAGMENTS}
+ rm -rf mouse_chrM.bam.fragments.adam
${ADAM} transformFragments -load_as_alignments ${READS} ${FRAGMENTS}
+ ./bin/adam-submit transformFragments -load_as_alignments mouse_chrM.bam.reads.adam mouse_chrM.bam.fragments.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.6-bin-without-hadoop-scala-2.12/bin/spark-submit
20/09/09 12:28:32 WARN util.Utils: Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0)
20/09/09 12:28:32 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
20/09/09 12:28:33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/09/09 12:28:33 INFO cli.ADAMMain: ADAM invoked with args: "transformFragments" "-load_as_alignments" "mouse_chrM.bam.reads.adam" "mouse_chrM.bam.fragments.adam"
20/09/09 12:28:33 INFO spark.SparkContext: Running Spark version 2.4.6
20/09/09 12:28:33 INFO spark.SparkContext: Submitted application: transformFragments
20/09/09 12:28:33 INFO spark.SecurityManager: Changing view acls to: jenkins
20/09/09 12:28:33 INFO spark.SecurityManager: Changing modify acls to: jenkins
20/09/09 12:28:33 INFO spark.SecurityManager: Changing view acls groups to: 
20/09/09 12:28:33 INFO spark.SecurityManager: Changing modify acls groups to: 
20/09/09 12:28:33 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
20/09/09 12:28:34 INFO util.Utils: Successfully started service 'sparkDriver' on port 46683.
20/09/09 12:28:34 INFO spark.SparkEnv: Registering MapOutputTracker
20/09/09 12:28:34 INFO spark.SparkEnv: Registering BlockManagerMaster
20/09/09 12:28:34 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
20/09/09 12:28:34 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
20/09/09 12:28:34 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-c36fbc17-f744-4db5-9bd0-2a631976bb7b
20/09/09 12:28:34 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
20/09/09 12:28:34 INFO spark.SparkEnv: Registering OutputCommitCoordinator
20/09/09 12:28:34 INFO util.log: Logging initialized @2806ms
20/09/09 12:28:34 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
20/09/09 12:28:34 INFO server.Server: Started @2895ms
20/09/09 12:28:34 INFO server.AbstractConnector: Started ServerConnector@7561db12{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
20/09/09 12:28:34 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@11963225{/jobs,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@25243bc1{/jobs/json,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1e287667{/jobs/job,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4201a617{/jobs/job/json,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@467f77a5{/stages,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bb9aa43{/stages/json,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@420bc288{/stages/stage,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@66b72664{/stages/stage/json,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7a34b7b8{/stages/pool,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@58cd06cb{/stages/pool/json,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3be8821f{/storage,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@64b31700{/storage/json,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3b65e559{/storage/rdd,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@bae47a0{/storage/rdd/json,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@74a9c4b0{/environment,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@85ec632{/environment/json,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c05a54d{/executors,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@65ef722a{/executors/json,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5fd9b663{/executors/threadDump,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@214894fc{/executors/threadDump/json,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@10567255{/static,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51bde877{/,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@60b85ba1{/api,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@d71adc2{/jobs/job/kill,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3add81c4{/stages/stage/kill,null,AVAILABLE,@Spark}
20/09/09 12:28:34 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.31:4040
20/09/09 12:28:34 INFO spark.SparkContext: Added JAR file:/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar at spark://192.168.10.31:46683/jars/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar with timestamp 1599679714551
20/09/09 12:28:34 INFO executor.Executor: Starting executor ID driver on host localhost
20/09/09 12:28:34 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 33801.
20/09/09 12:28:34 INFO netty.NettyBlockTransferService: Server created on 192.168.10.31:33801
20/09/09 12:28:34 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
20/09/09 12:28:34 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.31, 33801, None)
20/09/09 12:28:34 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.10.31:33801 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 33801, None)
20/09/09 12:28:34 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.31, 33801, None)
20/09/09 12:28:34 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 33801, None)
20/09/09 12:28:34 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@52fc5eb1{/metrics/json,null,AVAILABLE,@Spark}
20/09/09 12:28:35 INFO rdd.ADAMContext: Loading mouse_chrM.bam.reads.adam as Parquet of Alignments.
20/09/09 12:28:36 INFO rdd.ADAMContext: Reading the ADAM file at mouse_chrM.bam.reads.adam to create RDD
20/09/09 12:28:37 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 315.0 KB, free 366.0 MB)
20/09/09 12:28:37 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 28.1 KB, free 366.0 MB)
20/09/09 12:28:37 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.31:33801 (size: 28.1 KB, free: 366.3 MB)
20/09/09 12:28:37 INFO spark.SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:1797
20/09/09 12:28:38 INFO input.FileInputFormat: Total input paths to process : 1
20/09/09 12:28:38 INFO hadoop.ParquetInputFormat: Total input paths to process : 1
20/09/09 12:28:38 INFO fragment.RDDBoundFragmentDataset: Saving data in ADAM format
20/09/09 12:28:38 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/09/09 12:28:38 INFO spark.SparkContext: Starting job: runJob at SparkHadoopWriter.scala:78
20/09/09 12:28:38 INFO scheduler.DAGScheduler: Registering RDD 2 (groupBy at SingleReadBucket.scala:97) as input to shuffle 0
20/09/09 12:28:38 INFO scheduler.DAGScheduler: Got job 0 (runJob at SparkHadoopWriter.scala:78) with 1 output partitions
20/09/09 12:28:38 INFO scheduler.DAGScheduler: Final stage: ResultStage 1 (runJob at SparkHadoopWriter.scala:78)
20/09/09 12:28:38 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
20/09/09 12:28:38 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 0)
20/09/09 12:28:38 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[2] at groupBy at SingleReadBucket.scala:97), which has no missing parents
20/09/09 12:28:38 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 6.4 KB, free 366.0 MB)
20/09/09 12:28:38 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 3.5 KB, free 366.0 MB)
20/09/09 12:28:38 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.10.31:33801 (size: 3.5 KB, free: 366.3 MB)
20/09/09 12:28:38 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1163
20/09/09 12:28:38 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[2] at groupBy at SingleReadBucket.scala:97) (first 15 tasks are for partitions Vector(0))
20/09/09 12:28:38 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
20/09/09 12:28:38 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7480 bytes)
20/09/09 12:28:38 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
20/09/09 12:28:38 INFO executor.Executor: Fetching spark://192.168.10.31:46683/jars/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar with timestamp 1599679714551
20/09/09 12:28:38 INFO client.TransportClientFactory: Successfully created connection to /192.168.10.31:46683 after 39 ms (0 ms spent in bootstraps)
20/09/09 12:28:38 INFO util.Utils: Fetching spark://192.168.10.31:46683/jars/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar to /tmp/spark-e188255b-a9bb-44d6-97b8-010f837d36a0/userFiles-4953cf0d-0c81-4d29-8090-c67fac63ef19/fetchFileTemp3156617243385889465.tmp
20/09/09 12:28:38 INFO executor.Executor: Adding file:/tmp/spark-e188255b-a9bb-44d6-97b8-010f837d36a0/userFiles-4953cf0d-0c81-4d29-8090-c67fac63ef19/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar to class loader
20/09/09 12:28:39 INFO rdd.NewHadoopRDD: Input split: file:/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/part-r-00000.gz.parquet:0+10132211
20/09/09 12:28:39 INFO hadoop.InternalParquetRecordReader: RecordReader initialized will read a total of 163064 records.
20/09/09 12:28:39 INFO hadoop.InternalParquetRecordReader: at row 0. reading next block
20/09/09 12:28:39 INFO compress.CodecPool: Got brand-new decompressor [.gz]
20/09/09 12:28:39 INFO hadoop.InternalParquetRecordReader: block read in memory in 40 ms. row count = 163064
20/09/09 12:28:42 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 962 bytes result sent to driver
20/09/09 12:28:42 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 4030 ms on localhost (executor driver) (1/1)
20/09/09 12:28:42 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
20/09/09 12:28:42 INFO scheduler.DAGScheduler: ShuffleMapStage 0 (groupBy at SingleReadBucket.scala:97) finished in 4.146 s
20/09/09 12:28:42 INFO scheduler.DAGScheduler: looking for newly runnable stages
20/09/09 12:28:42 INFO scheduler.DAGScheduler: running: Set()
20/09/09 12:28:42 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 1)
20/09/09 12:28:42 INFO scheduler.DAGScheduler: failed: Set()
20/09/09 12:28:42 INFO scheduler.DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[6] at map at GenomicDataset.scala:3805), which has no missing parents
20/09/09 12:28:42 INFO memory.MemoryStore: Block broadcast_2 stored as values in memory (estimated size 90.2 KB, free 365.9 MB)
20/09/09 12:28:42 INFO memory.MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 33.5 KB, free 365.8 MB)
20/09/09 12:28:42 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.10.31:33801 (size: 33.5 KB, free: 366.2 MB)
20/09/09 12:28:42 INFO spark.SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1163
20/09/09 12:28:42 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[6] at map at GenomicDataset.scala:3805) (first 15 tasks are for partitions Vector(0))
20/09/09 12:28:42 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
20/09/09 12:28:42 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, executor driver, partition 0, ANY, 7141 bytes)
20/09/09 12:28:42 INFO executor.Executor: Running task 0.0 in stage 1.0 (TID 1)
20/09/09 12:28:42 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
20/09/09 12:28:42 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 6 ms
20/09/09 12:28:44 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/09/09 12:28:44 INFO codec.CodecConfig: Compression: GZIP
20/09/09 12:28:44 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/09/09 12:28:44 INFO hadoop.ParquetOutputFormat: Parquet block size to 134217728
20/09/09 12:28:44 INFO hadoop.ParquetOutputFormat: Parquet page size to 1048576
20/09/09 12:28:44 INFO hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
20/09/09 12:28:44 INFO hadoop.ParquetOutputFormat: Dictionary is on
20/09/09 12:28:44 INFO hadoop.ParquetOutputFormat: Validation is off
20/09/09 12:28:44 INFO hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
20/09/09 12:28:44 INFO hadoop.ParquetOutputFormat: Maximum row group padding size is 8388608 bytes
20/09/09 12:28:44 INFO hadoop.ParquetOutputFormat: Page size checking is: estimated
20/09/09 12:28:44 INFO hadoop.ParquetOutputFormat: Min row count for page size check is: 100
20/09/09 12:28:44 INFO hadoop.ParquetOutputFormat: Max row count for page size check is: 10000
20/09/09 12:28:44 INFO compress.CodecPool: Got brand-new compressor [.gz]
20/09/09 12:28:45 INFO storage.BlockManagerInfo: Removed broadcast_1_piece0 on 192.168.10.31:33801 in memory (size: 3.5 KB, free: 366.2 MB)
20/09/09 12:28:50 INFO hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 21417928
20/09/09 12:28:51 INFO output.FileOutputCommitter: Saved output of task 'attempt_20200909122838_0006_r_000000_0' to file:/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.fragments.adam/_temporary/0/task_20200909122838_0006_r_000000
20/09/09 12:28:51 INFO mapred.SparkHadoopMapRedUtil: attempt_20200909122838_0006_r_000000_0: Committed
20/09/09 12:28:51 INFO executor.Executor: Finished task 0.0 in stage 1.0 (TID 1). 1200 bytes result sent to driver
20/09/09 12:28:51 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 8649 ms on localhost (executor driver) (1/1)
20/09/09 12:28:51 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
20/09/09 12:28:51 INFO scheduler.DAGScheduler: ResultStage 1 (runJob at SparkHadoopWriter.scala:78) finished in 8.707 s
20/09/09 12:28:51 INFO scheduler.DAGScheduler: Job 0 finished: runJob at SparkHadoopWriter.scala:78, took 12.945453 s
20/09/09 12:28:51 INFO hadoop.ParquetFileReader: Initiating action with parallelism: 5
20/09/09 12:28:51 INFO io.SparkHadoopWriter: Job job_20200909122838_0006 committed.
20/09/09 12:28:51 INFO spark.SparkContext: Invoking stop() from shutdown hook
20/09/09 12:28:51 INFO server.AbstractConnector: Stopped Spark@7561db12{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
20/09/09 12:28:51 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.10.31:4040
20/09/09 12:28:51 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/09/09 12:28:51 INFO memory.MemoryStore: MemoryStore cleared
20/09/09 12:28:51 INFO storage.BlockManager: BlockManager stopped
20/09/09 12:28:51 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
20/09/09 12:28:51 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/09/09 12:28:51 INFO spark.SparkContext: Successfully stopped SparkContext
20/09/09 12:28:51 INFO util.ShutdownHookManager: Shutdown hook called
20/09/09 12:28:51 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-cd0f4a86-7155-4998-9acc-2084ed627f7d
20/09/09 12:28:51 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-e188255b-a9bb-44d6-97b8-010f837d36a0

# test that printing works
echo "Printing reads and fragments"
+ echo 'Printing reads and fragments'
Printing reads and fragments
${ADAM} print ${READS} 1>/dev/null 2>/dev/null
+ ./bin/adam-submit print mouse_chrM.bam.reads.adam
${ADAM} print ${FRAGMENTS} 1>/dev/null 2>/dev/null
+ ./bin/adam-submit print mouse_chrM.bam.fragments.adam

# run flagstat to verify that flagstat runs OK
echo "Printing read statistics"
+ echo 'Printing read statistics'
Printing read statistics
${ADAM} flagstat ${READS}
+ ./bin/adam-submit flagstat mouse_chrM.bam.reads.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.6-bin-without-hadoop-scala-2.12/bin/spark-submit
20/09/09 12:29:11 WARN util.Utils: Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0)
20/09/09 12:29:11 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
20/09/09 12:29:11 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/09/09 12:29:11 INFO cli.ADAMMain: ADAM invoked with args: "flagstat" "mouse_chrM.bam.reads.adam"
20/09/09 12:29:11 INFO spark.SparkContext: Running Spark version 2.4.6
20/09/09 12:29:11 INFO spark.SparkContext: Submitted application: flagstat
20/09/09 12:29:11 INFO spark.SecurityManager: Changing view acls to: jenkins
20/09/09 12:29:11 INFO spark.SecurityManager: Changing modify acls to: jenkins
20/09/09 12:29:11 INFO spark.SecurityManager: Changing view acls groups to: 
20/09/09 12:29:11 INFO spark.SecurityManager: Changing modify acls groups to: 
20/09/09 12:29:11 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
20/09/09 12:29:12 INFO util.Utils: Successfully started service 'sparkDriver' on port 33717.
20/09/09 12:29:12 INFO spark.SparkEnv: Registering MapOutputTracker
20/09/09 12:29:12 INFO spark.SparkEnv: Registering BlockManagerMaster
20/09/09 12:29:12 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
20/09/09 12:29:12 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
20/09/09 12:29:12 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-4ae4c616-0c91-40c8-a140-10c04bf0c560
20/09/09 12:29:12 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
20/09/09 12:29:12 INFO spark.SparkEnv: Registering OutputCommitCoordinator
20/09/09 12:29:12 INFO util.log: Logging initialized @2703ms
20/09/09 12:29:12 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
20/09/09 12:29:12 INFO server.Server: Started @2792ms
20/09/09 12:29:12 INFO server.AbstractConnector: Started ServerConnector@7674a051{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
20/09/09 12:29:12 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@622ef26a{/jobs,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51abf713{/jobs/json,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@eadb475{/jobs/job,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@315df4bb{/jobs/job/json,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3fc08eec{/stages,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5cad8b7d{/stages/json,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7b02e036{/stages/stage,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2e6ee0bc{/stages/stage/json,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4201a617{/stages/pool,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@467f77a5{/stages/pool/json,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bb9aa43{/storage,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@420bc288{/storage/json,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@df5f5c0{/storage/rdd,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@308a6984{/storage/rdd/json,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@66b72664{/environment,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7a34b7b8{/environment/json,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@58cd06cb{/executors,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3be8821f{/executors/json,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@64b31700{/executors/threadDump,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3b65e559{/executors/threadDump/json,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@bae47a0{/static,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6b5f8707{/,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@772485dd{/api,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3d829787{/jobs/job/kill,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@71652c98{/stages/stage/kill,null,AVAILABLE,@Spark}
20/09/09 12:29:12 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.31:4040
20/09/09 12:29:12 INFO spark.SparkContext: Added JAR file:/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar at spark://192.168.10.31:33717/jars/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar with timestamp 1599679752759
20/09/09 12:29:12 INFO executor.Executor: Starting executor ID driver on host localhost
20/09/09 12:29:12 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 45781.
20/09/09 12:29:12 INFO netty.NettyBlockTransferService: Server created on 192.168.10.31:45781
20/09/09 12:29:12 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
20/09/09 12:29:12 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.31, 45781, None)
20/09/09 12:29:12 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.10.31:45781 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 45781, None)
20/09/09 12:29:12 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.31, 45781, None)
20/09/09 12:29:12 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 45781, None)
20/09/09 12:29:13 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@79a1728c{/metrics/json,null,AVAILABLE,@Spark}
20/09/09 12:29:13 INFO rdd.ADAMContext: Loading mouse_chrM.bam.reads.adam as Parquet of Alignments.
20/09/09 12:29:13 INFO rdd.ADAMContext: Reading the ADAM file at mouse_chrM.bam.reads.adam to create RDD
20/09/09 12:29:13 INFO rdd.ADAMContext: Using the specified projection schema
20/09/09 12:29:14 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 322.5 KB, free 366.0 MB)
20/09/09 12:29:14 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 29.1 KB, free 366.0 MB)
20/09/09 12:29:14 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.31:45781 (size: 29.1 KB, free: 366.3 MB)
20/09/09 12:29:14 INFO spark.SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:1797
20/09/09 12:29:16 INFO input.FileInputFormat: Total input paths to process : 1
20/09/09 12:29:16 INFO hadoop.ParquetInputFormat: Total input paths to process : 1
20/09/09 12:29:16 INFO spark.SparkContext: Starting job: aggregate at FlagStat.scala:115
20/09/09 12:29:16 INFO scheduler.DAGScheduler: Got job 0 (aggregate at FlagStat.scala:115) with 1 output partitions
20/09/09 12:29:16 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (aggregate at FlagStat.scala:115)
20/09/09 12:29:16 INFO scheduler.DAGScheduler: Parents of final stage: List()
20/09/09 12:29:16 INFO scheduler.DAGScheduler: Missing parents: List()
20/09/09 12:29:16 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[2] at map at FlagStat.scala:96), which has no missing parents
20/09/09 12:29:16 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 5.3 KB, free 366.0 MB)
20/09/09 12:29:16 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 2.9 KB, free 365.9 MB)
20/09/09 12:29:16 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.10.31:45781 (size: 2.9 KB, free: 366.3 MB)
20/09/09 12:29:16 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1163
20/09/09 12:29:16 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[2] at map at FlagStat.scala:96) (first 15 tasks are for partitions Vector(0))
20/09/09 12:29:16 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
20/09/09 12:29:16 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7491 bytes)
20/09/09 12:29:16 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
20/09/09 12:29:16 INFO executor.Executor: Fetching spark://192.168.10.31:33717/jars/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar with timestamp 1599679752759
20/09/09 12:29:16 INFO client.TransportClientFactory: Successfully created connection to /192.168.10.31:33717 after 38 ms (0 ms spent in bootstraps)
20/09/09 12:29:16 INFO util.Utils: Fetching spark://192.168.10.31:33717/jars/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar to /tmp/spark-c5547927-f9af-4de1-8b06-6f9e174ee0ed/userFiles-76f96163-fecb-44a5-9390-0eadef401f92/fetchFileTemp5925805617320519724.tmp
20/09/09 12:29:16 INFO executor.Executor: Adding file:/tmp/spark-c5547927-f9af-4de1-8b06-6f9e174ee0ed/userFiles-76f96163-fecb-44a5-9390-0eadef401f92/adam-assembly-spark2_2.12-0.33.0-SNAPSHOT.jar to class loader
20/09/09 12:29:17 INFO rdd.NewHadoopRDD: Input split: file:/tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/part-r-00000.gz.parquet:0+10132211
20/09/09 12:29:17 INFO hadoop.InternalParquetRecordReader: RecordReader initialized will read a total of 163064 records.
20/09/09 12:29:17 INFO hadoop.InternalParquetRecordReader: at row 0. reading next block
20/09/09 12:29:17 INFO compress.CodecPool: Got brand-new decompressor [.gz]
20/09/09 12:29:17 INFO hadoop.InternalParquetRecordReader: block read in memory in 24 ms. row count = 163064
20/09/09 12:29:18 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 822 bytes result sent to driver
20/09/09 12:29:18 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 1729 ms on localhost (executor driver) (1/1)
20/09/09 12:29:18 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
20/09/09 12:29:18 INFO scheduler.DAGScheduler: ResultStage 0 (aggregate at FlagStat.scala:115) finished in 1.863 s
20/09/09 12:29:18 INFO scheduler.DAGScheduler: Job 0 finished: aggregate at FlagStat.scala:115, took 1.936719 s
163064 + 0 in total (QC-passed reads + QC-failed reads)
0 + 0 primary duplicates
0 + 0 primary duplicates - both read and mate mapped
0 + 0 primary duplicates - only read mapped
0 + 0 primary duplicates - cross chromosome
0 + 0 secondary duplicates
0 + 0 secondary duplicates - both read and mate mapped
0 + 0 secondary duplicates - only read mapped
0 + 0 secondary duplicates - cross chromosome
160512 + 0 mapped (98.43%:0.00%)
163064 + 0 paired in sequencing
81524 + 0 read1
81540 + 0 read2
154982 + 0 properly paired (95.04%:0.00%)
158044 + 0 with itself and mate mapped
2468 + 0 singletons (1.51%:0.00%)
418 + 0 with mate mapped to a different chr
120 + 0 with mate mapped to a different chr (mapQ>=5)
20/09/09 12:29:18 INFO spark.SparkContext: Invoking stop() from shutdown hook
20/09/09 12:29:18 INFO server.AbstractConnector: Stopped Spark@7674a051{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
20/09/09 12:29:18 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.10.31:4040
20/09/09 12:29:18 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/09/09 12:29:18 INFO memory.MemoryStore: MemoryStore cleared
20/09/09 12:29:18 INFO storage.BlockManager: BlockManager stopped
20/09/09 12:29:18 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
20/09/09 12:29:18 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/09/09 12:29:18 INFO spark.SparkContext: Successfully stopped SparkContext
20/09/09 12:29:18 INFO util.ShutdownHookManager: Shutdown hook called
20/09/09 12:29:18 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-3f63c463-767f-4337-8cce-097448ca7596
20/09/09 12:29:18 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-c5547927-f9af-4de1-8b06-6f9e174ee0ed
rm -rf ${ADAM_TMP_DIR}
+ rm -rf /tmp/adamTestqx4EBo7/deleteMePleaseThisIsNoLongerNeeded
popd
+ popd
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALA_VERSION/2.12/SPARK_VERSION/2.4.6/label/ubuntu

pushd ${PROJECT_ROOT}
+ pushd /home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALA_VERSION/2.12/SPARK_VERSION/2.4.6/label/ubuntu/scripts/..
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALA_VERSION/2.12/SPARK_VERSION/2.4.6/label/ubuntu ~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALA_VERSION/2.12/SPARK_VERSION/2.4.6/label/ubuntu

# move back to Scala 2.12 as default
if [ ${SCALA_VERSION} == 2.11 ];
then
    set +e
    ./scripts/move_to_scala_2.12.sh
    set -e
fi
+ '[' 2.12 == 2.11 ']'
# move back to Spark 3.x as default
if [ ${SPARK_VERSION} == 2.4.6 ];
then
    set +e
    ./scripts/move_to_spark_3.sh
    set -e
fi
+ '[' 2.4.6 == 2.4.6 ']'
+ set +e
+ ./scripts/move_to_spark_3.sh
+ set -e

# test that the source is formatted correctly
./scripts/format-source
+ ./scripts/format-source
+++ dirname ./scripts/format-source
++ cd ./scripts
++ pwd
+ DIR=/home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALA_VERSION/2.12/SPARK_VERSION/2.4.6/label/ubuntu/scripts
+ pushd /home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALA_VERSION/2.12/SPARK_VERSION/2.4.6/label/ubuntu/scripts/..
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALA_VERSION/2.12/SPARK_VERSION/2.4.6/label/ubuntu ~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALA_VERSION/2.12/SPARK_VERSION/2.4.6/label/ubuntu
+ mvn org.scalariform:scalariform-maven-plugin:format license:format
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=1g; support was removed in 8.0
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] ADAM_2.12                                                          [pom]
[INFO] ADAM_2.12: Shader workaround                                       [jar]
[INFO] ADAM_2.12: Avro-to-Dataset codegen utils                           [jar]
[INFO] ADAM_2.12: Core                                                    [jar]
[INFO] ADAM_2.12: APIs for Java, Python                                   [jar]
[INFO] ADAM_2.12: CLI                                                     [jar]
[INFO] ADAM_2.12: Assembly                                                [jar]
[INFO] 
[INFO] ------------< org.bdgenomics.adam:adam-parent-spark3_2.12 >-------------
[INFO] Building ADAM_2.12 0.33.0-SNAPSHOT                                 [1/7]
[INFO] --------------------------------[ pom ]---------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-parent-spark3_2.12 ---
[INFO] Modified 2 of 244 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-parent-spark3_2.12 ---
[INFO] Updating license headers...
[INFO] 
[INFO] -------------< org.bdgenomics.adam:adam-shade-spark3_2.12 >-------------
[INFO] Building ADAM_2.12: Shader workaround 0.33.0-SNAPSHOT              [2/7]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-shade-spark3_2.12 ---
[INFO] Modified 0 of 0 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-shade-spark3_2.12 ---
[INFO] Updating license headers...
[INFO] 
[INFO] ------------< org.bdgenomics.adam:adam-codegen-spark3_2.12 >------------
[INFO] Building ADAM_2.12: Avro-to-Dataset codegen utils 0.33.0-SNAPSHOT  [3/7]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-codegen-spark3_2.12 ---
[INFO] Modified 0 of 4 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-codegen-spark3_2.12 ---
[INFO] Updating license headers...
[INFO] 
[INFO] -------------< org.bdgenomics.adam:adam-core-spark3_2.12 >--------------
[INFO] Building ADAM_2.12: Core 0.33.0-SNAPSHOT                           [4/7]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-core-spark3_2.12 ---
[INFO] Modified 0 of 204 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-core-spark3_2.12 ---
[INFO] Updating license headers...
[INFO] 
[INFO] -------------< org.bdgenomics.adam:adam-apis-spark3_2.12 >--------------
[INFO] Building ADAM_2.12: APIs for Java, Python 0.33.0-SNAPSHOT          [5/7]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-apis-spark3_2.12 ---
[INFO] Modified 0 of 5 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-apis-spark3_2.12 ---
[INFO] Updating license headers...
[INFO] 
[INFO] --------------< org.bdgenomics.adam:adam-cli-spark3_2.12 >--------------
[INFO] Building ADAM_2.12: CLI 0.33.0-SNAPSHOT                            [6/7]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-cli-spark3_2.12 ---
[INFO] Modified 0 of 29 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-cli-spark3_2.12 ---
[INFO] Updating license headers...
[INFO] 
[INFO] -----------< org.bdgenomics.adam:adam-assembly-spark3_2.12 >------------
[INFO] Building ADAM_2.12: Assembly 0.33.0-SNAPSHOT                       [7/7]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-assembly-spark3_2.12 ---
[INFO] Modified 0 of 1 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-assembly-spark3_2.12 ---
[INFO] Updating license headers...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for ADAM_2.12 0.33.0-SNAPSHOT:
[INFO] 
[INFO] ADAM_2.12 .......................................... SUCCESS [  6.233 s]
[INFO] ADAM_2.12: Shader workaround ....................... SUCCESS [  0.032 s]
[INFO] ADAM_2.12: Avro-to-Dataset codegen utils ........... SUCCESS [  0.047 s]
[INFO] ADAM_2.12: Core .................................... SUCCESS [  3.304 s]
[INFO] ADAM_2.12: APIs for Java, Python ................... SUCCESS [  0.129 s]
[INFO] ADAM_2.12: CLI ..................................... SUCCESS [  0.186 s]
[INFO] ADAM_2.12: Assembly ................................ SUCCESS [  0.013 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  10.395 s
[INFO] Finished at: 2020-09-09T12:29:30-07:00
[INFO] ------------------------------------------------------------------------
+ popd
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALA_VERSION/2.12/SPARK_VERSION/2.4.6/label/ubuntu
if test -n "$(git status --porcelain)"
then
    echo "Please run './scripts/format-source'"
    exit 1
fi
git status --porcelain
++ git status --porcelain
+ test -n ''
popd    
+ popd
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALA_VERSION/2.12/SPARK_VERSION/2.4.6/label/ubuntu

echo
+ echo

echo "All the tests passed"
+ echo 'All the tests passed'
All the tests passed
echo
+ echo

Recording test results
Publishing Scoverage XML and HTML report...
null
Setting commit status on GitHub for https://github.com/bigdatagenomics/adam/commit/a30a822f2f1fae4c474b307d1f73ba8b63ce8f37
Finished: SUCCESS