SuccessConsole Output

Skipping 2,643 KB.. Full Log
nger than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToContigsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToVariantContextsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToContigsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToVariantContextDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToCoverageDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToContigsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToContigsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToAlignmentRecordsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToContigsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToContigsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToContigsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToContigsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToFragmentDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/JavaADAMContext$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToAlignmentRecordsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/python/DataFrameConversionWrapper.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformAlignments.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFeatures.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformVariantsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformGenotypesArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFragments$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFragments.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/Reads2CoverageArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformAlignmentsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountContigKmers.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformGenotypes$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountContigKmersArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/MergeShardsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformAlignments$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountContigKmers$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformVariants.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountReadKmers$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountReadKmersArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFeatures$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformVariants$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFragmentsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/Reads2Coverage$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformGenotypes.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFeaturesArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/repo/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT-javadoc.jar longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/repo/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT-sources.jar longer than 100 characters.
[INFO] Building tar: /tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/adam-distribution/target/adam-distribution-spark2_2.11-0.27.0-SNAPSHOT-bin.tar.bz2
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/smithwaterman/ longer than 100 characters.
[WARNING] Resulting tar file can only be processed successfully by GNU compatible tar commands
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/serialization/package.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/serialization/AvroSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/serialization/ADAMKryoRegistrator.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/serialization/WritableSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/serialization/InputStreamWithDecoder.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/smithwaterman/SmithWatermanConstantGapScoring.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/smithwaterman/package.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/smithwaterman/SmithWaterman.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/consensus/package.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/consensus/ConsensusGenerator$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/consensus/ConsensusGenerator.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/converters/VariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/converters/AlignmentRecordConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/converters/VariantContextConverter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/converters/DefaultHeaderLines$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rich/RichAlignmentRecord$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rich/RichAlignmentRecord.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/IndexedFastaFile.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/ParquetFileTraversable.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/GenomeFileReader$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/TwoBitFileSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/SequenceDictionaryReader$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/ReferenceContigMapSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/ParquetLogger$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/ReferenceContigMap.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/FileExtensions$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/ReferenceContigMap$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/AttributeUtils$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/TextRddWriter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/VariantCallingAnnotations.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/VariantAnnotation.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/VariantAnnotation$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/NucleotideContigFragment$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/AlignmentRecord.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/TranscriptEffect$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/NucleotideContigFragment.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/AlignmentRecord$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/ProcessingStep$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/TranscriptEffect.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/VariantContext$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/VariantCallingAnnotations$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/ReferenceField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/package.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/SequenceField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/SampleField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/FieldValue.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/VariantCallingAnnotationsField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/VariantAnnotationField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/ReadField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/DbxrefField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/FragmentField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/VariantField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/GenotypeField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/Filter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/AlignmentRecordField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/FieldEnumeration.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/Projection$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/OntologyTermField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/FieldEnumeration$SchemaVal.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/ReadGroupField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/NucleotideContigFragmentField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/TranscriptEffectField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/SliceField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/FeatureField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/InnerShuffleRegionJoinAndGroupByLeft.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/InFormatterCompanion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/SAMInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/AlignmentRecordDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMBAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/DatasetBoundAlignmentRecordDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/RDDBoundAlignmentRecordDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/QualityScoreBin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMSAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMCRAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMSAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMSAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/BAMInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMBAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/AlignmentRecordDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ReferencePositionPairSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMBAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/FASTQInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMCRAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMCRAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ParquetUnboundAlignmentRecordDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/QualityScoreBin$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/SingleReadBucketSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/AnySAMInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/AnySAMInFormatterCompanion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/FASTQInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/BAMInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMBAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/AnySAMOutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMCRAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/IncorrectMDTagException.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/SAMInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMSAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/AvroGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenomicPositionPartitioner.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenericGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/ADAMSaveAnyArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/AvroReadGroupGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenomicDatasetWithLineage.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/contig/NucleotideContigFragmentDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/contig/ParquetUnboundNucleotideContigFragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/contig/RDDBoundNucleotideContigFragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/contig/DatasetBoundNucleotideContigFragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/contig/NucleotideContigFragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenomicRegionPartitioner.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenomicPositionPartitioner$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/RightOuterShuffleRegionJoinAndGroupByLeft.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenomicRegionPartitioner$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/RDDBoundVariantDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/package.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/RDDBoundVariantContextDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/DatasetBoundVariantContextDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VariantDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/GenotypeDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VCFOutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/DatasetBoundGenotypeDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/ParquetUnboundGenotypeDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/RDDBoundGenotypeDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/ADAMVCFOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VCFInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/DatasetBoundVariantDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VariantDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VariantContextDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VariantContextDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VCFInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/ParquetUnboundVariantDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/GenotypeDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/InnerShuffleRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/VictimlessSortedIntervalPartitionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/LeftOuterShuffleRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/RightOuterTreeRegionJoinAndGroupByRight.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/RDDBoundGenericGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/DatasetBoundGenericGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/SortedIntervalPartitionJoinWithVictims.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/FullOuterShuffleRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/Tab5InFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/RDDBoundFragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/package.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/Tab6InFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/FragmentDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/Tab5InFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/Tab6InFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/FragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/ParquetUnboundFragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/InterleavedFASTQInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/DatasetBoundFragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/InterleavedFASTQInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/MultisampleGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenericConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/InnerTreeRegionJoinAndGroupByRight.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/ReferencePartitioner.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/MultisampleAvroGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/ShuffleRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/RightOuterTreeRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/ParquetUnboundCoverageDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GTFInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GTFInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/package.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/DatasetBoundFeatureDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/BEDOutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/CoverageDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/RDDBoundCoverageDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/NarrowPeakInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/NarrowPeakInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/ParquetUnboundFeatureDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/CoverageDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GFF3InFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GTFOutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/FeatureDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/RDDBoundFeatureDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/BEDInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/BEDInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GFF3OutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/DatasetBoundCoverageDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/NarrowPeakOutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GFF3InFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/FeatureDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenomicBroadcast.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/LeftOuterShuffleRegionJoinAndGroupByLeft.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/InnerTreeRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenomicDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/DatasetBoundGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/ReadGroupDictionary$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/SequenceDictionary$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/PositionOrdering$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/SequenceDictionary.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/ReferenceRegion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/VariantContext.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/TagType$$TypeVal.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/OptionalPositionOrdering$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/OptionalReferenceOrdering.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/OptionalRegionOrdering$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/ReferencePosition.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/ReferencePosition$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/SequenceRecord.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/ReferencePositionSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/ReferenceRegionSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/RegionOrdering$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/ReferenceOrdering.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/VariantContextSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/VariantContext$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/ReferenceRegion$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/SequenceRecord$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/models/ReadGroupDictionary.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/instrumentation/package.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/instrumentation/Timers$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/io/FastqRecordReader.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/io/FastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/io/SingleFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/io/InterleavedFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/apache/parquet/avro/AvroSchemaConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/FastqRecordReader.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/class-use/FastqRecordReader.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/class-use/FastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/class-use/SingleFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/class-use/InterleavedFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/SingleFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/InterleavedFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/javadocs/org/apache/parquet/avro/AvroSchemaConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/javadocs/org/apache/parquet/avro/class-use/AvroSchemaConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToVariantDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToAlignmentRecordsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToFeatureConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToContigsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToContigsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToContigsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToContigDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToContigsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToContigsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToAlignmentRecordsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToAlignmentRecordDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/JavaADAMContext.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToGenotypeDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToContigsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToFeatureDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToAlignmentRecordsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToAlignmentRecordsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToContigsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToVariantContextsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToContigsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToVariantContextDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToCoverageDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToContigsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToContigsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToAlignmentRecordsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToContigsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToContigsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToContigsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToContigsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToFragmentDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/JavaADAMContext$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ContigsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToAlignmentRecordsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/python/DataFrameConversionWrapper.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformAlignments.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFeatures.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformVariantsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformGenotypesArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFragments$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFragments.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/Reads2CoverageArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformAlignmentsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountContigKmers.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformGenotypes$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountContigKmersArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/MergeShardsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformAlignments$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountContigKmers$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformVariants.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountReadKmers$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountReadKmersArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFeatures$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformVariants$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFragmentsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/Reads2Coverage$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformGenotypes.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFeaturesArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/repo/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT-javadoc.jar longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/repo/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT-sources.jar longer than 100 characters.
[INFO] Building zip: /tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/adam-distribution/target/adam-distribution-spark2_2.11-0.27.0-SNAPSHOT-bin.zip
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] ADAM_2.11 .......................................... SUCCESS [  9.274 s]
[INFO] ADAM_2.11: Shader workaround ....................... SUCCESS [  4.956 s]
[INFO] ADAM_2.11: Avro-to-Dataset codegen utils ........... SUCCESS [  4.244 s]
[INFO] ADAM_2.11: Core .................................... SUCCESS [01:15 min]
[INFO] ADAM_2.11: APIs for Java, Python ................... SUCCESS [  7.199 s]
[INFO] ADAM_2.11: CLI ..................................... SUCCESS [ 10.557 s]
[INFO] ADAM_2.11: Assembly ................................ SUCCESS [ 19.478 s]
[INFO] ADAM_2.11: Python APIs ............................. SUCCESS [01:34 min]
[INFO] ADAM_2.11: R APIs .................................. SUCCESS [01:02 min]
[INFO] ADAM_2.11: Distribution ............................ SUCCESS [ 35.235 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 05:23 min
[INFO] Finished at: 2019-05-13T17:34:27-07:00
[INFO] Final Memory: 68M/1496M
[INFO] ------------------------------------------------------------------------
+ grep bdgenomics.adam
+ grep egg
+ tar tzvf adam-distribution/target/adam-distribution-spark2_2.11-0.27.0-SNAPSHOT-bin.tar.gz
drwxrwxr-x jenkins/jenkins        0 2019-05-13 17:31 adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/r/bdgenomics.adam.egg-info/
-rw-r--r-- jenkins/jenkins 39294771 2019-05-13 17:31 adam-distribution-spark2_2.11-0.27.0-SNAPSHOT/repo/bdgenomics.adam-0.26.0a0-py3.6.egg
+ ./bin/pyadam
Using PYSPARK=/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.3-bin-hadoop2.7/bin/pyspark
2019-05-13 17:34:30 WARN  Utils:66 - Your hostname, amp-jenkins-staging-worker-01 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface eno1)
2019-05-13 17:34:30 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2019-05-13 17:34:31 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
2019-05-13 17:34:38 WARN  Utils:66 - Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.debug.maxToStringFields' in SparkEnv.conf.

[Stage 0:>                                                          (0 + 1) / 1]
                                                                                
+ source deactivate
#!/bin/bash

# Determine the directory containing this script
if [[ -n $BASH_VERSION ]]; then
    _SCRIPT_LOCATION=${BASH_SOURCE[0]}
    _SHELL="bash"
elif [[ -n $ZSH_VERSION ]]; then
    _SCRIPT_LOCATION=${funcstack[1]}
    _SHELL="zsh"
else
    echo "Only bash and zsh are supported"
    return 1
fi
++ [[ -n 4.3.48(1)-release ]]
++ _SCRIPT_LOCATION=/home/jenkins/anaconda2/envs/adam-build-0cc65668-cdf7-4945-b18c-214435201a80/bin/deactivate
++ _SHELL=bash
_CONDA_DIR=$(dirname "$_SCRIPT_LOCATION")
dirname "$_SCRIPT_LOCATION"
+++ dirname /home/jenkins/anaconda2/envs/adam-build-0cc65668-cdf7-4945-b18c-214435201a80/bin/deactivate
++ _CONDA_DIR=/home/jenkins/anaconda2/envs/adam-build-0cc65668-cdf7-4945-b18c-214435201a80/bin

case "$(uname -s)" in
    CYGWIN*|MINGW*|MSYS*)
        EXT=".exe"
        export MSYS2_ENV_CONV_EXCL=CONDA_PATH
        ;;
    *)
        EXT=""
        ;;
esac
++ case "$(uname -s)" in
uname -s
+++ uname -s
++ EXT=

# shift over all args.  We don't accept any, so it's OK that we ignore them all here.
while [[ $# > 0 ]]
do
    key="$1"
    case $key in
        -h|--help)
            "$_CONDA_DIR/conda" ..deactivate $_SHELL$EXT -h
            if [[ -n $BASH_VERSION ]] && [[ "$(basename "$0" 2> /dev/null)" == "deactivate" ]]; then
                exit 0
            else
                return 0
            fi
            ;;
    esac
    shift # past argument or value
done
++ [[ 0 > 0 ]]

# Ensure that this script is sourced, not executed
# Note that if the script was executed, we're running inside bash!
# Also note that errors are ignored as `activate foo` doesn't generate a bad
# value for $0 which would cause errors.
if [[ -n $BASH_VERSION ]] && [[ "$(basename "$0" 2> /dev/null)" == "deactivate" ]]; then
    (>&2 echo "Error: deactivate must be sourced. Run 'source deactivate'
instead of 'deactivate'.
")
    "$_CONDA_DIR/conda" ..deactivate $_SHELL$EXT -h
    exit 1
fi
++ [[ -n 4.3.48(1)-release ]]
basename "$0" 2> /dev/null
+++ basename /home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.11/SPARK_VERSION/2.4.3/label/ubuntu/scripts/jenkins-test
++ [[ jenkins-test == \d\e\a\c\t\i\v\a\t\e ]]

if [[ -z "$CONDA_PATH_BACKUP" ]]; then
    if [[ -n $BASH_VERSION ]] && [[ "$(basename "$0" 2> /dev/null)" == "deactivate" ]]; then
        exit 0
    else
        return 0
    fi
fi
++ [[ -z /usr/lib/jvm/java-8-oracle/bin/:/usr/lib/jvm/java-8-oracle/bin/:/home/anaconda/bin/:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games ]]

if (( $? == 0 )); then
    # Inverse of activation: run deactivate scripts prior to deactivating env
    _CONDA_D="${CONDA_PREFIX}/etc/conda/deactivate.d"
    if [[ -d $_CONDA_D ]]; then
        eval $(find "$_CONDA_D" -iname "*.sh" -exec echo source \'{}\'';' \;)
    fi

#    # get the activation path that would have been provided for this prefix
#    _LAST_ACTIVATE_PATH=$("$_CONDA_DIR/conda" ..activate $_SHELL$EXT "$CONDA_PREFIX")
#
#    # in activate, we replace a placeholder so that conda keeps its place in the PATH order
#    # The activate script sets _CONDA_HOLD here to activate that behavior.
#    #   Otherwise, PATH is simply removed.
#    if [ -n "$_CONDA_HOLD" ]; then
#        export PATH="$($_CONDA_PYTHON2 -c "import re; print(re.sub(r'$_LAST_ACTIVATE_PATH(:?)', r'CONDA_PATH_PLACEHOLDER\1', '$PATH', 1))")"
#    else
#        export PATH="$($_CONDA_PYTHON2 -c "import re; print(re.sub(r'$_LAST_ACTIVATE_PATH(:?)', r'', '$PATH', 1))")"
#    fi
#
#    unset _LAST_ACTIVATE_PATH

    export PATH=$("$_CONDA_DIR/conda" ..deactivate.path $_SHELL$EXT "$CONDA_PREFIX")

    unset CONDA_DEFAULT_ENV
    unset CONDA_PREFIX
    unset CONDA_PATH_BACKUP
    export PS1="$CONDA_PS1_BACKUP"
    unset CONDA_PS1_BACKUP
    unset _CONDA_PYTHON2
else
    unset _CONDA_PYTHON2
    return $?
fi
++ ((  0 == 0  ))
++ _CONDA_D=/home/jenkins/anaconda2/envs/adam-build-0cc65668-cdf7-4945-b18c-214435201a80/etc/conda/deactivate.d
++ [[ -d /home/jenkins/anaconda2/envs/adam-build-0cc65668-cdf7-4945-b18c-214435201a80/etc/conda/deactivate.d ]]
"$_CONDA_DIR/conda" ..deactivate.path $_SHELL$EXT "$CONDA_PREFIX"
+++ /home/jenkins/anaconda2/envs/adam-build-0cc65668-cdf7-4945-b18c-214435201a80/bin/conda ..deactivate.path bash /home/jenkins/anaconda2/envs/adam-build-0cc65668-cdf7-4945-b18c-214435201a80
++ export PATH=/usr/lib/jvm/java-8-oracle/bin/:/usr/lib/jvm/java-8-oracle/bin/:/home/anaconda/bin/:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
++ PATH=/usr/lib/jvm/java-8-oracle/bin/:/usr/lib/jvm/java-8-oracle/bin/:/home/anaconda/bin/:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
++ unset CONDA_DEFAULT_ENV
++ unset CONDA_PREFIX
++ unset CONDA_PATH_BACKUP
++ export PS1=
++ PS1=
++ unset CONDA_PS1_BACKUP
++ unset _CONDA_PYTHON2

if [[ -n $BASH_VERSION ]]; then
    hash -r
elif [[ -n $ZSH_VERSION ]]; then
    rehash
fi
++ [[ -n 4.3.48(1)-release ]]
++ hash -r
+ conda remove -y -n adam-build-0cc65668-cdf7-4945-b18c-214435201a80 --all

Package plan for package removal in environment /home/jenkins/anaconda2/envs/adam-build-0cc65668-cdf7-4945-b18c-214435201a80:

The following packages will be REMOVED:

    ca-certificates: 2019.1.23-0            
    certifi:         2019.3.9-py36_0        
    libedit:         3.1.20181209-hc058e9b_0
    libffi:          3.2.1-hd88cf55_4       
    libgcc-ng:       8.2.0-hdf63c60_1       
    libstdcxx-ng:    8.2.0-hdf63c60_1       
    ncurses:         6.1-he6710b0_1         
    openssl:         1.1.1b-h7b6447c_1      
    pip:             19.1.1-py36_0          
    python:          3.6.8-h0371630_0       
    readline:        7.0-h7b6447c_5         
    setuptools:      41.0.1-py36_0          
    sqlite:          3.28.0-h7b6447c_0      
    tk:              8.6.8-hbc83047_0       
    wheel:           0.33.2-py36_0          
    xz:              5.2.4-h14c3975_4       
    zlib:            1.2.11-h7b6447c_3      

+ cp -r adam-python/target /home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.11/SPARK_VERSION/2.4.3/label/ubuntu/scripts/../adam-python/
+ pushd adam-python
/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/adam-python /tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded ~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.11/SPARK_VERSION/2.4.3/label/ubuntu
+ make clean
pip uninstall -y adam
Skipping adam as it is not installed.
rm -rf bdgenomics/*.egg*
rm -rf build/
+ make clean_sdist
rm -rf dist
+ popd
/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded ~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.11/SPARK_VERSION/2.4.3/label/ubuntu
	
# define filenames
BAM=mouse_chrM.bam
+ BAM=mouse_chrM.bam
READS=${BAM}.reads.adam
+ READS=mouse_chrM.bam.reads.adam
SORTED_READS=${BAM}.reads.sorted.adam
+ SORTED_READS=mouse_chrM.bam.reads.sorted.adam
FRAGMENTS=${BAM}.fragments.adam
+ FRAGMENTS=mouse_chrM.bam.fragments.adam
    
# fetch our input dataset
echo "Fetching BAM file"
+ echo 'Fetching BAM file'
Fetching BAM file
rm -rf ${BAM}
+ rm -rf mouse_chrM.bam
wget -q https://s3.amazonaws.com/bdgenomics-test/${BAM}
+ wget -q https://s3.amazonaws.com/bdgenomics-test/mouse_chrM.bam

# once fetched, convert BAM to ADAM
echo "Converting BAM to ADAM read format"
+ echo 'Converting BAM to ADAM read format'
Converting BAM to ADAM read format
rm -rf ${READS}
+ rm -rf mouse_chrM.bam.reads.adam
${ADAM} transformAlignments ${BAM} ${READS}
+ ./bin/adam-submit transformAlignments mouse_chrM.bam mouse_chrM.bam.reads.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.3-bin-hadoop2.7/bin/spark-submit
19/05/13 17:34:45 WARN Utils: Your hostname, amp-jenkins-staging-worker-01 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface eno1)
19/05/13 17:34:45 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/05/13 17:34:46 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
log4j:WARN No appenders could be found for logger (org.bdgenomics.adam.cli.ADAMMain).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/05/13 17:34:46 INFO SparkContext: Running Spark version 2.4.3
19/05/13 17:34:46 INFO SparkContext: Submitted application: transformAlignments
19/05/13 17:34:46 INFO SecurityManager: Changing view acls to: jenkins
19/05/13 17:34:46 INFO SecurityManager: Changing modify acls to: jenkins
19/05/13 17:34:46 INFO SecurityManager: Changing view acls groups to: 
19/05/13 17:34:46 INFO SecurityManager: Changing modify acls groups to: 
19/05/13 17:34:46 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
19/05/13 17:34:47 INFO Utils: Successfully started service 'sparkDriver' on port 46645.
19/05/13 17:34:47 INFO SparkEnv: Registering MapOutputTracker
19/05/13 17:34:47 INFO SparkEnv: Registering BlockManagerMaster
19/05/13 17:34:47 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/05/13 17:34:47 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/05/13 17:34:47 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-47cfd3a3-6089-4246-949a-79b7bde6bf6d
19/05/13 17:34:47 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
19/05/13 17:34:47 INFO SparkEnv: Registering OutputCommitCoordinator
19/05/13 17:34:47 INFO Utils: Successfully started service 'SparkUI' on port 4040.
19/05/13 17:34:47 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.31:4040
19/05/13 17:34:47 INFO SparkContext: Added JAR file:/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar at spark://192.168.10.31:46645/jars/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar with timestamp 1557794087507
19/05/13 17:34:47 INFO Executor: Starting executor ID driver on host localhost
19/05/13 17:34:47 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 44813.
19/05/13 17:34:47 INFO NettyBlockTransferService: Server created on 192.168.10.31:44813
19/05/13 17:34:47 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/05/13 17:34:47 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.31, 44813, None)
19/05/13 17:34:47 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.10.31:44813 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 44813, None)
19/05/13 17:34:47 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.31, 44813, None)
19/05/13 17:34:47 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 44813, None)
19/05/13 17:34:48 INFO ADAMContext: Loading mouse_chrM.bam as BAM/CRAM/SAM and converting to AlignmentRecords.
19/05/13 17:34:48 INFO ADAMContext: Loaded header from file:/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam
19/05/13 17:34:49 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 284.3 KB, free 366.0 MB)
19/05/13 17:34:49 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 23.1 KB, free 366.0 MB)
19/05/13 17:34:49 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.31:44813 (size: 23.1 KB, free: 366.3 MB)
19/05/13 17:34:49 INFO SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:1582
19/05/13 17:34:50 INFO RDDBoundAlignmentRecordDataset: Saving data in ADAM format
19/05/13 17:34:50 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
19/05/13 17:34:51 INFO FileInputFormat: Total input paths to process : 1
19/05/13 17:34:51 INFO SparkContext: Starting job: runJob at SparkHadoopWriter.scala:78
19/05/13 17:34:51 INFO DAGScheduler: Got job 0 (runJob at SparkHadoopWriter.scala:78) with 1 output partitions
19/05/13 17:34:51 INFO DAGScheduler: Final stage: ResultStage 0 (runJob at SparkHadoopWriter.scala:78)
19/05/13 17:34:51 INFO DAGScheduler: Parents of final stage: List()
19/05/13 17:34:51 INFO DAGScheduler: Missing parents: List()
19/05/13 17:34:51 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[2] at map at GenomicDataset.scala:3814), which has no missing parents
19/05/13 17:34:51 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 82.0 KB, free 365.9 MB)
19/05/13 17:34:51 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 30.2 KB, free 365.9 MB)
19/05/13 17:34:51 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.10.31:44813 (size: 30.2 KB, free: 366.2 MB)
19/05/13 17:34:51 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1161
19/05/13 17:34:51 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[2] at map at GenomicDataset.scala:3814) (first 15 tasks are for partitions Vector(0))
19/05/13 17:34:51 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
19/05/13 17:34:51 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7962 bytes)
19/05/13 17:34:51 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
19/05/13 17:34:51 INFO Executor: Fetching spark://192.168.10.31:46645/jars/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar with timestamp 1557794087507
19/05/13 17:34:51 INFO TransportClientFactory: Successfully created connection to /192.168.10.31:46645 after 55 ms (0 ms spent in bootstraps)
19/05/13 17:34:51 INFO Utils: Fetching spark://192.168.10.31:46645/jars/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar to /tmp/spark-9b9f9a15-9d7b-4c57-937c-542455d095c0/userFiles-b7150faf-cc08-4acb-bcfc-9369cbdbf80f/fetchFileTemp1635909534437033011.tmp
19/05/13 17:34:51 INFO Executor: Adding file:/tmp/spark-9b9f9a15-9d7b-4c57-937c-542455d095c0/userFiles-b7150faf-cc08-4acb-bcfc-9369cbdbf80f/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar to class loader
19/05/13 17:34:51 INFO NewHadoopRDD: Input split: file:/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam:83361792-833134657535
19/05/13 17:34:52 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
19/05/13 17:34:52 INFO CodecConfig: Compression: GZIP
19/05/13 17:34:52 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
19/05/13 17:34:52 INFO ParquetOutputFormat: Parquet block size to 134217728
19/05/13 17:34:52 INFO ParquetOutputFormat: Parquet page size to 1048576
19/05/13 17:34:52 INFO ParquetOutputFormat: Parquet dictionary page size to 1048576
19/05/13 17:34:52 INFO ParquetOutputFormat: Dictionary is on
19/05/13 17:34:52 INFO ParquetOutputFormat: Validation is off
19/05/13 17:34:52 INFO ParquetOutputFormat: Writer version is: PARQUET_1_0
19/05/13 17:34:52 INFO ParquetOutputFormat: Maximum row group padding size is 8388608 bytes
19/05/13 17:34:52 INFO ParquetOutputFormat: Page size checking is: estimated
19/05/13 17:34:52 INFO ParquetOutputFormat: Min row count for page size check is: 100
19/05/13 17:34:52 INFO ParquetOutputFormat: Max row count for page size check is: 10000
19/05/13 17:34:52 INFO CodecPool: Got brand-new compressor [.gz]
Ignoring SAM validation error: ERROR: Record 162622, Read name 613F0AAXX100423:3:58:9979:16082, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162624, Read name 613F0AAXX100423:6:13:3141:11793, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162625, Read name 613F0AAXX100423:8:39:18592:13552, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162635, Read name 613F1AAXX100423:7:2:13114:10698, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162637, Read name 613F1AAXX100423:6:100:8840:11167, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162639, Read name 613F1AAXX100423:8:15:10944:11181, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162640, Read name 613F1AAXX100423:8:17:5740:10104, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162651, Read name 613F1AAXX100423:1:53:11097:8261, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162654, Read name 613F1AAXX100423:2:112:16779:19612, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162657, Read name 613F0AAXX100423:8:28:7084:17683, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162659, Read name 613F0AAXX100423:8:39:19796:12794, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162662, Read name 613F1AAXX100423:5:116:9339:3264, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162667, Read name 613F0AAXX100423:4:67:2015:3054, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162669, Read name 613F0AAXX100423:7:7:11297:11738, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162674, Read name 613F0AAXX100423:6:59:10490:20829, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162678, Read name 613F1AAXX100423:8:11:17603:4766, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162682, Read name 613F0AAXX100423:5:86:10814:10257, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162683, Read name 613F0AAXX100423:5:117:14178:6111, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162685, Read name 613F0AAXX100423:2:3:13563:6720, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162689, Read name 613F0AAXX100423:7:59:16009:15799, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162696, Read name 613F0AAXX100423:5:31:9663:18252, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162698, Read name 613F1AAXX100423:2:27:12264:14626, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162699, Read name 613F0AAXX100423:1:120:19003:6647, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162702, Read name 613F1AAXX100423:3:37:6972:18407, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162704, Read name 613F1AAXX100423:3:77:6946:3880, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162706, Read name 613F0AAXX100423:7:48:2692:3492, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162708, Read name 613F1AAXX100423:7:80:8790:1648, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162710, Read name 6141AAAXX100423:5:30:15036:17610, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162712, Read name 613F1AAXX100423:8:80:6261:4465, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162713, Read name 6141AAAXX100423:5:74:5542:6195, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162715, Read name 613F1AAXX100423:5:14:14844:13639, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162718, Read name 613F1AAXX100423:7:112:14569:8480, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162725, Read name 613F1AAXX100423:4:56:10160:9879, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162727, Read name 6141AAAXX100423:7:89:12209:9221, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162731, Read name 6141AAAXX100423:6:55:1590:19793, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162732, Read name 6141AAAXX100423:7:102:16679:12368, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162734, Read name 613F1AAXX100423:2:7:4909:18472, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162737, Read name 6141AAAXX100423:4:73:6574:10572, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162741, Read name 6141AAAXX100423:1:8:14113:12655, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162743, Read name 6141AAAXX100423:3:40:7990:5056, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162744, Read name 6141AAAXX100423:4:36:15793:3411, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162745, Read name 6141AAAXX100423:8:83:1139:18985, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162746, Read name 6141AAAXX100423:5:7:18196:13562, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162748, Read name 6141AAAXX100423:3:114:5639:7123, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162751, Read name 6141AAAXX100423:7:47:4898:8640, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162753, Read name 6141AAAXX100423:3:64:8064:8165, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162756, Read name 613F1AAXX100423:1:105:14386:1684, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162757, Read name 613F1AAXX100423:6:98:1237:19470, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162761, Read name 613F1AAXX100423:7:106:19658:9261, MAPQ should be 0 for unmapped read.
19/05/13 17:35:02 INFO InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 16043959
19/05/13 17:35:02 INFO FileOutputCommitter: Saved output of task 'attempt_20190513173450_0002_r_000000_0' to file:/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/_temporary/0/task_20190513173450_0002_r_000000
19/05/13 17:35:02 INFO SparkHadoopMapRedUtil: attempt_20190513173450_0002_r_000000_0: Committed
19/05/13 17:35:02 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 893 bytes result sent to driver
19/05/13 17:35:02 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 11262 ms on localhost (executor driver) (1/1)
19/05/13 17:35:02 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
19/05/13 17:35:02 INFO DAGScheduler: ResultStage 0 (runJob at SparkHadoopWriter.scala:78) finished in 11.429 s
19/05/13 17:35:02 INFO DAGScheduler: Job 0 finished: runJob at SparkHadoopWriter.scala:78, took 11.512173 s
19/05/13 17:35:02 INFO ParquetFileReader: Initiating action with parallelism: 5
19/05/13 17:35:02 INFO SparkHadoopWriter: Job job_20190513173450_0002 committed.
19/05/13 17:35:02 INFO TransformAlignments: Overall Duration: 16.32 secs
19/05/13 17:35:02 INFO SparkContext: Invoking stop() from shutdown hook
19/05/13 17:35:02 INFO SparkUI: Stopped Spark web UI at http://192.168.10.31:4040
19/05/13 17:35:02 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/05/13 17:35:02 INFO MemoryStore: MemoryStore cleared
19/05/13 17:35:02 INFO BlockManager: BlockManager stopped
19/05/13 17:35:02 INFO BlockManagerMaster: BlockManagerMaster stopped
19/05/13 17:35:02 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/05/13 17:35:02 INFO SparkContext: Successfully stopped SparkContext
19/05/13 17:35:02 INFO ShutdownHookManager: Shutdown hook called
19/05/13 17:35:02 INFO ShutdownHookManager: Deleting directory /tmp/spark-9b9f9a15-9d7b-4c57-937c-542455d095c0
19/05/13 17:35:02 INFO ShutdownHookManager: Deleting directory /tmp/spark-88ed1690-cc5b-4b5b-8576-03558f33e58f

# then, sort the BAM
echo "Converting BAM to ADAM read format with sorting"
+ echo 'Converting BAM to ADAM read format with sorting'
Converting BAM to ADAM read format with sorting
rm -rf ${SORTED_READS}
+ rm -rf mouse_chrM.bam.reads.sorted.adam
${ADAM} transformAlignments -sort_reads ${READS} ${SORTED_READS}
+ ./bin/adam-submit transformAlignments -sort_reads mouse_chrM.bam.reads.adam mouse_chrM.bam.reads.sorted.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.3-bin-hadoop2.7/bin/spark-submit
19/05/13 17:35:04 WARN Utils: Your hostname, amp-jenkins-staging-worker-01 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface eno1)
19/05/13 17:35:04 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/05/13 17:35:04 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
log4j:WARN No appenders could be found for logger (org.bdgenomics.adam.cli.ADAMMain).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/05/13 17:35:05 INFO SparkContext: Running Spark version 2.4.3
19/05/13 17:35:05 INFO SparkContext: Submitted application: transformAlignments
19/05/13 17:35:05 INFO SecurityManager: Changing view acls to: jenkins
19/05/13 17:35:05 INFO SecurityManager: Changing modify acls to: jenkins
19/05/13 17:35:05 INFO SecurityManager: Changing view acls groups to: 
19/05/13 17:35:05 INFO SecurityManager: Changing modify acls groups to: 
19/05/13 17:35:05 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
19/05/13 17:35:05 INFO Utils: Successfully started service 'sparkDriver' on port 37144.
19/05/13 17:35:05 INFO SparkEnv: Registering MapOutputTracker
19/05/13 17:35:05 INFO SparkEnv: Registering BlockManagerMaster
19/05/13 17:35:05 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/05/13 17:35:05 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/05/13 17:35:05 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-0269e983-e436-4187-ba94-5b0dd2e8a0ac
19/05/13 17:35:05 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
19/05/13 17:35:05 INFO SparkEnv: Registering OutputCommitCoordinator
19/05/13 17:35:05 INFO Utils: Successfully started service 'SparkUI' on port 4040.
19/05/13 17:35:06 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.31:4040
19/05/13 17:35:06 INFO SparkContext: Added JAR file:/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar at spark://192.168.10.31:37144/jars/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar with timestamp 1557794106098
19/05/13 17:35:06 INFO Executor: Starting executor ID driver on host localhost
19/05/13 17:35:06 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 44055.
19/05/13 17:35:06 INFO NettyBlockTransferService: Server created on 192.168.10.31:44055
19/05/13 17:35:06 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/05/13 17:35:06 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.31, 44055, None)
19/05/13 17:35:06 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.10.31:44055 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 44055, None)
19/05/13 17:35:06 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.31, 44055, None)
19/05/13 17:35:06 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 44055, None)
19/05/13 17:35:06 INFO ADAMContext: Loading mouse_chrM.bam.reads.adam as Parquet of AlignmentRecords.
19/05/13 17:35:08 INFO ADAMContext: Reading the ADAM file at mouse_chrM.bam.reads.adam to create RDD
19/05/13 17:35:09 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 303.7 KB, free 366.0 MB)
19/05/13 17:35:09 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 27.4 KB, free 366.0 MB)
19/05/13 17:35:09 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.31:44055 (size: 27.4 KB, free: 366.3 MB)
19/05/13 17:35:09 INFO SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:1320
19/05/13 17:35:09 INFO TransformAlignments: Sorting reads
19/05/13 17:35:09 INFO RDDBoundAlignmentRecordDataset: Sorting reads by reference index, using SequenceDictionary{
chrM->16299, 0
chr1->194532772, 1
chr2->181993374, 2
chr3->159377569, 3
chr4->155544171, 4
chr5->151694555, 5
chr6->149307077, 6
chr7->142600211, 7
chr8->129558836, 8
chr9->124827824, 9
chr10->130854437, 10
chr11->122260198, 11
chr12->119596406, 12
chr13->120421117, 13
chr14->124162362, 14
chr15->104036045, 15
chr16->98095624, 16
chr17->95052499, 17
chr18->90827538, 18
chr19->61533103, 19
chrX->170682864, 20
chrY->91744698, 21
chr1_GL456210_random->169725, 22
chr1_GL456211_random->241735, 23
chr1_GL456212_random->153618, 24
chr1_GL456213_random->39340, 25
chr1_GL456221_random->206961, 26
chr4_GL456216_random->66673, 27
chr4_GL456350_random->227966, 28
chr4_JH584292_random->14945, 29
chr4_JH584293_random->207968, 30
chr4_JH584294_random->191905, 31
chr4_JH584295_random->1976, 32
chr5_GL456354_random->195993, 33
chr5_JH584296_random->199368, 34
chr5_JH584297_random->205776, 35
chr5_JH584298_random->184189, 36
chr5_JH584299_random->953012, 37
chr7_GL456219_random->175968, 38
chrUn_GL456239->40056, 39
chrUn_GL456359->22974, 40
chrUn_GL456360->31704, 41
chrUn_GL456366->47073, 42
chrUn_GL456367->42057, 43
chrUn_GL456368->20208, 44
chrUn_GL456370->26764, 45
chrUn_GL456372->28664, 46
chrUn_GL456378->31602, 47
chrUn_GL456379->72385, 48
chrUn_GL456381->25871, 49
chrUn_GL456382->23158, 50
chrUn_GL456383->38659, 51
chrUn_GL456385->35240, 52
chrUn_GL456387->24685, 53
chrUn_GL456389->28772, 54
chrUn_GL456390->24668, 55
chrUn_GL456392->23629, 56
chrUn_GL456393->55711, 57
chrUn_GL456394->24323, 58
chrUn_GL456396->21240, 59
chrUn_JH584304->114452, 60
chrX_GL456233_random->336933, 61
chrY_JH584300_random->182347, 62
chrY_JH584301_random->259875, 63
chrY_JH584302_random->155838, 64
chrY_JH584303_random->158099, 65}.
19/05/13 17:35:09 INFO FileInputFormat: Total input paths to process : 1
19/05/13 17:35:09 INFO ParquetInputFormat: Total input paths to process : 1
19/05/13 17:35:10 INFO RDDBoundAlignmentRecordDataset: Saving data in ADAM format
19/05/13 17:35:10 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
19/05/13 17:35:10 INFO SparkContext: Starting job: runJob at SparkHadoopWriter.scala:78
19/05/13 17:35:10 INFO DAGScheduler: Registering RDD 2 (sortBy at AlignmentRecordDataset.scala:996)
19/05/13 17:35:10 INFO DAGScheduler: Got job 0 (runJob at SparkHadoopWriter.scala:78) with 1 output partitions
19/05/13 17:35:10 INFO DAGScheduler: Final stage: ResultStage 1 (runJob at SparkHadoopWriter.scala:78)
19/05/13 17:35:10 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
19/05/13 17:35:10 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 0)
19/05/13 17:35:10 INFO DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[2] at sortBy at AlignmentRecordDataset.scala:996), which has no missing parents
19/05/13 17:35:10 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 15.3 KB, free 366.0 MB)
19/05/13 17:35:10 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 6.9 KB, free 366.0 MB)
19/05/13 17:35:10 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.10.31:44055 (size: 6.9 KB, free: 366.3 MB)
19/05/13 17:35:10 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1161
19/05/13 17:35:10 INFO DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[2] at sortBy at AlignmentRecordDataset.scala:996) (first 15 tasks are for partitions Vector(0))
19/05/13 17:35:10 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
19/05/13 17:35:10 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 8001 bytes)
19/05/13 17:35:10 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
19/05/13 17:35:10 INFO Executor: Fetching spark://192.168.10.31:37144/jars/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar with timestamp 1557794106098
19/05/13 17:35:10 INFO TransportClientFactory: Successfully created connection to /192.168.10.31:37144 after 53 ms (0 ms spent in bootstraps)
19/05/13 17:35:10 INFO Utils: Fetching spark://192.168.10.31:37144/jars/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar to /tmp/spark-1f8b11e5-675b-47d3-b564-81f940221f12/userFiles-d95f0623-adf3-44fb-a58b-426dbee9e478/fetchFileTemp8402367469216770113.tmp
19/05/13 17:35:10 INFO Executor: Adding file:/tmp/spark-1f8b11e5-675b-47d3-b564-81f940221f12/userFiles-d95f0623-adf3-44fb-a58b-426dbee9e478/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar to class loader
19/05/13 17:35:11 INFO NewHadoopRDD: Input split: file:/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/part-r-00000.gz.parquet:0+10132194
19/05/13 17:35:11 INFO InternalParquetRecordReader: RecordReader initialized will read a total of 163064 records.
19/05/13 17:35:11 INFO InternalParquetRecordReader: at row 0. reading next block
19/05/13 17:35:11 INFO CodecPool: Got brand-new decompressor [.gz]
19/05/13 17:35:11 INFO InternalParquetRecordReader: block read in memory in 50 ms. row count = 163064
19/05/13 17:35:14 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 956 bytes result sent to driver
19/05/13 17:35:14 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 3934 ms on localhost (executor driver) (1/1)
19/05/13 17:35:14 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
19/05/13 17:35:14 INFO DAGScheduler: ShuffleMapStage 0 (sortBy at AlignmentRecordDataset.scala:996) finished in 4.104 s
19/05/13 17:35:14 INFO DAGScheduler: looking for newly runnable stages
19/05/13 17:35:14 INFO DAGScheduler: running: Set()
19/05/13 17:35:14 INFO DAGScheduler: waiting: Set(ResultStage 1)
19/05/13 17:35:14 INFO DAGScheduler: failed: Set()
19/05/13 17:35:14 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[5] at map at GenomicDataset.scala:3814), which has no missing parents
19/05/13 17:35:14 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 83.1 KB, free 365.9 MB)
19/05/13 17:35:14 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 30.8 KB, free 365.8 MB)
19/05/13 17:35:14 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.10.31:44055 (size: 30.8 KB, free: 366.2 MB)
19/05/13 17:35:14 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1161
19/05/13 17:35:14 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[5] at map at GenomicDataset.scala:3814) (first 15 tasks are for partitions Vector(0))
19/05/13 17:35:14 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
19/05/13 17:35:14 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, executor driver, partition 0, ANY, 7662 bytes)
19/05/13 17:35:14 INFO Executor: Running task 0.0 in stage 1.0 (TID 1)
19/05/13 17:35:14 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/05/13 17:35:14 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 8 ms
19/05/13 17:35:16 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
19/05/13 17:35:16 INFO CodecConfig: Compression: GZIP
19/05/13 17:35:16 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
19/05/13 17:35:16 INFO ParquetOutputFormat: Parquet block size to 134217728
19/05/13 17:35:16 INFO ParquetOutputFormat: Parquet page size to 1048576
19/05/13 17:35:16 INFO ParquetOutputFormat: Parquet dictionary page size to 1048576
19/05/13 17:35:16 INFO ParquetOutputFormat: Dictionary is on
19/05/13 17:35:16 INFO ParquetOutputFormat: Validation is off
19/05/13 17:35:16 INFO ParquetOutputFormat: Writer version is: PARQUET_1_0
19/05/13 17:35:16 INFO ParquetOutputFormat: Maximum row group padding size is 8388608 bytes
19/05/13 17:35:16 INFO ParquetOutputFormat: Page size checking is: estimated
19/05/13 17:35:16 INFO ParquetOutputFormat: Min row count for page size check is: 100
19/05/13 17:35:16 INFO ParquetOutputFormat: Max row count for page size check is: 10000
19/05/13 17:35:16 INFO CodecPool: Got brand-new compressor [.gz]
19/05/13 17:35:20 INFO InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 16004474
19/05/13 17:35:21 INFO FileOutputCommitter: Saved output of task 'attempt_20190513173510_0005_r_000000_0' to file:/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.sorted.adam/_temporary/0/task_20190513173510_0005_r_000000
19/05/13 17:35:21 INFO SparkHadoopMapRedUtil: attempt_20190513173510_0005_r_000000_0: Committed
19/05/13 17:35:21 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1). 1280 bytes result sent to driver
19/05/13 17:35:21 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 6786 ms on localhost (executor driver) (1/1)
19/05/13 17:35:21 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
19/05/13 17:35:21 INFO DAGScheduler: ResultStage 1 (runJob at SparkHadoopWriter.scala:78) finished in 6.858 s
19/05/13 17:35:21 INFO DAGScheduler: Job 0 finished: runJob at SparkHadoopWriter.scala:78, took 11.078936 s
19/05/13 17:35:21 INFO ParquetFileReader: Initiating action with parallelism: 5
19/05/13 17:35:21 INFO SparkHadoopWriter: Job job_20190513173510_0005 committed.
19/05/13 17:35:21 INFO TransformAlignments: Overall Duration: 16.25 secs
19/05/13 17:35:21 INFO SparkContext: Invoking stop() from shutdown hook
19/05/13 17:35:21 INFO SparkUI: Stopped Spark web UI at http://192.168.10.31:4040
19/05/13 17:35:21 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/05/13 17:35:21 INFO MemoryStore: MemoryStore cleared
19/05/13 17:35:21 INFO BlockManager: BlockManager stopped
19/05/13 17:35:21 INFO BlockManagerMaster: BlockManagerMaster stopped
19/05/13 17:35:21 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/05/13 17:35:21 INFO SparkContext: Successfully stopped SparkContext
19/05/13 17:35:21 INFO ShutdownHookManager: Shutdown hook called
19/05/13 17:35:21 INFO ShutdownHookManager: Deleting directory /tmp/spark-8ae56134-0674-4f15-aa23-4c3b6be213ae
19/05/13 17:35:21 INFO ShutdownHookManager: Deleting directory /tmp/spark-1f8b11e5-675b-47d3-b564-81f940221f12

# convert the reads to fragments to re-pair the reads
echo "Converting read file to fragments"
+ echo 'Converting read file to fragments'
Converting read file to fragments
rm -rf ${FRAGMENTS}
+ rm -rf mouse_chrM.bam.fragments.adam
${ADAM} transformFragments -load_as_reads ${READS} ${FRAGMENTS}
+ ./bin/adam-submit transformFragments -load_as_reads mouse_chrM.bam.reads.adam mouse_chrM.bam.fragments.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.3-bin-hadoop2.7/bin/spark-submit
19/05/13 17:35:23 WARN Utils: Your hostname, amp-jenkins-staging-worker-01 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface eno1)
19/05/13 17:35:23 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/05/13 17:35:23 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
log4j:WARN No appenders could be found for logger (org.bdgenomics.adam.cli.ADAMMain).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/05/13 17:35:24 INFO SparkContext: Running Spark version 2.4.3
19/05/13 17:35:24 INFO SparkContext: Submitted application: transformFragments
19/05/13 17:35:24 INFO SecurityManager: Changing view acls to: jenkins
19/05/13 17:35:24 INFO SecurityManager: Changing modify acls to: jenkins
19/05/13 17:35:24 INFO SecurityManager: Changing view acls groups to: 
19/05/13 17:35:24 INFO SecurityManager: Changing modify acls groups to: 
19/05/13 17:35:24 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
19/05/13 17:35:24 INFO Utils: Successfully started service 'sparkDriver' on port 46752.
19/05/13 17:35:24 INFO SparkEnv: Registering MapOutputTracker
19/05/13 17:35:24 INFO SparkEnv: Registering BlockManagerMaster
19/05/13 17:35:24 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/05/13 17:35:24 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/05/13 17:35:24 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-6feae4e8-a459-4e20-9ab0-34d586cda665
19/05/13 17:35:24 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
19/05/13 17:35:24 INFO SparkEnv: Registering OutputCommitCoordinator
19/05/13 17:35:25 INFO Utils: Successfully started service 'SparkUI' on port 4040.
19/05/13 17:35:25 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.31:4040
19/05/13 17:35:25 INFO SparkContext: Added JAR file:/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar at spark://192.168.10.31:46752/jars/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar with timestamp 1557794125217
19/05/13 17:35:25 INFO Executor: Starting executor ID driver on host localhost
19/05/13 17:35:25 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 42803.
19/05/13 17:35:25 INFO NettyBlockTransferService: Server created on 192.168.10.31:42803
19/05/13 17:35:25 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/05/13 17:35:25 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.31, 42803, None)
19/05/13 17:35:25 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.10.31:42803 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 42803, None)
19/05/13 17:35:25 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.31, 42803, None)
19/05/13 17:35:25 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 42803, None)
19/05/13 17:35:26 INFO ADAMContext: Loading mouse_chrM.bam.reads.adam as Parquet of AlignmentRecords.
19/05/13 17:35:27 INFO ADAMContext: Reading the ADAM file at mouse_chrM.bam.reads.adam to create RDD
19/05/13 17:35:28 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 303.7 KB, free 366.0 MB)
19/05/13 17:35:28 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 27.4 KB, free 366.0 MB)
19/05/13 17:35:28 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.31:42803 (size: 27.4 KB, free: 366.3 MB)
19/05/13 17:35:28 INFO SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:1320
19/05/13 17:35:28 INFO FileInputFormat: Total input paths to process : 1
19/05/13 17:35:28 INFO ParquetInputFormat: Total input paths to process : 1
19/05/13 17:35:29 INFO RDDBoundFragmentDataset: Saving data in ADAM format
19/05/13 17:35:29 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
19/05/13 17:35:29 INFO SparkContext: Starting job: runJob at SparkHadoopWriter.scala:78
19/05/13 17:35:29 INFO DAGScheduler: Registering RDD 2 (groupBy at SingleReadBucket.scala:97)
19/05/13 17:35:29 INFO DAGScheduler: Got job 0 (runJob at SparkHadoopWriter.scala:78) with 1 output partitions
19/05/13 17:35:29 INFO DAGScheduler: Final stage: ResultStage 1 (runJob at SparkHadoopWriter.scala:78)
19/05/13 17:35:29 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
19/05/13 17:35:29 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 0)
19/05/13 17:35:29 INFO DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[2] at groupBy at SingleReadBucket.scala:97), which has no missing parents
19/05/13 17:35:29 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 5.3 KB, free 366.0 MB)
19/05/13 17:35:29 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 3.0 KB, free 366.0 MB)
19/05/13 17:35:29 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.10.31:42803 (size: 3.0 KB, free: 366.3 MB)
19/05/13 17:35:29 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1161
19/05/13 17:35:29 INFO DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[2] at groupBy at SingleReadBucket.scala:97) (first 15 tasks are for partitions Vector(0))
19/05/13 17:35:29 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
19/05/13 17:35:29 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 8001 bytes)
19/05/13 17:35:29 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
19/05/13 17:35:29 INFO Executor: Fetching spark://192.168.10.31:46752/jars/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar with timestamp 1557794125217
19/05/13 17:35:29 INFO TransportClientFactory: Successfully created connection to /192.168.10.31:46752 after 57 ms (0 ms spent in bootstraps)
19/05/13 17:35:29 INFO Utils: Fetching spark://192.168.10.31:46752/jars/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar to /tmp/spark-7acb1bce-8bab-4652-9587-791065bbbfd4/userFiles-453ea751-fb4d-457b-9f3c-242f0a25f119/fetchFileTemp3443492678647539022.tmp
19/05/13 17:35:29 INFO Executor: Adding file:/tmp/spark-7acb1bce-8bab-4652-9587-791065bbbfd4/userFiles-453ea751-fb4d-457b-9f3c-242f0a25f119/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar to class loader
19/05/13 17:35:29 INFO NewHadoopRDD: Input split: file:/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/part-r-00000.gz.parquet:0+10132194
19/05/13 17:35:30 INFO InternalParquetRecordReader: RecordReader initialized will read a total of 163064 records.
19/05/13 17:35:30 INFO InternalParquetRecordReader: at row 0. reading next block
19/05/13 17:35:30 INFO CodecPool: Got brand-new decompressor [.gz]
19/05/13 17:35:30 INFO InternalParquetRecordReader: block read in memory in 56 ms. row count = 163064
19/05/13 17:35:33 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 999 bytes result sent to driver
19/05/13 17:35:33 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 4611 ms on localhost (executor driver) (1/1)
19/05/13 17:35:33 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
19/05/13 17:35:33 INFO DAGScheduler: ShuffleMapStage 0 (groupBy at SingleReadBucket.scala:97) finished in 4.753 s
19/05/13 17:35:33 INFO DAGScheduler: looking for newly runnable stages
19/05/13 17:35:33 INFO DAGScheduler: running: Set()
19/05/13 17:35:33 INFO DAGScheduler: waiting: Set(ResultStage 1)
19/05/13 17:35:33 INFO DAGScheduler: failed: Set()
19/05/13 17:35:33 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[6] at map at GenomicDataset.scala:3814), which has no missing parents
19/05/13 17:35:34 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 86.2 KB, free 365.9 MB)
19/05/13 17:35:34 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 32.1 KB, free 365.9 MB)
19/05/13 17:35:34 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.10.31:42803 (size: 32.1 KB, free: 366.2 MB)
19/05/13 17:35:34 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1161
19/05/13 17:35:34 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[6] at map at GenomicDataset.scala:3814) (first 15 tasks are for partitions Vector(0))
19/05/13 17:35:34 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
19/05/13 17:35:34 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, executor driver, partition 0, ANY, 7662 bytes)
19/05/13 17:35:34 INFO Executor: Running task 0.0 in stage 1.0 (TID 1)
19/05/13 17:35:34 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/05/13 17:35:34 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 8 ms
19/05/13 17:35:35 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
19/05/13 17:35:35 INFO CodecConfig: Compression: GZIP
19/05/13 17:35:35 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
19/05/13 17:35:35 INFO ParquetOutputFormat: Parquet block size to 134217728
19/05/13 17:35:35 INFO ParquetOutputFormat: Parquet page size to 1048576
19/05/13 17:35:35 INFO ParquetOutputFormat: Parquet dictionary page size to 1048576
19/05/13 17:35:35 INFO ParquetOutputFormat: Dictionary is on
19/05/13 17:35:35 INFO ParquetOutputFormat: Validation is off
19/05/13 17:35:35 INFO ParquetOutputFormat: Writer version is: PARQUET_1_0
19/05/13 17:35:35 INFO ParquetOutputFormat: Maximum row group padding size is 8388608 bytes
19/05/13 17:35:35 INFO ParquetOutputFormat: Page size checking is: estimated
19/05/13 17:35:35 INFO ParquetOutputFormat: Min row count for page size check is: 100
19/05/13 17:35:35 INFO ParquetOutputFormat: Max row count for page size check is: 10000
19/05/13 17:35:36 INFO CodecPool: Got brand-new compressor [.gz]
19/05/13 17:35:38 INFO BlockManagerInfo: Removed broadcast_1_piece0 on 192.168.10.31:42803 in memory (size: 3.0 KB, free: 366.2 MB)
19/05/13 17:35:43 INFO InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 21417928
19/05/13 17:35:44 INFO FileOutputCommitter: Saved output of task 'attempt_20190513173529_0006_r_000000_0' to file:/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.fragments.adam/_temporary/0/task_20190513173529_0006_r_000000
19/05/13 17:35:44 INFO SparkHadoopMapRedUtil: attempt_20190513173529_0006_r_000000_0: Committed
19/05/13 17:35:44 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1). 1280 bytes result sent to driver
19/05/13 17:35:44 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 10343 ms on localhost (executor driver) (1/1)
19/05/13 17:35:44 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
19/05/13 17:35:44 INFO DAGScheduler: ResultStage 1 (runJob at SparkHadoopWriter.scala:78) finished in 10.396 s
19/05/13 17:35:44 INFO DAGScheduler: Job 0 finished: runJob at SparkHadoopWriter.scala:78, took 15.262672 s
19/05/13 17:35:44 INFO ParquetFileReader: Initiating action with parallelism: 5
19/05/13 17:35:44 INFO SparkHadoopWriter: Job job_20190513173529_0006 committed.
19/05/13 17:35:44 INFO TransformFragments: Overall Duration: 20.54 secs
19/05/13 17:35:44 INFO SparkContext: Invoking stop() from shutdown hook
19/05/13 17:35:44 INFO SparkUI: Stopped Spark web UI at http://192.168.10.31:4040
19/05/13 17:35:44 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/05/13 17:35:44 INFO MemoryStore: MemoryStore cleared
19/05/13 17:35:44 INFO BlockManager: BlockManager stopped
19/05/13 17:35:44 INFO BlockManagerMaster: BlockManagerMaster stopped
19/05/13 17:35:44 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/05/13 17:35:44 INFO SparkContext: Successfully stopped SparkContext
19/05/13 17:35:44 INFO ShutdownHookManager: Shutdown hook called
19/05/13 17:35:44 INFO ShutdownHookManager: Deleting directory /tmp/spark-7acb1bce-8bab-4652-9587-791065bbbfd4
19/05/13 17:35:44 INFO ShutdownHookManager: Deleting directory /tmp/spark-372fa00c-3c63-4a4a-b317-54df3e8cb2bb

# test that printing works
echo "Printing reads and fragments"
+ echo 'Printing reads and fragments'
Printing reads and fragments
${ADAM} print ${READS} 1>/dev/null 2>/dev/null
+ ./bin/adam-submit print mouse_chrM.bam.reads.adam
${ADAM} print ${FRAGMENTS} 1>/dev/null 2>/dev/null
+ ./bin/adam-submit print mouse_chrM.bam.fragments.adam

# run flagstat to verify that flagstat runs OK
echo "Printing read statistics"
+ echo 'Printing read statistics'
Printing read statistics
${ADAM} flagstat -print_metrics ${READS}
+ ./bin/adam-submit flagstat -print_metrics mouse_chrM.bam.reads.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.3-bin-hadoop2.7/bin/spark-submit
19/05/13 17:36:05 WARN Utils: Your hostname, amp-jenkins-staging-worker-01 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface eno1)
19/05/13 17:36:05 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/05/13 17:36:05 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
log4j:WARN No appenders could be found for logger (org.bdgenomics.adam.cli.ADAMMain).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/05/13 17:36:05 INFO SparkContext: Running Spark version 2.4.3
19/05/13 17:36:05 INFO SparkContext: Submitted application: flagstat
19/05/13 17:36:06 INFO SecurityManager: Changing view acls to: jenkins
19/05/13 17:36:06 INFO SecurityManager: Changing modify acls to: jenkins
19/05/13 17:36:06 INFO SecurityManager: Changing view acls groups to: 
19/05/13 17:36:06 INFO SecurityManager: Changing modify acls groups to: 
19/05/13 17:36:06 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
19/05/13 17:36:06 INFO Utils: Successfully started service 'sparkDriver' on port 43177.
19/05/13 17:36:06 INFO SparkEnv: Registering MapOutputTracker
19/05/13 17:36:06 INFO SparkEnv: Registering BlockManagerMaster
19/05/13 17:36:06 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/05/13 17:36:06 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/05/13 17:36:06 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-55df33bb-11a6-4e86-93db-4e32244a70a0
19/05/13 17:36:06 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
19/05/13 17:36:06 INFO SparkEnv: Registering OutputCommitCoordinator
19/05/13 17:36:06 INFO Utils: Successfully started service 'SparkUI' on port 4040.
19/05/13 17:36:07 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.31:4040
19/05/13 17:36:07 INFO SparkContext: Added JAR file:/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar at spark://192.168.10.31:43177/jars/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar with timestamp 1557794167048
19/05/13 17:36:07 INFO Executor: Starting executor ID driver on host localhost
19/05/13 17:36:07 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 35344.
19/05/13 17:36:07 INFO NettyBlockTransferService: Server created on 192.168.10.31:35344
19/05/13 17:36:07 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/05/13 17:36:07 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.31, 35344, None)
19/05/13 17:36:07 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.10.31:35344 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 35344, None)
19/05/13 17:36:07 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.31, 35344, None)
19/05/13 17:36:07 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 35344, None)
19/05/13 17:36:08 INFO ADAMContext: Loading mouse_chrM.bam.reads.adam as Parquet of AlignmentRecords.
19/05/13 17:36:08 INFO ADAMContext: Reading the ADAM file at mouse_chrM.bam.reads.adam to create RDD
19/05/13 17:36:08 INFO ADAMContext: Using the specified projection schema
19/05/13 17:36:08 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 313.2 KB, free 366.0 MB)
19/05/13 17:36:09 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 28.3 KB, free 366.0 MB)
19/05/13 17:36:09 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.31:35344 (size: 28.3 KB, free: 366.3 MB)
19/05/13 17:36:09 INFO SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:1320
19/05/13 17:36:11 INFO FileInputFormat: Total input paths to process : 1
19/05/13 17:36:11 INFO ParquetInputFormat: Total input paths to process : 1
19/05/13 17:36:11 INFO SparkContext: Starting job: aggregate at FlagStat.scala:115
19/05/13 17:36:11 INFO DAGScheduler: Got job 0 (aggregate at FlagStat.scala:115) with 1 output partitions
19/05/13 17:36:11 INFO DAGScheduler: Final stage: ResultStage 0 (aggregate at FlagStat.scala:115)
19/05/13 17:36:11 INFO DAGScheduler: Parents of final stage: List()
19/05/13 17:36:11 INFO DAGScheduler: Missing parents: List()
19/05/13 17:36:11 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[4] at map at FlagStat.scala:96), which has no missing parents
19/05/13 17:36:11 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 12.6 KB, free 366.0 MB)
19/05/13 17:36:11 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 5.5 KB, free 365.9 MB)
19/05/13 17:36:11 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.10.31:35344 (size: 5.5 KB, free: 366.3 MB)
19/05/13 17:36:11 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1161
19/05/13 17:36:11 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[4] at map at FlagStat.scala:96) (first 15 tasks are for partitions Vector(0))
19/05/13 17:36:11 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
19/05/13 17:36:11 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 8012 bytes)
19/05/13 17:36:11 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
19/05/13 17:36:11 INFO Executor: Fetching spark://192.168.10.31:43177/jars/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar with timestamp 1557794167048
19/05/13 17:36:11 INFO TransportClientFactory: Successfully created connection to /192.168.10.31:43177 after 54 ms (0 ms spent in bootstraps)
19/05/13 17:36:11 INFO Utils: Fetching spark://192.168.10.31:43177/jars/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar to /tmp/spark-a963bf9f-1909-4563-8cb8-b4e5fcbf2e24/userFiles-50cb7eab-dc16-45fc-860b-28ae6a6257ab/fetchFileTemp5532692192775832707.tmp
19/05/13 17:36:11 INFO Executor: Adding file:/tmp/spark-a963bf9f-1909-4563-8cb8-b4e5fcbf2e24/userFiles-50cb7eab-dc16-45fc-860b-28ae6a6257ab/adam-assembly-spark2_2.11-0.27.0-SNAPSHOT.jar to class loader
19/05/13 17:36:12 INFO NewHadoopRDD: Input split: file:/tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/part-r-00000.gz.parquet:0+10132194
19/05/13 17:36:12 INFO InternalParquetRecordReader: RecordReader initialized will read a total of 163064 records.
19/05/13 17:36:12 INFO InternalParquetRecordReader: at row 0. reading next block
19/05/13 17:36:12 INFO CodecPool: Got brand-new decompressor [.gz]
19/05/13 17:36:12 INFO InternalParquetRecordReader: block read in memory in 42 ms. row count = 163064
19/05/13 17:36:13 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 8196 bytes result sent to driver
19/05/13 17:36:13 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 2013 ms on localhost (executor driver) (1/1)
19/05/13 17:36:13 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
19/05/13 17:36:13 INFO DAGScheduler: ResultStage 0 (aggregate at FlagStat.scala:115) finished in 2.130 s
19/05/13 17:36:13 INFO DAGScheduler: Job 0 finished: aggregate at FlagStat.scala:115, took 2.207197 s
163064 + 0 in total (QC-passed reads + QC-failed reads)
0 + 0 primary duplicates
0 + 0 primary duplicates - both read and mate mapped
0 + 0 primary duplicates - only read mapped
0 + 0 primary duplicates - cross chromosome
0 + 0 secondary duplicates
0 + 0 secondary duplicates - both read and mate mapped
0 + 0 secondary duplicates - only read mapped
0 + 0 secondary duplicates - cross chromosome
160512 + 0 mapped (98.43%:0.00%)
163064 + 0 paired in sequencing
81524 + 0 read1
81540 + 0 read2
154982 + 0 properly paired (95.04%:0.00%)
158044 + 0 with itself and mate mapped
2468 + 0 singletons (1.51%:0.00%)
418 + 0 with mate mapped to a different chr
120 + 0 with mate mapped to a different chr (mapQ>=5)
19/05/13 17:36:13 INFO FlagStat: Overall Duration: 7.55 secs
19/05/13 17:36:13 INFO FlagStat: Metrics:

Timings
+--------------------------------------+--------------+--------------+-------------+--------+-----------+-----------+-----------+
|                Metric                | Worker Total | Driver Total | Driver Only | Count  |   Mean    |    Min    |    Max    |
+--------------------------------------+--------------+--------------+-------------+--------+-----------+-----------+-----------+
| └─ Load Alignments                   |            - |       3 secs |    2.9 secs |      1 |    3 secs |    3 secs |    3 secs |
|     └─ map at ADAMContext.scala:1329 |            - |    101.01 ms |           - |      1 | 101.01 ms | 101.01 ms | 101.01 ms |
|         └─ function call             |     25.18 ms |            - |           - | 163064 |    154 ns |     75 ns | 266.94 µs |
| └─ map at FlagStat.scala:96          |            - |     37.66 ms |           - |      1 |  37.66 ms |  37.66 ms |  37.66 ms |
|     └─ function call                 |    181.85 ms |            - |           - | 163064 |   1.11 µs |    199 ns |  16.05 ms |
| └─ aggregate at FlagStat.scala:115   |            - |    2.39 secs |           - |      1 | 2.39 secs | 2.39 secs | 2.39 secs |
|     └─ function call                 |     46.68 ms |            - |           - | 163065 |    286 ns |    144 ns | 681.03 µs |
+--------------------------------------+--------------+--------------+-------------+--------+-----------+-----------+-----------+

Spark Operations
+----------+---------------------------------+---------------+----------------+--------------+----------+
| Sequence |            Operation            | Is New Stage? | Stage Duration | Driver Total | Stage ID |
+----------+---------------------------------+---------------+----------------+--------------+----------+
| 1        | map at ADAMContext.scala:1329   | false         |              - |    101.01 ms | -        |
| 2        | map at FlagStat.scala:96        | false         |              - |     37.66 ms | -        |
| 3        | aggregate at FlagStat.scala:115 | true          |     -2.13 secs |    2.39 secs | 0        |
+----------+---------------------------------+---------------+----------------+--------------+----------+

Task Timings
+-------------------------------+------------+-------+-----------+-----------+-----------+
|            Metric             | Total Time | Count |   Mean    |    Min    |    Max    |
+-------------------------------+------------+-------+-----------+-----------+-----------+
| Task Duration                 |  2.01 secs |     1 | 2.01 secs | 2.01 secs | 2.01 secs |
| Executor Run Time             |  1.57 secs |     1 | 1.57 secs | 1.57 secs | 1.57 secs |
| Executor Deserialization Time |     344 ms |     1 |    344 ms |    344 ms |    344 ms |
| Result Serialization Time     |          0 |     1 |         0 |         0 |         0 |
+-------------------------------+------------+-------+-----------+-----------+-----------+

Task Timings By Host
+-------------------------------+--------+------------+-------+-----------+-----------+-----------+
|            Metric             |  Host  | Total Time | Count |   Mean    |    Min    |    Max    |
+-------------------------------+--------+------------+-------+-----------+-----------+-----------+
| Task Duration                 | driver |  2.01 secs |     1 | 2.01 secs | 2.01 secs | 2.01 secs |
| Executor Run Time             | driver |  1.57 secs |     1 | 1.57 secs | 1.57 secs | 1.57 secs |
| Executor Deserialization Time | driver |     344 ms |     1 |    344 ms |    344 ms |    344 ms |
| Result Serialization Time     | driver |          0 |     1 |         0 |         0 |         0 |
+-------------------------------+--------+------------+-------+-----------+-----------+-----------+

Task Timings By Stage
+-------------------------------+------------------------------------+------------+-------+-----------+-----------+-----------+
|            Metric             |          Stage ID & Name           | Total Time | Count |   Mean    |    Min    |    Max    |
+-------------------------------+------------------------------------+------------+-------+-----------+-----------+-----------+
| Task Duration                 | 0: aggregate at FlagStat.scala:115 |  2.01 secs |     1 | 2.01 secs | 2.01 secs | 2.01 secs |
| Executor Run Time             | 0: aggregate at FlagStat.scala:115 |  1.57 secs |     1 | 1.57 secs | 1.57 secs | 1.57 secs |
| Executor Deserialization Time | 0: aggregate at FlagStat.scala:115 |     344 ms |     1 |    344 ms |    344 ms |    344 ms |
| Result Serialization Time     | 0: aggregate at FlagStat.scala:115 |          0 |     1 |         0 |         0 |         0 |
+-------------------------------+------------------------------------+------------+-------+-----------+-----------+-----------+

19/05/13 17:36:13 INFO SparkContext: Invoking stop() from shutdown hook
19/05/13 17:36:13 INFO SparkUI: Stopped Spark web UI at http://192.168.10.31:4040
19/05/13 17:36:13 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/05/13 17:36:13 INFO MemoryStore: MemoryStore cleared
19/05/13 17:36:13 INFO BlockManager: BlockManager stopped
19/05/13 17:36:13 INFO BlockManagerMaster: BlockManagerMaster stopped
19/05/13 17:36:13 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/05/13 17:36:13 INFO SparkContext: Successfully stopped SparkContext
19/05/13 17:36:13 INFO ShutdownHookManager: Shutdown hook called
19/05/13 17:36:13 INFO ShutdownHookManager: Deleting directory /tmp/spark-a963bf9f-1909-4563-8cb8-b4e5fcbf2e24
19/05/13 17:36:13 INFO ShutdownHookManager: Deleting directory /tmp/spark-6d8f0ce8-aea8-417b-b061-0784d419a65a
rm -rf ${ADAM_TMP_DIR}
+ rm -rf /tmp/adamTestBSaRluR/deleteMePleaseThisIsNoLongerNeeded
popd
+ popd
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.11/SPARK_VERSION/2.4.3/label/ubuntu
    
# test that the source is formatted correctly
# we had modified the poms to add a temp dir, so back out that modification first
pushd ${PROJECT_ROOT}
+ pushd /home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.11/SPARK_VERSION/2.4.3/label/ubuntu/scripts/..
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.11/SPARK_VERSION/2.4.3/label/ubuntu ~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.11/SPARK_VERSION/2.4.3/label/ubuntu
./scripts/format-source
+ ./scripts/format-source
+++ dirname ./scripts/format-source
++ cd ./scripts
++ pwd
+ DIR=/home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.11/SPARK_VERSION/2.4.3/label/ubuntu/scripts
+ pushd /home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.11/SPARK_VERSION/2.4.3/label/ubuntu/scripts/..
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.11/SPARK_VERSION/2.4.3/label/ubuntu ~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.11/SPARK_VERSION/2.4.3/label/ubuntu
+ mvn org.scalariform:scalariform-maven-plugin:format license:format
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=1g; support was removed in 8.0
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] ADAM_2.11
[INFO] ADAM_2.11: Shader workaround
[INFO] ADAM_2.11: Avro-to-Dataset codegen utils
[INFO] ADAM_2.11: Core
[INFO] ADAM_2.11: APIs for Java, Python
[INFO] ADAM_2.11: CLI
[INFO] ADAM_2.11: Assembly
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11 0.27.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-parent-spark2_2.11 ---
[INFO] Modified 2 of 241 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-parent-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: Shader workaround 0.27.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-shade-spark2_2.11 ---
[INFO] Modified 0 of 0 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-shade-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: Avro-to-Dataset codegen utils 0.27.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-codegen-spark2_2.11 ---
[INFO] Modified 0 of 4 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-codegen-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: Core 0.27.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-core-spark2_2.11 ---
[INFO] Modified 0 of 199 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-core-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: APIs for Java, Python 0.27.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-apis-spark2_2.11 ---
[INFO] Modified 0 of 5 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-apis-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: CLI 0.27.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-cli-spark2_2.11 ---
[INFO] Modified 0 of 31 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-cli-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: Assembly 0.27.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-assembly-spark2_2.11 ---
[INFO] Modified 0 of 1 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-assembly-spark2_2.11 ---
[INFO] Updating license headers...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] ADAM_2.11 .......................................... SUCCESS [  7.274 s]
[INFO] ADAM_2.11: Shader workaround ....................... SUCCESS [  0.039 s]
[INFO] ADAM_2.11: Avro-to-Dataset codegen utils ........... SUCCESS [  0.082 s]
[INFO] ADAM_2.11: Core .................................... SUCCESS [  3.790 s]
[INFO] ADAM_2.11: APIs for Java, Python ................... SUCCESS [  0.101 s]
[INFO] ADAM_2.11: CLI ..................................... SUCCESS [  0.242 s]
[INFO] ADAM_2.11: Assembly ................................ SUCCESS [  0.019 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12.215 s
[INFO] Finished at: 2019-05-13T17:36:28-07:00
[INFO] Final Memory: 22M/1131M
[INFO] ------------------------------------------------------------------------
+ popd
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.11/SPARK_VERSION/2.4.3/label/ubuntu
if test -n "$(git status --porcelain)"
then
    echo "Please run './scripts/format-source'"
    exit 1
fi
git status --porcelain
++ git status --porcelain
+ test -n ''
popd    
+ popd
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.11/SPARK_VERSION/2.4.3/label/ubuntu

echo
+ echo

echo "All the tests passed"
+ echo 'All the tests passed'
All the tests passed
echo
+ echo

Recording test results
Publishing Scoverage XML and HTML report...
Setting commit status on GitHub for https://github.com/bigdatagenomics/adam/commit/0d6565b036217e99af3454b3fcb49a90c1ff6ee0
Finished: SUCCESS