Console Output

Skipping 3,384 KB.. Full Log
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/IncorrectMDTagException.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMSAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/DatasetBoundAlignmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMCRAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMBAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMBAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/SingleReadBucketSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ParquetUnboundReadDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMBAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/FASTQInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/FASTQInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/RDDBoundAlignmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/SAMInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ReadDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/RDDBoundReadDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMCRAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/AlignmentDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/QualityScoreBin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMSAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/AnySAMOutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMCRAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/AnySAMInFormatterCompanion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ReferencePositionPairSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMSAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/DatasetBoundReadDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/QualityScoreBin$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMBAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ParquetUnboundAlignmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/BAMInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMSAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ReadDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMCRAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/SAMInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/FullOuterShuffleRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/RDDBoundGenericGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/MultisampleAvroGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/MultisampleGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/AvroGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/DatasetBoundGenericGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/VictimlessSortedIntervalPartitionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/FeatureDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/CoverageDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/FeatureDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/BEDInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/NarrowPeakOutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GFF3InFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GFF3InFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/DatasetBoundCoverageDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GTFInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/ParquetUnboundCoverageDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/BEDInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/DatasetBoundFeatureDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GTFInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GTFOutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/NarrowPeakInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/ParquetUnboundFeatureDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/RDDBoundFeatureDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GFF3OutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/BEDOutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/NarrowPeakInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/RDDBoundCoverageDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/CoverageDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenericConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/SortedIntervalPartitionJoinWithVictims.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/InnerShuffleRegionJoinAndGroupByLeft.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/AvroReadGroupGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenericGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/InnerTreeRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/RightOuterTreeRegionJoinAndGroupByRight.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenomicDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/DatasetBoundFragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/ParquetUnboundFragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/FragmentDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/InterleavedFASTQInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/FragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/Tab5InFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/Tab5InFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/Tab6InFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/InterleavedFASTQInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/Tab6InFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/RDDBoundFragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/FASTAInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/RDDBoundSliceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/DatasetBoundSliceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/SequenceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/DatasetBoundSequenceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/SliceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/ParquetUnboundSequenceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/RDDBoundSequenceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/FASTAInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/SequenceDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/SliceDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/ParquetUnboundSliceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenomicPositionPartitioner.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenomicBroadcast.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/ADAMVCFOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VCFInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/GenotypeDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VariantDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/DatasetBoundGenotypeDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VCFInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/ParquetUnboundGenotypeDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/RDDBoundGenotypeDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/RDDBoundVariantDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/DatasetBoundVariantDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VariantContextDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/RDDBoundVariantContextDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/ParquetUnboundVariantDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VariantContextDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VCFOutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/DatasetBoundVariantContextDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VariantDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/GenotypeDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/ReferencePartitioner.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/ShuffleRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/SequenceDictionaryReader$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/ParquetFileTraversable.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/ParquetLogger$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/ReferenceContigMap$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/ReferenceContigMapSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/TextRddWriter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/ReferenceContigMap.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/IndexedFastaFile.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/FileExtensions$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/GenomeFileReader$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/TwoBitFileSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/AttributeUtils$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/converters/DefaultHeaderLines$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/converters/AlignmentConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/converters/VariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/converters/VariantContextConverter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/instrumentation/Timers$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/instrumentation/index.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/consensus/ConsensusGenerator$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/consensus/ConsensusGenerator.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/consensus/index.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/smithwaterman/SmithWatermanConstantGapScoring.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/smithwaterman/SmithWaterman.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/smithwaterman/index.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/javadocs/org/apache/parquet/avro/class-use/AvroSchemaConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/javadocs/org/apache/parquet/avro/AvroSchemaConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/class-use/SingleFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/class-use/InterleavedFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/class-use/FastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/class-use/FastqRecordReader.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/SingleFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/InterleavedFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/FastqRecordReader.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/python/DataFrameConversionWrapper.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToAlignmentDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToCoverageDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToFeatureConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToVariantContextDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToGenotypeDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToAlignmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToVariantContextsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToAlignmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToFragmentDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToAlignmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToAlignmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToAlignmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToVariantContextsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToAlignmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToAlignmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/JavaADAMContext.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToSequenceDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToFeatureDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToReadDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToAlignmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/JavaADAMContext$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToVariantDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToVariantContextsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToSliceDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountReadKmersArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformVariantsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSequences$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformGenotypes.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformGenotypes$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformAlignments$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountSliceKmersArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFeaturesArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFeatures$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformAlignmentsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformGenotypesArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFragmentsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSequencesArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountSliceKmers.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountSliceKmers$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSlicesArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFragments.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSequences.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFeatures.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformVariants.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformAlignments.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFragments$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/MergeShardsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformVariants$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSlices.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountReadKmers$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSlices$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/repo/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT-sources.jar longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/repo/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT-javadoc.jar longer than 100 characters.
[INFO] Building zip: /tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/adam-distribution/target/adam-distribution-spark2_2.12-0.31.0-SNAPSHOT-bin.zip
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] ADAM_2.12 .......................................... SUCCESS [  9.226 s]
[INFO] ADAM_2.12: Shader workaround ....................... SUCCESS [  6.108 s]
[INFO] ADAM_2.12: Avro-to-Dataset codegen utils ........... SUCCESS [  5.846 s]
[INFO] ADAM_2.12: Core .................................... SUCCESS [01:15 min]
[INFO] ADAM_2.12: APIs for Java, Python ................... SUCCESS [ 11.324 s]
[INFO] ADAM_2.12: CLI ..................................... SUCCESS [ 13.786 s]
[INFO] ADAM_2.12: Assembly ................................ SUCCESS [ 19.988 s]
[INFO] ADAM_2.12: Python APIs ............................. SUCCESS [01:27 min]
[INFO] ADAM_2.12: R APIs .................................. SUCCESS [01:09 min]
[INFO] ADAM_2.12: Distribution ............................ SUCCESS [ 46.961 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 05:45 min
[INFO] Finished at: 2020-02-10T10:32:44-08:00
[INFO] Final Memory: 63M/1495M
[INFO] ------------------------------------------------------------------------
+ tar tzvf adam-distribution/target/adam-distribution-spark2_2.12-0.31.0-SNAPSHOT-bin.tar.gz
+ grep bdgenomics.adam
+ grep egg
drwxrwxr-x jenkins/jenkins        0 2020-02-10 10:29 adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/r/bdgenomics.adam.egg-info/
-rw-r--r-- jenkins/jenkins 36825984 2020-02-10 10:29 adam-distribution-spark2_2.12-0.31.0-SNAPSHOT/repo/bdgenomics.adam-0.31.0a0-py3.6.egg
+ ./bin/pyadam
Using PYSPARK=/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.5-bin-without-hadoop-scala-2.12/bin/pyspark
2020-02-10 10:32:47 WARN  Utils:66 - Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0)
2020-02-10 10:32:47 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2020-02-10 10:32:48 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
2020-02-10 10:32:54 WARN  Utils:66 - Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.debug.maxToStringFields' in SparkEnv.conf.

[Stage 0:>                                                          (0 + 1) / 1]
                                                                                
+ source deactivate
#!/bin/bash

# Determine the directory containing this script
if [[ -n $BASH_VERSION ]]; then
    _SCRIPT_LOCATION=${BASH_SOURCE[0]}
    _SHELL="bash"
elif [[ -n $ZSH_VERSION ]]; then
    _SCRIPT_LOCATION=${funcstack[1]}
    _SHELL="zsh"
else
    echo "Only bash and zsh are supported"
    return 1
fi
++ [[ -n 4.3.48(1)-release ]]
++ _SCRIPT_LOCATION=/home/jenkins/anaconda2/envs/adam-build-3cb8d508-c770-4c56-8861-79128bdc0a9e/bin/deactivate
++ _SHELL=bash
_CONDA_DIR=$(dirname "$_SCRIPT_LOCATION")
dirname "$_SCRIPT_LOCATION"
+++ dirname /home/jenkins/anaconda2/envs/adam-build-3cb8d508-c770-4c56-8861-79128bdc0a9e/bin/deactivate
++ _CONDA_DIR=/home/jenkins/anaconda2/envs/adam-build-3cb8d508-c770-4c56-8861-79128bdc0a9e/bin

case "$(uname -s)" in
    CYGWIN*|MINGW*|MSYS*)
        EXT=".exe"
        export MSYS2_ENV_CONV_EXCL=CONDA_PATH
        ;;
    *)
        EXT=""
        ;;
esac
++ case "$(uname -s)" in
uname -s
+++ uname -s
++ EXT=

# shift over all args.  We don't accept any, so it's OK that we ignore them all here.
while [[ $# > 0 ]]
do
    key="$1"
    case $key in
        -h|--help)
            "$_CONDA_DIR/conda" ..deactivate $_SHELL$EXT -h
            if [[ -n $BASH_VERSION ]] && [[ "$(basename "$0" 2> /dev/null)" == "deactivate" ]]; then
                exit 0
            else
                return 0
            fi
            ;;
    esac
    shift # past argument or value
done
++ [[ 0 > 0 ]]

# Ensure that this script is sourced, not executed
# Note that if the script was executed, we're running inside bash!
# Also note that errors are ignored as `activate foo` doesn't generate a bad
# value for $0 which would cause errors.
if [[ -n $BASH_VERSION ]] && [[ "$(basename "$0" 2> /dev/null)" == "deactivate" ]]; then
    (>&2 echo "Error: deactivate must be sourced. Run 'source deactivate'
instead of 'deactivate'.
")
    "$_CONDA_DIR/conda" ..deactivate $_SHELL$EXT -h
    exit 1
fi
++ [[ -n 4.3.48(1)-release ]]
basename "$0" 2> /dev/null
+++ basename /home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.5/label/ubuntu/scripts/jenkins-test
++ [[ jenkins-test == \d\e\a\c\t\i\v\a\t\e ]]

if [[ -z "$CONDA_PATH_BACKUP" ]]; then
    if [[ -n $BASH_VERSION ]] && [[ "$(basename "$0" 2> /dev/null)" == "deactivate" ]]; then
        exit 0
    else
        return 0
    fi
fi
++ [[ -z /usr/lib/jvm/java-8-oracle/bin/:/usr/lib/jvm/java-8-oracle/bin/:/home/anaconda/bin/:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games ]]

if (( $? == 0 )); then
    # Inverse of activation: run deactivate scripts prior to deactivating env
    _CONDA_D="${CONDA_PREFIX}/etc/conda/deactivate.d"
    if [[ -d $_CONDA_D ]]; then
        eval $(find "$_CONDA_D" -iname "*.sh" -exec echo source \'{}\'';' \;)
    fi

#    # get the activation path that would have been provided for this prefix
#    _LAST_ACTIVATE_PATH=$("$_CONDA_DIR/conda" ..activate $_SHELL$EXT "$CONDA_PREFIX")
#
#    # in activate, we replace a placeholder so that conda keeps its place in the PATH order
#    # The activate script sets _CONDA_HOLD here to activate that behavior.
#    #   Otherwise, PATH is simply removed.
#    if [ -n "$_CONDA_HOLD" ]; then
#        export PATH="$($_CONDA_PYTHON2 -c "import re; print(re.sub(r'$_LAST_ACTIVATE_PATH(:?)', r'CONDA_PATH_PLACEHOLDER\1', '$PATH', 1))")"
#    else
#        export PATH="$($_CONDA_PYTHON2 -c "import re; print(re.sub(r'$_LAST_ACTIVATE_PATH(:?)', r'', '$PATH', 1))")"
#    fi
#
#    unset _LAST_ACTIVATE_PATH

    export PATH=$("$_CONDA_DIR/conda" ..deactivate.path $_SHELL$EXT "$CONDA_PREFIX")

    unset CONDA_DEFAULT_ENV
    unset CONDA_PREFIX
    unset CONDA_PATH_BACKUP
    export PS1="$CONDA_PS1_BACKUP"
    unset CONDA_PS1_BACKUP
    unset _CONDA_PYTHON2
else
    unset _CONDA_PYTHON2
    return $?
fi
++ ((  0 == 0  ))
++ _CONDA_D=/home/jenkins/anaconda2/envs/adam-build-3cb8d508-c770-4c56-8861-79128bdc0a9e/etc/conda/deactivate.d
++ [[ -d /home/jenkins/anaconda2/envs/adam-build-3cb8d508-c770-4c56-8861-79128bdc0a9e/etc/conda/deactivate.d ]]
"$_CONDA_DIR/conda" ..deactivate.path $_SHELL$EXT "$CONDA_PREFIX"
+++ /home/jenkins/anaconda2/envs/adam-build-3cb8d508-c770-4c56-8861-79128bdc0a9e/bin/conda ..deactivate.path bash /home/jenkins/anaconda2/envs/adam-build-3cb8d508-c770-4c56-8861-79128bdc0a9e
++ export PATH=/usr/lib/jvm/java-8-oracle/bin/:/usr/lib/jvm/java-8-oracle/bin/:/home/anaconda/bin/:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
++ PATH=/usr/lib/jvm/java-8-oracle/bin/:/usr/lib/jvm/java-8-oracle/bin/:/home/anaconda/bin/:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
++ unset CONDA_DEFAULT_ENV
++ unset CONDA_PREFIX
++ unset CONDA_PATH_BACKUP
++ export PS1=
++ PS1=
++ unset CONDA_PS1_BACKUP
++ unset _CONDA_PYTHON2

if [[ -n $BASH_VERSION ]]; then
    hash -r
elif [[ -n $ZSH_VERSION ]]; then
    rehash
fi
++ [[ -n 4.3.48(1)-release ]]
++ hash -r
+ conda remove -y -n adam-build-3cb8d508-c770-4c56-8861-79128bdc0a9e --all

Package plan for package removal in environment /home/jenkins/anaconda2/envs/adam-build-3cb8d508-c770-4c56-8861-79128bdc0a9e:

The following packages will be REMOVED:

    _libgcc_mutex:    0.1-main               
    ca-certificates:  2020.1.1-0             
    certifi:          2019.11.28-py36_0      
    ld_impl_linux-64: 2.33.1-h53a641e_7      
    libedit:          3.1.20181209-hc058e9b_0
    libffi:           3.2.1-hd88cf55_4       
    libgcc-ng:        9.1.0-hdf63c60_0       
    libstdcxx-ng:     9.1.0-hdf63c60_0       
    ncurses:          6.1-he6710b0_1         
    openssl:          1.1.1d-h7b6447c_3      
    pip:              20.0.2-py36_1          
    python:           3.6.10-h0371630_0      
    readline:         7.0-h7b6447c_5         
    setuptools:       45.1.0-py36_0          
    sqlite:           3.31.1-h7b6447c_0      
    tk:               8.6.8-hbc83047_0       
    wheel:            0.34.2-py36_0          
    xz:               5.2.4-h14c3975_4       
    zlib:             1.2.11-h7b6447c_3      

+ cp -r adam-python/target /home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.5/label/ubuntu/scripts/../adam-python/
+ pushd adam-python
/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/adam-python /tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded ~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.5/label/ubuntu
+ make clean
pip uninstall -y adam
Cannot uninstall requirement adam, not installed
You are using pip version 19.1.1, however version 20.0.2 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Makefile:65: recipe for target 'clean_develop' failed
make: [clean_develop] Error 1 (ignored)
rm -rf bdgenomics/*.egg*
rm -rf build/
rm -rf dist/
+ make clean_sdist
rm -rf dist
+ popd
/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded ~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.5/label/ubuntu
	
# define filenames
BAM=mouse_chrM.bam
+ BAM=mouse_chrM.bam
READS=${BAM}.reads.adam
+ READS=mouse_chrM.bam.reads.adam
SORTED_READS=${BAM}.reads.sorted.adam
+ SORTED_READS=mouse_chrM.bam.reads.sorted.adam
FRAGMENTS=${BAM}.fragments.adam
+ FRAGMENTS=mouse_chrM.bam.fragments.adam
    
# fetch our input dataset
echo "Fetching BAM file"
+ echo 'Fetching BAM file'
Fetching BAM file
rm -rf ${BAM}
+ rm -rf mouse_chrM.bam
wget -q https://s3.amazonaws.com/bdgenomics-test/${BAM}
+ wget -q https://s3.amazonaws.com/bdgenomics-test/mouse_chrM.bam

# once fetched, convert BAM to ADAM
echo "Converting BAM to ADAM read format"
+ echo 'Converting BAM to ADAM read format'
Converting BAM to ADAM read format
rm -rf ${READS}
+ rm -rf mouse_chrM.bam.reads.adam
${ADAM} transformAlignments ${BAM} ${READS}
+ ./bin/adam-submit transformAlignments mouse_chrM.bam mouse_chrM.bam.reads.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.5-bin-without-hadoop-scala-2.12/bin/spark-submit
20/02/10 10:33:01 WARN util.Utils: Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0)
20/02/10 10:33:01 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
20/02/10 10:33:01 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/02/10 10:33:02 INFO cli.ADAMMain: ADAM invoked with args: "transformAlignments" "mouse_chrM.bam" "mouse_chrM.bam.reads.adam"
20/02/10 10:33:02 INFO spark.SparkContext: Running Spark version 2.4.5
20/02/10 10:33:02 INFO spark.SparkContext: Submitted application: transformAlignments
20/02/10 10:33:02 INFO spark.SecurityManager: Changing view acls to: jenkins
20/02/10 10:33:02 INFO spark.SecurityManager: Changing modify acls to: jenkins
20/02/10 10:33:02 INFO spark.SecurityManager: Changing view acls groups to: 
20/02/10 10:33:02 INFO spark.SecurityManager: Changing modify acls groups to: 
20/02/10 10:33:02 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
20/02/10 10:33:02 INFO util.Utils: Successfully started service 'sparkDriver' on port 38581.
20/02/10 10:33:02 INFO spark.SparkEnv: Registering MapOutputTracker
20/02/10 10:33:02 INFO spark.SparkEnv: Registering BlockManagerMaster
20/02/10 10:33:02 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
20/02/10 10:33:02 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
20/02/10 10:33:02 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-a3a1bd45-5aba-477d-afa7-2c390666653a
20/02/10 10:33:02 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
20/02/10 10:33:02 INFO spark.SparkEnv: Registering OutputCommitCoordinator
20/02/10 10:33:03 INFO util.log: Logging initialized @2764ms
20/02/10 10:33:03 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
20/02/10 10:33:03 INFO server.Server: Started @2865ms
20/02/10 10:33:03 INFO server.AbstractConnector: Started ServerConnector@6e9c413e{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
20/02/10 10:33:03 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@61a5b4ae{/jobs,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@467f77a5{/jobs/json,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bb9aa43{/jobs/job,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@df5f5c0{/jobs/job/json,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@308a6984{/stages,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@66b72664{/stages/json,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7a34b7b8{/stages/stage,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@64b31700{/stages/stage/json,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3b65e559{/stages/pool,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@bae47a0{/stages/pool/json,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@74a9c4b0{/storage,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@85ec632{/storage/json,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c05a54d{/storage/rdd,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@65ef722a{/storage/rdd/json,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5fd9b663{/environment,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@214894fc{/environment/json,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@10567255{/executors,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@e362c57{/executors/json,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c4ee95c{/executors/threadDump,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@79c4715d{/executors/threadDump/json,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5aa360ea{/static,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2fb68ec6{/,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@d71adc2{/api,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@159e366{/jobs/job/kill,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@57dc9128{/stages/stage/kill,null,AVAILABLE,@Spark}
20/02/10 10:33:03 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.31:4040
20/02/10 10:33:03 INFO spark.SparkContext: Added JAR file:/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar at spark://192.168.10.31:38581/jars/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar with timestamp 1581359583277
20/02/10 10:33:03 INFO executor.Executor: Starting executor ID driver on host localhost
20/02/10 10:33:03 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 46155.
20/02/10 10:33:03 INFO netty.NettyBlockTransferService: Server created on 192.168.10.31:46155
20/02/10 10:33:03 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
20/02/10 10:33:03 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.31, 46155, None)
20/02/10 10:33:03 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.10.31:46155 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 46155, None)
20/02/10 10:33:03 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.31, 46155, None)
20/02/10 10:33:03 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 46155, None)
20/02/10 10:33:03 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2a49fe{/metrics/json,null,AVAILABLE,@Spark}
20/02/10 10:33:04 INFO rdd.ADAMContext: Loading mouse_chrM.bam as BAM/CRAM/SAM and converting to Alignments.
20/02/10 10:33:04 INFO rdd.ADAMContext: Loaded header from file:/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam
20/02/10 10:33:04 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 295.6 KB, free 366.0 MB)
20/02/10 10:33:05 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 24.0 KB, free 366.0 MB)
20/02/10 10:33:05 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.31:46155 (size: 24.0 KB, free: 366.3 MB)
20/02/10 10:33:05 INFO spark.SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:2059
20/02/10 10:33:06 INFO read.RDDBoundAlignmentDataset: Saving data in ADAM format
20/02/10 10:33:06 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/02/10 10:33:06 INFO input.FileInputFormat: Total input paths to process : 1
20/02/10 10:33:06 INFO spark.SparkContext: Starting job: runJob at SparkHadoopWriter.scala:78
20/02/10 10:33:06 INFO scheduler.DAGScheduler: Got job 0 (runJob at SparkHadoopWriter.scala:78) with 1 output partitions
20/02/10 10:33:06 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (runJob at SparkHadoopWriter.scala:78)
20/02/10 10:33:06 INFO scheduler.DAGScheduler: Parents of final stage: List()
20/02/10 10:33:06 INFO scheduler.DAGScheduler: Missing parents: List()
20/02/10 10:33:06 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[2] at map at GenomicDataset.scala:3814), which has no missing parents
20/02/10 10:33:06 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 85.6 KB, free 365.9 MB)
20/02/10 10:33:06 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 31.6 KB, free 365.9 MB)
20/02/10 10:33:06 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.10.31:46155 (size: 31.6 KB, free: 366.2 MB)
20/02/10 10:33:06 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1163
20/02/10 10:33:06 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[2] at map at GenomicDataset.scala:3814) (first 15 tasks are for partitions Vector(0))
20/02/10 10:33:06 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
20/02/10 10:33:06 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7441 bytes)
20/02/10 10:33:06 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
20/02/10 10:33:07 INFO executor.Executor: Fetching spark://192.168.10.31:38581/jars/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar with timestamp 1581359583277
20/02/10 10:33:07 INFO client.TransportClientFactory: Successfully created connection to /192.168.10.31:38581 after 39 ms (0 ms spent in bootstraps)
20/02/10 10:33:07 INFO util.Utils: Fetching spark://192.168.10.31:38581/jars/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar to /tmp/spark-e9b6d33b-c332-493f-b01e-c21027465b34/userFiles-4b9faf99-6d65-43a3-843b-a49db8f58360/fetchFileTemp2383667808408820385.tmp
20/02/10 10:33:07 INFO executor.Executor: Adding file:/tmp/spark-e9b6d33b-c332-493f-b01e-c21027465b34/userFiles-4b9faf99-6d65-43a3-843b-a49db8f58360/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar to class loader
20/02/10 10:33:07 INFO rdd.NewHadoopRDD: Input split: file:/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam:83361792-833134657535
20/02/10 10:33:07 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/02/10 10:33:07 INFO codec.CodecConfig: Compression: GZIP
20/02/10 10:33:07 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/02/10 10:33:07 INFO hadoop.ParquetOutputFormat: Parquet block size to 134217728
20/02/10 10:33:07 INFO hadoop.ParquetOutputFormat: Parquet page size to 1048576
20/02/10 10:33:07 INFO hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
20/02/10 10:33:07 INFO hadoop.ParquetOutputFormat: Dictionary is on
20/02/10 10:33:07 INFO hadoop.ParquetOutputFormat: Validation is off
20/02/10 10:33:07 INFO hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
20/02/10 10:33:07 INFO hadoop.ParquetOutputFormat: Maximum row group padding size is 8388608 bytes
20/02/10 10:33:07 INFO hadoop.ParquetOutputFormat: Page size checking is: estimated
20/02/10 10:33:07 INFO hadoop.ParquetOutputFormat: Min row count for page size check is: 100
20/02/10 10:33:07 INFO hadoop.ParquetOutputFormat: Max row count for page size check is: 10000
20/02/10 10:33:07 INFO compress.CodecPool: Got brand-new compressor [.gz]
Ignoring SAM validation error: ERROR: Record 162622, Read name 613F0AAXX100423:3:58:9979:16082, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162624, Read name 613F0AAXX100423:6:13:3141:11793, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162625, Read name 613F0AAXX100423:8:39:18592:13552, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162635, Read name 613F1AAXX100423:7:2:13114:10698, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162637, Read name 613F1AAXX100423:6:100:8840:11167, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162639, Read name 613F1AAXX100423:8:15:10944:11181, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162640, Read name 613F1AAXX100423:8:17:5740:10104, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162651, Read name 613F1AAXX100423:1:53:11097:8261, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162654, Read name 613F1AAXX100423:2:112:16779:19612, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162657, Read name 613F0AAXX100423:8:28:7084:17683, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162659, Read name 613F0AAXX100423:8:39:19796:12794, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162662, Read name 613F1AAXX100423:5:116:9339:3264, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162667, Read name 613F0AAXX100423:4:67:2015:3054, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162669, Read name 613F0AAXX100423:7:7:11297:11738, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162674, Read name 613F0AAXX100423:6:59:10490:20829, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162678, Read name 613F1AAXX100423:8:11:17603:4766, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162682, Read name 613F0AAXX100423:5:86:10814:10257, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162683, Read name 613F0AAXX100423:5:117:14178:6111, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162685, Read name 613F0AAXX100423:2:3:13563:6720, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162689, Read name 613F0AAXX100423:7:59:16009:15799, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162696, Read name 613F0AAXX100423:5:31:9663:18252, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162698, Read name 613F1AAXX100423:2:27:12264:14626, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162699, Read name 613F0AAXX100423:1:120:19003:6647, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162702, Read name 613F1AAXX100423:3:37:6972:18407, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162704, Read name 613F1AAXX100423:3:77:6946:3880, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162706, Read name 613F0AAXX100423:7:48:2692:3492, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162708, Read name 613F1AAXX100423:7:80:8790:1648, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162710, Read name 6141AAAXX100423:5:30:15036:17610, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162712, Read name 613F1AAXX100423:8:80:6261:4465, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162713, Read name 6141AAAXX100423:5:74:5542:6195, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162715, Read name 613F1AAXX100423:5:14:14844:13639, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162718, Read name 613F1AAXX100423:7:112:14569:8480, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162725, Read name 613F1AAXX100423:4:56:10160:9879, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162727, Read name 6141AAAXX100423:7:89:12209:9221, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162731, Read name 6141AAAXX100423:6:55:1590:19793, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162732, Read name 6141AAAXX100423:7:102:16679:12368, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162734, Read name 613F1AAXX100423:2:7:4909:18472, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162737, Read name 6141AAAXX100423:4:73:6574:10572, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162741, Read name 6141AAAXX100423:1:8:14113:12655, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162743, Read name 6141AAAXX100423:3:40:7990:5056, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162744, Read name 6141AAAXX100423:4:36:15793:3411, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162745, Read name 6141AAAXX100423:8:83:1139:18985, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162746, Read name 6141AAAXX100423:5:7:18196:13562, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162748, Read name 6141AAAXX100423:3:114:5639:7123, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162751, Read name 6141AAAXX100423:7:47:4898:8640, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162753, Read name 6141AAAXX100423:3:64:8064:8165, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162756, Read name 613F1AAXX100423:1:105:14386:1684, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162757, Read name 613F1AAXX100423:6:98:1237:19470, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162761, Read name 613F1AAXX100423:7:106:19658:9261, MAPQ should be 0 for unmapped read.
20/02/10 10:33:16 INFO hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 16043959
20/02/10 10:33:17 INFO output.FileOutputCommitter: Saved output of task 'attempt_20200210103306_0002_r_000000_0' to file:/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/_temporary/0/task_20200210103306_0002_r_000000
20/02/10 10:33:17 INFO mapred.SparkHadoopMapRedUtil: attempt_20200210103306_0002_r_000000_0: Committed
20/02/10 10:33:17 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 856 bytes result sent to driver
20/02/10 10:33:17 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 10104 ms on localhost (executor driver) (1/1)
20/02/10 10:33:17 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
20/02/10 10:33:17 INFO scheduler.DAGScheduler: ResultStage 0 (runJob at SparkHadoopWriter.scala:78) finished in 10.250 s
20/02/10 10:33:17 INFO scheduler.DAGScheduler: Job 0 finished: runJob at SparkHadoopWriter.scala:78, took 10.326711 s
20/02/10 10:33:17 INFO hadoop.ParquetFileReader: Initiating action with parallelism: 5
20/02/10 10:33:17 INFO io.SparkHadoopWriter: Job job_20200210103306_0002 committed.
20/02/10 10:33:17 INFO cli.TransformAlignments: Overall Duration: 14.99 secs
20/02/10 10:33:17 INFO spark.SparkContext: Invoking stop() from shutdown hook
20/02/10 10:33:17 INFO server.AbstractConnector: Stopped Spark@6e9c413e{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
20/02/10 10:33:17 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.10.31:4040
20/02/10 10:33:17 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/02/10 10:33:17 INFO memory.MemoryStore: MemoryStore cleared
20/02/10 10:33:17 INFO storage.BlockManager: BlockManager stopped
20/02/10 10:33:17 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
20/02/10 10:33:17 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/02/10 10:33:17 INFO spark.SparkContext: Successfully stopped SparkContext
20/02/10 10:33:17 INFO util.ShutdownHookManager: Shutdown hook called
20/02/10 10:33:17 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-e157595f-79e5-4a92-9f08-25931aa25fb6
20/02/10 10:33:17 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-e9b6d33b-c332-493f-b01e-c21027465b34

# then, sort the BAM
echo "Converting BAM to ADAM read format with sorting"
+ echo 'Converting BAM to ADAM read format with sorting'
Converting BAM to ADAM read format with sorting
rm -rf ${SORTED_READS}
+ rm -rf mouse_chrM.bam.reads.sorted.adam
${ADAM} transformAlignments -sort_by_reference_position ${READS} ${SORTED_READS}
+ ./bin/adam-submit transformAlignments -sort_by_reference_position mouse_chrM.bam.reads.adam mouse_chrM.bam.reads.sorted.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.5-bin-without-hadoop-scala-2.12/bin/spark-submit
20/02/10 10:33:19 WARN util.Utils: Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0)
20/02/10 10:33:19 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
20/02/10 10:33:19 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/02/10 10:33:19 INFO cli.ADAMMain: ADAM invoked with args: "transformAlignments" "-sort_by_reference_position" "mouse_chrM.bam.reads.adam" "mouse_chrM.bam.reads.sorted.adam"
20/02/10 10:33:20 INFO spark.SparkContext: Running Spark version 2.4.5
20/02/10 10:33:20 INFO spark.SparkContext: Submitted application: transformAlignments
20/02/10 10:33:20 INFO spark.SecurityManager: Changing view acls to: jenkins
20/02/10 10:33:20 INFO spark.SecurityManager: Changing modify acls to: jenkins
20/02/10 10:33:20 INFO spark.SecurityManager: Changing view acls groups to: 
20/02/10 10:33:20 INFO spark.SecurityManager: Changing modify acls groups to: 
20/02/10 10:33:20 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
20/02/10 10:33:20 INFO util.Utils: Successfully started service 'sparkDriver' on port 45811.
20/02/10 10:33:20 INFO spark.SparkEnv: Registering MapOutputTracker
20/02/10 10:33:20 INFO spark.SparkEnv: Registering BlockManagerMaster
20/02/10 10:33:20 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
20/02/10 10:33:20 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
20/02/10 10:33:20 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-6999e9d8-cc6e-49f5-a0fa-5dd87bb6497c
20/02/10 10:33:20 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
20/02/10 10:33:20 INFO spark.SparkEnv: Registering OutputCommitCoordinator
20/02/10 10:33:20 INFO util.log: Logging initialized @2535ms
20/02/10 10:33:20 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
20/02/10 10:33:20 INFO server.Server: Started @2621ms
20/02/10 10:33:20 INFO server.AbstractConnector: Started ServerConnector@57a4d5ee{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
20/02/10 10:33:20 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3a71c100{/jobs,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bb9aa43{/jobs/json,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@420bc288{/jobs/job,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@308a6984{/jobs/job/json,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@66b72664{/stages,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7a34b7b8{/stages/json,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@58cd06cb{/stages/stage,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3b65e559{/stages/stage/json,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@bae47a0{/stages/pool,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@74a9c4b0{/stages/pool/json,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@85ec632{/storage,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c05a54d{/storage/json,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@65ef722a{/storage/rdd,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5fd9b663{/storage/rdd/json,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@214894fc{/environment,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@10567255{/environment/json,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@e362c57{/executors,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c4ee95c{/executors/json,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@79c4715d{/executors/threadDump,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5aa360ea{/executors/threadDump/json,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6548bb7d{/static,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@d71adc2{/,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3add81c4{/api,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@57dc9128{/jobs/job/kill,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@24528a25{/stages/stage/kill,null,AVAILABLE,@Spark}
20/02/10 10:33:20 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.31:4040
20/02/10 10:33:20 INFO spark.SparkContext: Added JAR file:/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar at spark://192.168.10.31:45811/jars/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar with timestamp 1581359600826
20/02/10 10:33:20 INFO executor.Executor: Starting executor ID driver on host localhost
20/02/10 10:33:21 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 45287.
20/02/10 10:33:21 INFO netty.NettyBlockTransferService: Server created on 192.168.10.31:45287
20/02/10 10:33:21 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
20/02/10 10:33:21 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.31, 45287, None)
20/02/10 10:33:21 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.10.31:45287 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 45287, None)
20/02/10 10:33:21 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.31, 45287, None)
20/02/10 10:33:21 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 45287, None)
20/02/10 10:33:21 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@66596a88{/metrics/json,null,AVAILABLE,@Spark}
20/02/10 10:33:21 INFO rdd.ADAMContext: Loading mouse_chrM.bam.reads.adam as Parquet of Alignments.
20/02/10 10:33:23 INFO rdd.ADAMContext: Reading the ADAM file at mouse_chrM.bam.reads.adam to create RDD
20/02/10 10:33:23 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 315.0 KB, free 366.0 MB)
20/02/10 10:33:24 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 28.1 KB, free 366.0 MB)
20/02/10 10:33:24 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.31:45287 (size: 28.1 KB, free: 366.3 MB)
20/02/10 10:33:24 INFO spark.SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:1801
20/02/10 10:33:24 INFO cli.TransformAlignments: Sorting alignments by reference position, with references ordered by name
20/02/10 10:33:24 INFO read.RDDBoundAlignmentDataset: Sorting alignments by reference position
20/02/10 10:33:24 INFO input.FileInputFormat: Total input paths to process : 1
20/02/10 10:33:24 INFO hadoop.ParquetInputFormat: Total input paths to process : 1
20/02/10 10:33:24 INFO read.RDDBoundAlignmentDataset: Saving data in ADAM format
20/02/10 10:33:24 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/02/10 10:33:24 INFO spark.SparkContext: Starting job: runJob at SparkHadoopWriter.scala:78
20/02/10 10:33:24 INFO scheduler.DAGScheduler: Registering RDD 2 (sortBy at AlignmentDataset.scala:1008) as input to shuffle 0
20/02/10 10:33:24 INFO scheduler.DAGScheduler: Got job 0 (runJob at SparkHadoopWriter.scala:78) with 1 output partitions
20/02/10 10:33:24 INFO scheduler.DAGScheduler: Final stage: ResultStage 1 (runJob at SparkHadoopWriter.scala:78)
20/02/10 10:33:24 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
20/02/10 10:33:24 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 0)
20/02/10 10:33:24 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[2] at sortBy at AlignmentDataset.scala:1008), which has no missing parents
20/02/10 10:33:24 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 5.9 KB, free 366.0 MB)
20/02/10 10:33:24 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 3.4 KB, free 366.0 MB)
20/02/10 10:33:24 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.10.31:45287 (size: 3.4 KB, free: 366.3 MB)
20/02/10 10:33:24 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1163
20/02/10 10:33:24 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[2] at sortBy at AlignmentDataset.scala:1008) (first 15 tasks are for partitions Vector(0))
20/02/10 10:33:24 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
20/02/10 10:33:24 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7480 bytes)
20/02/10 10:33:24 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
20/02/10 10:33:24 INFO executor.Executor: Fetching spark://192.168.10.31:45811/jars/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar with timestamp 1581359600826
20/02/10 10:33:24 INFO client.TransportClientFactory: Successfully created connection to /192.168.10.31:45811 after 35 ms (0 ms spent in bootstraps)
20/02/10 10:33:24 INFO util.Utils: Fetching spark://192.168.10.31:45811/jars/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar to /tmp/spark-ba751115-11db-4c47-baa2-4b19653aeb94/userFiles-5c50b20b-0af0-4944-8f02-fdbc44923423/fetchFileTemp5142225177308491892.tmp
20/02/10 10:33:25 INFO executor.Executor: Adding file:/tmp/spark-ba751115-11db-4c47-baa2-4b19653aeb94/userFiles-5c50b20b-0af0-4944-8f02-fdbc44923423/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar to class loader
20/02/10 10:33:25 INFO rdd.NewHadoopRDD: Input split: file:/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/part-r-00000.gz.parquet:0+10132211
20/02/10 10:33:25 INFO hadoop.InternalParquetRecordReader: RecordReader initialized will read a total of 163064 records.
20/02/10 10:33:25 INFO hadoop.InternalParquetRecordReader: at row 0. reading next block
20/02/10 10:33:25 INFO compress.CodecPool: Got brand-new decompressor [.gz]
20/02/10 10:33:25 INFO hadoop.InternalParquetRecordReader: block read in memory in 46 ms. row count = 163064
20/02/10 10:33:28 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 919 bytes result sent to driver
20/02/10 10:33:28 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 3977 ms on localhost (executor driver) (1/1)
20/02/10 10:33:28 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
20/02/10 10:33:28 INFO scheduler.DAGScheduler: ShuffleMapStage 0 (sortBy at AlignmentDataset.scala:1008) finished in 4.192 s
20/02/10 10:33:28 INFO scheduler.DAGScheduler: looking for newly runnable stages
20/02/10 10:33:28 INFO scheduler.DAGScheduler: running: Set()
20/02/10 10:33:28 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 1)
20/02/10 10:33:28 INFO scheduler.DAGScheduler: failed: Set()
20/02/10 10:33:28 INFO scheduler.DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[5] at map at GenomicDataset.scala:3814), which has no missing parents
20/02/10 10:33:28 INFO memory.MemoryStore: Block broadcast_2 stored as values in memory (estimated size 86.8 KB, free 365.9 MB)
20/02/10 10:33:28 INFO memory.MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 32.4 KB, free 365.8 MB)
20/02/10 10:33:28 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.10.31:45287 (size: 32.4 KB, free: 366.2 MB)
20/02/10 10:33:28 INFO spark.SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1163
20/02/10 10:33:28 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[5] at map at GenomicDataset.scala:3814) (first 15 tasks are for partitions Vector(0))
20/02/10 10:33:28 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
20/02/10 10:33:28 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, executor driver, partition 0, ANY, 7141 bytes)
20/02/10 10:33:28 INFO executor.Executor: Running task 0.0 in stage 1.0 (TID 1)
20/02/10 10:33:28 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
20/02/10 10:33:28 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 8 ms
20/02/10 10:33:30 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/02/10 10:33:30 INFO codec.CodecConfig: Compression: GZIP
20/02/10 10:33:30 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/02/10 10:33:30 INFO hadoop.ParquetOutputFormat: Parquet block size to 134217728
20/02/10 10:33:30 INFO hadoop.ParquetOutputFormat: Parquet page size to 1048576
20/02/10 10:33:30 INFO hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
20/02/10 10:33:30 INFO hadoop.ParquetOutputFormat: Dictionary is on
20/02/10 10:33:30 INFO hadoop.ParquetOutputFormat: Validation is off
20/02/10 10:33:30 INFO hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
20/02/10 10:33:30 INFO hadoop.ParquetOutputFormat: Maximum row group padding size is 8388608 bytes
20/02/10 10:33:30 INFO hadoop.ParquetOutputFormat: Page size checking is: estimated
20/02/10 10:33:30 INFO hadoop.ParquetOutputFormat: Min row count for page size check is: 100
20/02/10 10:33:30 INFO hadoop.ParquetOutputFormat: Max row count for page size check is: 10000
20/02/10 10:33:30 INFO compress.CodecPool: Got brand-new compressor [.gz]
20/02/10 10:33:32 INFO storage.BlockManagerInfo: Removed broadcast_1_piece0 on 192.168.10.31:45287 in memory (size: 3.4 KB, free: 366.2 MB)
20/02/10 10:33:35 INFO hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 16004474
20/02/10 10:33:35 INFO output.FileOutputCommitter: Saved output of task 'attempt_20200210103324_0005_r_000000_0' to file:/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.sorted.adam/_temporary/0/task_20200210103324_0005_r_000000
20/02/10 10:33:35 INFO mapred.SparkHadoopMapRedUtil: attempt_20200210103324_0005_r_000000_0: Committed
20/02/10 10:33:35 INFO executor.Executor: Finished task 0.0 in stage 1.0 (TID 1). 1243 bytes result sent to driver
20/02/10 10:33:35 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 7176 ms on localhost (executor driver) (1/1)
20/02/10 10:33:35 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
20/02/10 10:33:35 INFO scheduler.DAGScheduler: ResultStage 1 (runJob at SparkHadoopWriter.scala:78) finished in 7.243 s
20/02/10 10:33:35 INFO scheduler.DAGScheduler: Job 0 finished: runJob at SparkHadoopWriter.scala:78, took 11.548152 s
20/02/10 10:33:36 INFO hadoop.ParquetFileReader: Initiating action with parallelism: 5
20/02/10 10:33:36 INFO io.SparkHadoopWriter: Job job_20200210103324_0005 committed.
20/02/10 10:33:36 INFO cli.TransformAlignments: Overall Duration: 16.18 secs
20/02/10 10:33:36 INFO spark.SparkContext: Invoking stop() from shutdown hook
20/02/10 10:33:36 INFO server.AbstractConnector: Stopped Spark@57a4d5ee{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
20/02/10 10:33:36 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.10.31:4040
20/02/10 10:33:36 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/02/10 10:33:36 INFO memory.MemoryStore: MemoryStore cleared
20/02/10 10:33:36 INFO storage.BlockManager: BlockManager stopped
20/02/10 10:33:36 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
20/02/10 10:33:36 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/02/10 10:33:36 INFO spark.SparkContext: Successfully stopped SparkContext
20/02/10 10:33:36 INFO util.ShutdownHookManager: Shutdown hook called
20/02/10 10:33:36 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-924988c7-3fc5-4251-9ff3-896c747437c1
20/02/10 10:33:36 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-ba751115-11db-4c47-baa2-4b19653aeb94

# convert the reads to fragments to re-pair the reads
echo "Converting read file to fragments"
+ echo 'Converting read file to fragments'
Converting read file to fragments
rm -rf ${FRAGMENTS}
+ rm -rf mouse_chrM.bam.fragments.adam
${ADAM} transformFragments -load_as_alignments ${READS} ${FRAGMENTS}
+ ./bin/adam-submit transformFragments -load_as_alignments mouse_chrM.bam.reads.adam mouse_chrM.bam.fragments.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.5-bin-without-hadoop-scala-2.12/bin/spark-submit
20/02/10 10:33:38 WARN util.Utils: Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0)
20/02/10 10:33:38 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
20/02/10 10:33:38 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/02/10 10:33:38 INFO cli.ADAMMain: ADAM invoked with args: "transformFragments" "-load_as_alignments" "mouse_chrM.bam.reads.adam" "mouse_chrM.bam.fragments.adam"
20/02/10 10:33:38 INFO spark.SparkContext: Running Spark version 2.4.5
20/02/10 10:33:38 INFO spark.SparkContext: Submitted application: transformFragments
20/02/10 10:33:38 INFO spark.SecurityManager: Changing view acls to: jenkins
20/02/10 10:33:38 INFO spark.SecurityManager: Changing modify acls to: jenkins
20/02/10 10:33:38 INFO spark.SecurityManager: Changing view acls groups to: 
20/02/10 10:33:38 INFO spark.SecurityManager: Changing modify acls groups to: 
20/02/10 10:33:38 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
20/02/10 10:33:39 INFO util.Utils: Successfully started service 'sparkDriver' on port 38241.
20/02/10 10:33:39 INFO spark.SparkEnv: Registering MapOutputTracker
20/02/10 10:33:39 INFO spark.SparkEnv: Registering BlockManagerMaster
20/02/10 10:33:39 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
20/02/10 10:33:39 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
20/02/10 10:33:39 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-c7acf534-7704-4dcd-9ff9-4d3f5396ec32
20/02/10 10:33:39 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
20/02/10 10:33:39 INFO spark.SparkEnv: Registering OutputCommitCoordinator
20/02/10 10:33:39 INFO util.log: Logging initialized @2440ms
20/02/10 10:33:39 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
20/02/10 10:33:39 INFO server.Server: Started @2519ms
20/02/10 10:33:39 INFO server.AbstractConnector: Started ServerConnector@1e53135d{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
20/02/10 10:33:39 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6b739528{/jobs,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@61e3a1fd{/jobs/json,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51abf713{/jobs/job,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4d4d48a6{/jobs/job/json,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@315df4bb{/stages,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3fc08eec{/stages/json,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5cad8b7d{/stages/stage,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1e287667{/stages/stage/json,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2e6ee0bc{/stages/pool,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4201a617{/stages/pool/json,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@467f77a5{/storage,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bb9aa43{/storage/json,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@420bc288{/storage/rdd,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@df5f5c0{/storage/rdd/json,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@308a6984{/environment,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@66b72664{/environment/json,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7a34b7b8{/executors,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@58cd06cb{/executors/json,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3be8821f{/executors/threadDump,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@64b31700{/executors/threadDump/json,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3b65e559{/static,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@49bd54f7{/,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6b5f8707{/api,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6e5bfdfc{/jobs/job/kill,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3d829787{/stages/stage/kill,null,AVAILABLE,@Spark}
20/02/10 10:33:39 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.31:4040
20/02/10 10:33:39 INFO spark.SparkContext: Added JAR file:/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar at spark://192.168.10.31:38241/jars/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar with timestamp 1581359619602
20/02/10 10:33:39 INFO executor.Executor: Starting executor ID driver on host localhost
20/02/10 10:33:39 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 33155.
20/02/10 10:33:39 INFO netty.NettyBlockTransferService: Server created on 192.168.10.31:33155
20/02/10 10:33:39 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
20/02/10 10:33:39 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.31, 33155, None)
20/02/10 10:33:39 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.10.31:33155 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 33155, None)
20/02/10 10:33:39 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.31, 33155, None)
20/02/10 10:33:39 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 33155, None)
20/02/10 10:33:39 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@79a1728c{/metrics/json,null,AVAILABLE,@Spark}
20/02/10 10:33:40 INFO rdd.ADAMContext: Loading mouse_chrM.bam.reads.adam as Parquet of Alignments.
20/02/10 10:33:42 INFO rdd.ADAMContext: Reading the ADAM file at mouse_chrM.bam.reads.adam to create RDD
20/02/10 10:33:42 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 315.0 KB, free 366.0 MB)
20/02/10 10:33:43 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 28.1 KB, free 366.0 MB)
20/02/10 10:33:43 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.31:33155 (size: 28.1 KB, free: 366.3 MB)
20/02/10 10:33:43 INFO spark.SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:1801
20/02/10 10:33:43 INFO input.FileInputFormat: Total input paths to process : 1
20/02/10 10:33:43 INFO hadoop.ParquetInputFormat: Total input paths to process : 1
20/02/10 10:33:43 INFO fragment.RDDBoundFragmentDataset: Saving data in ADAM format
20/02/10 10:33:43 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/02/10 10:33:43 INFO spark.SparkContext: Starting job: runJob at SparkHadoopWriter.scala:78
20/02/10 10:33:43 INFO scheduler.DAGScheduler: Registering RDD 2 (groupBy at SingleReadBucket.scala:97) as input to shuffle 0
20/02/10 10:33:43 INFO scheduler.DAGScheduler: Got job 0 (runJob at SparkHadoopWriter.scala:78) with 1 output partitions
20/02/10 10:33:43 INFO scheduler.DAGScheduler: Final stage: ResultStage 1 (runJob at SparkHadoopWriter.scala:78)
20/02/10 10:33:43 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
20/02/10 10:33:43 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 0)
20/02/10 10:33:43 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[2] at groupBy at SingleReadBucket.scala:97), which has no missing parents
20/02/10 10:33:43 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 6.4 KB, free 366.0 MB)
20/02/10 10:33:43 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 3.5 KB, free 366.0 MB)
20/02/10 10:33:43 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.10.31:33155 (size: 3.5 KB, free: 366.3 MB)
20/02/10 10:33:43 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1163
20/02/10 10:33:43 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[2] at groupBy at SingleReadBucket.scala:97) (first 15 tasks are for partitions Vector(0))
20/02/10 10:33:43 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
20/02/10 10:33:43 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7480 bytes)
20/02/10 10:33:43 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
20/02/10 10:33:43 INFO executor.Executor: Fetching spark://192.168.10.31:38241/jars/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar with timestamp 1581359619602
20/02/10 10:33:43 INFO client.TransportClientFactory: Successfully created connection to /192.168.10.31:38241 after 45 ms (0 ms spent in bootstraps)
20/02/10 10:33:43 INFO util.Utils: Fetching spark://192.168.10.31:38241/jars/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar to /tmp/spark-93df1475-0c95-49e3-bb65-9a0b97bc6eb6/userFiles-47c50963-4807-4a8a-9567-4230664830d5/fetchFileTemp2383600045680094222.tmp
20/02/10 10:33:44 INFO executor.Executor: Adding file:/tmp/spark-93df1475-0c95-49e3-bb65-9a0b97bc6eb6/userFiles-47c50963-4807-4a8a-9567-4230664830d5/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar to class loader
20/02/10 10:33:44 INFO rdd.NewHadoopRDD: Input split: file:/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/part-r-00000.gz.parquet:0+10132211
20/02/10 10:33:44 INFO hadoop.InternalParquetRecordReader: RecordReader initialized will read a total of 163064 records.
20/02/10 10:33:44 INFO hadoop.InternalParquetRecordReader: at row 0. reading next block
20/02/10 10:33:44 INFO compress.CodecPool: Got brand-new decompressor [.gz]
20/02/10 10:33:44 INFO hadoop.InternalParquetRecordReader: block read in memory in 46 ms. row count = 163064
20/02/10 10:33:47 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 962 bytes result sent to driver
20/02/10 10:33:47 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 3985 ms on localhost (executor driver) (1/1)
20/02/10 10:33:47 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
20/02/10 10:33:47 INFO scheduler.DAGScheduler: ShuffleMapStage 0 (groupBy at SingleReadBucket.scala:97) finished in 4.121 s
20/02/10 10:33:47 INFO scheduler.DAGScheduler: looking for newly runnable stages
20/02/10 10:33:47 INFO scheduler.DAGScheduler: running: Set()
20/02/10 10:33:47 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 1)
20/02/10 10:33:47 INFO scheduler.DAGScheduler: failed: Set()
20/02/10 10:33:47 INFO scheduler.DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[6] at map at GenomicDataset.scala:3814), which has no missing parents
20/02/10 10:33:47 INFO memory.MemoryStore: Block broadcast_2 stored as values in memory (estimated size 90.2 KB, free 365.9 MB)
20/02/10 10:33:47 INFO memory.MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 33.5 KB, free 365.8 MB)
20/02/10 10:33:47 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.10.31:33155 (size: 33.5 KB, free: 366.2 MB)
20/02/10 10:33:47 INFO spark.SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1163
20/02/10 10:33:47 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[6] at map at GenomicDataset.scala:3814) (first 15 tasks are for partitions Vector(0))
20/02/10 10:33:47 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
20/02/10 10:33:47 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, executor driver, partition 0, ANY, 7141 bytes)
20/02/10 10:33:47 INFO executor.Executor: Running task 0.0 in stage 1.0 (TID 1)
20/02/10 10:33:47 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
20/02/10 10:33:47 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 9 ms
20/02/10 10:33:49 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/02/10 10:33:49 INFO codec.CodecConfig: Compression: GZIP
20/02/10 10:33:49 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
20/02/10 10:33:49 INFO hadoop.ParquetOutputFormat: Parquet block size to 134217728
20/02/10 10:33:49 INFO hadoop.ParquetOutputFormat: Parquet page size to 1048576
20/02/10 10:33:49 INFO hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
20/02/10 10:33:49 INFO hadoop.ParquetOutputFormat: Dictionary is on
20/02/10 10:33:49 INFO hadoop.ParquetOutputFormat: Validation is off
20/02/10 10:33:49 INFO hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
20/02/10 10:33:49 INFO hadoop.ParquetOutputFormat: Maximum row group padding size is 8388608 bytes
20/02/10 10:33:49 INFO hadoop.ParquetOutputFormat: Page size checking is: estimated
20/02/10 10:33:49 INFO hadoop.ParquetOutputFormat: Min row count for page size check is: 100
20/02/10 10:33:49 INFO hadoop.ParquetOutputFormat: Max row count for page size check is: 10000
20/02/10 10:33:49 INFO compress.CodecPool: Got brand-new compressor [.gz]
20/02/10 10:33:50 INFO storage.BlockManagerInfo: Removed broadcast_1_piece0 on 192.168.10.31:33155 in memory (size: 3.5 KB, free: 366.2 MB)
20/02/10 10:33:56 INFO hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 21417928
20/02/10 10:33:56 INFO output.FileOutputCommitter: Saved output of task 'attempt_20200210103343_0006_r_000000_0' to file:/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.fragments.adam/_temporary/0/task_20200210103343_0006_r_000000
20/02/10 10:33:56 INFO mapred.SparkHadoopMapRedUtil: attempt_20200210103343_0006_r_000000_0: Committed
20/02/10 10:33:56 INFO executor.Executor: Finished task 0.0 in stage 1.0 (TID 1). 1200 bytes result sent to driver
20/02/10 10:33:56 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 8931 ms on localhost (executor driver) (1/1)
20/02/10 10:33:56 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
20/02/10 10:33:56 INFO scheduler.DAGScheduler: ResultStage 1 (runJob at SparkHadoopWriter.scala:78) finished in 8.989 s
20/02/10 10:33:56 INFO scheduler.DAGScheduler: Job 0 finished: runJob at SparkHadoopWriter.scala:78, took 13.227325 s
20/02/10 10:33:56 INFO hadoop.ParquetFileReader: Initiating action with parallelism: 5
20/02/10 10:33:56 INFO io.SparkHadoopWriter: Job job_20200210103343_0006 committed.
20/02/10 10:33:56 INFO cli.TransformFragments: Overall Duration: 18.15 secs
20/02/10 10:33:56 INFO spark.SparkContext: Invoking stop() from shutdown hook
20/02/10 10:33:56 INFO server.AbstractConnector: Stopped Spark@1e53135d{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
20/02/10 10:33:56 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.10.31:4040
20/02/10 10:33:56 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/02/10 10:33:56 INFO memory.MemoryStore: MemoryStore cleared
20/02/10 10:33:56 INFO storage.BlockManager: BlockManager stopped
20/02/10 10:33:56 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
20/02/10 10:33:56 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/02/10 10:33:57 INFO spark.SparkContext: Successfully stopped SparkContext
20/02/10 10:33:57 INFO util.ShutdownHookManager: Shutdown hook called
20/02/10 10:33:57 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-93df1475-0c95-49e3-bb65-9a0b97bc6eb6
20/02/10 10:33:57 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-95083872-af91-4adc-9619-688bc450fabb

# test that printing works
echo "Printing reads and fragments"
+ echo 'Printing reads and fragments'
Printing reads and fragments
${ADAM} print ${READS} 1>/dev/null 2>/dev/null
+ ./bin/adam-submit print mouse_chrM.bam.reads.adam
${ADAM} print ${FRAGMENTS} 1>/dev/null 2>/dev/null
+ ./bin/adam-submit print mouse_chrM.bam.fragments.adam

# run flagstat to verify that flagstat runs OK
echo "Printing read statistics"
+ echo 'Printing read statistics'
Printing read statistics
${ADAM} flagstat -print_metrics ${READS}
+ ./bin/adam-submit flagstat -print_metrics mouse_chrM.bam.reads.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.5-bin-without-hadoop-scala-2.12/bin/spark-submit
20/02/10 10:34:17 WARN util.Utils: Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0)
20/02/10 10:34:17 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
20/02/10 10:34:18 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/02/10 10:34:18 INFO cli.ADAMMain: ADAM invoked with args: "flagstat" "-print_metrics" "mouse_chrM.bam.reads.adam"
20/02/10 10:34:18 INFO spark.SparkContext: Running Spark version 2.4.5
20/02/10 10:34:18 INFO spark.SparkContext: Submitted application: flagstat
20/02/10 10:34:18 INFO spark.SecurityManager: Changing view acls to: jenkins
20/02/10 10:34:18 INFO spark.SecurityManager: Changing modify acls to: jenkins
20/02/10 10:34:18 INFO spark.SecurityManager: Changing view acls groups to: 
20/02/10 10:34:18 INFO spark.SecurityManager: Changing modify acls groups to: 
20/02/10 10:34:18 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
20/02/10 10:34:18 INFO util.Utils: Successfully started service 'sparkDriver' on port 43959.
20/02/10 10:34:18 INFO spark.SparkEnv: Registering MapOutputTracker
20/02/10 10:34:18 INFO spark.SparkEnv: Registering BlockManagerMaster
20/02/10 10:34:18 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
20/02/10 10:34:18 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
20/02/10 10:34:18 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-d01b6b85-b8a0-449a-8109-77283930941e
20/02/10 10:34:18 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
20/02/10 10:34:18 INFO spark.SparkEnv: Registering OutputCommitCoordinator
20/02/10 10:34:19 INFO util.log: Logging initialized @2754ms
20/02/10 10:34:19 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
20/02/10 10:34:19 INFO server.Server: Started @2849ms
20/02/10 10:34:19 INFO server.AbstractConnector: Started ServerConnector@6f6a7463{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
20/02/10 10:34:19 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@19868320{/jobs,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@63a5e46c{/jobs/json,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7e8e8651{/jobs/job,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@271f18d3{/jobs/job/json,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6bd51ed8{/stages,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@61e3a1fd{/stages/json,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51abf713{/stages/stage,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@315df4bb{/stages/stage/json,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3fc08eec{/stages/pool,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5cad8b7d{/stages/pool/json,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7b02e036{/storage,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@25243bc1{/storage/json,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1e287667{/storage/rdd,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2e6ee0bc{/storage/rdd/json,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4201a617{/environment,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@467f77a5{/environment/json,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bb9aa43{/executors,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@420bc288{/executors/json,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@df5f5c0{/executors/threadDump,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@308a6984{/executors/threadDump/json,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@66b72664{/static,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@e27ba81{/,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@54336c81{/api,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@49bd54f7{/jobs/job/kill,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6b5f8707{/stages/stage/kill,null,AVAILABLE,@Spark}
20/02/10 10:34:19 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.31:4040
20/02/10 10:34:19 INFO spark.SparkContext: Added JAR file:/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar at spark://192.168.10.31:43959/jars/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar with timestamp 1581359659255
20/02/10 10:34:19 INFO executor.Executor: Starting executor ID driver on host localhost
20/02/10 10:34:19 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 40419.
20/02/10 10:34:19 INFO netty.NettyBlockTransferService: Server created on 192.168.10.31:40419
20/02/10 10:34:19 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
20/02/10 10:34:19 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.31, 40419, None)
20/02/10 10:34:19 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.10.31:40419 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 40419, None)
20/02/10 10:34:19 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.31, 40419, None)
20/02/10 10:34:19 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 40419, None)
20/02/10 10:34:19 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6fca2a8f{/metrics/json,null,AVAILABLE,@Spark}
20/02/10 10:34:20 INFO rdd.ADAMContext: Loading mouse_chrM.bam.reads.adam as Parquet of Alignments.
20/02/10 10:34:20 INFO rdd.ADAMContext: Reading the ADAM file at mouse_chrM.bam.reads.adam to create RDD
20/02/10 10:34:20 INFO rdd.ADAMContext: Using the specified projection schema
20/02/10 10:34:20 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 322.5 KB, free 366.0 MB)
20/02/10 10:34:21 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 29.1 KB, free 366.0 MB)
20/02/10 10:34:21 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.31:40419 (size: 29.1 KB, free: 366.3 MB)
20/02/10 10:34:21 INFO spark.SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:1801
20/02/10 10:34:22 INFO input.FileInputFormat: Total input paths to process : 1
20/02/10 10:34:22 INFO hadoop.ParquetInputFormat: Total input paths to process : 1
20/02/10 10:34:22 INFO spark.SparkContext: Starting job: aggregate at FlagStat.scala:115
20/02/10 10:34:22 INFO scheduler.DAGScheduler: Got job 0 (aggregate at FlagStat.scala:115) with 1 output partitions
20/02/10 10:34:22 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (aggregate at FlagStat.scala:115)
20/02/10 10:34:22 INFO scheduler.DAGScheduler: Parents of final stage: List()
20/02/10 10:34:22 INFO scheduler.DAGScheduler: Missing parents: List()
20/02/10 10:34:22 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[4] at map at FlagStat.scala:96), which has no missing parents
20/02/10 10:34:22 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 13.6 KB, free 365.9 MB)
20/02/10 10:34:22 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 6.0 KB, free 365.9 MB)
20/02/10 10:34:22 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.10.31:40419 (size: 6.0 KB, free: 366.3 MB)
20/02/10 10:34:22 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1163
20/02/10 10:34:22 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[4] at map at FlagStat.scala:96) (first 15 tasks are for partitions Vector(0))
20/02/10 10:34:22 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
20/02/10 10:34:22 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7491 bytes)
20/02/10 10:34:22 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
20/02/10 10:34:22 INFO executor.Executor: Fetching spark://192.168.10.31:43959/jars/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar with timestamp 1581359659255
20/02/10 10:34:23 INFO client.TransportClientFactory: Successfully created connection to /192.168.10.31:43959 after 32 ms (0 ms spent in bootstraps)
20/02/10 10:34:23 INFO util.Utils: Fetching spark://192.168.10.31:43959/jars/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar to /tmp/spark-4de4ba53-86dd-47ce-8921-623f3cf6ceae/userFiles-62ab9049-553b-4407-adee-e00358f33f41/fetchFileTemp1290764546656693068.tmp
20/02/10 10:34:23 INFO executor.Executor: Adding file:/tmp/spark-4de4ba53-86dd-47ce-8921-623f3cf6ceae/userFiles-62ab9049-553b-4407-adee-e00358f33f41/adam-assembly-spark2_2.12-0.31.0-SNAPSHOT.jar to class loader
20/02/10 10:34:23 INFO rdd.NewHadoopRDD: Input split: file:/tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/part-r-00000.gz.parquet:0+10132211
20/02/10 10:34:23 INFO hadoop.InternalParquetRecordReader: RecordReader initialized will read a total of 163064 records.
20/02/10 10:34:23 INFO hadoop.InternalParquetRecordReader: at row 0. reading next block
20/02/10 10:34:23 INFO compress.CodecPool: Got brand-new decompressor [.gz]
20/02/10 10:34:23 INFO hadoop.InternalParquetRecordReader: block read in memory in 23 ms. row count = 163064
20/02/10 10:34:24 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 9641 bytes result sent to driver
20/02/10 10:34:24 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 1880 ms on localhost (executor driver) (1/1)
20/02/10 10:34:24 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
20/02/10 10:34:24 INFO scheduler.DAGScheduler: ResultStage 0 (aggregate at FlagStat.scala:115) finished in 2.003 s
20/02/10 10:34:24 INFO scheduler.DAGScheduler: Job 0 finished: aggregate at FlagStat.scala:115, took 2.067871 s
163064 + 0 in total (QC-passed reads + QC-failed reads)
0 + 0 primary duplicates
0 + 0 primary duplicates - both read and mate mapped
0 + 0 primary duplicates - only read mapped
0 + 0 primary duplicates - cross chromosome
0 + 0 secondary duplicates
0 + 0 secondary duplicates - both read and mate mapped
0 + 0 secondary duplicates - only read mapped
0 + 0 secondary duplicates - cross chromosome
160512 + 0 mapped (98.43%:0.00%)
163064 + 0 paired in sequencing
81524 + 0 read1
81540 + 0 read2
154982 + 0 properly paired (95.04%:0.00%)
158044 + 0 with itself and mate mapped
2468 + 0 singletons (1.51%:0.00%)
418 + 0 with mate mapped to a different chr
120 + 0 with mate mapped to a different chr (mapQ>=5)
20/02/10 10:34:24 INFO cli.FlagStat: Overall Duration: 6.57 secs
20/02/10 10:34:24 INFO cli.FlagStat: Metrics:

Timings
+--------------------------------------+--------------+--------------+-------------+--------+-----------+-----------+-----------+
|                Metric                | Worker Total | Driver Total | Driver Only | Count  |   Mean    |    Min    |    Max    |
+--------------------------------------+--------------+--------------+-------------+--------+-----------+-----------+-----------+
| └─ Load Alignments                   |            - |    2.54 secs |   2.43 secs |      1 | 2.54 secs | 2.54 secs | 2.54 secs |
|     └─ map at ADAMContext.scala:1805 |            - |    106.48 ms |           - |      1 | 106.48 ms | 106.48 ms | 106.48 ms |
|         └─ function call             |     21.55 ms |            - |           - | 163064 |    132 ns |     47 ns | 102.22 µs |
| └─ map at FlagStat.scala:96          |            - |     30.57 ms |           - |      1 |  30.57 ms |  30.57 ms |  30.57 ms |
|     └─ function call                 |    189.06 ms |            - |           - | 163064 |   1.16 µs |    182 ns |      1 ms |
| └─ aggregate at FlagStat.scala:115   |            - |    2.19 secs |           - |      1 | 2.19 secs | 2.19 secs | 2.19 secs |
|     └─ function call                 |     33.11 ms |            - |           - | 163065 |    203 ns |     77 ns | 365.24 µs |
+--------------------------------------+--------------+--------------+-------------+--------+-----------+-----------+-----------+

Spark Operations
+----------+---------------------------------+---------------+----------------+--------------+----------+
| Sequence |            Operation            | Is New Stage? | Stage Duration | Driver Total | Stage ID |
+----------+---------------------------------+---------------+----------------+--------------+----------+
| 1        | map at ADAMContext.scala:1805   | false         |              - |    106.48 ms | -        |
| 2        | map at FlagStat.scala:96        | false         |              - |     30.57 ms | -        |
| 3        | aggregate at FlagStat.scala:115 | true          |        -2 secs |    2.19 secs | 0        |
+----------+---------------------------------+---------------+----------------+--------------+----------+

Task Timings
+-------------------------------+------------+-------+-----------+-----------+-----------+
|            Metric             | Total Time | Count |   Mean    |    Min    |    Max    |
+-------------------------------+------------+-------+-----------+-----------+-----------+
| Task Duration                 |  1.88 secs |     1 | 1.88 secs | 1.88 secs | 1.88 secs |
| Executor Run Time             |  1.21 secs |     1 | 1.21 secs | 1.21 secs | 1.21 secs |
| Executor Deserialization Time |     585 ms |     1 |    585 ms |    585 ms |    585 ms |
| Result Serialization Time     |          0 |     1 |         0 |         0 |         0 |
+-------------------------------+------------+-------+-----------+-----------+-----------+

Task Timings By Host
+-------------------------------+--------+------------+-------+-----------+-----------+-----------+
|            Metric             |  Host  | Total Time | Count |   Mean    |    Min    |    Max    |
+-------------------------------+--------+------------+-------+-----------+-----------+-----------+
| Task Duration                 | driver |  1.88 secs |     1 | 1.88 secs | 1.88 secs | 1.88 secs |
| Executor Run Time             | driver |  1.21 secs |     1 | 1.21 secs | 1.21 secs | 1.21 secs |
| Executor Deserialization Time | driver |     585 ms |     1 |    585 ms |    585 ms |    585 ms |
| Result Serialization Time     | driver |          0 |     1 |         0 |         0 |         0 |
+-------------------------------+--------+------------+-------+-----------+-----------+-----------+

Task Timings By Stage
+-------------------------------+------------------------------------+------------+-------+-----------+-----------+-----------+
|            Metric             |          Stage ID & Name           | Total Time | Count |   Mean    |    Min    |    Max    |
+-------------------------------+------------------------------------+------------+-------+-----------+-----------+-----------+
| Task Duration                 | 0: aggregate at FlagStat.scala:115 |  1.88 secs |     1 | 1.88 secs | 1.88 secs | 1.88 secs |
| Executor Run Time             | 0: aggregate at FlagStat.scala:115 |  1.21 secs |     1 | 1.21 secs | 1.21 secs | 1.21 secs |
| Executor Deserialization Time | 0: aggregate at FlagStat.scala:115 |     585 ms |     1 |    585 ms |    585 ms |    585 ms |
| Result Serialization Time     | 0: aggregate at FlagStat.scala:115 |          0 |     1 |         0 |         0 |         0 |
+-------------------------------+------------------------------------+------------+-------+-----------+-----------+-----------+

20/02/10 10:34:24 INFO spark.SparkContext: Invoking stop() from shutdown hook
20/02/10 10:34:24 INFO server.AbstractConnector: Stopped Spark@6f6a7463{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
20/02/10 10:34:24 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.10.31:4040
20/02/10 10:34:24 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/02/10 10:34:24 INFO memory.MemoryStore: MemoryStore cleared
20/02/10 10:34:24 INFO storage.BlockManager: BlockManager stopped
20/02/10 10:34:24 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
20/02/10 10:34:24 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/02/10 10:34:24 INFO spark.SparkContext: Successfully stopped SparkContext
20/02/10 10:34:24 INFO util.ShutdownHookManager: Shutdown hook called
20/02/10 10:34:24 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-4de4ba53-86dd-47ce-8921-623f3cf6ceae
20/02/10 10:34:24 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-68ea5d95-8dcb-410d-8513-d19fbf7e7527
rm -rf ${ADAM_TMP_DIR}
+ rm -rf /tmp/adamTestn3MEIQ6/deleteMePleaseThisIsNoLongerNeeded
popd
+ popd
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.5/label/ubuntu

pushd ${PROJECT_ROOT}
+ pushd /home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.5/label/ubuntu/scripts/..
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.5/label/ubuntu ~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.5/label/ubuntu
# move back to Scala 2.11 as default
if [ ${SCALAVER} == 2.12 ];
then
    set +e
    ./scripts/move_to_scala_2.11.sh
    set -e
fi
+ '[' 2.12 == 2.12 ']'
+ set +e
+ ./scripts/move_to_scala_2.11.sh
+ set -e

# test that the source is formatted correctly
./scripts/format-source
+ ./scripts/format-source
+++ dirname ./scripts/format-source
++ cd ./scripts
++ pwd
+ DIR=/home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.5/label/ubuntu/scripts
+ pushd /home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.5/label/ubuntu/scripts/..
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.5/label/ubuntu ~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.5/label/ubuntu
+ mvn org.scalariform:scalariform-maven-plugin:format license:format
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=1g; support was removed in 8.0
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] ADAM_2.11
[INFO] ADAM_2.11: Shader workaround
[INFO] ADAM_2.11: Avro-to-Dataset codegen utils
[INFO] ADAM_2.11: Core
[INFO] ADAM_2.11: APIs for Java, Python
[INFO] ADAM_2.11: CLI
[INFO] ADAM_2.11: Assembly
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11 0.31.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-parent-spark2_2.11 ---
[INFO] Modified 2 of 243 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-parent-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: Shader workaround 0.31.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-shade-spark2_2.11 ---
[INFO] Modified 0 of 0 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-shade-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: Avro-to-Dataset codegen utils 0.31.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-codegen-spark2_2.11 ---
[INFO] Modified 0 of 4 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-codegen-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: Core 0.31.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-core-spark2_2.11 ---
[INFO] Modified 0 of 203 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-core-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: APIs for Java, Python 0.31.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-apis-spark2_2.11 ---
[INFO] Modified 0 of 5 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-apis-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: CLI 0.31.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-cli-spark2_2.11 ---
[INFO] Modified 0 of 29 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-cli-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: Assembly 0.31.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-assembly-spark2_2.11 ---
[INFO] Modified 0 of 1 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-assembly-spark2_2.11 ---
[INFO] Updating license headers...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] ADAM_2.11 .......................................... SUCCESS [  6.340 s]
[INFO] ADAM_2.11: Shader workaround ....................... SUCCESS [  0.025 s]
[INFO] ADAM_2.11: Avro-to-Dataset codegen utils ........... SUCCESS [  0.059 s]
[INFO] ADAM_2.11: Core .................................... SUCCESS [  3.319 s]
[INFO] ADAM_2.11: APIs for Java, Python ................... SUCCESS [  0.131 s]
[INFO] ADAM_2.11: CLI ..................................... SUCCESS [  0.198 s]
[INFO] ADAM_2.11: Assembly ................................ SUCCESS [  0.019 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 10.696 s
[INFO] Finished at: 2020-02-10T10:34:38-08:00
[INFO] Final Memory: 21M/1093M
[INFO] ------------------------------------------------------------------------
+ popd
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.5/label/ubuntu
if test -n "$(git status --porcelain)"
then
    echo "Please run './scripts/format-source'"
    exit 1
fi
git status --porcelain
++ git status --porcelain
+ test -n ''
popd    
+ popd
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.5/label/ubuntu

echo
+ echo

echo "All the tests passed"
+ echo 'All the tests passed'
All the tests passed
echo
+ echo

Recording test results
Publishing Scoverage XML and HTML report...
null
Setting commit status on GitHub for https://github.com/bigdatagenomics/adam/commit/e363df6b0a2f3fba3277416dc46dde6004ebcd3a
Finished: SUCCESS