SuccessConsole Output

Skipping 3,388 KB.. Full Log
NAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMSAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMCRAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMBAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMBAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/AlignmentRecordDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/SingleReadBucketSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ParquetUnboundReadDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMBAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/FASTQInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/FASTQInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/AlignmentRecordDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/DatasetBoundAlignmentRecordDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/SAMInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ReadDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/RDDBoundReadDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMCRAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/QualityScoreBin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMSAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/AnySAMOutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/RDDBoundAlignmentRecordDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMCRAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/AnySAMInFormatterCompanion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ReferencePositionPairSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMSAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/DatasetBoundReadDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/QualityScoreBin$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMBAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/BAMInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/InstrumentedADAMSAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ReadDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ParquetUnboundAlignmentRecordDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/ADAMCRAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/read/SAMInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/FullOuterShuffleRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/RDDBoundGenericGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/MultisampleAvroGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/MultisampleGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/AvroGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/DatasetBoundGenericGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/VictimlessSortedIntervalPartitionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/FeatureDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/CoverageDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/FeatureDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/BEDInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/NarrowPeakOutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GFF3InFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GFF3InFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/DatasetBoundCoverageDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GTFInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/ParquetUnboundCoverageDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/BEDInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/DatasetBoundFeatureDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GTFInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GTFOutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/NarrowPeakInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/ParquetUnboundFeatureDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/RDDBoundFeatureDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/GFF3OutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/BEDOutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/NarrowPeakInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/RDDBoundCoverageDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/feature/CoverageDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenericConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/SortedIntervalPartitionJoinWithVictims.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/InnerShuffleRegionJoinAndGroupByLeft.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/AvroReadGroupGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenericGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/InnerTreeRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/RightOuterTreeRegionJoinAndGroupByRight.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenomicDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/DatasetBoundFragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/ParquetUnboundFragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/FragmentDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/InterleavedFASTQInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/FragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/Tab5InFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/Tab5InFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/Tab6InFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/InterleavedFASTQInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/Tab6InFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/fragment/RDDBoundFragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/RDDBoundSliceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/DatasetBoundSliceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/SequenceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/DatasetBoundSequenceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/SliceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/ParquetUnboundSequenceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/RDDBoundSequenceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/SequenceDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/SliceDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/sequence/ParquetUnboundSliceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenomicPositionPartitioner.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/GenomicBroadcast.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/ADAMVCFOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VCFInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/GenotypeDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VariantDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/DatasetBoundGenotypeDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VCFInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/ParquetUnboundGenotypeDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/RDDBoundGenotypeDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/RDDBoundVariantDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/DatasetBoundVariantDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VariantContextDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/RDDBoundVariantContextDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/ParquetUnboundVariantDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VariantContextDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VCFOutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/DatasetBoundVariantContextDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/VariantDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/variant/GenotypeDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/ReferencePartitioner.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rdd/ShuffleRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/SequenceDictionaryReader$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/ParquetFileTraversable.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/ParquetLogger$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/ReferenceContigMap$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/ReferenceContigMapSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/TextRddWriter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/ReferenceContigMap.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/IndexedFastaFile.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/FileExtensions$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/GenomeFileReader$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/TwoBitFileSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/util/AttributeUtils$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/converters/DefaultHeaderLines$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/converters/VariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/converters/AlignmentRecordConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/converters/VariantContextConverter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/instrumentation/Timers$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/instrumentation/index.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/consensus/ConsensusGenerator$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/consensus/ConsensusGenerator.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/consensus/index.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/smithwaterman/SmithWatermanConstantGapScoring.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/smithwaterman/SmithWaterman.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/smithwaterman/index.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/javadocs/org/apache/parquet/avro/class-use/AvroSchemaConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/javadocs/org/apache/parquet/avro/AvroSchemaConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/class-use/SingleFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/class-use/InterleavedFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/class-use/FastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/class-use/FastqRecordReader.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/SingleFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/InterleavedFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/FastqRecordReader.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/python/DataFrameConversionWrapper.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToAlignmentRecordsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToCoverageDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToFeatureConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToAlignmentRecordDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToAlignmentRecordsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToVariantContextDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToAlignmentRecordsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToAlignmentRecordsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToAlignmentRecordsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToGenotypeDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToVariantContextsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToFragmentDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToAlignmentRecordsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToVariantContextsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToAlignmentRecordsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/JavaADAMContext.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToSequenceDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToFeatureDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToReadDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToAlignmentRecordsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/JavaADAMContext$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToVariantDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToVariantContextsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentRecordsToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToAlignmentRecordsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToSliceDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountReadKmersArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformVariantsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSequences$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformGenotypes.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformGenotypes$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformAlignments$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountSliceKmersArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFeaturesArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFeatures$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformAlignmentsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformGenotypesArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFragmentsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSequencesArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountSliceKmers.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountSliceKmers$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSlicesArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFragments.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSequences.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFeatures.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformVariants.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformAlignments.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFragments$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/MergeShardsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformVariants$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSlices.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountReadKmers$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSlices$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/repo/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT-javadoc.jar longer than 100 characters.
[WARNING] Entry: adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/repo/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT-sources.jar longer than 100 characters.
[INFO] Building zip: /tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/adam-distribution/target/adam-distribution-spark2_2.12-0.29.0-SNAPSHOT-bin.zip
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] ADAM_2.12 .......................................... SUCCESS [  8.247 s]
[INFO] ADAM_2.12: Shader workaround ....................... SUCCESS [  5.216 s]
[INFO] ADAM_2.12: Avro-to-Dataset codegen utils ........... SUCCESS [  5.253 s]
[INFO] ADAM_2.12: Core .................................... SUCCESS [01:15 min]
[INFO] ADAM_2.12: APIs for Java, Python ................... SUCCESS [ 10.703 s]
[INFO] ADAM_2.12: CLI ..................................... SUCCESS [ 12.942 s]
[INFO] ADAM_2.12: Assembly ................................ SUCCESS [ 18.809 s]
[INFO] ADAM_2.12: Python APIs ............................. SUCCESS [01:22 min]
[INFO] ADAM_2.12: R APIs .................................. SUCCESS [01:03 min]
[INFO] ADAM_2.12: Distribution ............................ SUCCESS [ 47.481 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 05:29 min
[INFO] Finished at: 2019-09-11T12:41:10-07:00
[INFO] Final Memory: 67M/1487M
[INFO] ------------------------------------------------------------------------
+ tar tzvf adam-distribution/target/adam-distribution-spark2_2.12-0.29.0-SNAPSHOT-bin.tar.gz
+ grep egg
+ grep bdgenomics.adam
drwxrwxr-x jenkins/jenkins        0 2019-09-11 12:38 adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/r/bdgenomics.adam.egg-info/
-rw-r--r-- jenkins/jenkins 36825340 2019-09-11 12:38 adam-distribution-spark2_2.12-0.29.0-SNAPSHOT/repo/bdgenomics.adam-0.28.0a0-py3.6.egg
+ ./bin/pyadam
Using PYSPARK=/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.4-bin-without-hadoop-scala-2.12/bin/pyspark
2019-09-11 12:41:14 WARN  Utils:66 - Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0)
2019-09-11 12:41:14 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2019-09-11 12:41:14 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
2019-09-11 12:41:21 WARN  Utils:66 - Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.debug.maxToStringFields' in SparkEnv.conf.

[Stage 0:>                                                          (0 + 1) / 1]
                                                                                
+ source deactivate
#!/bin/bash

# Determine the directory containing this script
if [[ -n $BASH_VERSION ]]; then
    _SCRIPT_LOCATION=${BASH_SOURCE[0]}
    _SHELL="bash"
elif [[ -n $ZSH_VERSION ]]; then
    _SCRIPT_LOCATION=${funcstack[1]}
    _SHELL="zsh"
else
    echo "Only bash and zsh are supported"
    return 1
fi
++ [[ -n 4.3.48(1)-release ]]
++ _SCRIPT_LOCATION=/home/jenkins/anaconda2/envs/adam-build-207727c3-b99c-4941-8689-fefd7b498217/bin/deactivate
++ _SHELL=bash
_CONDA_DIR=$(dirname "$_SCRIPT_LOCATION")
dirname "$_SCRIPT_LOCATION"
+++ dirname /home/jenkins/anaconda2/envs/adam-build-207727c3-b99c-4941-8689-fefd7b498217/bin/deactivate
++ _CONDA_DIR=/home/jenkins/anaconda2/envs/adam-build-207727c3-b99c-4941-8689-fefd7b498217/bin

case "$(uname -s)" in
    CYGWIN*|MINGW*|MSYS*)
        EXT=".exe"
        export MSYS2_ENV_CONV_EXCL=CONDA_PATH
        ;;
    *)
        EXT=""
        ;;
esac
++ case "$(uname -s)" in
uname -s
+++ uname -s
++ EXT=

# shift over all args.  We don't accept any, so it's OK that we ignore them all here.
while [[ $# > 0 ]]
do
    key="$1"
    case $key in
        -h|--help)
            "$_CONDA_DIR/conda" ..deactivate $_SHELL$EXT -h
            if [[ -n $BASH_VERSION ]] && [[ "$(basename "$0" 2> /dev/null)" == "deactivate" ]]; then
                exit 0
            else
                return 0
            fi
            ;;
    esac
    shift # past argument or value
done
++ [[ 0 > 0 ]]

# Ensure that this script is sourced, not executed
# Note that if the script was executed, we're running inside bash!
# Also note that errors are ignored as `activate foo` doesn't generate a bad
# value for $0 which would cause errors.
if [[ -n $BASH_VERSION ]] && [[ "$(basename "$0" 2> /dev/null)" == "deactivate" ]]; then
    (>&2 echo "Error: deactivate must be sourced. Run 'source deactivate'
instead of 'deactivate'.
")
    "$_CONDA_DIR/conda" ..deactivate $_SHELL$EXT -h
    exit 1
fi
++ [[ -n 4.3.48(1)-release ]]
basename "$0" 2> /dev/null
+++ basename /home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.4/label/ubuntu/scripts/jenkins-test
++ [[ jenkins-test == \d\e\a\c\t\i\v\a\t\e ]]

if [[ -z "$CONDA_PATH_BACKUP" ]]; then
    if [[ -n $BASH_VERSION ]] && [[ "$(basename "$0" 2> /dev/null)" == "deactivate" ]]; then
        exit 0
    else
        return 0
    fi
fi
++ [[ -z /usr/lib/jvm/java-8-oracle/bin/:/usr/lib/jvm/java-8-oracle/bin/:/home/anaconda/bin/:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games ]]

if (( $? == 0 )); then
    # Inverse of activation: run deactivate scripts prior to deactivating env
    _CONDA_D="${CONDA_PREFIX}/etc/conda/deactivate.d"
    if [[ -d $_CONDA_D ]]; then
        eval $(find "$_CONDA_D" -iname "*.sh" -exec echo source \'{}\'';' \;)
    fi

#    # get the activation path that would have been provided for this prefix
#    _LAST_ACTIVATE_PATH=$("$_CONDA_DIR/conda" ..activate $_SHELL$EXT "$CONDA_PREFIX")
#
#    # in activate, we replace a placeholder so that conda keeps its place in the PATH order
#    # The activate script sets _CONDA_HOLD here to activate that behavior.
#    #   Otherwise, PATH is simply removed.
#    if [ -n "$_CONDA_HOLD" ]; then
#        export PATH="$($_CONDA_PYTHON2 -c "import re; print(re.sub(r'$_LAST_ACTIVATE_PATH(:?)', r'CONDA_PATH_PLACEHOLDER\1', '$PATH', 1))")"
#    else
#        export PATH="$($_CONDA_PYTHON2 -c "import re; print(re.sub(r'$_LAST_ACTIVATE_PATH(:?)', r'', '$PATH', 1))")"
#    fi
#
#    unset _LAST_ACTIVATE_PATH

    export PATH=$("$_CONDA_DIR/conda" ..deactivate.path $_SHELL$EXT "$CONDA_PREFIX")

    unset CONDA_DEFAULT_ENV
    unset CONDA_PREFIX
    unset CONDA_PATH_BACKUP
    export PS1="$CONDA_PS1_BACKUP"
    unset CONDA_PS1_BACKUP
    unset _CONDA_PYTHON2
else
    unset _CONDA_PYTHON2
    return $?
fi
++ ((  0 == 0  ))
++ _CONDA_D=/home/jenkins/anaconda2/envs/adam-build-207727c3-b99c-4941-8689-fefd7b498217/etc/conda/deactivate.d
++ [[ -d /home/jenkins/anaconda2/envs/adam-build-207727c3-b99c-4941-8689-fefd7b498217/etc/conda/deactivate.d ]]
"$_CONDA_DIR/conda" ..deactivate.path $_SHELL$EXT "$CONDA_PREFIX"
+++ /home/jenkins/anaconda2/envs/adam-build-207727c3-b99c-4941-8689-fefd7b498217/bin/conda ..deactivate.path bash /home/jenkins/anaconda2/envs/adam-build-207727c3-b99c-4941-8689-fefd7b498217
++ export PATH=/usr/lib/jvm/java-8-oracle/bin/:/usr/lib/jvm/java-8-oracle/bin/:/home/anaconda/bin/:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
++ PATH=/usr/lib/jvm/java-8-oracle/bin/:/usr/lib/jvm/java-8-oracle/bin/:/home/anaconda/bin/:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
++ unset CONDA_DEFAULT_ENV
++ unset CONDA_PREFIX
++ unset CONDA_PATH_BACKUP
++ export PS1=
++ PS1=
++ unset CONDA_PS1_BACKUP
++ unset _CONDA_PYTHON2

if [[ -n $BASH_VERSION ]]; then
    hash -r
elif [[ -n $ZSH_VERSION ]]; then
    rehash
fi
++ [[ -n 4.3.48(1)-release ]]
++ hash -r
+ conda remove -y -n adam-build-207727c3-b99c-4941-8689-fefd7b498217 --all

Package plan for package removal in environment /home/jenkins/anaconda2/envs/adam-build-207727c3-b99c-4941-8689-fefd7b498217:

The following packages will be REMOVED:

    _libgcc_mutex:   0.1-main               
    ca-certificates: 2019.5.15-1            
    certifi:         2019.6.16-py36_1       
    libedit:         3.1.20181209-hc058e9b_0
    libffi:          3.2.1-hd88cf55_4       
    libgcc-ng:       9.1.0-hdf63c60_0       
    libstdcxx-ng:    9.1.0-hdf63c60_0       
    ncurses:         6.1-he6710b0_1         
    openssl:         1.1.1d-h7b6447c_0      
    pip:             19.2.2-py36_0          
    python:          3.6.9-h265db76_0       
    readline:        7.0-h7b6447c_5         
    setuptools:      41.0.1-py36_0          
    sqlite:          3.29.0-h7b6447c_0      
    tk:              8.6.8-hbc83047_0       
    wheel:           0.33.4-py36_0          
    xz:              5.2.4-h14c3975_4       
    zlib:            1.2.11-h7b6447c_3      

+ cp -r adam-python/target /home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.4/label/ubuntu/scripts/../adam-python/
+ pushd adam-python
/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/adam-python /tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded ~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.4/label/ubuntu
+ make clean
pip uninstall -y adam
Cannot uninstall requirement adam, not installed
You are using pip version 19.1.1, however version 19.2.3 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Makefile:65: recipe for target 'clean_develop' failed
make: [clean_develop] Error 1 (ignored)
rm -rf bdgenomics/*.egg*
rm -rf build/
+ make clean_sdist
rm -rf dist
+ popd
/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded ~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.4/label/ubuntu
	
# define filenames
BAM=mouse_chrM.bam
+ BAM=mouse_chrM.bam
READS=${BAM}.reads.adam
+ READS=mouse_chrM.bam.reads.adam
SORTED_READS=${BAM}.reads.sorted.adam
+ SORTED_READS=mouse_chrM.bam.reads.sorted.adam
FRAGMENTS=${BAM}.fragments.adam
+ FRAGMENTS=mouse_chrM.bam.fragments.adam
    
# fetch our input dataset
echo "Fetching BAM file"
+ echo 'Fetching BAM file'
Fetching BAM file
rm -rf ${BAM}
+ rm -rf mouse_chrM.bam
wget -q https://s3.amazonaws.com/bdgenomics-test/${BAM}
+ wget -q https://s3.amazonaws.com/bdgenomics-test/mouse_chrM.bam

# once fetched, convert BAM to ADAM
echo "Converting BAM to ADAM read format"
+ echo 'Converting BAM to ADAM read format'
Converting BAM to ADAM read format
rm -rf ${READS}
+ rm -rf mouse_chrM.bam.reads.adam
${ADAM} transformAlignments ${BAM} ${READS}
+ ./bin/adam-submit transformAlignments mouse_chrM.bam mouse_chrM.bam.reads.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.4-bin-without-hadoop-scala-2.12/bin/spark-submit
19/09/11 12:41:38 WARN util.Utils: Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0)
19/09/11 12:41:38 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/09/11 12:41:38 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/09/11 12:41:39 INFO cli.ADAMMain: ADAM invoked with args: "transformAlignments" "mouse_chrM.bam" "mouse_chrM.bam.reads.adam"
19/09/11 12:41:39 INFO spark.SparkContext: Running Spark version 2.4.4
19/09/11 12:41:39 INFO spark.SparkContext: Submitted application: transformAlignments
19/09/11 12:41:39 INFO spark.SecurityManager: Changing view acls to: jenkins
19/09/11 12:41:39 INFO spark.SecurityManager: Changing modify acls to: jenkins
19/09/11 12:41:39 INFO spark.SecurityManager: Changing view acls groups to: 
19/09/11 12:41:39 INFO spark.SecurityManager: Changing modify acls groups to: 
19/09/11 12:41:39 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
19/09/11 12:41:39 INFO util.Utils: Successfully started service 'sparkDriver' on port 37939.
19/09/11 12:41:39 INFO spark.SparkEnv: Registering MapOutputTracker
19/09/11 12:41:39 INFO spark.SparkEnv: Registering BlockManagerMaster
19/09/11 12:41:39 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/09/11 12:41:39 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/09/11 12:41:39 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-3e1ca8ea-40de-4c59-bd57-50148578fea4
19/09/11 12:41:39 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
19/09/11 12:41:39 INFO spark.SparkEnv: Registering OutputCommitCoordinator
19/09/11 12:41:39 INFO util.log: Logging initialized @2810ms
19/09/11 12:41:40 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
19/09/11 12:41:40 INFO server.Server: Started @2923ms
19/09/11 12:41:40 INFO server.AbstractConnector: Started ServerConnector@3c321bdb{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
19/09/11 12:41:40 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@15deb1dc{/jobs,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3c1e3314{/jobs/json,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4b770e40{/jobs/job,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@54a3ab8f{/jobs/job/json,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1968a49c{/stages,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6a1ebcff{/stages/json,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@19868320{/stages/stage,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@13c612bd{/stages/stage/json,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3ef41c66{/stages/pool,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6b739528{/stages/pool/json,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@622ef26a{/storage,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@41de5768{/storage/json,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5f577419{/storage/rdd,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@28fa700e{/storage/rdd/json,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3d526ad9{/environment,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@e041f0c{/environment/json,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6a175569{/executors,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@11963225{/executors/json,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3f3c966c{/executors/threadDump,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@11ee02f8{/executors/threadDump/json,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4102b1b1{/static,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7b02e036{/,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@25243bc1{/api,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@467f77a5{/jobs/job/kill,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bb9aa43{/stages/stage/kill,null,AVAILABLE,@Spark}
19/09/11 12:41:40 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.31:4040
19/09/11 12:41:40 INFO spark.SparkContext: Added JAR file:/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar at spark://192.168.10.31:37939/jars/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar with timestamp 1568230900183
19/09/11 12:41:40 INFO executor.Executor: Starting executor ID driver on host localhost
19/09/11 12:41:40 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 38399.
19/09/11 12:41:40 INFO netty.NettyBlockTransferService: Server created on 192.168.10.31:38399
19/09/11 12:41:40 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/09/11 12:41:40 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.31, 38399, None)
19/09/11 12:41:40 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.10.31:38399 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 38399, None)
19/09/11 12:41:40 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.31, 38399, None)
19/09/11 12:41:40 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 38399, None)
19/09/11 12:41:40 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4c0884e8{/metrics/json,null,AVAILABLE,@Spark}
19/09/11 12:41:41 INFO rdd.ADAMContext: Loading mouse_chrM.bam as BAM/CRAM/SAM and converting to AlignmentRecords.
19/09/11 12:41:41 INFO rdd.ADAMContext: Loaded header from file:/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam
19/09/11 12:41:41 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 295.6 KB, free 366.0 MB)
19/09/11 12:41:42 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 24.0 KB, free 366.0 MB)
19/09/11 12:41:42 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.31:38399 (size: 24.0 KB, free: 366.3 MB)
19/09/11 12:41:42 INFO spark.SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:2059
19/09/11 12:41:43 INFO read.RDDBoundAlignmentRecordDataset: Saving data in ADAM format
19/09/11 12:41:44 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
19/09/11 12:41:44 INFO input.FileInputFormat: Total input paths to process : 1
19/09/11 12:41:44 INFO spark.SparkContext: Starting job: runJob at SparkHadoopWriter.scala:78
19/09/11 12:41:44 INFO scheduler.DAGScheduler: Got job 0 (runJob at SparkHadoopWriter.scala:78) with 1 output partitions
19/09/11 12:41:44 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (runJob at SparkHadoopWriter.scala:78)
19/09/11 12:41:44 INFO scheduler.DAGScheduler: Parents of final stage: List()
19/09/11 12:41:44 INFO scheduler.DAGScheduler: Missing parents: List()
19/09/11 12:41:44 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[2] at map at GenomicDataset.scala:3814), which has no missing parents
19/09/11 12:41:44 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 85.6 KB, free 365.9 MB)
19/09/11 12:41:44 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 31.5 KB, free 365.9 MB)
19/09/11 12:41:44 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.10.31:38399 (size: 31.5 KB, free: 366.2 MB)
19/09/11 12:41:44 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1161
19/09/11 12:41:44 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[2] at map at GenomicDataset.scala:3814) (first 15 tasks are for partitions Vector(0))
19/09/11 12:41:44 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
19/09/11 12:41:44 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7441 bytes)
19/09/11 12:41:44 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
19/09/11 12:41:44 INFO executor.Executor: Fetching spark://192.168.10.31:37939/jars/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar with timestamp 1568230900183
19/09/11 12:41:44 INFO client.TransportClientFactory: Successfully created connection to /192.168.10.31:37939 after 40 ms (0 ms spent in bootstraps)
19/09/11 12:41:44 INFO util.Utils: Fetching spark://192.168.10.31:37939/jars/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar to /tmp/spark-2f5d9118-60f5-443e-8610-ab8bd0c8eebb/userFiles-f17c8d34-929f-4793-b857-5a7ce73ddb18/fetchFileTemp1727078678551789773.tmp
19/09/11 12:41:44 INFO executor.Executor: Adding file:/tmp/spark-2f5d9118-60f5-443e-8610-ab8bd0c8eebb/userFiles-f17c8d34-929f-4793-b857-5a7ce73ddb18/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar to class loader
19/09/11 12:41:44 INFO rdd.NewHadoopRDD: Input split: file:/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam:83361792-833134657535
19/09/11 12:41:44 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
19/09/11 12:41:44 INFO codec.CodecConfig: Compression: GZIP
19/09/11 12:41:44 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
19/09/11 12:41:44 INFO hadoop.ParquetOutputFormat: Parquet block size to 134217728
19/09/11 12:41:44 INFO hadoop.ParquetOutputFormat: Parquet page size to 1048576
19/09/11 12:41:44 INFO hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
19/09/11 12:41:44 INFO hadoop.ParquetOutputFormat: Dictionary is on
19/09/11 12:41:44 INFO hadoop.ParquetOutputFormat: Validation is off
19/09/11 12:41:44 INFO hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
19/09/11 12:41:44 INFO hadoop.ParquetOutputFormat: Maximum row group padding size is 8388608 bytes
19/09/11 12:41:44 INFO hadoop.ParquetOutputFormat: Page size checking is: estimated
19/09/11 12:41:44 INFO hadoop.ParquetOutputFormat: Min row count for page size check is: 100
19/09/11 12:41:44 INFO hadoop.ParquetOutputFormat: Max row count for page size check is: 10000
19/09/11 12:41:45 INFO compress.CodecPool: Got brand-new compressor [.gz]
Ignoring SAM validation error: ERROR: Record 162622, Read name 613F0AAXX100423:3:58:9979:16082, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162624, Read name 613F0AAXX100423:6:13:3141:11793, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162625, Read name 613F0AAXX100423:8:39:18592:13552, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162635, Read name 613F1AAXX100423:7:2:13114:10698, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162637, Read name 613F1AAXX100423:6:100:8840:11167, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162639, Read name 613F1AAXX100423:8:15:10944:11181, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162640, Read name 613F1AAXX100423:8:17:5740:10104, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162651, Read name 613F1AAXX100423:1:53:11097:8261, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162654, Read name 613F1AAXX100423:2:112:16779:19612, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162657, Read name 613F0AAXX100423:8:28:7084:17683, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162659, Read name 613F0AAXX100423:8:39:19796:12794, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162662, Read name 613F1AAXX100423:5:116:9339:3264, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162667, Read name 613F0AAXX100423:4:67:2015:3054, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162669, Read name 613F0AAXX100423:7:7:11297:11738, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162674, Read name 613F0AAXX100423:6:59:10490:20829, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162678, Read name 613F1AAXX100423:8:11:17603:4766, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162682, Read name 613F0AAXX100423:5:86:10814:10257, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162683, Read name 613F0AAXX100423:5:117:14178:6111, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162685, Read name 613F0AAXX100423:2:3:13563:6720, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162689, Read name 613F0AAXX100423:7:59:16009:15799, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162696, Read name 613F0AAXX100423:5:31:9663:18252, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162698, Read name 613F1AAXX100423:2:27:12264:14626, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162699, Read name 613F0AAXX100423:1:120:19003:6647, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162702, Read name 613F1AAXX100423:3:37:6972:18407, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162704, Read name 613F1AAXX100423:3:77:6946:3880, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162706, Read name 613F0AAXX100423:7:48:2692:3492, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162708, Read name 613F1AAXX100423:7:80:8790:1648, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162710, Read name 6141AAAXX100423:5:30:15036:17610, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162712, Read name 613F1AAXX100423:8:80:6261:4465, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162713, Read name 6141AAAXX100423:5:74:5542:6195, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162715, Read name 613F1AAXX100423:5:14:14844:13639, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162718, Read name 613F1AAXX100423:7:112:14569:8480, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162725, Read name 613F1AAXX100423:4:56:10160:9879, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162727, Read name 6141AAAXX100423:7:89:12209:9221, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162731, Read name 6141AAAXX100423:6:55:1590:19793, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162732, Read name 6141AAAXX100423:7:102:16679:12368, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162734, Read name 613F1AAXX100423:2:7:4909:18472, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162737, Read name 6141AAAXX100423:4:73:6574:10572, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162741, Read name 6141AAAXX100423:1:8:14113:12655, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162743, Read name 6141AAAXX100423:3:40:7990:5056, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162744, Read name 6141AAAXX100423:4:36:15793:3411, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162745, Read name 6141AAAXX100423:8:83:1139:18985, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162746, Read name 6141AAAXX100423:5:7:18196:13562, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162748, Read name 6141AAAXX100423:3:114:5639:7123, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162751, Read name 6141AAAXX100423:7:47:4898:8640, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162753, Read name 6141AAAXX100423:3:64:8064:8165, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162756, Read name 613F1AAXX100423:1:105:14386:1684, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162757, Read name 613F1AAXX100423:6:98:1237:19470, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162761, Read name 613F1AAXX100423:7:106:19658:9261, MAPQ should be 0 for unmapped read.
19/09/11 12:41:54 INFO hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 16043959
19/09/11 12:41:55 INFO output.FileOutputCommitter: Saved output of task 'attempt_20190911124144_0002_r_000000_0' to file:/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/_temporary/0/task_20190911124144_0002_r_000000
19/09/11 12:41:55 INFO mapred.SparkHadoopMapRedUtil: attempt_20190911124144_0002_r_000000_0: Committed
19/09/11 12:41:55 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 856 bytes result sent to driver
19/09/11 12:41:55 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 10907 ms on localhost (executor driver) (1/1)
19/09/11 12:41:55 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
19/09/11 12:41:55 INFO scheduler.DAGScheduler: ResultStage 0 (runJob at SparkHadoopWriter.scala:78) finished in 11.064 s
19/09/11 12:41:55 INFO scheduler.DAGScheduler: Job 0 finished: runJob at SparkHadoopWriter.scala:78, took 11.144441 s
19/09/11 12:41:55 INFO hadoop.ParquetFileReader: Initiating action with parallelism: 5
19/09/11 12:41:55 INFO io.SparkHadoopWriter: Job job_20190911124144_0002 committed.
19/09/11 12:41:55 INFO cli.TransformAlignments: Overall Duration: 16.32 secs
19/09/11 12:41:55 INFO spark.SparkContext: Invoking stop() from shutdown hook
19/09/11 12:41:55 INFO server.AbstractConnector: Stopped Spark@3c321bdb{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
19/09/11 12:41:55 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.10.31:4040
19/09/11 12:41:55 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/09/11 12:41:55 INFO memory.MemoryStore: MemoryStore cleared
19/09/11 12:41:55 INFO storage.BlockManager: BlockManager stopped
19/09/11 12:41:55 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
19/09/11 12:41:55 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/09/11 12:41:55 INFO spark.SparkContext: Successfully stopped SparkContext
19/09/11 12:41:55 INFO util.ShutdownHookManager: Shutdown hook called
19/09/11 12:41:55 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-4c9ddc87-ce78-4d23-9446-cc7632bc2361
19/09/11 12:41:55 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-2f5d9118-60f5-443e-8610-ab8bd0c8eebb

# then, sort the BAM
echo "Converting BAM to ADAM read format with sorting"
+ echo 'Converting BAM to ADAM read format with sorting'
Converting BAM to ADAM read format with sorting
rm -rf ${SORTED_READS}
+ rm -rf mouse_chrM.bam.reads.sorted.adam
${ADAM} transformAlignments -sort_by_reference_position ${READS} ${SORTED_READS}
+ ./bin/adam-submit transformAlignments -sort_by_reference_position mouse_chrM.bam.reads.adam mouse_chrM.bam.reads.sorted.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.4-bin-without-hadoop-scala-2.12/bin/spark-submit
19/09/11 12:41:57 WARN util.Utils: Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0)
19/09/11 12:41:57 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/09/11 12:41:57 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/09/11 12:41:57 INFO cli.ADAMMain: ADAM invoked with args: "transformAlignments" "-sort_by_reference_position" "mouse_chrM.bam.reads.adam" "mouse_chrM.bam.reads.sorted.adam"
19/09/11 12:41:58 INFO spark.SparkContext: Running Spark version 2.4.4
19/09/11 12:41:58 INFO spark.SparkContext: Submitted application: transformAlignments
19/09/11 12:41:58 INFO spark.SecurityManager: Changing view acls to: jenkins
19/09/11 12:41:58 INFO spark.SecurityManager: Changing modify acls to: jenkins
19/09/11 12:41:58 INFO spark.SecurityManager: Changing view acls groups to: 
19/09/11 12:41:58 INFO spark.SecurityManager: Changing modify acls groups to: 
19/09/11 12:41:58 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
19/09/11 12:41:58 INFO util.Utils: Successfully started service 'sparkDriver' on port 34329.
19/09/11 12:41:58 INFO spark.SparkEnv: Registering MapOutputTracker
19/09/11 12:41:58 INFO spark.SparkEnv: Registering BlockManagerMaster
19/09/11 12:41:58 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/09/11 12:41:58 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/09/11 12:41:58 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-267d13a0-c195-49bf-bb83-2b6ed1309715
19/09/11 12:41:58 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
19/09/11 12:41:58 INFO spark.SparkEnv: Registering OutputCommitCoordinator
19/09/11 12:41:58 INFO util.log: Logging initialized @2501ms
19/09/11 12:41:58 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
19/09/11 12:41:58 INFO server.Server: Started @2575ms
19/09/11 12:41:58 INFO server.AbstractConnector: Started ServerConnector@24855019{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
19/09/11 12:41:58 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6e9c413e{/jobs,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4b770e40{/jobs/json,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@78e16155{/jobs/job,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1968a49c{/jobs/job/json,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6a1ebcff{/stages,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@19868320{/stages/json,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@50b0bc4c{/stages/stage,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3ef41c66{/stages/stage/json,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6b739528{/stages/pool,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@622ef26a{/stages/pool/json,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@41de5768{/storage,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5f577419{/storage/json,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@28fa700e{/storage/rdd,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3d526ad9{/storage/rdd/json,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@e041f0c{/environment,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6a175569{/environment/json,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@11963225{/executors,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3f3c966c{/executors/json,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@11ee02f8{/executors/threadDump,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4102b1b1{/executors/threadDump/json,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@61a5b4ae{/static,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@25243bc1{/,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1e287667{/api,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bb9aa43{/jobs/job/kill,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@420bc288{/stages/stage/kill,null,AVAILABLE,@Spark}
19/09/11 12:41:58 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.31:4040
19/09/11 12:41:58 INFO spark.SparkContext: Added JAR file:/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar at spark://192.168.10.31:34329/jars/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar with timestamp 1568230918837
19/09/11 12:41:58 INFO executor.Executor: Starting executor ID driver on host localhost
19/09/11 12:41:58 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 41405.
19/09/11 12:41:58 INFO netty.NettyBlockTransferService: Server created on 192.168.10.31:41405
19/09/11 12:41:58 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/09/11 12:41:59 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.31, 41405, None)
19/09/11 12:41:59 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.10.31:41405 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 41405, None)
19/09/11 12:41:59 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.31, 41405, None)
19/09/11 12:41:59 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 41405, None)
19/09/11 12:41:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@231baf51{/metrics/json,null,AVAILABLE,@Spark}
19/09/11 12:41:59 INFO rdd.ADAMContext: Loading mouse_chrM.bam.reads.adam as Parquet of AlignmentRecords.
19/09/11 12:42:00 INFO rdd.ADAMContext: Reading the ADAM file at mouse_chrM.bam.reads.adam to create RDD
19/09/11 12:42:01 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 315.0 KB, free 366.0 MB)
19/09/11 12:42:02 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 28.2 KB, free 366.0 MB)
19/09/11 12:42:02 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.31:41405 (size: 28.2 KB, free: 366.3 MB)
19/09/11 12:42:02 INFO spark.SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:1801
19/09/11 12:42:02 INFO cli.TransformAlignments: Sorting alignments by reference position, with references ordered by name
19/09/11 12:42:02 INFO read.RDDBoundAlignmentRecordDataset: Sorting alignments by reference position
19/09/11 12:42:02 INFO input.FileInputFormat: Total input paths to process : 1
19/09/11 12:42:02 INFO hadoop.ParquetInputFormat: Total input paths to process : 1
19/09/11 12:42:02 INFO read.RDDBoundAlignmentRecordDataset: Saving data in ADAM format
19/09/11 12:42:02 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
19/09/11 12:42:02 INFO spark.SparkContext: Starting job: runJob at SparkHadoopWriter.scala:78
19/09/11 12:42:02 INFO scheduler.DAGScheduler: Registering RDD 2 (sortBy at AlignmentRecordDataset.scala:1008)
19/09/11 12:42:02 INFO scheduler.DAGScheduler: Got job 0 (runJob at SparkHadoopWriter.scala:78) with 1 output partitions
19/09/11 12:42:02 INFO scheduler.DAGScheduler: Final stage: ResultStage 1 (runJob at SparkHadoopWriter.scala:78)
19/09/11 12:42:02 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
19/09/11 12:42:02 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 0)
19/09/11 12:42:02 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[2] at sortBy at AlignmentRecordDataset.scala:1008), which has no missing parents
19/09/11 12:42:02 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 5.8 KB, free 366.0 MB)
19/09/11 12:42:02 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 3.4 KB, free 366.0 MB)
19/09/11 12:42:02 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.10.31:41405 (size: 3.4 KB, free: 366.3 MB)
19/09/11 12:42:02 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1161
19/09/11 12:42:02 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[2] at sortBy at AlignmentRecordDataset.scala:1008) (first 15 tasks are for partitions Vector(0))
19/09/11 12:42:02 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
19/09/11 12:42:02 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7480 bytes)
19/09/11 12:42:02 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
19/09/11 12:42:02 INFO executor.Executor: Fetching spark://192.168.10.31:34329/jars/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar with timestamp 1568230918837
19/09/11 12:42:02 INFO client.TransportClientFactory: Successfully created connection to /192.168.10.31:34329 after 38 ms (0 ms spent in bootstraps)
19/09/11 12:42:02 INFO util.Utils: Fetching spark://192.168.10.31:34329/jars/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar to /tmp/spark-ac35a200-2dd2-452a-8028-190c02d7fafd/userFiles-65c27fa3-7f23-4ecf-968a-e036bc024f7a/fetchFileTemp9138508265854120996.tmp
19/09/11 12:42:02 INFO executor.Executor: Adding file:/tmp/spark-ac35a200-2dd2-452a-8028-190c02d7fafd/userFiles-65c27fa3-7f23-4ecf-968a-e036bc024f7a/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar to class loader
19/09/11 12:42:03 INFO rdd.NewHadoopRDD: Input split: file:/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/part-r-00000.gz.parquet:0+10132230
19/09/11 12:42:03 INFO hadoop.InternalParquetRecordReader: RecordReader initialized will read a total of 163064 records.
19/09/11 12:42:03 INFO hadoop.InternalParquetRecordReader: at row 0. reading next block
19/09/11 12:42:03 INFO compress.CodecPool: Got brand-new decompressor [.gz]
19/09/11 12:42:03 INFO hadoop.InternalParquetRecordReader: block read in memory in 63 ms. row count = 163064
19/09/11 12:42:06 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 962 bytes result sent to driver
19/09/11 12:42:06 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 3981 ms on localhost (executor driver) (1/1)
19/09/11 12:42:06 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
19/09/11 12:42:06 INFO scheduler.DAGScheduler: ShuffleMapStage 0 (sortBy at AlignmentRecordDataset.scala:1008) finished in 4.119 s
19/09/11 12:42:06 INFO scheduler.DAGScheduler: looking for newly runnable stages
19/09/11 12:42:06 INFO scheduler.DAGScheduler: running: Set()
19/09/11 12:42:06 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 1)
19/09/11 12:42:06 INFO scheduler.DAGScheduler: failed: Set()
19/09/11 12:42:06 INFO scheduler.DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[5] at map at GenomicDataset.scala:3814), which has no missing parents
19/09/11 12:42:06 INFO memory.MemoryStore: Block broadcast_2 stored as values in memory (estimated size 86.8 KB, free 365.9 MB)
19/09/11 12:42:06 INFO memory.MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 32.3 KB, free 365.8 MB)
19/09/11 12:42:06 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.10.31:41405 (size: 32.3 KB, free: 366.2 MB)
19/09/11 12:42:06 INFO spark.SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1161
19/09/11 12:42:06 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[5] at map at GenomicDataset.scala:3814) (first 15 tasks are for partitions Vector(0))
19/09/11 12:42:06 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
19/09/11 12:42:06 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, executor driver, partition 0, ANY, 7141 bytes)
19/09/11 12:42:06 INFO executor.Executor: Running task 0.0 in stage 1.0 (TID 1)
19/09/11 12:42:06 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/09/11 12:42:06 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 7 ms
19/09/11 12:42:09 INFO storage.BlockManagerInfo: Removed broadcast_1_piece0 on 192.168.10.31:41405 in memory (size: 3.4 KB, free: 366.2 MB)
19/09/11 12:42:09 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
19/09/11 12:42:09 INFO codec.CodecConfig: Compression: GZIP
19/09/11 12:42:09 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
19/09/11 12:42:09 INFO hadoop.ParquetOutputFormat: Parquet block size to 134217728
19/09/11 12:42:09 INFO hadoop.ParquetOutputFormat: Parquet page size to 1048576
19/09/11 12:42:09 INFO hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
19/09/11 12:42:09 INFO hadoop.ParquetOutputFormat: Dictionary is on
19/09/11 12:42:09 INFO hadoop.ParquetOutputFormat: Validation is off
19/09/11 12:42:09 INFO hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
19/09/11 12:42:09 INFO hadoop.ParquetOutputFormat: Maximum row group padding size is 8388608 bytes
19/09/11 12:42:09 INFO hadoop.ParquetOutputFormat: Page size checking is: estimated
19/09/11 12:42:09 INFO hadoop.ParquetOutputFormat: Min row count for page size check is: 100
19/09/11 12:42:09 INFO hadoop.ParquetOutputFormat: Max row count for page size check is: 10000
19/09/11 12:42:09 INFO compress.CodecPool: Got brand-new compressor [.gz]
19/09/11 12:42:13 INFO hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 16004474
19/09/11 12:42:13 INFO output.FileOutputCommitter: Saved output of task 'attempt_20190911124202_0005_r_000000_0' to file:/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.sorted.adam/_temporary/0/task_20190911124202_0005_r_000000
19/09/11 12:42:13 INFO mapred.SparkHadoopMapRedUtil: attempt_20190911124202_0005_r_000000_0: Committed
19/09/11 12:42:13 INFO executor.Executor: Finished task 0.0 in stage 1.0 (TID 1). 1243 bytes result sent to driver
19/09/11 12:42:13 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 7125 ms on localhost (executor driver) (1/1)
19/09/11 12:42:13 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
19/09/11 12:42:13 INFO scheduler.DAGScheduler: ResultStage 1 (runJob at SparkHadoopWriter.scala:78) finished in 7.190 s
19/09/11 12:42:13 INFO scheduler.DAGScheduler: Job 0 finished: runJob at SparkHadoopWriter.scala:78, took 11.409399 s
19/09/11 12:42:13 INFO hadoop.ParquetFileReader: Initiating action with parallelism: 5
19/09/11 12:42:13 INFO io.SparkHadoopWriter: Job job_20190911124202_0005 committed.
19/09/11 12:42:13 INFO cli.TransformAlignments: Overall Duration: 15.84 secs
19/09/11 12:42:13 INFO spark.SparkContext: Invoking stop() from shutdown hook
19/09/11 12:42:13 INFO server.AbstractConnector: Stopped Spark@24855019{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
19/09/11 12:42:13 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.10.31:4040
19/09/11 12:42:13 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/09/11 12:42:13 INFO memory.MemoryStore: MemoryStore cleared
19/09/11 12:42:13 INFO storage.BlockManager: BlockManager stopped
19/09/11 12:42:13 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
19/09/11 12:42:13 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/09/11 12:42:13 INFO spark.SparkContext: Successfully stopped SparkContext
19/09/11 12:42:13 INFO util.ShutdownHookManager: Shutdown hook called
19/09/11 12:42:13 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-ac35a200-2dd2-452a-8028-190c02d7fafd
19/09/11 12:42:13 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-26a65f0c-dd0e-4122-8c14-648e68f68b2e

# convert the reads to fragments to re-pair the reads
echo "Converting read file to fragments"
+ echo 'Converting read file to fragments'
Converting read file to fragments
rm -rf ${FRAGMENTS}
+ rm -rf mouse_chrM.bam.fragments.adam
${ADAM} transformFragments -load_as_alignments ${READS} ${FRAGMENTS}
+ ./bin/adam-submit transformFragments -load_as_alignments mouse_chrM.bam.reads.adam mouse_chrM.bam.fragments.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.4-bin-without-hadoop-scala-2.12/bin/spark-submit
19/09/11 12:42:15 WARN util.Utils: Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0)
19/09/11 12:42:15 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/09/11 12:42:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/09/11 12:42:16 INFO cli.ADAMMain: ADAM invoked with args: "transformFragments" "-load_as_alignments" "mouse_chrM.bam.reads.adam" "mouse_chrM.bam.fragments.adam"
19/09/11 12:42:16 INFO spark.SparkContext: Running Spark version 2.4.4
19/09/11 12:42:16 INFO spark.SparkContext: Submitted application: transformFragments
19/09/11 12:42:16 INFO spark.SecurityManager: Changing view acls to: jenkins
19/09/11 12:42:16 INFO spark.SecurityManager: Changing modify acls to: jenkins
19/09/11 12:42:16 INFO spark.SecurityManager: Changing view acls groups to: 
19/09/11 12:42:16 INFO spark.SecurityManager: Changing modify acls groups to: 
19/09/11 12:42:16 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
19/09/11 12:42:16 INFO util.Utils: Successfully started service 'sparkDriver' on port 41263.
19/09/11 12:42:16 INFO spark.SparkEnv: Registering MapOutputTracker
19/09/11 12:42:16 INFO spark.SparkEnv: Registering BlockManagerMaster
19/09/11 12:42:16 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/09/11 12:42:16 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/09/11 12:42:17 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-adb54867-98e4-447a-a1c7-3e7e4356917e
19/09/11 12:42:17 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
19/09/11 12:42:17 INFO spark.SparkEnv: Registering OutputCommitCoordinator
19/09/11 12:42:17 INFO util.log: Logging initialized @2506ms
19/09/11 12:42:17 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
19/09/11 12:42:17 INFO server.Server: Started @2591ms
19/09/11 12:42:17 INFO server.AbstractConnector: Started ServerConnector@7577b641{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
19/09/11 12:42:17 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@50d68830{/jobs,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@32c0915e{/jobs/json,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@106faf11{/jobs/job,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@26d10f2e{/jobs/job/json,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@10ad20cb{/stages,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7dd712e8{/stages/json,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2c282004{/stages/stage,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3e792ce3{/stages/stage/json,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@53bc1328{/stages/pool,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@26f143ed{/stages/pool/json,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3c1e3314{/storage,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4b770e40{/storage/json,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@78e16155{/storage/rdd,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@54a3ab8f{/storage/rdd/json,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1968a49c{/environment,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6a1ebcff{/environment/json,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@19868320{/executors,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@50b0bc4c{/executors/json,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@c20be82{/executors/threadDump,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@13c612bd{/executors/threadDump/json,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3ef41c66{/static,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@63a5e46c{/,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7e8e8651{/api,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@61e3a1fd{/jobs/job/kill,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51abf713{/stages/stage/kill,null,AVAILABLE,@Spark}
19/09/11 12:42:17 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.31:4040
19/09/11 12:42:17 INFO spark.SparkContext: Added JAR file:/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar at spark://192.168.10.31:41263/jars/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar with timestamp 1568230937322
19/09/11 12:42:17 INFO executor.Executor: Starting executor ID driver on host localhost
19/09/11 12:42:17 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 35639.
19/09/11 12:42:17 INFO netty.NettyBlockTransferService: Server created on 192.168.10.31:35639
19/09/11 12:42:17 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/09/11 12:42:17 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.31, 35639, None)
19/09/11 12:42:17 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.10.31:35639 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 35639, None)
19/09/11 12:42:17 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.31, 35639, None)
19/09/11 12:42:17 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 35639, None)
19/09/11 12:42:17 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2ca6546f{/metrics/json,null,AVAILABLE,@Spark}
19/09/11 12:42:18 INFO rdd.ADAMContext: Loading mouse_chrM.bam.reads.adam as Parquet of AlignmentRecords.
19/09/11 12:42:19 INFO rdd.ADAMContext: Reading the ADAM file at mouse_chrM.bam.reads.adam to create RDD
19/09/11 12:42:20 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 315.0 KB, free 366.0 MB)
19/09/11 12:42:20 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 28.2 KB, free 366.0 MB)
19/09/11 12:42:20 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.31:35639 (size: 28.2 KB, free: 366.3 MB)
19/09/11 12:42:20 INFO spark.SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:1801
19/09/11 12:42:20 INFO input.FileInputFormat: Total input paths to process : 1
19/09/11 12:42:20 INFO hadoop.ParquetInputFormat: Total input paths to process : 1
19/09/11 12:42:20 INFO fragment.RDDBoundFragmentDataset: Saving data in ADAM format
19/09/11 12:42:20 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
19/09/11 12:42:20 INFO spark.SparkContext: Starting job: runJob at SparkHadoopWriter.scala:78
19/09/11 12:42:20 INFO scheduler.DAGScheduler: Registering RDD 2 (groupBy at SingleReadBucket.scala:97)
19/09/11 12:42:20 INFO scheduler.DAGScheduler: Got job 0 (runJob at SparkHadoopWriter.scala:78) with 1 output partitions
19/09/11 12:42:20 INFO scheduler.DAGScheduler: Final stage: ResultStage 1 (runJob at SparkHadoopWriter.scala:78)
19/09/11 12:42:20 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
19/09/11 12:42:20 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 0)
19/09/11 12:42:20 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[2] at groupBy at SingleReadBucket.scala:97), which has no missing parents
19/09/11 12:42:20 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 6.3 KB, free 366.0 MB)
19/09/11 12:42:20 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 3.4 KB, free 366.0 MB)
19/09/11 12:42:20 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.10.31:35639 (size: 3.4 KB, free: 366.3 MB)
19/09/11 12:42:20 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1161
19/09/11 12:42:20 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[2] at groupBy at SingleReadBucket.scala:97) (first 15 tasks are for partitions Vector(0))
19/09/11 12:42:20 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
19/09/11 12:42:20 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7480 bytes)
19/09/11 12:42:20 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
19/09/11 12:42:20 INFO executor.Executor: Fetching spark://192.168.10.31:41263/jars/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar with timestamp 1568230937322
19/09/11 12:42:21 INFO client.TransportClientFactory: Successfully created connection to /192.168.10.31:41263 after 39 ms (0 ms spent in bootstraps)
19/09/11 12:42:21 INFO util.Utils: Fetching spark://192.168.10.31:41263/jars/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar to /tmp/spark-d3edd7d3-80ee-491c-8419-c1428fbd276e/userFiles-78ac8688-77b9-486e-a69f-63ae43fa98d7/fetchFileTemp338152770744441402.tmp
19/09/11 12:42:21 INFO executor.Executor: Adding file:/tmp/spark-d3edd7d3-80ee-491c-8419-c1428fbd276e/userFiles-78ac8688-77b9-486e-a69f-63ae43fa98d7/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar to class loader
19/09/11 12:42:21 INFO rdd.NewHadoopRDD: Input split: file:/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/part-r-00000.gz.parquet:0+10132230
19/09/11 12:42:21 INFO hadoop.InternalParquetRecordReader: RecordReader initialized will read a total of 163064 records.
19/09/11 12:42:21 INFO hadoop.InternalParquetRecordReader: at row 0. reading next block
19/09/11 12:42:21 INFO compress.CodecPool: Got brand-new decompressor [.gz]
19/09/11 12:42:21 INFO hadoop.InternalParquetRecordReader: block read in memory in 40 ms. row count = 163064
19/09/11 12:42:24 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 919 bytes result sent to driver
19/09/11 12:42:24 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 3937 ms on localhost (executor driver) (1/1)
19/09/11 12:42:24 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
19/09/11 12:42:24 INFO scheduler.DAGScheduler: ShuffleMapStage 0 (groupBy at SingleReadBucket.scala:97) finished in 4.056 s
19/09/11 12:42:24 INFO scheduler.DAGScheduler: looking for newly runnable stages
19/09/11 12:42:24 INFO scheduler.DAGScheduler: running: Set()
19/09/11 12:42:24 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 1)
19/09/11 12:42:24 INFO scheduler.DAGScheduler: failed: Set()
19/09/11 12:42:24 INFO scheduler.DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[6] at map at GenomicDataset.scala:3814), which has no missing parents
19/09/11 12:42:24 INFO memory.MemoryStore: Block broadcast_2 stored as values in memory (estimated size 90.2 KB, free 365.9 MB)
19/09/11 12:42:24 INFO memory.MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 33.5 KB, free 365.8 MB)
19/09/11 12:42:24 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.10.31:35639 (size: 33.5 KB, free: 366.2 MB)
19/09/11 12:42:24 INFO spark.SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1161
19/09/11 12:42:24 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[6] at map at GenomicDataset.scala:3814) (first 15 tasks are for partitions Vector(0))
19/09/11 12:42:24 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
19/09/11 12:42:24 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, executor driver, partition 0, ANY, 7141 bytes)
19/09/11 12:42:24 INFO executor.Executor: Running task 0.0 in stage 1.0 (TID 1)
19/09/11 12:42:25 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/09/11 12:42:25 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 7 ms
19/09/11 12:42:27 INFO storage.BlockManagerInfo: Removed broadcast_1_piece0 on 192.168.10.31:35639 in memory (size: 3.4 KB, free: 366.2 MB)
19/09/11 12:42:27 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
19/09/11 12:42:27 INFO codec.CodecConfig: Compression: GZIP
19/09/11 12:42:27 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
19/09/11 12:42:27 INFO hadoop.ParquetOutputFormat: Parquet block size to 134217728
19/09/11 12:42:27 INFO hadoop.ParquetOutputFormat: Parquet page size to 1048576
19/09/11 12:42:27 INFO hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
19/09/11 12:42:27 INFO hadoop.ParquetOutputFormat: Dictionary is on
19/09/11 12:42:27 INFO hadoop.ParquetOutputFormat: Validation is off
19/09/11 12:42:27 INFO hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
19/09/11 12:42:27 INFO hadoop.ParquetOutputFormat: Maximum row group padding size is 8388608 bytes
19/09/11 12:42:27 INFO hadoop.ParquetOutputFormat: Page size checking is: estimated
19/09/11 12:42:27 INFO hadoop.ParquetOutputFormat: Min row count for page size check is: 100
19/09/11 12:42:27 INFO hadoop.ParquetOutputFormat: Max row count for page size check is: 10000
19/09/11 12:42:27 INFO compress.CodecPool: Got brand-new compressor [.gz]
19/09/11 12:42:33 INFO hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 21417928
19/09/11 12:42:33 INFO output.FileOutputCommitter: Saved output of task 'attempt_20190911124220_0006_r_000000_0' to file:/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.fragments.adam/_temporary/0/task_20190911124220_0006_r_000000
19/09/11 12:42:33 INFO mapred.SparkHadoopMapRedUtil: attempt_20190911124220_0006_r_000000_0: Committed
19/09/11 12:42:33 INFO executor.Executor: Finished task 0.0 in stage 1.0 (TID 1). 1243 bytes result sent to driver
19/09/11 12:42:33 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 8893 ms on localhost (executor driver) (1/1)
19/09/11 12:42:33 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
19/09/11 12:42:33 INFO scheduler.DAGScheduler: ResultStage 1 (runJob at SparkHadoopWriter.scala:78) finished in 8.933 s
19/09/11 12:42:33 INFO scheduler.DAGScheduler: Job 0 finished: runJob at SparkHadoopWriter.scala:78, took 13.087174 s
19/09/11 12:42:33 INFO hadoop.ParquetFileReader: Initiating action with parallelism: 5
19/09/11 12:42:33 INFO io.SparkHadoopWriter: Job job_20190911124220_0006 committed.
19/09/11 12:42:33 INFO cli.TransformFragments: Overall Duration: 17.53 secs
19/09/11 12:42:33 INFO spark.SparkContext: Invoking stop() from shutdown hook
19/09/11 12:42:33 INFO server.AbstractConnector: Stopped Spark@7577b641{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
19/09/11 12:42:33 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.10.31:4040
19/09/11 12:42:34 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/09/11 12:42:34 INFO memory.MemoryStore: MemoryStore cleared
19/09/11 12:42:34 INFO storage.BlockManager: BlockManager stopped
19/09/11 12:42:34 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
19/09/11 12:42:34 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/09/11 12:42:34 INFO spark.SparkContext: Successfully stopped SparkContext
19/09/11 12:42:34 INFO util.ShutdownHookManager: Shutdown hook called
19/09/11 12:42:34 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-d3edd7d3-80ee-491c-8419-c1428fbd276e
19/09/11 12:42:34 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-a0507bbe-a053-4401-902b-bce4436946ae

# test that printing works
echo "Printing reads and fragments"
+ echo 'Printing reads and fragments'
Printing reads and fragments
${ADAM} print ${READS} 1>/dev/null 2>/dev/null
+ ./bin/adam-submit print mouse_chrM.bam.reads.adam
${ADAM} print ${FRAGMENTS} 1>/dev/null 2>/dev/null
+ ./bin/adam-submit print mouse_chrM.bam.fragments.adam

# run flagstat to verify that flagstat runs OK
echo "Printing read statistics"
+ echo 'Printing read statistics'
Printing read statistics
${ADAM} flagstat -print_metrics ${READS}
+ ./bin/adam-submit flagstat -print_metrics mouse_chrM.bam.reads.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/spark-2.4.4-bin-without-hadoop-scala-2.12/bin/spark-submit
19/09/11 12:42:53 WARN util.Utils: Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0)
19/09/11 12:42:53 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/09/11 12:42:54 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/09/11 12:42:54 INFO cli.ADAMMain: ADAM invoked with args: "flagstat" "-print_metrics" "mouse_chrM.bam.reads.adam"
19/09/11 12:42:54 INFO spark.SparkContext: Running Spark version 2.4.4
19/09/11 12:42:54 INFO spark.SparkContext: Submitted application: flagstat
19/09/11 12:42:54 INFO spark.SecurityManager: Changing view acls to: jenkins
19/09/11 12:42:54 INFO spark.SecurityManager: Changing modify acls to: jenkins
19/09/11 12:42:54 INFO spark.SecurityManager: Changing view acls groups to: 
19/09/11 12:42:54 INFO spark.SecurityManager: Changing modify acls groups to: 
19/09/11 12:42:54 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
19/09/11 12:42:54 INFO util.Utils: Successfully started service 'sparkDriver' on port 36791.
19/09/11 12:42:54 INFO spark.SparkEnv: Registering MapOutputTracker
19/09/11 12:42:55 INFO spark.SparkEnv: Registering BlockManagerMaster
19/09/11 12:42:55 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/09/11 12:42:55 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/09/11 12:42:55 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-00a87a6a-fa72-4825-a5a0-07d7d5e8297f
19/09/11 12:42:55 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
19/09/11 12:42:55 INFO spark.SparkEnv: Registering OutputCommitCoordinator
19/09/11 12:42:55 INFO util.log: Logging initialized @2606ms
19/09/11 12:42:55 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
19/09/11 12:42:55 INFO server.Server: Started @2685ms
19/09/11 12:42:55 INFO server.AbstractConnector: Started ServerConnector@751e664e{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
19/09/11 12:42:55 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2b27cc70{/jobs,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@47a64f7d{/jobs/json,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@33d05366{/jobs/job,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7692cd34{/jobs/job/json,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@33aa93c{/stages,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@32c0915e{/stages/json,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@106faf11{/stages/stage,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@10ad20cb{/stages/stage/json,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7dd712e8{/stages/pool,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2c282004{/stages/pool/json,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@22ee2d0{/storage,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7bfc3126{/storage/json,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3e792ce3{/storage/rdd,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@53bc1328{/storage/rdd/json,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@26f143ed{/environment,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3c1e3314{/environment/json,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4b770e40{/executors,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@78e16155{/executors/json,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@54a3ab8f{/executors/threadDump,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1968a49c{/executors/threadDump/json,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6a1ebcff{/static,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3a71c100{/,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5b69fd74{/api,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@63a5e46c{/jobs/job/kill,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7e8e8651{/stages/stage/kill,null,AVAILABLE,@Spark}
19/09/11 12:42:55 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.31:4040
19/09/11 12:42:55 INFO spark.SparkContext: Added JAR file:/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar at spark://192.168.10.31:36791/jars/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar with timestamp 1568230975327
19/09/11 12:42:55 INFO executor.Executor: Starting executor ID driver on host localhost
19/09/11 12:42:55 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 35013.
19/09/11 12:42:55 INFO netty.NettyBlockTransferService: Server created on 192.168.10.31:35013
19/09/11 12:42:55 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/09/11 12:42:55 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.31, 35013, None)
19/09/11 12:42:55 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.10.31:35013 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 35013, None)
19/09/11 12:42:55 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.31, 35013, None)
19/09/11 12:42:55 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 35013, None)
19/09/11 12:42:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@12a160c2{/metrics/json,null,AVAILABLE,@Spark}
19/09/11 12:42:56 INFO rdd.ADAMContext: Loading mouse_chrM.bam.reads.adam as Parquet of AlignmentRecords.
19/09/11 12:42:56 INFO rdd.ADAMContext: Reading the ADAM file at mouse_chrM.bam.reads.adam to create RDD
19/09/11 12:42:56 INFO rdd.ADAMContext: Using the specified projection schema
19/09/11 12:42:56 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 322.6 KB, free 366.0 MB)
19/09/11 12:42:57 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 29.1 KB, free 366.0 MB)
19/09/11 12:42:57 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.31:35013 (size: 29.1 KB, free: 366.3 MB)
19/09/11 12:42:57 INFO spark.SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:1801
19/09/11 12:42:58 INFO input.FileInputFormat: Total input paths to process : 1
19/09/11 12:42:58 INFO hadoop.ParquetInputFormat: Total input paths to process : 1
19/09/11 12:42:58 INFO spark.SparkContext: Starting job: aggregate at FlagStat.scala:115
19/09/11 12:42:59 INFO scheduler.DAGScheduler: Got job 0 (aggregate at FlagStat.scala:115) with 1 output partitions
19/09/11 12:42:59 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (aggregate at FlagStat.scala:115)
19/09/11 12:42:59 INFO scheduler.DAGScheduler: Parents of final stage: List()
19/09/11 12:42:59 INFO scheduler.DAGScheduler: Missing parents: List()
19/09/11 12:42:59 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[4] at map at FlagStat.scala:96), which has no missing parents
19/09/11 12:42:59 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 13.5 KB, free 365.9 MB)
19/09/11 12:42:59 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 5.9 KB, free 365.9 MB)
19/09/11 12:42:59 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.10.31:35013 (size: 5.9 KB, free: 366.3 MB)
19/09/11 12:42:59 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1161
19/09/11 12:42:59 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[4] at map at FlagStat.scala:96) (first 15 tasks are for partitions Vector(0))
19/09/11 12:42:59 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
19/09/11 12:42:59 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7491 bytes)
19/09/11 12:42:59 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
19/09/11 12:42:59 INFO executor.Executor: Fetching spark://192.168.10.31:36791/jars/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar with timestamp 1568230975327
19/09/11 12:42:59 INFO client.TransportClientFactory: Successfully created connection to /192.168.10.31:36791 after 36 ms (0 ms spent in bootstraps)
19/09/11 12:42:59 INFO util.Utils: Fetching spark://192.168.10.31:36791/jars/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar to /tmp/spark-f6b105c3-4598-44bf-8255-da8e184e5746/userFiles-2f218101-17a6-4a48-9a09-cd81ea7643b6/fetchFileTemp517949029801336657.tmp
19/09/11 12:42:59 INFO executor.Executor: Adding file:/tmp/spark-f6b105c3-4598-44bf-8255-da8e184e5746/userFiles-2f218101-17a6-4a48-9a09-cd81ea7643b6/adam-assembly-spark2_2.12-0.29.0-SNAPSHOT.jar to class loader
19/09/11 12:42:59 INFO rdd.NewHadoopRDD: Input split: file:/tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/part-r-00000.gz.parquet:0+10132230
19/09/11 12:42:59 INFO hadoop.InternalParquetRecordReader: RecordReader initialized will read a total of 163064 records.
19/09/11 12:42:59 INFO hadoop.InternalParquetRecordReader: at row 0. reading next block
19/09/11 12:42:59 INFO compress.CodecPool: Got brand-new decompressor [.gz]
19/09/11 12:43:00 INFO hadoop.InternalParquetRecordReader: block read in memory in 25 ms. row count = 163064
19/09/11 12:43:00 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 9641 bytes result sent to driver
19/09/11 12:43:00 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 1806 ms on localhost (executor driver) (1/1)
19/09/11 12:43:00 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
19/09/11 12:43:00 INFO scheduler.DAGScheduler: ResultStage 0 (aggregate at FlagStat.scala:115) finished in 1.931 s
19/09/11 12:43:00 INFO scheduler.DAGScheduler: Job 0 finished: aggregate at FlagStat.scala:115, took 1.999830 s
163064 + 0 in total (QC-passed reads + QC-failed reads)
0 + 0 primary duplicates
0 + 0 primary duplicates - both read and mate mapped
0 + 0 primary duplicates - only read mapped
0 + 0 primary duplicates - cross chromosome
0 + 0 secondary duplicates
0 + 0 secondary duplicates - both read and mate mapped
0 + 0 secondary duplicates - only read mapped
0 + 0 secondary duplicates - cross chromosome
160512 + 0 mapped (98.43%:0.00%)
163064 + 0 paired in sequencing
81524 + 0 read1
81540 + 0 read2
154982 + 0 properly paired (95.04%:0.00%)
158044 + 0 with itself and mate mapped
2468 + 0 singletons (1.51%:0.00%)
418 + 0 with mate mapped to a different chr
120 + 0 with mate mapped to a different chr (mapQ>=5)
19/09/11 12:43:00 INFO cli.FlagStat: Overall Duration: 6.6 secs
19/09/11 12:43:01 INFO cli.FlagStat: Metrics:

Timings
+--------------------------------------+--------------+--------------+-------------+--------+-----------+-----------+-----------+
|                Metric                | Worker Total | Driver Total | Driver Only | Count  |   Mean    |    Min    |    Max    |
+--------------------------------------+--------------+--------------+-------------+--------+-----------+-----------+-----------+
| └─ Load Alignments                   |            - |    2.74 secs |   2.63 secs |      1 | 2.74 secs | 2.74 secs | 2.74 secs |
|     └─ map at ADAMContext.scala:1805 |            - |    112.63 ms |           - |      1 | 112.63 ms | 112.63 ms | 112.63 ms |
|         └─ function call             |     19.33 ms |            - |           - | 163064 |    118 ns |     57 ns | 172.55 µs |
| └─ map at FlagStat.scala:96          |            - |      30.2 ms |           - |      1 |   30.2 ms |   30.2 ms |   30.2 ms |
|     └─ function call                 |    125.74 ms |            - |           - | 163064 |    771 ns |    185 ns |   1.47 ms |
| └─ aggregate at FlagStat.scala:115   |            - |    2.11 secs |           - |      1 | 2.11 secs | 2.11 secs | 2.11 secs |
|     └─ function call                 |     56.87 ms |            - |           - | 163065 |    348 ns |     86 ns |  24.13 ms |
+--------------------------------------+--------------+--------------+-------------+--------+-----------+-----------+-----------+

Spark Operations
+----------+---------------------------------+---------------+----------------+--------------+----------+
| Sequence |            Operation            | Is New Stage? | Stage Duration | Driver Total | Stage ID |
+----------+---------------------------------+---------------+----------------+--------------+----------+
| 1        | map at ADAMContext.scala:1805   | false         |              - |    112.63 ms | -        |
| 2        | map at FlagStat.scala:96        | false         |              - |      30.2 ms | -        |
| 3        | aggregate at FlagStat.scala:115 | true          |     -1.93 secs |    2.11 secs | 0        |
+----------+---------------------------------+---------------+----------------+--------------+----------+

Task Timings
+-------------------------------+------------+-------+-----------+-----------+-----------+
|            Metric             | Total Time | Count |   Mean    |    Min    |    Max    |
+-------------------------------+------------+-------+-----------+-----------+-----------+
| Task Duration                 |  1.81 secs |     1 | 1.81 secs | 1.81 secs | 1.81 secs |
| Executor Run Time             |  1.14 secs |     1 | 1.14 secs | 1.14 secs | 1.14 secs |
| Executor Deserialization Time |     579 ms |     1 |    579 ms |    579 ms |    579 ms |
| Result Serialization Time     |          0 |     1 |         0 |         0 |         0 |
+-------------------------------+------------+-------+-----------+-----------+-----------+

Task Timings By Host
+-------------------------------+--------+------------+-------+-----------+-----------+-----------+
|            Metric             |  Host  | Total Time | Count |   Mean    |    Min    |    Max    |
+-------------------------------+--------+------------+-------+-----------+-----------+-----------+
| Task Duration                 | driver |  1.81 secs |     1 | 1.81 secs | 1.81 secs | 1.81 secs |
| Executor Run Time             | driver |  1.14 secs |     1 | 1.14 secs | 1.14 secs | 1.14 secs |
| Executor Deserialization Time | driver |     579 ms |     1 |    579 ms |    579 ms |    579 ms |
| Result Serialization Time     | driver |          0 |     1 |         0 |         0 |         0 |
+-------------------------------+--------+------------+-------+-----------+-----------+-----------+

Task Timings By Stage
+-------------------------------+------------------------------------+------------+-------+-----------+-----------+-----------+
|            Metric             |          Stage ID & Name           | Total Time | Count |   Mean    |    Min    |    Max    |
+-------------------------------+------------------------------------+------------+-------+-----------+-----------+-----------+
| Task Duration                 | 0: aggregate at FlagStat.scala:115 |  1.81 secs |     1 | 1.81 secs | 1.81 secs | 1.81 secs |
| Executor Run Time             | 0: aggregate at FlagStat.scala:115 |  1.14 secs |     1 | 1.14 secs | 1.14 secs | 1.14 secs |
| Executor Deserialization Time | 0: aggregate at FlagStat.scala:115 |     579 ms |     1 |    579 ms |    579 ms |    579 ms |
| Result Serialization Time     | 0: aggregate at FlagStat.scala:115 |          0 |     1 |         0 |         0 |         0 |
+-------------------------------+------------------------------------+------------+-------+-----------+-----------+-----------+

19/09/11 12:43:01 INFO spark.SparkContext: Invoking stop() from shutdown hook
19/09/11 12:43:01 INFO server.AbstractConnector: Stopped Spark@751e664e{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
19/09/11 12:43:01 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.10.31:4040
19/09/11 12:43:01 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/09/11 12:43:01 INFO memory.MemoryStore: MemoryStore cleared
19/09/11 12:43:01 INFO storage.BlockManager: BlockManager stopped
19/09/11 12:43:01 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
19/09/11 12:43:01 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/09/11 12:43:01 INFO spark.SparkContext: Successfully stopped SparkContext
19/09/11 12:43:01 INFO util.ShutdownHookManager: Shutdown hook called
19/09/11 12:43:01 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-f6b105c3-4598-44bf-8255-da8e184e5746
19/09/11 12:43:01 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-50d44015-175b-4834-9e8c-55e209524fa7
rm -rf ${ADAM_TMP_DIR}
+ rm -rf /tmp/adamTests5hCMTI/deleteMePleaseThisIsNoLongerNeeded
popd
+ popd
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.4/label/ubuntu

pushd ${PROJECT_ROOT}
+ pushd /home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.4/label/ubuntu/scripts/..
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.4/label/ubuntu ~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.4/label/ubuntu
# move back to Scala 2.11 as default
if [ ${SCALAVER} == 2.12 ];
then
    set +e
    ./scripts/move_to_scala_2.11.sh
    set -e
fi
+ '[' 2.12 == 2.12 ']'
+ set +e
+ ./scripts/move_to_scala_2.11.sh
+ set -e

# test that the source is formatted correctly
./scripts/format-source
+ ./scripts/format-source
+++ dirname ./scripts/format-source
++ cd ./scripts
++ pwd
+ DIR=/home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.4/label/ubuntu/scripts
+ pushd /home/jenkins/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.4/label/ubuntu/scripts/..
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.4/label/ubuntu ~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.4/label/ubuntu
+ mvn org.scalariform:scalariform-maven-plugin:format license:format
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=1g; support was removed in 8.0
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] ADAM_2.11
[INFO] ADAM_2.11: Shader workaround
[INFO] ADAM_2.11: Avro-to-Dataset codegen utils
[INFO] ADAM_2.11: Core
[INFO] ADAM_2.11: APIs for Java, Python
[INFO] ADAM_2.11: CLI
[INFO] ADAM_2.11: Assembly
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11 0.29.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-parent-spark2_2.11 ---
[INFO] Modified 2 of 242 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-parent-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: Shader workaround 0.29.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-shade-spark2_2.11 ---
[INFO] Modified 0 of 0 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-shade-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: Avro-to-Dataset codegen utils 0.29.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-codegen-spark2_2.11 ---
[INFO] Modified 0 of 4 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-codegen-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: Core 0.29.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-core-spark2_2.11 ---
[INFO] Modified 0 of 202 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-core-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: APIs for Java, Python 0.29.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-apis-spark2_2.11 ---
[INFO] Modified 0 of 5 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-apis-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: CLI 0.29.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-cli-spark2_2.11 ---
[INFO] Modified 0 of 29 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-cli-spark2_2.11 ---
[INFO] Updating license headers...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.11: Assembly 0.29.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-assembly-spark2_2.11 ---
[INFO] Modified 0 of 1 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-assembly-spark2_2.11 ---
[INFO] Updating license headers...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] ADAM_2.11 .......................................... SUCCESS [  6.062 s]
[INFO] ADAM_2.11: Shader workaround ....................... SUCCESS [  0.028 s]
[INFO] ADAM_2.11: Avro-to-Dataset codegen utils ........... SUCCESS [  0.059 s]
[INFO] ADAM_2.11: Core .................................... SUCCESS [  3.292 s]
[INFO] ADAM_2.11: APIs for Java, Python ................... SUCCESS [  0.149 s]
[INFO] ADAM_2.11: CLI ..................................... SUCCESS [  0.219 s]
[INFO] ADAM_2.11: Assembly ................................ SUCCESS [  0.013 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 10.307 s
[INFO] Finished at: 2019-09-11T12:43:13-07:00
[INFO] Final Memory: 22M/1089M
[INFO] ------------------------------------------------------------------------
+ popd
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.4/label/ubuntu
if test -n "$(git status --porcelain)"
then
    echo "Please run './scripts/format-source'"
    exit 1
fi
git status --porcelain
++ git status --porcelain
+ test -n ''
popd    
+ popd
~/workspace/ADAM-prb/HADOOP_VERSION/2.7.5/SCALAVER/2.12/SPARK_VERSION/2.4.4/label/ubuntu

echo
+ echo

echo "All the tests passed"
+ echo 'All the tests passed'
All the tests passed
echo
+ echo

Recording test results
Publishing Scoverage XML and HTML report...
null
Setting commit status on GitHub for https://github.com/bigdatagenomics/adam/commit/d299f8769e8b9a9341a998663908cb62a0efa40e
Finished: SUCCESS