Console Output

Skipping 913 KB.. Full Log
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/variant/RDDBoundVariantContextDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/variant/VCFInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/variant/GenotypeDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/variant/VCFInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/variant/VariantContextDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/variant/ParquetUnboundVariantDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/variant/VariantDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/variant/ParquetUnboundGenotypeDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/variant/VCFOutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/variant/ADAMVCFOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/RightOuterTreeRegionJoinAndGroupByRight.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/LeftOuterShuffleRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/GenomicDatasetWithLineage.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/FASTQInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/ADAMBAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/ReadDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/RDDBoundAlignmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/RDDBoundReadDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/ParquetUnboundReadDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/SingleReadBucketSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/AnySAMOutFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/ADAMBAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/SAMInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/ADAMSAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/ReadDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/BAMInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/ReferencePositionPairSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/AnySAMInFormatterCompanion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/ADAMCRAMOutputFormatHeaderLess.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/ADAMSAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/QualityScoreBin$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/DatasetBoundReadDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/FASTQInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/ADAMCRAMOutputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/DatasetBoundAlignmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/AnySAMInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/IncorrectMDTagException.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/AlignmentDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/QualityScoreBin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/ParquetUnboundAlignmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/SAMInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/AlignmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/read/BAMInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/RightOuterShuffleRegionJoinAndGroupByLeft.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/RightOuterTreeRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/VictimlessSortedIntervalPartitionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/GenomicPositionPartitioner$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/LeftOuterShuffleRegionJoinAndGroupByLeft.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/AvroGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/GenomicRegionPartitioner.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/GenomicPositionPartitioner.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/ShuffleRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/GenericGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/SortedIntervalPartitionJoinWithVictims.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/AvroReadGroupGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/GenericConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/GenomicRegionPartitioner$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/MultisampleGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/DatasetBoundGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/InnerTreeRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/InnerShuffleRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/fragment/InterleavedFASTQInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/fragment/Tab6InFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/fragment/FragmentDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/fragment/InterleavedFASTQInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/fragment/RDDBoundFragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/fragment/Tab5InFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/fragment/DatasetBoundFragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/fragment/FragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/fragment/ParquetUnboundFragmentDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/fragment/Tab6InFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/fragment/Tab5InFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/InnerShuffleRegionJoinAndGroupByLeft.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/FullOuterShuffleRegionJoin.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/RDDBoundGenericGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/MultisampleAvroGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/DatasetBoundGenericGenomicDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/InFormatterCompanion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/ReferencePartitioner.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/GenomicBroadcast.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/InnerTreeRegionJoinAndGroupByRight.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/sequence/FASTAInFormatter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/sequence/FASTAInFormatter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/sequence/ParquetUnboundSequenceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/sequence/RDDBoundSequenceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/sequence/SliceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/sequence/DatasetBoundSequenceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/sequence/SequenceDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/sequence/SliceDataset$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/sequence/RDDBoundSliceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/sequence/SequenceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/sequence/ParquetUnboundSliceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/ds/sequence/DatasetBoundSliceDataset.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/ProcessingStep$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/VariantAnnotation.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/VariantAnnotation$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/VariantCallingAnnotations.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/VariantCallingAnnotations$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/TranscriptEffect$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/TranscriptEffect.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/sql/VariantContext$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/rich/RichAlignment$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/FieldEnumeration$SchemaVal.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/VariantAnnotationField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/FragmentField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/AlignmentField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/VariantField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/OntologyTermField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/DbxrefField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/GenotypeField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/FieldValue.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/SliceField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/Filter$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/TranscriptEffectField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/FieldEnumeration.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/SampleField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/FeatureField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/ReadField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/ReferenceField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/SequenceField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/ReadGroupField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/Projection$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/projections/VariantCallingAnnotationsField$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/smithwaterman/SmithWatermanConstantGapScoring.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/smithwaterman/index.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/smithwaterman/SmithWaterman.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/consensus/ConsensusGenerator$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/consensus/index.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/algorithms/consensus/ConsensusGenerator.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/serialization/WritableSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/serialization/index.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/serialization/ADAMKryoRegistrator.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/serialization/InputStreamWithDecoder.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/serialization/AvroSerializer.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/javadocs/org/apache/parquet/avro/class-use/AvroSchemaConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/javadocs/org/apache/parquet/avro/AvroSchemaConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/InterleavedFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/class-use/InterleavedFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/class-use/FastqRecordReader.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/class-use/SingleFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/class-use/FastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/FastqRecordReader.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/javadocs/org/bdgenomics/adam/io/SingleFastqInputFormat.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToFragmentDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToFeatureConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToAlignmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToAlignmentDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToSequenceDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToSliceDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToVariantDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToVariantContextsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToReadDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/JavaADAMContext.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToCoverageDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/JavaADAMContext$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToAlignmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToAlignmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToAlignmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToVariantContextsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToAlignmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToAlignmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToVariantContextDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToAlignmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToCoverageDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToFeaturesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToAlignmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FeaturesToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToVariantContextsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToGenotypesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToFeatureDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToFragmentsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToSlicesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToVariantsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/CoverageToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SequencesToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToFragmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/SlicesToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToVariantsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/AlignmentsToVariantContextConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToSequencesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToSlicesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToGenotypesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToReadsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ToGenotypeDatasetConversion.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToReadsDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantsToSequencesDatasetConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/FragmentsToFeaturesConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/VariantContextsToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/GenotypesToCoverageConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/java/ReadsToAlignmentsConverter.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/api/python/DataFrameConversionWrapper.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFeaturesArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformVariants.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/MergeShardsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformAlignmentsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountReadKmersArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformAlignments.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSequences.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformGenotypes.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformGenotypes$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFeatures.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSequencesArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFragmentsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSlices.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountReadKmers$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountSliceKmersArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFeatures$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformVariants$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformAlignments$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountSliceKmers$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformGenotypesArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSlices$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSequences$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformSlicesArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFragments.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformVariantsArgs.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/TransformFragments$.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/scaladocs/org/bdgenomics/adam/cli/CountSliceKmers.html longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/repo/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT-sources.jar longer than 100 characters.
[WARNING] Entry: adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/repo/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT-javadoc.jar longer than 100 characters.
[INFO] Building zip: /tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/adam-distribution/target/adam-distribution-spark3_2.12-0.35.0-SNAPSHOT-bin.zip
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for ADAM_2.12 0.35.0-SNAPSHOT:
[INFO] 
[INFO] ADAM_2.12 .......................................... SUCCESS [  9.886 s]
[INFO] ADAM_2.12: Shader workaround ....................... SUCCESS [  7.749 s]
[INFO] ADAM_2.12: Avro-to-Dataset codegen utils ........... SUCCESS [ 10.009 s]
[INFO] ADAM_2.12: Core .................................... SUCCESS [01:45 min]
[INFO] ADAM_2.12: APIs for Java, Python ................... SUCCESS [ 29.315 s]
[INFO] ADAM_2.12: CLI ..................................... SUCCESS [ 42.720 s]
[INFO] ADAM_2.12: Assembly ................................ SUCCESS [ 19.894 s]
[INFO] ADAM_2.12: Python APIs ............................. SUCCESS [01:25 min]
[INFO] ADAM_2.12: Distribution ............................ SUCCESS [ 42.234 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  05:53 min
[INFO] Finished at: 2021-04-27T12:52:02-07:00
[INFO] ------------------------------------------------------------------------
+ tar tzvf adam-distribution/target/adam-distribution-spark3_2.12-0.35.0-SNAPSHOT-bin.tar.gz
+ grep bdgenomics.adam
+ grep egg
drwxrwxr-x jenkins/jenkins        0 2021-04-27 12:49 adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/r/bdgenomics.adam.egg-info/
-rw-r--r-- jenkins/jenkins 46836375 2021-04-27 12:50 adam-distribution-spark3_2.12-0.35.0-SNAPSHOT/repo/bdgenomics.adam-0.34.0a0-py3.6.egg
+ ./bin/pyadam
Using PYSPARK=/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/spark-3.0.2-bin-hadoop3.2/bin/pyspark
2021-04-27 12:52:05 WARN  Utils:69 - Your hostname, research-jenkins-worker-02 resolves to a loopback address: 127.0.1.1; using 192.168.122.1 instead (on interface virbr0)
2021-04-27 12:52:05 WARN  Utils:69 - Set SPARK_LOCAL_IP if you need to bind to another address
2021-04-27 12:52:05 WARN  NativeCodeLoader:60 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
2021-04-27 12:52:12 WARN  package:69 - Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.sql.debug.maxToStringFields'.
+ source deactivate
#!/bin/sh
_CONDA_ROOT="/home/jenkins/anaconda2"
++ _CONDA_ROOT=/home/jenkins/anaconda2
# Copyright (C) 2012 Anaconda, Inc
# SPDX-License-Identifier: BSD-3-Clause
\echo >&2 "DeprecationWarning: 'source deactivate' is deprecated. Use 'conda deactivate'."
++ echo 'DeprecationWarning: '\''source deactivate'\'' is deprecated. Use '\''conda deactivate'\''.'
DeprecationWarning: 'source deactivate' is deprecated. Use 'conda deactivate'.
\. "$_CONDA_ROOT/etc/profile.d/conda.sh" || return $?
++ . /home/jenkins/anaconda2/etc/profile.d/conda.sh
export CONDA_EXE='/home/jenkins/anaconda2/bin/conda'
+++ export CONDA_EXE=/home/jenkins/anaconda2/bin/conda
+++ CONDA_EXE=/home/jenkins/anaconda2/bin/conda
export _CE_M=''
+++ export _CE_M=
+++ _CE_M=
export _CE_CONDA=''
+++ export _CE_CONDA=
+++ _CE_CONDA=
export CONDA_PYTHON_EXE='/home/jenkins/anaconda2/bin/python'
+++ export CONDA_PYTHON_EXE=/home/jenkins/anaconda2/bin/python
+++ CONDA_PYTHON_EXE=/home/jenkins/anaconda2/bin/python

# Copyright (C) 2012 Anaconda, Inc
# SPDX-License-Identifier: BSD-3-Clause

__add_sys_prefix_to_path() {
    # In dev-mode CONDA_EXE is python.exe and on Windows
    # it is in a different relative location to condabin.
    if [ -n "${_CE_CONDA}" ] && [ -n "${WINDIR+x}" ]; then
        SYSP=$(\dirname "${CONDA_EXE}")
    else
        SYSP=$(\dirname "${CONDA_EXE}")
        SYSP=$(\dirname "${SYSP}")
    fi

    if [ -n "${WINDIR+x}" ]; then
        PATH="${SYSP}/bin:${PATH}"
        PATH="${SYSP}/Scripts:${PATH}"
        PATH="${SYSP}/Library/bin:${PATH}"
        PATH="${SYSP}/Library/usr/bin:${PATH}"
        PATH="${SYSP}/Library/mingw-w64/bin:${PATH}"
        PATH="${SYSP}:${PATH}"
    else
        PATH="${SYSP}/bin:${PATH}"
    fi
    \export PATH
}

__conda_hashr() {
    if [ -n "${ZSH_VERSION:+x}" ]; then
        \rehash
    elif [ -n "${POSH_VERSION:+x}" ]; then
        :  # pass
    else
        \hash -r
    fi
}

__conda_activate() {
    if [ -n "${CONDA_PS1_BACKUP:+x}" ]; then
        # Handle transition from shell activated with conda <= 4.3 to a subsequent activation
        # after conda updated to >= 4.4. See issue #6173.
        PS1="$CONDA_PS1_BACKUP"
        \unset CONDA_PS1_BACKUP
    fi

    \local cmd="$1"
    shift
    \local ask_conda
    OLDPATH="${PATH}"
    __add_sys_prefix_to_path
    ask_conda="$(PS1="$PS1" "$CONDA_EXE" $_CE_M $_CE_CONDA shell.posix "$cmd" "$@")" || \return $?
    PATH="${OLDPATH}"
    \eval "$ask_conda"
    __conda_hashr
}

__conda_reactivate() {
    \local ask_conda
    OLDPATH="${PATH}"
    __add_sys_prefix_to_path
    ask_conda="$(PS1="$PS1" "$CONDA_EXE" $_CE_M $_CE_CONDA shell.posix reactivate)" || \return $?
    PATH="${OLDPATH}"
    \eval "$ask_conda"
    __conda_hashr
}

conda() {
    if [ "$#" -lt 1 ]; then
        "$CONDA_EXE" $_CE_M $_CE_CONDA
    else
        \local cmd="$1"
        shift
        case "$cmd" in
            activate|deactivate)
                __conda_activate "$cmd" "$@"
                ;;
            install|update|upgrade|remove|uninstall)
                OLDPATH="${PATH}"
                __add_sys_prefix_to_path
                "$CONDA_EXE" $_CE_M $_CE_CONDA "$cmd" "$@"
                \local t1=$?
                PATH="${OLDPATH}"
                if [ $t1 = 0 ]; then
                    __conda_reactivate
                else
                    return $t1
                fi
                ;;
            *)
                OLDPATH="${PATH}"
                __add_sys_prefix_to_path
                "$CONDA_EXE" $_CE_M $_CE_CONDA "$cmd" "$@"
                \local t1=$?
                PATH="${OLDPATH}"
                return $t1
                ;;
        esac
    fi
}

if [ -z "${CONDA_SHLVL+x}" ]; then
    \export CONDA_SHLVL=0
    # In dev-mode CONDA_EXE is python.exe and on Windows
    # it is in a different relative location to condabin.
    if [ -n "${_CE_CONDA+x}" ] && [ -n "${WINDIR+x}" ]; then
        PATH="$(\dirname "$CONDA_EXE")/condabin${PATH:+":${PATH}"}"
    else
        PATH="$(\dirname "$(\dirname "$CONDA_EXE")")/condabin${PATH:+":${PATH}"}"
    fi
    \export PATH

    # We're not allowing PS1 to be unbound. It must at least be set.
    # However, we're not exporting it, which can cause problems when starting a second shell
    # via a first shell (i.e. starting zsh from bash).
    if [ -z "${PS1+x}" ]; then
        PS1=
    fi
fi
+++ '[' -z x ']'

conda deactivate
++ conda deactivate
++ '[' 1 -lt 1 ']'
++ local cmd=deactivate
++ shift
++ case "$cmd" in
++ __conda_activate deactivate
++ '[' -n '' ']'
++ local cmd=deactivate
++ shift
++ local ask_conda
++ OLDPATH=/home/jenkins/anaconda2/envs/adam-build-6530dcc1-ac60-45ef-9dcf-27cd45540d33/bin:/home/jenkins/anaconda2/condabin:/usr/java/latest/bin:/usr/java/latest/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
++ __add_sys_prefix_to_path
++ '[' -n '' ']'
+++ dirname /home/jenkins/anaconda2/bin/conda
++ SYSP=/home/jenkins/anaconda2/bin
+++ dirname /home/jenkins/anaconda2/bin
++ SYSP=/home/jenkins/anaconda2
++ '[' -n '' ']'
++ PATH=/home/jenkins/anaconda2/bin:/home/jenkins/anaconda2/envs/adam-build-6530dcc1-ac60-45ef-9dcf-27cd45540d33/bin:/home/jenkins/anaconda2/condabin:/usr/java/latest/bin:/usr/java/latest/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
++ export PATH
+++ PS1='(adam-build-6530dcc1-ac60-45ef-9dcf-27cd45540d33) '
+++ /home/jenkins/anaconda2/bin/conda shell.posix deactivate
++ ask_conda='export PATH='\''/home/jenkins/anaconda2/condabin:/usr/java/latest/bin:/usr/java/latest/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin'\''
unset CONDA_PREFIX
unset CONDA_DEFAULT_ENV
unset CONDA_PROMPT_MODIFIER
PS1='\'''\''
export CONDA_SHLVL='\''0'\''
export CONDA_EXE='\''/home/jenkins/anaconda2/bin/conda'\''
export _CE_M='\'''\''
export _CE_CONDA='\'''\''
export CONDA_PYTHON_EXE='\''/home/jenkins/anaconda2/bin/python'\'''
++ PATH=/home/jenkins/anaconda2/envs/adam-build-6530dcc1-ac60-45ef-9dcf-27cd45540d33/bin:/home/jenkins/anaconda2/condabin:/usr/java/latest/bin:/usr/java/latest/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
++ eval 'export PATH='\''/home/jenkins/anaconda2/condabin:/usr/java/latest/bin:/usr/java/latest/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin'\''
unset CONDA_PREFIX
unset CONDA_DEFAULT_ENV
unset CONDA_PROMPT_MODIFIER
PS1='\'''\''
export CONDA_SHLVL='\''0'\''
export CONDA_EXE='\''/home/jenkins/anaconda2/bin/conda'\''
export _CE_M='\'''\''
export _CE_CONDA='\'''\''
export CONDA_PYTHON_EXE='\''/home/jenkins/anaconda2/bin/python'\'''
export PATH='/home/jenkins/anaconda2/condabin:/usr/java/latest/bin:/usr/java/latest/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin'
+++ export PATH=/home/jenkins/anaconda2/condabin:/usr/java/latest/bin:/usr/java/latest/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
+++ PATH=/home/jenkins/anaconda2/condabin:/usr/java/latest/bin:/usr/java/latest/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
unset CONDA_PREFIX
+++ unset CONDA_PREFIX
unset CONDA_DEFAULT_ENV
+++ unset CONDA_DEFAULT_ENV
unset CONDA_PROMPT_MODIFIER
+++ unset CONDA_PROMPT_MODIFIER
PS1=''
+++ PS1=
export CONDA_SHLVL='0'
+++ export CONDA_SHLVL=0
+++ CONDA_SHLVL=0
export CONDA_EXE='/home/jenkins/anaconda2/bin/conda'
+++ export CONDA_EXE=/home/jenkins/anaconda2/bin/conda
+++ CONDA_EXE=/home/jenkins/anaconda2/bin/conda
export _CE_M=''
+++ export _CE_M=
+++ _CE_M=
export _CE_CONDA=''
+++ export _CE_CONDA=
+++ _CE_CONDA=
export CONDA_PYTHON_EXE='/home/jenkins/anaconda2/bin/python'
+++ export CONDA_PYTHON_EXE=/home/jenkins/anaconda2/bin/python
+++ CONDA_PYTHON_EXE=/home/jenkins/anaconda2/bin/python
++ __conda_hashr
++ '[' -n '' ']'
++ '[' -n '' ']'
++ hash -r
+ conda remove -y -n adam-build-6530dcc1-ac60-45ef-9dcf-27cd45540d33 --all
+ '[' 5 -lt 1 ']'
+ local cmd=remove
+ shift
+ case "$cmd" in
+ OLDPATH=/home/jenkins/anaconda2/condabin:/usr/java/latest/bin:/usr/java/latest/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
+ __add_sys_prefix_to_path
+ '[' -n '' ']'
++ dirname /home/jenkins/anaconda2/bin/conda
+ SYSP=/home/jenkins/anaconda2/bin
++ dirname /home/jenkins/anaconda2/bin
+ SYSP=/home/jenkins/anaconda2
+ '[' -n '' ']'
+ PATH=/home/jenkins/anaconda2/bin:/home/jenkins/anaconda2/condabin:/usr/java/latest/bin:/usr/java/latest/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
+ export PATH
+ /home/jenkins/anaconda2/bin/conda remove -y -n adam-build-6530dcc1-ac60-45ef-9dcf-27cd45540d33 --all

Remove all packages in environment /home/jenkins/anaconda2/envs/adam-build-6530dcc1-ac60-45ef-9dcf-27cd45540d33:


## Package Plan ##

  environment location: /home/jenkins/anaconda2/envs/adam-build-6530dcc1-ac60-45ef-9dcf-27cd45540d33


The following packages will be REMOVED:

  _libgcc_mutex-0.1-main
  ca-certificates-2021.4.13-h06a4308_1
  certifi-2020.12.5-py36h06a4308_0
  ld_impl_linux-64-2.33.1-h53a641e_7
  libffi-3.3-he6710b0_2
  libgcc-ng-9.1.0-hdf63c60_0
  libstdcxx-ng-9.1.0-hdf63c60_0
  ncurses-6.2-he6710b0_1
  openssl-1.1.1k-h27cfd23_0
  pip-21.0.1-py36h06a4308_0
  python-3.6.13-hdb3f193_0
  readline-8.1-h27cfd23_0
  setuptools-52.0.0-py36h06a4308_0
  sqlite-3.35.4-hdfb4753_0
  tk-8.6.10-hbc83047_0
  wheel-0.36.2-pyhd3eb1b0_0
  xz-5.2.5-h7b6447c_0
  zlib-1.2.11-h7b6447c_3


Preparing transaction: ...working... done
Verifying transaction: ...working... done
Executing transaction: ...working... done
+ local t1=0
+ PATH=/home/jenkins/anaconda2/condabin:/usr/java/latest/bin:/usr/java/latest/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
+ '[' 0 = 0 ']'
+ __conda_reactivate
+ local ask_conda
+ OLDPATH=/home/jenkins/anaconda2/condabin:/usr/java/latest/bin:/usr/java/latest/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
+ __add_sys_prefix_to_path
+ '[' -n '' ']'
++ dirname /home/jenkins/anaconda2/bin/conda
+ SYSP=/home/jenkins/anaconda2/bin
++ dirname /home/jenkins/anaconda2/bin
+ SYSP=/home/jenkins/anaconda2
+ '[' -n '' ']'
+ PATH=/home/jenkins/anaconda2/bin:/home/jenkins/anaconda2/condabin:/usr/java/latest/bin:/usr/java/latest/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
+ export PATH
++ PS1=
++ /home/jenkins/anaconda2/bin/conda shell.posix reactivate
+ ask_conda=
+ PATH=/home/jenkins/anaconda2/condabin:/usr/java/latest/bin:/usr/java/latest/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
+ eval ''
+ __conda_hashr
+ '[' -n '' ']'
+ '[' -n '' ']'
+ hash -r
+ cp -r adam-python/target /home/jenkins/workspace/ADAM/HADOOP_VERSION/3.2.1/SCALA_VERSION/2.12/SPARK_VERSION/3.0.2/label/ubuntu/scripts/../adam-python/
+ pushd adam-python
/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/adam-python /tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded ~/workspace/ADAM/HADOOP_VERSION/3.2.1/SCALA_VERSION/2.12/SPARK_VERSION/3.0.2/label/ubuntu
+ make clean
pip uninstall -y adam
DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 will drop support for Python 2.7 in January 2021. More details about Python 2 support in pip can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support pip 21.0 will remove support for this functionality.
WARNING: Skipping adam as it is not installed.
rm -rf bdgenomics/*.egg*
rm -rf build/
rm -rf dist/
+ make clean_sdist
rm -rf dist
+ popd
/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded ~/workspace/ADAM/HADOOP_VERSION/3.2.1/SCALA_VERSION/2.12/SPARK_VERSION/3.0.2/label/ubuntu

if [ ${SPARK_VERSION} == 3.0.2 ]
then
    echo "Unable to build R support for Spark 3.0.2, SparkR is not available"
else
    # make a directory to install SparkR into, and set the R user libs path
    export R_LIBS_USER=${SPARK_HOME}/local_R_libs
    mkdir -p ${R_LIBS_USER}
    R CMD INSTALL \
      -l ${R_LIBS_USER} \
      ${SPARK_HOME}/R/lib/SparkR/

    export SPARKR_SUBMIT_ARGS="--jars ${ASSEMBLY_DIR}/${ASSEMBLY_JAR} --driver-class-path ${ASSEMBLY_DIR}/${ASSEMBLY_JAR} sparkr-shell"

    mvn -U \
    	-P r \
    	package \
    	-Dsuites=select.no.suites\* \
    	-Dhadoop.version=${HADOOP_VERSION}
fi
+ '[' 3.0.2 == 3.0.2 ']'
+ echo 'Unable to build R support for Spark 3.0.2, SparkR is not available'
Unable to build R support for Spark 3.0.2, SparkR is not available

# define filenames
BAM=mouse_chrM.bam
+ BAM=mouse_chrM.bam
READS=${BAM}.reads.adam
+ READS=mouse_chrM.bam.reads.adam
SORTED_READS=${BAM}.reads.sorted.adam
+ SORTED_READS=mouse_chrM.bam.reads.sorted.adam
FRAGMENTS=${BAM}.fragments.adam
+ FRAGMENTS=mouse_chrM.bam.fragments.adam
    
# fetch our input dataset
echo "Fetching BAM file"
+ echo 'Fetching BAM file'
Fetching BAM file
rm -rf ${BAM}
+ rm -rf mouse_chrM.bam
wget -q https://s3.amazonaws.com/bdgenomics-test/${BAM}
+ wget -q https://s3.amazonaws.com/bdgenomics-test/mouse_chrM.bam

# once fetched, convert BAM to ADAM
echo "Converting BAM to ADAM read format"
+ echo 'Converting BAM to ADAM read format'
Converting BAM to ADAM read format
rm -rf ${READS}
+ rm -rf mouse_chrM.bam.reads.adam
${ADAM} transformAlignments ${BAM} ${READS}
+ ./bin/adam-submit transformAlignments mouse_chrM.bam mouse_chrM.bam.reads.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/spark-3.0.2-bin-hadoop3.2/bin/spark-submit
21/04/27 12:52:20 WARN Utils: Your hostname, research-jenkins-worker-02 resolves to a loopback address: 127.0.1.1; using 192.168.122.1 instead (on interface virbr0)
21/04/27 12:52:20 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
21/04/27 12:52:21 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
log4j:WARN No appenders could be found for logger (org.bdgenomics.adam.cli.ADAMMain).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/04/27 12:52:21 INFO SparkContext: Running Spark version 3.0.2
21/04/27 12:52:21 INFO ResourceUtils: ==============================================================
21/04/27 12:52:21 INFO ResourceUtils: Resources for spark.driver:

21/04/27 12:52:21 INFO ResourceUtils: ==============================================================
21/04/27 12:52:21 INFO SparkContext: Submitted application: transformAlignments
21/04/27 12:52:21 INFO SecurityManager: Changing view acls to: jenkins
21/04/27 12:52:21 INFO SecurityManager: Changing modify acls to: jenkins
21/04/27 12:52:21 INFO SecurityManager: Changing view acls groups to: 
21/04/27 12:52:21 INFO SecurityManager: Changing modify acls groups to: 
21/04/27 12:52:21 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
21/04/27 12:52:21 INFO Utils: Successfully started service 'sparkDriver' on port 34389.
21/04/27 12:52:21 INFO SparkEnv: Registering MapOutputTracker
21/04/27 12:52:21 INFO SparkEnv: Registering BlockManagerMaster
21/04/27 12:52:21 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
21/04/27 12:52:21 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
21/04/27 12:52:21 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
21/04/27 12:52:21 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-a1ae5915-f1c8-4166-97db-2e79abe5cbe5
21/04/27 12:52:21 INFO MemoryStore: MemoryStore started with capacity 408.9 MiB
21/04/27 12:52:22 INFO SparkEnv: Registering OutputCommitCoordinator
21/04/27 12:52:22 INFO Utils: Successfully started service 'SparkUI' on port 4040.
21/04/27 12:52:22 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.122.1:4040
21/04/27 12:52:22 INFO SparkContext: Added JAR file:/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar at spark://192.168.122.1:34389/jars/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar with timestamp 1619553141385
21/04/27 12:52:22 INFO Executor: Starting executor ID driver on host 192.168.122.1
21/04/27 12:52:22 INFO Executor: Fetching spark://192.168.122.1:34389/jars/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar with timestamp 1619553141385
21/04/27 12:52:22 INFO TransportClientFactory: Successfully created connection to /192.168.122.1:34389 after 37 ms (0 ms spent in bootstraps)
21/04/27 12:52:22 INFO Utils: Fetching spark://192.168.122.1:34389/jars/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar to /tmp/spark-39b9cc42-d9ce-443d-9de9-630ab1aa0129/userFiles-4f8746eb-ce3a-41bc-8cd1-7230a2ad940f/fetchFileTemp7204681994568972176.tmp
21/04/27 12:52:23 INFO Executor: Adding file:/tmp/spark-39b9cc42-d9ce-443d-9de9-630ab1aa0129/userFiles-4f8746eb-ce3a-41bc-8cd1-7230a2ad940f/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar to class loader
21/04/27 12:52:23 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 43265.
21/04/27 12:52:23 INFO NettyBlockTransferService: Server created on 192.168.122.1:43265
21/04/27 12:52:23 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
21/04/27 12:52:23 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.122.1, 43265, None)
21/04/27 12:52:23 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.122.1:43265 with 408.9 MiB RAM, BlockManagerId(driver, 192.168.122.1, 43265, None)
21/04/27 12:52:23 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.122.1, 43265, None)
21/04/27 12:52:23 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.122.1, 43265, None)
21/04/27 12:52:23 INFO ADAMContext: Loading mouse_chrM.bam as BAM/CRAM/SAM and converting to Alignments.
21/04/27 12:52:23 INFO ADAMContext: Loaded header from file:/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam
21/04/27 12:52:24 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 314.0 KiB, free 408.6 MiB)
21/04/27 12:52:24 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 27.2 KiB, free 408.6 MiB)
21/04/27 12:52:24 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.122.1:43265 (size: 27.2 KiB, free: 408.9 MiB)
21/04/27 12:52:24 INFO SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:2054
21/04/27 12:52:25 INFO RDDBoundAlignmentDataset: Saving data in ADAM format
21/04/27 12:52:25 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
21/04/27 12:52:25 INFO FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
21/04/27 12:52:26 INFO FileInputFormat: Total input files to process : 1
21/04/27 12:52:26 INFO SparkContext: Starting job: runJob at SparkHadoopWriter.scala:83
21/04/27 12:52:26 INFO DAGScheduler: Got job 0 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
21/04/27 12:52:26 INFO DAGScheduler: Final stage: ResultStage 0 (runJob at SparkHadoopWriter.scala:83)
21/04/27 12:52:26 INFO DAGScheduler: Parents of final stage: List()
21/04/27 12:52:26 INFO DAGScheduler: Missing parents: List()
21/04/27 12:52:26 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[2] at map at GenomicDataset.scala:3805), which has no missing parents
21/04/27 12:52:26 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 95.2 KiB, free 408.5 MiB)
21/04/27 12:52:26 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 34.6 KiB, free 408.4 MiB)
21/04/27 12:52:26 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.122.1:43265 (size: 34.6 KiB, free: 408.8 MiB)
21/04/27 12:52:26 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1223
21/04/27 12:52:26 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[2] at map at GenomicDataset.scala:3805) (first 15 tasks are for partitions Vector(0))
21/04/27 12:52:26 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
21/04/27 12:52:26 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 192.168.122.1, executor driver, partition 0, PROCESS_LOCAL, 7443 bytes)
21/04/27 12:52:26 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
21/04/27 12:52:26 INFO NewHadoopRDD: Input split: file:/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam:83361792-833134657535
21/04/27 12:52:26 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
21/04/27 12:52:26 INFO FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
21/04/27 12:52:26 INFO CodecConfig: Compression: GZIP
21/04/27 12:52:26 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
21/04/27 12:52:26 INFO FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
21/04/27 12:52:26 INFO ParquetOutputFormat: Parquet block size to 134217728
21/04/27 12:52:26 INFO ParquetOutputFormat: Parquet page size to 1048576
21/04/27 12:52:26 INFO ParquetOutputFormat: Parquet dictionary page size to 1048576
21/04/27 12:52:26 INFO ParquetOutputFormat: Dictionary is on
21/04/27 12:52:26 INFO ParquetOutputFormat: Validation is off
21/04/27 12:52:26 INFO ParquetOutputFormat: Writer version is: PARQUET_1_0
21/04/27 12:52:26 INFO ParquetOutputFormat: Maximum row group padding size is 8388608 bytes
21/04/27 12:52:26 INFO ParquetOutputFormat: Page size checking is: estimated
21/04/27 12:52:26 INFO ParquetOutputFormat: Min row count for page size check is: 100
21/04/27 12:52:26 INFO ParquetOutputFormat: Max row count for page size check is: 10000
21/04/27 12:52:26 INFO CodecPool: Got brand-new compressor [.gz]
Ignoring SAM validation error: ERROR: Record 162622, Read name 613F0AAXX100423:3:58:9979:16082, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162624, Read name 613F0AAXX100423:6:13:3141:11793, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162625, Read name 613F0AAXX100423:8:39:18592:13552, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162635, Read name 613F1AAXX100423:7:2:13114:10698, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162637, Read name 613F1AAXX100423:6:100:8840:11167, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162639, Read name 613F1AAXX100423:8:15:10944:11181, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162640, Read name 613F1AAXX100423:8:17:5740:10104, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162651, Read name 613F1AAXX100423:1:53:11097:8261, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162654, Read name 613F1AAXX100423:2:112:16779:19612, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162657, Read name 613F0AAXX100423:8:28:7084:17683, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162659, Read name 613F0AAXX100423:8:39:19796:12794, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162662, Read name 613F1AAXX100423:5:116:9339:3264, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162667, Read name 613F0AAXX100423:4:67:2015:3054, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162669, Read name 613F0AAXX100423:7:7:11297:11738, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162674, Read name 613F0AAXX100423:6:59:10490:20829, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162678, Read name 613F1AAXX100423:8:11:17603:4766, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162682, Read name 613F0AAXX100423:5:86:10814:10257, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162683, Read name 613F0AAXX100423:5:117:14178:6111, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162685, Read name 613F0AAXX100423:2:3:13563:6720, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162689, Read name 613F0AAXX100423:7:59:16009:15799, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162696, Read name 613F0AAXX100423:5:31:9663:18252, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162698, Read name 613F1AAXX100423:2:27:12264:14626, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162699, Read name 613F0AAXX100423:1:120:19003:6647, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162702, Read name 613F1AAXX100423:3:37:6972:18407, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162704, Read name 613F1AAXX100423:3:77:6946:3880, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162706, Read name 613F0AAXX100423:7:48:2692:3492, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162708, Read name 613F1AAXX100423:7:80:8790:1648, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162710, Read name 6141AAAXX100423:5:30:15036:17610, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162712, Read name 613F1AAXX100423:8:80:6261:4465, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162713, Read name 6141AAAXX100423:5:74:5542:6195, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162715, Read name 613F1AAXX100423:5:14:14844:13639, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162718, Read name 613F1AAXX100423:7:112:14569:8480, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162725, Read name 613F1AAXX100423:4:56:10160:9879, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162727, Read name 6141AAAXX100423:7:89:12209:9221, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162731, Read name 6141AAAXX100423:6:55:1590:19793, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162732, Read name 6141AAAXX100423:7:102:16679:12368, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162734, Read name 613F1AAXX100423:2:7:4909:18472, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162737, Read name 6141AAAXX100423:4:73:6574:10572, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162741, Read name 6141AAAXX100423:1:8:14113:12655, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162743, Read name 6141AAAXX100423:3:40:7990:5056, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162744, Read name 6141AAAXX100423:4:36:15793:3411, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162745, Read name 6141AAAXX100423:8:83:1139:18985, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162746, Read name 6141AAAXX100423:5:7:18196:13562, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162748, Read name 6141AAAXX100423:3:114:5639:7123, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162751, Read name 6141AAAXX100423:7:47:4898:8640, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162753, Read name 6141AAAXX100423:3:64:8064:8165, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162756, Read name 613F1AAXX100423:1:105:14386:1684, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162757, Read name 613F1AAXX100423:6:98:1237:19470, MAPQ should be 0 for unmapped read.
Ignoring SAM validation error: ERROR: Record 162761, Read name 613F1AAXX100423:7:106:19658:9261, MAPQ should be 0 for unmapped read.
21/04/27 12:52:32 INFO InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 16043959
21/04/27 12:52:33 INFO FileOutputCommitter: Saved output of task 'attempt_202104271252252681172853683272311_0002_r_000000_0' to file:/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/_temporary/0/task_202104271252252681172853683272311_0002_r_000000
21/04/27 12:52:33 INFO SparkHadoopMapRedUtil: attempt_202104271252252681172853683272311_0002_r_000000_0: Committed
21/04/27 12:52:33 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 978 bytes result sent to driver
21/04/27 12:52:33 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 6858 ms on 192.168.122.1 (executor driver) (1/1)
21/04/27 12:52:33 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
21/04/27 12:52:33 INFO DAGScheduler: ResultStage 0 (runJob at SparkHadoopWriter.scala:83) finished in 6.978 s
21/04/27 12:52:33 INFO DAGScheduler: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job
21/04/27 12:52:33 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage finished
21/04/27 12:52:33 INFO DAGScheduler: Job 0 finished: runJob at SparkHadoopWriter.scala:83, took 7.027250 s
21/04/27 12:52:33 INFO ParquetFileReader: Initiating action with parallelism: 5
21/04/27 12:52:33 INFO SparkHadoopWriter: Job job_202104271252252681172853683272311_0002 committed.
21/04/27 12:52:33 INFO SparkContext: Invoking stop() from shutdown hook
21/04/27 12:52:33 INFO SparkUI: Stopped Spark web UI at http://192.168.122.1:4040
21/04/27 12:52:33 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/04/27 12:52:33 INFO MemoryStore: MemoryStore cleared
21/04/27 12:52:33 INFO BlockManager: BlockManager stopped
21/04/27 12:52:33 INFO BlockManagerMaster: BlockManagerMaster stopped
21/04/27 12:52:33 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/04/27 12:52:33 INFO SparkContext: Successfully stopped SparkContext
21/04/27 12:52:33 INFO ShutdownHookManager: Shutdown hook called
21/04/27 12:52:33 INFO ShutdownHookManager: Deleting directory /tmp/spark-39b9cc42-d9ce-443d-9de9-630ab1aa0129
21/04/27 12:52:33 INFO ShutdownHookManager: Deleting directory /tmp/spark-9f7134fe-ef46-4c0d-865b-6c41861a5038

# then, sort the BAM
echo "Converting BAM to ADAM read format with sorting"
+ echo 'Converting BAM to ADAM read format with sorting'
Converting BAM to ADAM read format with sorting
rm -rf ${SORTED_READS}
+ rm -rf mouse_chrM.bam.reads.sorted.adam
${ADAM} transformAlignments -sort_by_reference_position ${READS} ${SORTED_READS}
+ ./bin/adam-submit transformAlignments -sort_by_reference_position mouse_chrM.bam.reads.adam mouse_chrM.bam.reads.sorted.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/spark-3.0.2-bin-hadoop3.2/bin/spark-submit
21/04/27 12:52:34 WARN Utils: Your hostname, research-jenkins-worker-02 resolves to a loopback address: 127.0.1.1; using 192.168.122.1 instead (on interface virbr0)
21/04/27 12:52:34 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
21/04/27 12:52:35 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
log4j:WARN No appenders could be found for logger (org.bdgenomics.adam.cli.ADAMMain).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/04/27 12:52:35 INFO SparkContext: Running Spark version 3.0.2
21/04/27 12:52:35 INFO ResourceUtils: ==============================================================
21/04/27 12:52:35 INFO ResourceUtils: Resources for spark.driver:

21/04/27 12:52:35 INFO ResourceUtils: ==============================================================
21/04/27 12:52:35 INFO SparkContext: Submitted application: transformAlignments
21/04/27 12:52:35 INFO SecurityManager: Changing view acls to: jenkins
21/04/27 12:52:35 INFO SecurityManager: Changing modify acls to: jenkins
21/04/27 12:52:35 INFO SecurityManager: Changing view acls groups to: 
21/04/27 12:52:35 INFO SecurityManager: Changing modify acls groups to: 
21/04/27 12:52:35 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
21/04/27 12:52:35 INFO Utils: Successfully started service 'sparkDriver' on port 32995.
21/04/27 12:52:35 INFO SparkEnv: Registering MapOutputTracker
21/04/27 12:52:35 INFO SparkEnv: Registering BlockManagerMaster
21/04/27 12:52:35 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
21/04/27 12:52:35 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
21/04/27 12:52:35 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
21/04/27 12:52:36 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-5bbda186-50fa-4f2c-85ff-6e63b695d720
21/04/27 12:52:36 INFO MemoryStore: MemoryStore started with capacity 408.9 MiB
21/04/27 12:52:36 INFO SparkEnv: Registering OutputCommitCoordinator
21/04/27 12:52:36 INFO Utils: Successfully started service 'SparkUI' on port 4040.
21/04/27 12:52:36 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.122.1:4040
21/04/27 12:52:36 INFO SparkContext: Added JAR file:/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar at spark://192.168.122.1:32995/jars/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar with timestamp 1619553155402
21/04/27 12:52:36 INFO Executor: Starting executor ID driver on host 192.168.122.1
21/04/27 12:52:36 INFO Executor: Fetching spark://192.168.122.1:32995/jars/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar with timestamp 1619553155402
21/04/27 12:52:36 INFO TransportClientFactory: Successfully created connection to /192.168.122.1:32995 after 34 ms (0 ms spent in bootstraps)
21/04/27 12:52:36 INFO Utils: Fetching spark://192.168.122.1:32995/jars/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar to /tmp/spark-b9b2bb8e-260c-4293-b336-b7ce1ea6c975/userFiles-3b0f8aa6-8270-4747-a0a6-b466b8956872/fetchFileTemp3209353034417595112.tmp
21/04/27 12:52:37 INFO Executor: Adding file:/tmp/spark-b9b2bb8e-260c-4293-b336-b7ce1ea6c975/userFiles-3b0f8aa6-8270-4747-a0a6-b466b8956872/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar to class loader
21/04/27 12:52:37 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 43345.
21/04/27 12:52:37 INFO NettyBlockTransferService: Server created on 192.168.122.1:43345
21/04/27 12:52:37 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
21/04/27 12:52:37 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.122.1, 43345, None)
21/04/27 12:52:37 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.122.1:43345 with 408.9 MiB RAM, BlockManagerId(driver, 192.168.122.1, 43345, None)
21/04/27 12:52:37 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.122.1, 43345, None)
21/04/27 12:52:37 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.122.1, 43345, None)
21/04/27 12:52:37 INFO ADAMContext: Loading mouse_chrM.bam.reads.adam as Parquet of Alignments.
21/04/27 12:52:38 INFO ADAMContext: Reading the ADAM file at mouse_chrM.bam.reads.adam to create RDD
21/04/27 12:52:39 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 333.7 KiB, free 408.6 MiB)
21/04/27 12:52:39 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 31.0 KiB, free 408.5 MiB)
21/04/27 12:52:39 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.122.1:43345 (size: 31.0 KiB, free: 408.9 MiB)
21/04/27 12:52:39 INFO SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:1797
21/04/27 12:52:39 INFO TransformAlignments: Sorting alignments by reference position, with references ordered by name
21/04/27 12:52:39 INFO RDDBoundAlignmentDataset: Sorting alignments by reference position
21/04/27 12:52:39 INFO FileInputFormat: Total input files to process : 1
21/04/27 12:52:39 INFO ParquetInputFormat: Total input paths to process : 1
21/04/27 12:52:40 INFO RDDBoundAlignmentDataset: Saving data in ADAM format
21/04/27 12:52:40 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
21/04/27 12:52:40 INFO FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
21/04/27 12:52:40 INFO SparkContext: Starting job: runJob at SparkHadoopWriter.scala:83
21/04/27 12:52:40 INFO DAGScheduler: Registering RDD 2 (sortBy at AlignmentDataset.scala:1008) as input to shuffle 0
21/04/27 12:52:40 INFO DAGScheduler: Got job 0 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
21/04/27 12:52:40 INFO DAGScheduler: Final stage: ResultStage 1 (runJob at SparkHadoopWriter.scala:83)
21/04/27 12:52:40 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
21/04/27 12:52:40 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 0)
21/04/27 12:52:40 INFO DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[2] at sortBy at AlignmentDataset.scala:1008), which has no missing parents
21/04/27 12:52:40 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 6.9 KiB, free 408.5 MiB)
21/04/27 12:52:40 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 4.0 KiB, free 408.5 MiB)
21/04/27 12:52:40 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.122.1:43345 (size: 4.0 KiB, free: 408.9 MiB)
21/04/27 12:52:40 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1223
21/04/27 12:52:40 INFO DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[2] at sortBy at AlignmentDataset.scala:1008) (first 15 tasks are for partitions Vector(0))
21/04/27 12:52:40 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
21/04/27 12:52:40 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 192.168.122.1, executor driver, partition 0, PROCESS_LOCAL, 7482 bytes)
21/04/27 12:52:40 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
21/04/27 12:52:40 INFO NewHadoopRDD: Input split: file:/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/part-r-00000.gz.parquet:0+10132211
21/04/27 12:52:40 INFO InternalParquetRecordReader: RecordReader initialized will read a total of 163064 records.
21/04/27 12:52:40 INFO InternalParquetRecordReader: at row 0. reading next block
21/04/27 12:52:40 INFO CodecPool: Got brand-new decompressor [.gz]
21/04/27 12:52:40 INFO InternalParquetRecordReader: block read in memory in 45 ms. row count = 163064
21/04/27 12:52:43 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1086 bytes result sent to driver
21/04/27 12:52:43 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 3180 ms on 192.168.122.1 (executor driver) (1/1)
21/04/27 12:52:43 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
21/04/27 12:52:43 INFO DAGScheduler: ShuffleMapStage 0 (sortBy at AlignmentDataset.scala:1008) finished in 3.300 s
21/04/27 12:52:43 INFO DAGScheduler: looking for newly runnable stages
21/04/27 12:52:43 INFO DAGScheduler: running: Set()
21/04/27 12:52:43 INFO DAGScheduler: waiting: Set(ResultStage 1)
21/04/27 12:52:43 INFO DAGScheduler: failed: Set()
21/04/27 12:52:43 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[5] at map at GenomicDataset.scala:3805), which has no missing parents
21/04/27 12:52:43 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 96.7 KiB, free 408.4 MiB)
21/04/27 12:52:43 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 35.5 KiB, free 408.4 MiB)
21/04/27 12:52:43 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.122.1:43345 (size: 35.5 KiB, free: 408.8 MiB)
21/04/27 12:52:43 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1223
21/04/27 12:52:43 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[5] at map at GenomicDataset.scala:3805) (first 15 tasks are for partitions Vector(0))
21/04/27 12:52:43 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
21/04/27 12:52:43 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, 192.168.122.1, executor driver, partition 0, NODE_LOCAL, 7143 bytes)
21/04/27 12:52:43 INFO Executor: Running task 0.0 in stage 1.0 (TID 1)
21/04/27 12:52:43 INFO ShuffleBlockFetcherIterator: Getting 1 (24.5 MiB) non-empty blocks including 1 (24.5 MiB) local and 0 (0.0 B) host-local and 0 (0.0 B) remote blocks
21/04/27 12:52:43 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 9 ms
21/04/27 12:52:44 INFO BlockManagerInfo: Removed broadcast_1_piece0 on 192.168.122.1:43345 in memory (size: 4.0 KiB, free: 408.8 MiB)
21/04/27 12:52:45 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
21/04/27 12:52:45 INFO FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
21/04/27 12:52:45 INFO CodecConfig: Compression: GZIP
21/04/27 12:52:45 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
21/04/27 12:52:45 INFO FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
21/04/27 12:52:45 INFO ParquetOutputFormat: Parquet block size to 134217728
21/04/27 12:52:45 INFO ParquetOutputFormat: Parquet page size to 1048576
21/04/27 12:52:45 INFO ParquetOutputFormat: Parquet dictionary page size to 1048576
21/04/27 12:52:45 INFO ParquetOutputFormat: Dictionary is on
21/04/27 12:52:45 INFO ParquetOutputFormat: Validation is off
21/04/27 12:52:45 INFO ParquetOutputFormat: Writer version is: PARQUET_1_0
21/04/27 12:52:45 INFO ParquetOutputFormat: Maximum row group padding size is 8388608 bytes
21/04/27 12:52:45 INFO ParquetOutputFormat: Page size checking is: estimated
21/04/27 12:52:45 INFO ParquetOutputFormat: Min row count for page size check is: 100
21/04/27 12:52:45 INFO ParquetOutputFormat: Max row count for page size check is: 10000
21/04/27 12:52:45 INFO CodecPool: Got brand-new compressor [.gz]
21/04/27 12:52:48 INFO InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 16004474
21/04/27 12:52:48 INFO FileOutputCommitter: Saved output of task 'attempt_202104271252405288983992695588507_0005_r_000000_0' to file:/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.sorted.adam/_temporary/0/task_202104271252405288983992695588507_0005_r_000000
21/04/27 12:52:48 INFO SparkHadoopMapRedUtil: attempt_202104271252405288983992695588507_0005_r_000000_0: Committed
21/04/27 12:52:48 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1). 1365 bytes result sent to driver
21/04/27 12:52:48 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 5426 ms on 192.168.122.1 (executor driver) (1/1)
21/04/27 12:52:48 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
21/04/27 12:52:48 INFO DAGScheduler: ResultStage 1 (runJob at SparkHadoopWriter.scala:83) finished in 5.461 s
21/04/27 12:52:48 INFO DAGScheduler: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job
21/04/27 12:52:48 INFO TaskSchedulerImpl: Killing all running tasks in stage 1: Stage finished
21/04/27 12:52:48 INFO DAGScheduler: Job 0 finished: runJob at SparkHadoopWriter.scala:83, took 8.840379 s
21/04/27 12:52:48 INFO ParquetFileReader: Initiating action with parallelism: 5
21/04/27 12:52:49 INFO SparkHadoopWriter: Job job_202104271252405288983992695588507_0005 committed.
21/04/27 12:52:49 INFO SparkContext: Invoking stop() from shutdown hook
21/04/27 12:52:49 INFO SparkUI: Stopped Spark web UI at http://192.168.122.1:4040
21/04/27 12:52:49 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/04/27 12:52:49 INFO MemoryStore: MemoryStore cleared
21/04/27 12:52:49 INFO BlockManager: BlockManager stopped
21/04/27 12:52:49 INFO BlockManagerMaster: BlockManagerMaster stopped
21/04/27 12:52:49 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/04/27 12:52:49 INFO SparkContext: Successfully stopped SparkContext
21/04/27 12:52:49 INFO ShutdownHookManager: Shutdown hook called
21/04/27 12:52:49 INFO ShutdownHookManager: Deleting directory /tmp/spark-b9b2bb8e-260c-4293-b336-b7ce1ea6c975
21/04/27 12:52:49 INFO ShutdownHookManager: Deleting directory /tmp/spark-a28c9d66-796c-49c4-abbf-3d1100fb7357

# convert the reads to fragments to re-pair the reads
echo "Converting read file to fragments"
+ echo 'Converting read file to fragments'
Converting read file to fragments
rm -rf ${FRAGMENTS}
+ rm -rf mouse_chrM.bam.fragments.adam
${ADAM} transformFragments -load_as_alignments ${READS} ${FRAGMENTS}
+ ./bin/adam-submit transformFragments -load_as_alignments mouse_chrM.bam.reads.adam mouse_chrM.bam.fragments.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/spark-3.0.2-bin-hadoop3.2/bin/spark-submit
21/04/27 12:52:50 WARN Utils: Your hostname, research-jenkins-worker-02 resolves to a loopback address: 127.0.1.1; using 192.168.122.1 instead (on interface virbr0)
21/04/27 12:52:50 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
21/04/27 12:52:51 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
log4j:WARN No appenders could be found for logger (org.bdgenomics.adam.cli.ADAMMain).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/04/27 12:52:51 INFO SparkContext: Running Spark version 3.0.2
21/04/27 12:52:51 INFO ResourceUtils: ==============================================================
21/04/27 12:52:51 INFO ResourceUtils: Resources for spark.driver:

21/04/27 12:52:51 INFO ResourceUtils: ==============================================================
21/04/27 12:52:51 INFO SparkContext: Submitted application: transformFragments
21/04/27 12:52:51 INFO SecurityManager: Changing view acls to: jenkins
21/04/27 12:52:51 INFO SecurityManager: Changing modify acls to: jenkins
21/04/27 12:52:51 INFO SecurityManager: Changing view acls groups to: 
21/04/27 12:52:51 INFO SecurityManager: Changing modify acls groups to: 
21/04/27 12:52:51 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
21/04/27 12:52:51 INFO Utils: Successfully started service 'sparkDriver' on port 41627.
21/04/27 12:52:51 INFO SparkEnv: Registering MapOutputTracker
21/04/27 12:52:51 INFO SparkEnv: Registering BlockManagerMaster
21/04/27 12:52:51 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
21/04/27 12:52:51 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
21/04/27 12:52:51 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
21/04/27 12:52:51 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-7184d92a-d36e-4069-8b36-02cd42d1f61b
21/04/27 12:52:51 INFO MemoryStore: MemoryStore started with capacity 408.9 MiB
21/04/27 12:52:51 INFO SparkEnv: Registering OutputCommitCoordinator
21/04/27 12:52:52 INFO Utils: Successfully started service 'SparkUI' on port 4040.
21/04/27 12:52:52 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.122.1:4040
21/04/27 12:52:52 INFO SparkContext: Added JAR file:/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar at spark://192.168.122.1:41627/jars/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar with timestamp 1619553171279
21/04/27 12:52:52 INFO Executor: Starting executor ID driver on host 192.168.122.1
21/04/27 12:52:52 INFO Executor: Fetching spark://192.168.122.1:41627/jars/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar with timestamp 1619553171279
21/04/27 12:52:52 INFO TransportClientFactory: Successfully created connection to /192.168.122.1:41627 after 34 ms (0 ms spent in bootstraps)
21/04/27 12:52:52 INFO Utils: Fetching spark://192.168.122.1:41627/jars/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar to /tmp/spark-28ec1b39-c749-4d0e-8a37-666adf9cbd0a/userFiles-74690338-952b-4041-889b-ac8619d18c2c/fetchFileTemp8991634568482691504.tmp
21/04/27 12:52:52 INFO Executor: Adding file:/tmp/spark-28ec1b39-c749-4d0e-8a37-666adf9cbd0a/userFiles-74690338-952b-4041-889b-ac8619d18c2c/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar to class loader
21/04/27 12:52:52 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 46111.
21/04/27 12:52:52 INFO NettyBlockTransferService: Server created on 192.168.122.1:46111
21/04/27 12:52:52 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
21/04/27 12:52:52 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.122.1, 46111, None)
21/04/27 12:52:52 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.122.1:46111 with 408.9 MiB RAM, BlockManagerId(driver, 192.168.122.1, 46111, None)
21/04/27 12:52:52 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.122.1, 46111, None)
21/04/27 12:52:52 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.122.1, 46111, None)
21/04/27 12:52:53 INFO ADAMContext: Loading mouse_chrM.bam.reads.adam as Parquet of Alignments.
21/04/27 12:52:54 INFO ADAMContext: Reading the ADAM file at mouse_chrM.bam.reads.adam to create RDD
21/04/27 12:52:55 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 333.7 KiB, free 408.6 MiB)
21/04/27 12:52:55 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 31.0 KiB, free 408.5 MiB)
21/04/27 12:52:55 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.122.1:46111 (size: 31.0 KiB, free: 408.9 MiB)
21/04/27 12:52:55 INFO SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:1797
21/04/27 12:52:55 INFO FileInputFormat: Total input files to process : 1
21/04/27 12:52:55 INFO ParquetInputFormat: Total input paths to process : 1
21/04/27 12:52:55 INFO RDDBoundFragmentDataset: Saving data in ADAM format
21/04/27 12:52:55 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
21/04/27 12:52:55 INFO FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
21/04/27 12:52:55 INFO SparkContext: Starting job: runJob at SparkHadoopWriter.scala:83
21/04/27 12:52:55 INFO DAGScheduler: Registering RDD 2 (groupBy at SingleReadBucket.scala:97) as input to shuffle 0
21/04/27 12:52:55 INFO DAGScheduler: Got job 0 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
21/04/27 12:52:55 INFO DAGScheduler: Final stage: ResultStage 1 (runJob at SparkHadoopWriter.scala:83)
21/04/27 12:52:55 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
21/04/27 12:52:55 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 0)
21/04/27 12:52:55 INFO DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[2] at groupBy at SingleReadBucket.scala:97), which has no missing parents
21/04/27 12:52:56 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 7.4 KiB, free 408.5 MiB)
21/04/27 12:52:56 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 4.0 KiB, free 408.5 MiB)
21/04/27 12:52:56 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.122.1:46111 (size: 4.0 KiB, free: 408.9 MiB)
21/04/27 12:52:56 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1223
21/04/27 12:52:56 INFO DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[2] at groupBy at SingleReadBucket.scala:97) (first 15 tasks are for partitions Vector(0))
21/04/27 12:52:56 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
21/04/27 12:52:56 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 192.168.122.1, executor driver, partition 0, PROCESS_LOCAL, 7482 bytes)
21/04/27 12:52:56 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
21/04/27 12:52:56 INFO NewHadoopRDD: Input split: file:/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/part-r-00000.gz.parquet:0+10132211
21/04/27 12:52:56 INFO InternalParquetRecordReader: RecordReader initialized will read a total of 163064 records.
21/04/27 12:52:56 INFO InternalParquetRecordReader: at row 0. reading next block
21/04/27 12:52:56 INFO CodecPool: Got brand-new decompressor [.gz]
21/04/27 12:52:56 INFO InternalParquetRecordReader: block read in memory in 46 ms. row count = 163064
21/04/27 12:52:59 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1086 bytes result sent to driver
21/04/27 12:52:59 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 3250 ms on 192.168.122.1 (executor driver) (1/1)
21/04/27 12:52:59 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
21/04/27 12:52:59 INFO DAGScheduler: ShuffleMapStage 0 (groupBy at SingleReadBucket.scala:97) finished in 3.381 s
21/04/27 12:52:59 INFO DAGScheduler: looking for newly runnable stages
21/04/27 12:52:59 INFO DAGScheduler: running: Set()
21/04/27 12:52:59 INFO DAGScheduler: waiting: Set(ResultStage 1)
21/04/27 12:52:59 INFO DAGScheduler: failed: Set()
21/04/27 12:52:59 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[6] at map at GenomicDataset.scala:3805), which has no missing parents
21/04/27 12:52:59 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 100.8 KiB, free 408.4 MiB)
21/04/27 12:52:59 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 37.8 KiB, free 408.4 MiB)
21/04/27 12:52:59 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.122.1:46111 (size: 37.8 KiB, free: 408.8 MiB)
21/04/27 12:52:59 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1223
21/04/27 12:52:59 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[6] at map at GenomicDataset.scala:3805) (first 15 tasks are for partitions Vector(0))
21/04/27 12:52:59 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
21/04/27 12:52:59 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, 192.168.122.1, executor driver, partition 0, NODE_LOCAL, 7143 bytes)
21/04/27 12:52:59 INFO Executor: Running task 0.0 in stage 1.0 (TID 1)
21/04/27 12:52:59 INFO ShuffleBlockFetcherIterator: Getting 1 (26.9 MiB) non-empty blocks including 1 (26.9 MiB) local and 0 (0.0 B) host-local and 0 (0.0 B) remote blocks
21/04/27 12:52:59 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 9 ms
21/04/27 12:53:01 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
21/04/27 12:53:01 INFO FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
21/04/27 12:53:01 INFO CodecConfig: Compression: GZIP
21/04/27 12:53:01 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
21/04/27 12:53:01 INFO FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
21/04/27 12:53:01 INFO BlockManagerInfo: Removed broadcast_1_piece0 on 192.168.122.1:46111 in memory (size: 4.0 KiB, free: 408.8 MiB)
21/04/27 12:53:01 INFO ParquetOutputFormat: Parquet block size to 134217728
21/04/27 12:53:01 INFO ParquetOutputFormat: Parquet page size to 1048576
21/04/27 12:53:01 INFO ParquetOutputFormat: Parquet dictionary page size to 1048576
21/04/27 12:53:01 INFO ParquetOutputFormat: Dictionary is on
21/04/27 12:53:01 INFO ParquetOutputFormat: Validation is off
21/04/27 12:53:01 INFO ParquetOutputFormat: Writer version is: PARQUET_1_0
21/04/27 12:53:01 INFO ParquetOutputFormat: Maximum row group padding size is 8388608 bytes
21/04/27 12:53:01 INFO ParquetOutputFormat: Page size checking is: estimated
21/04/27 12:53:01 INFO ParquetOutputFormat: Min row count for page size check is: 100
21/04/27 12:53:01 INFO ParquetOutputFormat: Max row count for page size check is: 10000
21/04/27 12:53:01 INFO CodecPool: Got brand-new compressor [.gz]
21/04/27 12:53:06 INFO InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 21417928
21/04/27 12:53:07 INFO FileOutputCommitter: Saved output of task 'attempt_202104271252559113291369121829820_0006_r_000000_0' to file:/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.fragments.adam/_temporary/0/task_202104271252559113291369121829820_0006_r_000000
21/04/27 12:53:07 INFO SparkHadoopMapRedUtil: attempt_202104271252559113291369121829820_0006_r_000000_0: Committed
21/04/27 12:53:07 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1). 1365 bytes result sent to driver
21/04/27 12:53:07 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 7689 ms on 192.168.122.1 (executor driver) (1/1)
21/04/27 12:53:07 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
21/04/27 12:53:07 INFO DAGScheduler: ResultStage 1 (runJob at SparkHadoopWriter.scala:83) finished in 7.721 s
21/04/27 12:53:07 INFO DAGScheduler: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job
21/04/27 12:53:07 INFO TaskSchedulerImpl: Killing all running tasks in stage 1: Stage finished
21/04/27 12:53:07 INFO DAGScheduler: Job 0 finished: runJob at SparkHadoopWriter.scala:83, took 11.182083 s
21/04/27 12:53:07 INFO ParquetFileReader: Initiating action with parallelism: 5
21/04/27 12:53:07 INFO SparkHadoopWriter: Job job_202104271252559113291369121829820_0006 committed.
21/04/27 12:53:07 INFO SparkContext: Invoking stop() from shutdown hook
21/04/27 12:53:07 INFO SparkUI: Stopped Spark web UI at http://192.168.122.1:4040
21/04/27 12:53:07 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/04/27 12:53:07 INFO MemoryStore: MemoryStore cleared
21/04/27 12:53:07 INFO BlockManager: BlockManager stopped
21/04/27 12:53:07 INFO BlockManagerMaster: BlockManagerMaster stopped
21/04/27 12:53:07 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/04/27 12:53:07 INFO SparkContext: Successfully stopped SparkContext
21/04/27 12:53:07 INFO ShutdownHookManager: Shutdown hook called
21/04/27 12:53:07 INFO ShutdownHookManager: Deleting directory /tmp/spark-24bba8cb-c8f5-4084-92d1-51ceced3e950
21/04/27 12:53:07 INFO ShutdownHookManager: Deleting directory /tmp/spark-28ec1b39-c749-4d0e-8a37-666adf9cbd0a

# test that printing works
echo "Printing reads and fragments"
+ echo 'Printing reads and fragments'
Printing reads and fragments
${ADAM} print ${READS} 1>/dev/null 2>/dev/null
+ ./bin/adam-submit print mouse_chrM.bam.reads.adam
${ADAM} print ${FRAGMENTS} 1>/dev/null 2>/dev/null
+ ./bin/adam-submit print mouse_chrM.bam.fragments.adam

# run flagstat to verify that flagstat runs OK
echo "Printing read statistics"
+ echo 'Printing read statistics'
Printing read statistics
${ADAM} flagstat ${READS}
+ ./bin/adam-submit flagstat mouse_chrM.bam.reads.adam
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using spark-submit=/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/spark-3.0.2-bin-hadoop3.2/bin/spark-submit
21/04/27 12:53:26 WARN Utils: Your hostname, research-jenkins-worker-02 resolves to a loopback address: 127.0.1.1; using 192.168.122.1 instead (on interface virbr0)
21/04/27 12:53:26 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
21/04/27 12:53:27 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
log4j:WARN No appenders could be found for logger (org.bdgenomics.adam.cli.ADAMMain).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/04/27 12:53:27 INFO SparkContext: Running Spark version 3.0.2
21/04/27 12:53:27 INFO ResourceUtils: ==============================================================
21/04/27 12:53:27 INFO ResourceUtils: Resources for spark.driver:

21/04/27 12:53:27 INFO ResourceUtils: ==============================================================
21/04/27 12:53:27 INFO SparkContext: Submitted application: flagstat
21/04/27 12:53:27 INFO SecurityManager: Changing view acls to: jenkins
21/04/27 12:53:27 INFO SecurityManager: Changing modify acls to: jenkins
21/04/27 12:53:27 INFO SecurityManager: Changing view acls groups to: 
21/04/27 12:53:27 INFO SecurityManager: Changing modify acls groups to: 
21/04/27 12:53:27 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
21/04/27 12:53:28 INFO Utils: Successfully started service 'sparkDriver' on port 43007.
21/04/27 12:53:28 INFO SparkEnv: Registering MapOutputTracker
21/04/27 12:53:28 INFO SparkEnv: Registering BlockManagerMaster
21/04/27 12:53:28 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
21/04/27 12:53:28 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
21/04/27 12:53:28 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
21/04/27 12:53:28 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-5e31d281-4c69-4ba5-8866-3cf1a86866b8
21/04/27 12:53:28 INFO MemoryStore: MemoryStore started with capacity 408.9 MiB
21/04/27 12:53:28 INFO SparkEnv: Registering OutputCommitCoordinator
21/04/27 12:53:28 INFO Utils: Successfully started service 'SparkUI' on port 4040.
21/04/27 12:53:28 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.122.1:4040
21/04/27 12:53:28 INFO SparkContext: Added JAR file:/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/adam-assembly/target/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar at spark://192.168.122.1:43007/jars/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar with timestamp 1619553207591
21/04/27 12:53:28 INFO Executor: Starting executor ID driver on host 192.168.122.1
21/04/27 12:53:28 INFO Executor: Fetching spark://192.168.122.1:43007/jars/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar with timestamp 1619553207591
21/04/27 12:53:28 INFO TransportClientFactory: Successfully created connection to /192.168.122.1:43007 after 33 ms (0 ms spent in bootstraps)
21/04/27 12:53:28 INFO Utils: Fetching spark://192.168.122.1:43007/jars/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar to /tmp/spark-deaf9a1c-4422-4bd3-9b96-fc2e11171bd9/userFiles-acb8321e-9557-493a-b9f9-7eea9dca0ed0/fetchFileTemp6091523227868091776.tmp
21/04/27 12:53:29 INFO Executor: Adding file:/tmp/spark-deaf9a1c-4422-4bd3-9b96-fc2e11171bd9/userFiles-acb8321e-9557-493a-b9f9-7eea9dca0ed0/adam-assembly-spark3_2.12-0.35.0-SNAPSHOT.jar to class loader
21/04/27 12:53:29 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 33371.
21/04/27 12:53:29 INFO NettyBlockTransferService: Server created on 192.168.122.1:33371
21/04/27 12:53:29 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
21/04/27 12:53:29 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.122.1, 33371, None)
21/04/27 12:53:29 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.122.1:33371 with 408.9 MiB RAM, BlockManagerId(driver, 192.168.122.1, 33371, None)
21/04/27 12:53:29 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.122.1, 33371, None)
21/04/27 12:53:29 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.122.1, 33371, None)
21/04/27 12:53:29 INFO ADAMContext: Loading mouse_chrM.bam.reads.adam as Parquet of Alignments.
21/04/27 12:53:29 INFO ADAMContext: Reading the ADAM file at mouse_chrM.bam.reads.adam to create RDD
21/04/27 12:53:29 INFO ADAMContext: Using the specified projection schema
21/04/27 12:53:30 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 338.3 KiB, free 408.6 MiB)
21/04/27 12:53:30 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 31.8 KiB, free 408.5 MiB)
21/04/27 12:53:30 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.122.1:33371 (size: 31.8 KiB, free: 408.9 MiB)
21/04/27 12:53:30 INFO SparkContext: Created broadcast 0 from newAPIHadoopFile at ADAMContext.scala:1797
21/04/27 12:53:32 INFO FileInputFormat: Total input files to process : 1
21/04/27 12:53:32 INFO ParquetInputFormat: Total input paths to process : 1
21/04/27 12:53:32 INFO SparkContext: Starting job: aggregate at FlagStat.scala:115
21/04/27 12:53:32 INFO DAGScheduler: Got job 0 (aggregate at FlagStat.scala:115) with 1 output partitions
21/04/27 12:53:32 INFO DAGScheduler: Final stage: ResultStage 0 (aggregate at FlagStat.scala:115)
21/04/27 12:53:32 INFO DAGScheduler: Parents of final stage: List()
21/04/27 12:53:32 INFO DAGScheduler: Missing parents: List()
21/04/27 12:53:32 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[2] at map at FlagStat.scala:96), which has no missing parents
21/04/27 12:53:32 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 5.3 KiB, free 408.5 MiB)
21/04/27 12:53:32 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 2.9 KiB, free 408.5 MiB)
21/04/27 12:53:32 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.122.1:33371 (size: 2.9 KiB, free: 408.9 MiB)
21/04/27 12:53:32 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1223
21/04/27 12:53:32 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[2] at map at FlagStat.scala:96) (first 15 tasks are for partitions Vector(0))
21/04/27 12:53:32 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
21/04/27 12:53:32 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 192.168.122.1, executor driver, partition 0, PROCESS_LOCAL, 7493 bytes)
21/04/27 12:53:32 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
21/04/27 12:53:32 INFO NewHadoopRDD: Input split: file:/tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded/mouse_chrM.bam.reads.adam/part-r-00000.gz.parquet:0+10132211
21/04/27 12:53:32 INFO InternalParquetRecordReader: RecordReader initialized will read a total of 163064 records.
21/04/27 12:53:32 INFO InternalParquetRecordReader: at row 0. reading next block
21/04/27 12:53:32 INFO CodecPool: Got brand-new decompressor [.gz]
21/04/27 12:53:32 INFO InternalParquetRecordReader: block read in memory in 22 ms. row count = 163064
21/04/27 12:53:33 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 988 bytes result sent to driver
21/04/27 12:53:33 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 1254 ms on 192.168.122.1 (executor driver) (1/1)
21/04/27 12:53:33 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
21/04/27 12:53:33 INFO DAGScheduler: ResultStage 0 (aggregate at FlagStat.scala:115) finished in 1.348 s
21/04/27 12:53:33 INFO DAGScheduler: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job
21/04/27 12:53:33 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage finished
21/04/27 12:53:33 INFO DAGScheduler: Job 0 finished: aggregate at FlagStat.scala:115, took 1.403282 s
163064 + 0 in total (QC-passed reads + QC-failed reads)
0 + 0 primary duplicates
0 + 0 primary duplicates - both read and mate mapped
0 + 0 primary duplicates - only read mapped
0 + 0 primary duplicates - cross chromosome
0 + 0 secondary duplicates
0 + 0 secondary duplicates - both read and mate mapped
0 + 0 secondary duplicates - only read mapped
0 + 0 secondary duplicates - cross chromosome
160512 + 0 mapped (98.43%:0.00%)
163064 + 0 paired in sequencing
81524 + 0 read1
81540 + 0 read2
154982 + 0 properly paired (95.04%:0.00%)
158044 + 0 with itself and mate mapped
2468 + 0 singletons (1.51%:0.00%)
418 + 0 with mate mapped to a different chr
120 + 0 with mate mapped to a different chr (mapQ>=5)
21/04/27 12:53:33 INFO SparkContext: Invoking stop() from shutdown hook
21/04/27 12:53:33 INFO SparkUI: Stopped Spark web UI at http://192.168.122.1:4040
21/04/27 12:53:33 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/04/27 12:53:33 INFO MemoryStore: MemoryStore cleared
21/04/27 12:53:33 INFO BlockManager: BlockManager stopped
21/04/27 12:53:33 INFO BlockManagerMaster: BlockManagerMaster stopped
21/04/27 12:53:33 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/04/27 12:53:33 INFO SparkContext: Successfully stopped SparkContext
21/04/27 12:53:33 INFO ShutdownHookManager: Shutdown hook called
21/04/27 12:53:33 INFO ShutdownHookManager: Deleting directory /tmp/spark-1f3cf09e-4698-49c0-add2-4982da55d6bc
21/04/27 12:53:33 INFO ShutdownHookManager: Deleting directory /tmp/spark-deaf9a1c-4422-4bd3-9b96-fc2e11171bd9
rm -rf ${ADAM_TMP_DIR}
+ rm -rf /tmp/adamTestZ4RBPRF/deleteMePleaseThisIsNoLongerNeeded
popd
+ popd
~/workspace/ADAM/HADOOP_VERSION/3.2.1/SCALA_VERSION/2.12/SPARK_VERSION/3.0.2/label/ubuntu

pushd ${PROJECT_ROOT}
+ pushd /home/jenkins/workspace/ADAM/HADOOP_VERSION/3.2.1/SCALA_VERSION/2.12/SPARK_VERSION/3.0.2/label/ubuntu/scripts/..
~/workspace/ADAM/HADOOP_VERSION/3.2.1/SCALA_VERSION/2.12/SPARK_VERSION/3.0.2/label/ubuntu ~/workspace/ADAM/HADOOP_VERSION/3.2.1/SCALA_VERSION/2.12/SPARK_VERSION/3.0.2/label/ubuntu

# test that the source is formatted correctly
./scripts/format-source
+ ./scripts/format-source
+++ dirname ./scripts/format-source
++ cd ./scripts
++ pwd
+ DIR=/home/jenkins/workspace/ADAM/HADOOP_VERSION/3.2.1/SCALA_VERSION/2.12/SPARK_VERSION/3.0.2/label/ubuntu/scripts
+ pushd /home/jenkins/workspace/ADAM/HADOOP_VERSION/3.2.1/SCALA_VERSION/2.12/SPARK_VERSION/3.0.2/label/ubuntu/scripts/..
~/workspace/ADAM/HADOOP_VERSION/3.2.1/SCALA_VERSION/2.12/SPARK_VERSION/3.0.2/label/ubuntu ~/workspace/ADAM/HADOOP_VERSION/3.2.1/SCALA_VERSION/2.12/SPARK_VERSION/3.0.2/label/ubuntu
+ mvn org.scalariform:scalariform-maven-plugin:format license:format
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=1g; support was removed in 8.0
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] ADAM_2.12                                                          [pom]
[INFO] ADAM_2.12: Shader workaround                                       [jar]
[INFO] ADAM_2.12: Avro-to-Dataset codegen utils                           [jar]
[INFO] ADAM_2.12: Core                                                    [jar]
[INFO] ADAM_2.12: APIs for Java, Python                                   [jar]
[INFO] ADAM_2.12: CLI                                                     [jar]
[INFO] ADAM_2.12: Assembly                                                [jar]
[INFO] 
[INFO] ------------< org.bdgenomics.adam:adam-parent-spark3_2.12 >-------------
[INFO] Building ADAM_2.12 0.35.0-SNAPSHOT                                 [1/7]
[INFO] --------------------------------[ pom ]---------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-parent-spark3_2.12 ---
[INFO] Modified 2 of 243 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-parent-spark3_2.12 ---
[INFO] Updating license headers...
[INFO] 
[INFO] -------------< org.bdgenomics.adam:adam-shade-spark3_2.12 >-------------
[INFO] Building ADAM_2.12: Shader workaround 0.35.0-SNAPSHOT              [2/7]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-shade-spark3_2.12 ---
[INFO] Modified 0 of 0 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-shade-spark3_2.12 ---
[INFO] Updating license headers...
[INFO] 
[INFO] ------------< org.bdgenomics.adam:adam-codegen-spark3_2.12 >------------
[INFO] Building ADAM_2.12: Avro-to-Dataset codegen utils 0.35.0-SNAPSHOT  [3/7]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-codegen-spark3_2.12 ---
[INFO] Modified 0 of 4 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-codegen-spark3_2.12 ---
[INFO] Updating license headers...
[INFO] 
[INFO] -------------< org.bdgenomics.adam:adam-core-spark3_2.12 >--------------
[INFO] Building ADAM_2.12: Core 0.35.0-SNAPSHOT                           [4/7]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-core-spark3_2.12 ---
[INFO] Modified 0 of 202 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-core-spark3_2.12 ---
[INFO] Updating license headers...
[INFO] 
[INFO] -------------< org.bdgenomics.adam:adam-apis-spark3_2.12 >--------------
[INFO] Building ADAM_2.12: APIs for Java, Python 0.35.0-SNAPSHOT          [5/7]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-apis-spark3_2.12 ---
[INFO] Modified 0 of 5 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-apis-spark3_2.12 ---
[INFO] Updating license headers...
[INFO] 
[INFO] --------------< org.bdgenomics.adam:adam-cli-spark3_2.12 >--------------
[INFO] Building ADAM_2.12: CLI 0.35.0-SNAPSHOT                            [6/7]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-cli-spark3_2.12 ---
[INFO] Modified 0 of 30 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-cli-spark3_2.12 ---
[INFO] Updating license headers...
[INFO] 
[INFO] -----------< org.bdgenomics.adam:adam-assembly-spark3_2.12 >------------
[INFO] Building ADAM_2.12: Assembly 0.35.0-SNAPSHOT                       [7/7]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-assembly-spark3_2.12 ---
[INFO] Modified 0 of 1 .scala files
[INFO] 
[INFO] --- maven-license-plugin:1.10.b1:format (default-cli) @ adam-assembly-spark3_2.12 ---
[INFO] Updating license headers...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for ADAM_2.12 0.35.0-SNAPSHOT:
[INFO] 
[INFO] ADAM_2.12 .......................................... SUCCESS [  6.285 s]
[INFO] ADAM_2.12: Shader workaround ....................... SUCCESS [  0.043 s]
[INFO] ADAM_2.12: Avro-to-Dataset codegen utils ........... SUCCESS [  0.061 s]
[INFO] ADAM_2.12: Core .................................... SUCCESS [  3.512 s]
[INFO] ADAM_2.12: APIs for Java, Python ................... SUCCESS [  0.126 s]
[INFO] ADAM_2.12: CLI ..................................... SUCCESS [  0.199 s]
[INFO] ADAM_2.12: Assembly ................................ SUCCESS [  0.022 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  10.678 s
[INFO] Finished at: 2021-04-27T12:53:45-07:00
[INFO] ------------------------------------------------------------------------
+ popd
~/workspace/ADAM/HADOOP_VERSION/3.2.1/SCALA_VERSION/2.12/SPARK_VERSION/3.0.2/label/ubuntu
if test -n "$(git status --porcelain)"
then
    echo "Please run './scripts/format-source'"
    exit 1
fi
++ git status --porcelain
+ test -n ''
popd    
+ popd
~/workspace/ADAM/HADOOP_VERSION/3.2.1/SCALA_VERSION/2.12/SPARK_VERSION/3.0.2/label/ubuntu

echo
+ echo

echo "All the tests passed"
+ echo 'All the tests passed'
All the tests passed
echo
+ echo

+ mvn --settings=/home/jenkins/workspace/settings.xml deploy -DskipTests -Pdistribution
[ERROR] Error executing Maven.
[ERROR] The specified user settings file does not exist: /home/jenkins/workspace/settings.xml
Build step 'Execute shell' marked build as failure
Sending e-mails to: heuermh@berkeley.edu akmorrow@berkeley.edu
ERROR: Couldn't connect to host, port: 127.0.0.1, 25; timeout 60000
com.sun.mail.util.MailConnectException: Couldn't connect to host, port: 127.0.0.1, 25; timeout 60000;
  nested exception is:
	java.net.ConnectException: Connection refused (Connection refused)
	at com.sun.mail.smtp.SMTPTransport.openServer(SMTPTransport.java:2210)
	at com.sun.mail.smtp.SMTPTransport.protocolConnect(SMTPTransport.java:722)
	at javax.mail.Service.connect(Service.java:342)
	at javax.mail.Service.connect(Service.java:222)
	at javax.mail.Service.connect(Service.java:171)
	at javax.mail.Transport.send0(Transport.java:230)
	at javax.mail.Transport.send(Transport.java:100)
	at hudson.tasks.MailSender.run(MailSender.java:130)
	at hudson.tasks.Mailer.perform(Mailer.java:176)
	at hudson.tasks.Mailer.perform(Mailer.java:139)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:803)
	at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:752)
	at hudson.model.Build$BuildExecution.post2(Build.java:177)
	at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:697)
	at hudson.model.Run.execute(Run.java:1932)
	at hudson.matrix.MatrixRun.run(MatrixRun.java:153)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:429)
Caused by: java.net.ConnectException: Connection refused (Connection refused)
	at java.net.PlainSocketImpl.socketConnect(Native Method)
	at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
	at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
	at java.net.Socket.connect(Socket.java:607)
	at com.sun.mail.util.SocketFetcher.createSocket(SocketFetcher.java:333)
	at com.sun.mail.util.SocketFetcher.getSocket(SocketFetcher.java:214)
	at com.sun.mail.smtp.SMTPTransport.openServer(SMTPTransport.java:2160)
	... 18 more
Finished: FAILURE