Console Output

Started by an SCM change
Running as SYSTEM
[EnvInject] - Loading node environment variables.
[EnvInject] - Preparing an environment for the build.
[EnvInject] - Keeping Jenkins system variables.
[EnvInject] - Keeping Jenkins build variables.
[EnvInject] - Injecting as environment variables the properties content 
PATH=/home/anaconda/envs/py36/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
AMPLAB_JENKINS="true"
JAVA_HOME=/usr/java/latest
AMPLAB_JENKINS_BUILD_HIVE_PROFILE=hive2.3
SPARK_TESTING=1
AMPLAB_JENKINS_BUILD_PROFILE=hadoop3.2
LANG=en_US.UTF-8
SPARK_BRANCH=branch-3.1

[EnvInject] - Variables injected successfully.
[EnvInject] - Injecting contributions.
Building remotely on research-jenkins-worker-05 (ubuntu20 ubuntu worker-05) in workspace /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2
The recommended git tool is: NONE
No credentials specified
 > git rev-parse --resolve-git-dir /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/.git # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/spark.git # timeout=10
Fetching upstream changes from https://github.com/apache/spark.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/spark.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git rev-parse origin/branch-3.1^{commit} # timeout=10
Checking out Revision c51f6449d38d30d0bff22df895dca515898a520b (origin/branch-3.1)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c51f6449d38d30d0bff22df895dca515898a520b # timeout=10
Commit message: "[SPARK-37392][SQL] Fix the performance bug when inferring constraints for Generate"
 > git rev-list --no-walk 281601739de100521de6009b4a65efc3e922622a # timeout=10
[spark-branch-3.1-test-sbt-hadoop-3.2] $ /bin/bash /tmp/jenkins5041413115634312173.sh
Removing R/cran-check.out
Removing R/lib/
Removing R/pkg/man/
Removing R/pkg/tests/fulltests/Rplots.pdf
Removing R/pkg/tests/fulltests/_snaps/
Removing R/unit-tests.out
Removing append/
Removing assembly/target/
Removing build/sbt-launch-1.4.6.jar
Removing common/kvstore/target/
Removing common/network-common/target/
Removing common/network-shuffle/target/
Removing common/network-yarn/target/
Removing common/sketch/target/
Removing common/tags/target/
Removing common/unsafe/target/
Removing core/derby.log
Removing core/dummy/
Removing core/ignored/
Removing core/target/
Removing core/temp-secrets/
Removing derby.log
Removing dev/__pycache__/
Removing dev/create-release/__pycache__/
Removing dev/lint-r-report.log
Removing dev/sparktestsupport/__pycache__/
Removing dev/target/
Removing examples/src/main/python/__pycache__/
Removing examples/src/main/python/ml/__pycache__/
Removing examples/src/main/python/mllib/__pycache__/
Removing examples/src/main/python/sql/__pycache__/
Removing examples/src/main/python/sql/streaming/__pycache__/
Removing examples/src/main/python/streaming/__pycache__/
Removing examples/target/
Removing external/avro/spark-warehouse/
Removing external/avro/target/
Removing external/docker-integration-tests/target/
Removing external/kafka-0-10-assembly/target/
Removing external/kafka-0-10-sql/spark-warehouse/
Removing external/kafka-0-10-sql/target/
Removing external/kafka-0-10-token-provider/target/
Removing external/kafka-0-10/target/
Removing external/kinesis-asl-assembly/target/
Removing external/kinesis-asl/checkpoint/
Removing external/kinesis-asl/src/main/python/examples/streaming/__pycache__/
Removing external/kinesis-asl/target/
Removing external/spark-ganglia-lgpl/target/
Removing graphx/target/
Removing hadoop-cloud/target/
Removing launcher/target/
Removing lib/
Removing logs/
Removing metastore_db/
Removing mllib-local/target/
Removing mllib/checkpoint/
Removing mllib/spark-warehouse/
Removing mllib/target/
Removing project/project/
Removing project/target/
Removing python/__pycache__/
Removing python/dist/
Removing python/docs/source/__pycache__/
Removing python/lib/pyspark.zip
Removing python/pyspark.egg-info/
Removing python/pyspark/__pycache__/
Removing python/pyspark/cloudpickle/__pycache__/
Removing python/pyspark/ml/__pycache__/
Removing python/pyspark/ml/linalg/__pycache__/
Removing python/pyspark/ml/param/__pycache__/
Removing python/pyspark/ml/tests/__pycache__/
Removing python/pyspark/mllib/__pycache__/
Removing python/pyspark/mllib/linalg/__pycache__/
Removing python/pyspark/mllib/stat/__pycache__/
Removing python/pyspark/mllib/tests/__pycache__/
Removing python/pyspark/python/
Removing python/pyspark/resource/__pycache__/
Removing python/pyspark/resource/tests/__pycache__/
Removing python/pyspark/sql/__pycache__/
Removing python/pyspark/sql/avro/__pycache__/
Removing python/pyspark/sql/pandas/__pycache__/
Removing python/pyspark/sql/tests/__pycache__/
Removing python/pyspark/streaming/__pycache__/
Removing python/pyspark/streaming/tests/__pycache__/
Removing python/pyspark/testing/__pycache__/
Removing python/pyspark/tests/__pycache__/
Removing python/target/
Removing python/test_coverage/__pycache__/
Removing python/test_support/__pycache__/
Removing repl/spark-warehouse/
Removing repl/target/
Removing resource-managers/kubernetes/core/target/
Removing resource-managers/kubernetes/core/temp-secret/
Removing resource-managers/kubernetes/integration-tests/target/
Removing resource-managers/kubernetes/integration-tests/tests/__pycache__/
Removing resource-managers/mesos/target/
Removing resource-managers/yarn/target/
Removing scalastyle-on-compile.generated.xml
Removing spark-warehouse/
Removing sql/__pycache__/
Removing sql/catalyst/fake/
Removing sql/catalyst/spark-warehouse/
Removing sql/catalyst/target/
Removing sql/core/spark-warehouse/
Removing sql/core/src/test/resources/__pycache__/
Removing sql/core/target/
Removing sql/hive-thriftserver/derby.log
Removing sql/hive-thriftserver/metastore_db/
Removing sql/hive-thriftserver/spark-warehouse/
Removing sql/hive-thriftserver/spark_derby/
Removing sql/hive-thriftserver/target/
Removing sql/hive/derby.log
Removing sql/hive/metastore_db/
Removing sql/hive/src/test/resources/data/scripts/__pycache__/
Removing sql/hive/target/
Removing streaming/checkpoint/
Removing streaming/target/
Removing target/
Removing tools/target/
Removing work/
+++ dirname /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R/install-dev.sh
++ cd /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R
++ pwd
+ FWDIR=/home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R
+ LIB_DIR=/home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R/lib
+ mkdir -p /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R/lib
+ pushd /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R
+ . /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R/find-r.sh
++ '[' -z '' ']'
++ '[' '!' -z '' ']'
+++ command -v R
++ '[' '!' /usr/bin/R ']'
++++ which R
+++ dirname /usr/bin/R
++ R_SCRIPT_PATH=/usr/bin
++ echo 'Using R_SCRIPT_PATH = /usr/bin'
Using R_SCRIPT_PATH = /usr/bin
+ . /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R/create-rd.sh
++ set -o pipefail
++ set -e
++++ dirname /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R/create-rd.sh
+++ cd /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R
+++ pwd
++ FWDIR=/home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R
++ pushd /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R
++ . /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R/find-r.sh
+++ '[' -z /usr/bin ']'
++ /usr/bin/Rscript -e ' if(requireNamespace("devtools", quietly=TRUE)) { setwd("/home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R"); devtools::document(pkg="./pkg", roclets="rd") }'
Updating SparkR documentation
First time using roxygen2. Upgrading automatically...
Loading SparkR
Creating a new generic function for ‘as.data.frame’ in package ‘SparkR’
Creating a new generic function for ‘colnames’ in package ‘SparkR’
Creating a new generic function for ‘colnames<-’ in package ‘SparkR’
Creating a new generic function for ‘cov’ in package ‘SparkR’
Creating a new generic function for ‘drop’ in package ‘SparkR’
Creating a new generic function for ‘na.omit’ in package ‘SparkR’
Creating a new generic function for ‘filter’ in package ‘SparkR’
Creating a new generic function for ‘intersect’ in package ‘SparkR’
Creating a new generic function for ‘sample’ in package ‘SparkR’
Creating a new generic function for ‘transform’ in package ‘SparkR’
Creating a new generic function for ‘subset’ in package ‘SparkR’
Creating a new generic function for ‘summary’ in package ‘SparkR’
Creating a new generic function for ‘union’ in package ‘SparkR’
Creating a new generic function for ‘endsWith’ in package ‘SparkR’
Creating a new generic function for ‘startsWith’ in package ‘SparkR’
Creating a new generic function for ‘lag’ in package ‘SparkR’
Creating a new generic function for ‘rank’ in package ‘SparkR’
Creating a new generic function for ‘sd’ in package ‘SparkR’
Creating a new generic function for ‘var’ in package ‘SparkR’
Creating a new generic function for ‘window’ in package ‘SparkR’
Creating a new generic function for ‘predict’ in package ‘SparkR’
Creating a new generic function for ‘rbind’ in package ‘SparkR’
Creating a generic function for ‘substr’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘%in%’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘lapply’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘Filter’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘nrow’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘ncol’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘factorial’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘atan2’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘ifelse’ from package ‘base’ in package ‘SparkR’
Writing structType.Rd
Writing print.structType.Rd
Writing structField.Rd
Writing print.structField.Rd
Writing summarize.Rd
Writing alias.Rd
Writing arrange.Rd
Writing as.data.frame.Rd
Writing cache.Rd
Writing checkpoint.Rd
Writing coalesce.Rd
Writing collect.Rd
Writing columns.Rd
Writing coltypes.Rd
Writing count.Rd
Writing cov.Rd
Writing corr.Rd
Writing createOrReplaceTempView.Rd
Writing cube.Rd
Writing dapply.Rd
Writing dapplyCollect.Rd
Writing gapply.Rd
Writing gapplyCollect.Rd
Writing describe.Rd
Writing distinct.Rd
Writing drop.Rd
Writing dropDuplicates.Rd
Writing nafunctions.Rd
Writing dtypes.Rd
Writing explain.Rd
Writing except.Rd
Writing exceptAll.Rd
Writing filter.Rd
Writing first.Rd
Writing groupBy.Rd
Writing hint.Rd
Writing insertInto.Rd
Writing intersect.Rd
Writing intersectAll.Rd
Writing isLocal.Rd
Writing isStreaming.Rd
Writing limit.Rd
Writing localCheckpoint.Rd
Writing merge.Rd
Writing mutate.Rd
Writing orderBy.Rd
Writing persist.Rd
Writing printSchema.Rd
Writing registerTempTable-deprecated.Rd
Writing rename.Rd
Writing repartition.Rd
Writing repartitionByRange.Rd
Writing sample.Rd
Writing rollup.Rd
Writing sampleBy.Rd
Writing saveAsTable.Rd
Writing take.Rd
Writing write.df.Rd
Writing write.jdbc.Rd
Writing write.json.Rd
Writing write.orc.Rd
Writing write.parquet.Rd
Writing write.stream.Rd
Writing write.text.Rd
Writing schema.Rd
Writing select.Rd
Writing selectExpr.Rd
Writing showDF.Rd
Writing subset.Rd
Writing summary.Rd
Writing union.Rd
Writing unionAll.Rd
Writing unionByName.Rd
Writing unpersist.Rd
Writing with.Rd
Writing withColumn.Rd
Writing withWatermark.Rd
Writing randomSplit.Rd
Writing broadcast.Rd
Writing columnfunctions.Rd
Writing between.Rd
Writing cast.Rd
Writing endsWith.Rd
Writing startsWith.Rd
Writing column_nonaggregate_functions.Rd
Writing otherwise.Rd
Writing over.Rd
Writing eq_null_safe.Rd
Writing withField.Rd
Writing dropFields.Rd
Writing partitionBy.Rd
Writing rowsBetween.Rd
Writing rangeBetween.Rd
Writing windowPartitionBy.Rd
Writing windowOrderBy.Rd
Writing column_datetime_diff_functions.Rd
Writing column_aggregate_functions.Rd
Writing column_collection_functions.Rd
Writing column_ml_functions.Rd
Writing column_string_functions.Rd
Writing column_misc_functions.Rd
Writing avg.Rd
Writing column_math_functions.Rd
Writing column.Rd
Writing column_window_functions.Rd
Writing column_datetime_functions.Rd
Writing column_avro_functions.Rd
Writing last.Rd
Writing not.Rd
Writing fitted.Rd
Writing predict.Rd
Writing rbind.Rd
Writing spark.als.Rd
Writing spark.bisectingKmeans.Rd
Writing spark.fmClassifier.Rd
Writing spark.fmRegressor.Rd
Writing spark.gaussianMixture.Rd
Writing spark.gbt.Rd
Writing spark.glm.Rd
Writing spark.isoreg.Rd
Writing spark.kmeans.Rd
Writing spark.kstest.Rd
Writing spark.lda.Rd
Writing spark.logit.Rd
Writing spark.mlp.Rd
Writing spark.naiveBayes.Rd
Writing spark.decisionTree.Rd
Writing spark.randomForest.Rd
Writing spark.survreg.Rd
Writing spark.svmLinear.Rd
Writing spark.fpGrowth.Rd
Writing spark.prefixSpan.Rd
Writing spark.powerIterationClustering.Rd
Writing spark.lm.Rd
Writing write.ml.Rd
Writing awaitTermination.Rd
Writing isActive.Rd
Writing lastProgress.Rd
Writing queryName.Rd
Writing status.Rd
Writing stopQuery.Rd
Writing print.jobj.Rd
Writing show.Rd
Writing substr.Rd
Writing match.Rd
Writing GroupedData.Rd
Writing pivot.Rd
Writing SparkDataFrame.Rd
Writing storageLevel.Rd
Writing toJSON.Rd
Writing nrow.Rd
Writing ncol.Rd
Writing dim.Rd
Writing head.Rd
Writing join.Rd
Writing crossJoin.Rd
Writing attach.Rd
Writing str.Rd
Writing histogram.Rd
Writing getNumPartitions.Rd
Writing sparkR.conf.Rd
Writing sparkR.version.Rd
Writing createDataFrame.Rd
Writing read.json.Rd
Writing read.orc.Rd
Writing read.parquet.Rd
Writing read.text.Rd
Writing sql.Rd
Writing tableToDF.Rd
Writing read.df.Rd
Writing read.jdbc.Rd
Writing read.stream.Rd
Writing WindowSpec.Rd
Writing createExternalTable-deprecated.Rd
Writing createTable.Rd
Writing cacheTable.Rd
Writing uncacheTable.Rd
Writing clearCache.Rd
Writing dropTempTable-deprecated.Rd
Writing dropTempView.Rd
Writing tables.Rd
Writing tableNames.Rd
Writing currentDatabase.Rd
Writing setCurrentDatabase.Rd
Writing listDatabases.Rd
Writing listTables.Rd
Writing listColumns.Rd
Writing listFunctions.Rd
Writing recoverPartitions.Rd
Writing refreshTable.Rd
Writing refreshByPath.Rd
Writing spark.addFile.Rd
Writing spark.getSparkFilesRootDirectory.Rd
Writing spark.getSparkFiles.Rd
Writing spark.lapply.Rd
Writing setLogLevel.Rd
Writing setCheckpointDir.Rd
Writing unresolved_named_lambda_var.Rd
Writing create_lambda.Rd
Writing invoke_higher_order_function.Rd
Writing install.spark.Rd
Writing sparkR.callJMethod.Rd
Writing sparkR.callJStatic.Rd
Writing sparkR.newJObject.Rd
Writing LinearSVCModel-class.Rd
Writing LogisticRegressionModel-class.Rd
Writing MultilayerPerceptronClassificationModel-class.Rd
Writing NaiveBayesModel-class.Rd
Writing FMClassificationModel-class.Rd
Writing BisectingKMeansModel-class.Rd
Writing GaussianMixtureModel-class.Rd
Writing KMeansModel-class.Rd
Writing LDAModel-class.Rd
Writing PowerIterationClustering-class.Rd
Writing FPGrowthModel-class.Rd
Writing PrefixSpan-class.Rd
Writing ALSModel-class.Rd
Writing AFTSurvivalRegressionModel-class.Rd
Writing GeneralizedLinearRegressionModel-class.Rd
Writing IsotonicRegressionModel-class.Rd
Writing LinearRegressionModel-class.Rd
Writing FMRegressionModel-class.Rd
Writing glm.Rd
Writing KSTest-class.Rd
Writing GBTRegressionModel-class.Rd
Writing GBTClassificationModel-class.Rd
Writing RandomForestRegressionModel-class.Rd
Writing RandomForestClassificationModel-class.Rd
Writing DecisionTreeRegressionModel-class.Rd
Writing DecisionTreeClassificationModel-class.Rd
Writing read.ml.Rd
Writing sparkR.session.stop.Rd
Writing sparkR.init-deprecated.Rd
Writing sparkRSQL.init-deprecated.Rd
Writing sparkRHive.init-deprecated.Rd
Writing sparkR.session.Rd
Writing sparkR.uiWebUrl.Rd
Writing setJobGroup.Rd
Writing clearJobGroup.Rd
Writing cancelJobGroup.Rd
Writing setJobDescription.Rd
Writing setLocalProperty.Rd
Writing getLocalProperty.Rd
Writing crosstab.Rd
Writing freqItems.Rd
Writing approxQuantile.Rd
Writing StreamingQuery.Rd
Writing hashCode.Rd
+ /usr/bin/R CMD INSTALL --library=/home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R/lib /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R/pkg/
* installing *source* package ‘SparkR’ ...
** using staged installation
** R
** inst
** byte-compile and prepare package for lazy loading
Creating a new generic function for ‘as.data.frame’ in package ‘SparkR’
Creating a new generic function for ‘colnames’ in package ‘SparkR’
Creating a new generic function for ‘colnames<-’ in package ‘SparkR’
Creating a new generic function for ‘cov’ in package ‘SparkR’
Creating a new generic function for ‘drop’ in package ‘SparkR’
Creating a new generic function for ‘na.omit’ in package ‘SparkR’
Creating a new generic function for ‘filter’ in package ‘SparkR’
Creating a new generic function for ‘intersect’ in package ‘SparkR’
Creating a new generic function for ‘sample’ in package ‘SparkR’
Creating a new generic function for ‘transform’ in package ‘SparkR’
Creating a new generic function for ‘subset’ in package ‘SparkR’
Creating a new generic function for ‘summary’ in package ‘SparkR’
Creating a new generic function for ‘union’ in package ‘SparkR’
Creating a new generic function for ‘endsWith’ in package ‘SparkR’
Creating a new generic function for ‘startsWith’ in package ‘SparkR’
Creating a new generic function for ‘lag’ in package ‘SparkR’
Creating a new generic function for ‘rank’ in package ‘SparkR’
Creating a new generic function for ‘sd’ in package ‘SparkR’
Creating a new generic function for ‘var’ in package ‘SparkR’
Creating a new generic function for ‘window’ in package ‘SparkR’
Creating a new generic function for ‘predict’ in package ‘SparkR’
Creating a new generic function for ‘rbind’ in package ‘SparkR’
Creating a generic function for ‘substr’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘%in%’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘lapply’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘Filter’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘nrow’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘ncol’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘factorial’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘atan2’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘ifelse’ from package ‘base’ in package ‘SparkR’
** help
*** installing help indices
** building package indices
** installing vignettes
** testing if installed package can be loaded from temporary location
** testing if installed package can be loaded from final location
** testing if installed package keeps a record of temporary installation path
* DONE (SparkR)
+ cd /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R/lib
+ jar cfM /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/R/lib/sparkr.zip SparkR
+ popd
[info] Using build tool sbt with Hadoop profile hadoop3.2 and Hive profile hive2.3 under environment amplab_jenkins
[info] Found the following changed modules: root
[info] Setup the following environment variables for tests: 

========================================================================
Running Apache RAT checks
========================================================================
Attempting to fetch rat
RAT checks passed.

========================================================================
Running Scala style checks
========================================================================
[info] Checking Scala style using SBT with these profiles:  -Phadoop-3.2 -Phive-2.3 -Pmesos -Pkubernetes -Phadoop-cloud -Phive-thriftserver -Pspark-ganglia-lgpl -Phive -Pkinesis-asl -Pyarn
Scalastyle checks passed.

========================================================================
Running Python style checks
========================================================================
starting python compilation test...
python compilation succeeded.

starting pycodestyle test...
pycodestyle checks passed.

starting flake8 test...
flake8 checks passed.

The mypy command was not found. Skipping for now.
python3 has Sphinx 3.1+ installed but it requires lower than 3.1. Skipping Sphinx build for now.


all lint-python tests passed!

========================================================================
Running R style checks
========================================================================
Loading required namespace: SparkR
Loading required namespace: lintr
lintr checks passed.

========================================================================
Building Spark
========================================================================
[info] Building Spark using SBT with these arguments:  -Phadoop-3.2 -Phive-2.3 -Pmesos -Pkubernetes -Phadoop-cloud -Phive-thriftserver -Pspark-ganglia-lgpl -Phive -Pkinesis-asl -Pyarn test:package streaming-kinesis-asl-assembly/assembly
Using /usr/java/latest as default JAVA_HOME.
Note, this will be overridden by -java-home if it is set.
[info] welcome to sbt 1.4.6 (Private Build Java 1.8.0_275)
[info] loading settings for project spark-branch-3-1-test-sbt-hadoop-3-2-build from plugins.sbt ...
[info] loading project definition from /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project
[info] resolving key references (36199 settings) ...
[info] set current project to spark-parent (in build file:/home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/)
[warn] there are 204 keys that are not used by any other settings/tasks:
[warn]  
[warn] * assembly / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * assembly / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * assembly / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * assembly / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * assembly / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * assembly / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * avro / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * avro / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * avro / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * avro / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * avro / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * avro / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * catalyst / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * catalyst / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * catalyst / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * catalyst / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * catalyst / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * catalyst / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * core / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * core / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * core / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * core / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * core / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * core / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * examples / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * examples / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * examples / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * examples / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * examples / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * examples / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * ganglia-lgpl / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * ganglia-lgpl / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * ganglia-lgpl / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * ganglia-lgpl / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * ganglia-lgpl / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * ganglia-lgpl / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * graphx / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * graphx / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * graphx / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * graphx / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * graphx / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * graphx / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * hadoop-cloud / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * hadoop-cloud / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * hadoop-cloud / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * hadoop-cloud / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * hadoop-cloud / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * hadoop-cloud / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * hive / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * hive / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * hive / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * hive / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * hive / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * hive / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * hive-thriftserver / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * hive-thriftserver / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * hive-thriftserver / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * hive-thriftserver / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * hive-thriftserver / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * hive-thriftserver / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * kubernetes / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * kubernetes / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * kubernetes / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * kubernetes / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * kubernetes / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * kubernetes / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * kvstore / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * kvstore / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * kvstore / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * kvstore / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * kvstore / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * kvstore / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * launcher / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * launcher / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * launcher / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * launcher / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * launcher / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * launcher / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * mesos / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * mesos / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * mesos / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * mesos / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * mesos / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * mesos / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * mllib / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * mllib / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * mllib / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * mllib / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * mllib / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * mllib / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * mllib-local / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * mllib-local / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * mllib-local / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * mllib-local / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * mllib-local / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * mllib-local / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * network-common / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * network-common / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * network-common / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * network-common / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * network-common / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * network-common / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * network-shuffle / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * network-shuffle / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * network-shuffle / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * network-shuffle / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * network-shuffle / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * network-shuffle / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * network-yarn / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * network-yarn / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * network-yarn / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * network-yarn / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * network-yarn / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * network-yarn / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * repl / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * repl / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * repl / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * repl / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * repl / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * repl / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * sketch / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * sketch / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * sketch / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * sketch / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * sketch / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * sketch / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * spark / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * spark / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * spark / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * spark / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * spark / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * spark / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * sql / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * sql / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * sql / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * sql / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * sql / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * sql / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * sql-kafka-0-10 / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * sql-kafka-0-10 / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * sql-kafka-0-10 / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * sql-kafka-0-10 / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * sql-kafka-0-10 / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * sql-kafka-0-10 / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming-kafka-0-10 / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming-kafka-0-10 / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming-kafka-0-10 / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming-kafka-0-10 / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming-kafka-0-10 / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming-kafka-0-10 / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming-kafka-0-10-assembly / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming-kafka-0-10-assembly / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming-kafka-0-10-assembly / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming-kafka-0-10-assembly / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming-kafka-0-10-assembly / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming-kafka-0-10-assembly / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming-kinesis-asl / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming-kinesis-asl / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming-kinesis-asl / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming-kinesis-asl / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming-kinesis-asl / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming-kinesis-asl / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming-kinesis-asl-assembly / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming-kinesis-asl-assembly / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming-kinesis-asl-assembly / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming-kinesis-asl-assembly / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming-kinesis-asl-assembly / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming-kinesis-asl-assembly / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * tags / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * tags / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * tags / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * tags / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * tags / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * tags / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * token-provider-kafka-0-10 / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * token-provider-kafka-0-10 / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * token-provider-kafka-0-10 / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * token-provider-kafka-0-10 / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * token-provider-kafka-0-10 / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * token-provider-kafka-0-10 / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * tools / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * tools / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * tools / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * tools / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * tools / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * tools / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * unsafe / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * unsafe / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * unsafe / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * unsafe / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * unsafe / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * unsafe / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * yarn / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * yarn / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * yarn / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * yarn / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * yarn / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * yarn / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn]  
[warn] note: a setting might still be used by a command; to exclude a key from this `lintUnused` check
[warn] either append it to `Global / excludeLintKeys` or call .withRank(KeyRanks.Invisible) on the key
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] compiling 1 Scala source to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/tools/target/scala-2.12/classes ...
[info] compiling 2 Scala sources and 8 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/tags/target/scala-2.12/classes ...
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] compiling 78 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/network-common/target/scala-2.12/classes ...
[info] done compiling
[info] compiling 12 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/kvstore/target/scala-2.12/classes ...
[info] compiling 20 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/launcher/target/scala-2.12/classes ...
[info] compiling 6 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/tags/target/scala-2.12/test-classes ...
[info] compiling 9 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/sketch/target/scala-2.12/classes ...
[info] compiling 18 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/unsafe/target/scala-2.12/classes ...
[info] compiling 5 Scala sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/mllib-local/target/scala-2.12/classes ...
[info] done compiling
[info] compiling 39 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/network-shuffle/target/scala-2.12/classes ...
[info] done compiling
[info] compiling 24 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/network-common/target/scala-2.12/test-classes ...
[info] done compiling
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:22:1:  Unsafe is internal proprietary API and may be removed in a future release
[warn] import sun.misc.Unsafe;
[warn]                ^
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:28:1:  Unsafe is internal proprietary API and may be removed in a future release
[warn]   private static final Unsafe _UNSAFE;
[warn]                        ^
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:150:1:  Unsafe is internal proprietary API and may be removed in a future release
[warn]     sun.misc.Unsafe unsafe;
[warn]             ^
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:152:1:  Unsafe is internal proprietary API and may be removed in a future release
[warn]       Field unsafeField = Unsafe.class.getDeclaredField("theUnsafe");
[warn]                           ^
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:154:1:  Unsafe is internal proprietary API and may be removed in a future release
[warn]       unsafe = (sun.misc.Unsafe) unsafeField.get(null);
[warn]                         ^5 warnings
[info] done compiling
[info] compiling 3 Scala sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/sketch/target/scala-2.12/test-classes ...
[info] done compiling
[info] done compiling
[info] done compiling
[info] compiling 7 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/launcher/target/scala-2.12/test-classes ...
[info] compiling 11 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/kvstore/target/scala-2.12/test-classes ...
[info] compiling 1 Scala source and 5 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/unsafe/target/scala-2.12/test-classes ...
[info] done compiling
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/network-common/src/test/java/org/apache/spark/network/server/OneForOneStreamManagerSuite.java:105:1:  [unchecked] unchecked conversion
[warn]     Iterator<ManagedBuffer> buffers = Mockito.mock(Iterator.class);
[warn]                                                   ^  required: Iterator<ManagedBuffer>
[warn]   found:    Iterator
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/network-common/src/test/java/org/apache/spark/network/server/OneForOneStreamManagerSuite.java:111:1:  [unchecked] unchecked conversion
[warn]     Iterator<ManagedBuffer> buffers2 = Mockito.mock(Iterator.class);
[warn]                                                    ^  required: Iterator<ManagedBuffer>
[warn]   found:    Iterator
[warn] Note: Some input files use or override a deprecated API.
[warn] Note: Recompile with -Xlint:deprecation for details.
[warn] 2 warnings
[info] compiling 3 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/network-yarn/target/scala-2.12/classes ...
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] Note: /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/launcher/src/test/java/org/apache/spark/launcher/SparkSubmitCommandBuilderSuite.java uses or overrides a deprecated API.
[info] Note: Recompile with -Xlint:deprecation for details.
[info] done compiling
[info] done compiling
[info] compiling 16 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/common/network-shuffle/target/scala-2.12/test-classes ...
[info] done compiling
[info] compiling 560 Scala sources and 99 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/core/target/scala-2.12/classes ...
[info] done compiling
[info] done compiling
[info] done compiling
[info] done compiling
[info] done compiling
[info] compiling 10 Scala sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/mllib-local/target/scala-2.12/test-classes ...
[info] done compiling
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] Note: /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/core/src/main/java/org/apache/spark/SparkFirehoseListener.java uses or overrides a deprecated API.
[info] Note: Recompile with -Xlint:deprecation for details.
[info] done compiling
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] compiling 1 Scala source and 1 Java source to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/external/spark-ganglia-lgpl/target/scala-2.12/classes ...
[info] compiling 104 Scala sources and 6 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/streaming/target/scala-2.12/classes ...
[info] compiling 38 Scala sources and 5 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/graphx/target/scala-2.12/classes ...
[info] compiling 5 Scala sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/external/kafka-0-10-token-provider/target/scala-2.12/classes ...
[info] compiling 327 Scala sources and 103 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/catalyst/target/scala-2.12/classes ...
[info] compiling 25 Scala sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/resource-managers/yarn/target/scala-2.12/classes ...
[info] compiling 20 Scala sources and 1 Java source to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/resource-managers/mesos/target/scala-2.12/classes ...
[info] compiling 41 Scala sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/resource-managers/kubernetes/core/target/scala-2.12/classes ...
[info] compiling 302 Scala sources and 27 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/core/target/scala-2.12/test-classes ...
[info] done compiling
[info] done compiling
[info] done compiling
[info] done compiling
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] done compiling
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] done compiling
[info] done compiling
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] compiling 10 Scala sources and 1 Java source to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/external/kafka-0-10/target/scala-2.12/classes ...
[info] compiling 11 Scala sources and 2 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/external/kinesis-asl/target/scala-2.12/classes ...
[info] done compiling
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/external/kinesis-asl/src/main/java/org/apache/spark/examples/streaming/JavaKinesisWordCountASL.java:157:1:  [unchecked] unchecked method invocation: method union in class JavaStreamingContext is applied to given types
[warn]       unionStreams = jssc.union(streamsList.toArray(new JavaDStream[0]));
[warn]                                ^  required: JavaDStream<T>[]
[warn]   found: JavaDStream[]
[warn]   where T is a type-variable:
[warn]     T extends Object declared in method <T>union(JavaDStream<T>...)
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/external/kinesis-asl/src/main/java/org/apache/spark/examples/streaming/JavaKinesisWordCountASL.java:157:1:  [unchecked] unchecked conversion
[warn]       unionStreams = jssc.union(streamsList.toArray(new JavaDStream[0]));
[warn]                                                    ^  required: JavaDStream<T>[]
[warn]   found:    JavaDStream[]
[warn]   where T is a type-variable:
[warn]     T extends Object declared in method <T>union(JavaDStream<T>...)
[info] done compiling
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/SupportsAtomicPartitionManagement.java:48:1:  [unchecked] unchecked method invocation: method createPartitions in interface SupportsAtomicPartitionManagement is applied to given types
[warn]       createPartitions(new InternalRow[]{ident}, new Map[]{properties});
[warn]                       ^  required: InternalRow[],Map<String,String>[]
[warn]   found: InternalRow[],Map[]
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/SupportsAtomicPartitionManagement.java:48:1:  [unchecked] unchecked conversion
[warn]       createPartitions(new InternalRow[]{ident}, new Map[]{properties});
[warn]                                                  ^  required: Map<String,String>[]
[warn]   found:    Map[]
[warn] 2 warnings
[info] Note: Some input files use or override a deprecated API.
[info] Note: Recompile with -Xlint:deprecation for details.
[info] done compiling
[info] done compiling
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] compiling 11 Scala sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/resource-managers/mesos/target/scala-2.12/test-classes ...
[info] compiling 6 Scala sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/external/kafka-0-10-token-provider/target/scala-2.12/test-classes ...
[info] compiling 19 Scala sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/graphx/target/scala-2.12/test-classes ...
[info] compiling 41 Scala sources and 9 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/streaming/target/scala-2.12/test-classes ...
[info] compiling 35 Scala sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/resource-managers/kubernetes/core/target/scala-2.12/test-classes ...
[info] compiling 21 Scala sources and 3 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/resource-managers/yarn/target/scala-2.12/test-classes ...
[info] done compiling
[info] compiling 274 Scala sources and 6 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/catalyst/target/scala-2.12/test-classes ...
[info] compiling 490 Scala sources and 59 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/core/target/scala-2.12/classes ...
[info] done compiling
[info] done compiling
[info] done compiling
[info] done compiling
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] done compiling
[info] compiling 6 Scala sources and 4 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/external/kafka-0-10/target/scala-2.12/test-classes ...
[info] compiling 8 Scala sources and 1 Java source to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/external/kinesis-asl/target/scala-2.12/test-classes ...
[info] Note: /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/external/kinesis-asl/src/test/java/org/apache/spark/streaming/kinesis/JavaKinesisInputDStreamBuilderSuite.java uses or overrides a deprecated API.
[info] Note: Recompile with -Xlint:deprecation for details.
[info] done compiling
[info] done compiling
[info] Note: /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/SpecificParquetRecordReaderBase.java uses or overrides a deprecated API.
[info] Note: Recompile with -Xlint:deprecation for details.
[info] done compiling
[info] done compiling
[info] compiling 4 Scala sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/repl/target/scala-2.12/classes ...
[info] compiling 18 Scala sources and 1 Java source to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/external/avro/target/scala-2.12/classes ...
[info] compiling 30 Scala sources and 1 Java source to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/external/kafka-0-10-sql/target/scala-2.12/classes ...
[info] compiling 2 Scala sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/hadoop-cloud/target/scala-2.12/classes ...
[info] compiling 29 Scala sources and 2 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/hive/target/scala-2.12/classes ...
[info] compiling 324 Scala sources and 5 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/mllib/target/scala-2.12/classes ...
[info] done compiling
[info] compiling 2 Scala sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/hadoop-cloud/target/scala-2.12/test-classes ...
[info] done compiling
[info] done compiling
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/external/avro/src/main/java/org/apache/spark/sql/avro/SparkAvroKeyOutputFormat.java:55:1:  [unchecked] unchecked call to SparkAvroKeyRecordWriter(Schema,GenericData,CodecFactory,OutputStream,int,Map<String,String>) as a member of the raw type SparkAvroKeyRecordWriter
[warn]       return new SparkAvroKeyRecordWriter(
[warn]              ^
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/external/avro/src/main/java/org/apache/spark/sql/avro/SparkAvroKeyOutputFormat.java:74:1:  [unchecked] unchecked call to DataFileWriter(DatumWriter<D>) as a member of the raw type DataFileWriter
[warn]     this.mAvroFileWriter = new DataFileWriter(dataModel.createDatumWriter(writerSchema));
[warn]                            ^  where D is a type-variable:
[warn]     D extends Object declared in class DataFileWriter
[info] done compiling
[info] done compiling
[info] Note: /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/hive/src/main/java/org/apache/hadoop/hive/ql/io/orc/SparkOrcNewRecordReader.java uses or overrides a deprecated API.
[info] Note: Recompile with -Xlint:deprecation for details.
[info] done compiling
[info] compiling 25 Scala sources and 86 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/hive-thriftserver/target/scala-2.12/classes ...
[info] Note: Some input files use or override a deprecated API.
[info] Note: Recompile with -Xlint:deprecation for details.
[info] done compiling
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] compiling 424 Scala sources and 40 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/core/target/scala-2.12/test-classes ...
[info] done compiling
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] compiling 5 Scala sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/repl/target/scala-2.12/test-classes ...
[info] compiling 197 Scala sources and 134 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/examples/target/scala-2.12/classes ...
[info] done compiling
[info] Note: /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/examples/src/main/java/org/apache/spark/examples/ml/JavaChiSqSelectorExample.java uses or overrides a deprecated API.
[info] Note: Recompile with -Xlint:deprecation for details.
[info] done compiling
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] Note: Some input files use or override a deprecated API.
[info] Note: Recompile with -Xlint:deprecation for details.
[info] done compiling
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] compiling 10 Scala sources and 1 Java source to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/external/avro/target/scala-2.12/test-classes ...
[info] compiling 104 Scala sources and 17 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/hive/target/scala-2.12/test-classes ...
[info] compiling 21 Scala sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/external/kafka-0-10-sql/target/scala-2.12/test-classes ...
[info] compiling 204 Scala sources and 66 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/mllib/target/scala-2.12/test-classes ...
[info] done compiling
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] done compiling
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:464:1:  [unchecked] unchecked cast
[warn]         setLint((List<Integer>)value);
[warn]                                ^  required: List<Integer>
[warn]   found:    Object
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:472:1:  [unchecked] unchecked cast
[warn]         setLString((List<String>)value);
[warn]                                  ^  required: List<String>
[warn]   found:    Object
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:480:1:  [unchecked] unchecked cast
[warn]         setLintString((List<IntString>)value);
[warn]                                        ^  required: List<IntString>
[warn]   found:    Object
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:488:1:  [unchecked] unchecked cast
[warn]         setMStringString((Map<String,String>)value);
[warn]                                              ^  required: Map<String,String>
[warn]   found:    Object
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:749:1:  [unchecked] unchecked call to read(TProtocol,T) as a member of the raw type IScheme
[warn]     schemes.get(iprot.getScheme()).getScheme().read(iprot, this);
[warn]                                                    ^  where T is a type-variable:
[warn]     T extends TBase declared in interface IScheme
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:753:1:  [unchecked] unchecked call to write(TProtocol,T) as a member of the raw type IScheme
[warn]     schemes.get(oprot.getScheme()).getScheme().write(oprot, this);
[warn]                                                     ^  where T is a type-variable:
[warn]     T extends TBase declared in interface IScheme
[warn] /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:1027:1:  [unchecked] getScheme() in ComplexTupleSchemeFactory implements <S>getScheme() in SchemeFactory
[warn]     public ComplexTupleScheme getScheme() {
[warn]                               ^  return type requires unchecked conversion from ComplexTupleScheme to S
[warn]   where S is a type-variable:
[warn]     S extends IScheme declared in method <S>getScheme()
[warn] Note: /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/hive/src/test/java/org/apache/spark/sql/hive/JavaDataFrameSuite.java uses or overrides a deprecated API.
[warn] Note: Recompile with -Xlint:deprecation for details.
[warn] 8 warnings
[info] done compiling
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] compiling 17 Scala sources to /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/sql/hive-thriftserver/target/scala-2.12/test-classes ...
[info] done compiling
[info] done compiling
[success] Total time: 424 s (07:04), completed Dec 7, 2021 9:25:14 PM
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] Strategy 'discard' was applied to 2 files (Run the task at debug level to see details)
[info] Strategy 'filterDistinctLines' was applied to 11 files (Run the task at debug level to see details)
[info] Strategy 'first' was applied to 50 files (Run the task at debug level to see details)
[success] Total time: 24 s, completed Dec 7, 2021 9:25:37 PM

========================================================================
Detecting binary incompatibilities with MiMa
========================================================================
[info] Detecting binary incompatibilities with MiMa using SBT with these profiles:  -Phadoop-3.2 -Phive-2.3 -Pmesos -Pkubernetes -Phadoop-cloud -Phive-thriftserver -Pspark-ganglia-lgpl -Phive -Pkinesis-asl -Pyarn
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.Strategy
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamedQueryContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.InBlock
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.TrainValidationSplit.TrainValidationSplitReader
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.LocalLDAModel.SaveLoadV1_0.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.launcher.Main.MainClassOptionParser
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DefaultParamsReader.Metadata
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.OpenHashMapBasedStateMap.StateInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.dynalloc.ExecutorMonitor.ShuffleCleanedEvent
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.FMRegressionModel.FMRegressionModelWriter.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.RpcEndpointVerifier.CheckExistence
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.SetupDriver
Error instrumenting class:org.apache.spark.mapred.SparkHadoopMapRedUtil$
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.MultilayerPerceptronClassifierWrapper.MultilayerPerceptronClassifierWrapperReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.V1FallbackWriters.toV1WriteBuilder
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FromStatementBodyContext
[WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.Hello
[WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.CryptoHelperChannel
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ImputerModel.ImputerModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveSessionCatalog.SessionCatalogAndTable
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierCommentListContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LegacyDecimalLiteralContext
[WARN] Unable to detect inner functions for class:org.apache.spark.util.SignalUtils.ActionHandler
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleMultipartIdentifierContext
[WARN] Unable to detect inner functions for class:org.apache.spark.api.r.BaseRRunner.ReaderIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentityTransformContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QualifiedNameListContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator22$3
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.parquet.ParquetWriteBuilder
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntervalContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QuerySpecificationContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.DeleteColumn
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.plans
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.DateAccessor
Error instrumenting class:org.apache.spark.sql.execution.streaming.StreamExecution$
Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetUtils$FileTypes$
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.ImplicitTypeCasts
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveRdd
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleTableIdentifierContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.SchemaPruning.RootField
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.BucketSpecHelper
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.ZooKeeperLeaderElectionAgent.LeadershipStatus
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UnsetTablePropertiesContext
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillTask
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LogisticRegressionWrapper.LogisticRegressionWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntervalValueContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.FeatureHasher.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.RunLengthEncoding.Decoder
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterChanged
[WARN] Unable to detect inner functions for class:org.apache.spark.rdd.DefaultPartitionCoalescer.PartitionLocations
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.StateStoreAwareZipPartitionsHelper
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueRowConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.InConversion
[WARN] Unable to detect inner functions for class:org.apache.spark.launcher.SparkAppHandle.Listener
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.api.python.SerDeUtil.ArrayConstructor
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.TempFileBasedBlockStoreUpdater
Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetUtils$
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestWorkerState
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.KVTypeInfo.Accessor
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillMerger.$1
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisteredWorker
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterApplication
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FromClauseContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateKeyWatermarkPredicate
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ColumnReferenceContext
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterChangeAcknowledged
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryTermDefaultContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComparisonContext
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.orc.OrcWriteBuilder
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetMatchingBlockIds
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillExecutors
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerExecutorStateResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.PythonMLLibAPI.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ApplicationRemoved
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyAndNumValues
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryTermContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator19$3
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.StandaloneResourceUtils.MutableResourceInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0.Data
Error instrumenting class:org.apache.spark.input.StreamInputFormat
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.CaseWhenCoercion
[WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RetryingBlockFetcher.BlockFetchStarter
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RetrieveSparkAppConfig
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelWriter.$$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$5
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator3$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.BisectingKMeansModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator2$2
Error instrumenting class:org.apache.spark.mllib.regression.IsotonicRegressionModel$SaveLoadV1_0$
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.Prefix
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyToNumValuesType
Error instrumenting class:org.apache.spark.deploy.history.RollingEventLogFilesWriter$
[WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.io.LocalDiskShuffleMapOutputWriter.$PartitionWriterStream
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.TaskIdentifier
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.InMemoryFileIndex.SerializableBlockLocation
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.stat.test.ChiSqTest.Method
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ExtractWindowExpressions
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.Batch
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ChiSquareTest.ChiSquareResult
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDB.PrefixCache
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AliasedRelationContext
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.FreqSequence
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.DecisionTreeRegressionModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$4
Error instrumenting class:org.apache.spark.sql.execution.PartitionedFileUtil$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.CubeType
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.FloatType.FloatIsConflicted
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveGroupingAnalytics
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.NodeData
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0.$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.Pipeline.PipelineReader
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.ArrayWrappers.ComparableObjectArray
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.Division
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.DoubleHasher
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RetrieveDelegationTokens
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ClassificationModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MultiUnitsIntervalContext
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.ReceiverTracker.TrackerState
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.PlanChangeLogger
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.streaming.StreamingQueryListener.QueryTerminatedEvent
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.sources.MemorySink.AddedData
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ParenthesizedExpressionContext
Error instrumenting class:org.apache.spark.mllib.clustering.LocalLDAModel$SaveLoadV1_0$
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.dynalloc.ExecutorMonitor.Tracker
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.NodeData
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableProviderContext
Error instrumenting class:org.apache.spark.sql.execution.command.DDLUtils$
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.DecisionTreeRegressorWrapper.DecisionTreeRegressorWrapperReader
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.UnregisterApplication
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleByBytesContext
[WARN] Unable to detect inner functions for class:org.apache.spark.api.r.BaseRRunner.WriterThread
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorkerFailed
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.SplitData
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.JoinReorderDP.JoinPlan
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MergeIntoTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.expressions
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.EventFilter.FilterStatistics
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.CatalogImpl.$$typecreator1$4
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerExecutorStateResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateFunctionContext
[WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClient.$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$7
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.LDAWrapperReader
Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport$
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LocationSpecContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.PartitioningUtils.PartitionValues
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper.GeneralizedLinearRegressionWrapperReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Tokenizer.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DriverStatusResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.KMeansModelReader.$$typecreator4$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelWriter.$$typecreator1$2
Error instrumenting class:org.apache.spark.sql.execution.command.LoadDataCommand$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveCatalogs.NonSessionCatalogAndTable
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.optimization.LBFGS.CostFun
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.executor.CoarseGrainedExecutorBackend.Arguments
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.$$typecreator3$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.NGram.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TablePropertyKeyContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.param.shared.SharedParamsCodeGen.ParamDesc
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.GaussianMixtureModel.SaveLoadV1_0.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator21$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedStateStore.COMMITTED
[WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClientFactory.ClientPool
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.RandomForestRegressionModel.$$typecreator1$1
Error instrumenting class:org.apache.spark.scheduler.SplitInfo$
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowNamespacesContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExtractContext
[WARN] Unable to detect inner functions for class:org.apache.spark.SparkBuildInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AddTablePartitionContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SmallIntLiteralContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.python.MLSerDe.SparseMatrixPickler
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$11
Error instrumenting class:org.apache.spark.api.python.DoubleArrayWritable
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV2_0
Error instrumenting class:org.apache.spark.ml.tuning.TrainValidationSplitModel$TrainValidationSplitModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.TransformHelper
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingGlobalLimitStrategy
Error instrumenting class:org.apache.spark.sql.execution.datasources.orc.OrcUtils$
[WARN] Unable to detect inner functions for class:org.apache.spark.api.java.JavaUtils.SerializableMapWrapper
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyToNumValuesStore
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.DecisionTreeRegressorWrapper.DecisionTreeRegressorWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.AnalysisErrorAt
[WARN] Unable to detect inner functions for class:org.apache.spark.ui.JettyUtils.ServletParams
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.Once
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RepairTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.EnsembleNodeData
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator17$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.FloatConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.DataSource.SourceInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.SparkSubmitUtils.MavenCoordinate
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.ApproximatePercentile.PercentileDigest
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamedExpressionContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Std
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RowConstructorContext
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV1_0
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.OptimizeMetadataOnlyQuery.PartitionedRelation
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AggregationClauseContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WindowFunctionType.Python
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSizeHint.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.AssociationRules.Rule
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.SquaredEuclideanSilhouette.ClusterStats
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.RepeatedGroupConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeans.ClusterSummaryAggregator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.SequenceImpl
Error instrumenting class:org.apache.spark.streaming.CheckpointWriter$CheckpointWriteHandler
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowDefContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ErrorIdentContext
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.SpillReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveNewInstance
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelWriter.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.IntervalUtils.ParseState
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ErrorCapturingIdentifierContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.fpm.FPGrowthModel.FPGrowthModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.DateConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.Aggregator
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.IteratorForPartition
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PivotColumnContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.MapConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ExecutorAllocationManager.StageAttempt
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.IntervalUtils.IntervalUnit
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetQuantifierContext
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.client.StandaloneAppClient.ClientEndpoint
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator5$2
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveShuffle
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CtesContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.plans.DslLogicalPlan
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GBTRegressorWrapper.GBTRegressorWrapperReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.FMClassificationModel.FMClassificationModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator4$1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.IsotonicRegressionModel.SaveLoadV1_0.$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.KMeansWrapper.KMeansWrapperReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.ReceiverTracker.ReceiverTrackerEndpoint
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.BisectingKMeansWrapper.BisectingKMeansWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.IdentityProjection
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.PowerIterationClustering.$$typecreator5$1
Error instrumenting class:org.apache.spark.internal.io.HadoopMapRedCommitProtocol
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorter.$SortedIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MultiInsertQueryContext
Error instrumenting class:org.apache.spark.sql.execution.datasources.DaysWritable
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.fpm.FPGrowthModel.FPGrowthModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ChiSquareTest.$typecreator5$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SubqueryContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.WidenSetOperationTypes
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AssignmentListContext
[WARN] Unable to detect inner functions for class:org.apache.spark.network.crypto.TransportCipher.EncryptionHandler
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.StandaloneResourceUtils.StandaloneResourceAllocation
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LDAModel.$$typecreator2$1
Error instrumenting class:org.apache.spark.internal.io.HadoopMapReduceWriteConfigUtil
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GBTRegressionModel.GBTRegressionModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.RobustScalerModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LogisticRegressionWrapper.LogisticRegressionWrapperReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.DateTimeOperations
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator13$2
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.SetupDriver
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0.$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetMatchingBlockIds
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexAndValue
[WARN] Unable to detect inner functions for class:org.apache.spark.InternalAccumulator.output
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.CatalystTypeConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.SparkConf.DeprecatedConfig
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StrictIdentifierContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.DecimalConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComparisonOperatorContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.EltCoercion
[WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SparkSaslClient.$ClientCallbackHandler
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DriverStatusResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.IntArrays
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelWriter.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator11$1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayes.$$typecreator5$1
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.BlacklistTracker.ExecutorFailureList
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveTableValuedFunctions.ArgumentList
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.parquet.ParquetScan
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowCurrentNamespaceContext
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.receiver.BlockGenerator.Block
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.$typecreator4$2
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.SparkAppConfig
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BigIntLiteralContext
Error instrumenting class:org.apache.spark.sql.execution.datasources.FilePartition$
[WARN] Unable to detect inner functions for class:org.apache.spark.network.server.TransportServer.$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PivotClauseContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.UpdateColumnPosition
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Count
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RetrieveLastAllocatedExecutorId
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StopWordsRemover.$$typecreator3$1
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.SignedPrefixComparatorDesc
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.impl.GLMRegressionModel.SaveLoadV1_0.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.LongDelta.Encoder
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WindowFunctionType.SQL
Error instrumenting class:org.apache.spark.sql.execution.streaming.CheckpointFileManager$RenameHelperMethods
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.UpdateColumnNullability
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.FloatConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillExecutor
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelWriter.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.BinaryPrefixComparator
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.FMClassificationModel.FMClassificationModelWriter.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PivotValueContext
[WARN] Unable to detect inner functions for class:org.apache.spark.rdd.InputFileBlockHolder.FileBlock
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FailNativeCommandContext
Error instrumenting class:org.apache.spark.api.python.WriteInputFormatTestDataGenerator$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExpressionContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCheckResult.TypeCheckFailure
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleTableSchemaContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.StarSchemaDetection.TableAccessCardinality
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.WriteSkippedQueue
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.EncryptedDownloadFile
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.Cholesky
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.LDAWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$16
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PredicateOperatorContext
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.stat.test.KolmogorovSmirnovTest.NullHypothesis
Error instrumenting class:org.apache.spark.deploy.master.ui.MasterWebUI
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.GroupType
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.ArrayConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Family
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.AppListingListener.MutableAttemptInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.BlockManagerHeartbeat
Error instrumenting class:org.apache.spark.sql.execution.datasources.DataSource$
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StopExecutors
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillDriver
Error instrumenting class:org.apache.spark.mllib.clustering.GaussianMixtureModel$SaveLoadV1_0$
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.UpdateBlockInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.RollupType
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator17$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator25$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableValuedFunctionContext
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.json.JsonTable
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.IntDelta.Decoder
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.StopBlockManagerMaster
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.ALSModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0.$typecreator1$4
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.OneForOneBlockFetcher.$1
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.NaturalKeys
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.TypedAverage.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.EventFilter.FilterStatistics
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeFixedWidthAggregationMap.$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetNamespaceLocationContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Min
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.$$typecreator2$2
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.text.TextScan
Error instrumenting class:org.apache.spark.sql.execution.streaming.state.StateStoreProvider$
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.StorageStatus.NonRddStorageInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ColTypeContext
[WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockHandler.$ShuffleManagedBufferIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.WindowInPandasExec.WindowBoundType
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.OrderedIdentifierContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.OutputSpec
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DatasetUtils.$typecreator3$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.RandomForestClassificationModel.RandomForestClassificationModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerLatestState
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator16$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator2$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.ColumnChange
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeExternalRowSorter.PrefixComputer
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.PromoteStrings
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingDeduplicationStrategy
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.Window
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.MapConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ReplaceTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.ArrayWrappers.ComparableLongArray
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.EncryptedDownloadFile.$EncryptedDownloadWritableChannel
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InsertIntoContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CommentTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.RandomForestRegressionModel.RandomForestRegressionModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocations
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCWriter.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.Rating
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NumericLiteralContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveBinaryArithmetic
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator12$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionValContext
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.BlockManagerHeartbeat
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.ArrayConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryPrimaryDefaultContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCWriter
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.orc.OrcScan$
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Tweedie
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.UnsignedPrefixComparator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WhereClauseContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSizeHint.$$typecreator3$1
Error instrumenting class:org.apache.spark.sql.execution.datasources.TextBasedFileFormat
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator2$4
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InlineTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.GBTClassificationModel.GBTClassificationModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionSummary.$$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RegisterBlockManager
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$6
[WARN] Unable to detect inner functions for class:org.apache.spark.network.util.TransportFrameDecoder.Interceptor
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerRemoved
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.FixedLengthRowBasedKeyValueBatch.$1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.LabeledPointPickler
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.CoalesceExec.EmptyPartition
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Index
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.RemoveProperty
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.SuccessFetchResult
[WARN] Unable to detect inner functions for class:org.apache.spark.network.util.LevelDBProvider.LevelDBLogger
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.OverlayContext
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.ToBlockManagerSlave
Error instrumenting class:org.apache.spark.ml.tuning.CrossValidatorModel$CrossValidatorModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetArrayConverter.ElementConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleInsertQueryContext
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.SuccessFetchResult
[WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.ShuffleInMemorySorter.1
[WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClientFactory.$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.errors.TreeNodeException
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.PassThrough.Encoder
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryLogisticRegressionSummary.$$typecreator5$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RefreshTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.RandomForestClassifierWrapper.RandomForestClassifierWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.DecisionTreeClassificationModel.DecisionTreeClassificationModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Index
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.StateStoreType
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.recommendation.MatrixFactorizationModel.SaveLoadV1_0.$typecreator13$1
[WARN] Unable to detect inner functions for class:org.apache.spark.launcher.SparkAppHandle.State
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.TypedAverage.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.UnsignedPrefixComparatorDescNullsFirst
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillExecutorsOnHost
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator8$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.MapConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TypeConstructorContext
Error instrumenting class:org.apache.spark.SSLOptions
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestDriverStatus
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.streaming.InternalOutputModes.Append
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcDeserializer.ArrayDataUpdater
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CastContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.PivotType
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.orc.OrcTable
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.ALSModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$11
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableAliasContext
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestExecutors
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator6$3
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Logit
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.SchemaPruning.RootField
Error instrumenting class:org.apache.spark.kafka010.KafkaDelegationTokenProvider
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.TextSocketContinuousStream.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ImplicitOperators
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSlicer.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StopExecutor
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.AsTableIdentifier
Error instrumenting class:org.apache.spark.input.WholeTextFileInputFormat
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveRdd
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator18$2
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BooleanExpressionContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveMissingReferences
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveRandomSeed
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UnquotedIdentifierContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.PrimitiveConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.io.LocalDiskShuffleMapOutputWriter.$LocalDiskShufflePartitionWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveNaturalAndUsingJoin
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.LaunchDriver
Error instrumenting class:org.apache.spark.deploy.history.HistoryServer
Error instrumenting class:org.apache.spark.sql.execution.streaming.ManifestFileCommitProtocol
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator14$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateValueWatermarkPredicate
Error instrumenting class:org.apache.spark.api.python.TestOutputKeyConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QualifiedColTypeWithPositionListContext
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.optimization.NNLS.Workspace
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator7$1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.linalg.distributed.RowMatrix.$SVDMode$1$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DataTypeContext
[WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SparkSaslServer.1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UnitToUnitIntervalContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QuotedIdentifierContext
Error instrumenting class:org.apache.spark.sql.execution.datasources.binaryfile.BinaryFileFormat
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IndexToString.$$typecreator1$4
Error instrumenting class:org.apache.spark.api.python.TestWritable
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowClauseContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.BisectingKMeansModel.BisectingKMeansModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.SparkAppConfig
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WriteStyle.RawStyle
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.SendHeartbeat
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerRemoved
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.RandomForestClassificationModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.Decimal.DecimalAsIfIntegral
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeMax
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.KafkaDataConsumer.NonCachedKafkaDataConsumer
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.LeftSide
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.Optimizer.OptimizeSubqueries
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.stat.StatFunctions.CovarianceCounter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator19$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RecoverPartitionsContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0.$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.SquaredEuclideanSilhouette.$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyToValuePair
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.ParquetOutputTimestampType
Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetReadSupport
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PolynomialExpansion.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$6
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.Hasher
Error instrumenting class:org.apache.spark.input.StreamBasedRecordReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator7$3
[WARN] Unable to detect inner functions for class:org.apache.spark.executor.Executor.TaskReaper
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator4$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedStateStore.STATE
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.LocalIndexEncoder
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.Pipeline.SharedReadWrite
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.IsExecutorAlive
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TransformListContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.Replaced
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.StringType.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.Deserializer
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QualifiedColTypeWithPositionContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.InMemoryFileIndex.SerializableFileStatus
[WARN] Unable to detect inner functions for class:org.apache.spark.network.server.RpcHandler.1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.CrossValidator.CrossValidatorWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.network.server.TransportRequestHandler.$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.CreateStageResult
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterExecutor
Error instrumenting class:org.apache.spark.WritableConverter$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetMapConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AnsiNonReservedContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.FixedPoint
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterInStandby
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.ProgressReporter.ExecutionStats
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.DatabaseDesc
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.After
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.ChainedIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillDriverResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.SizeTracker.Sample
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StopWordsRemover.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.IdentifierHelper
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV1_0.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator23$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.FloatType.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.python.MLSerDe.SparseVectorPickler
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocationsAndStatus
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DereferenceContext
Error instrumenting class:org.apache.spark.sql.execution.datasources.csv.MultiLineCSVDataSource$
Error instrumenting class:org.apache.spark.deploy.security.HBaseDelegationTokenProvider
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveBlock
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.DriverEndpoint
[WARN] Unable to detect inner functions for class:org.apache.spark.internal.io.FileCommitProtocol.TaskCommitMessage
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetExecutorEndpointRef
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Link
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntegerLiteralContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.LookupFunctions
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.BooleanAccessor
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.streaming.InternalOutputModes.Complete
Error instrumenting class:org.apache.spark.input.StreamFileInputFormat
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$4
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AnalyzeContext
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InstanceList.CountingRemoveIfForEach
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ChiSquareTest.ChiSquareResult
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.TimestampConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowFunctionsContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.DoubleAccessor
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StructContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.MultipartIdentifierHelper
[WARN] Unable to detect inner functions for class:org.apache.spark.MapOutputTrackerMaster.MessageLoop
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TransformQuerySpecificationContext
Error instrumenting class:org.apache.spark.metrics.sink.PrometheusServlet
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.KafkaDataConsumer.NonCachedKafkaDataConsumer
Error instrumenting class:org.apache.spark.ml.image.SamplePathFilter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PredicatedContext
[WARN] Unable to detect inner functions for class:org.apache.spark.network.server.RpcHandler.OneWayRpcCallback
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RelationPrimaryContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator8$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamespaceContext
[WARN] Unable to detect inner functions for class:org.apache.spark.rdd.NewHadoopRDD.NewHadoopMapPartitionsWithSplitRDD
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.StructConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InsertOverwriteDirContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelWriter.$$typecreator1$2
Error instrumenting class:org.apache.spark.input.FixedLengthBinaryInputFormat
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$5
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.StreamingListenerBus.WrappedStreamingListenerEvent
Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.SpecificParquetRecordReaderBase$NullIntIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SortItemContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveSubquery
[WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClient.$RpcChannelListener
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.EnsembleNodeData
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.Heartbeat
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.dstream.ReceiverInputDStream.ReceiverRateController
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.TypeConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Key
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.HashingTF.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.NamespaceChange.1
[WARN] Unable to detect inner functions for class:org.apache.spark.rdd.HadoopRDD.HadoopMapPartitionsWithSplitRDD
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.LocalLDAModel.SaveLoadV1_0.$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.graphx.util.BytecodeUtils.MethodInvocationFinder
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExponentLiteralContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.StringLiteralCoercion
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.BooleanEquality
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.ByteConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SparkSaslServer.$DigestCallbackHandler
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.VectorizedColumnReader.2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinExec.OneSideHashJoiner
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.IntHasher
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedStateStore.ABORTED
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV2_0.$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.ByteType.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$10
Error instrumenting class:org.apache.spark.sql.execution.datasources.PartitioningUtils$
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillDriver
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.trees.TreeNodeRef
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveDeserializer
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator1$2
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.csv.CSVTable
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.UpdateDelegationTokens
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV1_0
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.ByteArrays
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.SparseMatrixPickler
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionSpecContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeQueryContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelWriter.Data
Error instrumenting class:org.apache.spark.sql.execution.streaming.CheckpointFileManager$
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.BlockStoreUpdater
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchRequest
[WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClient.$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UncacheTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.plans.logical.statsEstimation.EstimationUtils.OverlappedRange
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MatchedActionContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslSymbol
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Max
Error instrumenting class:org.apache.spark.sql.internal.SharedState$
Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.VectorizedParquetRecordReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ElementwiseProduct.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.DecisionTreeRegressionModelWriter.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.StateStoreAwareZipPartitionsRDD
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingJoinStrategy
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.ShuffleMetricsSource
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.DeprecatedConfig
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComplexDataTypeContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.InMemoryScans
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$9
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.ApproxCountDistinctForIntervals.LongArrayInternalRow
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.ALSWrapper.ALSWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ValueExpressionContext
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.text.TextTable
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.impl.GLMRegressionModel.SaveLoadV1_0
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.Schema
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QuotedIdentifierAlternativeContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.FunctionArgumentConversion
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator9$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetFilters.ParquetPrimitiveField
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterStateResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.KafkaDataConsumer.CachedKafkaDataConsumer
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.InMemoryFileIndex.SerializableFileStatus
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.StructAccessor
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RFormulaModel.RFormulaModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator5$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.Aggregation
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.KolmogorovSmirnovTest.KolmogorovSmirnovTestResult
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetMemoryStatus
Error instrumenting class:org.apache.spark.ml.source.libsvm.LibSVMFileFormat
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.AttributeSeq
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.$$typecreator2$1
Error instrumenting class:org.apache.spark.mllib.tree.model.TreeEnsembleModel$SaveLoadV1_0$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator11$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedStateStore.UPDATING
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.CrossValidator.CrossValidatorReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.DataSourceV2Implicits.TableHelper
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator16$3
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.recommendation.MatrixFactorizationModel.SaveLoadV1_0.$typecreator5$1
Error instrumenting class:org.apache.spark.deploy.history.EventLogFileReader$
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestExecutors
Error instrumenting class:org.apache.spark.sql.execution.datasources.DataSourceUtils$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.UpdateColumnType
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocations
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.RemovedConfig
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.UnsignedPrefixComparatorDesc
[WARN] Unable to detect inner functions for class:org.apache.spark.ui.JettyUtils.ServletParams
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.ContinuousRow
[WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.io.LocalDiskShuffleMapOutputWriter.$PartitionWriterChannel
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveAggAliasInGroupBy
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.PushLeftSemiLeftAntiThroughJoin.AllowedJoin
[WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.io.LocalDiskShuffleMapOutputWriter.1
Error instrumenting class:org.apache.spark.deploy.rest.RestSubmissionServer
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV2_0.$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelWriter.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator6$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelWriter.$$typecreator4$1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDeBase.BasePickler
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Binarizer.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.Cluster
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.Cluster
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelWriter.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.SpecialLimits
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator13$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeM2n
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.PartitionOverwriteMode
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LinearSVCWrapper.LinearSVCWrapperReader
[WARN] Unable to detect inner functions for class:org.apache.spark.SparkConf.DeprecatedConfig
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator4$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.TemporalSequenceImpl
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FlatMapGroupsWithStateExec.InputProcessor
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$17
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterChanged
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.NaiveBayesWrapper.NaiveBayesWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.RandomForestClassifierWrapper.RandomForestClassifierWrapperReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DatasetUtils.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.SubmitDriverResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.Schema
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.ArrowVectorAccessor
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GBTRegressionModel.GBTRegressionModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ErrorCapturingUnitToUnitIntervalContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamedWindowContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.MapAccessor
Error instrumenting class:org.apache.spark.sql.execution.command.CommandUtils$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.IdentityConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CacheTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.Shutdown
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.streaming.InternalOutputModes.Update
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArray.SpillableArrayIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueRowConverterFormatV2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.WindowInPandasExec.WindowBoundType
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetMapConverter.$KeyValueConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.StringArrays
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GaussianMixtureWrapper.GaussianMixtureWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.TypedAverage.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.DecisionTreeClassifierWrapper.DecisionTreeClassifierWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StorageHandlerContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ReplaceTableHeaderContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.ContinuousRow
Error instrumenting class:org.apache.spark.input.Configurable
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DataType.JSortedObject
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.Decimal.DecimalIsFractional
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClustering.Assignment
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DecimalType.Expression
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.RatingBlock
Error instrumenting class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$4
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection.Schema
[WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.OneForOneBlockFetcher.$ChunkCallback
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropFunctionContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator3$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FrameBoundContext
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.SignedPrefixComparator
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.FMRegressionModel.FMRegressionModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.DoubleConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetFilters.ParquetSchemaType
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerWriter.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.NodeData
[WARN] Unable to detect inner functions for class:org.apache.spark.SparkConf.AlternateConfig
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisteredApplication
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.RobustScalerModelWriter.$$typecreator1$2
Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetUtils$FileTypes
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.CosineSilhouette.$typecreator3$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.DCT.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.TrainValidationSplit.TrainValidationSplitWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.ExecutorAllocationManager.StageAttempt
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.UnsignedPrefixComparatorNullsLast
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchBlockInfo
Error instrumenting class:org.apache.spark.sql.execution.datasources.csv.TextInputCSVDataSource$
[WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalShuffleBlockResolver.$2
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.VertexData
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorAdded
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchBlockInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateViewContext
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetBlockStatus
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveFunctions
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator18$1
[WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.ShuffleInMemorySorter.SortComparator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator6$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.IntegralSequenceImpl
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.SerializerBuildHelper.MapElementInformation
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.IsExecutorAlive
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator5$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.FMClassificationModel.FMClassificationModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StatefulAggregationStrategy
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.FPGrowthWrapper.FPGrowthWrapperReader
Error instrumenting class:org.apache.spark.ui.ProxyRedirectHandler$ResponseWrapper
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableIdentifierContext
[WARN] Unable to detect inner functions for class:org.apache.spark.executor.ExecutorMetricsSource.ExecutorMetricGauge
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.json.JsonScan
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.RatingPickler
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LDAModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.SplitData
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetOperationContext
[WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.MessageDecoder.1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.LocalLDAModel.SaveLoadV1_0.Data$
[WARN] Unable to detect inner functions for class:org.apache.spark.rdd.DefaultPartitionCoalescer.partitionGroupOrdering
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.EpochMarker
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GaussianMixtureWrapper.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.StateStore.MaintenanceTask
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerLatestState
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeColNameContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.JoinTypeContext
Error instrumenting class:org.apache.spark.sql.execution.datasources.orc.OrcFileFormat
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.joins.BuildSide
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.OneVsRestModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.BytesToBytesMap.1
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorAdded
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.ALSWrapper.ALSWrapperReader
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestSubmitDriver
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MultipartIdentifierContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ArithmeticBinaryContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.CatalogImpl.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$12
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableFileFormatContext
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.PredictData
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelWriter.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator8$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.joins.BuildRight
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.analysis.DetectAmbiguousSelfJoin.ColumnReference
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.InConversion
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExplainContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WriteStyle.FlattenStyle
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0.Data
Error instrumenting class:org.apache.spark.sql.execution.streaming.CheckpointFileManager$CancellableFSDataOutputStream
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.RandomForestClassificationModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$13
Error instrumenting class:org.apache.spark.ui.JettyUtils$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UseContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeKVExternalSorter.$KVSorterIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.SpillableIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherServer.$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.SparkSession.implicits
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.SplitData
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveOrdinalInOrderByAndGroupBy
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.RemoteBlockDownloadFileManager.ReferenceWithCleanup
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelWriter.Data
Error instrumenting class:org.apache.spark.input.FixedLengthBinaryRecordReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator15$2
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.Pipeline.PipelineWriter
Error instrumenting class:org.apache.spark.internal.io.HadoopMapRedWriteConfigUtil
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryPrimaryContext
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.StopAppClient
[WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.ErrorHandlingWritableChannel
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClusteringModel.SaveLoadV1_0
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.LongAccessor
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator1$4
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.CoalesceExec.EmptyPartition
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.NaiveBayesWrapper.NaiveBayesWrapperReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Identity
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.QueryExecution.debug
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator5$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.GroupingSetContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Normalizer.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.attribute.AttributeType.Nominal$1$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.ShortAccessor
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.TaskIdentifier
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.RatingBlock
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.RobustScalerModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.MetricsAggregate
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.LaunchedExecutor
Error instrumenting class:org.apache.spark.sql.execution.datasources.PartitionPath$
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.csv.CSVWriteBuilder
Error instrumenting class:org.apache.spark.sql.execution.datasources.SchemaMergeUtils$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.DefaultStep
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateManagerImplV2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ResourceContext
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetPeers
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDBTypeInfo.$Index
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.TextSocketContinuousStream.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.IsotonicRegressionModel.SaveLoadV1_0.Data$
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.FMRegressionModel.FMRegressionModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog.CatalogDatabaseHelper
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.StructConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Binarizer.$$typecreator3$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.NGram.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.HavingClauseContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ConstantContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveTempViews
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.KMeansModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslExpression
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator2$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator10$1
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerSchedulerStateResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.ArrayAccessor
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.CreateStageResult
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Subscript
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveReferences
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.CoalesceExec.EmptyRDDWithPartitions
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.SignedPrefixComparatorDescNullsFirst
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelWriter.Data
Error instrumenting class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider$StoreFile$
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorStateChanged
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PredicateContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelWriter.Data
Error instrumenting class:org.apache.spark.sql.catalyst.parser.ParserUtils$EnhancedLogicalPlan$
[WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.ShuffleInMemorySorter.ShuffleSorterIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.HashComparator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.ConcatCoercion
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestKillDriver
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestSubmitDriver
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.JoinReorderDP.JoinPlan
Error instrumenting class:org.apache.spark.deploy.worker.ui.WorkerWebUI
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.receiver.BlockGenerator.GeneratorState
[WARN] Unable to detect inner functions for class:org.apache.spark.InternalAccumulator.shuffleWrite
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.python.MLSerDe.DenseMatrixPickler
Error instrumenting class:org.apache.spark.metrics.MetricsSystem
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.DenseVectorPickler
[WARN] Unable to detect inner functions for class:org.apache.spark.executor.ExecutorMetricsPoller.TCMP
Error instrumenting class:org.apache.spark.status.api.v1.PrometheusResource$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveNamespace
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RegisterBlockManager
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateManagerImplBase
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.RenameColumn
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.LaunchExecutor
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DefaultParamsReader.Metadata
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.DecisionTreeClassificationModel.DecisionTreeClassificationModelWriter.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Interaction.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator17$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.xml.UDFXPathUtil.ReusableStringReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StringLiteralContext
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.RebaseDateTime.JsonRebaseRecord
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BoundPortsResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.FMRegressionModel.FMRegressionModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.ImplicitAttribute
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.util.Utils.Lock
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec.$ColumnMetrics$
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.BisectingKMeansModel.BisectingKMeansModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BooleanValueContext
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.parquet.ParquetTable
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.streaming.StreamingQueryListener.QueryStartedEvent
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.VertexData
[WARN] Unable to detect inner functions for class:org.apache.spark.util.JsonProtocol.TASK_END_REASON_FORMATTED_CLASS_NAMES
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.analysis.DetectAmbiguousSelfJoin.ColumnReference
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryLogisticRegressionSummary.$$typecreator5$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.FlatMapGroupsWithStateStrategy
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.EltCoercion
[WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.protocol.BlockTransferMessage.Type
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InsertOverwriteTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetClauseContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ExtractGenerator
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixture.$$typecreator4$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.CatalogAndIdentifier
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.CompleteRecovery
[WARN] Unable to detect inner functions for class:org.apache.spark.graphx.impl.ShippableVertexPartition.ShippableVertexPartitionOpsConstructor
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorker
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.FPGrowthWrapper.FPGrowthWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.EquivalentExpressions.Expr
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.OpenHashMapBasedStateMap.LimitMarker
Error instrumenting class:org.apache.spark.sql.execution.streaming.FileContextBasedCheckpointFileManager
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator11$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleByPercentileContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexerModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV1_0.$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComplexColTypeListContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.$$typecreator2$1
Error instrumenting class:org.apache.spark.sql.execution.datasources.json.JsonFileFormat
[WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.Trigger
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelWriter.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveSubqueryColumnAliases
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$4
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchResult
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.BinaryType.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Gaussian
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowColumnsContext
[WARN] Unable to detect inner functions for class:org.apache.spark.network.util.LevelDBProvider.StoreVersion
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.UncompressedInBlockSort
[WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockHandler.$ShuffleMetrics
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LDA.LDAReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAssembler.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateData
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.EquivalentExpressions.Expr
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.dynalloc.ExecutorMonitor.ShuffleCleanedEvent
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Log
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.RobustScalerModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ApplicationFinished
Error instrumenting class:org.apache.spark.sql.execution.streaming.FileStreamSinkLog
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Tweedie
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.ExternalIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LogicalNotContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.adaptive.OptimizeLocalShuffleReader.BroadcastJoinWithShuffleRight
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator20$2
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PolynomialExpansion.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.attribute.AttributeType.Binary$1$
[WARN] Unable to detect inner functions for class:org.apache.spark.util.SizeEstimator.ClassInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.LaunchTask
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.RepeatedConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.graphx.PartitionStrategy.RandomVertexCut
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.HintContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.StreamingListenerBus.WrappedStreamingListenerEvent
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClustering.Assignment
[WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.WriteQueued
[WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.types.UTF8String.IntWrapper
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterClusterManager
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GBTClassifierWrapper.GBTClassifierWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.network.server.OneForOneStreamManager.StreamState
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.GroupByType
[WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.types.UTF8String.LongWrapper
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FileFormatContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator3$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingRelationStrategy
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.ParserUtils.EnhancedLogicalPlan
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocationsMultipleBlockIds
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorkerFailed
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.MapZipWithCoercion
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.PushLeftSemiLeftAntiThroughJoin.PushdownDirection
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierListContext
Error instrumenting class:org.apache.spark.mllib.clustering.DistributedLDAModel$SaveLoadV1_0$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.InMemoryFileIndex.SerializableBlockLocation
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetFilters.ParquetPrimitiveField
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueStore
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ColTypeListContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.attribute.AttributeType.Unresolved$1$
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.ArrayWrappers.ComparableIntArray
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateKeyWatermarkPredicate
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Named
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetNamespacePropertiesContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SortPrefixUtils.NoOpPrefixComparator
Error instrumenting class:org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CommentSpecContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterReader
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.CheckForWorkerTimeOut
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator4$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetDecimalConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelWriter.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.KVTypeInfo.$MethodAccessor
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.ReviveOffers
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.BooleanBitSet.Decoder
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorker
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RenameTableColumnContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.Serializer
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetLongDictionaryAwareDecimalConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.OutputCommitCoordinatorEndpoint
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.EnsembleNodeData
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.DoubleConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.ByteBufferBlockStoreUpdater
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.LeastSquaresNESolver
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FunctionNameContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.MetricsAggregate
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.CosineSilhouette.$typecreator2$2
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.FileBasedWriteAheadLog.LogInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.ExecutorAllocationManager.ExecutorAllocationListener
Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat
Error instrumenting class:org.apache.spark.sql.execution.datasources.binaryfile.BinaryFileFormat$
Error instrumenting class:org.apache.spark.metrics.sink.MetricsServlet
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TablePropertyListContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GBTRegressionModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$5
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.DataTypeJsonUtils.DataTypeJsonDeserializer
[WARN] Unable to detect inner functions for class:org.apache.spark.util.JsonProtocol.SPARK_LISTENER_EVENT_FORMATTED_CLASS_NAMES
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StopWordsRemover.$$typecreator4$1
[WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.ListObjectOutputStream
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator4$1
[WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.Message
[WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockStoreClient.$2
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestDriverStatus
Error instrumenting class:org.apache.spark.ui.DelegatingServletContextHandler
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.NumNonZeros
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.NullIntolerant
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.PivotType
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.ArrayConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelWriter.Data
Error instrumenting class:org.apache.spark.ml.tuning.CrossValidatorModel$CrossValidatorModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.DiskBlockObjectWriter.$ManualCloseBufferedOutputStream$1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0.$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.RebaseDateTime.RebaseInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.launcher.Main.1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.IntAccessor
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.VariableLengthRowBasedKeyValueBatch.$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelWriter.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.IntegerType.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.WriteQueueResult
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.BooleanBitSet.Encoder
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.DCT.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.QueryPlanningTracker.PhaseSummary
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Poisson
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.QuasiNewton
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeExternalRowSorter.PrefixComputer.Prefix
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.StructNullableTypeConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.KMeansWrapper.KMeansWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.StringAccessor
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$8
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.RandomForestRegressionModel.$$typecreator2$1
Error instrumenting class:org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand$
Error instrumenting class:org.apache.spark.sql.execution.streaming.state.StateStore$
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.LaunchExecutor
[WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RetryingBlockFetcher.1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RelationContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterWriter.$$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.sources.MemorySink.AddedData
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillExecutors
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.StandaloneResourceUtils.StandaloneResourceAllocation
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Probit
[WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockHandler.$ManagedBufferIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleMethodContext
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.SubmitDriverResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.BinaryAccessor
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionSpecLocationContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator9$1
Error instrumenting class:org.apache.spark.sql.execution.streaming.StreamMetadata$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.IntDelta.Encoder
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator4$2
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.LocalPrefixSpan.ReversedPrefix
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierCommentContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.RightSide
Error instrumenting class:org.apache.spark.ui.ServerInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StopWordsRemover.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.FileBasedWriteAheadLog.LogInfo
Error instrumenting class:org.apache.spark.kafka010.KafkaTokenUtil$KafkaDelegationTokenIdentifier
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RemoveWorker
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SubstringContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RFormulaModel.RFormulaModelWriter.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.CosineSilhouette.$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.TriggerThreadDump
[WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.Strings
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.FileEntry
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.WindowsSubstitution
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.DiskMapIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.GradientBoostedTreesModel.SaveLoadV1_0
Error instrumenting class:org.apache.spark.sql.execution.datasources.NoopCache$
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ChiSquareTest.$typecreator4$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeExternalRowSorter.$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillExecutorsOnHost
[WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.protocol.BlockTransferMessage.Decoder
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.CholeskySolver
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator6$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper.GeneralizedLinearRegressionWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StopDriver
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.BlockLocationsAndStatus
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CurrentDatetimeContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.JoinSelection
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.Postfix
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InMemoryLists
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$6
[WARN] Unable to detect inner functions for class:org.apache.spark.network.server.TransportRequestHandler.$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator14$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LambdaContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BucketSpecContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.SourceFileRemover
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DriverStateChanged
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinConditionSplitPredicates
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TransformClauseContext
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.$typecreator4$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeKVExternalSorter.1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$9
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.FloatType.FloatAsIfIntegral
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.IDF.DocumentFrequencyAggregator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DoubleLiteralContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.Block.InlineHelper
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.SetProperty
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.SquaredEuclideanSilhouette.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveNamespace
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.GBTClassificationModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelWriter.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.SerializationDebugger
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.receiver.BlockGenerator.Block
[WARN] Unable to detect inner functions for class:org.apache.spark.BarrierCoordinator.ContextBarrierState
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FailureFetchResult
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FunctionCallContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.CalendarConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSlicer.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeM2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$14
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.ReplicateBlock
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillExecutors
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.text.TextWriteBuilder
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SubqueryExpressionContext
[WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.ObjectStreamClassReflection
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugQuery
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ImputerModel.ImputerReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateWatermarkPredicates
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTableClausesContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.RobustScalerModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DecimalType.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveShuffle
[WARN] Unable to detect inner functions for class:org.apache.spark.network.crypto.TransportCipher.EncryptedMessage
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.HashingTF.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.LongDelta.Decoder
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.StarSchemaDetection.TableAccessCardinality
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator4$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.PromoteStrings
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.SessionCatalogAndIdentifier
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.DirectKafkaRateController
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GBTRegressionModel.$$typecreator3$1
[WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.BytesToBytesMap.$Location
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.SpecificParquetRecordReaderBase.IntIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ApplyTransformContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.BisectingKMeansWrapper.BisectingKMeansWrapperReader
[WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.LongArrays
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.RowUpdater
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ConstantDefaultContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.DictionaryEncoding.Decoder
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.ArrayWrappers.ComparableByteArray
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.InBlock
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DatasetUtils.$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.OldData
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.ByteBufferBlockStoreUpdater
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$5
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.$typecreator5$1
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.StorageStatus.RddStorageInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LateralViewContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.CatalogAndMultipartIdentifier
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComplexColTypeContext
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RemoveExecutor
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InMemoryView
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.RepeatedPrimitiveConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator12$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ErrorCapturingMultiUnitsIntervalContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.ShortType.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveHints.ResolveCoalesceHints
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Binarizer.$$typecreator2$1
Error instrumenting class:org.apache.spark.mllib.regression.IsotonicRegressionModel$SaveLoadV1_0$Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelWriter.$$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Inverse
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.stat.test.ChiSqTest.Method
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Tokenizer.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator23$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingGlobalLimitStrategy
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ClearCacheContext
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.SpilledFile
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator21$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.EvaluatePython.StructTypePickler
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.StringConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelWriter.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Sqrt
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.DirectKafkaInputDStreamCheckpointData
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.analysis.DetectAmbiguousSelfJoin.LogicalPlanWithDatasetId
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.KMeansModelReader.$$typecreator5$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TransformArgumentContext
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.internal.plugin.PluginContextImpl.PluginMetricsSource
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.ArrayConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryLogisticRegressionSummary.$$typecreator5$4
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.DeprecatedConfig
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.plans.logical.statsEstimation.EstimationUtils.OverlappedRange
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LastContext
[WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SparkSaslClient.1
Error instrumenting class:org.apache.spark.ml.image.SamplePathFilter$
[WARN] Unable to detect inner functions for class:org.apache.spark.graphx.PartitionStrategy.EdgePartition1D
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NotMatchedActionContext
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.UpdateDelegationTokens
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.HistoryServerDiskManager.Lease
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.HashMapGenerator.Buffer
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.RemoteBlockDownloadFileManager
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.LocalDateConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.AddWebUIFilter
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ReconnectWorker
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TablePropertyContext
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerStateResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.Deprecated
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ResetConfigurationContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierSeqContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelWriter.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveRelations
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveHints.ResolveCoalesceHints
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.UpdateBlockInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.ProgressReporter.ExecutionStats
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.RunLengthEncoding.Encoder
[WARN] Unable to detect inner functions for class:org.apache.spark.graphx.PartitionStrategy.EdgePartition2D
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.OldData
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TinyIntLiteralContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TrimContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.StructConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.TimSort.$SortState
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.IsotonicRegressionWrapper.IsotonicRegressionWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.BasicNullableTypeConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ColumnarBatch.$1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.impl.GLMClassificationModel.SaveLoadV1_0.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ClassificationModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetBinaryDictionaryAwareDecimalConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.FixedPoint
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$14
[WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.NettyRpcEnv.FileDownloadChannel
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.FMClassificationModel.FMClassificationModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexAndValue
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateManager
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RegularQuerySpecificationContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.BooleanConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.FamilyAndLink
[WARN] Unable to detect inner functions for class:org.apache.spark.util.sketch.CountMinSketch.Version
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec.$ColumnMetrics
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.TempFileBasedBlockStoreUpdater
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.dynalloc.ExecutorMonitor.ExecutorIdCollector
[WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.BaseErrorHandler
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.VectorizedRleValuesReader.1
Error instrumenting class:org.apache.spark.ml.tuning.TrainValidationSplitModel$TrainValidationSplitModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.Postfix
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LDAModel.$$typecreator1$2
Error instrumenting class:org.apache.spark.sql.catalyst.util.CompressionCodecs$
[WARN] Unable to detect inner functions for class:org.apache.spark.executor.Executor.TaskRunner
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.SpecificParquetRecordReaderBase.RLEIntIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTableHeaderContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.streaming.StreamingQueryListener.QueryProgressEvent
[WARN] Unable to detect inner functions for class:org.apache.spark.rdd.HadoopRDD.HadoopMapPartitionsWithSplitRDD
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DoubleType.DoubleAsIfIntegral
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillExecutor
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowSpecContext
[WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.ObjectStreamClassMethods
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.attribute.AttributeType.Numeric$1$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TransformContext
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.DoublePrefixComparator
Error instrumenting class:org.apache.spark.sql.execution.datasources.orc.OrcColumnarBatchReader
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorUpdated
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LoadDataContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WriteStyle.QuotedStyle
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.Auto
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV2_0.$typecreator1$4
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDB.$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeNNZ
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DateType.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.AFTSurvivalRegressionWrapper.AFTSurvivalRegressionWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WhenClauseContext
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisteredWorker
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator6$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.Heartbeat
[WARN] Unable to detect inner functions for class:org.apache.spark.executor.CoarseGrainedExecutorBackend.Arguments
[WARN] Unable to detect inner functions for class:org.apache.spark.launcher.SparkSubmitCommandBuilder.$OptionParser
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ManageResourceContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ExecutorAllocationManager.ExecutorAllocationManagerSource
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntervalLiteralContext
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveBroadcast
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.Metadata$
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClusteringModel.SaveLoadV1_0.$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RealIdentContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ArithmeticOperatorContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AssignmentContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MultiInsertQueryBodyContext
[WARN] Unable to detect inner functions for class:org.apache.spark.network.crypto.TransportCipher.DecryptionHandler
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveBlock
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslAttribute
Error instrumenting class:org.apache.spark.sql.execution.streaming.FileSystemBasedCheckpointFileManager
[WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalShuffleBlockResolver.$1
[WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.SimpleDownloadFile.$SimpleDownloadWritableChannel
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.SeenFilesMap
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RowFormatSerdeContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec.$SetAccumulator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.JoinCriteriaContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ValueExpressionDefaultContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator22$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.MultilayerPerceptronClassifierWrapper.MultilayerPerceptronClassifierWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropViewContext
Error instrumenting class:org.apache.spark.mllib.tree.model.TreeEnsembleModel$SaveLoadV1_0$Metadata
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StrictNonReservedContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator15$3
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.UnregisterApplication
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Link
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleExpressionContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.DecimalConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDBTypeInfo.1
[WARN] Unable to detect inner functions for class:org.apache.spark.SparkConf.AlternateConfig
Error instrumenting class:org.apache.spark.sql.execution.streaming.FileStreamSourceLog
[WARN] Unable to detect inner functions for class:org.apache.spark.util.random.StratifiedSamplingUtils.RandomDataGenerator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.LongType.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.PythonMLLibAPI.$$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelWriter.Data
Error instrumenting class:org.apache.spark.WritableFactory$
[WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClient.$StdChannelListener
[WARN] Unable to detect inner functions for class:org.apache.spark.api.python.SerDeUtil.AutoBatchedPickler
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveExecutor
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.param.shared.SharedParamsCodeGen.ParamDesc
Error instrumenting class:org.apache.spark.ui.ServerInfo$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MultipartIdentifierListContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.IsotonicRegressionWrapper.IsotonicRegressionWrapperReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ElementwiseProduct.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RegexTokenizer.$$typecreator2$2
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.GetExecutorLossReason
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.InMemoryTableScanExec.ExtractableLiteral
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.StorageStatus.NonRddStorageInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.ShortConverter
Error instrumenting class:org.apache.spark.sql.execution.streaming.FileStreamSink$
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeMean
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator5$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator9$1
[WARN] Unable to detect inner functions for class:org.apache.spark.util.JsonProtocol.JOB_RESULT_FORMATTED_CLASS_NAMES
Error instrumenting class:org.apache.spark.ui.WebUI
[WARN] Unable to detect inner functions for class:org.apache.spark.status.KVUtils.KVStoreScalaSerializer
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.NormL1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PrimaryExpressionContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator19$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.IdentityConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.StringUtils.PlanStringConcat
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateManagerImplV1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ArithmeticUnaryContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.CLogLog
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCheckResult.TypeCheckSuccess
Error instrumenting class:org.apache.spark.sql.execution.streaming.CompactibleFileStreamLog
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.SquaredEuclideanSilhouette.ClusterStats
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.BooleanConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexer.CategoryStats
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveSessionCatalog.SessionCatalogAndNamespace
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.FPGrowthModel.SaveLoadV1_0
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.SpilledFile
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.SpecificParquetRecordReaderBase.ValuesReaderIntIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator24$2
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleByBucketContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DoubleType.DoubleAsIfIntegral
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SearchedCaseContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetConfigurationContext
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator3$1
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.RevokedLeadership
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RFormulaModel.RFormulaModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.BatchedWriteAheadLog.Record
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveExecutor
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ColPositionContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.QuantileSummaries.Stats
[WARN] Unable to detect inner functions for class:org.apache.spark.graphx.lib.SVDPlusPlus.Conf
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.$$typecreator13$2
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GaussianMixtureWrapper.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AlterColumnActionContext
[WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RetryingBlockFetcher.$RetryingBlockFetchListener
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DoubleType.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator4$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleByRowsContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcDeserializer.CatalystDataUpdater
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NonReservedContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueRowConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.ElectedLeader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExistsContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.NamespaceChange.RemoveProperty
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.StringUtils.StringConcat
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.UncompressedInBlock
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator20$3
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GBTRegressorWrapper.GBTRegressorWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RenameTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Interaction.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AlterTableAlterColumnContext
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.csv.CSVScan
Error instrumenting class:org.apache.spark.executor.ExecutorSource
Error instrumenting class:org.apache.spark.sql.execution.datasources.FileFormatWriter$
[WARN] Unable to detect inner functions for class:org.apache.spark.TestUtils.JavaSourceFromString
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.DistributedLDAModel.DistributedWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator7$2
Error instrumenting class:org.apache.spark.sql.execution.datasources.json.MultiLineJsonDataSource$
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StatusUpdate
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NullLiteralContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TruncateTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Sum
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.adaptive.OptimizeLocalShuffleReader.BroadcastJoinWithShuffleLeft
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StarContext
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RequestExecutors
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowPartitionsContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.DecisionTreeClassificationModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$18
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueRowConverterFormatV1
[WARN] Unable to detect inner functions for class:org.apache.spark.api.python.BasePythonRunner.ReaderIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.AddColumn
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.StageState
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestKillDriver
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.SparkSession.Builder
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.EvaluatePython.RowPickler
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.TimSort.1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveTables
Error instrumenting class:org.apache.spark.input.StreamRecordReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.CatalogImpl.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UpdateTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator10$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeExternalRowSorter.RowComparator
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeans.ClusterSummary
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.functions.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.PassThrough.Decoder
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GBTRegressionModel.$$typecreator2$1
Error instrumenting class:org.apache.spark.sql.execution.datasources.SQLHadoopMapReduceCommitProtocol
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.BasicOperators
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.impl.GLMRegressionModel.SaveLoadV1_0.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.DistributedLDAModel.DistributedLDAModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.impl.GLMClassificationModel.SaveLoadV1_0.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.PartitionTypeHelper
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.LaunchTask
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BoundPortsRequest
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.InternalLinearRegressionModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$12
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.DataSource.SourceInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.launcher.AbstractLauncher.ArgumentValidator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DoubleType.DoubleIsConflicted
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.SerializerBuildHelper.MapElementInformation
Error instrumenting class:org.apache.spark.ml.source.image.ImageFileFormat
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SimpleCaseContext
Error instrumenting class:org.apache.spark.sql.catalyst.expressions.codegen.Block$InlineHelper$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveWindowFrame
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NestedConstantListContext
Error instrumenting class:org.apache.spark.api.python.JavaToWritableConverter
Error instrumenting class:org.apache.spark.sql.execution.datasources.PartitionDirectory$
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterClusterManager
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Family
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryOrganizationContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.FMClassificationModel.FMClassificationModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.FamilyAndLink
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkDirCleanup
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator10$3
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelWriter.Data
Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.SpecificParquetRecordReaderBase
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntervalUnitContext
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.TextBasedFileScan
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.$SpillableIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.ExternalIterator.$StreamBuffer
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator13$1
[WARN] Unable to detect inner functions for class:org.apache.spark.api.python.BasePythonRunner.WriterThread
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$2
[WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.RpcEndpointVerifier.CheckExistence
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.PipelineModel.PipelineModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.RebaseDateTime.JsonRebaseRecord
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelWriter.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator12$2
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRest.OneVsRestReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$7
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.NNLSSolver
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeans.ClusterSummary
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.DecisionTreeRegressionModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.network.util.NettyUtils.1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTableLikeContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.python.MLSerDe.DenseVectorPickler
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InMemoryIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.api.python.SerDeUtil.ByteArrayConstructor
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinSide
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.DictionaryEncoding.Encoder
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.Instrumentation.loggerTags
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.TableDesc
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator2$3
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.dstream.FileInputDStream.FileInputDStreamCheckpointData
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SkewSpecContext
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.GetExecutorLossReason
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator21$2
Error instrumenting class:org.apache.spark.mllib.clustering.GaussianMixtureModel$SaveLoadV1_0$Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.ApproxCountDistinctForIntervals.LongArrayInternalRow
[WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClient.$3
[WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.ErrorHandlingInputStream
Error instrumenting class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider$StoreFile
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator5$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Projection
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$5
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.RandomForestRegressorWrapper.RandomForestRegressorWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveHints.RemoveAllHints
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.PythonMLLibAPI.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherBackend.BackendConnection
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetIntDictionaryAwareDecimalConverter
Error instrumenting class:org.apache.spark.mllib.clustering.DistributedLDAModel$SaveLoadV1_0$EdgeData
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.Prefix
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.NodeData
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.BooleanType.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryLogisticRegressionSummary.$$typecreator5$1
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetExecutorEndpointRef
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RemoveExecutor
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.HintStatementContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LinearSVCWrapper.LinearSVCWrapperWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.CatalogHelper
[WARN] Unable to detect inner functions for class:org.apache.spark.rdd.JdbcRDD.ConnectionFactory
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.OrderedIdentifierListContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.CatalogImpl.$$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveInsertInto
[WARN] Unable to detect inner functions for class:org.apache.spark.executor.ExecutorMetricsPoller.TCMP
Error instrumenting class:org.apache.spark.sql.execution.streaming.CheckpointFileManager$RenameBasedFSDataOutputStream
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$6
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeKVExternalSorter.KVComparator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.Block.BlockHelper
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAssembler.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateValueWatermarkPredicate
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveWindowOrder
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.InstantConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveSessionCatalog.TempViewOrV1Table
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.BlacklistTracker.ExecutorFailureList.TaskId
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelReader.$$typecreator4$2
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.$typecreator1$1
Error instrumenting class:org.apache.spark.sql.execution.datasources.json.TextInputJsonDataSource$
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.Solver
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.KolmogorovSmirnovTest.KolmogorovSmirnovTestResult
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TablePropertyValueContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.QueryPlanningTracker.RuleSummary
Error instrumenting class:org.apache.spark.ui.ProxyRedirectHandler
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamedExpressionSeqContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FunctionIdentifierContext
[WARN] Unable to detect inner functions for class:org.apache.spark.network.util.LevelDBProvider.1
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.AddWebUIFilter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AddTableColumnsContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DmlStatementContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleDataTypeContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.joins.BuildLeft
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Wildcard
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetTablePropertiesContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCheckResult.TypeCheckFailure
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.parquet.ParquetScan$
[WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SaslEncryption.EncryptionHandler
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog.TableIdentifierHelper
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionBase.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSizeHint.$$typecreator4$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DeleteFromTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FailureFetchResult
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV2_0
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.WindowFrameCoercion
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.ByteConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveAliases
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.FloatHasher
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.SignedPrefixComparatorNullsLast
Error instrumenting class:org.apache.spark.sql.execution.datasources.InMemoryFileIndex$
[WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.ListObjectOutput
[WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherServer.$ServerConnection
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RowFormatContext
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocationsAndStatus
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.FPTree.Node
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslString
Error instrumenting class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider$HDFSBackedStateStore
Error instrumenting class:org.apache.spark.api.python.TestOutputValueConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.RadixSortSupport
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.MapConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.PartitioningUtils.PartitionValues
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator3$1
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterStateResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator4$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeRelationContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ExtractGenerator.AliasedGenerator$
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.PipelineModel.PipelineModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.analysis.DetectAmbiguousSelfJoin.AttrWithCast
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.AFTSurvivalRegressionWrapper.AFTSurvivalRegressionWrapperReader
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV1_0.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.PythonEvals
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.DataSourceV2Implicits.OptionsHelper
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StatementDefaultContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$6
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateNamespaceContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveGenerate
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV3_0.$typecreator1$6
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.GBTClassificationModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ReconnectWorker
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Binomial
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArray.ExternalAppendOnlyUnsafeRowArrayIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.BlockLocationsAndStatus
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.InternalLinearRegressionModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QualifiedNameContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.StringToAttributeConversionHelper
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelWriter.Data
Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.PullOutNondeterministic
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterExecutor
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DriverStateChanged
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.JoinRelationContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetFilters.ParquetSchemaType
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FirstContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InsertIntoTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator22$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetTableLocationContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateData
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator18$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LogicalBinaryContext
Error instrumenting class:org.apache.spark.SparkEnv$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.ByteAccessor
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.Batch
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetStorageStatus
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.rdd.NewHadoopRDD.NewHadoopMapPartitionsWithSplitRDD
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorter.1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.StateStoreOps
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.QuantileSummaries.Stats
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSizeHint.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.EdgeData$
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.DenseMatrixPickler
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SubscriptContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.api.r.SQLUtils.RegexContext
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterChangeAcknowledged
[WARN] Unable to detect inner functions for class:org.apache.spark.api.python.PythonWorkerFactory.MonitorThread
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveTableValuedFunctions.ArgumentList
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV2_0.$typecreator1$4
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$13
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.KafkaDataConsumer.CachedKafkaDataConsumer
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FunctionTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator15$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerStateResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.IsotonicRegressionModel.SaveLoadV1_0.$typecreator1$1
Error instrumenting class:org.apache.spark.deploy.history.EventLogFileWriter$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.DiskBlockObjectWriter.ManualCloseOutputStream
Error instrumenting class:org.apache.spark.streaming.StreamingContext$
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.FeatureHasher.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeWeightSum
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorkerResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.DecimalAccessor
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.impl.GLMRegressionModel.SaveLoadV1_0.Data$
Error instrumenting class:org.apache.spark.sql.catalyst.catalog.InMemoryCatalog$
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.PMMLLinearRegressionModelWriter.Data
Error instrumenting class:org.apache.spark.streaming.api.java.JavaStreamingContext$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinConditionSplitPredicates
[WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockStoreClient.$1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.recommendation.MatrixFactorizationModel.SaveLoadV1_0
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.DecimalConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator20$1
[WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SaslEncryption.DecryptionHandler
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InstanceList
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.$$typecreator3$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$8
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.First
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.BasicNullableTypeConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.RandomForestRegressionModel.$$typecreator3$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.functions.$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Metric
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.OutputSpec
[WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.NullOutputStream
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillDriverResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.PythonForeachWriter.UnsafeRowBuffer
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext.MutableStateArrays
[WARN] Unable to detect inner functions for class:org.apache.spark.rdd.PipedRDD.NotEqualsFileNameFilter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FromStatementContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableNameContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowViewsContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.DumpByteCode
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropTablePartitionsContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Variance
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.WindowInPandasExec.BoundedWindow
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.SparkSubmitUtils.MavenCoordinate
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTempViewUsingContext
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BeginRecovery
[WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.NettyRpcEnv.FileDownloadCallback
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.StringConverter
Error instrumenting class:org.apache.spark.sql.execution.datasources.csv.CSVFileFormat
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.EpochMarkerGenerator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateWatermarkPredicates
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.KVTypeInfo.$FieldAccessor
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Power
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CommentNamespaceContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.ValueAndMatchPair
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection.Schema
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerDriverStateResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.KolmogorovSmirnovTest.$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.GBTClassificationModel.GBTClassificationModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.FileStreamSourceCleaner
[WARN] Unable to detect inner functions for class:org.apache.spark.serializer.DummySerializerInstance.$1
Error instrumenting class:org.apache.spark.input.ConfigurableCombineFileRecordReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NotMatchedClauseContext
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchRequest
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.FPTree.Summary
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.orc.OrcScan
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveOutputRelation
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateFileFormatContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinExec.OneSideHashJoiner.$AddingProcessedRowToStateCompletionIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.WindowInPandasExec.UnboundedWindow
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.JobScheduler.JobHandler
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Named
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$4
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.RandomForestRegressionModel.RandomForestRegressionModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.HandleNullInputsForUDF
[WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Message.Type
[WARN] Unable to detect inner functions for class:org.apache.spark.graphx.impl.VertexPartition.VertexPartitionOpsConstructor
[WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.OneForOneBlockFetcher.$DownloadCallback
[WARN] Unable to detect inner functions for class:org.apache.spark.internal.io.FileCommitProtocol.EmptyTaskCommitMessage
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Normalizer.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDB.TypeAliases
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.Empty2Null
Error instrumenting class:org.apache.spark.sql.catalyst.expressions.codegen.Block$BlockHelper$
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeMetric
Error instrumenting class:org.apache.spark.sql.execution.streaming.SinkFileStatus$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetArrayConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropNamespaceContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.FMRegressionModel.FMRegressionModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.HashingTF.HashingTFReader
Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.json.JsonWriteBuilder
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelWriter.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.StackCoercion
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.UpdateColumnComment
[WARN] Unable to detect inner functions for class:org.apache.spark.api.python.BasePythonRunner.MonitorThread
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClusteringModel.SaveLoadV1_0.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.GlobalAggregates
Error instrumenting class:org.apache.spark.serializer.SerializationDebugger$ObjectStreamClassMethods$
[WARN] Unable to detect inner functions for class:org.apache.spark.util.SizeEstimator.SearchState
[WARN] Unable to detect inner functions for class:org.apache.spark.InternalAccumulator.input
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateWatermarkPredicate
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.InternalLinearRegressionModelWriter.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcDeserializer.RowUpdater
Error instrumenting class:org.apache.spark.status.api.v1.ApiRootResource$
[WARN] Unable to detect inner functions for class:org.apache.spark.InternalAccumulator.shuffleRead
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.Empty2Null
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowFrameContext
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator1$4
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorUpdated
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RetrieveSparkAppConfig
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.UDTConverter
Error instrumenting class:org.apache.spark.mllib.clustering.LocalLDAModel$SaveLoadV1_0$Data
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.RandomForestRegressorWrapper.RandomForestRegressorWrapperReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InlineTableDefault1Context
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator14$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ErrorCapturingIdentifierExtraContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.HashMapGenerator.Buffer
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.TimestampType.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.LaunchedExecutor
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator9$2
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerDriverStateResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyAndNumValues
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.EnsembleNodeData
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.StructConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveAggregateFunctions
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InlineTableDefault2Context
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillExecutors
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.DecisionTreeClassifierWrapper.DecisionTreeClassifierWrapperReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.StructNullableTypeConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ReregisterWithMaster
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.GaussianMixtureModel.SaveLoadV1_0.$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.TimestampAccessor
[WARN] Unable to detect inner functions for class:org.apache.spark.util.logging.DriverLogger.DfsAsyncWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorter.SortComparator
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.Rating
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.receiver.ReceiverSupervisor.ReceiverState
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.$$typecreator5$2
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RenameTablePartitionContext
[WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.CryptoParams
[WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.Stop
[WARN] Unable to detect inner functions for class:org.apache.spark.util.sketch.BloomFilter.Version
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.DataTypeJsonUtils.DataTypeJsonSerializer
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator24$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.V1Table.IdentifierHelper
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NumberContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.VectorizedRleValuesReader.MODE
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.KeyWrapper
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.LongConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AlterViewQueryContext
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator2$2
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.stat.test.ChiSqTest.NullHypothesis
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$10
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeFuncNameContext
Error instrumenting class:org.apache.spark.SparkContext$
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.KolmogorovSmirnovTest.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GaussianMixtureWrapper.GaussianMixtureWrapperReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.NormalEquation
[WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.UnsafeShuffleWriter.StreamFallbackChannelWrapper
Error instrumenting class:org.apache.spark.sql.execution.datasources.CodecStreams$
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLContext.implicits
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$7
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.OpenHashMapBasedStateMap.StateInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.NonSessionCatalogAndIdentifier
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleStatementContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BooleanLiteralContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SelectClauseContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.RebaseDateTime.RebaseInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.UncompressedInBlockBuilder
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.ArraySortLike.NullOrder
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.FloatType.FloatAsIfIntegral
Error instrumenting class:org.apache.spark.sql.execution.command.PathFilterIgnoreNonData
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.RemovedConfig
[WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.Trigger
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.impl.GLMClassificationModel.SaveLoadV1_0.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisteredApplication
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.BlacklistTracker.ExecutorFailureList.TaskId
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRest.OneVsRestWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.ApproximatePercentile.PercentileDigestSerializer
[WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.ErrorHandlingOutputStream
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator3$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RefreshResourceContext
[WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.HashMapGrowthStrategy.Doubling
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FromStmtContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.impl.RandomForest.NodeIndexInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleFunctionIdentifierContext
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV3_0.$typecreator1$5
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.FileStreamSourceCleaner
[WARN] Unable to detect inner functions for class:org.apache.spark.status.KVUtils.MetadataMismatchException
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.CatalogAndNamespace
[WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.BytesToBytesMap.$MapIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.QuasiNewtonSolver.NormalEquationCostFun
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Mean
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowTablesContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Binarizer.$$typecreator4$1
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocationsMultipleBlockIds
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelWriter.Data
Error instrumenting class:org.apache.spark.sql.execution.aggregate.TungstenAggregationIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$15
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RequestExecutors
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BeginRecovery
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ApplicationRemoved
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.StateStoreHandler
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.DecisionTreeClassificationModel.DecisionTreeClassificationModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.$typecreator4$1
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerSchedulerStateResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.LongHasher
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.MapKeyDedupPolicy
[WARN] Unable to detect inner functions for class:org.apache.spark.serializer.KryoSerializer.PoolWrapper
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestMasterState
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelWriter.$$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MatchedClauseContext
Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport
Error instrumenting class:org.apache.spark.ui.SparkUI
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveHints.ResolveJoinStrategyHints
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RemoveWorker
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.DeclarativeAggregate.RichAttribute
Error instrumenting class:org.apache.spark.sql.catalyst.catalog.ExternalCatalogUtils$
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.SizeTracker.Sample
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ConstantListContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcShimUtils.VectorizedRowBatchWrap
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV3_0
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillTask
[WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.SetAppId
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.ToBlockManagerMaster
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.OneVsRestModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeL1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.ConcatCoercion
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DatasetUtils.$typecreator4$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveUpCast
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolvePivot
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArray.InMemoryBufferIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.executor.CoarseGrainedExecutorBackend.RegisteredExecutor
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.stat.FrequentItems.FreqItemCounter
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.StringPrefixComparator
[WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.BatchedWriteAheadLog.Record
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.InternalKMeansModelWriter.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0.$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.ColumnPosition
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.GenericFileFormatContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetTableSerDeContext
[WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.UnsafeShuffleWriter.MyByteArrayOutputStream
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowRefContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowTblPropertiesContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropTableColumnsContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.UDTConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.ShortConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.NamespaceHelper
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.IfCoercion
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetStringConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.streaming.StreamingQueryListener.Event
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.FPGrowth.FreqItemset
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator17$3
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DmlStatementNoWithContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.RandomForestClassificationModel.RandomForestClassificationModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterApplication
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.impl.GLMClassificationModel.SaveLoadV1_0
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.NormL2
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeMin
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.VectorizedColumnReader.$1
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.AppListingListener.MutableApplicationInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator7$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StatementContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeNamespaceContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.ContinuousRecord
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.IntConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AliasedQueryContext
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.LaunchDriver
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.Decimal.DecimalIsConflicted
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.$$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryLogisticRegressionSummary.$$typecreator5$5
[WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SaslEncryption.EncryptedMessage
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.StageState
[WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalShuffleBlockResolver.AppExecId
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetBlockStatus
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PositionContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveAlterTableChanges
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.IntConverter
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.NamespaceChange.SetProperty
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DecimalLiteralContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator16$2
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.SparseVectorPickler
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator8$1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.GaussianMixtureModel.SaveLoadV1_0.Data$
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexModelWriter.$$typecreator1$3
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator3$2
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.StandaloneResourceUtils.MutableResourceInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.FileEntry
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BigDecimalLiteralContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator5$1
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RegexTokenizer.$$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetPeers
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Gamma
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.IntegralDivision
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.PMMLLinearRegressionModelWriter.Data
Error instrumenting class:org.apache.spark.input.WholeTextFileRecordReader
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.RatingBlockBuilder
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.StringUtils.StringConcat
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator3$1
Error instrumenting class:org.apache.spark.internal.io.HadoopMapReduceCommitProtocol
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.StringToColumn
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveBroadcast
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.PredictData
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelWriter
[WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StatusUpdate
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RowFormatDelimitedContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.ValueAndMatchPair
[WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.SetState
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.LocalPrefixSpan.ReversedPrefix
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BoundPortsResponse
[WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.ErrorHandlingReadableChannel
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugStreamQuery
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.DataReaderThread
[WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.LatchedTriggers
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelWriter.Data
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.RandomForestModel.SaveLoadV1_0
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.$typecreator2$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.FloatAccessor
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorStateChanged
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.LinearRegressionModel.LinearRegressionModelReader
[WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.SpillableIterator
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyToValuePair
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueType
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.ReplicateBlock
[WARN] Unable to detect inner functions for class:org.apache.spark.network.server.TransportRequestHandler.$1
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV1_0.$typecreator1$2
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpanModel.SaveLoadV1_0
[WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ApplicationFinished
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator13$3
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GBTClassifierWrapper.GBTClassifierWrapperReader
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.LegacyBehaviorPolicy
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PrimitiveDataTypeContext
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DecimalType.Fixed
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeFunctionContext
[WARN] Unable to detect inner functions for class:org.apache.spark.storage.StorageStatus.RddStorageInfo
[WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.LogisticRegressionWithLBFGS.$$typecreator1$1
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.LongConverter
Error instrumenting class:org.apache.spark.sql.execution.datasources.text.TextFileFormat
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator1$4
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.SplitData
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
[WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowCreateTableContext
[WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelWriter.$$typecreator5$1
Created : .generated-mima-class-excludes in current directory.
Created : .generated-mima-member-excludes in current directory.
Using /usr/java/latest as default JAVA_HOME.
Note, this will be overridden by -java-home if it is set.
[info] welcome to sbt 1.4.6 (Private Build Java 1.8.0_275)
[info] loading settings for project spark-branch-3-1-test-sbt-hadoop-3-2-build from plugins.sbt ...
[info] loading project definition from /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project
[info] resolving key references (36193 settings) ...
[info] set current project to spark-parent (in build file:/home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/)
[warn] there are 204 keys that are not used by any other settings/tasks:
[warn]  
[warn] * assembly / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * assembly / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * assembly / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * assembly / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * assembly / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * assembly / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * avro / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * avro / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * avro / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * avro / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * avro / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * avro / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * catalyst / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * catalyst / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * catalyst / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * catalyst / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * catalyst / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * catalyst / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * core / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * core / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * core / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * core / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * core / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * core / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * examples / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * examples / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * examples / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * examples / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * examples / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * examples / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * ganglia-lgpl / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * ganglia-lgpl / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * ganglia-lgpl / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * ganglia-lgpl / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * ganglia-lgpl / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * ganglia-lgpl / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * graphx / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * graphx / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * graphx / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * graphx / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * graphx / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * graphx / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * hadoop-cloud / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * hadoop-cloud / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * hadoop-cloud / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * hadoop-cloud / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * hadoop-cloud / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * hadoop-cloud / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * hive / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * hive / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * hive / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * hive / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * hive / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * hive / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * hive-thriftserver / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * hive-thriftserver / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * hive-thriftserver / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * hive-thriftserver / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * hive-thriftserver / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * hive-thriftserver / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * kubernetes / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * kubernetes / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * kubernetes / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * kubernetes / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * kubernetes / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * kubernetes / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * kvstore / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * kvstore / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * kvstore / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * kvstore / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * kvstore / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * kvstore / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * launcher / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * launcher / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * launcher / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * launcher / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * launcher / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * launcher / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * mesos / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * mesos / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * mesos / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * mesos / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * mesos / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * mesos / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * mllib / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * mllib / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * mllib / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * mllib / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * mllib / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * mllib / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * mllib-local / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * mllib-local / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * mllib-local / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * mllib-local / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * mllib-local / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * mllib-local / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * network-common / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * network-common / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * network-common / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * network-common / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * network-common / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * network-common / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * network-shuffle / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * network-shuffle / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * network-shuffle / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * network-shuffle / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * network-shuffle / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * network-shuffle / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * network-yarn / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * network-yarn / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * network-yarn / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * network-yarn / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * network-yarn / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * network-yarn / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * repl / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * repl / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * repl / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * repl / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * repl / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * repl / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * sketch / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * sketch / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * sketch / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * sketch / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * sketch / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * sketch / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * spark / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * spark / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * spark / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * spark / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * spark / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * spark / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * sql / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * sql / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * sql / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * sql / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * sql / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * sql / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * sql-kafka-0-10 / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * sql-kafka-0-10 / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * sql-kafka-0-10 / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * sql-kafka-0-10 / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * sql-kafka-0-10 / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * sql-kafka-0-10 / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming-kafka-0-10 / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming-kafka-0-10 / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming-kafka-0-10 / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming-kafka-0-10 / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming-kafka-0-10 / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming-kafka-0-10 / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming-kafka-0-10-assembly / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming-kafka-0-10-assembly / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming-kafka-0-10-assembly / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming-kafka-0-10-assembly / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming-kafka-0-10-assembly / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming-kafka-0-10-assembly / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming-kinesis-asl / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming-kinesis-asl / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming-kinesis-asl / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming-kinesis-asl / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming-kinesis-asl / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming-kinesis-asl / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming-kinesis-asl-assembly / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming-kinesis-asl-assembly / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming-kinesis-asl-assembly / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming-kinesis-asl-assembly / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming-kinesis-asl-assembly / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming-kinesis-asl-assembly / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * tags / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * tags / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * tags / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * tags / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * tags / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * tags / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * token-provider-kafka-0-10 / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * token-provider-kafka-0-10 / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * token-provider-kafka-0-10 / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * token-provider-kafka-0-10 / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * token-provider-kafka-0-10 / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * token-provider-kafka-0-10 / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * tools / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * tools / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * tools / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * tools / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * tools / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * tools / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * unsafe / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * unsafe / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * unsafe / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * unsafe / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * unsafe / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * unsafe / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * yarn / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * yarn / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * yarn / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * yarn / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * yarn / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * yarn / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn]  
[warn] note: a setting might still be used by a command; to exclude a key from this `lintUnused` check
[warn] either append it to `Global / excludeLintKeys` or call .withRank(KeyRanks.Invisible) on the key
[info] spark-parent: mimaPreviousArtifacts not set, not analyzing binary compatibility
[info] spark-tags: mimaPreviousArtifacts not set, not analyzing binary compatibility
[info] spark-kvstore: mimaPreviousArtifacts not set, not analyzing binary compatibility
[info] spark-unsafe: mimaPreviousArtifacts not set, not analyzing binary compatibility
[info] spark-network-common: mimaPreviousArtifacts not set, not analyzing binary compatibility
[info] spark-network-shuffle: mimaPreviousArtifacts not set, not analyzing binary compatibility
[info] spark-network-yarn: mimaPreviousArtifacts not set, not analyzing binary compatibility
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] spark-tools: mimaPreviousArtifacts not set, not analyzing binary compatibility
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] spark-catalyst: mimaPreviousArtifacts not set, not analyzing binary compatibility
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] spark-mesos: mimaPreviousArtifacts not set, not analyzing binary compatibility
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] spark-ganglia-lgpl: mimaPreviousArtifacts not set, not analyzing binary compatibility
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] spark-kubernetes: mimaPreviousArtifacts not set, not analyzing binary compatibility
[info] spark-token-provider-kafka-0-10: mimaPreviousArtifacts not set, not analyzing binary compatibility
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] spark-yarn: mimaPreviousArtifacts not set, not analyzing binary compatibility
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] spark-streaming-kinesis-asl: mimaPreviousArtifacts not set, not analyzing binary compatibility
[info] spark-avro: mimaPreviousArtifacts not set, not analyzing binary compatibility
[info] spark-sql-kafka-0-10: mimaPreviousArtifacts not set, not analyzing binary compatibility
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] spark-hive: mimaPreviousArtifacts not set, not analyzing binary compatibility
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] spark-streaming-kinesis-asl-assembly: mimaPreviousArtifacts not set, not analyzing binary compatibility
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] spark-hadoop-cloud: mimaPreviousArtifacts not set, not analyzing binary compatibility
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] spark-repl: mimaPreviousArtifacts not set, not analyzing binary compatibility
[info] spark-streaming-kafka-0-10-assembly: mimaPreviousArtifacts not set, not analyzing binary compatibility
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] spark-examples: mimaPreviousArtifacts not set, not analyzing binary compatibility
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] spark-hive-thriftserver: mimaPreviousArtifacts not set, not analyzing binary compatibility
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] spark-assembly: mimaPreviousArtifacts not set, not analyzing binary compatibility
[success] Total time: 87 s (01:27), completed Dec 7, 2021 9:28:22 PM

[info] Building Spark assembly using SBT with these arguments:  -Phadoop-3.2 -Phive-2.3 -Pmesos -Pkubernetes -Phadoop-cloud -Phive-thriftserver -Pspark-ganglia-lgpl -Phive -Pkinesis-asl -Pyarn assembly/package
Using /usr/java/latest as default JAVA_HOME.
Note, this will be overridden by -java-home if it is set.
[info] welcome to sbt 1.4.6 (Private Build Java 1.8.0_275)
[info] loading settings for project spark-branch-3-1-test-sbt-hadoop-3-2-build from plugins.sbt ...
[info] loading project definition from /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project
[info] resolving key references (36199 settings) ...
[info] set current project to spark-parent (in build file:/home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/)
[warn] there are 204 keys that are not used by any other settings/tasks:
[warn]  
[warn] * assembly / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * assembly / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * assembly / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * assembly / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * assembly / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * assembly / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * avro / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * avro / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * avro / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * avro / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * avro / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * avro / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * catalyst / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * catalyst / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * catalyst / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * catalyst / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * catalyst / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * catalyst / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * core / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * core / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * core / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * core / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * core / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * core / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * examples / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * examples / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * examples / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * examples / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * examples / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * examples / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * ganglia-lgpl / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * ganglia-lgpl / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * ganglia-lgpl / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * ganglia-lgpl / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * ganglia-lgpl / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * ganglia-lgpl / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * graphx / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * graphx / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * graphx / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * graphx / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * graphx / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * graphx / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * hadoop-cloud / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * hadoop-cloud / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * hadoop-cloud / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * hadoop-cloud / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * hadoop-cloud / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * hadoop-cloud / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * hive / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * hive / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * hive / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * hive / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * hive / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * hive / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * hive-thriftserver / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * hive-thriftserver / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * hive-thriftserver / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * hive-thriftserver / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * hive-thriftserver / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * hive-thriftserver / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * kubernetes / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * kubernetes / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * kubernetes / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * kubernetes / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * kubernetes / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * kubernetes / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * kvstore / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * kvstore / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * kvstore / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * kvstore / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * kvstore / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * kvstore / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * launcher / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * launcher / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * launcher / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * launcher / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * launcher / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * launcher / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * mesos / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * mesos / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * mesos / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * mesos / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * mesos / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * mesos / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * mllib / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * mllib / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * mllib / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * mllib / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * mllib / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * mllib / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * mllib-local / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * mllib-local / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * mllib-local / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * mllib-local / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * mllib-local / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * mllib-local / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * network-common / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * network-common / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * network-common / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * network-common / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * network-common / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * network-common / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * network-shuffle / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * network-shuffle / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * network-shuffle / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * network-shuffle / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * network-shuffle / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * network-shuffle / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * network-yarn / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * network-yarn / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * network-yarn / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * network-yarn / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * network-yarn / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * network-yarn / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * repl / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * repl / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * repl / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * repl / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * repl / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * repl / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * sketch / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * sketch / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * sketch / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * sketch / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * sketch / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * sketch / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * spark / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * spark / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * spark / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * spark / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * spark / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * spark / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * sql / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * sql / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * sql / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * sql / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * sql / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * sql / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * sql-kafka-0-10 / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * sql-kafka-0-10 / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * sql-kafka-0-10 / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * sql-kafka-0-10 / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * sql-kafka-0-10 / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * sql-kafka-0-10 / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming-kafka-0-10 / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming-kafka-0-10 / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming-kafka-0-10 / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming-kafka-0-10 / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming-kafka-0-10 / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming-kafka-0-10 / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming-kafka-0-10-assembly / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming-kafka-0-10-assembly / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming-kafka-0-10-assembly / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming-kafka-0-10-assembly / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming-kafka-0-10-assembly / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming-kafka-0-10-assembly / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming-kinesis-asl / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming-kinesis-asl / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming-kinesis-asl / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming-kinesis-asl / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming-kinesis-asl / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming-kinesis-asl / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming-kinesis-asl-assembly / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming-kinesis-asl-assembly / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming-kinesis-asl-assembly / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming-kinesis-asl-assembly / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming-kinesis-asl-assembly / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming-kinesis-asl-assembly / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * tags / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * tags / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * tags / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * tags / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * tags / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * tags / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * token-provider-kafka-0-10 / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * token-provider-kafka-0-10 / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * token-provider-kafka-0-10 / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * token-provider-kafka-0-10 / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * token-provider-kafka-0-10 / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * token-provider-kafka-0-10 / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * tools / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * tools / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * tools / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * tools / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * tools / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * tools / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * unsafe / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * unsafe / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * unsafe / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * unsafe / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * unsafe / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * unsafe / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * yarn / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * yarn / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * yarn / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * yarn / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * yarn / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * yarn / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn]  
[warn] note: a setting might still be used by a command; to exclude a key from this `lintUnused` check
[warn] either append it to `Global / excludeLintKeys` or call .withRank(KeyRanks.Invisible) on the key
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[success] Total time: 66 s (01:06), completed Dec 7, 2021 9:29:44 PM

========================================================================
Running Java style checks
========================================================================
[info] Checking Java style using SBT with these profiles:  -Phadoop-3.2 -Phive-2.3 -Pmesos -Pkubernetes -Phadoop-cloud -Phive-thriftserver -Pspark-ganglia-lgpl -Phive -Pkinesis-asl -Pyarn
Checkstyle checks passed.

========================================================================
Running Spark unit tests
========================================================================
[info] Running Spark tests using SBT with these arguments:  -Phadoop-3.2 -Phive-2.3 -Phadoop-cloud -Pmesos -Pkubernetes -Phive-thriftserver -Pspark-ganglia-lgpl -Pkinesis-asl -Pyarn -Phive test
Using /usr/java/latest as default JAVA_HOME.
Note, this will be overridden by -java-home if it is set.
[info] welcome to sbt 1.4.6 (Private Build Java 1.8.0_275)
[info] loading settings for project spark-branch-3-1-test-sbt-hadoop-3-2-build from plugins.sbt ...
[info] loading project definition from /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project
[info] resolving key references (36199 settings) ...
[info] set current project to spark-parent (in build file:/home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/)
[warn] there are 204 keys that are not used by any other settings/tasks:
[warn]  
[warn] * assembly / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * assembly / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * assembly / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * assembly / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * assembly / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * assembly / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * avro / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * avro / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * avro / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * avro / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * avro / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * avro / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * catalyst / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * catalyst / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * catalyst / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * catalyst / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * catalyst / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * catalyst / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * core / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * core / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * core / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * core / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * core / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * core / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * examples / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * examples / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * examples / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * examples / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * examples / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * examples / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * ganglia-lgpl / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * ganglia-lgpl / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * ganglia-lgpl / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * ganglia-lgpl / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * ganglia-lgpl / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * ganglia-lgpl / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * graphx / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * graphx / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * graphx / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * graphx / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * graphx / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * graphx / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * hadoop-cloud / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * hadoop-cloud / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * hadoop-cloud / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * hadoop-cloud / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * hadoop-cloud / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * hadoop-cloud / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * hive / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * hive / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * hive / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * hive / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * hive / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * hive / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * hive-thriftserver / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * hive-thriftserver / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * hive-thriftserver / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * hive-thriftserver / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * hive-thriftserver / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * hive-thriftserver / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * kubernetes / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * kubernetes / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * kubernetes / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * kubernetes / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * kubernetes / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * kubernetes / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * kvstore / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * kvstore / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * kvstore / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * kvstore / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * kvstore / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * kvstore / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * launcher / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * launcher / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * launcher / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * launcher / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * launcher / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * launcher / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * mesos / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * mesos / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * mesos / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * mesos / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * mesos / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * mesos / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * mllib / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * mllib / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * mllib / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * mllib / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * mllib / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * mllib / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * mllib-local / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * mllib-local / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * mllib-local / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * mllib-local / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * mllib-local / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * mllib-local / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * network-common / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * network-common / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * network-common / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * network-common / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * network-common / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * network-common / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * network-shuffle / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * network-shuffle / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * network-shuffle / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * network-shuffle / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * network-shuffle / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * network-shuffle / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * network-yarn / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * network-yarn / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * network-yarn / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * network-yarn / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * network-yarn / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * network-yarn / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * repl / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * repl / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * repl / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * repl / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * repl / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * repl / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * sketch / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * sketch / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * sketch / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * sketch / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * sketch / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * sketch / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * spark / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * spark / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * spark / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * spark / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * spark / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * spark / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * sql / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * sql / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * sql / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * sql / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * sql / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * sql / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * sql-kafka-0-10 / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * sql-kafka-0-10 / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * sql-kafka-0-10 / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * sql-kafka-0-10 / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * sql-kafka-0-10 / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * sql-kafka-0-10 / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming-kafka-0-10 / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming-kafka-0-10 / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming-kafka-0-10 / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming-kafka-0-10 / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming-kafka-0-10 / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming-kafka-0-10 / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming-kafka-0-10-assembly / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming-kafka-0-10-assembly / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming-kafka-0-10-assembly / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming-kafka-0-10-assembly / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming-kafka-0-10-assembly / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming-kafka-0-10-assembly / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming-kinesis-asl / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming-kinesis-asl / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming-kinesis-asl / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming-kinesis-asl / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming-kinesis-asl / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming-kinesis-asl / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * streaming-kinesis-asl-assembly / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * streaming-kinesis-asl-assembly / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * streaming-kinesis-asl-assembly / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * streaming-kinesis-asl-assembly / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * streaming-kinesis-asl-assembly / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * streaming-kinesis-asl-assembly / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * tags / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * tags / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * tags / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * tags / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * tags / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * tags / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * token-provider-kafka-0-10 / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * token-provider-kafka-0-10 / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * token-provider-kafka-0-10 / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * token-provider-kafka-0-10 / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * token-provider-kafka-0-10 / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * token-provider-kafka-0-10 / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * tools / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * tools / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * tools / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * tools / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * tools / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * tools / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * unsafe / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * unsafe / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * unsafe / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * unsafe / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * unsafe / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * unsafe / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn] * yarn / Compile / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:998
[warn] * yarn / M2r / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:285
[warn] * yarn / Sbt / publishMavenStyle
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:286
[warn] * yarn / Test / checkstyle / javaSource
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:999
[warn] * yarn / scalaStyleOnCompile / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:188
[warn] * yarn / scalaStyleOnTest / logLevel
[warn]   +- /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/project/SparkBuild.scala:189
[warn]  
[warn] note: a setting might still be used by a command; to exclude a key from this `lintUnused` check
[warn] either append it to `Global / excludeLintKeys` or call .withRank(KeyRanks.Invisible) on the key
[info] ScalaTest
[info] Run completed in 196 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] ScalaTest
[info] Run completed in 136 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] ScalaTest
[info] Run completed in 222 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] ScalaTest
[info] Run completed in 132 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] Test run started
[info] Test org.apache.spark.network.crypto.AuthMessagesSuite.testPublicKeyEncodeDecode started
[info] Test run started
[info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testNoRedirectToLog started
[info] BloomFilterSuite:
[info] Test run started
[info] Test org.apache.spark.util.kvstore.LevelDBBenchmark ignored
[info] Test run finished: 0 failed, 1 ignored, 0 total, 0.011s
[info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testProcMonitorWithOutputRedirection started
[info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectOutputToLog started
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.182s
[info] - accuracy - Byte (44 milliseconds)
[info] Test run started
[info] Test org.apache.spark.util.kvstore.ArrayWrappersSuite.testGenericArrayKey started
[info] Test run started
[info] Test org.apache.spark.network.ChunkFetchRequestHandlerSuite.handleChunkFetchRequest started
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.006s
[info] Test run started
[info] - mergeInPlace - Byte (25 milliseconds)
[info] - accuracy - Short (10 milliseconds)
[info] - mergeInPlace - Short (19 milliseconds)
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexDescendingWithStart started
[info] - accuracy - Int (35 milliseconds)
[info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectsSimple started
[info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectErrorToLog started
[info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testProcMonitorWithLogRedirection started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexWithStart started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexDescendingWithStart started
[info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testFailedChildProc started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexDescending started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexWithStart started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexWithLast started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexWithSkip started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexWithMax started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexDescending started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexDescendingWithLast started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexDescending started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexDescendingWithLast started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndex started
[info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectErrorTwiceFails started
[info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testBadLogRedirect started
[info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectLastWins started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexWithLast started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexWithStart started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexDescendingWithStart started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexWithLast started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexWithSkip started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexDescending started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.testRefWithIntNaturalKey started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexDescending started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexDescendingWithStart started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexWithMax started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndex started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexWithLast started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexWithSkip started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexWithMax started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexDescendingWithLast started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexDescendingWithLast started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexDescendingWithStart started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndex started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexWithLast started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexWithSkip started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexWithStart started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndex started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexDescendingWithLast started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexWithStart started
[info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndex started
[info] - mergeInPlace - Int (152 milliseconds)
[info] Test run finished: 0 failed, 0 ignored, 38 total, 0.258s
[info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectToLog started
[info] Test run finished: 0 failed, 0 ignored, 11 total, 0.374s
[info] Test run started
[info] Test org.apache.spark.launcher.SparkSubmitOptionParserSuite.testMissingArg started
[info] - accuracy - Long (61 milliseconds)
[info] Test run started
[info] Test org.apache.spark.util.kvstore.LevelDBSuite.testMultipleTypesWriteReadDelete started
[info] - mergeInPlace - Long (115 milliseconds)
[info] Test org.apache.spark.util.kvstore.LevelDBSuite.testObjectWriteReadDelete started
[info] Test org.apache.spark.util.kvstore.LevelDBSuite.testSkip started
[info] Test org.apache.spark.util.kvstore.LevelDBSuite.testMultipleObjectWriteReadDelete started
[info] Test org.apache.spark.util.kvstore.LevelDBSuite.testReopenAndVersionCheckDb started
[info] Test org.apache.spark.launcher.SparkSubmitOptionParserSuite.testAllOptions started
[info] Test org.apache.spark.util.kvstore.LevelDBSuite.testMetadata started
[info] Test org.apache.spark.util.kvstore.LevelDBSuite.testUpdate started
[info] Test org.apache.spark.util.kvstore.LevelDBSuite.testCloseLevelDBIterator started
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.96s
[info] Test run started
[info] Test org.apache.spark.network.client.TransportClientFactorySuite.reuseClientsUpToConfigVariable started
[info] Test org.apache.spark.launcher.SparkSubmitOptionParserSuite.testEqualSeparatedOption started
[info] Test org.apache.spark.launcher.SparkSubmitOptionParserSuite.testExtraOptions started
[info] Test run finished: 0 failed, 0 ignored, 4 total, 0.788s
[info] Test run started
[info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testCliParser started
[info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testPySparkLauncher started
[info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testAlternateSyntaxParsing started
[info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunner started
[info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testSparkRShell started
[info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testMissingAppResource started
[info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunnerPrimaryResource started
[info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testShellCliParser started
[info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testClusterCmdBuilder started
[info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testDriverCmdBuilder started
[info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunnerNoMainClass started
[info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testCliKillAndStatus started
[info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunnerNoArg started
[info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testIsClientMode started
[info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testPySparkFallback started
[info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunnerWithMasterNoMainClass started
[info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testCliHelpAndNoArg started
[info] Test run finished: 0 failed, 0 ignored, 17 total, 0.154s
[info] Test run started
[info] Test org.apache.spark.launcher.LauncherServerSuite.testTimeout started
[info] Test org.apache.spark.launcher.LauncherServerSuite.testStreamFiltering started
[info] Test org.apache.spark.launcher.LauncherServerSuite.testSparkSubmitVmShutsDown started
[info] Test org.apache.spark.launcher.LauncherServerSuite.testLauncherServerReuse started
[info] Test org.apache.spark.launcher.LauncherServerSuite.testAppHandleDisconnect started
[info] Test org.apache.spark.launcher.LauncherServerSuite.testCommunication started
[info] Test run finished: 0 failed, 0 ignored, 6 total, 0.064s
[info] Test run started
[info] Test org.apache.spark.launcher.InProcessLauncherSuite.testKill started
[info] Test org.apache.spark.launcher.InProcessLauncherSuite.testLauncher started
[info] Test org.apache.spark.launcher.InProcessLauncherSuite.testErrorPropagation started
[info] Test run finished: 0 failed, 0 ignored, 3 total, 0.039s
[info] Test run started
[info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testValidOptionStrings started
[info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testJavaMajorVersion started
[info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testPythonArgQuoting started
[info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testWindowsBatchQuoting started
[info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testInvalidOptionStrings started
[info] Test run finished: 0 failed, 0 ignored, 5 total, 0.003s
[info] Test org.apache.spark.network.client.TransportClientFactorySuite.reuseClientsUpToConfigVariableConcurrent started
[info] Test org.apache.spark.network.client.TransportClientFactorySuite.fastFailConnectionInTimeWindow started
[info] Test org.apache.spark.network.client.TransportClientFactorySuite.closeFactoryBeforeCreateClient started
[info] UTF8StringPropertyCheckSuite:
[info] - toString (128 milliseconds)
[info] Test org.apache.spark.network.client.TransportClientFactorySuite.closeBlockClientsWithFactory started
[info] - numChars (35 milliseconds)
[info] - startsWith (30 milliseconds)
[info] - endsWith (12 milliseconds)
[info] - toUpperCase (8 milliseconds)
[info] - toLowerCase (7 milliseconds)
[info] - compare (20 milliseconds)
[info] - substring (70 milliseconds)
[info] - contains (54 milliseconds)
[info] - trim, trimLeft, trimRight (31 milliseconds)
[info] - reverse (8 milliseconds)
[info] - indexOf (29 milliseconds)
[info] - repeat (12 milliseconds)
[info] - lpad, rpad (7 milliseconds)
[info] - concat (104 milliseconds)
[info] Test org.apache.spark.network.client.TransportClientFactorySuite.neverReturnInactiveClients started
[info] - concatWs (44 milliseconds)
[info] - split !!! IGNORED !!!
[info] - levenshteinDistance (9 milliseconds)
[info] - hashCode (3 milliseconds)
[info] - equals (3 milliseconds)
[info] Test org.apache.spark.network.client.TransportClientFactorySuite.closeIdleConnectionForRequestTimeOut started
[info] Test run started
[info] Test org.apache.spark.unsafe.array.LongArraySuite.basicTest started
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.007s
[info] Test run started
[info] Test org.apache.spark.unsafe.PlatformUtilSuite.freeingOnHeapMemoryBlockResetsBaseObjectAndOffset started
[info] Test org.apache.spark.unsafe.PlatformUtilSuite.overlappingCopyMemory started
[info] - accuracy - String (3 seconds, 551 milliseconds)
[info] Test org.apache.spark.unsafe.PlatformUtilSuite.memoryDebugFillEnabledInTest started
[info] Test org.apache.spark.unsafe.PlatformUtilSuite.offHeapMemoryAllocatorThrowsAssertionErrorOnDoubleFree started
[info] Test org.apache.spark.unsafe.PlatformUtilSuite.heapMemoryReuse started
[info] Test org.apache.spark.unsafe.PlatformUtilSuite.onHeapMemoryAllocatorPoolingReUsesLongArrays started
[info] Test org.apache.spark.unsafe.PlatformUtilSuite.onHeapMemoryAllocatorThrowsAssertionErrorOnDoubleFree started
[info] Test org.apache.spark.unsafe.PlatformUtilSuite.freeingOffHeapMemoryBlockResetsOffset started
[info] Test run finished: 0 failed, 0 ignored, 8 total, 0.042s
[info] Test run started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.titleCase started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.concatTest started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.soundex started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.basicTest started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStreamUnderflow started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.testToShort started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.startsWith started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.compareTo started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.levenshteinDistance started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStreamOverflow started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStreamIntArray started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.upperAndLower started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.testToInt started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.createBlankString started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.prefix started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.concatWsTest started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.repeat started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.contains started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.skipWrongFirstByte started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.emptyStringTest started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStreamSlice started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.trimBothWithTrimString started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.substringSQL started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.substring_index started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.pad started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.split started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.trims started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.trimRightWithTrimString started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.findInSet started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.substring started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.translate started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.replace started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.reverse started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.trimLeftWithTrimString started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.endsWith started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.testToByte started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.testToLong started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStream started
[info] Test org.apache.spark.unsafe.types.UTF8StringSuite.indexOf started
[info] Test run finished: 0 failed, 0 ignored, 39 total, 0.049s
[info] Test run started
[info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.testKnownLongInputs started
[info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.testKnownIntegerInputs started
[info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.randomizedStressTest started
[info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.testKnownBytesInputs started
[info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.randomizedStressTestPaddedStrings started
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.randomizedStressTestBytes started
[info] Test run finished: 0 failed, 0 ignored, 6 total, 0.407s
[info] Test run started
[info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.periodAndDurationTest started
[info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.equalsTest started
[info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.toStringTest started
[info] Test run finished: 0 failed, 0 ignored, 3 total, 0.007s
[info] Test org.apache.spark.network.client.TransportClientFactorySuite.returnDifferentClientsForDifferentServers started
[info] Test run finished: 0 failed, 0 ignored, 8 total, 4.933s
[info] Test run started
[info] Test org.apache.spark.network.util.CryptoUtilsSuite.testConfConversion started
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.014s
[info] Test run started
[info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptChallengeSalt started
[info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptChallengeAppId started
[info] Test org.apache.spark.network.crypto.AuthEngineSuite.testFixedChallengeResponse started
[info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptChallengeCiphertext started
[info] Test org.apache.spark.network.crypto.AuthEngineSuite.testMismatchedSecret started
[info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptResponseSalt started
[info] Test org.apache.spark.network.crypto.AuthEngineSuite.testEncryptedMessage started
[info] Test org.apache.spark.network.crypto.AuthEngineSuite.testEncryptedMessageWhenTransferringZeroBytes started
[info] Test org.apache.spark.network.crypto.AuthEngineSuite.testFixedChallenge started
[info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptServerCiphertext started
[info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptResponseAppId started
[info] Test org.apache.spark.network.crypto.AuthEngineSuite.testAuthEngine started
[info] Test org.apache.spark.network.crypto.AuthEngineSuite.testBadKeySize started
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] Test run finished: 0 failed, 0 ignored, 13 total, 0.268s
[info] Test run started
[info] Test org.apache.spark.network.util.NettyMemoryMetricsSuite.testGeneralNettyMemoryMetrics started
[info] Test org.apache.spark.util.kvstore.LevelDBSuite.testRemoveAll started
[info] Test org.apache.spark.util.kvstore.LevelDBSuite.testNegativeIndexValues started
[info] Test run finished: 0 failed, 0 ignored, 10 total, 6.09s
[info] Test run started
[info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testBasicIteration started
[info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testObjectWriteReadDelete started
[info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testDeleteParentIndex started
[info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testMultipleObjectWriteReadDelete started
[info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testMetadata started
[info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testArrayIndices started
[info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testUpdate started
[info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testRemoveAll started
[info] Test run finished: 0 failed, 0 ignored, 8 total, 0.008s
[info] Test run started
[info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testDuplicateIndex started
[info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testEmptyIndexName started
[info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testIndexAnnotation started
[info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testNumEncoding started
[info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testIllegalIndexMethod started
[info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testKeyClashes started
[info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testArrayIndices started
[info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testNoNaturalIndex2 started
[info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testIllegalIndexName started
[info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testNoNaturalIndex started
[info] Test run finished: 0 failed, 0 ignored, 10 total, 0.011s
[info] Test run started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexDescendingWithStart started
[info] BLASSuite:
[info] Test org.apache.spark.network.util.NettyMemoryMetricsSuite.testAdditionalMetrics started
Dec 07, 2021 9:30:34 PM com.github.fommil.netlib.BLAS <clinit>
WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS
Dec 07, 2021 9:30:34 PM com.github.fommil.netlib.BLAS <clinit>
WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS
[info] - nativeL1Threshold (68 milliseconds)
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexWithStart started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexDescendingWithStart started
[info] - copy (31 milliseconds)
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexDescending started
[info] - scal (3 milliseconds)
[info] Test run finished: 0 failed, 0 ignored, 2 total, 0.458s
[info] - axpy (8 milliseconds)
[info] Test run started
[info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testDeallocateReleasesManagedBuffer started
[info] - dot (9 milliseconds)
[info] - spr (8 milliseconds)
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexWithStart started
[info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testByteBufBody started
[info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testShortWrite started
[info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testCompositeByteBufBodySingleBuffer started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexWithLast started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexWithSkip started
[info] - syr (28 milliseconds)
[info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testCompositeByteBufBodyMultipleBuffers started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexWithMax started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexDescending started
[info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testSingleWrite started
[info] Test run finished: 0 failed, 0 ignored, 6 total, 0.063s
[info] - mergeInPlace - String (2 seconds, 730 milliseconds)
[info] Test run started
[info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testEmptyFrame started
[info] - incompatible merge (5 milliseconds)
[info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testNegativeFrameSize started
[info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testSplitLengthField started
[info] - gemm (23 milliseconds)
[info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testFrameDecoding started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexDescendingWithLast started
[info] - gemv (24 milliseconds)
[info] BitArraySuite:
[info] - spmv (3 milliseconds)
[info] - error case when create BitArray (2 milliseconds)
[info] - bitSize (1 millisecond)
[info] - set (2 milliseconds)
[info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testInterception started
[info] UtilsSuite:
[info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testRetainedFrames started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexDescending started
[info] - EPSILON (8 milliseconds)
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexDescendingWithLast started
[info] - normal operation (33 milliseconds)
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndex started
[info] TestingUtilsSuite:
[info] - merge (11 milliseconds)
[info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testConsolidationPerf started
[info] - Comparing doubles using relative error. (17 milliseconds)
[info] - Comparing doubles using absolute error. (6 milliseconds)
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexWithLast started
[info] - Comparing vectors using relative error. (8 milliseconds)
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexWithStart started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexDescendingWithStart started
[info] - Comparing vectors using absolute error. (5 milliseconds)
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexWithLast started
[info] CountMinSketchSuite:
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexWithSkip started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexDescending started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.testRefWithIntNaturalKey started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexDescending started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexDescendingWithStart started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexWithMax started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndex started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexWithLast started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexWithSkip started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexWithMax started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexDescendingWithLast started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexDescendingWithLast started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexDescendingWithStart started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndex started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexWithLast started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexWithSkip started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexWithStart started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndex started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexDescendingWithLast started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexWithStart started
[info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndex started
[info] Test run finished: 0 failed, 0 ignored, 38 total, 0.82s
[info] - accuracy - Byte (426 milliseconds)
[info] - Comparing Matrices using absolute error. (479 milliseconds)
[info] - Comparing Matrices using relative error. (10 milliseconds)
[info] BreezeMatrixConversionSuite:
[info] - dense matrix to breeze (1 millisecond)
[info] - dense breeze matrix to matrix (2 milliseconds)
[info] - sparse matrix to breeze (187 milliseconds)
[info] - sparse breeze matrix to sparse matrix (7 milliseconds)
[info] BreezeVectorConversionSuite:
[info] - mergeInPlace - Byte (423 milliseconds)
[info] - dense to breeze (400 milliseconds)
[info] - sparse to breeze (153 milliseconds)
[info] - dense breeze to vector (1 millisecond)
[info] - sparse breeze to vector (0 milliseconds)
[info] - sparse breeze with partially-used arrays to vector (3 milliseconds)
[info] MultivariateGaussianSuite:
Dec 07, 2021 9:30:36 PM com.github.fommil.netlib.LAPACK <clinit>
WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeSystemLAPACK
Dec 07, 2021 9:30:36 PM com.github.fommil.netlib.LAPACK <clinit>
WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeRefLAPACK
[info] - univariate (155 milliseconds)
[info] - multivariate (18 milliseconds)
[info] - multivariate degenerate (3 milliseconds)
[info] - SPARK-11302 (12 milliseconds)
[info] MatricesSuite:
[info] - dense matrix construction (1 millisecond)
[info] - dense matrix construction with wrong dimension (1 millisecond)
[info] - accuracy - Short (837 milliseconds)
[info] - sparse matrix construction (190 milliseconds)
[info] - sparse matrix construction with wrong number of elements (2 milliseconds)
[info] - index in matrices incorrect input (7 milliseconds)
[info] - equals (5 milliseconds)
[info] - matrix copies are deep copies (1 millisecond)
[info] - matrix indexing and updating (3 milliseconds)
[info] - dense to dense (3 milliseconds)
[info] - dense to sparse (3 milliseconds)
[info] - sparse to sparse (5 milliseconds)
[info] - sparse to dense (5 milliseconds)
[info] - compressed dense (8 milliseconds)
[info] - compressed sparse (3 milliseconds)
[info] - map, update (4 milliseconds)
[info] - transpose (1 millisecond)
[info] - foreachActive (2 milliseconds)
[info] Test run started
[info] Test org.apache.spark.network.sasl.ShuffleSecretManagerSuite.testMultipleRegisters started
[info] - horzcat, vertcat, eye, speye (18 milliseconds)
[info] - zeros (1 millisecond)
[info] - ones (2 milliseconds)
[info] - eye (1 millisecond)
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.268s
[info] - mergeInPlace - Short (387 milliseconds)
[info] Test run started
[info] Test org.apache.spark.network.shuffle.ExternalShuffleCleanupSuite.cleanupOnlyRemovedApp started
[info] - rand (832 milliseconds)
[info] Test org.apache.spark.network.shuffle.ExternalShuffleCleanupSuite.cleanupUsesExecutor started
[info] - randn (7 milliseconds)
[info] - diag (1 millisecond)
[info] - sprand (46 milliseconds)
[info] - sprandn (6 milliseconds)
[info] - toString (15 milliseconds)
[info] - accuracy - Int (658 milliseconds)
[info] - numNonzeros and numActives (1 millisecond)
[info] Test org.apache.spark.network.shuffle.ExternalShuffleCleanupSuite.noCleanupAndCleanup started
[info] - fromBreeze with sparse matrix (38 milliseconds)
[info] - row/col iterator (9 milliseconds)
[info] Test org.apache.spark.network.shuffle.ExternalShuffleCleanupSuite.cleanupMultipleExecutors started
[info] VectorsSuite:
[info] - dense vector construction with varargs (2 milliseconds)
[info] - dense vector construction from a double array (0 milliseconds)
[info] - sparse vector construction (1 millisecond)
[info] - sparse vector construction with unordered elements (4 milliseconds)
[info] - sparse vector construction with mismatched indices/values array (2 milliseconds)
[info] - sparse vector construction with too many indices vs size (1 millisecond)
[info] - sparse vector construction with negative indices (1 millisecond)
[info] - dense to array (0 milliseconds)
[info] - dense argmax (1 millisecond)
[info] - sparse to array (0 milliseconds)
[info] - sparse argmax (1 millisecond)
[info] - vector equals (5 milliseconds)
[info] - vectors equals with explicit 0 (4 milliseconds)
[info] - indexing dense vectors (1 millisecond)
[info] - indexing sparse vectors (2 milliseconds)
[info] - zeros (1 millisecond)
[info] - Vector.copy (1 millisecond)
[info] - fromBreeze (3 milliseconds)
[info] - sqdist (82 milliseconds)
[info] - foreach (12 milliseconds)
[info] - foreachActive (4 milliseconds)
[info] - foreachNonZero (3 milliseconds)
[info] - vector p-norm (8 milliseconds)
[info] - Vector numActive and numNonzeros (2 milliseconds)
[info] - Vector toSparse and toDense (2 milliseconds)
[info] - Vector.compressed (1 millisecond)
[info] - SparseVector.slice (2 milliseconds)
[info] - sparse vector only support non-negative length (3 milliseconds)
[info] - dot product only supports vectors of same size (2 milliseconds)
[info] - dense vector dot product (1 millisecond)
[info] - sparse vector dot product (1 millisecond)
[info] - mixed sparse and dense vector dot product (0 milliseconds)
[info] - iterator (2 milliseconds)
[info] Test run finished: 0 failed, 0 ignored, 4 total, 0.963s
[info] - activeIterator (3 milliseconds)
[info] - nonZeroIterator (3 milliseconds)
[info] Test run started
[info] - mergeInPlace - Int (329 milliseconds)
[info] ScalaTest
[info] Run completed in 11 seconds, 859 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] Passed: Total 46, Failed 0, Errors 0, Passed 46
[info] ScalaTest
[info] Run completed in 11 seconds, 855 milliseconds.
[info] Total number of tests run: 19
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 19, failed 0, canceled 0, ignored 1, pending 0
[info] All tests passed.
[info] Passed: Total 76, Failed 0, Errors 0, Passed 76, Ignored 1
[info] ScalaTest
[info] Run completed in 11 seconds, 924 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] Passed: Total 106, Failed 0, Errors 0, Passed 105, Skipped 1
[info] ScalaTest
[info] Run completed in 11 seconds, 911 milliseconds.
[info] Total number of tests run: 95
[info] Suites: completed 8, aborted 0
[info] Tests: succeeded 95, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 95, Failed 0, Errors 0, Passed 95
[info] Test org.apache.spark.network.sasl.SaslIntegrationSuite.testGoodClient started
[info] - accuracy - Long (732 milliseconds)
[info] Test org.apache.spark.network.sasl.SaslIntegrationSuite.testNoSaslClient started
[info] Test org.apache.spark.network.sasl.SaslIntegrationSuite.testNoSaslServer started
[info] Test org.apache.spark.network.sasl.SaslIntegrationSuite.testBadClient started
[info] Test run finished: 0 failed, 0 ignored, 4 total, 1.173s
[info] Test run started
[info] Test org.apache.spark.network.shuffle.RetryingBlockFetcherSuite.testRetryAndUnrecoverable started
[info] Test org.apache.spark.network.shuffle.RetryingBlockFetcherSuite.testSingleIOExceptionOnFirst started
[info] - mergeInPlace - Long (571 milliseconds)
[info] Test org.apache.spark.network.shuffle.RetryingBlockFetcherSuite.testUnrecoverableFailure started
[info] Test org.apache.spark.network.shuffle.RetryingBlockFetcherSuite.testSingleIOExceptionOnSecond started
[info] Test org.apache.spark.network.shuffle.RetryingBlockFetcherSuite.testThreeIOExceptions started
[info] Test org.apache.spark.network.shuffle.RetryingBlockFetcherSuite.testNoFailures started
[info] Test org.apache.spark.network.shuffle.RetryingBlockFetcherSuite.testTwoIOExceptions started
[info] Test run finished: 0 failed, 0 ignored, 7 total, 0.186s
[info] Test run started
[info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testEmptyBlockFetch started
[info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFailure started
[info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFailureAndSuccess started
[info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testBatchFetchThreeShuffleBlocks started
[info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFetchThreeShuffleBlocks started
[info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testUseOldProtocol started
[info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFetchThree started
[info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testBatchFetchShuffleBlocksOrder started
[info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFetchOne started
[info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFetchShuffleBlocksOrder started
[info] Test run finished: 0 failed, 0 ignored, 10 total, 0.168s
[info] Test run started
[info] Test org.apache.spark.network.shuffle.ErrorHandlerSuite.testPushErrorLogging started
[info] Test org.apache.spark.network.shuffle.ErrorHandlerSuite.testPushErrorRetry started
[info] Test run finished: 0 failed, 0 ignored, 2 total, 0.004s
[info] Test run started
[info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testFetchShuffleBlocks started
[info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testOpenDiskPersistedRDDBlocksWithMissingBlock started
[info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testFinalizeShuffleMerge started
[info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testRegisterExecutor started
[info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testBadMessages started
[info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testCompatibilityWithOldVersion started
[info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testFetchShuffleBlocksInBatch started
[info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testOpenDiskPersistedRDDBlocks started
[info] Test run finished: 0 failed, 0 ignored, 8 total, 0.256s
[info] Test run started
[info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnRemovedExecutorWithoutFilesToKeep started
[info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnRemovedExecutorWithFilesToKeepFetchRddEnabled started
[info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRemovedExecutorWithoutFilesToKeep started
[info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRegisteredExecutorWithFilesToKeepFetchRddDisabled started
[info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRegisteredExecutorWithoutFilesToKeep started
[info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRegisteredExecutorWithFilesToKeepFetchRddEnabled started
[info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupUsesExecutorWithoutFilesToKeep started
[info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnRemovedExecutorWithFilesToKeepFetchRddDisabled started
[info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRemovedExecutorWithFilesToKeepFetchRddDisabled started
[info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupUsesExecutorWithFilesToKeep started
[info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRemovedExecutorWithFilesToKeepFetchRddEnabled started
[info] Test run finished: 0 failed, 0 ignored, 11 total, 0.33s
[info] Test run started
[info] Test org.apache.spark.network.shuffle.OneForOneBlockPusherSuite.testHandlingRetriableFailures started
[info] Test org.apache.spark.network.shuffle.OneForOneBlockPusherSuite.testPushOne started
[info] Test org.apache.spark.network.shuffle.OneForOneBlockPusherSuite.testServerFailures started
[info] Test org.apache.spark.network.shuffle.OneForOneBlockPusherSuite.testPushThree started
[info] Test run finished: 0 failed, 0 ignored, 4 total, 0.018s
[info] Test run started
[info] Test org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testBadSecret started
[info] Test org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testBadAppId started
[info] Test org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testValid started
[info] Test org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testEncryption started
[info] Test run finished: 0 failed, 0 ignored, 4 total, 0.805s
[info] Test run started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testDuplicateBlocksAreIgnoredWhenPrevStreamIsInProgress started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testRequestForAbortedShufflePartitionThrowsException started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureAfterMultipleDataBlocks started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testBasicBlockMerge started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testCollision started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureAfterDuplicateBlockDoesNotInterfereActiveStream started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testDuplicateBlocksAreIgnoredWhenPrevStreamHasCompleted started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureWhileTruncatingFiles started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testIOExceptionsExceededThreshold started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testOnFailureInvokedMoreThanOncePerBlock started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureAfterData started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testWritingPendingBufsIsAbortedImmediatelyDuringComplete started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFinalizeWithMultipleReducePartitions started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testUpdateLocalDirsOnlyOnce started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testTooLateArrival started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testNoIndexFile started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testRecoverIndexFileAfterIOExceptionsInFinalize started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureAfterComplete started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testRecoverMetaFileAfterIOExceptionsInFinalize started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testIncompleteStreamsAreOverwritten started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureInAStreamDoesNotInterfereWithStreamWhichIsWriting started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testDeferredBufsAreWrittenDuringOnData started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testRecoverIndexFileAfterIOExceptions started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testCleanUpDirectory started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testIOExceptionsDuringMetaUpdateIncreasesExceptionCount started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testDividingMergedBlocksIntoChunks started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testPendingBlockIsAbortedImmediately started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testDeferredBufsAreWrittenDuringOnComplete started
[info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testRecoverMetaFileAfterIOExceptions started
[info] Test run finished: 0 failed, 0 ignored, 29 total, 0.404s
[info] Test run started
[info] Test org.apache.spark.network.shuffle.BlockTransferMessagesSuite.serializeOpenShuffleBlocks started
[info] Test org.apache.spark.network.shuffle.BlockTransferMessagesSuite.testLocalDirsMessages started
[info] Test run finished: 0 failed, 0 ignored, 2 total, 0.004s
[info] Test run started
[info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchUnregisteredExecutor started
[info] ExternalSorterSuite:
[info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchWrongExecutor started
[info] - accuracy - String (2 seconds, 351 milliseconds)
[info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testRegisterWithCustomShuffleManager started
[info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchCorruptRddBlock started
[info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchNoServer started
[info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchDeletedRddBlock started
[info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchValidRddBlock started
[info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testRemoveRddBlocks started
[info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchThreeSort started
[info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchWrongBlockId started
[info] - mergeInPlace - String (1 second, 208 milliseconds)
[info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchNonexistent started
[info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchOneSort started
[info] Test run finished: 0 failed, 0 ignored, 12 total, 1.761s
[info] Test run started
[info] Test org.apache.spark.network.shuffle.ExternalShuffleBlockResolverSuite.testSortShuffleBlocks started
[info] Test org.apache.spark.network.shuffle.ExternalShuffleBlockResolverSuite.testBadRequests started
[info] Test org.apache.spark.network.shuffle.ExternalShuffleBlockResolverSuite.jsonSerializationOfExecutorRegistration started
[info] Test run finished: 0 failed, 0 ignored, 3 total, 0.216s
[info] Test run started
[info] Test org.apache.spark.network.shuffle.AppIsolationSuite.testSaslAppIsolation started
[info] Test org.apache.spark.network.shuffle.AppIsolationSuite.testAuthEngineAppIsolation started
[info] Test run finished: 0 failed, 0 ignored, 2 total, 0.645s
[info] - empty data stream with kryo ser (2 seconds, 838 milliseconds)
[info] - empty data stream with java ser (285 milliseconds)
[info] - few elements per partition with kryo ser (452 milliseconds)
[info] - few elements per partition with java ser (375 milliseconds)
[info] - empty partitions with spilling with kryo ser (831 milliseconds)
[info] DistributedSuite:
[info] - empty partitions with spilling with java ser (457 milliseconds)
[info] - accuracy - Byte array (4 seconds, 893 milliseconds)
[info] Test run finished: 0 failed, 0 ignored, 7 total, 15.543s
[info] Test run started
[info] Test org.apache.spark.network.TransportResponseHandlerSuite.failOutstandingStreamCallbackOnException started
[info] Test org.apache.spark.network.TransportResponseHandlerSuite.failOutstandingStreamCallbackOnClose started
[info] Test org.apache.spark.network.TransportResponseHandlerSuite.testActiveStreams started
[info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleSuccessfulFetch started
[info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleFailedRPC started
[info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleFailedFetch started
[info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleSuccessfulRPC started
[info] Test org.apache.spark.network.TransportResponseHandlerSuite.clearAllOutstandingRequests started
[info] Test run finished: 0 failed, 0 ignored, 8 total, 0.067s
[info] Test run started
[info] Test org.apache.spark.network.sasl.SparkSaslSuite.testNonMatching started
[info] Test org.apache.spark.network.sasl.SparkSaslSuite.testSaslAuthentication started
[info] Test org.apache.spark.network.sasl.SparkSaslSuite.testEncryptedMessageChunking started
[info] Test org.apache.spark.network.sasl.SparkSaslSuite.testServerAlwaysEncrypt started
[info] Test org.apache.spark.network.sasl.SparkSaslSuite.testDataEncryptionIsActuallyEnabled started
[info] - mergeInPlace - Byte array (3 seconds, 54 milliseconds)
[info] - incompatible merge (3 milliseconds)
[info] Test org.apache.spark.network.sasl.SparkSaslSuite.testFileRegionEncryption started
[info] Test org.apache.spark.network.sasl.SparkSaslSuite.testSaslEncryption started
[info] ScalaTest
[info] Run completed in 24 seconds, 583 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] Passed: Total 103, Failed 0, Errors 0, Passed 103
[info] ScalaTest
[info] Run completed in 24 seconds, 802 milliseconds.
[info] Total number of tests run: 29
[info] Suites: completed 3, aborted 0
[info] Tests: succeeded 29, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 29, Failed 0, Errors 0, Passed 29
[info] Test org.apache.spark.network.sasl.SparkSaslSuite.testDelegates started
[info] Test org.apache.spark.network.sasl.SparkSaslSuite.testEncryptedMessage started
[info] Test org.apache.spark.network.sasl.SparkSaslSuite.testMatching started
[info] Test org.apache.spark.network.sasl.SparkSaslSuite.testRpcHandlerDelegate started
[info] Test run finished: 0 failed, 0 ignored, 11 total, 0.846s
[info] Test run started
[info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchNonExistentChunk started
[info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchFileChunk started
[info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchBothChunks started
[info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchChunkAndNonExistent started
[info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchBufferChunk started
[info] Test run finished: 0 failed, 0 ignored, 5 total, 0.526s
[info] Test run started
[info] Test org.apache.spark.network.RequestTimeoutIntegrationSuite.furtherRequestsDelay started
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@5bd4b885 rejected from java.util.concurrent.ThreadPoolExecutor@1ed63d9b[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 0]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at java.util.concurrent.Executors$DelegatedExecutorService.execute(Executors.java:668)
	at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
	at scala.concurrent.Promise.complete(Promise.scala:53)
	at scala.concurrent.Promise.complete$(Promise.scala:52)
	at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67)
	at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
	at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59)
	at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875)
	at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110)
	at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107)
	at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
	at scala.concurrent.Promise.complete(Promise.scala:53)
	at scala.concurrent.Promise.complete$(Promise.scala:52)
	at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
[info] - task throws not serializable exception (11 seconds, 80 milliseconds)
[info] - local-cluster format (10 milliseconds)
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] MesosCoarseGrainedSchedulerBackendSuite:
[info] - spilling in local cluster with kryo ser (12 seconds, 268 milliseconds)
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] Test org.apache.spark.network.RequestTimeoutIntegrationSuite.timeoutCleanlyClosesClient started
[info] - mesos supports killing and limiting executors (4 seconds, 986 milliseconds)
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] - mesos supports killing and relaunching tasks with executors (548 milliseconds)
[info] - mesos supports spark.executor.cores (460 milliseconds)
[info] - simple groupByKey (8 seconds, 10 milliseconds)
[info] - mesos supports unset spark.executor.cores (260 milliseconds)
[info] - mesos does not acquire more than spark.cores.max (232 milliseconds)
[info] - mesos does not acquire gpus if not specified (256 milliseconds)
[info] - mesos does not acquire more than spark.mesos.gpus.max (180 milliseconds)
[info] - mesos declines offers that violate attribute constraints (327 milliseconds)
[info] - mesos declines offers with a filter when reached spark.cores.max (242 milliseconds)
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] - mesos declines offers with a filter when maxCores not a multiple of executor.cores (548 milliseconds)
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] - mesos declines offers with a filter when reached spark.cores.max with executor.cores (417 milliseconds)
[info] - mesos assigns tasks round-robin on offers (633 milliseconds)
[info] - mesos creates multiple executors on a single agent (295 milliseconds)
[info] - mesos doesn't register twice with the same shuffle service (237 milliseconds)
[info] - Port offer decline when there is no appropriate range (254 milliseconds)
[info] - Port offer accepted when ephemeral ports are used (226 milliseconds)
[info] - Port offer accepted with user defined port numbers (191 milliseconds)
[info] - spilling in local cluster with java ser (12 seconds, 430 milliseconds)
[info] - mesos kills an executor when told (278 milliseconds)
[info] - weburi is set in created scheduler driver (204 milliseconds)
[info] - failover timeout is set in created scheduler driver (284 milliseconds)
[info] - honors unset spark.mesos.containerizer (563 milliseconds)
[info] Test org.apache.spark.network.RequestTimeoutIntegrationSuite.timeoutInactiveRequests started
[info] - honors spark.mesos.containerizer="mesos" (389 milliseconds)
[info] - docker settings are reflected in created tasks (410 milliseconds)
[info] - force-pull-image option is disabled by default (181 milliseconds)
[info] - groupByKey where map output sizes exceed maxMbInFlight (8 seconds, 828 milliseconds)
[info] - mesos supports spark.executor.uri (204 milliseconds)
[info] - mesos supports setting fetcher cache (119 milliseconds)
[info] - mesos supports disabling fetcher cache (170 milliseconds)
[info] - mesos sets task name to spark.app.name (129 milliseconds)
[info] - mesos sets configurable labels on tasks (158 milliseconds)
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] - mesos supports spark.mesos.network.name and spark.mesos.network.labels (202 milliseconds)
[info] ScalaTest
[info] Run completed in 87 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] - SPARK-28778 '--hostname' shouldn't be set for executor when virtual network is enabled (2 seconds, 435 milliseconds)
[info] - supports spark.scheduler.minRegisteredResourcesRatio (465 milliseconds)
[info] - accumulators (6 seconds, 918 milliseconds)
[info] Test run finished: 0 failed, 0 ignored, 3 total, 32.355s
[info] Test run started
[info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testSaslClientFallback started
[info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testSaslServerFallback started
[info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testAuthReplay started
[info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testNewAuth started
[info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testLargeMessageEncryption started
[info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testAuthFailure started
[info] Test run finished: 0 failed, 0 ignored, 6 total, 0.916s
[info] Test run started
[info] Test org.apache.spark.network.server.OneForOneStreamManagerSuite.streamStatesAreFreedWhenConnectionIsClosedEvenIfBufferIteratorThrowsException started
[info] Test org.apache.spark.network.server.OneForOneStreamManagerSuite.testMissingChunk started
[info] Test org.apache.spark.network.server.OneForOneStreamManagerSuite.managedBuffersAreFreedWhenConnectionIsClosed started
[info] Test run finished: 0 failed, 0 ignored, 3 total, 0.047s
[info] Test run started
[info] Test org.apache.spark.network.TransportRequestHandlerSuite.handleStreamRequest started
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.004s
[info] Test run started
[info] Test org.apache.spark.network.protocol.EncodersSuite.testBitmapArraysEncodeDecode started
[info] Test org.apache.spark.network.protocol.EncodersSuite.testRoaringBitmapEncodeShouldFailWhenBufferIsSmall started
[info] Test org.apache.spark.network.protocol.EncodersSuite.testRoaringBitmapEncodeDecode started
[info] Test run finished: 0 failed, 0 ignored, 3 total, 0.033s
[info] Test run started
[info] Test org.apache.spark.network.StreamSuite.testSingleStream started
[info] Test org.apache.spark.network.StreamSuite.testMultipleStreams started
[info] Test org.apache.spark.network.StreamSuite.testConcurrentStreams started
[info] Test org.apache.spark.network.StreamSuite.testZeroLengthStream started
[info] - supports data locality with dynamic allocation (6 seconds, 133 milliseconds)
[info] - Creates an env-based reference secrets. (326 milliseconds)
[info] Test run finished: 0 failed, 0 ignored, 4 total, 0.91s
[info] Test run started
[info] Test org.apache.spark.network.crypto.TransportCipherSuite.testBufferNotLeaksOnInternalError started
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.078s
[info] Test run started
[info] Test org.apache.spark.network.RpcIntegrationSuite.sendRpcWithStreamConcurrently started
[info] Test org.apache.spark.network.RpcIntegrationSuite.sendOneWayMessage started
[info] Test org.apache.spark.network.RpcIntegrationSuite.singleRPC started
[info] Test org.apache.spark.network.RpcIntegrationSuite.throwErrorRPC started
[info] Test org.apache.spark.network.RpcIntegrationSuite.doubleTrouble started
[info] Test org.apache.spark.network.RpcIntegrationSuite.doubleRPC started
[info] Test org.apache.spark.network.RpcIntegrationSuite.returnErrorRPC started
[info] Test org.apache.spark.network.RpcIntegrationSuite.sendRpcWithStreamFailures started
[info] - Creates an env-based value secrets. (177 milliseconds)
[info] Test org.apache.spark.network.RpcIntegrationSuite.sendRpcWithStreamOneAtATime started
[info] Test org.apache.spark.network.RpcIntegrationSuite.sendSuccessAndFailure started
[info] - Creates file-based reference secrets. (203 milliseconds)
[info] Test run finished: 0 failed, 0 ignored, 10 total, 0.578s
[info] Test run started
[info] Test org.apache.spark.network.ProtocolSuite.responses started
[info] Test org.apache.spark.network.ProtocolSuite.requests started
[info] Test run finished: 0 failed, 0 ignored, 2 total, 0.01s
[info] - spilling in local cluster with many reduce tasks with kryo ser (14 seconds, 685 milliseconds)
[info] - Creates a file-based value secrets. (280 milliseconds)
[info] MesosClusterManagerSuite:
[info] - mesos fine-grained (152 milliseconds)
[info] - mesos coarse-grained (104 milliseconds)
[info] - mesos with zookeeper (94 milliseconds)
[info] - mesos with i/o encryption throws error (211 milliseconds)
[info] MesosFineGrainedSchedulerBackendSuite:
[info] - weburi is set in created scheduler driver (619 milliseconds)
[info] - Use configured mesosExecutor.cores for ExecutorInfo (97 milliseconds)
[info] - check spark-class location correctly (9 milliseconds)
[info] - spark docker properties correctly populate the DockerInfo message (34 milliseconds)
[info] - mesos resource offers result in launching tasks (36 milliseconds)
[info] - can handle multiple roles (10 milliseconds)
[info] MesosProtoUtilsSuite:
[info] - mesosLabels (2 milliseconds)
[info] - broadcast variables (6 seconds, 286 milliseconds)
[info] MesosSchedulerUtilsSuite:
[info] - use at-least minimum overhead (27 milliseconds)
[info] - use overhead if it is greater than minimum value (2 milliseconds)
[info] - use spark.mesos.executor.memoryOverhead (if set) (1 millisecond)
[info] - parse a non-empty constraint string correctly (7 milliseconds)
[info] - parse an empty constraint string correctly (1 millisecond)
[info] - throw an exception when the input is malformed (4 milliseconds)
[info] - empty values for attributes' constraints matches all values (4 milliseconds)
[info] - subset match is performed for set attributes (1 millisecond)
[info] - less than equal match is performed on scalar attributes (2 milliseconds)
[info] - contains match is performed for range attributes (28 milliseconds)
[info] - equality match is performed for text attributes (1 millisecond)
[info] - Port reservation is done correctly with user specified ports only (11 milliseconds)
[info] - Port reservation is done correctly with all random ports (2 milliseconds)
[info] - Port reservation is done correctly with user specified ports only - multiple ranges (2 milliseconds)
[info] - Port reservation is done correctly with all random ports - multiple ranges (1 millisecond)
[info] - Principal specified via spark.mesos.principal (15 milliseconds)
[info] - Principal specified via spark.mesos.principal.file (19 milliseconds)
[info] - Principal specified via spark.mesos.principal.file that does not exist (3 milliseconds)
[info] - Principal specified via SPARK_MESOS_PRINCIPAL (1 millisecond)
[info] - Principal specified via SPARK_MESOS_PRINCIPAL_FILE (1 millisecond)
[info] - Principal specified via SPARK_MESOS_PRINCIPAL_FILE that does not exist (0 milliseconds)
[info] - Secret specified via spark.mesos.secret (0 milliseconds)
[info] - Principal specified via spark.mesos.secret.file (1 millisecond)
[info] - Principal specified via spark.mesos.secret.file that does not exist (1 millisecond)
[info] - Principal specified via SPARK_MESOS_SECRET (0 milliseconds)
[info] - Principal specified via SPARK_MESOS_SECRET_FILE (1 millisecond)
[info] - Secret specified with no principal (2 milliseconds)
[info] - Principal specification preference (0 milliseconds)
[info] - Secret specification preference (1 millisecond)
[info] MesosRestServerSuite:
[info] - test default driver overhead memory (100 milliseconds)
[info] - test driver overhead memory with overhead factor (25 milliseconds)
[info] - test configured driver overhead memory (25 milliseconds)
[info] MesosClusterDispatcherArgumentsSuite:
Using host: 192.168.123.1
Using port: 7077
Using webUiPort: 8081
Framework Name: Spark Cluster
Spark Config properties set:
(spark.testing,true)
(spark.ui.showConsoleProgress,false)
(spark.master.rest.enabled,false)
(spark.ui.enabled,false)
(spark.unsafe.exceptionOnMemoryLeak,true)
(spark.test.home,/home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2)
(spark.mesos.key2,value2)
(spark.memory.debugFill,true)
(spark.port.maxRetries,100)
[info] - test if spark config args are passed successfully (39 milliseconds)
Using host: localhost
Using port: 1212
Using webUiPort: 2323
Framework Name: myFramework
Spark Config properties set:
(spark.testing,true)
(spark.ui.showConsoleProgress,false)
(spark.master.rest.enabled,false)
(spark.ui.enabled,false)
(spark.unsafe.exceptionOnMemoryLeak,true)
(spark.test.home,/home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2)
(spark.memory.debugFill,true)
(spark.port.maxRetries,100)
[info] - test non conf settings (3 milliseconds)
[info] MesosSchedulerBackendUtilSuite:
[info] - ContainerInfo fails to parse invalid docker parameters (7 milliseconds)
[info] - ContainerInfo parses docker parameters (1 millisecond)
[info] - SPARK-28778 ContainerInfo respects Docker network configuration (2 milliseconds)
[info] MesosClusterSchedulerSuite:
[info] - can queue drivers (39 milliseconds)
[info] - can kill queued drivers (30 milliseconds)
[info] - can handle multiple roles (55 milliseconds)
[info] - escapes commandline args for the shell (60 milliseconds)
[info] - supports spark.mesos.driverEnv.* (33 milliseconds)
[info] - supports spark.mesos.network.name and spark.mesos.network.labels (31 milliseconds)
[info] - supports setting fetcher cache (38 milliseconds)
[info] - supports setting fetcher cache on the dispatcher (31 milliseconds)
[info] - supports disabling fetcher cache (30 milliseconds)
[info] - accept/decline offers with driver constraints (77 milliseconds)
[info] - supports spark.mesos.driver.labels (28 milliseconds)
[info] - can kill supervised drivers (31 milliseconds)
[info] JobGeneratorSuite:
[info] - SPARK-27347: do not restart outdated supervised drivers (1 second, 532 milliseconds)
[info] - Declines offer with refuse seconds = 120. (32 milliseconds)
[info] - Creates an env-based reference secrets. (38 milliseconds)
[info] - Creates an env-based value secrets. (31 milliseconds)
[info] - Creates file-based reference secrets. (32 milliseconds)
[info] - Creates a file-based value secrets. (42 milliseconds)
[info] - assembles a valid driver command, escaping all confs and args (42 milliseconds)
[info] - SPARK-23499: Test dispatcher priority queue with non float value (27 milliseconds)
[info] - SPARK-23499: Get driver priority (38 milliseconds)
[info] - SPARK-23499: Can queue drivers with priority (42 milliseconds)
[info] - SPARK-23499: Can queue drivers with negative priority (37 milliseconds)
[info] MesosClusterDispatcherSuite:
[info] - prints usage on empty input (14 milliseconds)
[info] - prints usage with only --help (24 milliseconds)
[info] - prints error with unrecognized options (13 milliseconds)
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] PregelSuite:
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] - repeatedly failing task (6 seconds, 731 milliseconds)
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] - SPARK-6222: Do not clear received block data too soon (9 seconds, 431 milliseconds)
[info] ReceiverInputDStreamSuite:
[info] - Without WAL enabled: createBlockRDD creates empty BlockRDD when no block info (345 milliseconds)
[info] - Without WAL enabled: createBlockRDD creates correct BlockRDD with block info (507 milliseconds)
[info] - Without WAL enabled: createBlockRDD filters non-existent blocks before creating BlockRDD (308 milliseconds)
[info] - With WAL enabled: createBlockRDD creates empty WALBackedBlockRDD when no block info (270 milliseconds)
[info] - With WAL enabled: createBlockRDD creates correct WALBackedBlockRDD with all block info having WAL info (243 milliseconds)
[info] - With WAL enabled: createBlockRDD creates BlockRDD when some block info don't have WAL info (267 milliseconds)
[info] FileBasedWriteAheadLogWithFileCloseAfterWriteSuite:
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] - FileBasedWriteAheadLog - read all logs (293 milliseconds)
[info] - 1 iteration (7 seconds, 353 milliseconds)
[info] - FileBasedWriteAheadLog - write logs (1 second, 785 milliseconds)
[info] - spilling in local cluster with many reduce tasks with java ser (17 seconds, 257 milliseconds)
[info] - FileBasedWriteAheadLog - read all logs after write (1 second, 113 milliseconds)
[info] - cleanup of intermediate files in sorter (316 milliseconds)
[info] - cleanup of intermediate files in sorter with failures (266 milliseconds)
[info] - chain propagation (3 seconds, 165 milliseconds)
[info] ConnectedComponentsSuite:
[info] - FileBasedWriteAheadLog - clean old logs (675 milliseconds)
[info] - cleanup of intermediate files in shuffle (1 second, 227 milliseconds)
[info] - FileBasedWriteAheadLog - clean old logs synchronously (1 second, 276 milliseconds)
[info] - cleanup of intermediate files in shuffle with failures (611 milliseconds)
[info] - no sorting or partial aggregation with kryo ser (398 milliseconds)
[info] - no sorting or partial aggregation with java ser (269 milliseconds)
[info] - no sorting or partial aggregation with spilling with kryo ser (397 milliseconds)
[info] - no sorting or partial aggregation with spilling with java ser (215 milliseconds)
[info] - sorting, no partial aggregation with kryo ser (161 milliseconds)
[info] - sorting, no partial aggregation with java ser (191 milliseconds)
[info] - Grid Connected Components (3 seconds, 959 milliseconds)
[info] - sorting, no partial aggregation with spilling with kryo ser (202 milliseconds)
[info] - FileBasedWriteAheadLog - handling file errors while reading rotating logs (2 seconds, 735 milliseconds)
[info] - FileBasedWriteAheadLog - do not create directories or files unless write (23 milliseconds)
[info] - sorting, no partial aggregation with spilling with java ser (302 milliseconds)
[info] - FileBasedWriteAheadLog - parallel recovery not enabled if closeFileAfterWrite = false (224 milliseconds)
[info] - repeatedly failing task that crashes JVM (13 seconds, 617 milliseconds)
[info] - FileBasedWriteAheadLog - close after write flag (35 milliseconds)
[info] - partial aggregation, no sorting with kryo ser (147 milliseconds)
[info] RateLimiterSuite:
[info] - rate limiter initializes even without a maxRate set (2 milliseconds)
[info] - rate limiter updates when below maxRate (1 millisecond)
[info] - rate limiter stays below maxRate despite large updates (1 millisecond)
[info] ReceivedBlockTrackerSuite:
[info] - block addition, and block to batch allocation (32 milliseconds)
[info] - partial aggregation, no sorting with java ser (151 milliseconds)
[info] - partial aggregation, no sorting with spilling with kryo ser (230 milliseconds)
[info] - partial aggregation, no sorting with spilling with java ser (174 milliseconds)
[info] - partial aggregation and sorting with kryo ser (330 milliseconds)
[info] - partial aggregation and sorting with java ser (314 milliseconds)
[info] - partial aggregation and sorting with spilling with kryo ser (186 milliseconds)
[info] - partial aggregation and sorting with spilling with java ser (293 milliseconds)
[info] - Reverse Grid Connected Components (2 seconds, 675 milliseconds)
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] ScalaTest
[info] Run completed in 133 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list
[info] - sort without breaking sorting contracts with kryo ser (3 seconds, 500 milliseconds)
[info] - Chain Connected Components (3 seconds, 590 milliseconds)
[info] ScalaTest
[info] Run completed in 58 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] - sort without breaking sorting contracts with java ser (3 seconds, 332 milliseconds)
[info] - sort without breaking timsort contracts for large arrays !!! IGNORED !!!
[info] - Reverse Chain Connected Components (4 seconds, 60 milliseconds)
[info] - spilling with hash collisions (1 second, 188 milliseconds)
[info] - Connected Components on a Toy Connected Graph (1 second, 63 milliseconds)
[info] ShortestPathsSuite:
[info] - Shortest Path Computations (1 second, 176 milliseconds)
[info] PeriodicGraphCheckpointerSuite:
[info] - Persisting (648 milliseconds)
[info] - spilling with many hash collisions (2 seconds, 131 milliseconds)
[info] - spilling with hash collisions using the Int.MaxValue key (1 second, 5 milliseconds)
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] - Checkpointing (2 seconds, 415 milliseconds)
[info] PageRankSuite:
[info] - spilling with null keys and values (804 milliseconds)
[info] - repeatedly failing task that crashes JVM with a zero exit code (SPARK-16925) (15 seconds, 502 milliseconds)
[info] - Star PageRank (2 seconds, 311 milliseconds)
[info] - sorting updates peak execution memory (2 seconds, 77 milliseconds)
[info] - force to spill for external sorter (1 second, 366 milliseconds)
[info] DAGSchedulerSuite:
[info] - Star PersonalPageRank (3 seconds, 59 milliseconds)
[info] - [SPARK-3353] parent stage should have lower stage id (1 second, 664 milliseconds)
[info] - [SPARK-13902] Ensure no duplicate stages are created (207 milliseconds)
[info] - All shuffle files on the storage endpoint should be cleaned up when it is lost (151 milliseconds)
[info] - SPARK-32003: All shuffle files for executor should be cleaned up on fetch failure (182 milliseconds)
[info] - zero split job (120 milliseconds)
[info] - run trivial job (222 milliseconds)
[info] - run trivial job w/ dependency (216 milliseconds)
[info] - equals and hashCode AccumulableInfo (1 millisecond)
[info] - cache location preferences w/ dependency (126 milliseconds)
[info] - regression test for getCacheLocs (144 milliseconds)
[info] - getMissingParentStages should consider all ancestor RDDs' cache statuses (100 milliseconds)
[info] - avoid exponential blowup when getting preferred locs list (226 milliseconds)
[info] - unserializable task (114 milliseconds)
[info] - trivial job failure (176 milliseconds)
[info] - trivial job cancellation (100 milliseconds)
[info] - job cancellation no-kill backend (101 milliseconds)
[info] - run trivial shuffle (124 milliseconds)
[info] - run trivial shuffle with fetch failure (121 milliseconds)
[info] - shuffle files not lost when executor process lost with shuffle service (121 milliseconds)
[info] - shuffle files lost when worker lost with shuffle service (125 milliseconds)
[info] - shuffle files lost when worker lost without shuffle service (134 milliseconds)
[info] - caching (encryption = off) (8 seconds, 652 milliseconds)
[info] - shuffle files not lost when executor failure with shuffle service (127 milliseconds)
[info] - shuffle files lost when executor failure without shuffle service (120 milliseconds)
[info] - SPARK-28967 properties must be cloned before posting to listener bus for 0 partition (90 milliseconds)
[info] - Single stage fetch failure should not abort the stage. (151 milliseconds)
[info] - Multiple consecutive stage fetch failures should lead to job being aborted. (257 milliseconds)
[info] - Failures in different stages should not trigger an overall abort (312 milliseconds)
[info] - Non-consecutive stage failures don't trigger abort (292 milliseconds)
[info] - trivial shuffle with multiple fetch failures (124 milliseconds)
[info] - Retry all the tasks on a resubmitted attempt of a barrier stage caused by FetchFailure (153 milliseconds)
[info] - Retry all the tasks on a resubmitted attempt of a barrier stage caused by TaskKilled (115 milliseconds)
[info] - Fail the job if a barrier ResultTask failed (405 milliseconds)
[info] - block addition, and block to batch allocation with many blocks (26 seconds, 769 milliseconds)
[info] - late fetch failures don't cause multiple concurrent attempts for the same map stage (142 milliseconds)
[info] - recovery with write ahead logs should remove only allocated blocks from received queue (31 milliseconds)
[info] - extremely late fetch failures don't cause multiple concurrent attempts for the same stage (177 milliseconds)
[info] - task events always posted in speculation / when stage is killed (152 milliseconds)
[info] - ignore late map task completions (138 milliseconds)
[info] - run shuffle with map stage failure (161 milliseconds)
[info] - block allocation to batch should not loose blocks from received queue (877 milliseconds)
[info] - shuffle fetch failure in a reused shuffle dependency (191 milliseconds)
[info] - recovery and cleanup with write ahead logs (193 milliseconds)
[info] - disable write ahead log when checkpoint directory is not set (1 millisecond)
[info] - Grid PageRank (7 seconds, 644 milliseconds)
[info] - parallel file deletion in FileBasedWriteAheadLog is robust to deletion error (88 milliseconds)
[info] - don't submit stage until its dependencies map outputs are registered (SPARK-5259) (182 milliseconds)
[info] ReceivedBlockHandlerWithEncryptionSuite:
[info] - register map outputs correctly after ExecutorLost and task Resubmitted (139 milliseconds)
[info] - failure of stage used by two jobs (225 milliseconds)
[info] - BlockManagerBasedBlockHandler - store blocks (403 milliseconds)
[info] - stage used by two jobs, the first no longer active (SPARK-6880) (184 milliseconds)
[info] - BlockManagerBasedBlockHandler - handle errors in storing block (11 milliseconds)
[info] - stage used by two jobs, some fetch failures, and the first job no longer active (SPARK-6880) (253 milliseconds)
[info] - run trivial shuffle with out-of-band executor failure and retry (135 milliseconds)
[info] - WriteAheadLogBasedBlockHandler - store blocks (216 milliseconds)
[info] - recursive shuffle failures (155 milliseconds)
[info] - WriteAheadLogBasedBlockHandler - handle errors in storing block (40 milliseconds)
[info] - cached post-shuffle (133 milliseconds)
[info] - WriteAheadLogBasedBlockHandler - clean old blocks (79 milliseconds)
[info] - SPARK-30388: shuffle fetch failed on speculative task, but original task succeed (655 milliseconds)
[info] - Test Block - count messages (498 milliseconds)
[info] - misbehaved accumulator should not crash DAGScheduler and SparkContext (289 milliseconds)
[info] - Test Block - isFullyConsumed (58 milliseconds)
[info] ReceiverSchedulingPolicySuite:
[info] - rescheduleReceiver: empty executors (1 millisecond)
[info] - rescheduleReceiver: receiver preferredLocation (5 milliseconds)
[info] - rescheduleReceiver: return all idle executors if there are any idle executors (5 milliseconds)
[info] - rescheduleReceiver: return all executors that have minimum weight if no idle executors (5 milliseconds)
[info] - scheduleReceivers: schedule receivers evenly when there are more receivers than executors (9 milliseconds)
[info] - scheduleReceivers: schedule receivers evenly when there are more executors than receivers (6 milliseconds)
[info] - scheduleReceivers: schedule receivers evenly when the preferredLocations are even (11 milliseconds)
[info] - scheduleReceivers: return empty if no receiver (1 millisecond)
[info] - scheduleReceivers: return empty scheduled executors if no executors (4 milliseconds)
[info] MapWithStateSuite:
[info] - misbehaved accumulator should not impact other accumulators (221 milliseconds)
[info] - state - get, exists, update, remove,  (8 milliseconds)
[info] - misbehaved resultHandler should not crash DAGScheduler and SparkContext (170 milliseconds)
[info] - invalid spark.job.interruptOnCancel should not crash DAGScheduler (134 milliseconds)
[info] - getPartitions exceptions should not crash DAGScheduler and SparkContext (SPARK-8606) (123 milliseconds)
[info] - getPreferredLocations errors should not crash DAGScheduler and SparkContext (SPARK-8606) (142 milliseconds)
[info] - accumulator not calculated for resubmitted result stage (147 milliseconds)
[info] - accumulator not calculated for resubmitted task in result stage (175 milliseconds)
[info] - accumulators are updated on exception failures and task killed (120 milliseconds)
[info] - reduce tasks should be placed locally with map output (187 milliseconds)
[info] - caching (encryption = on) (7 seconds, 904 milliseconds)
[info] - reduce task locality preferences should only include machines with largest map outputs (215 milliseconds)
[info] - mapWithState - basic operations with simple API (1 second, 564 milliseconds)
[info] - stages with both narrow and shuffle dependencies use narrow ones for locality (113 milliseconds)
[info] - Spark exceptions should include call site in stack trace (153 milliseconds)
[info] - catch errors in event loop (197 milliseconds)
[info] - simple map stage submission (182 milliseconds)
[info] - map stage submission with reduce stage also depending on the data (151 milliseconds)
[info] - mapWithState - basic operations with advanced API (811 milliseconds)
[info] - map stage submission with fetch failure (205 milliseconds)
[info] - mapWithState - type inferencing and class tags (17 milliseconds)
[info] - map stage submission with multiple shared stages and failures (378 milliseconds)
[info] - Trigger mapstage's job listener in submitMissingTasks (188 milliseconds)
[info] - map stage submission with executor failure late map task completions (207 milliseconds)
[info] - mapWithState - states as mapped data (775 milliseconds)
[info] - getShuffleDependenciesAndResourceProfiles correctly returns only direct shuffle parents (169 milliseconds)
[info] - mapWithState - initial states, with nothing returned as from mapping function (1 second, 50 milliseconds)
[info] - SPARK-17644: After one stage is aborted for too many failed attempts, subsequent stagesstill behave correctly on fetch failures (2 seconds, 45 milliseconds)
[info] - [SPARK-19263] DAGScheduler should not submit multiple active tasksets, even with late completions from earlier stage attempts (364 milliseconds)
[info] - mapWithState - state removing (1 second, 336 milliseconds)
[info] - task end event should have updated accumulators (SPARK-20342) (716 milliseconds)
[info] - Barrier task failures from the same stage attempt don't trigger multiple stage retries (203 milliseconds)
[info] - Barrier task failures from a previous stage attempt don't trigger stage retry (222 milliseconds)
[info] - SPARK-25341: abort stage while using old fetch protocol (226 milliseconds)
[info] - SPARK-25341: retry all the succeeding stages when the map stage is indeterminate (478 milliseconds)
[info] - SPARK-25341: continuous indeterminate stage roll back (397 milliseconds)
[info] - SPARK-29042: Sampled RDD with unordered input should be indeterminate (208 milliseconds)
[info] - mapWithState - state timing out (2 seconds, 375 milliseconds)
[info] - caching on disk (encryption = off) (7 seconds, 939 milliseconds)
[info] - SPARK-23207: cannot rollback a result stage (241 milliseconds)
[info] - mapWithState - checkpoint durations (181 milliseconds)
[info] - SPARK-23207: local checkpoint fail to rollback (checkpointed before) (225 milliseconds)
[info] - SPARK-23207: local checkpoint fail to rollback (checkpointing now) (119 milliseconds)
-------------------------------------------
Time: 1000 ms
-------------------------------------------

-------------------------------------------
Time: 2000 ms
-------------------------------------------
(a,1)

-------------------------------------------
Time: 3000 ms
-------------------------------------------
(a,2)
(b,1)

-------------------------------------------
Time: 3000 ms
-------------------------------------------
(a,2)
(b,1)

[info] - SPARK-23207: reliable checkpoint can avoid rollback (checkpointed before) (573 milliseconds)
-------------------------------------------
Time: 4000 ms
-------------------------------------------
(a,3)
(b,2)
(c,1)

-------------------------------------------
Time: 5000 ms
-------------------------------------------
(c,1)
(a,4)
(b,3)

[info] - SPARK-23207: reliable checkpoint fail to rollback (checkpointing now) (220 milliseconds)
-------------------------------------------
Time: 6000 ms
-------------------------------------------
(b,3)
(c,1)
(a,5)

-------------------------------------------
Time: 7000 ms
-------------------------------------------
(a,5)
(b,3)
(c,1)

[info] - mapWithState - driver failure recovery (1 second, 68 milliseconds)
[info] - SPARK-27164: RDD.countApprox on empty RDDs schedules jobs which never complete (117 milliseconds)
[info] JavaStreamingListenerWrapperSuite:
[info] - basic (17 milliseconds)
[info] DurationSuite:
[info] - less (1 millisecond)
[info] - lessEq (0 milliseconds)
[info] - greater (0 milliseconds)
[info] - greaterEq (0 milliseconds)
[info] - plus (1 millisecond)
[info] - minus (0 milliseconds)
[info] - times (0 milliseconds)
[info] - div (1 millisecond)
[info] - isMultipleOf (0 milliseconds)
[info] - min (0 milliseconds)
[info] - max (0 milliseconds)
[info] - isZero (0 milliseconds)
[info] - Milliseconds (0 milliseconds)
[info] - Seconds (0 milliseconds)
[info] - Minutes (1 millisecond)
[info] PIDRateEstimatorSuite:
[info] - the right estimator is created (7 milliseconds)
[info] - estimator checks ranges (3 milliseconds)
[info] - first estimate is None (3 milliseconds)
[info] - second estimate is not None (1 millisecond)
[info] - no estimate when no time difference between successive calls (2 milliseconds)
[info] - no estimate when no records in previous batch (1 millisecond)
[info] - no estimate when there is no processing delay (0 milliseconds)
[info] - Completions in zombie tasksets update status of non-zombie taskset (143 milliseconds)
[info] - estimate is never less than min rate (6 milliseconds)
[info] - with no accumulated or positive error, |I| > 0, follow the processing speed (6 milliseconds)
[info] - with no accumulated but some positive error, |I| > 0, follow the processing speed (5 milliseconds)
[info] - with some accumulated and some positive error, |I| > 0, stay below the processing speed (22 milliseconds)
[info] WindowOperationsSuite:
[info] - test default resource profile (160 milliseconds)
[info] - test 1 resource profile (253 milliseconds)
[info] - window - basic window (467 milliseconds)
[info] - test 2 resource profiles errors by default (176 milliseconds)
[info] - test 2 resource profile with merge conflict config true (138 milliseconds)
[info] - window - tumbling window (428 milliseconds)
[info] - test multiple resource profiles created from merging use same rp (177 milliseconds)
[info] - test merge 2 resource profiles multiple configs (3 milliseconds)
[info] - test merge 3 resource profiles (1 millisecond)
[info] - getShuffleDependenciesAndResourceProfiles returns deps and profiles correctly (174 milliseconds)
[info] FsHistoryProviderSuite:
[info] - window - larger window (582 milliseconds)
[info] - Parse application logs (inMemory = true) (343 milliseconds)
[info] - window - non-overlapping window (438 milliseconds)
[info] - window - persistence level (242 milliseconds)
[info] - Parse application logs (inMemory = false) (749 milliseconds)
[info] - SPARK-31608: parse application logs with HybridStore (298 milliseconds)
[info] - reduceByKeyAndWindow - basic reduction (721 milliseconds)
[info] - SPARK-3697: ignore files that cannot be read. (108 milliseconds)
[info] - history file is renamed from inprogress to completed (144 milliseconds)
[info] - Parse logs that application is not started (48 milliseconds)
[info] - SPARK-5582: empty log directory (131 milliseconds)
[info] - reduceByKeyAndWindow - key already in window and new value added into window (520 milliseconds)
[info] - apps with multiple attempts with order (597 milliseconds)
[info] - reduceByKeyAndWindow - new key added into window (651 milliseconds)
[info] - log urls without customization (301 milliseconds)
[info] - custom log urls, including FILE_NAME (354 milliseconds)
[info] - custom log urls, excluding FILE_NAME (238 milliseconds)
[info] - reduceByKeyAndWindow - key removed from window (877 milliseconds)
[info] - custom log urls with invalid attribute (205 milliseconds)
[info] - custom log urls, LOG_FILES not available while FILE_NAME is specified (190 milliseconds)
[info] - reduceByKeyAndWindow - larger slide time (642 milliseconds)
[info] - caching on disk (encryption = on) (7 seconds, 67 milliseconds)
[info] - reduceByKeyAndWindow - big test (876 milliseconds)
[info] - reduceByKeyAndWindow with inverse function - basic reduction (341 milliseconds)
[info] - reduceByKeyAndWindow with inverse function - key already in window and new value added into window (415 milliseconds)
[info] - reduceByKeyAndWindow with inverse function - new key added into window (385 milliseconds)
[info] - reduceByKeyAndWindow with inverse function - key removed from window (741 milliseconds)
[info] - custom log urls, app not finished, applyIncompleteApplication: true (3 seconds, 585 milliseconds)
[info] - custom log urls, app not finished, applyIncompleteApplication: false (164 milliseconds)
[info] - log cleaner (98 milliseconds)
[info] - reduceByKeyAndWindow with inverse function - larger slide time (788 milliseconds)
[info] - should not clean inprogress application with lastUpdated time less than maxTime (77 milliseconds)
[info] - log cleaner for inProgress files (81 milliseconds)
[info] - Event log copy (105 milliseconds)
[info] - driver log cleaner (163 milliseconds)
[info] - SPARK-8372: new logs with no app ID are ignored (42 milliseconds)
[info] - reduceByKeyAndWindow with inverse function - big test (1 second, 168 milliseconds)
[info] - provider correctly checks whether fs is in safe mode (830 milliseconds)
[info] - provider waits for safe mode to finish before initializing (55 milliseconds)
[info] - provider reports error after FS leaves safe mode (111 milliseconds)
[info] - ignore hidden files (62 milliseconds)
[info] - support history server ui admin acls (452 milliseconds)
[info] - mismatched version discards old listing (132 milliseconds)
[info] - reduceByKeyAndWindow with inverse and filter functions - big test (1 second, 107 milliseconds)
[info] - invalidate cached UI (296 milliseconds)
[info] - clean up stale app information (210 milliseconds)
[info] - SPARK-21571: clean up removes invalid history files (69 milliseconds)
[info] - always find end event for finished apps (288 milliseconds)
[info] - parse event logs with optimizations off (77 milliseconds)
[info] - SPARK-24948: ignore files we don't have read permission on (488 milliseconds)
[info] - check in-progress event logs absolute length (343 milliseconds)
[info] - groupByKeyAndWindow (1 second, 854 milliseconds)
[info] - log cleaner with the maximum number of log files (456 milliseconds)
[info] - backwards compatibility with LogInfo from Spark 2.4 (8 milliseconds)
[info] - SPARK-29755 LogInfo should be serialized/deserialized by jackson properly (12 milliseconds)
[info] - SPARK-29755 AttemptInfoWrapper should be serialized/deserialized by jackson properly (11 milliseconds)
[info] - SPARK-29043: clean up specified event log (76 milliseconds)
[info] - caching in memory, replicated (encryption = off) (7 seconds, 918 milliseconds)
java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@7ab9f010 rejected from java.util.concurrent.ThreadPoolExecutor@3fbb7123[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 0]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at java.util.concurrent.Executors$DelegatedExecutorService.execute(Executors.java:668)
	at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
	at scala.concurrent.Promise.complete(Promise.scala:53)
	at scala.concurrent.Promise.complete$(Promise.scala:52)
	at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67)
	at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
	at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59)
	at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875)
	at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110)
	at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107)
	at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
	at scala.concurrent.Promise.complete(Promise.scala:53)
	at scala.concurrent.Promise.complete$(Promise.scala:52)
	at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
[info] - countByWindow (799 milliseconds)
[info] - compact event log files (295 milliseconds)
[info] - SPARK-33146: don't let one bad rolling log folder prevent loading other applications (133 milliseconds)
[info] - SPARK-36354: EventLogFileReader should skip rolling event log directories with no logs (123 milliseconds)
[info] - countByValueAndWindow (442 milliseconds)
[info] TimeSuite:
[info] - less (0 milliseconds)
[info] - lessEq (0 milliseconds)
[info] - greater (0 milliseconds)
[info] - greaterEq (1 millisecond)
[info] - plus (0 milliseconds)
[info] - minus Time (0 milliseconds)
[info] - minus Duration (0 milliseconds)
[info] - floor (1 millisecond)
[info] - isMultipleOf (1 millisecond)
[info] - min (1 millisecond)
[info] - max (0 milliseconds)
[info] - until (0 milliseconds)
[info] - to (0 milliseconds)
[info] DStreamScopeSuite:
[info] - SPARK-33215: check ui view permissions without retrieving ui (204 milliseconds)
[info] RDDSuite:
[info] - dstream without scope (3 milliseconds)
[info] - input dstream without scope (4 milliseconds)
[info] - scoping simple operations (27 milliseconds)
[info] - scoping nested operations (72 milliseconds)
[info] - transform should allow RDD operations to be captured in scopes (38 milliseconds)
[info] - foreachRDD should allow RDD operations to be captured in scope (53 milliseconds)
[info] StreamingContextSuite:
[info] - from no conf constructor (106 milliseconds)
[info] - from no conf + spark home (166 milliseconds)
[info] - from no conf + spark home + env (115 milliseconds)
[info] - basic operations (1 second, 605 milliseconds)
[info] - serialization (3 milliseconds)
[info] - from conf with settings (707 milliseconds)
[info] - distinct with known partitioner preserves partitioning (501 milliseconds)
[info] - from existing SparkContext (249 milliseconds)
[info] - countApproxDistinct (277 milliseconds)
[info] - SparkContext.union (117 milliseconds)
[info] - from existing SparkContext with settings (224 milliseconds)
[info] - SparkContext.union parallel partition listing (136 milliseconds)
[info] - SparkContext.union creates UnionRDD if at least one RDD has no partitioner (2 milliseconds)
[info] - SparkContext.union creates PartitionAwareUnionRDD if all RDDs have partitioners (4 milliseconds)
[info] - PartitionAwareUnionRDD raises exception if at least one RDD has no partitioner (3 milliseconds)
[info] - SPARK-23778: empty RDD in union should not produce a UnionRDD (12 milliseconds)
[info] - partitioner aware union (234 milliseconds)
[info] - UnionRDD partition serialized size should be small (11 milliseconds)
[info] - fold (24 milliseconds)
[info] - fold with op modifying first arg (33 milliseconds)
[info] - aggregate (39 milliseconds)
[info] - from checkpoint (508 milliseconds)
[info] - checkPoint from conf (225 milliseconds)
[info] - state matching (1 millisecond)
[info] - treeAggregate (758 milliseconds)
[info] - start and stop state check (252 milliseconds)
[info] - start with non-serializable DStream checkpoints (148 milliseconds)
[info] - start failure should stop internal components (130 milliseconds)
[info] - start should set local properties of streaming jobs correctly (241 milliseconds)
[info] - treeAggregate with ops modifying first args (781 milliseconds)
[info] - start multiple times (201 milliseconds)
[info] - Grid PageRank with checkpoint (33 seconds, 573 milliseconds)
[info] - stop multiple times (380 milliseconds)
[info] - treeReduce (874 milliseconds)
[info] - stop before start (370 milliseconds)
[info] - basic caching (66 milliseconds)
[info] - caching with failures (25 milliseconds)
[info] - start after stop (177 milliseconds)
[info] - empty RDD (285 milliseconds)
[info] - repartitioned RDDs (339 milliseconds)
[info] - stop only streaming context (575 milliseconds)
[info] - stop(stopSparkContext=true) after stop(stopSparkContext=false) (371 milliseconds)
[info] - caching in memory, replicated (encryption = off) (with replication as stream) (7 seconds, 473 milliseconds)
[info] - Chain PageRank (2 seconds, 233 milliseconds)
[info] - repartitioned RDDs perform load balancing (3 seconds, 351 milliseconds)
[info] - coalesced RDDs (269 milliseconds)
[info] - coalesced RDDs with locality (99 milliseconds)
[info] - coalesced RDDs with partial locality (76 milliseconds)
[info] - coalesced RDDs with locality, large scale (10K partitions) (1 second, 701 milliseconds)
[info] - coalesced RDDs with partial locality, large scale (10K partitions) (571 milliseconds)
[info] - coalesced RDDs with locality, fail first pass (9 milliseconds)
[info] - zipped RDDs (71 milliseconds)
[info] - partition pruning (25 milliseconds)
[info] - caching in memory, replicated (encryption = on) (7 seconds, 565 milliseconds)
[info] - collect large number of empty partitions (4 seconds, 59 milliseconds)
[info] - stop gracefully (11 seconds, 175 milliseconds)
[info] - Chain PageRank with checkpoint (10 seconds, 965 milliseconds)
[info] - stop gracefully even if a receiver misses StopReceiver (981 milliseconds)
[info] - take (3 seconds, 172 milliseconds)
[info] - top with predefined ordering (135 milliseconds)
[info] - top with custom ordering (19 milliseconds)
[info] - takeOrdered with predefined ordering (16 milliseconds)
[info] - takeOrdered with limit 0 (0 milliseconds)
[info] - takeOrdered with custom ordering (15 milliseconds)
[info] - isEmpty (129 milliseconds)
[info] - sample preserves partitioner (4 milliseconds)
[info] - Chain PersonalizedPageRank (2 seconds, 425 milliseconds)
[info] - caching in memory, replicated (encryption = on) (with replication as stream) (8 seconds, 482 milliseconds)
[info] - caching in memory, serialized, replicated (encryption = off) (7 seconds, 669 milliseconds)
[info] - Loop with source PageRank (12 seconds, 597 milliseconds)
[info] - stop slow receiver gracefully (16 seconds, 43 milliseconds)
[info] - registering and de-registering of streamingSource (198 milliseconds)
[info] - SPARK-28709 registering and de-registering of progressListener (403 milliseconds)
[info] - awaitTermination (2 seconds, 319 milliseconds)
[info] - awaitTermination after stop (324 milliseconds)
[info] - awaitTermination with error in task (1 second, 219 milliseconds)
[info] - awaitTermination with error in job generation (348 milliseconds)
[info] - caching in memory, serialized, replicated (encryption = off) (with replication as stream) (8 seconds, 492 milliseconds)
[info] - awaitTerminationOrTimeout (1 second, 294 milliseconds)
[info] - getOrCreate (1 second, 195 milliseconds)
[info] - getActive and getActiveOrCreate (371 milliseconds)
[info] - takeSample (23 seconds, 779 milliseconds)
[info] - takeSample from an empty rdd (16 milliseconds)
[info] - randomSplit (647 milliseconds)
[info] - runJob on an invalid partition (9 milliseconds)
[info] - getActiveOrCreate with checkpoint (1 second, 517 milliseconds)
[info] - sort an empty RDD (29 milliseconds)
[info] - multiple streaming contexts (116 milliseconds)
[info] - sortByKey (174 milliseconds)
[info] - sortByKey ascending parameter (140 milliseconds)
[info] - sortByKey with explicit ordering (138 milliseconds)
[info] - repartitionAndSortWithinPartitions (41 milliseconds)
[info] - cartesian on empty RDD (40 milliseconds)
[info] - cartesian on non-empty RDDs (79 milliseconds)
[info] - intersection (98 milliseconds)
[info] - DStream and generated RDD creation sites (637 milliseconds)
[info] - intersection strips duplicates in an input (144 milliseconds)
[info] - zipWithIndex (67 milliseconds)
[info] - throw exception on using active or stopped context (179 milliseconds)
[info] - zipWithIndex with a single partition (16 milliseconds)
[info] - zipWithIndex chained with other RDDs (SPARK-4433) (36 milliseconds)
[info] - zipWithUniqueId (68 milliseconds)
[info] - retag with implicit ClassTag (32 milliseconds)
[info] - parent method (10 milliseconds)
[info] - getNarrowAncestors (46 milliseconds)
[info] - getNarrowAncestors with multiple parents (25 milliseconds)
[info] - getNarrowAncestors with cycles (37 milliseconds)
[info] - task serialization exception should not hang scheduler (52 milliseconds)
[info] - RDD.partitions() fails fast when partitions indices are incorrect (SPARK-13021) (2 milliseconds)
[info] - nested RDDs are not supported (SPARK-5063) (30 milliseconds)
[info] - actions cannot be performed inside of transformations (SPARK-5063) (44 milliseconds)
[info] - queueStream doesn't support checkpointing (712 milliseconds)
[info] - custom RDD coalescer (1 second, 6 milliseconds)
[info] - SPARK-18406: race between end-of-task and completion iterator read lock release (77 milliseconds)
[info] - Creating an InputDStream but not using it should not crash (967 milliseconds)
[info] - SPARK-27666: Do not release lock while TaskContext already completed (1 second, 81 milliseconds)
[info] - SPARK-23496: order of input partitions can result in severe skew in coalesce (14 milliseconds)
[info] - cannot run actions after SparkContext has been stopped (SPARK-5063) (154 milliseconds)
[info] - cannot call methods on a stopped SparkContext (SPARK-5063) (4 milliseconds)
[info] ExecutorSuite:
[info] - SPARK-15963: Catch `TaskKilledException` correctly in Executor.TaskRunner (354 milliseconds)
[info] - caching in memory, serialized, replicated (encryption = on) (7 seconds, 837 milliseconds)
[info] - SPARK-19276: Handle FetchFailedExceptions that are hidden by user exceptions (201 milliseconds)
[info] - Executor's worker threads should be UninterruptibleThread (102 milliseconds)
[info] - SPARK-19276: OOMs correctly handled with a FetchFailure (108 milliseconds)
[info] - SPARK-23816: interrupts are not masked by a FetchFailure (69 milliseconds)
[info] - Gracefully handle error in task deserialization (10 milliseconds)
[info] - Heartbeat should drop zero accumulator updates (108 milliseconds)
[info] - Heartbeat should not drop zero accumulator updates when the conf is disabled (8 milliseconds)
[info] - Send task executor metrics in DirectTaskResult (114 milliseconds)
[info] - Send task executor metrics in TaskKilled (151 milliseconds)
[info] - Send task executor metrics in ExceptionFailure (126 milliseconds)
[info] - SPARK-34949: do not re-register BlockManager when executor is shutting down (9 milliseconds)
[info] - SPARK-33587: isFatalError (234 milliseconds)
[info] SerDeUtilSuite:
[info] - Converting an empty pair RDD to python does not throw an exception (SPARK-5441) (72 milliseconds)
[info] - Converting an empty python RDD to pair RDD does not throw an exception (SPARK-5441) (69 milliseconds)
[info] UtilsSuite:
[info] - timeConversion (4 milliseconds)
[info] - Test byteString conversion (5 milliseconds)
[info] - bytesToString (2 milliseconds)
[info] - copyStream (8 milliseconds)
[info] - copyStreamUpTo (18 milliseconds)
[info] - memoryStringToMb (1 millisecond)
[info] - splitCommandString (1 millisecond)
[info] - string formatting of time durations (2 milliseconds)
[info] - reading offset bytes of a file (11 milliseconds)
[info] - reading offset bytes of a file (compressed) (15 milliseconds)
[info] - reading offset bytes across multiple files (15 milliseconds)
[info] - reading offset bytes across multiple files (compressed) (24 milliseconds)
[info] - deserialize long value (0 milliseconds)
[info] - writeByteBuffer should not change ByteBuffer position (1 millisecond)
[info] - get iterator size (1 millisecond)
[info] - getIteratorZipWithIndex (1 millisecond)
[info] - doesDirectoryContainFilesNewerThan (19 milliseconds)
[info] - resolveURI (2 milliseconds)
[info] - resolveURIs with multiple paths (2 milliseconds)
[info] - nonLocalPaths (6 milliseconds)
[info] - isBindCollision (2 milliseconds)
[info] - log4j log level change (1 millisecond)
[info] - deleteRecursively (50 milliseconds)
[info] - loading properties from file (24 milliseconds)
[info] - timeIt with prepare (2 seconds, 3 milliseconds)
[info] - fetch hcfs dir (29 milliseconds)
[info] - shutdown hook manager (4 milliseconds)
[info] - isInDirectory (3 milliseconds)
[info] - circular buffer: if nothing was written to the buffer, display nothing (0 milliseconds)
[info] - circular buffer: if the buffer isn't full, print only the contents written (1 millisecond)
[info] - circular buffer: data written == size of the buffer (1 millisecond)
[info] - circular buffer: multiple overflow (1 millisecond)
[info] - isDynamicAllocationEnabled (0 milliseconds)
[info] - getDynamicAllocationInitialExecutors (3 milliseconds)
[info] - Set Spark CallerContext (0 milliseconds)
[info] - encodeFileNameToURIRawPath (0 milliseconds)
[info] - decodeFileNameInURI (1 millisecond)
[info] - caching in memory, serialized, replicated (encryption = on) (with replication as stream) (7 seconds, 722 milliseconds)
[info] - Kill process (5 seconds, 58 milliseconds)
[info] - chi square test of randomizeInPlace (30 milliseconds)
[info] - redact sensitive information (2 milliseconds)
[info] - redact sensitive information in command line args (3 milliseconds)
[info] - redact sensitive information in sequence of key value pairs (1 millisecond)
[info] - tryWithSafeFinally (5 milliseconds)
[info] - tryWithSafeFinallyAndFailureCallbacks (7 milliseconds)
[info] - load extensions (7 milliseconds)
[info] - check Kubernetes master URL (3 milliseconds)
[info] - stringHalfWidth (0 milliseconds)
[info] - trimExceptCRLF standalone (4 milliseconds)
[info] - pathsToMetadata (1 millisecond)
[info] - checkHost supports both IPV4 and IPV6 (3 milliseconds)
[info] - checkHostPort support IPV6 and IPV4 (2 milliseconds)
[info] - parseHostPort support IPV6 and IPV4 (0 milliseconds)
[info] - executorOffHeapMemorySizeAsMb when MEMORY_OFFHEAP_ENABLED is false (0 milliseconds)
[info] - executorOffHeapMemorySizeAsMb when MEMORY_OFFHEAP_ENABLED is true (1 millisecond)
[info] - executorMemoryOverhead when MEMORY_OFFHEAP_ENABLED is true, but MEMORY_OFFHEAP_SIZE not config scene (1 millisecond)
[info] - isPushBasedShuffleEnabled when both PUSH_BASED_SHUFFLE_ENABLED and SHUFFLE_SERVICE_ENABLED are true (1 millisecond)
[info] PagedDataSourceSuite:
[info] - basic (3 milliseconds)
[info] CheckpointStorageSuite:
[info] - checkpoint compression (298 milliseconds)
[info] - cache checkpoint preferred location (195 milliseconds)
[info] - SPARK-31484: checkpoint should not fail in retry (790 milliseconds)
[info] SortingSuite:
[info] - sortByKey (62 milliseconds)
[info] - large array (66 milliseconds)
[info] - large array with one split (46 milliseconds)
[info] - large array with many partitions (129 milliseconds)
[info] - sort descending (81 milliseconds)
[info] - sort descending with one split (58 milliseconds)
[info] - sort descending with many partitions (169 milliseconds)
[info] - more partitions than elements (167 milliseconds)
[info] - empty RDD (45 milliseconds)
[info] - partition balancing (118 milliseconds)
[info] - partition balancing for descending sort (117 milliseconds)
[info] - get a range of elements in a sorted RDD that is on one partition (82 milliseconds)
[info] - get a range of elements over multiple partitions in a descendingly sorted RDD (141 milliseconds)
[info] - get a range of elements in an array not partitioned by a range partitioner (36 milliseconds)
Exception in thread "streaming-job-executor-0" java.lang.Error: java.lang.InterruptedException
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.InterruptedException
	at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
	at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:242)
	at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:258)
	at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:187)
	at org.apache.spark.util.ThreadUtils$.awaitReady(ThreadUtils.scala:334)
	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:893)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2196)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2217)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2236)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2261)
	at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
	at org.apache.spark.rdd.RDD.collect(RDD.scala:1029)
	at org.apache.spark.streaming.StreamingContextSuite.$anonfun$new$133(StreamingContextSuite.scala:841)
	at org.apache.spark.streaming.StreamingContextSuite.$anonfun$new$133$adapted(StreamingContextSuite.scala:839)
	at org.apache.spark.streaming.dstream.DStream.$anonfun$foreachRDD$2(DStream.scala:629)
	at org.apache.spark.streaming.dstream.DStream.$anonfun$foreachRDD$2$adapted(DStream.scala:629)
	at org.apache.spark.streaming.dstream.ForEachDStream.$anonfun$generateJob$2(ForEachDStream.scala:51)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:417)
	at org.apache.spark.streaming.dstream.ForEachDStream.$anonfun$generateJob$1(ForEachDStream.scala:51)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at scala.util.Try$.apply(Try.scala:213)
	at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
	at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.$anonfun$run$1(JobScheduler.scala:256)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
	at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	... 2 more
[info] - get a range of elements over multiple partitions but not taking up full partitions (117 milliseconds)
[info] RpcAddressSuite:
[info] - hostPort (0 milliseconds)
[info] - fromSparkURL (0 milliseconds)
[info] - fromSparkURL: a typo url (1 millisecond)
[info] - fromSparkURL: invalid scheme (1 millisecond)
[info] - toSparkURL (0 milliseconds)
[info] JavaSerializerSuite:
[info] - JavaSerializer instances are serializable (0 milliseconds)
[info] - Deserialize object containing a primitive Class as attribute (6 milliseconds)
[info] LocalDirsSuite:
[info] - Utils.getLocalDir() returns a valid directory, even if some local dirs are missing (6 milliseconds)
[info] - SPARK_LOCAL_DIRS override also affects driver (6 milliseconds)
[info] - Utils.getLocalDir() throws an exception if any temporary directory cannot be retrieved (8 milliseconds)
[info] - SPARK-18560 Receiver data should be deserialized properly. (13 seconds, 951 milliseconds)
[info] TaskContextSuite:
[info] - provide metrics sources (247 milliseconds)
[info] - calls TaskCompletionListener after failure (86 milliseconds)
[info] - calls TaskFailureListeners after failure (91 milliseconds)
[info] - all TaskCompletionListeners should be called even if some fail (9 milliseconds)
[info] - all TaskFailureListeners should be called even if some fail (9 milliseconds)
[info] - TaskContext.attemptNumber should return attempt number, not task id (SPARK-4014) (110 milliseconds)
[info] - SPARK-22955 graceful shutdown shouldn't lead to job generation error (1 second, 39 milliseconds)
[info] DStreamClosureSuite:
[info] - user provided closures are actually cleaned (116 milliseconds)
[info] BatchedWriteAheadLogSuite:
[info] - TaskContext.stageAttemptNumber getter (633 milliseconds)
[info] - BatchedWriteAheadLog - read all logs (86 milliseconds)
[info] - accumulators are updated on exception failures (222 milliseconds)
[info] - failed tasks collect only accumulators whose values count during failures (229 milliseconds)
[info] - BatchedWriteAheadLog - write logs (469 milliseconds)
[info] - only updated internal accumulators will be sent back to driver (180 milliseconds)
[info] - localProperties are propagated to executors correctly (217 milliseconds)
[info] - immediately call a completion listener if the context is completed (1 millisecond)
[info] - immediately call a failure listener if the context has failed (1 millisecond)
[info] - TaskCompletionListenerException.getMessage should include previousError (1 millisecond)
[info] - all TaskCompletionListeners should be called even if some fail or a task (4 milliseconds)
[info] - BatchedWriteAheadLog - read all logs after write (360 milliseconds)
[info] HistoryServerSuite:
[info] - BatchedWriteAheadLog - clean old logs (253 milliseconds)
[info] - BatchedWriteAheadLog - clean old logs synchronously (244 milliseconds)
[info] - caching on disk, replicated 2 (encryption = off) (6 seconds, 985 milliseconds)
[info] - BatchedWriteAheadLog - handling file errors while reading rotating logs (473 milliseconds)
[info] - BatchedWriteAheadLog - do not create directories or files unless write (15 milliseconds)
[info] - BatchedWriteAheadLog - parallel recovery not enabled if closeFileAfterWrite = false (48 milliseconds)
[info] - BatchedWriteAheadLog - serializing and deserializing batched records (2 milliseconds)
[info] - BatchedWriteAheadLog - failures in wrappedLog get bubbled up (99 milliseconds)
[info] - BatchedWriteAheadLog - name log with the highest timestamp of aggregated entries (24 milliseconds)
[info] - BatchedWriteAheadLog - shutdown properly (1 millisecond)
[info] - BatchedWriteAheadLog - fail everything in queue during shutdown (5 milliseconds)
[info] ReceiverTrackerSuite:
[info] - application list json (1 second, 419 milliseconds)
[info] - send rate update to receivers (441 milliseconds)
[info] - completed app list json (51 milliseconds)
[info] - running app list json (10 milliseconds)
[info] - minDate app list json (13 milliseconds)
[info] - maxDate app list json (10 milliseconds)
[info] - maxDate2 app list json (10 milliseconds)
[info] - minEndDate app list json (13 milliseconds)
[info] - maxEndDate app list json (11 milliseconds)
[info] - minEndDate and maxEndDate app list json (9 milliseconds)
[info] - minDate and maxEndDate app list json (9 milliseconds)
[info] - limit app list json (7 milliseconds)
[info] - one app json (52 milliseconds)
[info] - one app multi-attempt json (7 milliseconds)
[info] - job list json (334 milliseconds)
[info] - job list from multi-attempt app json(1) (190 milliseconds)
[info] - should restart receiver after stopping it (801 milliseconds)
[info] - job list from multi-attempt app json(2) (180 milliseconds)
[info] - one job json (10 milliseconds)
[info] - succeeded job list json (14 milliseconds)
[info] - succeeded&failed job list json (13 milliseconds)
[info] - executor list json (35 milliseconds)
[info] - executor list with executor metrics json (382 milliseconds)
[info] - SPARK-11063: TaskSetManager should use Receiver RDD's preferredLocations (624 milliseconds)
[info] - stage list json (104 milliseconds)
[info] - complete stage list json (16 milliseconds)
[info] - failed stage list json (26 milliseconds)
[info] - one stage json (86 milliseconds)
[info] - one stage attempt json (46 milliseconds)
[info] - stage task summary w shuffle write (410 milliseconds)
[info] - stage task summary w shuffle read (34 milliseconds)
[info] - stage task summary w/ custom quantiles (43 milliseconds)
[info] - stage task list (69 milliseconds)
[info] - stage task list w/ offset & length (57 milliseconds)
[info] - get allocated executors (839 milliseconds)
[info] StateMapSuite:
[info] - EmptyStateMap (1 millisecond)
[info] - stage task list w/ sortBy (35 milliseconds)
[info] - OpenHashMapBasedStateMap - put, get, getByTime, getAll, remove (22 milliseconds)
[info] - OpenHashMapBasedStateMap - put, get, getByTime, getAll, remove with copy (3 milliseconds)
[info] - stage task list w/ sortBy short names: -runtime (30 milliseconds)
[info] - stage task list w/ sortBy short names: runtime (20 milliseconds)
[info] - OpenHashMapBasedStateMap - serializing and deserializing (51 milliseconds)
[info] - OpenHashMapBasedStateMap - serializing and deserializing with compaction (9 milliseconds)
[info] - stage task list w/ status (315 milliseconds)
[info] - stage task list w/ status & offset & length (63 milliseconds)
[info] - stage task list w/ status & sortBy short names: runtime (30 milliseconds)
[info] - stage list with accumulable json (47 milliseconds)
[info] - stage with accumulable json (44 milliseconds)
[info] - stage task list from multi-attempt app json(1) (14 milliseconds)
[info] - stage task list from multi-attempt app json(2) (40 milliseconds)
[info] - excludeOnFailure for stage (300 milliseconds)
[info] - excludeOnFailure node for stage (331 milliseconds)
[info] - rdd list storage json (22 milliseconds)
[info] - executor node excludeOnFailure (240 milliseconds)
[info] - executor node excludeOnFailure unexcluding (10 milliseconds)
[info] - executor memory usage (19 milliseconds)
[info] - executor resource information (193 milliseconds)
[info] - multiple resource profiles (291 milliseconds)
[info] - stage list with peak metrics (509 milliseconds)
[info] - stage with peak metrics (58 milliseconds)
[info] - app environment (120 milliseconds)
[info] - one rdd storage json (13 milliseconds)
[info] - download all logs for app with multiple attempts (77 milliseconds)
[info] - download one log for app with multiple attempts (74 milliseconds)
[info] - response codes on bad paths (56 milliseconds)
[info] - automatically retrieve uiRoot from request through Knox (85 milliseconds)
[info] - static relative links are prefixed with uiRoot (spark.ui.proxyBase) (7 milliseconds)
[info] - /version api endpoint (18 milliseconds)
[info] - security manager starts with spark.authenticate set (23 milliseconds)
[info] - caching on disk, replicated 2 (encryption = off) (with replication as stream) (6 seconds, 706 milliseconds)
[info] - OpenHashMapBasedStateMap - all possible sequences of operations with copies  (10 seconds, 285 milliseconds)
[info] - OpenHashMapBasedStateMap - serializing and deserializing with KryoSerializable states (17 milliseconds)
[info] - EmptyStateMap - serializing and deserializing (42 milliseconds)
[info] - MapWithStateRDDRecord - serializing and deserializing with KryoSerializable states (12 milliseconds)
[info] RecurringTimerSuite:
[info] - basic (15 milliseconds)
[info] - SPARK-10224: call 'callback' after stopping (33 milliseconds)
[info] CheckpointSuite:
[info] - non-existent checkpoint dir (3 milliseconds)
[info] - caching on disk, replicated 2 (encryption = on) (7 seconds, 57 milliseconds)
[info] - incomplete apps get refreshed (10 seconds, 344 milliseconds)
[info] - ui and api authorization checks (1 second, 133 milliseconds)
[info] - SPARK-33215: speed up event log download by skipping UI rebuild (527 milliseconds)
[info] - access history application defaults to the last attempt id (532 milliseconds)
[info] - SPARK-31697: HistoryServer should set Content-Type (4 milliseconds)
[info] - Redirect to the root page when accessed to /history/ (2 milliseconds)
[info] NextIteratorSuite:
[info] - one iteration (2 milliseconds)
[info] - two iterations (0 milliseconds)
[info] - empty iteration (0 milliseconds)
[info] - close is called once for empty iterations (0 milliseconds)
[info] - close is called once for non-empty iterations (0 milliseconds)
[info] ParallelCollectionSplitSuite:
[info] - one element per slice (1 millisecond)
[info] - one slice (0 milliseconds)
[info] - equal slices (1 millisecond)
[info] - non-equal slices (0 milliseconds)
[info] - splitting exclusive range (0 milliseconds)
[info] - splitting inclusive range (1 millisecond)
[info] - empty data (1 millisecond)
[info] - zero slices (1 millisecond)
[info] - negative number of slices (1 millisecond)
[info] - exclusive ranges sliced into ranges (2 milliseconds)
[info] - inclusive ranges sliced into ranges (1 millisecond)
[info] - identical slice sizes between Range and NumericRange (2 milliseconds)
[info] - identical slice sizes between List and NumericRange (1 millisecond)
[info] - large ranges don't overflow (2 milliseconds)
[info] - random array tests (188 milliseconds)
[info] - random exclusive range tests (12 milliseconds)
[info] - random inclusive range tests (8 milliseconds)
[info] - exclusive ranges of longs (2 milliseconds)
[info] - inclusive ranges of longs (1 millisecond)
[info] - exclusive ranges of doubles (2 milliseconds)
[info] - inclusive ranges of doubles (1 millisecond)
[info] - inclusive ranges with Int.MaxValue and Int.MinValue (1 millisecond)
[info] - empty ranges with Int.MaxValue and Int.MinValue (1 millisecond)
[info] UISeleniumSuite:
[info] - caching on disk, replicated 2 (encryption = on) (with replication as stream) (6 seconds, 398 milliseconds)
[info] - basic rdd checkpoints + dstream graph checkpoint recovery (7 seconds, 177 milliseconds)
[info] - recovery of conf through checkpoints (204 milliseconds)
[info] - get correct spark.driver.[host|port] from checkpoint (197 milliseconds)
[info] - SPARK-30199 get ui port and blockmanager port (289 milliseconds)
-------------------------------------------
Time: 500 ms
-------------------------------------------
(a,2)
(b,1)

-------------------------------------------
Time: 1000 ms
-------------------------------------------
(,2)

-------------------------------------------
Time: 1500 ms
-------------------------------------------

[info] - effects of unpersist() / persist() should be reflected (3 seconds, 21 milliseconds)
-------------------------------------------
Time: 1500 ms
-------------------------------------------

-------------------------------------------
Time: 2000 ms
-------------------------------------------
(a,2)
(b,1)

-------------------------------------------
Time: 2500 ms
-------------------------------------------
(,2)

-------------------------------------------
Time: 3000 ms
-------------------------------------------

[info] - recovery with map and reduceByKey operations (704 milliseconds)
-------------------------------------------
Time: 500 ms
-------------------------------------------
(a,1)

-------------------------------------------
Time: 1000 ms
-------------------------------------------
(a,2)

-------------------------------------------
Time: 1500 ms
-------------------------------------------
(a,3)

-------------------------------------------
Time: 2000 ms
-------------------------------------------
(a,4)

-------------------------------------------
Time: 2500 ms
-------------------------------------------
(a,4)

-------------------------------------------
Time: 3000 ms
-------------------------------------------
(a,4)

-------------------------------------------
Time: 3500 ms
-------------------------------------------
(a,4)

-------------------------------------------
Time: 3500 ms
-------------------------------------------
(a,4)

-------------------------------------------
Time: 4000 ms
-------------------------------------------
(a,4)

-------------------------------------------
Time: 4500 ms
-------------------------------------------
(a,4)

[info] - failed stages should not appear to be active (1 second, 708 milliseconds)
-------------------------------------------
Time: 5000 ms
-------------------------------------------
(a,4)

[info] - recovery with invertible reduceByKeyAndWindow operation (1 second, 597 milliseconds)
-------------------------------------------
Time: 500 ms
-------------------------------------------
(a,2)
(b,1)

-------------------------------------------
Time: 1000 ms
-------------------------------------------
(,2)

-------------------------------------------
Time: 1500 ms
-------------------------------------------

-------------------------------------------
Time: 1500 ms
-------------------------------------------

-------------------------------------------
Time: 2000 ms
-------------------------------------------
(a,2)
(b,1)

-------------------------------------------
Time: 2500 ms
-------------------------------------------
(,2)

[info] - spark.ui.killEnabled should properly control kill button display (1 second, 736 milliseconds)
-------------------------------------------
Time: 3000 ms
-------------------------------------------

[info] - recovery with saveAsHadoopFiles operation (1 second, 804 milliseconds)
-------------------------------------------
Time: 500 ms
-------------------------------------------
(a,2)
(b,1)

-------------------------------------------
Time: 1000 ms
-------------------------------------------
(,2)

-------------------------------------------
Time: 1500 ms
-------------------------------------------

-------------------------------------------
Time: 1500 ms
-------------------------------------------

-------------------------------------------
Time: 2000 ms
-------------------------------------------
(a,2)
(b,1)

[info] - jobs page should not display job group name unless some job was submitted in a job group (1 second, 527 milliseconds)
-------------------------------------------
Time: 2500 ms
-------------------------------------------
(,2)

[info] - caching on disk, replicated 3 (encryption = off) (6 seconds, 314 milliseconds)
-------------------------------------------
Time: 3000 ms
-------------------------------------------

[info] - recovery with saveAsNewAPIHadoopFiles operation (1 second, 600 milliseconds)
-------------------------------------------
Time: 500 ms
-------------------------------------------
(b,1)
(a,2)

-------------------------------------------
Time: 1000 ms
-------------------------------------------
(,2)

-------------------------------------------
Time: 1500 ms
-------------------------------------------

-------------------------------------------
Time: 1500 ms
-------------------------------------------

[info] - job progress bars should handle stage / task failures (1 second, 384 milliseconds)
-------------------------------------------
Time: 2000 ms
-------------------------------------------
(b,1)
(a,2)

-------------------------------------------
Time: 2500 ms
-------------------------------------------
(,2)

-------------------------------------------
Time: 3000 ms
-------------------------------------------

[info] - recovery with saveAsHadoopFile inside transform operation (1 second, 703 milliseconds)
-------------------------------------------
Time: 500 ms
-------------------------------------------
(a,1)

[info] - job details page should display useful information for stages that haven't started (962 milliseconds)
-------------------------------------------
Time: 1000 ms
-------------------------------------------
(a,2)

-------------------------------------------
Time: 1500 ms
-------------------------------------------
(a,3)

-------------------------------------------
Time: 2000 ms
-------------------------------------------
(a,4)

-------------------------------------------
Time: 2500 ms
-------------------------------------------
(a,5)

-------------------------------------------
Time: 3000 ms
-------------------------------------------
(a,6)

-------------------------------------------
Time: 3500 ms
-------------------------------------------
(a,7)

-------------------------------------------
Time: 3500 ms
-------------------------------------------
(a,7)

[info] - job progress bars / cells reflect skipped stages / tasks (902 milliseconds)
-------------------------------------------
Time: 4000 ms
-------------------------------------------
(a,8)

-------------------------------------------
Time: 4500 ms
-------------------------------------------
(a,9)

-------------------------------------------
Time: 5000 ms
-------------------------------------------
(a,10)

[info] - recovery with updateStateByKey operation (1 second, 556 milliseconds)
[info] - stages that aren't run appear as 'skipped stages' after a job finishes (776 milliseconds)
[info] - jobs with stages that are skipped should show correct link descriptions on all jobs page (667 milliseconds)
[info] - attaching and detaching a new tab (887 milliseconds)
[info] - kill stage POST/GET response is correct (298 milliseconds)
[info] - kill job POST/GET response is correct (241 milliseconds)
[info] - recovery maintains rate controller (2 seconds, 776 milliseconds)
[info] - caching on disk, replicated 3 (encryption = off) (with replication as stream) (6 seconds, 266 milliseconds)
[info] - Loop with source PageRank with checkpoint (1 minute, 6 seconds)
[info] - stage & job retention (2 seconds, 616 milliseconds)
Exception in thread "streaming-job-executor-0" java.lang.Error: java.lang.InterruptedException
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.InterruptedException
	at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
	at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:242)
	at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:258)
	at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:187)
	at org.apache.spark.util.ThreadUtils$.awaitReady(ThreadUtils.scala:334)
	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:893)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2196)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2217)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2236)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2261)
	at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
	at org.apache.spark.rdd.RDD.collect(RDD.scala:1029)
	at org.apache.spark.streaming.TestOutputStream$$anonfun$$lessinit$greater$1.apply(TestSuiteBase.scala:99)
	at org.apache.spark.streaming.TestOutputStream$$anonfun$$lessinit$greater$1.apply(TestSuiteBase.scala:98)
	at org.apache.spark.streaming.dstream.ForEachDStream.$anonfun$generateJob$2(ForEachDStream.scala:51)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:417)
	at org.apache.spark.streaming.dstream.ForEachDStream.$anonfun$generateJob$1(ForEachDStream.scala:51)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at scala.util.Try$.apply(Try.scala:213)
	at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
	at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.$anonfun$run$1(JobScheduler.scala:256)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
	at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	... 2 more
[info] - live UI json application list (492 milliseconds)
[info] - job stages should have expected dotfile under DAG visualization (289 milliseconds)
[info] - recovery with file input stream (3 seconds, 436 milliseconds)
[info] - DStreamCheckpointData.restore invoking times (438 milliseconds)
[info] - stages page should show skipped stages (1 second, 427 milliseconds)
[info] - recovery from checkpoint contains array object (983 milliseconds)
[info] - SPARK-11267: the race condition of two checkpoints in a batch (143 milliseconds)
[info] - Staleness of Spark UI should not last minutes or hours (522 milliseconds)
[info] - SPARK-28912: Fix MatchError in getCheckpointFiles (30 milliseconds)
[info] - SPARK-6847: stack overflow when updateStateByKey is followed by a checkpointed dstream (509 milliseconds)
[info] - description for empty jobs (586 milliseconds)
[info] FailureSuite:
[info] HadoopDelegationTokenManagerSuite:
[info] - default configuration (16 milliseconds)
[info] - disable hadoopfs credential provider (2 milliseconds)
[info] - using deprecated configurations (3 milliseconds)
[info] - caching on disk, replicated 3 (encryption = on) (5 seconds, 960 milliseconds)
[info] - SPARK-29082: do not fail if current user does not have credentials (1 second, 202 milliseconds)
[info] RollingEventLogFilesWriterSuite:
[info] - create EventLogFileWriter with enable/disable rolling (69 milliseconds)
[info] - initialize, write, stop - with codec None (57 milliseconds)
[info] - initialize, write, stop - with codec Some(lz4) (51 milliseconds)
[info] - initialize, write, stop - with codec Some(lzf) (48 milliseconds)
[info] - initialize, write, stop - with codec Some(snappy) (71 milliseconds)
[info] - initialize, write, stop - with codec Some(zstd) (45 milliseconds)
[info] - spark.eventLog.compression.codec overrides spark.io.compression.codec (18 milliseconds)
[info] - Event log names (1 millisecond)
[info] - Log overwriting (47 milliseconds)
[info] - rolling event log files - codec None (304 milliseconds)
[info] - rolling event log files - codec Some(lz4) (272 milliseconds)
[info] - rolling event log files - codec Some(lzf) (360 milliseconds)
[info] - Loop with sink PageRank (6 seconds, 645 milliseconds)
[info] - rolling event log files - codec Some(snappy) (390 milliseconds)
[info] - rolling event log files - codec Some(zstd) (394 milliseconds)
[info] - rolling event log files - the max size of event log file size less than lower limit (20 milliseconds)
[info] RandomBlockReplicationPolicyBehavior:
[info] - block replication - random block replication policy (10 milliseconds)
[info] RDDCleanerSuite:
[info] - RDD shuffle cleanup standalone (200 milliseconds)
[info] LocalDiskShuffleMapOutputWriterSuite:
[info] - writing to an outputstream (5 milliseconds)
[info] - writing to a channel (5 milliseconds)
[info] TestMemoryManagerSuite:
[info] - tracks allocated execution memory by task (3 milliseconds)
[info] - markconsequentOOM (1 millisecond)
[info] ResourceProfileManagerSuite:
[info] - ResourceProfileManager (3 milliseconds)
[info] - isSupported yarn no dynamic allocation (2 milliseconds)
[info] - isSupported yarn with dynamic allocation (1 millisecond)
[info] - isSupported k8s with dynamic allocation (1 millisecond)
[info] - isSupported with local mode (1 millisecond)
[info] - ResourceProfileManager has equivalent profile (141 milliseconds)
[info] ExecutorRunnerTest:
[info] - command includes appId (34 milliseconds)
[info] BlockTransferServiceSuite:
[info] - fetchBlockSync should not hang when BlockFetchingListener.onBlockFetchSuccess fails (6 milliseconds)
[info] EventLoggingListenerSuite:
[info] - Basic event logging with compression (347 milliseconds)
[info] - caching on disk, replicated 3 (encryption = on) (with replication as stream) (6 seconds, 15 milliseconds)
[info] - End-to-end event logging (5 seconds, 563 milliseconds)
[info] - caching in memory and disk, replicated (encryption = off) (6 seconds, 316 milliseconds)
[info] - Loop with sink PageRank with checkpoint (13 seconds, 893 milliseconds)
[info] EdgeRDDSuite:
[info] - cache, getStorageLevel (95 milliseconds)
[info] - checkpointing (270 milliseconds)
[info] - count (159 milliseconds)
[info] GraphOpsSuite:
[info] - joinVertices (293 milliseconds)
[info] - collectNeighborIds (509 milliseconds)
[info] - removeSelfEdges (265 milliseconds)
[info] - filter (325 milliseconds)
[info] - convertToCanonicalEdges (286 milliseconds)
[info] - collectEdgesCycleDirectionOut (430 milliseconds)
[info] - caching in memory and disk, replicated (encryption = off) (with replication as stream) (6 seconds, 413 milliseconds)
[info] - collectEdgesCycleDirectionIn (267 milliseconds)
[info] - collectEdgesCycleDirectionEither (227 milliseconds)
[info] - collectEdgesChainDirectionOut (236 milliseconds)
[info] - collectEdgesChainDirectionIn (285 milliseconds)
[info] - collectEdgesChainDirectionEither (268 milliseconds)
[info] VertexPartitionSuite:
[info] - isDefined, filter (10 milliseconds)
[info] - map (1 millisecond)
[info] - diff (3 milliseconds)
[info] - leftJoin (5 milliseconds)
[info] - innerJoin (3 milliseconds)
[info] - createUsingIndex (0 milliseconds)
[info] - innerJoinKeepLeft (0 milliseconds)
[info] - aggregateUsingIndex (1 millisecond)
[info] - reindex (2 milliseconds)
[info] - serialization (26 milliseconds)
[info] GraphGeneratorsSuite:
[info] - GraphGenerators.generateRandomEdges (4 milliseconds)
[info] - GraphGenerators.sampleLogNormal (8 milliseconds)
[info] - GraphGenerators.logNormalGraph (305 milliseconds)
[info] - SPARK-5064 GraphGenerators.rmatGraph numEdges upper bound (179 milliseconds)
[info] VertexRDDSuite:
[info] - filter (322 milliseconds)
[info] - mapValues (274 milliseconds)
[info] - minus (261 milliseconds)
[info] - minus with RDD[(VertexId, VD)] (191 milliseconds)
[info] - minus with non-equal number of partitions (278 milliseconds)
[info] - diff (422 milliseconds)
[info] - diff with RDD[(VertexId, VD)] (395 milliseconds)
[info] - diff vertices with non-equal number of partitions (276 milliseconds)
[info] - leftJoin (359 milliseconds)
[info] - leftJoin vertices with non-equal number of partitions (205 milliseconds)
[info] - innerJoin (286 milliseconds)
[info] - innerJoin vertices with the non-equal number of partitions (283 milliseconds)
[info] - aggregateUsingIndex (256 milliseconds)
[info] - mergeFunc (145 milliseconds)
[info] - cache, getStorageLevel (93 milliseconds)
[info] - checkpoint (417 milliseconds)
[info] - count (230 milliseconds)
[info] GraphLoaderSuite:
[info] - caching in memory and disk, replicated (encryption = on) (6 seconds, 464 milliseconds)
[info] - GraphLoader.edgeListFile (467 milliseconds)
[info] StronglyConnectedComponentsSuite:
[info] - Island Strongly Connected Components (436 milliseconds)
[info] - Cycle Strongly Connected Components (1 second, 579 milliseconds)
[info] - 2 Cycle Strongly Connected Components (1 second, 181 milliseconds)
[info] TriangleCountSuite:
[info] - Count a single triangle (319 milliseconds)
[info] - Count two triangles (330 milliseconds)
[info] - Count two triangles with bi-directed edges (347 milliseconds)
[info] - Count a single triangle with duplicate edges (526 milliseconds)
[info] EdgePartitionSuite:
[info] - reverse (3 milliseconds)
[info] - map (3 milliseconds)
[info] - filter (3 milliseconds)
[info] - groupEdges (2 milliseconds)
[info] - innerJoin (3 milliseconds)
[info] - isActive, numActives, replaceActives (1 millisecond)
[info] - tripletIterator (1 millisecond)
[info] - serialization (16 milliseconds)
[info] EdgeSuite:
[info] - compare (1 millisecond)
[info] LabelPropagationSuite:
[info] - End-to-end event logging with compression (21 seconds, 785 milliseconds)
[info] - Event logging with password redaction (22 milliseconds)
[info] - Spark-33504 sensitive attributes redaction in properties (47 milliseconds)
[info] - Executor metrics update (103 milliseconds)
[info] - caching in memory and disk, replicated (encryption = on) (with replication as stream) (5 seconds, 792 milliseconds)
[info] - SPARK-31764: isBarrier should be logged in event log (266 milliseconds)
[info] PluginContainerSuite:
[info] - plugin initialization and communication (131 milliseconds)
[info] - do nothing if plugins are not configured (1 millisecond)
[info] - merging of config options (48 milliseconds)
[info] - SPARK-33088: executor tasks trigger plugin calls (79 milliseconds)
[info] - SPARK-33088: executor failed tasks trigger plugin calls (97 milliseconds)
[info] - Label Propagation (1 second, 564 milliseconds)
[info] SVDPlusPlusSuite:
[info] - Test SVD++ with mean square error on training set (609 milliseconds)
[info] - Test SVD++ with no edges (210 milliseconds)
[info] GraphSuite:
[info] - Graph.fromEdgeTuples (239 milliseconds)
[info] - Graph.fromEdges (125 milliseconds)
[info] - Graph.apply (340 milliseconds)
[info] - triplets (328 milliseconds)
[info] - multiple failures with map (35 seconds, 540 milliseconds)
[info] - plugin initialization in non-local mode (3 seconds, 952 milliseconds)
[info] - caching in memory and disk, serialized, replicated (encryption = off) (6 seconds, 257 milliseconds)
[info] - partitionBy (3 seconds, 888 milliseconds)
[info] - mapVertices (300 milliseconds)
[info] - mapVertices changing type with same erased type (281 milliseconds)
[info] - mapEdges (166 milliseconds)
[info] - mapTriplets (298 milliseconds)
[info] - reverse (288 milliseconds)
[info] - reverse with join elimination (282 milliseconds)
[info] - subgraph (399 milliseconds)
[info] - mask (333 milliseconds)
[info] - groupEdges (441 milliseconds)
[info] - aggregateMessages (371 milliseconds)
[info] - outerJoinVertices (474 milliseconds)
[info] - more edge partitions than vertex partitions (185 milliseconds)
[info] - plugin initialization in non-local mode with resources (6 seconds, 80 milliseconds)
[info] DriverRunnerTest:
[info] - Process succeeds instantly (87 milliseconds)
[info] - checkpoint (455 milliseconds)
[info] - Process failing several times and then succeeding (30 milliseconds)
[info] - Process doesn't restart if not supervised (26 milliseconds)
[info] - Process doesn't restart if killed (30 milliseconds)
[info] - Reset of backoff counter (36 milliseconds)
[info] - cache, getStorageLevel (144 milliseconds)
[info] - Kill process finalized with state KILLED (37 milliseconds)
[info] - Finalized with state FINISHED (36 milliseconds)
[info] - Finalized with state FAILED (32 milliseconds)
[info] - Handle exception starting process (33 milliseconds)
[info] PrefixComparatorsSuite:
[info] - String prefix comparator (72 milliseconds)
[info] - Binary prefix comparator (18 milliseconds)
[info] - double prefix comparator handles NaNs properly (1 millisecond)
[info] - double prefix comparator handles negative NaNs properly (1 millisecond)
[info] - double prefix comparator handles other special values properly (1 millisecond)
[info] NettyBlockTransferSecuritySuite:
[info] - security default off (76 milliseconds)
[info] - non-default number of edge partitions (312 milliseconds)
[info] - security on same password (100 milliseconds)
[info] - security on mismatch password (68 milliseconds)
[info] - security mismatch auth off on server (76 milliseconds)
[info] - security mismatch auth off on client (107 milliseconds)
[info] - security with aes encryption (215 milliseconds)
[info] CommandUtilsSuite:
[info] - set libraryPath correctly (27 milliseconds)
[info] - unpersist graph RDD (635 milliseconds)
[info] - auth secret shouldn't appear in java opts (71 milliseconds)
[info] PairRDDFunctionsSuite:
[info] - SPARK-14219: pickRandomVertex (355 milliseconds)
[info] - aggregateByKey (153 milliseconds)
[info] - groupByKey (71 milliseconds)
[info] - groupByKey with duplicates (70 milliseconds)
[info] - groupByKey with negative key hash codes (63 milliseconds)
[info] - groupByKey with many output partitions (79 milliseconds)
[info] - caching in memory and disk, serialized, replicated (encryption = off) (with replication as stream) (7 seconds, 120 milliseconds)
[info] HashExpressionsSuite:
[info] - sampleByKey (6 seconds, 209 milliseconds)
[info] - md5 (2 seconds, 286 milliseconds)
[info] - caching in memory and disk, serialized, replicated (encryption = on) (5 seconds, 747 milliseconds)
[info] - sha1 (294 milliseconds)
[info] - sha2 (474 milliseconds)
[info] - crc32 (168 milliseconds)
[info] - hive-hash for null (4 milliseconds)
[info] - hive-hash for boolean (2 milliseconds)
[info] - hive-hash for byte (4 milliseconds)
[info] - hive-hash for short (1 millisecond)
[info] - hive-hash for int (1 millisecond)
[info] - hive-hash for long (2 milliseconds)
[info] - hive-hash for float (1 millisecond)
[info] - hive-hash for double (1 millisecond)
[info] - hive-hash for string (1 millisecond)
[info] - hive-hash for date type (9 milliseconds)
[info] - hive-hash for timestamp type (220 milliseconds)
[info] - hive-hash for CalendarInterval type (11 milliseconds)
[info] - hive-hash for array (2 milliseconds)
[info] - hive-hash for map (1 millisecond)
[info] - hive-hash for struct (2 milliseconds)
[info] - murmur3/xxHash64/hive hash: struct<null:null,boolean:boolean,byte:tinyint,short:smallint,int:int,long:bigint,float:float,double:double,bigDecimal:decimal(38,18),smallDecimal:decimal(10,0),string:string,binary:binary,date:date,timestamp:timestamp,udt:examplepoint> (3 seconds, 537 milliseconds)
[info] - SPARK-30633: xxHash64 with long seed: struct<null:null,boolean:boolean,byte:tinyint,short:smallint,int:int,long:bigint,float:float,double:double,bigDecimal:decimal(38,18),smallDecimal:decimal(10,0),string:string,binary:binary,date:date,timestamp:timestamp,udt:examplepoint> (716 milliseconds)
[info] - caching in memory and disk, serialized, replicated (encryption = on) (with replication as stream) (5 seconds, 985 milliseconds)
[info] - sampleByKeyExact (8 seconds, 582 milliseconds)
[info] - reduceByKey (37 milliseconds)
[info] - reduceByKey with collectAsMap (35 milliseconds)
[info] - reduceByKey with many output partitions (37 milliseconds)
[info] - reduceByKey with partitioner (45 milliseconds)
[info] - countApproxDistinctByKey (148 milliseconds)
[info] - join (39 milliseconds)
[info] - join all-to-all (34 milliseconds)
[info] - leftOuterJoin (41 milliseconds)
[info] - cogroup with empty RDD (36 milliseconds)
[info] - cogroup with groupByed RDD having 0 partitions (47 milliseconds)
[info] - cogroup between multiple RDD with an order of magnitude difference in number of partitions (10 milliseconds)
[info] - cogroup between multiple RDD with number of partitions similar in order of magnitude (8 milliseconds)
[info] - cogroup between multiple RDD when defaultParallelism is set without proper partitioner (5 milliseconds)
[info] - cogroup between multiple RDD when defaultParallelism is set with proper partitioner (6 milliseconds)
[info] - cogroup between multiple RDD when defaultParallelism is set; with huge number of partitions in upstream RDDs (6 milliseconds)
[info] - rightOuterJoin (46 milliseconds)
[info] - fullOuterJoin (31 milliseconds)
[info] - join with no matches (30 milliseconds)
[info] - join with many output partitions (35 milliseconds)
[info] - groupWith (65 milliseconds)
[info] - groupWith3 (43 milliseconds)
[info] - groupWith4 (72 milliseconds)
[info] - zero-partition RDD (70 milliseconds)
[info] - keys and values (37 milliseconds)
[info] - default partitioner uses partition size (10 milliseconds)
[info] - default partitioner uses largest partitioner (9 milliseconds)
[info] - subtract (48 milliseconds)
[info] - subtract with narrow dependency (58 milliseconds)
[info] - subtractByKey (24 milliseconds)
[info] - subtractByKey with narrow dependency (32 milliseconds)
[info] - foldByKey (30 milliseconds)
[info] - foldByKey with mutable result type (43 milliseconds)
[info] - saveNewAPIHadoopFile should call setConf if format is configurable (106 milliseconds)
[info] - The JobId on the driver and executors should be the same during the commit (48 milliseconds)
[info] - saveAsHadoopFile should respect configured output committers (53 milliseconds)
[info] - failure callbacks should be called before calling writer.close() in saveNewAPIHadoopFile (40 milliseconds)
[info] - failure callbacks should be called before calling writer.close() in saveAsHadoopFile (62 milliseconds)
[info] - saveAsNewAPIHadoopDataset should support invalid output paths when there are no files to be committed to an absolute output location (102 milliseconds)
[info] - saveAsHadoopDataset should respect empty output directory when there are no files to be committed to an absolute output location (45 milliseconds)
[info] - lookup (49 milliseconds)
[info] - lookup with partitioner (60 milliseconds)
[info] - lookup with bad partitioner (38 milliseconds)
[info] RBackendSuite:
[info] - close() clears jvmObjectTracker (3 milliseconds)
[info] PrimitiveVectorSuite:
[info] - primitive value (6 milliseconds)
[info] - non-primitive value (4 milliseconds)
[info] - ideal growth (4 milliseconds)
[info] - ideal size (2 milliseconds)
[info] - resizing (5 milliseconds)
[info] MetricsConfigSuite:
[info] - MetricsConfig with default properties (2 milliseconds)
[info] - MetricsConfig with properties set from a file (1 millisecond)
[info] - MetricsConfig with properties set from a Spark configuration (0 milliseconds)
[info] - MetricsConfig with properties set from a file and a Spark configuration (0 milliseconds)
[info] - MetricsConfig with subProperties (1 millisecond)
[info] PartiallySerializedBlockSuite:
[info] - valuesIterator() and finishWritingToStream() cannot be called after discard() is called (65 milliseconds)
[info] - discard() can be called more than once (1 millisecond)
[info] - cannot call valuesIterator() more than once (3 milliseconds)
[info] - cannot call finishWritingToStream() more than once (3 milliseconds)
[info] - cannot call finishWritingToStream() after valuesIterator() (1 millisecond)
[info] - cannot call valuesIterator() after finishWritingToStream() (3 milliseconds)
[info] - buffers are deallocated in a TaskCompletionListener (2 milliseconds)
[info] - basic numbers with discard() and numBuffered = 50 (6 milliseconds)
[info] - basic numbers with finishWritingToStream() and numBuffered = 50 (40 milliseconds)
[info] - basic numbers with valuesIterator() and numBuffered = 50 (5 milliseconds)
[info] - basic numbers with discard() and numBuffered = 0 (1 millisecond)
[info] - basic numbers with finishWritingToStream() and numBuffered = 0 (20 milliseconds)
[info] - basic numbers with valuesIterator() and numBuffered = 0 (2 milliseconds)
[info] - basic numbers with discard() and numBuffered = 1000 (25 milliseconds)
[info] - basic numbers with finishWritingToStream() and numBuffered = 1000 (22 milliseconds)
[info] - basic numbers with valuesIterator() and numBuffered = 1000 (26 milliseconds)
[info] - case classes with discard() and numBuffered = 50 (31 milliseconds)
[info] - case classes with finishWritingToStream() and numBuffered = 50 (155 milliseconds)
[info] - case classes with valuesIterator() and numBuffered = 50 (11 milliseconds)
[info] - case classes with discard() and numBuffered = 0 (1 millisecond)
[info] - case classes with finishWritingToStream() and numBuffered = 0 (171 milliseconds)
[info] - case classes with valuesIterator() and numBuffered = 0 (2 milliseconds)
[info] - case classes with discard() and numBuffered = 1000 (210 milliseconds)
[info] - case classes with finishWritingToStream() and numBuffered = 1000 (210 milliseconds)
[info] - case classes with valuesIterator() and numBuffered = 1000 (253 milliseconds)
[info] - empty iterator with discard() and numBuffered = 0 (2 milliseconds)
[info] - empty iterator with finishWritingToStream() and numBuffered = 0 (4 milliseconds)
[info] - empty iterator with valuesIterator() and numBuffered = 0 (2 milliseconds)
[info] SparkContextSchedulerCreationSuite:
[info] - bad-master (91 milliseconds)
[info] - local (121 milliseconds)
[info] - local-* (112 milliseconds)
[info] - local-n (64 milliseconds)
[info] - local-*-n-failures (74 milliseconds)
[info] - local-n-failures (82 milliseconds)
[info] - bad-local-n (64 milliseconds)
[info] - bad-local-n-failures (50 milliseconds)
[info] - local-default-parallelism (54 milliseconds)
[info] - local-cluster (265 milliseconds)
[info] SerializationDebuggerSuite:
[info] - primitives, strings, and nulls (2 milliseconds)
[info] - primitive arrays (1 millisecond)
[info] - non-primitive arrays (1 millisecond)
[info] - serializable object (1 millisecond)
[info] - nested arrays (0 milliseconds)
[info] - nested objects (1 millisecond)
[info] - cycles (should not loop forever) (0 milliseconds)
[info] - root object not serializable (1 millisecond)
[info] - array containing not serializable element (1 millisecond)
[info] - object containing not serializable field (1 millisecond)
[info] - externalizable class writing out not serializable object (1 millisecond)
[info] - externalizable class writing out serializable objects (0 milliseconds)
[info] - object containing writeReplace() which returns not serializable object (1 millisecond)
[info] - object containing writeReplace() which returns serializable object (0 milliseconds)
[info] - no infinite loop with writeReplace() which returns class of its own type (1 millisecond)
[info] - object containing writeObject() and not serializable field (2 milliseconds)
[info] - object containing writeObject() and serializable field (1 millisecond)
[info] - object of serializable subclass with more fields than superclass (SPARK-7180) (1 millisecond)
[info] - crazy nested objects (1 millisecond)
[info] - improveException (1 millisecond)
[info] - improveException with error in debugger (3 milliseconds)
[info] LoggingSuite:
[info] - spark-shell logging filter (2 milliseconds)
[info] NettyRpcHandlerSuite:
[info] - receive (95 milliseconds)
[info] - connectionTerminated (1 millisecond)
[info] SamplingUtilsSuite:
[info] - reservoirSampleAndCount (1 millisecond)
[info] - SPARK-18678 reservoirSampleAndCount with tiny input (5 milliseconds)
[info] - computeFraction (5 milliseconds)
[info] TimeStampedHashMapSuite:
[info] - HashMap - basic test (5 milliseconds)
[info] - TimeStampedHashMap - basic test (4 milliseconds)
[info] - TimeStampedHashMap - threading safety test (125 milliseconds)
[info] - TimeStampedHashMap - clearing by timestamp (33 milliseconds)
[info] RandomSamplerSuite:
[info] - utilities (9 milliseconds)
[info] - sanity check medianKSD against references (105 milliseconds)
[info] - bernoulli sampling (44 milliseconds)
[info] - bernoulli sampling without iterator (37 milliseconds)
[info] - bernoulli sampling with gap sampling optimization (69 milliseconds)
[info] - murmur3/xxHash64/hive hash: struct<arrayOfNull:array<null>,arrayOfString:array<string>,arrayOfArrayOfString:array<array<string>>,arrayOfArrayOfInt:array<array<int>>,arrayOfStruct:array<struct<str:string>>,arrayOfUDT:array<examplepoint>> (7 seconds, 543 milliseconds)
[info] - bernoulli sampling (without iterator) with gap sampling optimization (81 milliseconds)
[info] - bernoulli boundary cases (0 milliseconds)
[info] - bernoulli (without iterator) boundary cases (2 milliseconds)
[info] - bernoulli data types (78 milliseconds)
[info] - bernoulli clone (15 milliseconds)
[info] - bernoulli set seed (32 milliseconds)
[info] - replacement sampling (35 milliseconds)
[info] - replacement sampling without iterator (40 milliseconds)
[info] - compute without caching when no partitions fit in memory (7 seconds, 1 milliseconds)
[info] - replacement sampling with gap sampling (144 milliseconds)
[info] - replacement sampling (without iterator) with gap sampling (131 milliseconds)
[info] - replacement boundary cases (0 milliseconds)
[info] - replacement (without) boundary cases (1 millisecond)
[info] - replacement data types (85 milliseconds)
[info] - replacement clone (23 milliseconds)
[info] - replacement set seed (39 milliseconds)
[info] - bernoulli partitioning sampling (21 milliseconds)
[info] - bernoulli partitioning sampling without iterator (20 milliseconds)
[info] - bernoulli partitioning boundary cases (0 milliseconds)
[info] - bernoulli partitioning (without iterator) boundary cases (2 milliseconds)
[info] - bernoulli partitioning data (1 millisecond)
[info] - bernoulli partitioning clone (1 millisecond)
[info] ChunkedByteBufferOutputStreamSuite:
[info] - empty output (1 millisecond)
[info] - write a single byte (0 milliseconds)
[info] - write a single near boundary (0 milliseconds)
[info] - write a single at boundary (1 millisecond)
[info] - single chunk output (1 millisecond)
[info] - single chunk output at boundary size (1 millisecond)
[info] - multiple chunk output (1 millisecond)
[info] - multiple chunk output at boundary size (0 milliseconds)
[info] - SPARK-36464: size returns correct positive number even with over 2GB data (1 second, 334 milliseconds)
[info] ProcfsMetricsGetterSuite:
[info] - testGetProcessInfo (2 milliseconds)
[info] - SPARK-34845: partial metrics shouldn't be returned (26 milliseconds)
[info] GraphiteSinkSuite:
[info] - GraphiteSink with default MetricsFilter (34 milliseconds)
[info] - GraphiteSink with regex MetricsFilter (21 milliseconds)
[info] SparkSubmitUtilsSuite:
[info] - incorrect maven coordinate throws error (4 milliseconds)
[info] - create repo resolvers (102 milliseconds)
[info] - create additional resolvers (9 milliseconds)
:: loading settings :: url = jar:file:/home/jenkins/sparkivy/per-executor-caches/5/.cache/coursier/v1/https/maven-central.storage-download.googleapis.com/maven2/org/apache/ivy/ivy/2.4.0/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
[info] - add dependencies works correctly (114 milliseconds)
[info] - excludes works correctly (11 milliseconds)
[info] - SPARK-30633: xxHash64 with long seed: struct<arrayOfNull:array<null>,arrayOfString:array<string>,arrayOfArrayOfString:array<array<string>>,arrayOfArrayOfInt:array<array<int>>,arrayOfStruct:array<struct<str:string>>,arrayOfUDT:array<examplepoint>> (4 seconds, 674 milliseconds)
[info] - ivy path works correctly (2 seconds, 222 milliseconds)
[info] - multiple failures with updateStateByKey (35 seconds, 519 milliseconds)
[info] WriteAheadLogUtilsSuite:
[info] - log selection and creation (43 milliseconds)
[info] - wrap WriteAheadLog in BatchedWriteAheadLog when batching is enabled (11 milliseconds)
[info] - batching is enabled by default in WriteAheadLog (1 millisecond)
[info] - closeFileAfterWrite is disabled by default in WriteAheadLog (1 millisecond)
[info] BlockGeneratorSuite:
[info] - block generation and data callbacks (44 milliseconds)
[info] - murmur3/xxHash64/hive hash: struct<structOfString:struct<str:string>,structOfStructOfString:struct<struct:struct<str:string>>,structOfArray:struct<array:array<string>>,structOfUDT:struct<udt:examplepoint>> (1 second, 839 milliseconds)
[info] - search for artifact at local repositories (1 second, 914 milliseconds)
[info] - stop ensures correct shutdown (236 milliseconds)
[info] - compute when only some partitions fit in memory (6 seconds, 278 milliseconds)
[info] - block push errors are reported (25 milliseconds)
[info] UISeleniumSuite:
[info] - dependency not found throws RuntimeException (308 milliseconds)
[info] - SPARK-30633: xxHash64 with long seed: struct<structOfString:struct<str:string>,structOfStructOfString:struct<struct:struct<str:string>>,structOfArray:struct<array:array<string>>,structOfUDT:struct<udt:examplepoint>> (601 milliseconds)
[info] - hive-hash for decimal (3 milliseconds)
[info] - neglects Spark and Spark's dependencies (818 milliseconds)
[info] - exclude dependencies end to end (542 milliseconds)
:: loading settings :: file = /home/jenkins/workspace/spark-branch-3.1-test-sbt-hadoop-3.2/target/tmp/ivy-570ace77-6f3f-462b-bfa7-d05593368bbd/ivysettings.xml
[info] - load ivy settings file (304 milliseconds)
[info] - SPARK-18207: Compute hash for a lot of expressions (1 second, 757 milliseconds)
[info] - SPARK-10878: test resolution files cleaned after resolving artifact (312 milliseconds)
[info] BasicEventFilterBuilderSuite:
[info] - track live jobs (10 milliseconds)
[info] - track live executors (1 millisecond)
[info] ImplicitOrderingSuite:
[info] - basic inference of Orderings (283 milliseconds)
[info] TaskMetricsSuite:
[info] - mutating values (1 millisecond)
[info] - mutating shuffle read metrics values (1 millisecond)
[info] - mutating shuffle write metrics values (1 millisecond)
[info] - mutating input metrics values (0 milliseconds)
[info] - mutating output metrics values (1 millisecond)
[info] - merging multiple shuffle read metrics (0 milliseconds)
[info] - additional accumulables (1 millisecond)
[info] ExternalShuffleServiceSuite:
[info] - groupByKey without compression (166 milliseconds)
[info] - passing environment variables to cluster (5 seconds, 584 milliseconds)
[info] - attaching and detaching a Streaming tab (6 seconds, 931 milliseconds)
[info] ExecutorAllocationManagerSuite:
[info] - basic functionality (308 milliseconds)
[info] - basic decommissioning (73 milliseconds)
[info] - requestExecutors policy (72 milliseconds)
[info] - killExecutor policy (20 milliseconds)
[info] - parameter validation (30 milliseconds)
[info] - shuffle non-zero block size (7 seconds, 185 milliseconds)
[info] - enabling and disabling (1 second, 371 milliseconds)
[info] StreamingJobProgressListenerSuite:
[info] - onBatchSubmitted, onBatchStarted, onBatchCompleted, onReceiverStarted, onReceiverError, onReceiverStopped (82 milliseconds)
[info] - Remove the old completed batches when exceeding the limit (95 milliseconds)
[info] - out-of-order onJobStart and onBatchXXX (98 milliseconds)
[info] - detect memory leak (196 milliseconds)
[info] ReceiverSuite:
[info] - receiver life cycle (338 milliseconds)
[info] - block generator throttling !!! IGNORED !!!
[info] - recover from node failures (6 seconds, 874 milliseconds)
[info] - shuffle serializer (6 seconds, 539 milliseconds)
java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@2228cb08 rejected from java.util.concurrent.ThreadPoolExecutor@c94c520[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 111]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete(Promise.scala:372)
	at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete$(Promise.scala:371)
	at scala.concurrent.impl.Promise$KeptPromise$Successful.onComplete(Promise.scala:379)
	at scala.concurrent.impl.Promise.transform(Promise.scala:33)
	at scala.concurrent.impl.Promise.transform$(Promise.scala:31)
	at scala.concurrent.impl.Promise$KeptPromise$Successful.transform(Promise.scala:379)
	at scala.concurrent.Future.map(Future.scala:292)
	at scala.concurrent.Future.map$(Future.scala:292)
	at scala.concurrent.impl.Promise$KeptPromise$Successful.map(Promise.scala:379)
	at scala.concurrent.Future$.apply(Future.scala:659)
	at org.apache.spark.streaming.receiver.WriteAheadLogBasedBlockHandler.storeBlock(ReceivedBlockHandler.scala:190)
	at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:158)
	at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushArrayBuffer(ReceiverSupervisorImpl.scala:129)
	at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl$$anon$2.onPushBlock(ReceiverSupervisorImpl.scala:110)
	at org.apache.spark.streaming.receiver.BlockGenerator.pushBlock(BlockGenerator.scala:299)
	at org.apache.spark.streaming.receiver.BlockGenerator.org$apache$spark$streaming$receiver$BlockGenerator$$keepPushingBlocks(BlockGenerator.scala:271)
	at org.apache.spark.streaming.receiver.BlockGenerator$$anon$1.run(BlockGenerator.scala:112)
java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@7d70ae06 rejected from java.util.concurrent.ThreadPoolExecutor@4b95c816[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 111]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete(Promise.scala:372)
	at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete$(Promise.scala:371)
	at scala.concurrent.impl.Promise$KeptPromise$Successful.onComplete(Promise.scala:379)
	at scala.concurrent.impl.Promise.transform(Promise.scala:33)
	at scala.concurrent.impl.Promise.transform$(Promise.scala:31)
	at scala.concurrent.impl.Promise$KeptPromise$Successful.transform(Promise.scala:379)
	at scala.concurrent.Future.map(Future.scala:292)
	at scala.concurrent.Future.map$(Future.scala:292)
	at scala.concurrent.impl.Promise$KeptPromise$Successful.map(Promise.scala:379)
	at scala.concurrent.Future$.apply(Future.scala:659)
	at org.apache.spark.streaming.receiver.WriteAheadLogBasedBlockHandler.storeBlock(ReceivedBlockHandler.scala:190)
	at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:158)
	at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushArrayBuffer(ReceiverSupervisorImpl.scala:129)
	at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl$$anon$2.onPushBlock(ReceiverSupervisorImpl.scala:110)
	at org.apache.spark.streaming.receiver.BlockGenerator.pushBlock(BlockGenerator.scala:299)
	at org.apache.spark.streaming.receiver.BlockGenerator.org$apache$spark$streaming$receiver$BlockGenerator$$keepPushingBlocks(BlockGenerator.scala:271)
	at org.apache.spark.streaming.receiver.BlockGenerator$$anon$1.run(BlockGenerator.scala:112)
java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@53b51efa rejected from java.util.concurrent.ThreadPoolExecutor@c94c520[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 111]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete(Promise.scala:372)
	at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete$(Promise.scala:371)
	at scala.concurrent.impl.Promise$KeptPromise$Successful.onComplete(Promise.scala:379)
	at scala.concurrent.impl.Promise.transform(Promise.scala:33)
	at scala.concurrent.impl.Promise.transform$(Promise.scala:31)
	at scala.concurrent.impl.Promise$KeptPromise$Successful.transform(Promise.scala:379)
	at scala.concurrent.Future.map(Future.scala:292)
	at scala.concurrent.Future.map$(Future.scala:292)
	at scala.concurrent.impl.Promise$KeptPromise$Successful.map(Promise.scala:379)
	at scala.concurrent.Future$.apply(Future.scala:659)
	at org.apache.spark.streaming.receiver.WriteAheadLogBasedBlockHandler.storeBlock(ReceivedBlockHandler.scala:203)
	at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:158)
	at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushArrayBuffer(ReceiverSupervisorImpl.scala:129)
	at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl$$anon$2.onPushBlock(ReceiverSupervisorImpl.scala:110)
	at org.apache.spark.streaming.receiver.BlockGenerator.pushBlock(BlockGenerator.scala:299)
	at org.apache.spark.streaming.receiver.BlockGenerator.org$apache$spark$streaming$receiver$BlockGenerator$$keepPushingBlocks(BlockGenerator.scala:271)
	at org.apache.spark.streaming.receiver.BlockGenerator$$anon$1.run(BlockGenerator.scala:112)
java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@753c7417 rejected from java.util.concurrent.ThreadPoolExecutor@4b95c816[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 111]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete(Promise.scala:372)
	at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete$(Promise.scala:371)
	at scala.concurrent.impl.Promise$KeptPromise$Successful.onComplete(Promise.scala:379)
	at scala.concurrent.impl.Promise.transform(Promise.scala:33)
	at scala.concurrent.impl.Promise.transform$(Promise.scala:31)
	at scala.concurrent.impl.Promise$KeptPromise$Successful.transform(Promise.scala:379)
	at scala.concurrent.Future.map(Future.scala:292)
	at scala.concurrent.Future.map$(Future.scala:292)
	at scala.concurrent.impl.Promise$KeptPromise$Successful.map(Promise.scala:379)
	at scala.concurrent.Future$.apply(Future.scala:659)
	at org.apache.spark.streaming.receiver.WriteAheadLogBasedBlockHandler.storeBlock(ReceivedBlockHandler.scala:203)
	at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:158)
	at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushArrayBuffer(ReceiverSupervisorImpl.scala:129)
	at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl$$anon$2.onPushBlock(ReceiverSupervisorImpl.scala:110)
	at org.apache.spark.streaming.receiver.BlockGenerator.pushBlock(BlockGenerator.scala:299)
	at org.apache.spark.streaming.receiver.BlockGenerator.org$apache$spark$streaming$receiver$BlockGenerator$$keepPushingBlocks(BlockGenerator.scala:271)
	at org.apache.spark.streaming.receiver.BlockGenerator$$anon$1.run(BlockGenerator.scala:112)
[info] - SPARK-22284: Compute hash for nested structs (17 seconds, 913 milliseconds)
[info] - SPARK-30633: xxHash with different type seeds (345 milliseconds)
[info] - recover from repeated node failures during shuffle-map (9 seconds, 47 milliseconds)
[info] CastSuite:
[info] - zero sized blocks (8 seconds, 760 milliseconds)
[info] - null cast (7 seconds, 950 milliseconds)
[info] - cast string to date (1 second, 188 milliseconds)
[info] - zero sized blocks without kryo (9 seconds, 890 milliseconds)
[info] - recover from repeated node failures during shuffle-reduce (16 seconds, 184 milliseconds)
[info] - cast string to timestamp (9 seconds, 341 milliseconds)
[info] - cast from boolean (165 milliseconds)
[info] - cast from int (411 milliseconds)
[info] - cast from long (284 milliseconds)
[info] - cast from float (205 milliseconds)
[info] - cast from double (198 milliseconds)
[info] - cast from string (8 milliseconds)
[info] - shuffle on mutable pairs (7 seconds, 350 milliseconds)
[info] - sorting on mutable pairs (6 seconds, 561 milliseconds)
[info] - write ahead log - generating and cleaning (39 seconds, 919 milliseconds)
[info] StreamingListenerSuite:
[info] - batch info reporting (653 milliseconds)
[info] - receiver info reporting (179 milliseconds)
[info] - recover from node failures with replication (14 seconds, 191 milliseconds)
[info] - output operation reporting (1 second, 37 milliseconds)
[info] - don't call ssc.stop in listener (984 milliseconds)
[info] - onBatchCompleted with successful batch (1 second, 28 milliseconds)
[info] - cogroup using mutable pairs (5 seconds, 713 milliseconds)
[info] - onBatchCompleted with failed batch and one failed job (972 milliseconds)
[info] - onBatchCompleted with failed batch and multiple failed jobs (992 milliseconds)
[info] - StreamingListener receives no events after stopping StreamingListenerBus (652 milliseconds)
[info] MapWithStateRDDSuite:
[info] - creation from pair RDD (234 milliseconds)
[info] - updating state and generating mapped data in MapWithStateRDDRecord (7 milliseconds)
[info] - unpersist RDDs (5 seconds, 824 milliseconds)
[info] - states generated by MapWithStateRDD (1 second, 754 milliseconds)
[info] - checkpointing (835 milliseconds)
[info] - checkpointing empty state RDD (287 milliseconds)
[info] BasicOperationsSuite:
[info] - subtract mutable pairs (5 seconds, 452 milliseconds)
[info] - map (461 milliseconds)
[info] - flatMap (340 milliseconds)
[info] - filter (354 milliseconds)
[info] - glom (234 milliseconds)
[info] - mapPartitions (260 milliseconds)
[info] - repartition (more partitions) (417 milliseconds)
[info] - repartition (fewer partitions) (403 milliseconds)
[info] - reference partitions inside a task (4 seconds, 975 milliseconds)
[info] - groupByKey (410 milliseconds)
[info] - reduceByKey (300 milliseconds)
[info] - reduce (302 milliseconds)
[info] - count (358 milliseconds)
[info] - countByValue (309 milliseconds)
[info] - mapValues (314 milliseconds)
[info] - flatMapValues (373 milliseconds)
[info] - union (319 milliseconds)
[info] - union with input stream return None (250 milliseconds)
[info] - StreamingContext.union (368 milliseconds)
[info] DateExpressionsSuite:
[info] - transform (314 milliseconds)
[info] - datetime function current_date (56 milliseconds)
[info] - transform with NULL (126 milliseconds)
[info] - sort with Java non serializable class - Kryo (5 seconds, 690 milliseconds)
[info] - transform with input stream return None (147 milliseconds)
[info] - transformWith (427 milliseconds)
[info] - datetime function current_timestamp (752 milliseconds)
[info] - transformWith with input stream return None (182 milliseconds)
[info] - StreamingContext.transform (296 milliseconds)
[info] - StreamingContext.transform with input stream return None (164 milliseconds)
[info] - cogroup (590 milliseconds)
[info] - join (388 milliseconds)
[info] - leftOuterJoin (443 milliseconds)
[info] - rightOuterJoin (439 milliseconds)
[info] - fullOuterJoin (438 milliseconds)
[info] - updateStateByKey (483 milliseconds)
[info] - updateStateByKey - simple with initial value RDD (481 milliseconds)
[info] - DayOfYear (4 seconds, 174 milliseconds)
[info] - updateStateByKey - testing time stamps as input (541 milliseconds)
[info] - sort with Java non serializable class - Java (5 seconds, 264 milliseconds)
[info] - updateStateByKey - with initial value RDD (660 milliseconds)
[info] - updateStateByKey - object lifecycle (392 milliseconds)
[info] - shuffle with different compression settings (SPARK-3426) (625 milliseconds)
[info] - [SPARK-4085] rerun map stage if reduce stage cannot find its local shuffle file (359 milliseconds)
[info] - cannot find its local shuffle file if no execution of the stage and rerun shuffle (107 milliseconds)
[info] - metrics for shuffle without aggregation (244 milliseconds)
[info] - metrics for shuffle with aggregation (655 milliseconds)
[info] - multiple simultaneous attempts for one task (SPARK-8029) (84 milliseconds)
[info] - slice (2 seconds, 189 milliseconds)
[info] - slice - has not been initialized (62 milliseconds)
[info] - rdd cleanup - map and window (368 milliseconds)
[info] - rdd cleanup - updateStateByKey (694 milliseconds)
Exception in thread "receiver-supervisor-future-0" java.lang.Error: java.lang.InterruptedException: sleep interrupted
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.InterruptedException: sleep interrupted
	at java.lang.Thread.sleep(Native Method)
	at org.apache.spark.streaming.receiver.ReceiverSupervisor.$anonfun$restartReceiver$1(ReceiverSupervisor.scala:196)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
	at scala.util.Success.$anonfun$map$1(Try.scala:255)
	at scala.util.Success.map(Try.scala:213)
	at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
	at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	... 2 more
[info] - rdd cleanup - input blocks and persisted RDDs (2 seconds, 207 milliseconds)
[info] - Year (6 seconds, 731 milliseconds)
[info] InputStreamsSuite:
Exception in thread "receiver-supervisor-future-0" java.lang.Error: java.lang.InterruptedException: sleep interrupted
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.InterruptedException: sleep interrupted
	at java.lang.Thread.sleep(Native Method)
	at org.apache.spark.streaming.receiver.ReceiverSupervisor.$anonfun$restartReceiver$1(ReceiverSupervisor.scala:196)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
	at scala.util.Success.$anonfun$map$1(Try.scala:255)
	at scala.util.Success.map(Try.scala:213)
	at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
	at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	... 2 more
[info] - socket input stream (782 milliseconds)
[info] - socket input stream - no block in a batch (405 milliseconds)
[info] - data type casting (38 seconds, 107 milliseconds)
[info] - cast and add (241 milliseconds)
[info] - from decimal (306 milliseconds)
[info] - using external shuffle service (6 seconds, 272 milliseconds)
[info] - cast from array (214 milliseconds)
[info] - cast from map (173 milliseconds)
[info] - cast from struct (329 milliseconds)
[info] - cast struct with a timestamp field (39 milliseconds)
[info] - complex casting (5 milliseconds)
[info] - cast between string and interval (106 milliseconds)
[info] - cast string to boolean (115 milliseconds)
[info] - SPARK-16729 type checking for casting to date type (0 milliseconds)
[info] - SPARK-20302 cast with same structure (45 milliseconds)
[info] - SPARK-22500: cast for struct should not generate codes beyond 64KB (1 second, 225 milliseconds)
[info] - SPARK-22570: Cast should not create a lot of global variables (2 milliseconds)
[info] - up-cast (42 milliseconds)
[info] - SPARK-27671: cast from nested null type in struct (420 milliseconds)
[info] - Process Infinity, -Infinity, NaN in case insensitive manner (181 milliseconds)
[info] - null cast #2 (567 milliseconds)
[info] - cast from long #2 (48 milliseconds)
[info] - cast from int #2 (84 milliseconds)
[info] - Quarter (6 seconds, 458 milliseconds)
[info] - casting to fixed-precision decimals (1 second, 431 milliseconds)
[info] - SPARK-28470: Cast should honor nullOnOverflow property (100 milliseconds)
[info] - collect_list/collect_set can cast to ArrayType not containsNull (9 milliseconds)
[info] - NullTypes should be able to cast to any complex types (1 millisecond)
[info] - SPARK-31227: Non-nullable null type should not coerce to nullable type (3 milliseconds)
[info] - binary records stream (6 seconds, 184 milliseconds)
[info] - SPARK-25888: using external shuffle service fetching disk persisted blocks (4 seconds, 882 milliseconds)
[info] ClosureCleanerSuite:
[info] - closures inside an object (74 milliseconds)
[info] - Cast should output null for invalid strings when ANSI is not enabled. (298 milliseconds)
[info] - closures inside a class (84 milliseconds)
[info] - file input stream - newFilesOnly = true (376 milliseconds)
[info] - closures inside a class with no default constructor (87 milliseconds)
[info] - closures that don't use fields of the outer class (77 milliseconds)
[info] - cast from date (206 milliseconds)
[info] - nested closures inside an object (100 milliseconds)
[info] - file input stream - newFilesOnly = false (253 milliseconds)
[info] - nested closures inside a class (104 milliseconds)
[info] - toplevel return statements in closures are identified at cleaning time (63 milliseconds)
[info] - return statements from named functions nested in closures don't raise exceptions (63 milliseconds)
[info] - user provided closures are actually cleaned (128 milliseconds)
[info] - cast from timestamp (434 milliseconds)
[info] - createNullValue (4 milliseconds)
[info] UnpersistSuite:
[info] - unpersist RDD (85 milliseconds)
[info] - file input stream - wildcard (416 milliseconds)
[info] - cast a timestamp before the epoch 1970-01-01 00:00:00Z (117 milliseconds)
[info] PeriodicRDDCheckpointerSuite:
[info] - SPARK-32828: cast from a derived user-defined type to a base type (46 milliseconds)
[info] - Persisting (17 milliseconds)
[info] - Fast fail for cast string type to decimal type (147 milliseconds)
[info] - Modified files are correctly detected. (238 milliseconds)
[info] - Checkpointing (420 milliseconds)
[info] TaskSetManagerSuite:
[info] - TaskSet with no preferences (71 milliseconds)
[info] - multiple offers with no preferences (50 milliseconds)
[info] - skip unsatisfiable locality levels (54 milliseconds)
[info] - Month (2 seconds, 741 milliseconds)
[info] - basic delay scheduling (52 milliseconds)
[info] - we do not need to delay scheduling when we only have noPref tasks in the queue (39 milliseconds)
[info] - delay scheduling with fallback (149 milliseconds)
[info] - delay scheduling with failed hosts (62 milliseconds)
[info] - task result lost (42 milliseconds)
[info] - repeated failures lead to task set abortion (53 milliseconds)
[info] - executors should be excluded after task failure, in spite of locality preferences (82 milliseconds)
[info] - new executors get added and lost (60 milliseconds)
[info] - Executors exit for reason unrelated to currently running tasks (48 milliseconds)
[info] - SPARK-31837: Shift to the new highest locality level if there is when recomputeLocality (40 milliseconds)
[info] - SPARK-32653: Decommissioned host should not be used to calculate locality levels (57 milliseconds)
[info] - SPARK-32653: Decommissioned executor should not be used to calculate locality levels (56 milliseconds)
[info] - test RACK_LOCAL tasks (86 milliseconds)
[info] - do not emit warning when serialized task is small (59 milliseconds)
[info] - emit warning when serialized task is large (90 milliseconds)
[info] - multi-thread receiver (1 second, 733 milliseconds)
[info] - Not serializable exception thrown if the task cannot be serialized (68 milliseconds)
[info] - SPARK-22825 Cast array to string (1 second, 932 milliseconds)
[info] - SPARK-33291: Cast array with null elements to string (58 milliseconds)
[info] - SPARK-22973 Cast map to string (479 milliseconds)
[info] - queue input stream - oneAtATime = true (1 second, 100 milliseconds)
[info] - SPARK-22981 Cast struct to string (522 milliseconds)
[info] - SPARK-33291: Cast struct with null elements to string (62 milliseconds)
[info] - data type casting II (116 milliseconds)
[info] - Cast from double II (28 milliseconds)
[info] - SPARK-34727: cast from float II (22 milliseconds)
[info] - abort the job if total size of results is too large (1 second, 509 milliseconds)
Exception in thread "task-result-getter-3" java.lang.Error: java.lang.InterruptedException
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.InterruptedException
	at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
	at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:242)
	at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:258)
	at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:263)
	at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:293)
	at org.apache.spark.network.BlockTransferService.fetchBlockSync(BlockTransferService.scala:103)
	at org.apache.spark.storage.BlockManager.fetchRemoteManagedBuffer(BlockManager.scala:1061)
	at org.apache.spark.storage.BlockManager.$anonfun$getRemoteBlock$8(BlockManager.scala:1005)
	at scala.Option.orElse(Option.scala:447)
	at org.apache.spark.storage.BlockManager.getRemoteBlock(BlockManager.scala:1005)
	at org.apache.spark.storage.BlockManager.getRemoteBytes(BlockManager.scala:1143)
	at org.apache.spark.scheduler.TaskResultGetter$$anon$3.$anonfun$run$1(TaskResultGetter.scala:88)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
	at org.apache.spark.scheduler.TaskResultGetter$$anon$3.run(TaskResultGetter.scala:63)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	... 2 more
[info] - queue input stream - oneAtATime = false (2 seconds, 127 milliseconds)
[info] - test track the number of input stream (135 milliseconds)
[info] UIUtilsSuite:
[info] - shortTimeUnitString (0 milliseconds)
[info] - normalizeDuration (3 milliseconds)
[info] - convertToTimeUnit (0 milliseconds)
[info] - formatBatchTime (0 milliseconds)
[info] RateLimitedOutputStreamSuite:
[info] ReassignLambdaVariableIDSuite:
[info] - basic: replace positive IDs with unique negative IDs (776 milliseconds)
[info] - ignore LambdaVariable with negative IDs (10 milliseconds)
[info] - fail if positive ID LambdaVariable and negative LambdaVariable both exist (10 milliseconds)
[info] AnalysisHelperSuite:
[info] - setAnalyze is recursive (6 milliseconds)
[info] - resolveOperator runs on operators recursively (13 milliseconds)
[info] - resolveOperatorsDown runs on operators recursively (2 milliseconds)
[info] - resolveExpressions runs on operators recursively (9 milliseconds)
[info] - resolveOperator skips all ready resolved plans (1 millisecond)
[info] - resolveOperatorsDown skips all ready resolved plans (1 millisecond)
[info] - resolveExpressions skips all ready resolved plans (1 millisecond)
[info] - resolveOperator skips partially resolved plans (1 millisecond)
[info] - resolveOperatorsDown skips partially resolved plans (2 milliseconds)
[info] - resolveExpressions skips partially resolved plans (2 milliseconds)
[info] - do not allow transform in analyzer (7 milliseconds)
[info] - allow transform in resolveOperators in the analyzer (5 milliseconds)
[info] - allow transform with allowInvokingTransformsInAnalyzer in the analyzer (5 milliseconds)
[info] AggregateEstimationSuite:
[info] - SPARK-26894: propagate child stats for aliases in Aggregate (36 milliseconds)
[info] - set an upper bound if the product of ndv's of group-by columns is too large (6 milliseconds)
[info] - data contains all combinations of distinct values of group-by columns. (3 milliseconds)
[info] - empty group-by column (3 milliseconds)
[info] - aggregate on empty table - with or without group-by column (4 milliseconds)
[info] - group-by column with only null value (5 milliseconds)
[info] - group-by column with null value (3 milliseconds)
[info] - non-cbo estimation (13 milliseconds)
[info] LiteralExpressionSuite:
[info] - write (4 seconds, 104 milliseconds)
[info] FileBasedWriteAheadLogSuite:
[info] - FileBasedWriteAheadLog - read all logs (140 milliseconds)
[info] - FileBasedWriteAheadLog - write logs (273 milliseconds)
[info] - FileBasedWriteAheadLog - read all logs after write (232 milliseconds)
[info] - null (2 seconds, 14 milliseconds)
[info] - FileBasedWriteAheadLog - clean old logs (103 milliseconds)
[info] - FileBasedWriteAheadLog - clean old logs synchronously (156 milliseconds)
[info] - FileBasedWriteAheadLog - handling file errors while reading rotating logs (273 milliseconds)
[info] - FileBasedWriteAheadLog - do not create directories or files unless write (4 milliseconds)
[info] - FileBasedWriteAheadLog - parallel recovery not enabled if closeFileAfterWrite = false (42 milliseconds)
[info] - FileBasedWriteAheadLog - seqToParIterator (57 milliseconds)
[info] - FileBasedWriteAheadLogWriter - writing data (23 milliseconds)
[info] - FileBasedWriteAheadLogWriter - syncing of data by writing and reading immediately (22 milliseconds)
[info] - FileBasedWriteAheadLogReader - sequentially reading data (8 milliseconds)
[info] - FileBasedWriteAheadLogReader - sequentially reading data written with writer (10 milliseconds)
[info] - FileBasedWriteAheadLogReader - reading data written with writer after corrupted write (794 milliseconds)
[info] - FileBasedWriteAheadLogReader - handles errors when file doesn't exist (7 milliseconds)
[info] - FileBasedWriteAheadLogRandomReader - reading data using random reader (17 milliseconds)
[info] - FileBasedWriteAheadLogRandomReader- reading data using random reader written with writer (11 milliseconds)
[info] RateControllerSuite:
[info] - default (1 second, 778 milliseconds)
[info] - SPARK-32470: do not check total size of intermediate stages (8 seconds, 449 milliseconds)
[info] - RateController - rate controller publishes updates after batches complete (389 milliseconds)
[info] - [SPARK-13931] taskSetManager should not send Resubmitted tasks after being a zombie (77 milliseconds)
[info] - [SPARK-22074] Task killed by other attempt task should not be resubmitted (89 milliseconds)
[info] - speculative and noPref task should be scheduled after node-local (60 milliseconds)
[info] - node-local tasks should be scheduled right away when there are only node-local and no-preference tasks (57 milliseconds)
[info] - SPARK-4939: node-local tasks should be scheduled right after process-local tasks finished (51 milliseconds)
[info] - ReceiverRateController - published rates reach receivers (606 milliseconds)
[info] InputInfoTrackerSuite:
[info] - SPARK-4939: no-pref tasks should be scheduled after process-local tasks finished (59 milliseconds)
[info] - test report and get InputInfo from InputInfoTracker (1 millisecond)
[info] - Ensure TaskSetManager is usable after addition of levels (60 milliseconds)
[info] - test cleanup InputInfo from InputInfoTracker (0 milliseconds)
[info] WriteAheadLogBackedBlockRDDSuite:
[info] - Test that locations with HDFSCacheTaskLocation are treated as PROCESS_LOCAL. (71 milliseconds)
[info] - Test TaskLocation for different host type. (1 millisecond)
[info] - Kill other task attempts when one attempt belonging to the same task succeeds (80 milliseconds)
[info] - Read data available in both block manager and write ahead log (99 milliseconds)
[info] - Killing speculative tasks does not count towards aborting the taskset (65 milliseconds)
[info] - Read data available only in block manager, not in write ahead log (66 milliseconds)
[info] - Read data available only in write ahead log, not in block manager (73 milliseconds)
[info] - Read data with partially available in block manager, and rest in write ahead log (71 milliseconds)
[info] - SPARK-19868: DagScheduler only notified of taskEnd when state is ready (151 milliseconds)
[info] - SPARK-17894: Verify TaskSetManagers for different stage attempts have unique names (68 milliseconds)
[info] - Test isBlockValid skips block fetching from BlockManager (135 milliseconds)
[info] - don't update excludelist for shuffle-fetch failures, preemption, denied commits, or killed tasks (87 milliseconds)
[info] - update application healthTracker for shuffle-fetch (88 milliseconds)
[info] - Test whether RDD is valid after removing blocks from block manager (160 milliseconds)
[info] - boolean literals (1 second, 597 milliseconds)
[info] - update healthTracker before adding pending task to avoid race condition (74 milliseconds)
[info] - Test storing of blocks recovered from write ahead log back into block manager (165 milliseconds)
[info] - SPARK-21563 context's added jars shouldn't change mid-TaskSet (71 milliseconds)
Exception in thread "block-manager-storage-async-thread-pool-4" Exception in thread "block-manager-storage-async-thread-pool-0" Exception in thread "block-manager-storage-async-thread-pool-2" java.lang.Error: java.lang.InterruptedException
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.InterruptedException
	at java.lang.Object.wait(Native Method)
	at java.lang.Object.wait(Object.java:502)
	at org.apache.spark.storage.BlockInfoManager.lockForWriting(BlockInfoManager.scala:236)
	at org.apache.spark.storage.BlockManager.removeBlock(BlockManager.scala:1872)
	at org.apache.spark.storage.BlockManagerStorageEndpoint$$anonfun$receiveAndReply$1.$anonfun$applyOrElse$1(BlockManagerStorageEndpoint.scala:47)
	at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
	at org.apache.spark.storage.BlockManagerStorageEndpoint.$anonfun$doAsync$1(BlockManagerStorageEndpoint.scala:89)
	at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
	at scala.util.Success.$anonfun$map$1(Try.scala:255)
	at scala.util.Success.map(Try.scala:213)
	at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
	at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	... 2 more
java.lang.Error: java.lang.InterruptedException
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.InterruptedException
	at java.lang.Object.wait(Native Method)
	at java.lang.Object.wait(Object.java:502)
	at org.apache.spark.storage.BlockInfoManager.lockForWriting(BlockInfoManager.scala:236)
	at org.apache.spark.storage.BlockManager.removeBlock(BlockManager.scala:1872)
	at org.apache.spark.storage.BlockManagerStorageEndpoint$$anonfun$receiveAndReply$1.$anonfun$applyOrElse$1(BlockManagerStorageEndpoint.scala:47)
	at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
	at org.apache.spark.storage.BlockManagerStorageEndpoint.$anonfun$doAsync$1(BlockManagerStorageEndpoint.scala:89)
	at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
	at scala.util.Success.$anonfun$map$1(Try.scala:255)
	at scala.util.Success.map(Try.scala:213)
	at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
	at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	... 2 more
java.lang.Error: java.lang.InterruptedException
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.InterruptedException
	at java.lang.Object.wait(Native Method)
	at java.lang.Object.wait(Object.java:502)
	at org.apache.spark.storage.BlockInfoManager.lockForWriting(BlockInfoManager.scala:236)
	at org.apache.spark.storage.BlockManager.removeBlock(BlockManager.scala:1872)
	at org.apache.spark.storage.BlockManagerStorageEndpoint$$anonfun$receiveAndReply$1.$anonfun$applyOrElse$1(BlockManagerStorageEndpoint.scala:47)
	at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
	at org.apache.spark.storage.BlockManagerStorageEndpoint.$anonfun$doAsync$1(BlockManagerStorageEndpoint.scala:89)
	at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
	at scala.util.Success.$anonfun$map$1(Try.scala:255)
	at scala.util.Success.map(Try.scala:213)
	at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
	at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	... 2 more
java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@bfea7c5 rejected from java.util.concurrent.ThreadPoolExecutor@60bac7c0[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 3]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
	at scala.concurrent.Promise.complete(Promise.scala:53)
	at scala.concurrent.Promise.complete$(Promise.scala:52)
	at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67)
	at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
	at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59)
	at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875)
	at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110)
	at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107)
	at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
	at scala.concurrent.Promise.complete(Promise.scala:53)
	at scala.concurrent.Promise.complete$(Promise.scala:52)
	at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@64854f88 rejected from java.util.concurrent.ThreadPoolExecutor@60bac7c0[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 3]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
	at scala.concurrent.Promise.complete(Promise.scala:53)
	at scala.concurrent.Promise.complete$(Promise.scala:52)
	at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67)
	at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
	at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59)
	at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875)
	at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110)
	at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107)
	at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
	at scala.concurrent.Promise.complete(Promise.scala:53)
	at scala.concurrent.Promise.complete$(Promise.scala:52)
	at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@4ac3adfe rejected from java.util.concurrent.ThreadPoolExecutor@60bac7c0[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 3]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
	at scala.concurrent.Promise.complete(Promise.scala:53)
	at scala.concurrent.Promise.complete$(Promise.scala:52)
	at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@5a604a98 rejected from java.util.concurrent.ThreadPoolExecutor@60bac7c0[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 3]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
	at scala.concurrent.Promise.complete(Promise.scala:53)
	at scala.concurrent.Promise.complete$(Promise.scala:52)
	at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
[info] - SPARK-24677: Avoid NoSuchElementException from MedianHeap (68 milliseconds)
[info] - SPARK-24755 Executor loss can cause task to not be resubmitted (72 milliseconds)
[info] - read data in block manager and WAL with encryption on (169 milliseconds)
[info] BatchedWriteAheadLogWithCloseFileAfterWriteSuite:
[info] - SPARK-13343 speculative tasks that didn't commit shouldn't be marked as success (71 milliseconds)
[info] - BatchedWriteAheadLog - read all logs (81 milliseconds)
[info] - SPARK-13704 Rack Resolution is done with a batch of de-duped hosts (107 milliseconds)
[info] - TaskSetManager passes task resource along (71 milliseconds)
[info] - SPARK-26755 Ensure that a speculative task is submitted only once for execution (75 milliseconds)
[info] - SPARK-26755 Ensure that a speculative task obeys original locality preferences (89 milliseconds)
[info] - BatchedWriteAheadLog - write logs (485 milliseconds)
[info] - SPARK-29976 when a speculation time threshold is provided, should speculative run the task even if there are not enough successful runs, total tasks: 1 (78 milliseconds)
[info] - int literals (939 milliseconds)
[info] - SPARK-29976: when the speculation time threshold is not provided,don't speculative run if there are not enough successful runs, total tasks: 1 (57 milliseconds)
[info] - SPARK-29976 when a speculation time threshold is provided, should speculative run the task even if there are not enough successful runs, total tasks: 2 (50 milliseconds)
[info] - SPARK-29976: when the speculation time threshold is not provided,don't speculative run if there are not enough successful runs, total tasks: 2 (51 milliseconds)
[info] - SPARK-29976 when a speculation time threshold is provided, should not speculative if there are too many tasks in the stage even though time threshold is provided (54 milliseconds)
[info] - SPARK-21040: Check speculative tasks are launched when an executor is decommissioned and the tasks running on it cannot finish within EXECUTOR_DECOMMISSION_KILL_INTERVAL (76 milliseconds)
[info] - BatchedWriteAheadLog - read all logs after write (399 milliseconds)
[info] - SPARK-29976 Regular speculation configs should still take effect even when a threshold is provided (62 milliseconds)
[info] - SPARK-30417 when spark.task.cpus is greater than spark.executor.cores due to standalone settings, speculate if there is only one task in the stage (56 milliseconds)
[info] - double literals (494 milliseconds)
[info] - TaskOutputFileAlreadyExistException lead to task set abortion (58 milliseconds)
[info] - string literals (105 milliseconds)
[info] - sum two literals (117 milliseconds)
[info] - BatchedWriteAheadLog - clean old logs (363 milliseconds)
[info] - binary literals (57 milliseconds)
[info] - BatchedWriteAheadLog - clean old logs synchronously (484 milliseconds)
[info] - BatchedWriteAheadLog - handling file errors while reading rotating logs (715 milliseconds)
[info] - BatchedWriteAheadLog - do not create directories or files unless write (5 milliseconds)
[info] - decimal (1 second, 224 milliseconds)
[info] - BatchedWriteAheadLog - parallel recovery not enabled if closeFileAfterWrite = false (41 milliseconds)
[info] - BatchedWriteAheadLog - close after write flag (14 milliseconds)
[info] ReceivedBlockHandlerSuite:
[info] - BlockManagerBasedBlockHandler - store blocks (30 milliseconds)
[info] - BlockManagerBasedBlockHandler - handle errors in storing block (4 milliseconds)
[info] - WriteAheadLogBasedBlockHandler - store blocks (145 milliseconds)
[info] - WriteAheadLogBasedBlockHandler - handle errors in storing block (33 milliseconds)
[info] - array (576 milliseconds)
[info] - WriteAheadLogBasedBlockHandler - clean old blocks (70 milliseconds)
[info] - seq (131 milliseconds)
[info] - Test Block - count messages (100 milliseconds)
[info] - Test Block - isFullyConsumed (32 milliseconds)
[info] - map (229 milliseconds)
[info] Test run started
[info] Test org.apache.spark.streaming.JavaDurationSuite.testGreaterEq started
[info] Test org.apache.spark.streaming.JavaDurationSuite.testDiv started
[info] Test org.apache.spark.streaming.JavaDurationSuite.testMinus started
[info] Test org.apache.spark.streaming.JavaDurationSuite.testTimes started
[info] Test org.apache.spark.streaming.JavaDurationSuite.testLess started
[info] Test org.apache.spark.streaming.JavaDurationSuite.testPlus started
[info] Test org.apache.spark.streaming.JavaDurationSuite.testGreater started
[info] Test org.apache.spark.streaming.JavaDurationSuite.testMinutes started
[info] Test org.apache.spark.streaming.JavaDurationSuite.testMilliseconds started
[info] Test org.apache.spark.streaming.JavaDurationSuite.testLessEq started
[info] Test org.apache.spark.streaming.JavaDurationSuite.testSeconds started
[info] Test run finished: 0 failed, 0 ignored, 11 total, 0.014s
[info] Test run started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testStreamingContextTransform started
[info] - struct (197 milliseconds)
[info] - unsupported types (map and struct) in Literal.apply (3 milliseconds)
[info] - SPARK-24571: char literals (46 milliseconds)
[info] - SPARK-33390: Make Literal support char array (35 milliseconds)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testFlatMapValues started
[info] - construct literals from java.time.LocalDate (223 milliseconds)
[info] - construct literals from arrays of java.time.LocalDate (31 milliseconds)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduceByWindowWithInverse started
[info] - construct literals from java.time.Instant (196 milliseconds)
[info] - construct literals from arrays of java.time.Instant (25 milliseconds)
[info] - format timestamp literal using spark.sql.session.timeZone (40 milliseconds)
[info] - format date literal independently from time zone (28 milliseconds)
[info] - SPARK-33860: Make CatalystTypeConverters.convertToCatalyst match special Array value (1 millisecond)
[info] MiscExpressionsSuite:
[info] - RaiseError (56 milliseconds)
[info] - Day / DayOfMonth (17 seconds, 662 milliseconds)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testMapPartitions started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairFilter started
[info] - uuid (431 milliseconds)
[info] - PrintToStderr (18 milliseconds)
[info] ReplaceOperatorSuite:
[info] - SPARK-30359: don't clean executorsPendingToRemove at the beginning of CoarseGrainedSchedulerBackend.reset (3 seconds, 753 milliseconds)
[info] BlockManagerBasicStrategyReplicationSuite:
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testRepartitionFewerPartitions started
[info] - replace Intersect with Left-semi Join (215 milliseconds)
[info] - get peers with addition and removal of block managers (31 milliseconds)
[info] - replace Except with Filter while both the nodes are of type Filter (64 milliseconds)
[info] - replace Except with Filter while only right node is of type Filter (19 milliseconds)
[info] - replace Except with Filter while both the nodes are of type Project (26 milliseconds)
[info] - replace Except with Filter while only right node is of type Project (20 milliseconds)
[info] - replace Except with Filter while left node is Project and right node is Filter (25 milliseconds)
[info] - replace Except with Left-anti Join (18 milliseconds)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testCombineByKey started
[info] - replace Except with Filter when only right filter can be applied to the left (70 milliseconds)
[info] - replace Distinct with Aggregate (5 milliseconds)
[info] - replace batch Deduplicate with Aggregate (12 milliseconds)
[info] - add one grouping key if necessary when replace Deduplicate with Aggregate (3 milliseconds)
[info] - don't replace streaming Deduplicate (2 milliseconds)
[info] - block replication - 2x replication (125 milliseconds)
[info] - SPARK-26366: ReplaceExceptWithFilter should handle properly NULL (34 milliseconds)
[info] - SPARK-26366: ReplaceExceptWithFilter should not transform non-deterministic (48 milliseconds)
[info] AnsiCastSuiteWithAnsiModeOff:
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testContextGetOrCreate started
[info] - block replication - 3x replication (116 milliseconds)
[info] - block replication - mixed between 1x to 5x (203 milliseconds)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testWindowWithSlideDuration started
[info] - block replication - off-heap (40 milliseconds)
[info] - block replication - 2x replication without peers (1 millisecond)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testQueueStream started
[info] - block replication - replication failures (44 milliseconds)
[info] - test block replication failures when block is received by remote block manager but putBlock fails (stream = false) (55 milliseconds)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testCountByValue started
[info] - test block replication failures when block is received by remote block manager but putBlock fails (stream = true) (33 milliseconds)
[info] - block replication - addition and deletion of block managers (133 milliseconds)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testMap started
[info] BlockManagerMasterSuite:
[info] - SPARK-31422: getMemoryStatus should not fail after BlockManagerMaster stops (3 milliseconds)
[info] - SPARK-31422: getStorageStatus should not fail after BlockManagerMaster stops (0 milliseconds)
[info] RDDOperationGraphSuite:
[info] - Test simple cluster equals (1 millisecond)
[info] ShuffleExternalSorterSuite:
[info] - nested spill should be no-op (75 milliseconds)
[info] ChunkedByteBufferSuite:
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairToNormalRDDTransform started
[info] - no chunks (1 millisecond)
[info] - getChunks() duplicates chunks (0 milliseconds)
[info] - copy() does not affect original buffer's position (1 millisecond)
[info] - writeFully() does not affect original buffer's position (0 milliseconds)
[info] - SPARK-24107: writeFully() write buffer which is larger than bufferWriteChunkSize (25 milliseconds)
[info] - toArray() (1 millisecond)
[info] - toArray() throws UnsupportedOperationException if size exceeds 2GB (3 milliseconds)
[info] - toInputStream() (1 millisecond)
[info] HistoryServerDiskManagerSuite:
[info] - leasing space (67 milliseconds)
[info] - tracking active stores (10 milliseconds)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairReduceByKey started
[info] - approximate size heuristic (0 milliseconds)
[info] - SPARK-32024: update ApplicationStoreInfo.size during initializing (12 milliseconds)
[info] PythonBroadcastSuite:
[info] - PythonBroadcast can be serialized with Kryo (SPARK-4882) (18 milliseconds)
[info] KeyLockSuite:
[info] - The same key should wait when its lock is held (9 milliseconds)
[info] - A different key should not be locked (2 milliseconds)
[info] NettyBlockTransferServiceSuite:
[info] - can bind to a random port (21 milliseconds)
[info] - can bind to two random ports (33 milliseconds)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testCount started
[info] - can bind to a specific port (23 milliseconds)
[info] - can bind to a specific port twice and the second increments (36 milliseconds)
[info] - SPARK-27637: test fetch block with executor dead (65 milliseconds)
[info] BasicSchedulerIntegrationSuite:
[info] - Seconds (3 seconds, 370 milliseconds)
[info] - super simple job (105 milliseconds)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery started
[info] - multi-stage job (182 milliseconds)
[info] - DayOfWeek (303 milliseconds)
[info] - WeekDay (295 milliseconds)
[info] - job with fetch failure (313 milliseconds)
[info] - job failure after 4 attempts (106 milliseconds)
[info] - WeekOfYear (271 milliseconds)
[info] - SPARK-23626: RDD with expensive getPartitions() doesn't block scheduler loop (94 milliseconds)
[info] JobWaiterSuite:
[info] - call jobFailed multiple times (2 milliseconds)
[info] RDDBarrierSuite:
[info] - create an RDDBarrier (4 milliseconds)
[info] - null cast (3 seconds, 456 milliseconds)
[info] - RDDBarrier mapPartitionsWithIndex (29 milliseconds)
[info] - create an RDDBarrier in the middle of a chain of RDDs (4 milliseconds)
[info] - RDDBarrier with shuffle (6 milliseconds)
[info] UninterruptibleThreadSuite:
[info] - cast string to date (152 milliseconds)
[info] - DateFormat (547 milliseconds)
[info] - interrupt when runUninterruptibly is running (1 second, 1 millisecond)
[info] - interrupt before runUninterruptibly runs (2 milliseconds)
[info] - nested runUninterruptibly (5 milliseconds)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairMap started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testUnion started
[info] - stress test (1 second, 938 milliseconds)
[info] BlockManagerDecommissionUnitSuite:
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testFlatMap started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduceByKeyAndWindowWithInverse started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testGlom started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testJoin started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairFlatMap started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairToPairFlatMapWithChangingTypes started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairMapPartitions started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testRepartitionMorePartitions started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduceByWindowWithoutInverse started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testLeftOuterJoin started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testVariousTransform started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testTransformWith started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testVariousTransformWith started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testTextFileStream started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairGroupByKey started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testCoGroup started
[info] - test that with no blocks we finish migration (5 seconds, 64 milliseconds)
[info] - block decom manager with no migrations configured (10 milliseconds)
[info] - block decom manager with no peers (4 milliseconds)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testInitialization started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testSocketString started
[info] - cast string to timestamp (8 seconds, 431 milliseconds)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testGroupByKeyAndWindow started
[info] - cast from boolean (137 milliseconds)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduceByKeyAndWindow started
[info] - cast from int (294 milliseconds)
[info] - cast from long (240 milliseconds)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testForeachRDD started
[info] - cast from float (152 milliseconds)
[info] - cast from double (193 milliseconds)
[info] - cast from string (1 millisecond)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testFileStream started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairTransform started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testFilter started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairMap2 started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testMapValues started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduce started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testUpdateStateByKey started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testTransform started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testWindow started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testCountByValueAndWindow started
[info] - Hour (11 seconds, 785 milliseconds)
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testRawSocketStream started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testSocketTextStream started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testUpdateStateByKeyWithInitial started
[info] Test test.org.apache.spark.streaming.JavaAPISuite.testContextState started
[info] Test run finished: 0 failed, 0 ignored, 53 total, 18.235s
[info] Test run started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testStreamingContextTransform started
[info] - block decom manager with only shuffle files time moves forward (5 seconds, 35 milliseconds)
[info] Test test.org.apache.spark.streaming.Java8APISuite.testFlatMapValues started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testMapPartitions started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testPairFilter started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testCombineByKey started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testMap started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testPairToNormalRDDTransform started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testPairReduceByKey started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testPairMap started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testFlatMap started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testReduceByKeyAndWindowWithInverse started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testReduceByWindow started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testPairFlatMap started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testPairToPairFlatMapWithChangingTypes started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testPairMapPartitions started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testVariousTransform started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testTransformWith started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testVariousTransformWith started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testReduceByKeyAndWindow started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testPairTransform started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testFilter started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testPairMap2 started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testMapValues started
[info] - block decom manager does not re-add removed shuffle files (5 seconds, 4 milliseconds)
[info] Test test.org.apache.spark.streaming.Java8APISuite.testReduce started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testUpdateStateByKey started
[info] Test test.org.apache.spark.streaming.Java8APISuite.testTransform started
[info] Test run finished: 0 failed, 0 ignored, 26 total, 6.26s
[info] Test run started
[info] Test org.apache.spark.streaming.JavaMapWithStateSuite.testBasicFunction started
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.429s
[info] Test run started
[info] Test org.apache.spark.streaming.JavaWriteAheadLogSuite.testCustomWAL started
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.002s
[info] Test run started
[info] Test org.apache.spark.streaming.JavaReceiverAPISuite.testReceiver started
[info] Test run finished: 0 failed, 0 ignored, 1 total, 1.201s
[info] Test run started
[info] Test org.apache.spark.streaming.JavaTimeSuite.testGreaterEq started
[info] Test org.apache.spark.streaming.JavaTimeSuite.testLess started
[info] Test org.apache.spark.streaming.JavaTimeSuite.testPlus started
[info] Test org.apache.spark.streaming.JavaTimeSuite.testMinusDuration started
[info] Test org.apache.spark.streaming.JavaTimeSuite.testGreater started
[info] Test org.apache.spark.streaming.JavaTimeSuite.testLessEq started
[info] Test org.apache.spark.streaming.JavaTimeSuite.testMinusTime started
[info] Test run finished: 0 failed, 0 ignored, 7 total, 0.001s
[info] - block decom manager handles IO failures (5 seconds, 35 milliseconds)
[info] MathExpressionsSuite:
[info] - conv (2 seconds, 959 milliseconds)
[info] - e (94 milliseconds)
[info] - pi (41 milliseconds)
[info] - block decom manager short circuits removed blocks (5 seconds, 34 milliseconds)
[info] - test shuffle and cached rdd migration without any error (51 milliseconds)
[info] DriverSuite:
[info] - driver should exit after finishing without cleanup (SPARK-530) !!! IGNORED !!!
[info] CompactBufferSuite:
[info] - empty buffer (2 milliseconds)
[info] - basic inserts (9 milliseconds)
[info] - adding sequences (4 milliseconds)
[info] - adding the same buffer to itself (2 milliseconds)
[info] MapStatusSuite:
[info] - compressSize (1 millisecond)
[info] - decompressSize (1 millisecond)
[info] - Minute (16 seconds, 283 milliseconds)
[info] - sin (2 seconds, 148 milliseconds)
[info] - MapStatus should never report non-empty blocks' sizes as 0 (643 milliseconds)
[info] - large tasks should use org.apache.spark.scheduler.HighlyCompressedMapStatus (1 millisecond)
[info] - HighlyCompressedMapStatus: estimated size should be the average non-empty block size (12 milliseconds)
[info] - SPARK-22540: ensure HighlyCompressedMapStatus calculates correct avgSize (20 milliseconds)
[info] - RoaringBitmap: runOptimize succeeded (16 milliseconds)
[info] - RoaringBitmap: runOptimize failed (6 milliseconds)
[info] - Blocks which are bigger than SHUFFLE_ACCURATE_BLOCK_THRESHOLD should not be underestimated. (10 milliseconds)
[info] - date_add (655 milliseconds)
[info] - date add interval (396 milliseconds)
[info] - asin (1 second, 233 milliseconds)
[info] - date_sub (744 milliseconds)
[info] - sinh (1 second, 466 milliseconds)
[info] - time_add (2 seconds, 156 milliseconds)
[info] - asinh (1 second, 361 milliseconds)
[info] - cos (1 second, 26 milliseconds)
[info] - acos (783 milliseconds)
[info] - time_sub (2 seconds, 553 milliseconds)
[info] - add_months (443 milliseconds)
[info] - cosh (1 second, 106 milliseconds)
[info] - SPARK-21133 HighlyCompressedMapStatus#writeExternal throws NPE (7 seconds, 504 milliseconds)
[info] BlockInfoManagerSuite:
[info] - initial memory usage (1 millisecond)
[info] - get non-existent block (0 milliseconds)
[info] - basic lockNewBlockForWriting (3 milliseconds)
[info] - acosh (1 second, 59 milliseconds)
[info] - lockNewBlockForWriting blocks while write lock is held, then returns false after release (303 milliseconds)
[info] - lockNewBlockForWriting blocks while write lock is held, then returns true after removal (305 milliseconds)
[info] - read locks are reentrant (1 millisecond)
[info] - multiple tasks can hold read locks (3 milliseconds)
[info] - single task can hold write lock (2 milliseconds)
[info] - cannot grab a writer lock while already holding a write lock (1 millisecond)
[info] - assertBlockIsLockedForWriting throws exception if block is not locked (1 millisecond)
[info] - downgrade lock (1 millisecond)
[info] - write lock will block readers (304 milliseconds)
[info] - tan (893 milliseconds)
[info] - read locks will block writer (304 milliseconds)
[info] - removing a non-existent block throws IllegalArgumentException (2 milliseconds)
[info] - removing a block without holding any locks throws IllegalStateException (1 millisecond)
[info] - removing a block while holding only a read lock throws IllegalStateException (0 milliseconds)
[info] - removing a block causes blocked callers to receive None (302 milliseconds)
[info] - releaseAllLocksForTask releases write locks (2 milliseconds)
[info] StoragePageSuite:
[info] - rddTable (8 milliseconds)
[info] - empty rddTable (0 milliseconds)
[info] - streamBlockStorageLevelDescriptionAndSize (1 millisecond)
[info] - receiverBlockTables (11 milliseconds)
[info] - empty receiverBlockTables (1 millisecond)
[info] TaskSchedulerImplSuite:
[info] - SPARK-32653: Decommissioned host/executor should be considered as inactive (55 milliseconds)
[info] - cot (1 second, 116 milliseconds)
[info] - Scheduler does not always schedule tasks on the same workers (835 milliseconds)
[info] - months_between (3 seconds, 754 milliseconds)
[info] - Scheduler correctly accounts for multiple CPUs per task (68 milliseconds)
[info] - SPARK-18886 - partial offers (isAllFreeResources = false) reset timer before any resources have been rejected (64 milliseconds)
[info] - SPARK-18886 - delay scheduling timer is reset when it accepts all resources offered when isAllFreeResources = true (55 milliseconds)
[info] - SPARK-18886 - task set with no locality requirements should not starve one with them (105 milliseconds)
[info] - SPARK-18886 - partial resource offers (isAllFreeResources = false) reset time if last full resource offer (isAllResources = true) was accepted as well as any following partial resource offers (46 milliseconds)
[info] - SPARK-18886 - partial resource offers (isAllFreeResources = false) do not reset time if any offer was rejected since last full offer was fully accepted (70 milliseconds)
[info] - atan (987 milliseconds)
[info] - Scheduler does not crash when tasks are not serializable (48 milliseconds)
[info] - concurrent attempts for the same stage only have one active taskset (68 milliseconds)
[info] - last_day (638 milliseconds)
[info] - don't schedule more tasks after a taskset is zombie (68 milliseconds)
[info] - if a zombie attempt finishes, continue scheduling tasks for non-zombie attempts (77 milliseconds)
[info] - tasks are not re-scheduled while executor loss reason is pending (72 milliseconds)
[info] - scheduled tasks obey task and stage excludelist (212 milliseconds)
[info] - next_day (471 milliseconds)
[info] - scheduled tasks obey node and executor excludelists (102 milliseconds)
[info] - abort stage when all executors are excluded and we cannot acquire new executor (86 milliseconds)
[info] - SPARK-22148 abort timer should kick in when task is completely excluded & no new executor can be acquired (61 milliseconds)
[info] - SPARK-22148 try to acquire a new executor when task is unschedulable with 1 executor (71 milliseconds)
[info] - TruncDate (402 milliseconds)
[info] - tanh (1 second, 19 milliseconds)
[info] - SPARK-22148 abort timer should clear unschedulableTaskSetToExpiryTime for all TaskSets (89 milliseconds)
[info] - SPARK-22148 Ensure we don't abort the taskSet if we haven't been completely excluded (66 milliseconds)
[info] - SPARK-31418 abort timer should kick in when task is completely excluded &allocation manager could not acquire a new executor before the timeout (73 milliseconds)
[info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 0 (125 milliseconds)
[info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 1 (111 milliseconds)
[info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 2 (148 milliseconds)
[info] - TruncTimestamp (738 milliseconds)
[info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 3 (127 milliseconds)
[info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 4 (110 milliseconds)
[info] - atanh (969 milliseconds)
[info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 5 (118 milliseconds)
[info] - unsupported fmt fields for trunc/date_trunc results null (445 milliseconds)
[info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 6 (180 milliseconds)
[info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 7 (98 milliseconds)
[info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 8 (116 milliseconds)
[info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 9 (173 milliseconds)
[info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 0 (111 milliseconds)
[info] - toDegrees (883 milliseconds)
[info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 1 (108 milliseconds)
[info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 2 (142 milliseconds)
[info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 3 (146 milliseconds)
[info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 4 (111 milliseconds)
[info] - from_unixtime (1 second, 238 milliseconds)
[info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 5 (100 milliseconds)
[info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 6 (126 milliseconds)
[info] - toRadians (882 milliseconds)
[info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 7 (212 milliseconds)
[info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 8 (127 milliseconds)
[info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 9 (123 milliseconds)
[info] - abort stage if executor loss results in unschedulability from previously failed tasks (48 milliseconds)
[info] - don't abort if there is an executor available, though it hasn't had scheduled tasks yet (113 milliseconds)
[info] - SPARK-16106 locality levels updated if executor added to existing host (50 milliseconds)
[info] - scheduler checks for executors that can be expired from excludeOnFailure (54 milliseconds)
[info] - if an executor is lost then the state for its running tasks is cleaned up (SPARK-18553) (72 milliseconds)
[info] - unix_timestamp (1 second, 191 milliseconds)
[info] - if a task finishes with TaskState.LOST its executor is marked as dead (71 milliseconds)
[info] - cbrt (909 milliseconds)
[info] - Locality should be used for bulk offers even with delay scheduling off (67 milliseconds)
[info] - data type casting (34 seconds, 760 milliseconds)
[info] - With delay scheduling off, tasks can be run at any locality level immediately (47 milliseconds)
[info] - TaskScheduler should throw IllegalArgumentException when schedulingMode is not supported (52 milliseconds)
[info] - don't schedule for a barrier taskSet if available slots are less than pending tasks (60 milliseconds)
[info] - cast and add (169 milliseconds)
[info] - don't schedule for a barrier taskSet if available slots are less than pending tasks gpus limiting (52 milliseconds)
[info] - schedule tasks for a barrier taskSet if all tasks can be launched together gpus (53 milliseconds)
[info] - from decimal (181 milliseconds)
[info] - schedule tasks for a barrier taskSet if all tasks can be launched together diff ResourceProfile (59 milliseconds)
[info] - schedule tasks for a barrier taskSet if all tasks can be launched together diff ResourceProfile, but not enough gpus (57 milliseconds)
[info] - cast from array (142 milliseconds)
[info] - schedule tasks for a barrier taskSet if all tasks can be launched together (51 milliseconds)
[info] - SPARK-29263: barrier TaskSet can't schedule when higher prio taskset takes the slots (43 milliseconds)
[info] - cast from map (125 milliseconds)
[info] - cancelTasks shall kill all the running tasks and fail the stage (60 milliseconds)
[info] - killAllTaskAttempts shall kill all the running tasks and not fail the stage (49 milliseconds)
[info] - mark taskset for a barrier stage as zombie in case a task fails (59 milliseconds)
[info] - Scheduler correctly accounts for GPUs per task (49 milliseconds)
[info] - cast from struct (326 milliseconds)
[info] - Scheduler correctly accounts for GPUs per task with fractional amount (121 milliseconds)
[info] - cast struct with a timestamp field (46 milliseconds)
[info] - complex casting (1 millisecond)
[info] - to_unix_timestamp (1 second, 153 milliseconds)
[info] - cast between string and interval (109 milliseconds)
[info] - Scheduler works with multiple ResourceProfiles and gpus (70 milliseconds)
[info] - datediff (169 milliseconds)
[info] - scheduler should keep the decommission state where host was decommissioned (56 milliseconds)
[info] - cast string to boolean (112 milliseconds)
[info] - SPARK-16729 type checking for casting to date type (1 millisecond)
[info] - SPARK-20302 cast with same structure (39 milliseconds)
[info] - test full decommissioning flow (57 milliseconds)
[info] SparkConfSuite:
[info] - Test byteString conversion (2 milliseconds)
[info] - Test timeString conversion (1 millisecond)
[info] - loading from system properties (0 milliseconds)
[info] - initializing without loading defaults (0 milliseconds)
[info] - named set methods (2 milliseconds)
[info] - basic get and set (1 millisecond)
[info] - basic getAllWithPrefix (0 milliseconds)
[info] - creating SparkContext without master and app name (5 milliseconds)
[info] - creating SparkContext without master (2 milliseconds)
[info] - creating SparkContext without app name (2 milliseconds)
[info] - creating SparkContext with both master and app name (78 milliseconds)
[info] - SparkContext property overriding (81 milliseconds)
[info] - nested property names (1 millisecond)
[info] - to_utc_timestamp (512 milliseconds)
[info] - to_utc_timestamp - invalid time zone id (64 milliseconds)
[info] - ceil (1 second, 972 milliseconds)
[info] - from_utc_timestamp (347 milliseconds)
[info] - from_utc_timestamp - invalid time zone id (49 milliseconds)
[info] - Thread safeness - SPARK-5425 (1 second, 3 milliseconds)
[info] - register kryo classes through registerKryoClasses (4 milliseconds)
[info] - register kryo classes through registerKryoClasses and custom registrator (1 millisecond)
[info] - register kryo classes through conf (1 millisecond)
[info] - deprecated configs (3 milliseconds)
[info] - akka deprecated configs (1 millisecond)
[info] - SPARK-13727 (0 milliseconds)
[info] - SPARK-22500: cast for struct should not generate codes beyond 64KB (1 second, 319 milliseconds)
[info] - SPARK-22570: Cast should not create a lot of global variables (2 milliseconds)
[info] - SPARK-17240: SparkConf should be serializable (java) (11 milliseconds)
[info] - SPARK-17240: SparkConf should be serializable (kryo) (4 milliseconds)
[info] - encryption requires authentication (1 millisecond)
[info] - spark.network.timeout should bigger than spark.executor.heartbeatInterval (1 millisecond)
[info] - SPARK-26998: SSL configuration not needed on executors (1 millisecond)
[info] - SPARK-27244 toDebugString redacts sensitive information (1 millisecond)
[info] - SPARK-28355: Use Spark conf for threshold at which UDFs are compressed by broadcast (1 millisecond)
[info] - SPARK-24337: getSizeAsKb with default throws an useful error message with key name (1 millisecond)
[info] - SPARK-24337: getTimeAsMs throws an useful error message with key name (1 millisecond)
[info] - SPARK-24337: getTimeAsSeconds throws an useful error message with key name (0 milliseconds)
[info] - SPARK-24337: getTimeAsSeconds with default throws an useful error message with key name (0 milliseconds)
[info] - SPARK-24337: getSizeAsBytes with default long throws an useful error message with key name (1 millisecond)
[info] - SPARK-24337: getSizeAsMb throws an useful error message with key name (1 millisecond)
[info] - SPARK-24337: getSizeAsGb throws an useful error message with key name (0 milliseconds)
[info] - SPARK-24337: getSizeAsBytes with default string throws an useful error message with key name (0 milliseconds)
[info] - SPARK-24337: getDouble throws an useful error message with key name (0 milliseconds)
[info] - SPARK-24337: getTimeAsMs with default throws an useful error message with key name (0 milliseconds)
[info] - SPARK-24337: getSizeAsBytes throws an useful error message with key name (1 millisecond)
[info] - SPARK-24337: getSizeAsGb with default throws an useful error message with key name (0 milliseconds)
[info] - SPARK-24337: getInt throws an useful error message with key name (1 millisecond)
[info] - SPARK-24337: getSizeAsMb with default throws an useful error message with key name (0 milliseconds)
[info] - SPARK-24337: getSizeAsKb throws an useful error message with key name (1 millisecond)
[info] - SPARK-24337: getBoolean throws an useful error message with key name (0 milliseconds)
[info] - SPARK-24337: getLong throws an useful error message with key name (1 millisecond)
[info] - get task resource requirement from config (1 millisecond)
[info] - test task resource requirement with 0 amount (0 milliseconds)
[info] - Ensure that we can configure fractional resources for a task (3 milliseconds)
[info] - Non-task resources are never fractional (2 milliseconds)
[info] - up-cast (32 milliseconds)
[info] ShuffleBlockFetcherIteratorSuite:
[info] - successful 3 local + 4 host local + 2 remote reads (55 milliseconds)
[info] - error during accessing host local dirs for executors (6 milliseconds)
[info] - Hit maxBytesInFlight limitation before maxBlocksInFlightPerAddress (4 milliseconds)
[info] - Hit maxBlocksInFlightPerAddress limitation before maxBytesInFlight (5 milliseconds)
[info] - fetch continuous blocks in batch successful 3 local + 4 host local + 2 remote reads (11 milliseconds)
[info] - fetch continuous blocks in batch should respect maxBytesInFlight (9 milliseconds)
[info] - fetch continuous blocks in batch should respect maxBlocksInFlightPerAddress (6 milliseconds)
[info] - release current unexhausted buffer in case the task completes early (6 milliseconds)
[info] - fail all blocks if any of the remote request fails (6 milliseconds)
[info] - retry corrupt blocks (11 milliseconds)
[info] - big blocks are also checked for corruption (6 milliseconds)
[info] - creating values of DateType via make_date (585 milliseconds)
[info] - ensure big blocks available as a concatenated stream can be read (46 milliseconds)
[info] - retry corrupt blocks (disabled) (5 milliseconds)
[info] - Blocks should be shuffled to disk when size of the request is above the threshold(maxReqSizeShuffleToMem). (43 milliseconds)
[info] - fail zero-size blocks (6 milliseconds)
[info] - SPARK-31521: correct the fetch size when merging blocks into a merged block (1 millisecond)
[info] ConfigEntrySuite:
[info] - conf entry: int (1 millisecond)
[info] - conf entry: long (0 milliseconds)
[info] - conf entry: double (1 millisecond)
[info] - conf entry: boolean (1 millisecond)
[info] - conf entry: optional (0 milliseconds)
[info] - conf entry: fallback (0 milliseconds)
[info] - conf entry: time (1 millisecond)
[info] - conf entry: bytes (1 millisecond)
[info] - conf entry: regex (1 millisecond)
[info] - conf entry: string seq (0 milliseconds)
[info] - conf entry: int seq (1 millisecond)
[info] - conf entry: transformation (1 millisecond)
[info] - conf entry: checkValue() (2 milliseconds)
[info] - conf entry: valid values check (1 millisecond)
[info] - conf entry: conversion error (1 millisecond)
[info] - default value handling is null-safe (0 milliseconds)
[info] - variable expansion of spark config entries (1 millisecond)
[info] - conf entry : default function (1 millisecond)
[info] - conf entry: alternative keys (0 milliseconds)
[info] - conf entry: prepend with default separator (0 milliseconds)
[info] - conf entry: prepend with custom separator (0 milliseconds)
[info] - conf entry: prepend with fallback (1 millisecond)
[info] - conf entry: prepend should work only with string type (5 milliseconds)
[info] - onCreate (2 milliseconds)
[info] WorkerSuite:
[info] - test isUseLocalNodeSSLConfig (2 milliseconds)
[info] - test maybeUpdateSSLSettings (25 milliseconds)
[info] - test clearing of finishedExecutors (small number of executors) (60 milliseconds)
[info] - SPARK-27671: cast from nested null type in struct (480 milliseconds)
[info] - test clearing of finishedExecutors (more executors) (65 milliseconds)
[info] - test clearing of finishedDrivers (small number of drivers) (148 milliseconds)
[info] - Process Infinity, -Infinity, NaN in case insensitive manner (219 milliseconds)
[info] - floor (1 second, 847 milliseconds)
[info] - ANSI mode: Throw exception on casting out-of-range value to byte type (610 milliseconds)
[info] - test clearing of finishedDrivers (more drivers) (928 milliseconds)
[info] - worker could be launched without any resources (66 milliseconds)
[info] - worker could load resources from resources file while launching (67 milliseconds)
[info] - worker could load resources from discovery script while launching (69 milliseconds)
[info] - ANSI mode: Throw exception on casting out-of-range value to short type (557 milliseconds)
[info] - worker could load resources from resources file and discovery script while launching (76 milliseconds)
[info] - cleanup non-shuffle files after executor exits when config spark.storage.cleanupFilesAfterExecutorExit=true (41 milliseconds)
[info] - don't cleanup non-shuffle files after executor exits when config spark.storage.cleanupFilesAfterExecutorExit=false (30 milliseconds)
[info] - factorial (879 milliseconds)
[info] - WorkDirCleanup cleans app dirs and shuffle metadata when spark.shuffle.service.db.enabled=true (44 milliseconds)
[info] - WorkDirCleanup cleans only app dirs whenspark.shuffle.service.db.enabled=false (48 milliseconds)
[info] - creating values of TimestampType via make_timestamp (1 second, 953 milliseconds)
[info] BlockManagerSuite:
[info] - ISO 8601 week-numbering year (96 milliseconds)
[info] - ANSI mode: Throw exception on casting out-of-range value to int type (477 milliseconds)
[info] - rint (634 milliseconds)
[info] - ANSI mode: Throw exception on casting out-of-range value to long type (299 milliseconds)
[info] - extract the seconds part with fraction from timestamps (504 milliseconds)
[info] - ANSI mode: Throw exception on casting out-of-range value to decimal type (93 milliseconds)
[info] - ANSI mode: disallow type conversions between Numeric types and Timestamp type (3 milliseconds)
[info] - ANSI mode: disallow type conversions between Numeric types and Date type (2 milliseconds)
[info] - ANSI mode: disallow type conversions between Numeric types and Binary type (1 millisecond)
[info] - ANSI mode: disallow type conversions between Datatime types and Boolean types (1 millisecond)
[info] - ANSI mode: disallow casting complex types as String type (10 milliseconds)
[info] - timestamps difference (122 milliseconds)
[info] - cast from invalid string to numeric should throw NumberFormatException (269 milliseconds)
[info] - subtract dates (202 milliseconds)
[info] - to_timestamp exception mode (52 milliseconds)
[info] - Fast fail for cast string type to decimal type in ansi mode (116 milliseconds)
[info] - Consistent error handling for datetime formatting and parsing functions (183 milliseconds)
[info] - SPARK-31896: Handle am-pm timestamp parsing when hour is missing (18 milliseconds)
[info] - exp (665 milliseconds)
[info] - DATE_FROM_UNIX_DATE (274 milliseconds)
[info] - UNIX_DATE (61 milliseconds)
[info] - UNIX_SECONDS (165 milliseconds)
[info] - UNIX_MILLIS (146 milliseconds)
[info] - expm1 (740 milliseconds)
[info] - UNIX_MICROS (134 milliseconds)
[info] - TIMESTAMP_SECONDS (805 milliseconds)
[info] - signum (860 milliseconds)
[info] - TIMESTAMP_MILLIS (341 milliseconds)
[info] - log (621 milliseconds)
[info] - TIMESTAMP_MICROS (369 milliseconds)
[info] - SPARK-33498: GetTimestamp,UnixTimestamp,ToUnixTimestamp with parseError (333 milliseconds)
[info] - log10 (583 milliseconds)
[info] - log1p (583 milliseconds)
[info] - bin (1 second, 9 milliseconds)
[info] BasicDriverFeatureStepSuite:
[info] - log2 (533 milliseconds)
[info] - Check the pod respects all configurations from the user. (633 milliseconds)
[info] - Check driver pod respects kubernetes driver request cores (26 milliseconds)
[info] - Check appropriate entrypoint rerouting for various bindings (15 milliseconds)
[info] - memory overhead factor: java (7 milliseconds)
[info] - memory overhead factor: python default (7 milliseconds)
[info] - memory overhead factor: python w/ override (7 milliseconds)
[info] - memory overhead factor: r default (6 milliseconds)
[info] - SPARK-35493: make spark.blockManager.port be able to be fallen back to in driver pod (12 milliseconds)
[info] EnvSecretsFeatureStepSuite:
[info] - sets up all keyRefs (10 milliseconds)
[info] ExecutorPodsPollingSnapshotSourceSuite:
[info] - sqrt (541 milliseconds)
[info] - pow (412 milliseconds)
[info] - shift left (450 milliseconds)
[info] - Items returned by the API should be pushed to the event queue (61 milliseconds)
[info] ExecutorPodsSnapshotSuite:
[info] - States are interpreted correctly from pod metadata. (35 milliseconds)
[info] - SPARK-30821: States are interpreted correctly from pod metadata when configured to check all containers. (12 milliseconds)
[info] - Updates add new pods for non-matching ids and edit existing pods for matching ids (5 milliseconds)
[info] ExecutorKubernetesCredentialsFeatureStepSuite:
[info] - configure spark pod with executor service account (6 milliseconds)
[info] - configure spark pod with with driver service account and without executor service account (2 milliseconds)
[info] - configure spark pod with with driver service account and with executor service account (2 milliseconds)
[info] DriverKubernetesCredentialsFeatureStepSuite:
[info] - shift right (464 milliseconds)
[info] - Don't set any credentials (16 milliseconds)
[info] - Only set credentials that are manually mounted. (5 milliseconds)
[info] - Mount credentials from the submission client as a secret. (89 milliseconds)
[info] PodTemplateConfigMapStepSuite:
[info] - Do nothing when executor template is not specified (3 milliseconds)
[info] - Mounts executor template volume if config specified (26 milliseconds)
[info] KubernetesExecutorBuilderSuite:
[info] - shift right unsigned (380 milliseconds)
[info] - use empty initial pod if template is not specified (282 milliseconds)
[info] - load pod template if specified (135 milliseconds)
[info] - complain about misconfigured pod template (29 milliseconds)
[info] KubernetesConfSuite:
[info] - Resolve driver labels, annotations, secret mount paths, envs, and memory overhead (5 milliseconds)
[info] - Basic executor translated fields. (1 millisecond)
[info] - resource profile not default. (1 millisecond)
[info] - Image pull secrets. (1 millisecond)
[info] - Set executor labels, annotations, and secrets (3 milliseconds)
[info] - Verify that executorEnv key conforms to the regular specification (3 milliseconds)
[info] BasicExecutorFeatureStepSuite:
[info] - test spark resource missing vendor (40 milliseconds)
[info] - test spark resource missing amount (4 milliseconds)
[info] - hex (341 milliseconds)
[info] - basic executor pod with resources (43 milliseconds)
[info] - basic executor pod has reasonable defaults (38 milliseconds)
[info] - executor pod hostnames get truncated to 63 characters (32 milliseconds)
[info] - unhex (124 milliseconds)
[info] - hostname truncation generates valid host names (66 milliseconds)
[info] - classpath and extra java options get translated into environment variables (36 milliseconds)
[info] - SPARK-32655 Support appId/execId placeholder in SPARK_EXECUTOR_DIRS (31 milliseconds)
[info] - test executor pyspark memory (27 milliseconds)
[info] - auth secret propagation (34 milliseconds)
[info] - Auth secret shouldn't propagate if files are loaded. (53 milliseconds)
[info] - SPARK-32661 test executor offheap memory (31 milliseconds)
[info] - basic resourceprofile (32 milliseconds)
[info] - resourceprofile with gpus (27 milliseconds)
[info] - Verify spark conf dir is mounted as configmap volume on executor pod's container. (26 milliseconds)
[info] - SPARK-35482: user correct block manager port for executor pods (29 milliseconds)
[info] KubernetesVolumeUtilsSuite:
[info] - Parses hostPath volumes correctly (5 milliseconds)
[info] - Parses subPath correctly (3 milliseconds)
[info] - Parses persistentVolumeClaim volumes correctly (5 milliseconds)
[info] - Parses emptyDir volumes correctly (3 milliseconds)
[info] - Parses emptyDir volume options can be optional (2 milliseconds)
[info] - Defaults optional readOnly to false (1 millisecond)
[info] - Fails on missing mount key (3 milliseconds)
[info] - Fails on missing option key (3 milliseconds)
[info] - SPARK-33063: Fails on missing option key in persistentVolumeClaim (2 milliseconds)
[info] - Parses read-only nfs volumes correctly (2 milliseconds)
[info] - Parses read/write nfs volumes correctly (1 millisecond)
[info] - Fails on missing path option (1 millisecond)
[info] - Fails on missing server option (2 milliseconds)
[info] KubernetesClusterSchedulerBackendSuite:
[info] - hypot (1 second, 179 milliseconds)
[info] - Start all components (4 milliseconds)
[info] - Stop all components (11 milliseconds)
[info] - Remove executor (108 milliseconds)
[info] - Kill executors (94 milliseconds)
[info] - SPARK-34407: CoarseGrainedSchedulerBackend.stop may throw SparkException (6 milliseconds)
[info] - SPARK-34469: Ignore RegisterExecutor when SparkContext is stopped (4 milliseconds)
[info] KubernetesDriverBuilderSuite:
[info] - use empty initial pod if template is not specified (62 milliseconds)
[info] - load pod template if specified (29 milliseconds)
[info] - complain about misconfigured pod template (4 milliseconds)
[info] LocalDirsFeatureStepSuite:
[info] - Resolve to default local dir if neither env nor configuration are set (3 milliseconds)
[info] - Use configured local dirs split on comma if provided. (3 milliseconds)
[info] - Use tmpfs to back default local dir (1 millisecond)
[info] - local dir on mounted volume (2 milliseconds)
[info] ExecutorPodsWatchSnapshotSourceSuite:
[info] - Watch events should be pushed to the snapshots store as snapshot updates. (2 milliseconds)
[info] ExecutorPodsAllocatorSuite:
[info] - Initially request executors in batches. Do not request another batch if the first has not finished. (30 milliseconds)
[info] - ANSI mode: cast string to timestamp with parse error (9 seconds, 657 milliseconds)
[info] - Request executors in batches. Allow another batch to be requested if all pending executors start running. (14 milliseconds)
[info] - When a current batch reaches error states immediately, re-request them on the next batch. (8 milliseconds)
[info] - SPARK-26218: Fix the corner case of codegen when casting float to Integer (28 milliseconds)
[info] - When an executor is requested but the API does not report it in a reasonable time, retry requesting that executor. (7 milliseconds)
[info] - SPARK-28487: scale up and down on target executor count changes (28 milliseconds)
[info] - SPARK-34334: correctly identify timed out pending pod requests as excess (9 milliseconds)
[info] SameResultSuite:
[info] - relations (4 milliseconds)
[info] - projections (21 milliseconds)
[info] - filters (9 milliseconds)
[info] - SPARK-33099: Respect executor idle timeout configuration (9 milliseconds)
[info] - SPARK-34361: scheduler backend known pods with multiple resource profiles at downscaling (22 milliseconds)
[info] - sorts (14 milliseconds)
[info] - union (16 milliseconds)
[info] - hint (12 milliseconds)
[info] - join hint (18 milliseconds)
[info] - SPARK-33288: multiple resource profiles (26 milliseconds)
[info] - SPARK-33262: pod allocator does not stall with pending pods (11 milliseconds)
[info] - print the pod name instead of Some(name) if pod is absent (3 milliseconds)
[info] ExecutorPodsSnapshotsStoreSuite:
[info] - Subscribers get notified of events periodically. (7 milliseconds)
[info] - Even without sending events, initially receive an empty buffer. (2 milliseconds)
[info] - Replacing the snapshot passes the new snapshot to subscribers. (2 milliseconds)
[info] ExecutorPodsLifecycleManagerSuite:
[info] - When an executor reaches error states immediately, remove from the scheduler backend. (16 milliseconds)
[info] - Don't remove executors twice from Spark but remove from K8s repeatedly. (4 milliseconds)
[info] - When the scheduler backend lists executor ids that aren't present in the cluster, remove those executors from Spark. (4 milliseconds)
[info] UnsupportedOperationsSuite:
[info] - Keep executor pods in k8s if configured. (4 milliseconds)
[info] HadoopConfDriverFeatureStepSuite:
[info] - batch plan - local relation: supported (5 milliseconds)
[info] - mount hadoop config map if defined (2 milliseconds)
[info] - batch plan - streaming source: not supported (7 milliseconds)
[info] - batch plan - select on streaming source: not supported (1 millisecond)
[info] - create hadoop config map if config dir is defined (6 milliseconds)
[info] KubernetesClientUtilsSuite:
[info] - streaming plan - no streaming source (7 milliseconds)
[info] - streaming plan - commmands: not supported (4 milliseconds)
[info] - streaming plan - aggregate - multiple batch aggregations: supported (5 milliseconds)
[info] - streaming plan - aggregate - multiple aggregations but only one streaming aggregation: supported (1 millisecond)
[info] - streaming plan - aggregate - multiple streaming aggregations: not supported (3 milliseconds)
[info] - streaming plan - aggregate - streaming aggregations in update mode: supported (0 milliseconds)
[info] - streaming plan - aggregate - streaming aggregations in complete mode: supported (0 milliseconds)
[info] - streaming plan - aggregate - streaming aggregations with watermark in append mode: supported (1 millisecond)
[info] - streaming plan - aggregate - streaming aggregations without watermark in append mode: not supported (3 milliseconds)
[info] - streaming plan - distinct aggregate - aggregate on batch relation: supported (1 millisecond)
[info] - verify load files, loads only allowed files and not the disallowed files. (26 milliseconds)
[info] - streaming plan - distinct aggregate - aggregate on streaming relation: not supported (2 milliseconds)
[info] - batch plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on batch relation: supported (0 milliseconds)
[info] - batch plan - flatMapGroupsWithState - multiple flatMapGroupsWithState(Append)s on batch relation: supported (0 milliseconds)
[info] - batch plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on batch relation: supported (1 millisecond)
[info] - batch plan - flatMapGroupsWithState - multiple flatMapGroupsWithState(Update)s on batch relation: supported (0 milliseconds)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation without aggregation in update mode: supported (1 millisecond)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation without aggregation in append mode: not supported (2 milliseconds)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation without aggregation in complete mode: not supported (2 milliseconds)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation with aggregation in Append mode: not supported (2 milliseconds)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation with aggregation in Update mode: not supported (2 milliseconds)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation with aggregation in Complete mode: not supported (3 milliseconds)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation without aggregation in append mode: supported (1 millisecond)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation without aggregation in update mode: not supported (2 milliseconds)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation before aggregation in Append mode: supported (2 milliseconds)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation before aggregation in Update mode: supported (2 milliseconds)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation before aggregation in Complete mode: supported (2 milliseconds)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation after aggregation in Append mode: not supported (2 milliseconds)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation after aggregation in Update mode: not supported (2 milliseconds)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation in complete mode: not supported (2 milliseconds)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on batch relation inside streaming relation in Append output mode: supported (0 milliseconds)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on batch relation inside streaming relation in Update output mode: supported (1 millisecond)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on batch relation inside streaming relation in Append output mode: supported (0 milliseconds)
[info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on batch relation inside streaming relation in Update output mode: supported (0 milliseconds)
[info] - streaming plan - flatMapGroupsWithState - multiple flatMapGroupsWithStates on streaming relation and all are in append mode: supported (2 milliseconds)
[info] - streaming plan - flatMapGroupsWithState -  multiple flatMapGroupsWithStates on s streaming relation but some are not in append mode: not supported (2 milliseconds)
[info] - streaming plan - mapGroupsWithState - mapGroupsWithState on streaming relation without aggregation in append mode: not supported (2 milliseconds)
[info] - streaming plan - mapGroupsWithState - mapGroupsWithState on streaming relation without aggregation in complete mode: not supported (2 milliseconds)
[info] - streaming plan - mapGroupsWithState - mapGroupsWithState on streaming relation with aggregation in Append mode: not supported (2 milliseconds)
[info] - streaming plan - mapGroupsWithState - mapGroupsWithState on streaming relation with aggregation in Update mode: not supported (2 milliseconds)
[info] - streaming plan - mapGroupsWithState - mapGroupsWithState on streaming relation with aggregation in Complete mode: not supported (1 millisecond)
[info] - streaming plan - mapGroupsWithState - multiple mapGroupsWithStates on streaming relation and all are in append mode: not supported (1 millisecond)
[info] - streaming plan - mapGroupsWithState - mixing mapGroupsWithStates and flatMapGroupsWithStates on streaming relation: not supported (1 millisecond)
[info] - streaming plan - mapGroupsWithState - mapGroupsWithState with event time timeout without watermark: not supported (2 milliseconds)
[info] - streaming plan - mapGroupsWithState - mapGroupsWithState with event time timeout with watermark: supported (1 millisecond)
[info] - streaming plan - Deduplicate - Deduplicate on streaming relation before aggregation: supported (1 millisecond)
[info] - streaming plan - Deduplicate - Deduplicate on streaming relation after aggregation: not supported (2 milliseconds)
[info] - streaming plan - Deduplicate - Deduplicate on batch relation inside a streaming query: supported (1 millisecond)
[info] - streaming plan - single inner join in append mode with stream-stream relations: supported (1 millisecond)
[info] - streaming plan - single inner join in append mode with stream-batch relations: supported (0 milliseconds)
[info] - streaming plan - single inner join in append mode with batch-stream relations: supported (1 millisecond)
[info] - streaming plan - single inner join in append mode with batch-batch relations: supported (0 milliseconds)
[info] - streaming plan - multiple inner joins in append mode with stream-stream relations: supported (0 milliseconds)
[info] - streaming plan - multiple inner joins in append mode with stream-batch relations: supported (0 milliseconds)
[info] - streaming plan - multiple inner joins in append mode with batch-stream relations: supported (0 milliseconds)
[info] - streaming plan - multiple inner joins in append mode with batch-batch relations: supported (1 millisecond)
[info] - streaming plan - inner join in update mode with stream-stream relations: not supported (2 milliseconds)
[info] - streaming plan - inner join in update mode with stream-batch relations: supported (0 milliseconds)
[info] - streaming plan - inner join in update mode with batch-stream relations: supported (0 milliseconds)
[info] - streaming plan - inner join in update mode with batch-batch relations: supported (1 millisecond)
[info] - streaming plan - FullOuter join with stream-stream relations: not supported (8 milliseconds)
[info] - streaming plan - FullOuter join with stream-batch relations: not supported (1 millisecond)
[info] - streaming plan - FullOuter join with batch-stream relations: not supported (2 milliseconds)
[info] - streaming plan - FullOuter join with batch-batch relations: supported (0 milliseconds)
[info] - streaming plan - LeftOuter join with stream-stream relations: not supported (1 millisecond)
[info] - streaming plan - LeftOuter join with stream-batch relations: supported (1 millisecond)
[info] - streaming plan - LeftOuter join with batch-stream relations: not supported (1 millisecond)
[info] - streaming plan - LeftOuter join with batch-batch relations: supported (1 millisecond)
[info] - streaming plan - LeftSemi join with stream-stream relations: not supported (1 millisecond)
[info] - streaming plan - LeftSemi join with stream-batch relations: supported (1 millisecond)
[info] - streaming plan - LeftSemi join with batch-stream relations: not supported (2 milliseconds)
[info] - streaming plan - LeftSemi join with batch-batch relations: supported (0 milliseconds)
[info] - streaming plan - LeftAnti join with stream-stream relations: not supported (1 millisecond)
[info] - streaming plan - LeftAnti join with stream-batch relations: supported (0 milliseconds)
[info] - streaming plan - LeftAnti join with batch-stream relations: not supported (1 millisecond)
[info] - streaming plan - LeftAnti join with batch-batch relations: supported (0 milliseconds)
[info] - streaming plan - RightOuter join with stream-stream relations: not supported (1 millisecond)
[info] - streaming plan - RightOuter join with stream-batch relations: not supported (1 millisecond)
[info] - streaming plan - RightOuter join with batch-stream relations: supported (0 milliseconds)
[info] - streaming plan - RightOuter join with batch-batch relations: supported (0 milliseconds)
[info] - streaming plan - LeftOuter join with stream-stream relations and update mode: not supported (2 milliseconds)
[info] - streaming plan - LeftOuter join with stream-stream relations and complete mode: not supported (1 millisecond)
[info] - streaming plan - LeftOuter join with stream-stream relations and join on attribute with left watermark: supported (3 milliseconds)
[info] - streaming plan - LeftOuter join with stream-stream relations and join on attribute with right watermark: supported (0 milliseconds)
[info] - streaming plan - LeftOuter join with stream-stream relations and join on non-watermarked attribute: not supported (1 millisecond)
[info] - streaming plan - LeftOuter join with stream-stream relations and state value watermark: supported (17 milliseconds)
[info] - streaming plan - LeftOuter join with stream-stream relations and state value watermark: not supported (3 milliseconds)
[info] - streaming plan - RightOuter join with stream-stream relations and update mode: not supported (1 millisecond)
[info] - streaming plan - RightOuter join with stream-stream relations and complete mode: not supported (1 millisecond)
[info] - streaming plan - RightOuter join with stream-stream relations and join on attribute with left watermark: supported (0 milliseconds)
[info] - streaming plan - RightOuter join with stream-stream relations and join on attribute with right watermark: supported (1 millisecond)
[info] - streaming plan - RightOuter join with stream-stream relations and join on non-watermarked attribute: not supported (2 milliseconds)
[info] - streaming plan - RightOuter join with stream-stream relations and state value watermark: supported (2 milliseconds)
[info] - streaming plan - RightOuter join with stream-stream relations and state value watermark: not supported (3 milliseconds)
[info] - streaming plan - FullOuter join with stream-stream relations and update mode: not supported (2 milliseconds)
[info] - atan2 (1 second, 113 milliseconds)
[info] - streaming plan - FullOuter join with stream-stream relations and complete mode: not supported (1 millisecond)
[info] - streaming plan - FullOuter join with stream-stream relations and join on attribute with left watermark: supported (0 milliseconds)
[info] - streaming plan - FullOuter join with stream-stream relations and join on attribute with right watermark: supported (1 millisecond)
[info] - streaming plan - FullOuter join with stream-stream relations and join on non-watermarked attribute: not supported (1 millisecond)
[info] - streaming plan - FullOuter join with stream-stream relations and state value watermark: supported (2 milliseconds)
[info] - streaming plan - FullOuter join with stream-stream relations and state value watermark: not supported (4 milliseconds)
[info] - streaming plan - LeftSemi join with stream-stream relations and update mode: not supported (1 millisecond)
[info] - streaming plan - LeftSemi join with stream-stream relations and complete mode: not supported (1 millisecond)
[info] - streaming plan - LeftSemi join with stream-stream relations and join on attribute with left watermark: supported (1 millisecond)
[info] - streaming plan - LeftSemi join with stream-stream relations and join on attribute with right watermark: supported (1 millisecond)
[info] - streaming plan - LeftSemi join with stream-stream relations and join on non-watermarked attribute: not supported (2 milliseconds)
[info] - streaming plan - LeftSemi join with stream-stream relations and state value watermark: supported (2 milliseconds)
[info] - streaming plan - LeftSemi join with stream-stream relations and state value watermark: not supported (2 milliseconds)
[info] - Global watermark limit - single Inner join in Append mode (1 millisecond)
[info] - Global watermark limit - streaming aggregation after stream-stream Inner join in Append mode (0 milliseconds)
[info] - Global watermark limit - streaming-stream Inner after stream-stream Inner join in Append mode (1 millisecond)
[info] - Global watermark limit - streaming-stream LeftOuter after stream-stream Inner join in Append mode (0 milliseconds)
[info] - Global watermark limit - streaming-stream RightOuter after stream-stream Inner join in Append mode (0 milliseconds)
[info] - Global watermark limit - FlatMapGroupsWithState after stream-stream Inner join in Append mode (1 millisecond)
[info] - Global watermark limit - deduplicate after stream-stream Inner join in Append mode (0 milliseconds)
[info] - Global watermark limit - single LeftOuter join in Append mode (0 milliseconds)
[info] - Global watermark limit - streaming aggregation after stream-stream LeftOuter join in Append mode (1 millisecond)
[info] - Global watermark limit - streaming-stream Inner after stream-stream LeftOuter join in Append mode (1 millisecond)
[info] - Global watermark limit - streaming-stream LeftOuter after stream-stream LeftOuter join in Append mode (0 milliseconds)
[info] - Global watermark limit - streaming-stream RightOuter after stream-stream LeftOuter join in Append mode (1 millisecond)
[info] - Global watermark limit - FlatMapGroupsWithState after stream-stream LeftOuter join in Append mode (0 milliseconds)
[info] - Global watermark limit - deduplicate after stream-stream LeftOuter join in Append mode (0 milliseconds)
[info] - Global watermark limit - single RightOuter join in Append mode (1 millisecond)
[info] - Global watermark limit - streaming aggregation after stream-stream RightOuter join in Append mode (0 milliseconds)
[info] - Global watermark limit - streaming-stream Inner after stream-stream RightOuter join in Append mode (0 milliseconds)
[info] - Global watermark limit - streaming-stream LeftOuter after stream-stream RightOuter join in Append mode (0 milliseconds)
[info] - Global watermark limit - streaming-stream RightOuter after stream-stream RightOuter join in Append mode (1 millisecond)
[info] - Global watermark limit - FlatMapGroupsWithState after stream-stream RightOuter join in Append mode (0 milliseconds)
[info] - Global watermark limit - deduplicate after stream-stream RightOuter join in Append mode (0 milliseconds)
[info] - streaming plan - cogroup with stream-stream relations: not supported (5 milliseconds)
[info] - streaming plan - cogroup with stream-batch relations: not supported (4 milliseconds)
[info] - streaming plan - cogroup with batch-stream relations: not supported (3 milliseconds)
[info] - streaming plan - cogroup with batch-batch relations: supported (1 millisecond)
[info] - streaming plan - union with stream-stream relations: supported (2 milliseconds)
[info] - streaming plan - union with stream-batch relations: not supported (2 milliseconds)
[info] - streaming plan - union with batch-stream relations: not supported (2 milliseconds)
[info] - streaming plan - union with batch-batch relations: supported (0 milliseconds)
[info] - streaming plan - except with stream-stream relations: not supported (1 millisecond)
[info] - streaming plan - except with stream-batch relations: supported (1 millisecond)
[info] - streaming plan - except with batch-stream relations: not supported (1 millisecond)
[info] - streaming plan - except with batch-batch relations: supported (0 milliseconds)
[info] - streaming plan - intersect with stream-stream relations: not supported (2 milliseconds)
[info] - streaming plan - intersect with stream-batch relations: supported (1 millisecond)
[info] - streaming plan - intersect with batch-stream relations: supported (0 milliseconds)
[info] - streaming plan - intersect with batch-batch relations: supported (0 milliseconds)
[info] - streaming plan - sort with stream relation: not supported (2 milliseconds)
[info] - streaming plan - sort with batch relation: supported (0 milliseconds)
[info] - streaming plan - sort - sort after aggregation in Complete output mode: supported (1 millisecond)
[info] - streaming plan - sort - sort before aggregation in Complete output mode: not supported (1 millisecond)
[info] - streaming plan - sort - sort over aggregated data in Update output mode: not supported (2 milliseconds)
[info] - streaming plan - sample with stream relation: not supported (1 millisecond)
[info] - streaming plan - sample with batch relation: supported (0 milliseconds)
[info] - streaming plan - window with stream relation: not supported (3 milliseconds)
[info] - streaming plan - window with batch relation: supported (0 milliseconds)
[info] - streaming plan - Append output mode - aggregation: not supported (1 millisecond)
[info] - streaming plan - Append output mode - no aggregation: supported (0 milliseconds)
[info] - streaming plan - Update output mode - aggregation: supported (1 millisecond)
[info] - streaming plan - Update output mode - no aggregation: supported (0 milliseconds)
[info] - streaming plan - Complete output mode - aggregation: supported (1 millisecond)
[info] - streaming plan - Complete output mode - no aggregation: not supported (1 millisecond)
[info] - streaming plan - MonotonicallyIncreasingID: not supported (4 milliseconds)
[info] - continuous processing - TypedFilter: supported (6 milliseconds)
[info] - Global watermark limit - single streaming aggregation in Append mode (1 millisecond)
[info] - Global watermark limit - chained streaming aggregations in Append mode (0 milliseconds)
[info] - Global watermark limit - Inner join after streaming aggregation in Append mode (0 milliseconds)
[info] - Global watermark limit - LeftOuter join after streaming aggregation in Append mode (0 milliseconds)
[info] - Global watermark limit - RightOuter join after streaming aggregation in Append mode (0 milliseconds)
[info] - Global watermark limit - deduplicate after streaming aggregation in Append mode (0 milliseconds)
[info] - Global watermark limit - FlatMapGroupsWithState after streaming aggregation in Append mode (0 milliseconds)
[info] - Global watermark limit - single FlatMapGroupsWithState in Append mode (0 milliseconds)
[info] - Global watermark limit - streaming aggregation after FlatMapGroupsWithState in Append mode (0 milliseconds)
[info] - Global watermark limit - stream-stream Inner after FlatMapGroupsWithState in Append mode (0 milliseconds)
[info] - Global watermark limit - stream-stream LeftOuter after FlatMapGroupsWithState in Append mode (0 milliseconds)
[info] - Global watermark limit - stream-stream RightOuter after FlatMapGroupsWithState in Append mode (0 milliseconds)
[info] - Global watermark limit - FlatMapGroupsWithState after FlatMapGroupsWithState in Append mode (0 milliseconds)
[info] - Global watermark limit - deduplicate after FlatMapGroupsWithState in Append mode (0 milliseconds)
[info] - Global watermark limit - streaming aggregation after deduplicate in Append mode (0 milliseconds)
[info] - Global watermark limit - Inner join after deduplicate in Append mode (0 milliseconds)
[info] - Global watermark limit - LeftOuter join after deduplicate in Append mode (0 milliseconds)
[info] - Global watermark limit - RightOuter join after deduplicate in Append mode (0 milliseconds)
[info] - Global watermark limit - FlatMapGroupsWithState after deduplicate in Append mode (0 milliseconds)
[info] V2OverwriteByExpressionANSIAnalysisSuite:
[info] - SPARK-33136: output resolved on complex types for V2 write commands (638 milliseconds)
[info] - skipSchemaResolution should still require query to be resolved (10 milliseconds)
[info] - byName: basic behavior (32 milliseconds)
[info] - byName: does not match by position (30 milliseconds)
[info] - byName: case sensitive column resolution (19 milliseconds)
[info] - byName: case insensitive column resolution (29 milliseconds)
[info] - byName: data columns are reordered by name (24 milliseconds)
[info] - byName: fail nullable data written to required columns (18 milliseconds)
[info] - byName: allow required data written to nullable columns (19 milliseconds)
[info] - byName: missing required columns cause failure and are identified by name (18 milliseconds)
[info] - byName: missing optional columns cause failure and are identified by name (18 milliseconds)
[info] - byName: insert safe cast (23 milliseconds)
[info] - byName: fail extra data fields (20 milliseconds)
[info] - byPosition: basic behavior (23 milliseconds)
[info] - byPosition: data columns are not reordered (22 milliseconds)
[info] - byPosition: fail nullable data written to required columns (18 milliseconds)
[info] - byPosition: allow required data written to nullable columns (19 milliseconds)
[info] - byPosition: missing required columns cause failure (19 milliseconds)
[info] - byPosition: missing optional columns cause failure (16 milliseconds)
[info] - byPosition: insert safe cast (20 milliseconds)
[info] - byPosition: fail extra data fields (17 milliseconds)
[info] - bypass output column resolution (40 milliseconds)
[info] - check fields of struct type column (43 milliseconds)
[info] - delete expression is resolved using table fields (22 milliseconds)
[info] - delete expression is not resolved using query fields (55 milliseconds)
[info] CodeFormatterSuite:
[info] - removing overlapping comments (1 millisecond)
[info] - removing extra new lines and comments (0 milliseconds)
[info] - basic example (4 milliseconds)
[info] - nested example (0 milliseconds)
[info] - single line (0 milliseconds)
[info] - if else on the same line (1 millisecond)
[info] - function calls (0 milliseconds)
[info] - function calls with maxLines=0 (0 milliseconds)
[info] - function calls with maxLines=2 (1 millisecond)
[info] - single line comments (1 millisecond)
[info] - binary log (1 second, 376 milliseconds)
[info] - single line comments /* */  (4 milliseconds)
[info] - multi-line comments (1 millisecond)
[info] - reduce empty lines (1 millisecond)
[info] - comment place holder (2 milliseconds)
[info] AnsiCastSuiteWithAnsiModeOn:
[info] - verify load files, truncates the content to maxSize, when keys are very large in number. (3 seconds, 195 milliseconds)
[info] - verify load files, truncates the content to maxSize, when keys are equal in length. (7 milliseconds)
[info] MountVolumesFeatureStepSuite:
[info] - Mounts hostPath volumes (3 milliseconds)
[info] - Mounts persistentVolumeClaims (10 milliseconds)
[info] - SPARK-32713 Mounts parameterized persistentVolumeClaims in executors (3 milliseconds)
[info] - Create and mounts persistentVolumeClaims in driver (1 millisecond)
[info] - Create and mount persistentVolumeClaims in executors (2 milliseconds)
[info] - Mounts emptyDir (6 milliseconds)
[info] - Mounts emptyDir with no options (2 milliseconds)
[info] - Mounts read/write nfs volumes (4 milliseconds)
[info] - Mounts read-only nfs volumes (2 milliseconds)
[info] - Mounts multiple volumes (2 milliseconds)
[info] - mountPath should be unique (2 milliseconds)
[info] - Mounts subpath on emptyDir (2 milliseconds)
[info] - Mounts subpath on persistentVolumeClaims (2 milliseconds)
[info] - Mounts multiple subpaths (3 milliseconds)
[info] ClientSuite:
[info] - The client should configure the pod using the builder. (15 milliseconds)
[info] - The client should create Kubernetes resources (5 milliseconds)
[info] - All files from SPARK_CONF_DIR, except templates, spark config, binary files and are within size limit, should be populated to pod's configMap. (20 milliseconds)
[info] - Waiting for app completion should stall on the watcher (2 milliseconds)
[info] K8sSubmitOpSuite:
[info] - List app status (8 milliseconds)
[info] - List status for multiple apps with glob (4 milliseconds)
[info] - Kill app (4 milliseconds)
[info] - Kill app with gracePeriod (2 milliseconds)
[info] - Kill multiple apps with glob without gracePeriod (4 milliseconds)
[info] KerberosConfDriverFeatureStepSuite:
[info] - mount krb5 config map if defined (37 milliseconds)
[info] - create krb5.conf config map if local config provided (34 milliseconds)
[info] - create keytab secret if client keytab file used (23 milliseconds)
[info] - do nothing if container-local keytab used (14 milliseconds)
[info] - mount delegation tokens if provided (17 milliseconds)
[info] - create delegation tokens if needed (50 milliseconds)
[info] - do nothing if no config and no tokens (31 milliseconds)
[info] MountSecretsFeatureStepSuite:
[info] - mounts all given secrets (3 milliseconds)
[info] DriverServiceFeatureStepSuite:
[info] - Headless service has a port for the driver RPC, the block manager and driver ui. (5 milliseconds)
[info] - Hostname and ports are set according to the service name. (1 millisecond)
[info] - Ports should resolve to defaults in SparkConf and in the service. (1 millisecond)
[info] - Long prefixes should switch to using a generated unique name. (12 milliseconds)
[info] - Disallow bind address and driver host to be set explicitly. (2 milliseconds)
[info] DriverCommandFeatureStepSuite:
[info] - java resource (2 milliseconds)
[info] - python resource (4 milliseconds)
[info] - python executable precedence (5 milliseconds)
[info] - R resource (1 millisecond)
[info] - SPARK-25355: java resource args with proxy-user (1 millisecond)
[info] - SPARK-25355: python resource args with proxy-user (2 milliseconds)
[info] - SPARK-25355: R resource args with proxy-user (1 millisecond)
[info] KubernetesUtilsSuite:
[info] - Selects the given container as spark container. (2 milliseconds)
[info] - Selects the first container if no container name is given. (1 millisecond)
[info] - Falls back to the first container if given container name does not exist. (2 milliseconds)
[info] - constructs spark pod correctly with pod template with no containers (1 millisecond)
[info] - SPARK-32091: count failures from active executors when remove rdd/broadcast/shuffle (15 seconds, 28 milliseconds)
[info] - null cast (3 seconds, 39 milliseconds)
[info] - cast string to date (118 milliseconds)
[info] - round/bround (4 seconds, 332 milliseconds)
[info] - SPARK-37388: width_bucket (744 milliseconds)
[info] KafkaHadoopDelegationTokenManagerSuite:
[info] - default configuration (318 milliseconds)
[info] KafkaConfigUpdaterSuite:
[info] - set should always set value (13 milliseconds)
[info] - setIfUnset without existing key should set value (2 milliseconds)
[info] - setIfUnset with existing key should not set value (3 milliseconds)
[info] - setAuthenticationConfigIfNeeded with global security should not set values (1 second, 200 milliseconds)
[info] - setAuthenticationConfigIfNeeded with token should set values (55 milliseconds)
[info] - setAuthenticationConfigIfNeeded with token should not override user-defined protocol (32 milliseconds)
[info] - setAuthenticationConfigIfNeeded with invalid mechanism should throw exception (23 milliseconds)
[info] - setAuthenticationConfigIfNeeded without security should not set values (17 milliseconds)
[info] KafkaTokenUtilSuite:
[info] - checkProxyUser with proxy current user should throw exception (81 milliseconds)
[info] - createAdminClientProperties with SASL_PLAINTEXT protocol should not include keystore and truststore config (6 milliseconds)
[info] - createAdminClientProperties with SASL_SSL protocol should include truststore config (4 milliseconds)
[info] - createAdminClientProperties with SSL protocol should include keystore and truststore config (5 milliseconds)
[info] - createAdminClientProperties with global config should not set dynamic jaas config (10 milliseconds)
[info] - createAdminClientProperties with keytab should set keytab dynamic jaas config (10 milliseconds)
[info] - createAdminClientProperties without keytab should set ticket cache dynamic jaas config (7 milliseconds)
[info] - createAdminClientProperties with specified params should include it (2 milliseconds)
[info] - isGlobalJaasConfigurationProvided without global config should return false (4 milliseconds)
[info] - isGlobalJaasConfigurationProvided with global config should return false (2 milliseconds)
[info] - findMatchingTokenClusterConfig without token should return None (17 milliseconds)
[info] - findMatchingTokenClusterConfig with non-matching tokens should return None (16 milliseconds)
[info] - findMatchingTokenClusterConfig with one matching token should return token and cluster configuration (17 milliseconds)
[info] - findMatchingTokenClusterConfig with multiple matching tokens should throw exception (23 milliseconds)
[info] - getTokenJaasParams with token should return scram module (17 milliseconds)
[info] - needTokenUpdate without cluster config should return false (4 milliseconds)
[info] - needTokenUpdate without jaas config should return false (2 milliseconds)
[info] - needTokenUpdate with same token should return false (17 milliseconds)
[info] - needTokenUpdate with different token should return true (24 milliseconds)
[info] KafkaRedactionUtilSuite:
[info] - redactParams shouldn't throw exception when no SparkEnv available (10 milliseconds)
[info] - redactParams should give back empty parameters (3 milliseconds)
[info] - redactParams should give back null value (2 milliseconds)
[info] - redactParams should redact non String parameters (6 milliseconds)
[info] - redactParams should redact token password from parameters (18 milliseconds)
[info] - redactParams should redact passwords from parameters (3 milliseconds)
[info] - redactJaasParam should give back null (1 millisecond)
[info] - redactJaasParam should give back empty string (1 millisecond)
[info] - redactJaasParam should redact token password (15 milliseconds)
[info] KafkaTokenSparkConfSuite:
[info] - getClusterConfig should trow exception when not exists (3 milliseconds)
[info] - getClusterConfig should return entry with defaults (2 milliseconds)
[info] - getClusterConfig should return entry overwrite defaults (2 milliseconds)
[info] - getClusterConfig should return specified kafka params (1 millisecond)
[info] - getAllClusterConfigs should return empty list when nothing configured (4 milliseconds)
[info] - getAllClusterConfigs should return empty list with malformed configuration (2 milliseconds)
[info] - getAllClusterConfigs should return multiple entries (3 milliseconds)
[info] ResourceRequestHelperSuite:
[info] KinesisInputDStreamBuilderSuite:
[info] - should raise an exception if the StreamingContext is missing (60 milliseconds)
[info] - cast string to timestamp (7 seconds, 94 milliseconds)
[info] - should raise an exception if the stream name is missing (9 milliseconds)
[info] - should raise an exception if the checkpoint app name is missing (2 milliseconds)
[info] - empty SparkConf should be valid (239 milliseconds)
[info] - just normal resources are defined (6 milliseconds)
[info] - get yarn resources from configs (23 milliseconds)
[info] - get invalid yarn resources from configs (34 milliseconds)
[info] - cast from boolean (143 milliseconds)
[info] - valid request: value with unit (194 milliseconds)
[info] - valid request: value without unit (3 milliseconds)
[info] - valid request: multiple resources (3 milliseconds)
[info] - invalid request: value does not match pattern (4 milliseconds)
[info] - invalid request: only unit defined (3 milliseconds)
[info] - invalid request: invalid unit (4 milliseconds)
[info] - disallowed resource request: spark.yarn.executor.resource.memory.amount (3 milliseconds)
[info] - disallowed resource request: spark.yarn.executor.resource.memory-mb.amount (2 milliseconds)
[info] - disallowed resource request: spark.yarn.executor.resource.mb.amount (2 milliseconds)
[info] - disallowed resource request: spark.yarn.executor.resource.cores.amount (1 millisecond)
[info] - disallowed resource request: spark.yarn.executor.resource.vcores.amount (1 millisecond)
[info] - disallowed resource request: spark.yarn.am.resource.memory.amount (2 milliseconds)
[info] - disallowed resource request: spark.yarn.driver.resource.memory.amount (1 millisecond)
[info] - disallowed resource request: spark.yarn.am.resource.cores.amount (1 millisecond)
[info] - disallowed resource request: spark.yarn.driver.resource.cores.amount (1 millisecond)
[info] - multiple disallowed resources in config (6 milliseconds)
[info] - should propagate required values to KinesisInputDStream (348 milliseconds)
[info] - should propagate default values to KinesisInputDStream (9 milliseconds)
[info] FailureTrackerSuite:
[info] - failures expire if validity interval is set (43 milliseconds)
[info] - failures never expire if validity interval is not set (-1) (2 milliseconds)
[info] YarnSparkHadoopUtilSuite:
[info] - shell script escaping (55 milliseconds)
[info] - cast from int (349 milliseconds)
[info] - Yarn configuration override (212 milliseconds)
[info] - cast from long (238 milliseconds)
[info] - test getApplicationAclsForYarn acls on (64 milliseconds)
[info] - test getApplicationAclsForYarn acls on and specify users (22 milliseconds)
[info] ClientSuite:
[info] - default Yarn application classpath (8 milliseconds)
[info] - default MR application classpath (1 millisecond)
[info] - resultant classpath for an application that defines a classpath for YARN (13 milliseconds)
[info] - resultant classpath for an application that defines a classpath for MR (13 milliseconds)
[info] - resultant classpath for an application that defines both classpaths, YARN and MR (14 milliseconds)
[info] - should propagate custom non-auth values to KinesisInputDStream (535 milliseconds)
[info] - old Api should throw UnsupportedOperationExceptionexception with AT_TIMESTAMP (3 milliseconds)
[info] - cast from float (181 milliseconds)
[info] WithoutAggregationKinesisBackedBlockRDDSuite:
[info] - Basic reading from Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - Read data available in both block manager and Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - Read data available only in block manager, not in Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - Read data available only in Kinesis, not in block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - Read data available partially in block manager, rest in Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - Test isBlockValid skips block fetching from block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - Test whether RDD is valid after removing blocks from block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - Local jar URIs (137 milliseconds)
[info] KinesisReceiverSuite:
[info] - cast from double (187 milliseconds)
[info] - cast from string (1 millisecond)
[info] - process records including store and set checkpointer (33 milliseconds)
[info] - split into multiple processes if a limitation is set (3 milliseconds)
[info] - shouldn't store and update checkpointer when receiver is stopped (3 milliseconds)
[info] - shouldn't update checkpointer when exception occurs during store (9 milliseconds)
[info] - shutdown should checkpoint if the reason is TERMINATE (9 milliseconds)
[info] - shutdown should not checkpoint if the reason is something other than TERMINATE (2 milliseconds)
[info] - retry success on first attempt (4 milliseconds)
[info] - retry success on second attempt after a Kinesis throttling exception (93 milliseconds)
[info] - retry success on second attempt after a Kinesis dependency exception (51 milliseconds)
[info] - retry failed after a shutdown exception (4 milliseconds)
[info] - retry failed after an invalid state exception (4 milliseconds)
[info] - retry failed after unexpected exception (4 milliseconds)
[info] - retry failed after exhausting all retries (126 milliseconds)
[info] WithAggregationKinesisBackedBlockRDDSuite:
[info] - Basic reading from Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - Read data available in both block manager and Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - Read data available only in block manager, not in Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - Read data available only in Kinesis, not in block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - Read data available partially in block manager, rest in Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - Test isBlockValid skips block fetching from block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - Test whether RDD is valid after removing blocks from block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] WithAggregationKinesisStreamSuite:
[info] - RDD generation (42 milliseconds)
[info] - basic operation [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - custom message handling [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - Kinesis read with custom configurations (5 milliseconds)
[info] - split and merge shards in a stream [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - failure recovery [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - Prepare KinesisTestUtils [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] KinesisCheckpointerSuite:
[info] - checkpoint is not called twice for the same sequence number (7 milliseconds)
[info] - checkpoint is called after sequence number increases (2 milliseconds)
[info] - should checkpoint if we have exceeded the checkpoint interval (6 milliseconds)
[info] - shouldn't checkpoint if we have not exceeded the checkpoint interval (1 millisecond)
[info] - should not checkpoint for the same sequence number (3 milliseconds)
[info] - removing checkpointer checkpoints one last time (2 milliseconds)
[info] - Jar path propagation through SparkConf (1 second, 300 milliseconds)
[info] - Cluster path translation (7 milliseconds)
[info] - if checkpointing is going on, wait until finished before removing and checkpointing (93 milliseconds)
[info] - configuration and args propagate through createApplicationSubmissionContext (76 milliseconds)
[info] SparkAWSCredentialsBuilderSuite:
[info] - should build DefaultCredentials when given no params (2 milliseconds)
[info] - should build BasicCredentials (1 millisecond)
[info] - should build STSCredentials (1 millisecond)
[info] - SparkAWSCredentials classes should be serializable (5 milliseconds)
[info] WithoutAggregationKinesisStreamSuite:
[info] - specify a more specific type for the application !!! CANCELED !!! (18 milliseconds)
[info]   org.apache.spark.deploy.yarn.ResourceRequestHelper.isYarnResourceTypesAvailable() was true (ClientSuite.scala:220)
[info]   org.scalatest.exceptions.TestCanceledException:
[info]   at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475)
[info]   at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474)
[info]   at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231)
[info]   at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310)
[info]   at org.apache.spark.deploy.yarn.ClientSuite.$anonfun$new$19(ClientSuite.scala:220)
[info]   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[info]   at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
[info]   at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
[info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
[info]   at org.scalatest.Transformer.apply(Transformer.scala:20)
[info]   at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:190)
[info]   at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:176)
[info]   at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:188)
[info]   at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:200)
[info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
[info]   at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:200)
[info]   at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:182)
[info]   at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:61)
[info]   at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
[info]   at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
[info]   at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:61)
[info]   at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:233)
[info]   at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
[info]   at scala.collection.immutable.List.foreach(List.scala:392)
[info]   at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
[info]   at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
[info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
[info]   at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:233)
[info]   at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:232)
[info]   at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
[info]   at org.scalatest.Suite.run(Suite.scala:1112)
[info]   at org.scalatest.Suite.run$(Suite.scala:1094)
[info]   at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
[info]   at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:237)
[info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
[info]   at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:237)
[info]   at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:236)
[info]   at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:61)
[info]   at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
[info]   at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
[info]   at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
[info]   at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:61)
[info]   at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318)
[info]   at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513)
[info]   at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413)
[info]   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info]   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[info]   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[info]   at java.lang.Thread.run(Thread.java:748)
[info] - RDD generation (5 milliseconds)
[info] - basic operation [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - custom message handling [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - Kinesis read with custom configurations (6 milliseconds)
[info] - split and merge shards in a stream [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - failure recovery [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - Prepare KinesisTestUtils [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!!
[info] - spark.yarn.jars with multiple paths and globs (272 milliseconds)
[info] Test run started
[info] Test org.apache.spark.streaming.kinesis.JavaKinesisInputDStreamBuilderSuite.testJavaKinesisDStreamBuilderOldApi started
[info] - distribute jars archive (128 milliseconds)
[info] Test org.apache.spark.streaming.kinesis.JavaKinesisInputDStreamBuilderSuite.testJavaKinesisDStreamBuilder started
[info] Test run finished: 0 failed, 0 ignored, 2 total, 0.366s
[info] - distribute archive multiple times (505 milliseconds)
[info] - distribute local spark jars (119 milliseconds)
[info] - ignore same name jars (135 milliseconds)
[info] - custom resource request (client mode) (17 milliseconds)
[info] - custom resource request (cluster mode) (17 milliseconds)
[info] - custom driver resource request yarn config and spark config fails (4 milliseconds)
[info] - custom executor resource request yarn config and spark config fails (3 milliseconds)
[info] - custom resources spark config mapped to yarn config (18 milliseconds)
[info] - test yarn jars path not exists (34 milliseconds)
[info] - SPARK-31582 Being able to not populate Hadoop classpath (41 milliseconds)
[info] - files URI match test1 (1 millisecond)
[info] - files URI match test2 (1 millisecond)
[info] - files URI match test3 (1 millisecond)
[info] - wasb URI match test (1 millisecond)
[info] - hdfs URI match test (1 millisecond)
[info] - files URI unmatch test1 (0 milliseconds)
[info] - files URI unmatch test2 (1 millisecond)
[info] - files URI unmatch test3 (1 millisecond)
[info] - wasb URI unmatch test1 (1 millisecond)
[info] - wasb URI unmatch test2 (1 millisecond)
[info] - s3 URI unmatch test (0 milliseconds)
[info] - hdfs URI unmatch test1 (1 millisecond)
[info] - hdfs URI unmatch test2 (0 milliseconds)
[info] ClientDistributedCacheManagerSuite:
[info] - test getFileStatus empty (441 milliseconds)
[info] - test getFileStatus cached (1 millisecond)
[info] - test addResource (5 milliseconds)
[info] - test addResource link null (2 milliseconds)
[info] - test addResource appmaster only (2 milliseconds)
[info] - test addResource archive (2 milliseconds)
[info] ContainerPlacementStrategySuite:
[info] - allocate locality preferred containers with enough resource and no matched existed containers (218 milliseconds)
[info] - allocate locality preferred containers with enough resource and partially matched containers (30 milliseconds)
[info] - allocate locality preferred containers with limited resource and partially matched containers (31 milliseconds)
[info] - allocate locality preferred containers with fully matched containers (35 milliseconds)
[info] - allocate containers with no locality preference (38 milliseconds)
[info] - allocate locality preferred containers by considering the localities of pending requests (47 milliseconds)
[info] YarnClusterSuite:
21:38:52.234 WARN org.apache.spark.util.Utils: Your hostname, research-jenkins-worker-05 resolves to a loopback address: 127.0.1.1; using 192.168.123.1 instead (on interface virbr0)
21:38:52.236 WARN org.apache.spark.util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
[info] SQLQuerySuite:
[info] - SPARK-32091: ignore failures from lost executors when remove rdd/broadcast/shuffle (15 seconds, 22 milliseconds)
21:38:52.874 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[info] - StorageLevel object caching (1 millisecond)
[info] - BlockManagerId object caching (1 millisecond)
[info] - BlockManagerId.isDriver() with DRIVER_IDENTIFIER (SPARK-27090) (1 millisecond)
[info] - master + 1 manager interaction (39 milliseconds)
[info] - master + 2 managers interaction (81 milliseconds)
[info] - removing block (74 milliseconds)
[info] - removing rdd (67 milliseconds)
[info] - removing broadcast (92 milliseconds)
[info] - reregistration on heart beat (20 milliseconds)
[info] - reregistration on block update (26 milliseconds)
[info] - reregistration doesn't dead lock (583 milliseconds)
[info] - correct BlockResult returned from get() calls (24 milliseconds)
[info] - optimize a location order of blocks without topology information (22 milliseconds)
[info] - optimize a location order of blocks with topology information (28 milliseconds)
[info] - SPARK-9591: getRemoteBytes from another location when Exception throw (156 milliseconds)
[info] - SPARK-27622: avoid the network when block requested from same host, StorageLevel(disk, 1 replicas) (108 milliseconds)
[info] - SPARK-27622: avoid the network when block requested from same host, StorageLevel(disk, deserialized, 1 replicas) (79 milliseconds)
[info] - SPARK-27622: avoid the network when block requested from same host, StorageLevel(disk, deserialized, 2 replicas) (77 milliseconds)
[info] - SPARK-27622: as file is removed fall back to network fetch, StorageLevel(disk, 1 replicas), getRemoteValue() (52 milliseconds)
[info] - SPARK-27622: as file is removed fall back to network fetch, StorageLevel(disk, 1 replicas), getRemoteBytes() (65 milliseconds)
[info] - SPARK-27622: as file is removed fall back to network fetch, StorageLevel(disk, deserialized, 1 replicas), getRemoteValue() (48 milliseconds)
[info] - SPARK-27622: as file is removed fall back to network fetch, StorageLevel(disk, deserialized, 1 replicas), getRemoteBytes() (42 milliseconds)
[info] - SPARK-14252: getOrElseUpdate should still read from remote storage (55 milliseconds)
[info] - in-memory LRU storage (23 milliseconds)
[info] - in-memory LRU storage with serialization (30 milliseconds)
[info] - in-memory LRU storage with off-heap (37 milliseconds)
[info] - in-memory LRU for partitions of same RDD (22 milliseconds)
[info] - in-memory LRU for partitions of multiple RDDs (23 milliseconds)
[info] - on-disk storage (encryption = off) (36 milliseconds)
[info] - on-disk storage (encryption = on) (44 milliseconds)
[info] - disk and memory storage (encryption = off) (25 milliseconds)
[info] - disk and memory storage (encryption = on) (24 milliseconds)
[info] - disk and memory storage with getLocalBytes (encryption = off) (26 milliseconds)
[info] - disk and memory storage with getLocalBytes (encryption = on) (26 milliseconds)
[info] - disk and memory storage with serialization (encryption = off) (29 milliseconds)
[info] - disk and memory storage with serialization (encryption = on) (34 milliseconds)
[info] - disk and memory storage with serialization and getLocalBytes (encryption = off) (29 milliseconds)
[info] - disk and memory storage with serialization and getLocalBytes (encryption = on) (26 milliseconds)
[info] - disk and off-heap memory storage (encryption = off) (40 milliseconds)
[info] - disk and off-heap memory storage (encryption = on) (37 milliseconds)
[info] - disk and off-heap memory storage with getLocalBytes (encryption = off) (26 milliseconds)
[info] - disk and off-heap memory storage with getLocalBytes (encryption = on) (30 milliseconds)
[info] - LRU with mixed storage levels (encryption = off) (37 milliseconds)
[info] - LRU with mixed storage levels (encryption = on) (34 milliseconds)
[info] - in-memory LRU with streams (encryption = off) (24 milliseconds)
[info] - in-memory LRU with streams (encryption = on) (35 milliseconds)
[info] - LRU with mixed storage levels and streams (encryption = off) (46 milliseconds)
[info] - LRU with mixed storage levels and streams (encryption = on) (45 milliseconds)
[info] - negative byte values in ByteBufferInputStream (1 millisecond)
[info] - overly large block (33 milliseconds)
[info] - block compression (262 milliseconds)
[info] - block store put failure (10 milliseconds)
[info] - test putBlockDataAsStream with caching (encryption = off) (38 milliseconds)
[info] - test putBlockDataAsStream with caching (encryption = on) (45 milliseconds)
[info] - test putBlockDataAsStream with caching, serialized (encryption = off) (39 milliseconds)
[info] - test putBlockDataAsStream with caching, serialized (encryption = on) (41 milliseconds)
[info] - test putBlockDataAsStream with caching on disk (encryption = off) (56 milliseconds)
[info] - test putBlockDataAsStream with caching on disk (encryption = on) (53 milliseconds)
[info] - turn off updated block statuses (35 milliseconds)
[info] - updated block statuses (53 milliseconds)
[info] - query block statuses (78 milliseconds)
[info] - get matching blocks (93 milliseconds)
[info] - SPARK-1194 regression: fix the same-RDD rule for cache replacement (32 milliseconds)
[info] - safely unroll blocks through putIterator (disk) (33 milliseconds)
[info] - read-locked blocks cannot be evicted from memory (39 milliseconds)
[info] - remove block if a read fails due to missing DiskStore files (SPARK-15736) (129 milliseconds)
[info] - SPARK-13328: refresh block locations (fetch should fail after hitting a threshold) (25 milliseconds)
[info] - SPARK-13328: refresh block locations (fetch should succeed after location refresh) (36 milliseconds)
[info] - SPARK-17484: block status is properly updated following an exception in put() (74 milliseconds)
[info] - SPARK-17484: master block locations are updated following an invalid remote block fetch (67 milliseconds)
[info] - SPARK-25888: serving of removed file not detected by shuffle service (54 milliseconds)
[info] - test sorting of block locations (41 milliseconds)
[info] - SPARK-8010: promote numeric to string (2 seconds, 347 milliseconds)
[info] - show functions (1 second, 392 milliseconds)
[info] - describe functions (78 milliseconds)
[info] - SPARK-20640: Shuffle registration timeout and maxAttempts conf are working (5 seconds, 175 milliseconds)
[info] - fetch remote block to local disk if block size is larger than threshold (27 milliseconds)
[info] - query locations of blockIds (4 milliseconds)
[info] - SPARK-30594: Do not post SparkListenerBlockUpdated when updateBlockInfo returns false (1 millisecond)
[info] - we reject putting blocks when we have the wrong shuffle resolver (50 milliseconds)
[info] - test decommission block manager should not be part of peers (99 milliseconds)
[info] - test decommissionRddCacheBlocks should offload all cached blocks (92 milliseconds)
[info] - SPARK-14415: All functions should have own descriptions (3 seconds, 498 milliseconds)
[info] - test decommissionRddCacheBlocks should keep the block if it is not able to offload (70 milliseconds)
[info] - test migration of shuffle blocks during decommissioning (92 milliseconds)
[info] - SPARK-35589: test migration of index-only shuffle blocks during decommissioning (79 milliseconds)
[info] - SPARK-32919: Shuffle push merger locations should be bounded with in spark.shuffle.push.retainedMergerLocations (183 milliseconds)
[info] - SPARK-32919: Prefer active executor locations for shuffle push mergers (134 milliseconds)
[info] - SPARK-33387 Support ordered shuffle block migration (3 milliseconds)
[info] - SPARK-34193: Potential race condition during decommissioning with TorrentBroadcast (37 milliseconds)
[info] PythonRunnerSuite:
[info] - format path (5 milliseconds)
[info] - format paths (3 milliseconds)
[info] SortShuffleWriterSuite:
[info] - SPARK-6743: no columns from cache (1 second, 911 milliseconds)
[info] - write empty iterator (3 milliseconds)
[info] - write with some records (4 milliseconds)
[info] CryptoStreamUtilsSuite:
[info] - crypto configuration conversion (1 millisecond)
[info] - shuffle encryption key length should be 128 by default (2 milliseconds)
[info] - create 256-bit key (0 milliseconds)
[info] - create key with invalid length (2 milliseconds)
[info] - serializer manager integration (4 milliseconds)
[info] - self join with aliases (984 milliseconds)
[info] - support table.star (803 milliseconds)
[info] - self join with alias in agg (1 second, 596 milliseconds)
[info] - SPARK-8668 expr function (596 milliseconds)
[info] - SPARK-4625 support SORT BY in SimpleSQLParser & DSL (193 milliseconds)
[info] - SPARK-7158 collect and take return different results (807 milliseconds)
[info] - encryption key propagation to executors (5 seconds, 665 milliseconds)
[info] - crypto stream wrappers (9 milliseconds)
[info] - error handling wrapper (13 milliseconds)
[info] StatsdSinkSuite:
[info] - grouping on nested fields (876 milliseconds)
[info] - metrics StatsD sink with Counter (19 milliseconds)
[info] - metrics StatsD sink with Gauge (2 milliseconds)
[info] - metrics StatsD sink with Histogram (13 milliseconds)
[info] - metrics StatsD sink with Timer (7 milliseconds)
[info] FileCommitProtocolInstantiationSuite:
[info] - Dynamic partitions require appropriate constructor (1 millisecond)
[info] - Standard partitions work with classic constructor (1 millisecond)
[info] - Three arg constructors have priority (1 millisecond)
[info] - Three arg constructors have priority when dynamic (0 milliseconds)
[info] - The protocol must be of the correct class (2 milliseconds)
[info] - If there is no matching constructor, class hierarchy is irrelevant (8 milliseconds)
[info] CompletionIteratorSuite:
[info] - basic test (1 millisecond)
[info] - SPARK-6201 IN type conversion (231 milliseconds)
[info] - SPARK-11226 Skip empty line in json file (229 milliseconds)
[info] - reference to sub iterator should not be available after completion (664 milliseconds)
[info] LauncherBackendSuite:
[info] - SPARK-8828 sum should return null if all input values are null (327 milliseconds)
[info] - run Spark in yarn-client mode (26 seconds, 142 milliseconds)
[info] - local: launcher handle (4 seconds, 854 milliseconds)
[info] - aggregation with codegen (9 seconds, 868 milliseconds)
[info] - Add Parser of SQL COALESCE() (245 milliseconds)
[info] - SPARK-3176 Added Parser of SQL LAST() (172 milliseconds)
[info] - standalone/client: launcher handle (5 seconds, 531 milliseconds)
[info] LogPageSuite:
[info] - SPARK-2041 column name equals tablename (85 milliseconds)
[info] - SQRT (103 milliseconds)
[info] - get logs simple (179 milliseconds)
[info] UnifiedMemoryManagerSuite:
[info] - SQRT with automatic string casts (89 milliseconds)
[info] - single task requesting on-heap execution memory (2 milliseconds)
[info] - two tasks requesting full on-heap execution memory (3 milliseconds)
[info] - two tasks cannot grow past 1 / N of on-heap execution memory (3 milliseconds)
[info] - tasks can block to get at least 1 / 2N of on-heap execution memory (303 milliseconds)
[info] - SPARK-2407 Added Parser of SQL SUBSTR() (360 milliseconds)
[info] - TaskMemoryManager.cleanUpAllAllocatedMemory (305 milliseconds)
[info] - tasks should not be granted a negative amount of execution memory (2 milliseconds)
[info] - off-heap execution allocations cannot exceed limit (3 milliseconds)
[info] - basic execution memory (3 milliseconds)
[info] - basic storage memory (3 milliseconds)
[info] - execution evicts storage (1 millisecond)
[info] - execution memory requests smaller than free memory should evict storage (SPARK-12165) (1 millisecond)
[info] - storage does not evict execution (1 millisecond)
[info] - small heap (2 milliseconds)
[info] - insufficient executor memory (1 millisecond)
[info] - execution can evict cached blocks when there are multiple active tasks (SPARK-12155) (1 millisecond)
[info] - SPARK-15260: atomically resize memory pools (2 milliseconds)
[info] - not enough free memory in the storage pool --OFF_HEAP (2 milliseconds)
[info] UnsafeKryoSerializerSuite:
[info] - SPARK-7392 configuration limits (2 milliseconds)
[info] - basic types (12 milliseconds)
[info] - pairs (2 milliseconds)
[info] - Scala data structures (3 milliseconds)
[info] - Bug: SPARK-10251 (5 milliseconds)
[info] - ranges (10 milliseconds)
[info] - asJavaIterable (13 milliseconds)
[info] - custom registrator (7 milliseconds)
[info] - kryo with collect (50 milliseconds)
[info] - kryo with parallelize (18 milliseconds)
[info] - kryo with parallelize for specialized tuples (14 milliseconds)
[info] - kryo with parallelize for primitive arrays (14 milliseconds)
[info] - kryo with collect for specialized tuples (17 milliseconds)
[info] - kryo with SerializableHyperLogLog (26 milliseconds)
[info] - kryo with reduce (16 milliseconds)
[info] - kryo with fold (17 milliseconds)
[info] - kryo with nonexistent custom registrator should fail (2 milliseconds)
[info] - default class loader can be set by a different thread (5 milliseconds)
[info] - registration of HighlyCompressedMapStatus (3 milliseconds)
[info] - registration of TaskCommitMessage (4 milliseconds)
[info] - serialization buffer overflow reporting (74 milliseconds)
[info] - KryoOutputObjectOutputBridge.writeObject and KryoInputObjectInputBridge.readObject (2 milliseconds)
[info] - getAutoReset (3 milliseconds)
[info] - SPARK-25176 ClassCastException when writing a Map after previously reading a Map with different generic type (3 milliseconds)
[info] - instance reuse with autoReset = true, referenceTracking = true, usePool = true (2 milliseconds)
[info] - instance reuse with autoReset = true, referenceTracking = true, usePool = false (2 milliseconds)
[info] - instance reuse with autoReset = false, referenceTracking = true, usePool = true (1 millisecond)
[info] - instance reuse with autoReset = false, referenceTracking = true, usePool = false (1 millisecond)
[info] - instance reuse with autoReset = true, referenceTracking = false, usePool = true (1 millisecond)
[info] - instance reuse with autoReset = true, referenceTracking = false, usePool = false (1 millisecond)
[info] - instance reuse with autoReset = false, referenceTracking = false, usePool = true (1 millisecond)
[info] - instance reuse with autoReset = false, referenceTracking = false, usePool = false (1 millisecond)
[info] - SPARK-25839 KryoPool implementation works correctly in multi-threaded environment (6 milliseconds)
[info] - SPARK-27216: test RoaringBitmap ser/dser with Kryo (3 milliseconds)
[info] NettyRpcAddressSuite:
[info] - toString (1 millisecond)
[info] - toString for client mode (0 milliseconds)
[info] ExternalShuffleServiceMetricsSuite:
[info] - SPARK-31646: metrics should be registered (0 milliseconds)
[info] BitSetSuite:
[info] - basic set and get (2 milliseconds)
[info] - 100% full bit set (11 milliseconds)
[info] - nextSetBit (1 millisecond)
[info] - xor len(bitsetX) < len(bitsetY) (1 millisecond)
[info] - xor len(bitsetX) > len(bitsetY) (1 millisecond)
[info] - andNot len(bitsetX) < len(bitsetY) (1 millisecond)
[info] - andNot len(bitsetX) > len(bitsetY) (1 millisecond)
[info] - [gs]etUntil (3 milliseconds)
[info] AsyncRDDActionsSuite:
[info] - SPARK-3173 Timestamp support in the parser (868 milliseconds)
[info] - countAsync (18 milliseconds)
[info] - collectAsync (13 milliseconds)
[info] - foreachAsync (27 milliseconds)
[info] - foreachPartitionAsync (16 milliseconds)
[info] - left semi greater than predicate (219 milliseconds)
[info] - left semi greater than predicate and equal operator (882 milliseconds)
[info] - select * (100 milliseconds)
[info] - simple select (96 milliseconds)
[info] - takeAsync (1 second, 865 milliseconds)
[info] - async success handling (14 milliseconds)
[info] - async failure handling (15 milliseconds)
[info] - FutureAction result, infinite wait (15 milliseconds)
[info] - FutureAction result, finite wait (9 milliseconds)
[info] - FutureAction result, timeout (25 milliseconds)
[info] - SimpleFutureAction callback must not consume a thread while waiting (38 milliseconds)
[info] - ComplexFutureAction callback must not consume a thread while waiting (26 milliseconds)
[info] StagePageSuite:
[info] - ApiHelper.COLUMN_TO_INDEX should match headers of the task table (4 milliseconds)
[info] BarrierStageOnSubmittedSuite:
[info] - submit a barrier ResultStage that contains PartitionPruningRDD (81 milliseconds)
[info] - submit a barrier ShuffleMapStage that contains PartitionPruningRDD (87 milliseconds)
[info] - submit a barrier stage that doesn't contain PartitionPruningRDD (170 milliseconds)
[info] - submit a barrier stage with partial partitions (215 milliseconds)
[info] - submit a barrier stage with union() (88 milliseconds)
[info] - submit a barrier stage with coalesce() (82 milliseconds)
[info] - submit a barrier stage that contains an RDD that depends on multiple barrier RDDs (64 milliseconds)
[info] - submit a barrier stage with zip() (104 milliseconds)
[info] - submit a barrier ResultStage with dynamic resource allocation enabled (81 milliseconds)
[info] - submit a barrier ShuffleMapStage with dynamic resource allocation enabled (88 milliseconds)
[info] - external sorting (3 seconds, 515 milliseconds)
[info] - CTE feature (343 milliseconds)
[info] - Allow only a single WITH clause per query (2 milliseconds)
[info] - date row (174 milliseconds)
[info] - from follow multiple brackets (462 milliseconds)
[info] - average (133 milliseconds)
[info] - average overflow (313 milliseconds)
[info] - count (173 milliseconds)
[info] - submit a barrier ResultStage that requires more slots than current total under local mode (3 seconds, 67 milliseconds)
[info] - count distinct (285 milliseconds)
21:39:34.291 WARN org.apache.spark.sql.catalyst.util.package: Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.sql.debug.maxToStringFields'.
[info] - approximate count distinct (432 milliseconds)
[info] - approximate count distinct with user provided standard deviation (454 milliseconds)
[info] - null count (874 milliseconds)
[info] - data type casting (50 seconds, 903 milliseconds)
[info] - count of empty table (103 milliseconds)
[info] - cast and add (221 milliseconds)
[info] - inner join where, one match per row (449 milliseconds)
[info] - from decimal (371 milliseconds)
[info] - cast from array (199 milliseconds)
[info] - inner join ON, one match per row (383 milliseconds)
[info] - cast from map (156 milliseconds)
[info] - cast from struct (202 milliseconds)
[info] - submit a barrier ShuffleMapStage that requires more slots than current total under local mode (3 seconds, 80 milliseconds)
[info] - inner join, where, multiple matches (325 milliseconds)
[info] - cast struct with a timestamp field (77 milliseconds)
[info] - complex casting (1 millisecond)
[info] - cast between string and interval (129 milliseconds)
[info] - inner join, no matches (229 milliseconds)
[info] - cast string to boolean (125 milliseconds)
[info] - SPARK-16729 type checking for casting to date type (1 millisecond)
[info] - SPARK-20302 cast with same structure (63 milliseconds)
[info] - big inner join, 4 matches per row (871 milliseconds)
[info] - cartesian product join (182 milliseconds)
[info] - left outer join (444 milliseconds)
[info] - SPARK-22500: cast for struct should not generate codes beyond 64KB (1 second, 802 milliseconds)
[info] - SPARK-22570: Cast should not create a lot of global variables (1 millisecond)
[info] - up-cast (13 milliseconds)
[info] - right outer join (448 milliseconds)
[info] - full outer join (545 milliseconds)
[info] - SPARK-27671: cast from nested null type in struct (936 milliseconds)
[info] - SPARK-11111 null-safe join should not use cartesian product (628 milliseconds)
[info] - Process Infinity, -Infinity, NaN in case insensitive manner (316 milliseconds)
[info] - submit a barrier ResultStage that requires more slots than current total under local-cluster mode (3 seconds, 436 milliseconds)
java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@4687d623 rejected from java.util.concurrent.ThreadPoolExecutor@30907fc3[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 0]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at java.util.concurrent.Executors$DelegatedExecutorService.execute(Executors.java:668)
	at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
	at scala.concurrent.Promise.complete(Promise.scala:53)
	at scala.concurrent.Promise.complete$(Promise.scala:52)
	at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67)
	at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
	at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59)
	at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875)
	at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110)
	at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107)
	at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
	at scala.concurrent.Promise.complete(Promise.scala:53)
	at scala.concurrent.Promise.complete$(Promise.scala:52)
	at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@4fcbf969 rejected from java.util.concurrent.ThreadPoolExecutor@ac96ef4[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 0]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at java.util.concurrent.Executors$DelegatedExecutorService.execute(Executors.java:668)
	at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
	at scala.concurrent.Promise.complete(Promise.scala:53)
	at scala.concurrent.Promise.complete$(Promise.scala:52)
	at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67)
	at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
	at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59)
	at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875)
	at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110)
	at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107)
	at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
	at scala.concurrent.Promise.complete(Promise.scala:53)
	at scala.concurrent.Promise.complete$(Promise.scala:52)
	at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
[info] - SPARK-3349 partitioning after limit (892 milliseconds)
[info] - ANSI mode: Throw exception on casting out-of-range value to byte type (848 milliseconds)
[info] - mixed-case keywords (425 milliseconds)
[info] - select with table name as qualifier (99 milliseconds)
[info] - inner join ON with table name as qualifier (436 milliseconds)
[info] - ANSI mode: Throw exception on casting out-of-range value to short type (1 second, 33 milliseconds)
[info] - qualified select with inner join ON with table name as qualifier (418 milliseconds)
[info] - system function upper() (217 milliseconds)
[info] - system function lower() (236 milliseconds)
[info] - ANSI mode: Throw exception on casting out-of-range value to int type (900 milliseconds)
[info] - ANSI mode: Throw exception on casting out-of-range value to long type (631 milliseconds)
[info] - ANSI mode: Throw exception on casting out-of-range value to decimal type (159 milliseconds)
[info] - ANSI mode: disallow type conversions between Numeric types and Timestamp type (1 millisecond)
[info] - ANSI mode: disallow type conversions between Numeric types and Date type (1 millisecond)
[info] - ANSI mode: disallow type conversions between Numeric types and Binary type (1 millisecond)
[info] - ANSI mode: disallow type conversions between Datatime types and Boolean types (0 milliseconds)
[info] - ANSI mode: disallow casting complex types as String type (9 milliseconds)
[info] - submit a barrier ShuffleMapStage that requires more slots than current total under local-cluster mode (3 seconds, 374 milliseconds)
[info] - UNION (1 second, 149 milliseconds)
[info] - cast from invalid string to numeric should throw NumberFormatException (374 milliseconds)
[info] - Fast fail for cast string type to decimal type in ansi mode (183 milliseconds)
[info] - UNION with column mismatches (876 milliseconds)
[info] - run Spark in yarn-cluster mode (27 seconds, 71 milliseconds)
[info] - EXCEPT (1 second, 692 milliseconds)
[info] - MINUS (1 second, 590 milliseconds)
[info] - INTERSECT (1 second, 114 milliseconds)
[info] - SET commands semantics using sql() (183 milliseconds)
[info] - SPARK-19218 SET command should show a result in a sorted order (29 milliseconds)
21:39:50.024 WARN org.apache.spark.sql.internal.WithTestConf$$anon$4: SQL configurations from Hive module is not loaded
scala.ScalaReflectionException: object org.apache.spark.sql.hive.HiveUtils not found.
	at scala.reflect.internal.Mirrors$RootsBase.staticModule(Mirrors.scala:190)
	at scala.reflect.internal.Mirrors$RootsBase.staticModule(Mirrors.scala:29)
	at org.apache.spark.sql.internal.SQLConf.loadDefinedConfs(SQLConf.scala:3797)
	at org.apache.spark.sql.internal.SQLConf.getAllDefinedConfs(SQLConf.scala:3818)
	at org.apache.spark.sql.execution.command.SetCommand.$anonfun$x$7$10(SetCommand.scala:118)
	at org.apache.spark.sql.execution.command.SetCommand.run(SetCommand.scala:158)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
	at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:228)
	at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3700)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3698)
	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:228)
	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
	at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613)
	at org.apache.spark.sql.test.SQLTestUtilsBase.$anonfun$sql$1(SQLTestUtils.scala:231)
	at org.apache.spark.sql.SQLQuerySuite.$anonfun$new$211(SQLQuerySuite.scala:1030)
	at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
	at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
	at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
	at org.scalatest.Transformer.apply(Transformer.scala:22)
	at org.scalatest.Transformer.apply(Transformer.scala:20)
	at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:190)
	at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:176)
	at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:188)
	at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:200)
	at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
	at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:200)
	at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:182)
	at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:61)
	at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
	at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
	at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:61)
	at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:233)
	at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
	at scala.collection.immutable.List.foreach(List.scala:392)
	at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
	at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
	at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
	at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:233)
	at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:232)
	at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
	at org.scalatest.Suite.run(Suite.scala:1112)
	at org.scalatest.Suite.run$(Suite.scala:1094)
	at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
	at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:237)
	at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
	at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:237)
	at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:236)
	at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:61)
	at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
	at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
	at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
	at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:61)
	at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318)
	at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513)
	at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
[info] - SPARK-19218 `SET -v` should not fail with null value configuration (30 milliseconds)
21:39:50.052 WARN org.apache.spark.sql.execution.command.SetCommand: Property mapred.reduce.tasks is deprecated, automatically converted to spark.sql.shuffle.partitions instead.
21:39:50.055 WARN org.apache.spark.sql.execution.command.SetCommand: Property mapred.reduce.tasks is deprecated, automatically converted to spark.sql.shuffle.partitions instead.
21:39:50.058 WARN org.apache.spark.sql.execution.command.SetCommand: Property mapred.reduce.tasks is deprecated, automatically converted to spark.sql.shuffle.partitions instead.
[info] - SET commands with illegal or inappropriate argument (11 milliseconds)
21:39:50.064 WARN org.apache.spark.sql.execution.command.SetCommand: Property mapreduce.job.reduces is Hadoop's property, automatically converted to spark.sql.shuffle.partitions instead.
21:39:50.069 WARN org.apache.spark.sql.execution.command.SetCommand: Property mapreduce.job.reduces is Hadoop's property, automatically converted to spark.sql.shuffle.partitions instead.
[info] - SET mapreduce.job.reduces automatically converted to spark.sql.shuffle.partitions (10 milliseconds)
[info] - SPARK-35576: Set command should redact sensitive data (68 milliseconds)
[info] - apply schema (605 milliseconds)
[info] - SPARK-3423 BETWEEN (267 milliseconds)
[info] - SPARK-17863: SELECT distinct does not work correctly if order by missing attribute (510 milliseconds)
[info] - cast boolean to string (120 milliseconds)
[info] - metadata is propagated correctly (65 milliseconds)
[info] - SPARK-3371 Renaming a function expression with group by gives error (300 milliseconds)
[info] - SPARK-3813 CASE a WHEN b THEN c [WHEN d THEN e]* [ELSE f] END (220 milliseconds)
[info] - SPARK-3813 CASE WHEN a THEN b [WHEN c THEN d]* [ELSE e] END (207 milliseconds)
[info] - SPARK-16748: SparkExceptions during planning should not wrapped in TreeNodeException (153 milliseconds)
[info] - Multiple join (948 milliseconds)
[info] - SPARK-3483 Special chars in column names (54 milliseconds)
[info] - SPARK-3814 Support Bitwise & operator (72 milliseconds)
[info] - SPARK-3814 Support Bitwise | operator (86 milliseconds)
[info] - SPARK-3814 Support Bitwise ^ operator (80 milliseconds)
[info] - SPARK-3814 Support Bitwise ~ operator (70 milliseconds)
[info] - SPARK-4120 Join of multiple tables does not work in SparkSQL (566 milliseconds)
[info] - SPARK-4154 Query does not work if it has 'not between' in Spark SQL and HQL (292 milliseconds)
[info] - SPARK-4207 Query which has syntax like 'not like' is not working in Spark SQL (278 milliseconds)
[info] - SPARK-4322 Grouping field with struct field as sub expression (489 milliseconds)
[info] - SPARK-4432 Fix attribute reference resolution error when using ORDER BY (302 milliseconds)
[info] - order by asc by default when not specify ascending and descending (281 milliseconds)
[info] - Supporting relational operator '<=>' in Spark SQL (447 milliseconds)
[info] - ANSI mode: cast string to timestamp with parse error (12 seconds, 26 milliseconds)
[info] - Multi-column COUNT(DISTINCT ...) (331 milliseconds)
[info] - SPARK-26218: Fix the corner case of codegen when casting float to Integer (100 milliseconds)
[info] CheckCartesianProductsSuite:
[info] - SPARK-4699 case sensitivity SQL query (128 milliseconds)
[info] - CheckCartesianProducts doesn't throw an exception if cross joins are enabled) (119 milliseconds)
[info] - CheckCartesianProducts throws an exception for join types that require a join condition (19 milliseconds)
[info] - CheckCartesianProducts doesn't throw an exception if a join condition is present (16 milliseconds)
[info] - CheckCartesianProducts doesn't throw an exception if join types don't require conditions (7 milliseconds)
[info] SQLKeywordSuite:
[info] - all keywords are documented (3 milliseconds)
[info] - Spark keywords are documented correctly under ANSI mode (7 milliseconds)
[info] - Spark keywords are documented correctly under default mode (12 milliseconds)
[info] - SQL 2016 keywords are documented correctly (71 milliseconds)
[info] - SPARK-6145: ORDER BY test for nested fields (1 second, 469 milliseconds)
[info] - SPARK-6145: special cases (507 milliseconds)
[info] LookupCatalogSuite:
[info] - SPARK-6898: complete support for special chars in column names (141 milliseconds)
[info] - catalog and identifier (630 milliseconds)
[info] - table identifier (33 milliseconds)
[info] FirstLastTestSuite:
[info] - empty buffer (52 milliseconds)
[info] - update (25 milliseconds)
[info] - update - ignore nulls (26 milliseconds)
[info] - merge (26 milliseconds)
[info] - merge - ignore nulls (2 milliseconds)
[info] - eval (30 milliseconds)
[info] - eval - ignore nulls (1 millisecond)
[info] - SPARK-32344: correct error handling for a type mismatch (6 milliseconds)
[info] ProjectEstimationSuite:
[info] - project with alias (5 milliseconds)
[info] - project on empty table (2 milliseconds)
[info] - test row size estimation (10 milliseconds)
[info] CSVExprUtilsSuite:
[info] - Can parse escaped characters (9 milliseconds)
[info] - Does not accept delimiter larger than one character (3 milliseconds)
[info] - Throws exception for unsupported escaped characters (1 millisecond)
[info]