Started by an SCM change Running as SYSTEM [EnvInject] - Loading node environment variables. [EnvInject] - Preparing an environment for the build. [EnvInject] - Keeping Jenkins system variables. [EnvInject] - Keeping Jenkins build variables. [EnvInject] - Injecting as environment variables the properties content PATH=/home/anaconda/envs/py36/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin AMPLAB_JENKINS="true" SPARK_MASTER_SBT_HADOOP_2_7=1 JAVA_HOME=/usr/java/latest AMPLAB_JENKINS_BUILD_HIVE_PROFILE=hive2.3 SPARK_TESTING=1 AMPLAB_JENKINS_BUILD_PROFILE=hadoop3.2 LANG=en_US.UTF-8 SPARK_BRANCH=master [EnvInject] - Variables injected successfully. [EnvInject] - Injecting contributions. Building remotely on research-jenkins-worker-04 (ubuntu20 ubuntu) in workspace /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2 The recommended git tool is: NONE No credentials specified > git rev-parse --resolve-git-dir /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/.git # timeout=10 Fetching changes from the remote Git repository > git config remote.origin.url https://github.com/apache/spark.git # timeout=10 Fetching upstream changes from https://github.com/apache/spark.git > git --version # timeout=10 > git --version # 'git version 2.25.1' > git fetch --tags --force --progress -- https://github.com/apache/spark.git +refs/heads/*:refs/remotes/origin/* # timeout=10 > git rev-parse origin/master^{commit} # timeout=10 Checking out Revision 03750c046b55f60b43646c8108e5f2e540782755 (origin/master) > git config core.sparsecheckout # timeout=10 > git checkout -f 03750c046b55f60b43646c8108e5f2e540782755 # timeout=10 Commit message: "[SPARK-37554][BUILD] Add PyArrow, pandas and plotly to release Docker image dependencies" > git rev-list --no-walk feba5ac32f2598f6ca8a274850934106be0db64d # timeout=10 [EnvInject] - Mask passwords that will be passed as build parameters. [spark-master-test-sbt-hadoop-3.2] $ /bin/bash /tmp/jenkins4314740526760586336.sh Removing R/SparkR.Rcheck/ Removing R/SparkR_3.3.0.tar.gz Removing R/cran-check.out Removing R/lib/ Removing R/pkg/man/ Removing R/pkg/tests/fulltests/Rplots.pdf Removing R/pkg/tests/fulltests/_snaps/ Removing R/unit-tests.out Removing append/ Removing assembly/target/ Removing build/sbt-launch-1.5.5.jar Removing common/kvstore/target/ Removing common/network-common/target/ Removing common/network-shuffle/target/ Removing common/network-yarn/target/ Removing common/sketch/target/ Removing common/tags/target/ Removing common/unsafe/target/ Removing core/derby.log Removing core/dummy/ Removing core/ignored/ Removing core/target/ Removing core/temp-secrets/ Removing derby.log Removing dev/__pycache__/ Removing dev/ansible-for-test-node/roles/jenkins-worker/files/util_scripts/__pycache__/ Removing dev/create-release/__pycache__/ Removing dev/lint-r-report.log Removing dev/sparktestsupport/__pycache__/ Removing dev/target/ Removing examples/src/main/python/__pycache__/ Removing examples/src/main/python/ml/__pycache__/ Removing examples/src/main/python/mllib/__pycache__/ Removing examples/src/main/python/sql/__pycache__/ Removing examples/src/main/python/sql/streaming/__pycache__/ Removing examples/src/main/python/streaming/__pycache__/ Removing examples/target/ Removing external/avro/spark-warehouse/ Removing external/avro/target/ Removing external/docker-integration-tests/target/ Removing external/kafka-0-10-assembly/target/ Removing external/kafka-0-10-sql/spark-warehouse/ Removing external/kafka-0-10-sql/target/ Removing external/kafka-0-10-token-provider/target/ Removing external/kafka-0-10/target/ Removing external/kinesis-asl-assembly/target/ Removing external/kinesis-asl/checkpoint/ Removing external/kinesis-asl/src/main/python/examples/streaming/__pycache__/ Removing external/kinesis-asl/target/ Removing external/spark-ganglia-lgpl/target/ Removing graphx/target/ Removing hadoop-cloud/target/ Removing launcher/target/ Removing lib/ Removing logs/ Removing metastore_db/ Removing mllib-local/target/ Removing mllib/checkpoint/ Removing mllib/spark-warehouse/ Removing mllib/target/ Removing project/project/ Removing project/target/ Removing python/__pycache__/ Removing python/dist/ Removing python/docs/source/__pycache__/ Removing python/lib/pyspark.zip Removing python/pyspark.egg-info/ Removing python/pyspark/__pycache__/ Removing python/pyspark/cloudpickle/__pycache__/ Removing python/pyspark/ml/__pycache__/ Removing python/pyspark/ml/linalg/__pycache__/ Removing python/pyspark/ml/param/__pycache__/ Removing python/pyspark/ml/tests/__pycache__/ Removing python/pyspark/mllib/__pycache__/ Removing python/pyspark/mllib/linalg/__pycache__/ Removing python/pyspark/mllib/stat/__pycache__/ Removing python/pyspark/mllib/tests/__pycache__/ Removing python/pyspark/pandas/__pycache__/ Removing python/pyspark/pandas/data_type_ops/__pycache__/ Removing python/pyspark/pandas/indexes/__pycache__/ Removing python/pyspark/pandas/missing/__pycache__/ Removing python/pyspark/pandas/plot/__pycache__/ Removing python/pyspark/pandas/spark/__pycache__/ Removing python/pyspark/pandas/tests/__pycache__/ Removing python/pyspark/pandas/tests/data_type_ops/__pycache__/ Removing python/pyspark/pandas/tests/indexes/__pycache__/ Removing python/pyspark/pandas/tests/plot/__pycache__/ Removing python/pyspark/pandas/typedef/__pycache__/ Removing python/pyspark/pandas/usage_logging/__pycache__/ Removing python/pyspark/python/ Removing python/pyspark/resource/__pycache__/ Removing python/pyspark/resource/tests/__pycache__/ Removing python/pyspark/sql/__pycache__/ Removing python/pyspark/sql/avro/__pycache__/ Removing python/pyspark/sql/pandas/__pycache__/ Removing python/pyspark/sql/tests/__pycache__/ Removing python/pyspark/streaming/__pycache__/ Removing python/pyspark/streaming/tests/__pycache__/ Removing python/pyspark/testing/__pycache__/ Removing python/pyspark/tests/__pycache__/ Removing python/target/ Removing python/test_coverage/__pycache__/ Removing python/test_support/__pycache__/ Removing repl/derby.log Removing repl/metastore_db/ Removing repl/spark-warehouse/ Removing repl/target/ Removing resource-managers/kubernetes/core/target/ Removing resource-managers/kubernetes/core/temp-secret/ Removing resource-managers/kubernetes/integration-tests/target/ Removing resource-managers/kubernetes/integration-tests/tests/__pycache__/ Removing resource-managers/mesos/target/ Removing resource-managers/yarn/target/ Removing scalastyle-on-compile.generated.xml Removing spark-warehouse/ Removing sql/__pycache__/ Removing sql/catalyst/fake/ Removing sql/catalyst/spark-warehouse/ Removing sql/catalyst/target/ Removing sql/core/spark-warehouse/ Removing sql/core/src/test/resources/__pycache__/ Removing sql/core/target/ Removing sql/hive-thriftserver/derby.log Removing sql/hive-thriftserver/metastore_db/ Removing sql/hive-thriftserver/spark-warehouse/ Removing sql/hive-thriftserver/spark_derby/ Removing sql/hive-thriftserver/target/ Removing sql/hive/derby.log Removing sql/hive/metastore_db/ Removing sql/hive/src/test/resources/data/scripts/__pycache__/ Removing sql/hive/target/ Removing streaming/checkpoint/ Removing streaming/target/ Removing target/ Removing tools/target/ Removing work/ +++ dirname /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/install-dev.sh ++ cd /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R ++ pwd + FWDIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R + LIB_DIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/lib + mkdir -p /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/lib + pushd /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R + . /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/find-r.sh ++ '[' -z '' ']' ++ '[' '!' -z '' ']' +++ command -v R ++ '[' '!' /usr/bin/R ']' ++++ which R +++ dirname /usr/bin/R ++ R_SCRIPT_PATH=/usr/bin ++ echo 'Using R_SCRIPT_PATH = /usr/bin' Using R_SCRIPT_PATH = /usr/bin + . /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/create-rd.sh ++ set -o pipefail ++ set -e ++++ dirname /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/create-rd.sh +++ cd /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R +++ pwd ++ FWDIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R ++ pushd /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R ++ . /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/find-r.sh +++ '[' -z /usr/bin ']' ++ /usr/bin/Rscript -e ' if(requireNamespace("devtools", quietly=TRUE)) { setwd("/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R"); devtools::document(pkg="./pkg", roclets="rd") }' Updating SparkR documentation First time using roxygen2. Upgrading automatically... Loading SparkR Creating a new generic function for ‘as.data.frame’ in package ‘SparkR’ Creating a new generic function for ‘colnames’ in package ‘SparkR’ Creating a new generic function for ‘colnames<-’ in package ‘SparkR’ Creating a new generic function for ‘cov’ in package ‘SparkR’ Creating a new generic function for ‘drop’ in package ‘SparkR’ Creating a new generic function for ‘na.omit’ in package ‘SparkR’ Creating a new generic function for ‘filter’ in package ‘SparkR’ Creating a new generic function for ‘intersect’ in package ‘SparkR’ Creating a new generic function for ‘sample’ in package ‘SparkR’ Creating a new generic function for ‘transform’ in package ‘SparkR’ Creating a new generic function for ‘subset’ in package ‘SparkR’ Creating a new generic function for ‘summary’ in package ‘SparkR’ Creating a new generic function for ‘union’ in package ‘SparkR’ Creating a new generic function for ‘endsWith’ in package ‘SparkR’ Creating a new generic function for ‘startsWith’ in package ‘SparkR’ Creating a new generic function for ‘lag’ in package ‘SparkR’ Creating a new generic function for ‘rank’ in package ‘SparkR’ Creating a new generic function for ‘sd’ in package ‘SparkR’ Creating a new generic function for ‘var’ in package ‘SparkR’ Creating a new generic function for ‘window’ in package ‘SparkR’ Creating a new generic function for ‘predict’ in package ‘SparkR’ Creating a new generic function for ‘rbind’ in package ‘SparkR’ Creating a generic function for ‘substr’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘%in%’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘lapply’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘Filter’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘nrow’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘ncol’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘factorial’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘atan2’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘ifelse’ from package ‘base’ in package ‘SparkR’ Writing structType.Rd Writing print.structType.Rd Writing structField.Rd Writing print.structField.Rd Writing summarize.Rd Writing alias.Rd Writing arrange.Rd Writing as.data.frame.Rd Writing cache.Rd Writing checkpoint.Rd Writing coalesce.Rd Writing collect.Rd Writing columns.Rd Writing coltypes.Rd Writing count.Rd Writing cov.Rd Writing corr.Rd Writing createOrReplaceTempView.Rd Writing cube.Rd Writing dapply.Rd Writing dapplyCollect.Rd Writing gapply.Rd Writing gapplyCollect.Rd Writing describe.Rd Writing distinct.Rd Writing drop.Rd Writing dropDuplicates.Rd Writing nafunctions.Rd Writing dtypes.Rd Writing explain.Rd Writing except.Rd Writing exceptAll.Rd Writing filter.Rd Writing first.Rd Writing groupBy.Rd Writing hint.Rd Writing insertInto.Rd Writing intersect.Rd Writing intersectAll.Rd Writing isLocal.Rd Writing isStreaming.Rd Writing limit.Rd Writing localCheckpoint.Rd Writing merge.Rd Writing mutate.Rd Writing orderBy.Rd Writing persist.Rd Writing printSchema.Rd Writing registerTempTable-deprecated.Rd Writing rename.Rd Writing repartition.Rd Writing repartitionByRange.Rd Writing sample.Rd Writing rollup.Rd Writing sampleBy.Rd Writing saveAsTable.Rd Writing take.Rd Writing write.df.Rd Writing write.jdbc.Rd Writing write.json.Rd Writing write.orc.Rd Writing write.parquet.Rd Writing write.stream.Rd Writing write.text.Rd Writing schema.Rd Writing select.Rd Writing selectExpr.Rd Writing showDF.Rd Writing subset.Rd Writing summary.Rd Writing union.Rd Writing unionAll.Rd Writing unionByName.Rd Writing unpersist.Rd Writing with.Rd Writing withColumn.Rd Writing withWatermark.Rd Writing randomSplit.Rd Writing broadcast.Rd Writing columnfunctions.Rd Writing between.Rd Writing cast.Rd Writing endsWith.Rd Writing startsWith.Rd Writing column_nonaggregate_functions.Rd Writing otherwise.Rd Writing over.Rd Writing eq_null_safe.Rd Writing withField.Rd Writing dropFields.Rd Writing partitionBy.Rd Writing rowsBetween.Rd Writing rangeBetween.Rd Writing windowPartitionBy.Rd Writing windowOrderBy.Rd Writing column_datetime_diff_functions.Rd Writing column_aggregate_functions.Rd Writing column_collection_functions.Rd Writing column_ml_functions.Rd Writing column_string_functions.Rd Writing column_misc_functions.Rd Writing avg.Rd Writing column_math_functions.Rd Writing column.Rd Writing column_window_functions.Rd Writing column_datetime_functions.Rd Writing column_avro_functions.Rd Writing last.Rd Writing not.Rd Writing fitted.Rd Writing predict.Rd Writing rbind.Rd Writing spark.als.Rd Writing spark.bisectingKmeans.Rd Writing spark.fmClassifier.Rd Writing spark.fmRegressor.Rd Writing spark.gaussianMixture.Rd Writing spark.gbt.Rd Writing spark.glm.Rd Writing spark.isoreg.Rd Writing spark.kmeans.Rd Writing spark.kstest.Rd Writing spark.lda.Rd Writing spark.logit.Rd Writing spark.mlp.Rd Writing spark.naiveBayes.Rd Writing spark.decisionTree.Rd Writing spark.randomForest.Rd Writing spark.survreg.Rd Writing spark.svmLinear.Rd Writing spark.fpGrowth.Rd Writing spark.prefixSpan.Rd Writing spark.powerIterationClustering.Rd Writing spark.lm.Rd Writing write.ml.Rd Writing awaitTermination.Rd Writing isActive.Rd Writing lastProgress.Rd Writing queryName.Rd Writing status.Rd Writing stopQuery.Rd Writing print.jobj.Rd Writing show.Rd Writing substr.Rd Writing match.Rd Writing GroupedData.Rd Writing pivot.Rd Writing SparkDataFrame.Rd Writing storageLevel.Rd Writing toJSON.Rd Writing nrow.Rd Writing ncol.Rd Writing dim.Rd Writing head.Rd Writing join.Rd Writing crossJoin.Rd Writing attach.Rd Writing str.Rd Writing histogram.Rd Writing getNumPartitions.Rd Writing sparkR.conf.Rd Writing sparkR.version.Rd Writing createDataFrame.Rd Writing read.json.Rd Writing read.orc.Rd Writing read.parquet.Rd Writing read.text.Rd Writing sql.Rd Writing tableToDF.Rd Writing read.df.Rd Writing read.jdbc.Rd Writing read.stream.Rd Writing WindowSpec.Rd Writing createExternalTable-deprecated.Rd Writing createTable.Rd Writing cacheTable.Rd Writing uncacheTable.Rd Writing clearCache.Rd Writing dropTempTable-deprecated.Rd Writing dropTempView.Rd Writing tables.Rd Writing tableNames.Rd Writing currentDatabase.Rd Writing setCurrentDatabase.Rd Writing listDatabases.Rd Writing listTables.Rd Writing listColumns.Rd Writing listFunctions.Rd Writing recoverPartitions.Rd Writing refreshTable.Rd Writing refreshByPath.Rd Writing spark.addFile.Rd Writing spark.getSparkFilesRootDirectory.Rd Writing spark.getSparkFiles.Rd Writing spark.lapply.Rd Writing setLogLevel.Rd Writing setCheckpointDir.Rd Writing unresolved_named_lambda_var.Rd Writing create_lambda.Rd Writing invoke_higher_order_function.Rd Writing install.spark.Rd Writing sparkR.callJMethod.Rd Writing sparkR.callJStatic.Rd Writing sparkR.newJObject.Rd Writing LinearSVCModel-class.Rd Writing LogisticRegressionModel-class.Rd Writing MultilayerPerceptronClassificationModel-class.Rd Writing NaiveBayesModel-class.Rd Writing FMClassificationModel-class.Rd Writing BisectingKMeansModel-class.Rd Writing GaussianMixtureModel-class.Rd Writing KMeansModel-class.Rd Writing LDAModel-class.Rd Writing PowerIterationClustering-class.Rd Writing FPGrowthModel-class.Rd Writing PrefixSpan-class.Rd Writing ALSModel-class.Rd Writing AFTSurvivalRegressionModel-class.Rd Writing GeneralizedLinearRegressionModel-class.Rd Writing IsotonicRegressionModel-class.Rd Writing LinearRegressionModel-class.Rd Writing FMRegressionModel-class.Rd Writing glm.Rd Writing KSTest-class.Rd Writing GBTRegressionModel-class.Rd Writing GBTClassificationModel-class.Rd Writing RandomForestRegressionModel-class.Rd Writing RandomForestClassificationModel-class.Rd Writing DecisionTreeRegressionModel-class.Rd Writing DecisionTreeClassificationModel-class.Rd Writing read.ml.Rd Writing sparkR.session.stop.Rd Writing sparkR.init-deprecated.Rd Writing sparkRSQL.init-deprecated.Rd Writing sparkRHive.init-deprecated.Rd Writing sparkR.session.Rd Writing sparkR.uiWebUrl.Rd Writing setJobGroup.Rd Writing clearJobGroup.Rd Writing cancelJobGroup.Rd Writing setJobDescription.Rd Writing setLocalProperty.Rd Writing getLocalProperty.Rd Writing crosstab.Rd Writing freqItems.Rd Writing approxQuantile.Rd Writing StreamingQuery.Rd Writing hashCode.Rd + /usr/bin/R CMD INSTALL --library=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/lib /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/pkg/ * installing *source* package ‘SparkR’ ... ** using staged installation ** R ** inst ** byte-compile and prepare package for lazy loading Creating a new generic function for ‘as.data.frame’ in package ‘SparkR’ Creating a new generic function for ‘colnames’ in package ‘SparkR’ Creating a new generic function for ‘colnames<-’ in package ‘SparkR’ Creating a new generic function for ‘cov’ in package ‘SparkR’ Creating a new generic function for ‘drop’ in package ‘SparkR’ Creating a new generic function for ‘na.omit’ in package ‘SparkR’ Creating a new generic function for ‘filter’ in package ‘SparkR’ Creating a new generic function for ‘intersect’ in package ‘SparkR’ Creating a new generic function for ‘sample’ in package ‘SparkR’ Creating a new generic function for ‘transform’ in package ‘SparkR’ Creating a new generic function for ‘subset’ in package ‘SparkR’ Creating a new generic function for ‘summary’ in package ‘SparkR’ Creating a new generic function for ‘union’ in package ‘SparkR’ Creating a new generic function for ‘endsWith’ in package ‘SparkR’ Creating a new generic function for ‘startsWith’ in package ‘SparkR’ Creating a new generic function for ‘lag’ in package ‘SparkR’ Creating a new generic function for ‘rank’ in package ‘SparkR’ Creating a new generic function for ‘sd’ in package ‘SparkR’ Creating a new generic function for ‘var’ in package ‘SparkR’ Creating a new generic function for ‘window’ in package ‘SparkR’ Creating a new generic function for ‘predict’ in package ‘SparkR’ Creating a new generic function for ‘rbind’ in package ‘SparkR’ Creating a generic function for ‘substr’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘%in%’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘lapply’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘Filter’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘nrow’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘ncol’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘factorial’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘atan2’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘ifelse’ from package ‘base’ in package ‘SparkR’ ** help *** installing help indices ** building package indices ** installing vignettes ** testing if installed package can be loaded from temporary location ** testing if installed package can be loaded from final location ** testing if installed package keeps a record of temporary installation path * DONE (SparkR) + cd /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/lib + jar cfM /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/lib/sparkr.zip SparkR + popd [info] Using build tool sbt with profiles -Phadoop-3.2 under environment amplab_jenkins [info] Found the following changed modules: root [info] Setup the following environment variables for tests: ======================================================================== Running Apache RAT checks ======================================================================== Attempting to fetch rat RAT checks passed. ======================================================================== Running Scala style checks ======================================================================== [info] Checking Scala style using SBT with these profiles: -Phadoop-3.2 -Pmesos -Phadoop-cloud -Pkinesis-asl -Pdocker-integration-tests -Pyarn -Pspark-ganglia-lgpl -Pkubernetes -Phive-thriftserver -Phive Scalastyle checks passed. ======================================================================== Running Python style checks ======================================================================== starting python compilation test... python compilation succeeded. The python3 -m black command was not found. Skipping black checks for now. starting flake8 test... flake8 checks passed. The mypy command was not found. Skipping for now. all lint-python tests passed! ======================================================================== Running R style checks ======================================================================== Loading required namespace: SparkR Loading required namespace: lintr lintr checks passed. ======================================================================== Building Spark ======================================================================== [info] Building Spark using SBT with these arguments: -Phadoop-3.2 -Pmesos -Phadoop-cloud -Pkinesis-asl -Pdocker-integration-tests -Pyarn -Pspark-ganglia-lgpl -Pkubernetes -Phive-thriftserver -Phive test:package streaming-kinesis-asl-assembly/assembly Using /usr/java/latest as default JAVA_HOME. Note, this will be overridden by -java-home if it is set. [info] welcome to sbt 1.5.5 (Private Build Java 1.8.0_275) [info] loading settings for project spark-master-test-sbt-hadoop-3-2-build from plugins.sbt ... [info] loading project definition from /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project [info] resolving key references (36387 settings) ... [info] set current project to spark-parent (in build file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/) [warn] there are 210 keys that are not used by any other settings/tasks: [warn] [warn] * assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * avro / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * avro / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * avro / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * avro / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * avro / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * avro / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * catalyst / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * catalyst / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * catalyst / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * catalyst / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * catalyst / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * catalyst / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * core / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * core / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * core / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * core / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * core / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * core / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * docker-integration-tests / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * docker-integration-tests / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * docker-integration-tests / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * docker-integration-tests / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * docker-integration-tests / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * docker-integration-tests / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * examples / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * examples / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * examples / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * examples / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * examples / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * examples / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * ganglia-lgpl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * ganglia-lgpl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * ganglia-lgpl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * ganglia-lgpl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * ganglia-lgpl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * ganglia-lgpl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * graphx / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * graphx / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * graphx / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * graphx / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * graphx / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * graphx / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * hadoop-cloud / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * hadoop-cloud / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * hadoop-cloud / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * hadoop-cloud / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * hadoop-cloud / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * hadoop-cloud / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * hive / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * hive / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * hive / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * hive / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * hive / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * hive / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * hive-thriftserver / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * hive-thriftserver / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * hive-thriftserver / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * hive-thriftserver / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * hive-thriftserver / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * hive-thriftserver / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * kubernetes / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * kubernetes / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * kubernetes / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * kubernetes / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * kubernetes / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * kubernetes / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * kvstore / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * kvstore / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * kvstore / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * kvstore / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * kvstore / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * kvstore / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * launcher / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * launcher / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * launcher / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * launcher / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * launcher / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * launcher / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * mesos / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * mesos / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * mesos / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * mesos / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * mesos / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * mesos / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * mllib / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * mllib / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * mllib / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * mllib / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * mllib / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * mllib / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * mllib-local / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * mllib-local / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * mllib-local / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * mllib-local / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * mllib-local / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * mllib-local / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * network-common / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * network-common / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * network-common / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * network-common / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * network-common / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * network-common / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * network-shuffle / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * network-shuffle / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * network-shuffle / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * network-shuffle / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * network-shuffle / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * network-shuffle / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * network-yarn / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * network-yarn / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * network-yarn / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * network-yarn / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * network-yarn / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * network-yarn / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * repl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * repl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * repl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * repl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * repl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * repl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * sketch / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * sketch / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * sketch / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * sketch / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * sketch / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * sketch / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * spark / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * spark / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * spark / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * spark / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * spark / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * spark / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * sql / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * sql / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * sql / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * sql / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * sql / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * sql / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * sql-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * sql-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * sql-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * sql-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * sql-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * sql-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming-kafka-0-10-assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming-kafka-0-10-assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming-kafka-0-10-assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming-kafka-0-10-assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming-kafka-0-10-assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming-kafka-0-10-assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming-kinesis-asl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming-kinesis-asl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming-kinesis-asl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming-kinesis-asl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming-kinesis-asl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming-kinesis-asl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming-kinesis-asl-assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming-kinesis-asl-assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming-kinesis-asl-assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming-kinesis-asl-assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming-kinesis-asl-assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming-kinesis-asl-assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * tags / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * tags / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * tags / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * tags / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * tags / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * tags / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * token-provider-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * token-provider-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * token-provider-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * token-provider-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * token-provider-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * token-provider-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * tools / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * tools / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * tools / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * tools / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * tools / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * tools / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * unsafe / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * unsafe / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * unsafe / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * unsafe / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * unsafe / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * unsafe / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * yarn / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * yarn / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * yarn / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * yarn / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * yarn / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * yarn / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] [warn] note: a setting might still be used by a command; to exclude a key from this `lintUnused` check [warn] either append it to `Global / excludeLintKeys` or call .withRank(KeyRanks.Invisible) on the key [warn] sbt 0.13 shell syntax is deprecated; use slash syntax instead: hive-thriftserver / Test / package, docker-integration-tests / Test / package, mllib / Test / package, unsafe / Test / package, repl / Test / package, streaming-kafka-0-10 / Test / package, network-shuffle / Test / package, kubernetes / Test / package, streaming-kinesis-asl / Test / package, tools / Test / package, token-provider-kafka-0-10 / Test / package, streaming / Test / package, sql-kafka-0-10 / Test / package, examples / Test / package, yarn / Test / package, hive / Test / package, graphx / Test / package, core / Test / package, hadoop-cloud / Test / package, kvstore / Test / package, streaming-kafka-0-10-assembly / Test / package, avro / Test / package, catalyst / Test / package, tags / Test / package, ganglia-lgpl / Test / package, mesos / Test / package, assembly / Test / package, mllib-local / Test / package, launcher / Test / package, streaming-kinesis-asl-assembly / Test / package, network-common / Test / package, sketch / Test / package, network-yarn / Test / package, sql / Test / package, Test / package [info] compiling 2 Scala sources and 8 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/tags/target/scala-2.12/classes ... [info] compiling 1 Scala source to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/tools/target/scala-2.12/classes ... [info] compiling 85 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/network-common/target/scala-2.12/classes ... [info] done compiling [info] done compiling [info] compiling 9 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/sketch/target/scala-2.12/classes ... [info] compiling 12 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/kvstore/target/scala-2.12/classes ... [info] compiling 18 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/unsafe/target/scala-2.12/classes ... [info] compiling 50 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/network-shuffle/target/scala-2.12/classes ... [info] compiling 21 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/launcher/target/scala-2.12/classes ... [info] compiling 8 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/tags/target/scala-2.12/test-classes ... [info] done compiling [info] compiling 26 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/network-common/target/scala-2.12/test-classes ... [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:22:1: Unsafe is internal proprietary API and may be removed in a future release [warn] import sun.misc.Unsafe; [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:28:1: Unsafe is internal proprietary API and may be removed in a future release [warn] private static final Unsafe _UNSAFE; [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:150:1: Unsafe is internal proprietary API and may be removed in a future release [warn] sun.misc.Unsafe unsafe; [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:152:1: Unsafe is internal proprietary API and may be removed in a future release [warn] Field unsafeField = Unsafe.class.getDeclaredField("theUnsafe"); [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:154:1: Unsafe is internal proprietary API and may be removed in a future release [warn] unsafe = (sun.misc.Unsafe) unsafeField.get(null); [warn] ^5 warnings [info] done compiling [info] done compiling [info] compiling 3 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/sketch/target/scala-2.12/test-classes ... [info] Note: /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/launcher/src/main/java/org/apache/spark/launcher/LauncherServer.java uses or overrides a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] done compiling [info] compiling 1 Scala source and 6 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/unsafe/target/scala-2.12/test-classes ... [info] done compiling [info] done compiling [info] compiling 12 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/kvstore/target/scala-2.12/test-classes ... [info] compiling 7 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/launcher/target/scala-2.12/test-classes ... [info] done compiling [info] compiling 3 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/network-yarn/target/scala-2.12/classes ... [info] compiling 568 Scala sources and 104 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/core/target/scala-2.12/classes ... [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/network-common/src/test/java/org/apache/spark/network/server/OneForOneStreamManagerSuite.java:105:1: [unchecked] unchecked conversion [warn] Iterator<ManagedBuffer> buffers = Mockito.mock(Iterator.class); [warn] ^ required: Iterator<ManagedBuffer> [warn] found: Iterator [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/network-common/src/test/java/org/apache/spark/network/server/OneForOneStreamManagerSuite.java:111:1: [unchecked] unchecked conversion [warn] Iterator<ManagedBuffer> buffers2 = Mockito.mock(Iterator.class); [warn] ^ required: Iterator<ManagedBuffer> [warn] found: Iterator [warn] 2 warnings [info] done compiling [info] done compiling [info] compiling 18 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/common/network-shuffle/target/scala-2.12/test-classes ... [info] done compiling [info] done compiling [info] done compiling [info] done compiling [info] done compiling [info] compiling 5 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/mllib-local/target/scala-2.12/classes ... [info] done compiling [info] Note: /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/core/src/main/java/org/apache/spark/SparkFirehoseListener.java uses or overrides a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] compiling 104 Scala sources and 6 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/streaming/target/scala-2.12/classes ... [info] compiling 47 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/resource-managers/kubernetes/core/target/scala-2.12/classes ... [info] compiling 5 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/external/kafka-0-10-token-provider/target/scala-2.12/classes ... [info] compiling 38 Scala sources and 5 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/graphx/target/scala-2.12/classes ... [info] compiling 1 Scala source and 1 Java source to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/external/spark-ganglia-lgpl/target/scala-2.12/classes ... [info] compiling 367 Scala sources and 168 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/catalyst/target/scala-2.12/classes ... [info] compiling 25 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/resource-managers/yarn/target/scala-2.12/classes ... [info] compiling 20 Scala sources and 1 Java source to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/resource-managers/mesos/target/scala-2.12/classes ... [info] compiling 314 Scala sources and 29 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/core/target/scala-2.12/test-classes ... [info] done compiling [info] done compiling [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] done compiling [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] done compiling [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] compiling 10 Scala sources and 1 Java source to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/external/kafka-0-10/target/scala-2.12/classes ... [info] compiling 11 Scala sources and 2 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/external/kinesis-asl/target/scala-2.12/classes ... [info] done compiling [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/external/kinesis-asl/src/main/java/org/apache/spark/examples/streaming/JavaKinesisWordCountASL.java:157:1: [unchecked] unchecked method invocation: method union in class JavaStreamingContext is applied to given types [warn] unionStreams = jssc.union(streamsList.toArray(new JavaDStream[0])); [warn] ^ required: JavaDStream<T>[] [warn] found: JavaDStream[] [warn] where T is a type-variable: [warn] T extends Object declared in method <T>union(JavaDStream<T>...) [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/external/kinesis-asl/src/main/java/org/apache/spark/examples/streaming/JavaKinesisWordCountASL.java:157:1: [unchecked] unchecked conversion [warn] unionStreams = jssc.union(streamsList.toArray(new JavaDStream[0])); [warn] ^ required: JavaDStream<T>[] [warn] found: JavaDStream[] [warn] where T is a type-variable: [warn] T extends Object declared in method <T>union(JavaDStream<T>...) [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] compiling 19 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/graphx/target/scala-2.12/test-classes ... [info] compiling 11 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/mllib-local/target/scala-2.12/test-classes ... [info] compiling 41 Scala sources and 9 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/streaming/target/scala-2.12/test-classes ... [info] compiling 38 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/resource-managers/kubernetes/core/target/scala-2.12/test-classes ... [info] compiling 11 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/resource-managers/mesos/target/scala-2.12/test-classes ... [info] compiling 6 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/external/kafka-0-10-token-provider/target/scala-2.12/test-classes ... [info] compiling 22 Scala sources and 3 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/resource-managers/yarn/target/scala-2.12/test-classes ... [info] done compiling [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/SupportsAtomicPartitionManagement.java:55:1: [unchecked] unchecked method invocation: method createPartitions in interface SupportsAtomicPartitionManagement is applied to given types [warn] createPartitions(new InternalRow[]{ident}, new Map[]{properties}); [warn] ^ required: InternalRow[],Map<String,String>[] [warn] found: InternalRow[],Map[] [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/SupportsAtomicPartitionManagement.java:55:1: [unchecked] unchecked conversion [warn] createPartitions(new InternalRow[]{ident}, new Map[]{properties}); [warn] ^ required: Map<String,String>[] [warn] found: Map[] [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/catalyst/src/main/java/org/apache/spark/sql/util/NumericHistogram.java:186:1: [unchecked] unchecked method invocation: method sort in class Collections is applied to given types [warn] Collections.sort(tmp_bins); [warn] ^ required: List<T> [warn] found: ArrayList<Coord> [warn] where T is a type-variable: [warn] T extends Comparable<? super T> declared in method <T>sort(List<T>) [warn] 3 warnings [info] done compiling [info] done compiling [info] done compiling [info] done compiling [info] done compiling [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] compiling 302 Scala sources and 6 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/catalyst/target/scala-2.12/test-classes ... [info] compiling 544 Scala sources and 71 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/core/target/scala-2.12/classes ... [info] done compiling [info] compiling 6 Scala sources and 4 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/external/kafka-0-10/target/scala-2.12/test-classes ... [info] compiling 8 Scala sources and 1 Java source to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/external/kinesis-asl/target/scala-2.12/test-classes ... [info] Note: /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/external/kinesis-asl/src/test/java/org/apache/spark/streaming/kinesis/JavaKinesisInputDStreamBuilderSuite.java uses or overrides a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] done compiling [info] done compiling [info] done compiling [info] compiling 31 Scala sources and 1 Java source to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/external/kafka-0-10-sql/target/scala-2.12/classes ... [info] compiling 29 Scala sources and 2 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/scala-2.12/classes ... [info] compiling 4 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/repl/target/scala-2.12/classes ... [info] compiling 18 Scala sources and 1 Java source to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/external/avro/target/scala-2.12/classes ... [info] compiling 327 Scala sources and 5 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/mllib/target/scala-2.12/classes ... [info] compiling 2 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/hadoop-cloud/target/scala-2.12/classes ... [info] done compiling [info] compiling 2 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/hadoop-cloud/target/scala-2.12/test-classes ... [info] done compiling [info] done compiling [info] done compiling [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/external/avro/src/main/java/org/apache/spark/sql/avro/SparkAvroKeyOutputFormat.java:55:1: [unchecked] unchecked call to SparkAvroKeyRecordWriter(Schema,GenericData,CodecFactory,OutputStream,int,Map<String,String>) as a member of the raw type SparkAvroKeyRecordWriter [warn] return new SparkAvroKeyRecordWriter( [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/external/avro/src/main/java/org/apache/spark/sql/avro/SparkAvroKeyOutputFormat.java:74:1: [unchecked] unchecked call to DataFileWriter(DatumWriter<D>) as a member of the raw type DataFileWriter [warn] this.mAvroFileWriter = new DataFileWriter(dataModel.createDatumWriter(writerSchema)); [warn] ^ where D is a type-variable: [warn] D extends Object declared in class DataFileWriter [info] done compiling [info] done compiling [info] Note: /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/src/main/java/org/apache/hadoop/hive/ql/io/orc/SparkOrcNewRecordReader.java uses or overrides a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] done compiling [info] compiling 27 Scala sources and 86 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive-thriftserver/target/scala-2.12/classes ... [info] Note: Some input files use or override a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] compiling 523 Scala sources and 47 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/core/target/scala-2.12/test-classes ... [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] compiling 6 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/repl/target/scala-2.12/test-classes ... [info] compiling 204 Scala sources and 135 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/examples/target/scala-2.12/classes ... [info] done compiling [info] Note: /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/examples/src/main/java/org/apache/spark/examples/ml/JavaChiSqSelectorExample.java uses or overrides a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] Note: Some input files use or override a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] compiling 21 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/external/kafka-0-10-sql/target/scala-2.12/test-classes ... [info] compiling 13 Scala sources and 1 Java source to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/external/avro/target/scala-2.12/test-classes ... [info] compiling 205 Scala sources and 66 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/mllib/target/scala-2.12/test-classes ... [info] compiling 117 Scala sources and 17 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/scala-2.12/test-classes ... [info] compiling 20 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/external/docker-integration-tests/target/scala-2.12/test-classes ... [info] done compiling [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] done compiling [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:464:1: [unchecked] unchecked cast [warn] setLint((List<Integer>)value); [warn] ^ required: List<Integer> [warn] found: Object [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:472:1: [unchecked] unchecked cast [warn] setLString((List<String>)value); [warn] ^ required: List<String> [warn] found: Object [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:480:1: [unchecked] unchecked cast [warn] setLintString((List<IntString>)value); [warn] ^ required: List<IntString> [warn] found: Object [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:488:1: [unchecked] unchecked cast [warn] setMStringString((Map<String,String>)value); [warn] ^ required: Map<String,String> [warn] found: Object [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:749:1: [unchecked] unchecked call to read(TProtocol,T) as a member of the raw type IScheme [warn] schemes.get(iprot.getScheme()).getScheme().read(iprot, this); [warn] ^ where T is a type-variable: [warn] T extends TBase declared in interface IScheme [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:753:1: [unchecked] unchecked call to write(TProtocol,T) as a member of the raw type IScheme [warn] schemes.get(oprot.getScheme()).getScheme().write(oprot, this); [warn] ^ where T is a type-variable: [warn] T extends TBase declared in interface IScheme [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:1027:1: [unchecked] getScheme() in ComplexTupleSchemeFactory implements <S>getScheme() in SchemeFactory [warn] public ComplexTupleScheme getScheme() { [warn] ^ return type requires unchecked conversion from ComplexTupleScheme to S [warn] where S is a type-variable: [warn] S extends IScheme declared in method <S>getScheme() [warn] Note: /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/src/test/java/org/apache/spark/sql/hive/JavaDataFrameSuite.java uses or overrides a deprecated API. [warn] Note: Recompile with -Xlint:deprecation for details. [warn] 8 warnings [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] compiling 18 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive-thriftserver/target/scala-2.12/test-classes ... [info] done compiling [info] done compiling [success] Total time: 454 s (07:34), completed Dec 5, 2021 7:59:14 PM [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] Strategy 'discard' was applied to 2 files (Run the task at debug level to see details) [info] Strategy 'filterDistinctLines' was applied to 3 files (Run the task at debug level to see details) [info] Strategy 'first' was applied to 79 files (Run the task at debug level to see details) [warn] Ignored unknown package option FixedTimestamp(Some(1262304000000)) [success] Total time: 32 s, completed Dec 5, 2021 7:59:47 PM ======================================================================== Detecting binary incompatibilities with MiMa ======================================================================== [info] Detecting binary incompatibilities with MiMa using SBT with these profiles: -Phadoop-3.2 -Pmesos -Phadoop-cloud -Pkinesis-asl -Pdocker-integration-tests -Pyarn -Pspark-ganglia-lgpl -Pkubernetes -Phive-thriftserver -Phive [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.Strategy [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamedQueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.InBlock [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.TrainValidationSplit.TrainValidationSplitReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.LocalLDAModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.Main.MainClassOptionParser [WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DefaultParamsReader.Metadata [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetReadState.RowRange [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.OpenHashMapBasedStateMap.StateInfo [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.dynalloc.ExecutorMonitor.ShuffleCleanedEvent [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.FMRegressionModel.FMRegressionModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.RpcEndpointVerifier.CheckExistence [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.MiscellaneousProcessAdded [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.SetupDriver Error instrumenting class:org.apache.spark.mapred.SparkHadoopMapRedUtil$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.MultilayerPerceptronClassifierWrapper.MultilayerPerceptronClassifierWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FromStatementBodyContext Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory$FixedLenByteArrayUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.Hello [WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.CryptoHelperChannel [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ImputerModel.ImputerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveSessionCatalog.SessionCatalogAndTable [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierCommentListContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LegacyDecimalLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.SignalUtils.ActionHandler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleMultipartIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.api.r.BaseRRunner.ReaderIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentityTransformContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.HealthTracker.ExecutorFailureList [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QualifiedNameListContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator22$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntervalContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QuerySpecificationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.DeleteColumn [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator27$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.plans [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.DateAccessor Error instrumenting class:org.apache.spark.sql.execution.streaming.StreamExecution$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.RocksDBStateStore.COMMITTED Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.text.TextWrite Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetUtils$FileTypes$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveRdd [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleTableIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.SchemaPruning.RootField [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.LongWithRebaseUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.BucketSpecHelper [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.ZooKeeperLeaderElectionAgent.LeadershipStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UnsetTablePropertiesContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillTask [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LogisticRegressionWrapper.LogisticRegressionWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntervalValueContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.FeatureHasher.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.RunLengthEncoding.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterChanged [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.DefaultPartitionCoalescer.PartitionLocations [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.NullAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.StateStoreAwareZipPartitionsHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueRowConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Imputer.$$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveSessionCatalog.DatabaseInSessionCatalog [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SchemaHelper.SchemaWriter [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.SparkAppHandle.Listener [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.TempFileBasedBlockStoreUpdater Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestWorkerState [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.KVTypeInfo.Accessor [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillMerger.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RetryingBlockTransferor.BlockTransferStarter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisteredWorker [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.RocksDBStateStore.ABORTED [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterApplication [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FromClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateKeyWatermarkPredicate Error instrumenting class:org.apache.spark.sql.execution.datasources.PathGlobFilter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ColumnReferenceContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterChangeAcknowledged [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryTermDefaultContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComparisonContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetMatchingBlockIds [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerExecutorStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.PythonMLLibAPI.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ApplicationRemoved [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyAndNumValues [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryTermContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator19$3 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.StandaloneResourceUtils.MutableResourceInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0.Data Error instrumenting class:org.apache.spark.input.StreamInputFormat [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RetrieveSparkAppConfig [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator26$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.RpcHandler.MergedBlockMetaReqHandler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelWriter.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetShufflePushMergerLocations [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$5 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.BisectingKMeansModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator23$3 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator2$2 Error instrumenting class:org.apache.spark.mllib.regression.IsotonicRegressionModel$SaveLoadV1_0$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.IntegerWithRebaseUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.Prefix [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerDecommissioner.ShuffleMigrationRunnable [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyToNumValuesType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.DoubleUpdater Error instrumenting class:org.apache.spark.deploy.history.RollingEventLogFilesWriter$ [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.io.LocalDiskShuffleMapOutputWriter.$PartitionWriterStream [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.TaskIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.stat.test.ChiSqTest.Method [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ExtractWindowExpressions [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CurrentLikeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.Batch [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveShufflePushMergerLocation [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionColumnContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDB.PrefixCache [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LinearRegressionWrapper.LinearRegressionWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AliasedRelationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.EltCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.FreqSequence [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.DecisionTreeRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$4 Error instrumenting class:org.apache.spark.sql.execution.PartitionedFileUtil$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.CubeType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.FloatType.FloatIsConflicted [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveGroupingAnalytics [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.NodeData [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.Pipeline.PipelineReader [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.ArrayWrappers.ComparableObjectArray [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.DoubleHasher [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RetrieveDelegationTokens [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ClassificationModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MultiUnitsIntervalContext [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.ReceiverTracker.TrackerState [WARN] Unable to detect inner functions for class:org.apache.spark.sql.streaming.StreamingQueryListener.QueryTerminatedEvent [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.sources.MemorySink.AddedData [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockStoreClient.$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ParenthesizedExpressionContext Error instrumenting class:org.apache.spark.sql.execution.command.DataWritingCommand$ Error instrumenting class:org.apache.spark.mllib.clustering.LocalLDAModel$SaveLoadV1_0$ [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.dynalloc.ExecutorMonitor.Tracker [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.NodeData [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.ShuffleBlockPusher.PushRequest [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableProviderContext Error instrumenting class:org.apache.spark.sql.execution.command.DDLUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.PeriodConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.DecisionTreeRegressorWrapper.DecisionTreeRegressorWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.UnregisterApplication [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleByBytesContext [WARN] Unable to detect inner functions for class:org.apache.spark.api.r.BaseRRunner.WriterThread [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorkerFailed [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.SplitData [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator16$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.JoinReorderDP.JoinPlan [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MergeIntoTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.expressions [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.EventFilter.FilterStatistics [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.CatalogImpl.$$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.StringLiteralCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerExecutorStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateFunctionContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClient.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$7 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.LDAWrapperReader Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.Division [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryClassificationSummary.$$typecreator6$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LocationSpecContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.PartitioningUtils.PartitionValues [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper.GeneralizedLinearRegressionWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.FValueTest.FValueResult [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.FValueTest.$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Tokenizer.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DriverStatusResponse [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelWriter.$$typecreator1$2 Error instrumenting class:org.apache.spark.sql.execution.command.LoadDataCommand$ [WARN] Unable to detect inner functions for class:org.apache.spark.resource.ResourceProfile.DefaultProfileExecutorResources [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveCatalogs.NonSessionCatalogAndTable Error instrumenting class:org.apache.spark.sql.execution.streaming.state.SchemaHelper$SchemaV2Reader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.optimization.LBFGS.CostFun [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.executor.CoarseGrainedExecutorBackend.Arguments [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.NGram.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TablePropertyKeyContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.AsFunctionIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.ml.param.shared.SharedParamsCodeGen.ParamDesc [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.GaussianMixtureModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator21$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedStateStore.COMMITTED [WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClientFactory.ClientPool [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.RandomForestRegressionModel.$$typecreator1$1 Error instrumenting class:org.apache.spark.scheduler.SplitInfo$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.IfCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowNamespacesContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExtractContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.YearMonthIntervalType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.SparkBuildInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AddTablePartitionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SmallIntLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.python.MLSerDe.SparseMatrixPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$11 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.DurationConverter Error instrumenting class:org.apache.spark.api.python.DoubleArrayWritable [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV2_0 Error instrumenting class:org.apache.spark.ml.tuning.TrainValidationSplitModel$TrainValidationSplitModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.TransformHelper [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingGlobalLimitStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.DecommissionBlockManager [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ChiSquareTest.$typecreator13$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.orc.OrcUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.api.java.JavaUtils.SerializableMapWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyToNumValuesStore Error instrumenting class:org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.DecisionTreeRegressorWrapper.DecisionTreeRegressorWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.AnalysisErrorAt [WARN] Unable to detect inner functions for class:org.apache.spark.ui.JettyUtils.ServletParams [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.Once [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RepairTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.EnsembleNodeData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.FloatConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.DataSource.SourceInfo [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.SparkSubmitUtils.MavenCoordinate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.ApproximatePercentile.PercentileDigest [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamedExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.expressions.NullOrdering.1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Std [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RefreshFunctionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RowConstructorContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.OptimizeMetadataOnlyQuery.PartitionedRelation [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.ImplicitTypeCasts [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AggregationClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WindowFunctionType.Python [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSizeHint.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.AssociationRules.Rule [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.SquaredEuclideanSilhouette.ClusterStats Error instrumenting class:org.apache.spark.sql.execution.streaming.state.SchemaHelper$SchemaReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.RepeatedGroupConverter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeans.ClusterSummaryAggregator Error instrumenting class:org.apache.spark.streaming.CheckpointWriter$CheckpointWriteHandler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DayTimeIntervalDataTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowDefContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.DeferFetchRequestResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ErrorIdentContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.SpillReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveNewInstance [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.IntervalUtils.ParseState [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ErrorCapturingIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.fpm.FPGrowthModel.FPGrowthModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.DateConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.GroupingAnalyticsContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator29$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.Aggregator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryClassificationSummary.$$typecreator6$5 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.IteratorForPartition [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PivotColumnContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.RocksDBStateStore [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.MapConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ExecutorAllocationManager.StageAttempt [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetQuantifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.DecommissionExecutorsOnHost [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.FValueTest.$typecreator19$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.client.StandaloneAppClient.ClientEndpoint [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator5$2 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveShuffle [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CtesContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FallbackOnPushMergedFailureResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.plans.DslLogicalPlan [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GBTRegressorWrapper.GBTRegressorWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.FMClassificationModel.FMClassificationModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveUserSpecifiedColumns [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.IsotonicRegressionModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.KMeansWrapper.KMeansWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.ReceiverTracker.ReceiverTrackerEndpoint [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.BisectingKMeansWrapper.BisectingKMeansWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.IdentityProjection Error instrumenting class:org.apache.spark.internal.io.HadoopMapRedCommitProtocol [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorter.$SortedIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MultiInsertQueryContext Error instrumenting class:org.apache.spark.sql.execution.datasources.DaysWritable [WARN] Unable to detect inner functions for class:org.apache.spark.ml.fpm.FPGrowthModel.FPGrowthModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveSessionCatalog.ResolvedV1TableIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SubqueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.SelectorModel.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryClassificationSummary.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AssignmentListContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.FValueTest.$typecreator13$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.crypto.TransportCipher.EncryptionHandler [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.BytesToBytesMap.$MapIteratorWithKeyIndex [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.StandaloneResourceUtils.StandaloneResourceAllocation [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RetryingBlockTransferor.1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LDAModel.$$typecreator2$1 Error instrumenting class:org.apache.spark.internal.io.HadoopMapReduceWriteConfigUtil [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GBTRegressionModel.GBTRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.RobustScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LogisticRegressionWrapper.LogisticRegressionWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.DateTimeOperations [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator13$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.ByteUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.SetupDriver [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetMatchingBlockIds [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexAndValue [WARN] Unable to detect inner functions for class:org.apache.spark.InternalAccumulator.output [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.CatalystTypeConverter [WARN] Unable to detect inner functions for class:org.apache.spark.SparkConf.DeprecatedConfig [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StrictIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.UnivariateFeatureSelectorModel.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.Bitmaps [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.DecimalConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComparisonOperatorContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SparkSaslClient.$ClientCallbackHandler [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DriverStatusResponse [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.IntArrays [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetReplicateInfoForRDDBlocks [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator11$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.YearMonthIntervalDataTypeContext Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.parquet.ParquetScan [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowCurrentNamespaceContext [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.receiver.BlockGenerator.Block [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.OneForOneBlockFetcher.$BlocksInfo [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.SparkAppConfig [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.StackCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BigIntLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ConfigKeyContext Error instrumenting class:org.apache.spark.sql.execution.datasources.FilePartition$ [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.TransportServer.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PivotClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.UpdateColumnPosition [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Count [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RetrieveLastAllocatedExecutorId [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StopWordsRemover.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.SignedPrefixComparatorDesc [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.impl.GLMRegressionModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionFieldListContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerDecommissioning [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator11$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.LongDelta.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WindowFunctionType.SQL Error instrumenting class:org.apache.spark.sql.execution.streaming.CheckpointFileManager$RenameHelperMethods [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.UpdateColumnNullability [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.FloatConverter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.BinaryPrefixComparator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.FMClassificationModel.FMClassificationModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PivotValueContext [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.InputFileBlockHolder.FileBlock [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FailNativeCommandContext Error instrumenting class:org.apache.spark.api.python.WriteInputFormatTestDataGenerator$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DecommissionWorker [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCheckResult.TypeCheckFailure [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleTableSchemaContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.StarSchemaDetection.TableAccessCardinality [WARN] Unable to detect inner functions for class:org.apache.spark.resource.ResourceProfile.ExecutorResourcesOrDefaults [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Data [WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.WriteSkippedQueue [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.EncryptedDownloadFile [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.Cholesky [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.json.JsonFilters.JsonPredicate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ConfigValueContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.LDAWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$16 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PredicateOperatorContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.stat.test.KolmogorovSmirnovTest.NullHypothesis Error instrumenting class:org.apache.spark.deploy.master.ui.MasterWebUI [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.GroupType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.ArrayConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Family [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.AppListingListener.MutableAttemptInfo [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.BlockManagerHeartbeat Error instrumenting class:org.apache.spark.sql.execution.datasources.DataSource$ [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StopExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillDriver Error instrumenting class:org.apache.spark.mllib.clustering.GaussianMixtureModel$SaveLoadV1_0$ [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.UpdateBlockInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.RollupType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator17$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator25$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableValuedFunctionContext Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.json.JsonTable [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ANOVATest.$typecreator19$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.IntDelta.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayes.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.StopBlockManagerMaster [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.ALSModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0.$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.HandleAnalysisOnlyCommand [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.OneForOneBlockFetcher.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.NaturalKeys [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.TypedAverage.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.EventFilter.FilterStatistics [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.InConversion [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SchemaHelper.SchemaReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeFixedWidthAggregationMap.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetNamespaceLocationContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Min [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.$$typecreator2$2 Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.text.TextScan Error instrumenting class:org.apache.spark.sql.execution.streaming.state.StateStoreProvider$ [WARN] Unable to detect inner functions for class:org.apache.spark.storage.StorageStatus.NonRddStorageInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ColTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockHandler.$ShuffleManagedBufferIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.WindowInPandasExec.WindowBoundType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.OrderedIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.OutputSpec [WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DatasetUtils.$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.RandomForestClassificationModel.RandomForestClassificationModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerLatestState [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator16$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator2$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.ColumnChange [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeExternalRowSorter.PrefixComputer [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.PromoteStrings [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingDeduplicationStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.Window [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.MapConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ReplaceTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.ArrayWrappers.ComparableLongArray [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.EncryptedDownloadFile.$EncryptedDownloadWritableChannel [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InsertIntoContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CommentTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.RandomForestRegressionModel.RandomForestRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocations [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.Rating [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NumericLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveBinaryArithmetic [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator12$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionValContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.BlockManagerHeartbeat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.ArrayConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryPrimaryDefaultContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCWriter Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.orc.OrcScan$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Tweedie [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.UnsignedPrefixComparator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WhereClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSizeHint.$$typecreator3$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.TextBasedFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator2$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InlineTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.GBTClassificationModel.GBTClassificationModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionSummary.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RegisterBlockManager [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$6 [WARN] Unable to detect inner functions for class:org.apache.spark.network.util.TransportFrameDecoder.Interceptor [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerRemoved [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.FixedLengthRowBasedKeyValueBatch.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.LabeledPointPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.CoalesceExec.EmptyPartition [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Index [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.RemoveProperty [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.SuccessFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslAttr [WARN] Unable to detect inner functions for class:org.apache.spark.network.util.LevelDBProvider.LevelDBLogger [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.ShuffleBlockPusher.PushResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.OverlayContext Error instrumenting class:org.apache.spark.ml.tuning.CrossValidatorModel$CrossValidatorModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetArrayConverter.ElementConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleInsertQueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.SuccessFetchResult Error instrumenting class:org.apache.spark.sql.execution.datasources.ModifiedBeforeFilter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VarianceThresholdSelectorModel.VarianceThresholdSelectorWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.ShuffleInMemorySorter.1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClientFactory.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.PassThrough.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RefreshTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.HadoopFSUtils.SerializableFileStatus [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.RandomForestClassifierWrapper.RandomForestClassifierWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.DecisionTreeClassificationModel.DecisionTreeClassificationModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Index [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.StateStoreType [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.SparkAppHandle.State [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.TypedAverage.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.UnsignedPrefixComparatorDescNullsFirst [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillExecutorsOnHost [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator8$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.MapConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TypeConstructorContext Error instrumenting class:org.apache.spark.SSLOptions [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestDriverStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.streaming.InternalOutputModes.Append [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcDeserializer.ArrayDataUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CastContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.PivotType Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.orc.OrcTable [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.ALSModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$11 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableAliasContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator6$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Logit [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.SchemaPruning.RootField Error instrumenting class:org.apache.spark.kafka010.KafkaDelegationTokenProvider [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.TextSocketContinuousStream.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ImplicitOperators [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.MapZipWithCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSlicer.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveSessionCatalog.ResolvedViewIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StopExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.AsTableIdentifier Error instrumenting class:org.apache.spark.input.WholeTextFileInputFormat [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveRdd [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator18$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BooleanExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveMissingReferences [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveRandomSeed [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UnquotedIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.PrimitiveConverter Error instrumenting class:org.apache.spark.sql.execution.streaming.state.SchemaHelper$SchemaV1Reader [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.io.LocalDiskShuffleMapOutputWriter.$LocalDiskShufflePartitionWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryClassificationSummary.$$typecreator6$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveNaturalAndUsingJoin [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.LaunchDriver [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$16 Error instrumenting class:org.apache.spark.deploy.history.HistoryServer Error instrumenting class:org.apache.spark.sql.execution.streaming.ManifestFileCommitProtocol [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator14$2 Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.parquet.ParquetWrite [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateValueWatermarkPredicate Error instrumenting class:org.apache.spark.api.python.TestOutputKeyConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.jdbc.connection.SecureConnectionProvider.JDBCConfiguration [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.WidenSetOperationTypes [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.expressions.SortDirection.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QualifiedColTypeWithPositionListContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.optimization.NNLS.Workspace [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.linalg.distributed.RowMatrix.$SVDMode$1$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DataTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SparkSaslServer.1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.OneForOneBlockPusher.$BlockPushCallback [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionFieldContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UnitToUnitIntervalContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.DataSourceV2Implicits.MetadataColumnsHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QuotedIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VarianceThresholdSelectorModel.VarianceThresholdSelectorWriter.Data Error instrumenting class:org.apache.spark.sql.execution.datasources.binaryfile.BinaryFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.IntervalYearAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IndexToString.$$typecreator1$4 Error instrumenting class:org.apache.spark.api.python.TestWritable [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.BisectingKMeansModel.BisectingKMeansModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.SparkAppConfig [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WriteStyle.RawStyle [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.SendHeartbeat [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerRemoved [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetShufflePushMergerLocations [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.RandomForestClassificationModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.Decimal.DecimalAsIfIntegral [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeMax Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.json.JsonWrite [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.KafkaDataConsumer.NonCachedKafkaDataConsumer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.LeftSide [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.Optimizer.OptimizeSubqueries [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.IntervalDayAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.stat.StatFunctions.CovarianceCounter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator19$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RecoverPartitionsContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0.$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.SquaredEuclideanSilhouette.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyToValuePair [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.ParquetOutputTimestampType Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetReadSupport [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PolynomialExpansion.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$6 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.Hasher [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.CombinedTypeCoercionRule Error instrumenting class:org.apache.spark.input.StreamBasedRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FloatLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator7$3 [WARN] Unable to detect inner functions for class:org.apache.spark.executor.Executor.TaskReaper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedStateStore.STATE [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.LocalIndexEncoder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.IntegralDivision [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RetryingBlockTransferor.$RetryingBlockTransferListener [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.Pipeline.SharedReadWrite [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.IsExecutorAlive [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.AddMetadataColumns [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.Replaced [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.StringType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetReplicateInfoForRDDBlocks [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionTransformContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.Deserializer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QualifiedColTypeWithPositionContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.RpcHandler.1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.CrossValidator.CrossValidatorWriter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.ExecutorDecommissioning [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.TransportRequestHandler.$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.CreateStageResult [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.status.AppStatusListener.StageCompletionTime Error instrumenting class:org.apache.spark.WritableConverter$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetMapConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AnsiNonReservedContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Imputer.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.FixedPoint [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterInStandby [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.ProgressReporter.ExecutionStats [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.DatabaseDesc [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.After [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.ChainedIterator [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillDriverResponse [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.SizeTracker.Sample [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StopWordsRemover.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.IdentifierHelper [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator23$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.FloatType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.python.MLSerDe.SparseVectorPickler [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocationsAndStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DereferenceContext Error instrumenting class:org.apache.spark.sql.execution.datasources.csv.MultiLineCSVDataSource$ Error instrumenting class:org.apache.spark.deploy.security.HBaseDelegationTokenProvider [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveBlock [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.DriverEndpoint [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveSessionCatalog.ResolvedV1TableAndIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.internal.io.FileCommitProtocol.TaskCommitMessage [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetExecutorEndpointRef [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Link [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntegerLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.LookupFunctions [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.BooleanAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.streaming.InternalOutputModes.Complete Error instrumenting class:org.apache.spark.input.StreamFileInputFormat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AnalyzeContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InstanceList.CountingRemoveIfForEach [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.TimestampConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowFunctionsContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.DoubleAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StructContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.MultipartIdentifierHelper [WARN] Unable to detect inner functions for class:org.apache.spark.MapOutputTrackerMaster.MessageLoop [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TransformQuerySpecificationContext Error instrumenting class:org.apache.spark.metrics.sink.PrometheusServlet [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.KafkaDataConsumer.NonCachedKafkaDataConsumer Error instrumenting class:org.apache.spark.ml.image.SamplePathFilter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PredicatedContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.RpcHandler.OneWayRpcCallback [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RelationPrimaryContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamespaceContext [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.NewHadoopRDD.NewHadoopMapPartitionsWithSplitRDD [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.StructConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InsertOverwriteDirContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelWriter.$$typecreator1$2 Error instrumenting class:org.apache.spark.input.FixedLengthBinaryInputFormat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$5 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.StreamingListenerBus.WrappedStreamingListenerEvent [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SortItemContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveSubquery [WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClient.$RpcChannelListener [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.EnsembleNodeData [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.Heartbeat [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.dstream.ReceiverInputDStream.ReceiverRateController [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.TypeConverter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DecommissionWorkers [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.AnsiTypeCoercion.DateTimeOperations [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Key [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.HashingTF.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.FValueTest.FValueResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.NamespaceChange.1 [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.HadoopRDD.HadoopMapPartitionsWithSplitRDD [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.LocalLDAModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetQuotedConfigurationContext Error instrumenting class:org.apache.spark.sql.errors.QueryCompilationErrors$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.util.BytecodeUtils.MethodInvocationFinder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExponentLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.TimestampNTZType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.BooleanEquality [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.MergingSortWithSessionWindowStateIterator.SessionRowInformation [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.ByteConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SparkSaslServer.$DigestCallbackHandler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.VectorizedColumnReader.2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinExec.OneSideHashJoiner [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.IntHasher [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedStateStore.ABORTED [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV2_0.$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.ByteType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$10 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.DecommissionExecutorsOnHost Error instrumenting class:org.apache.spark.sql.execution.datasources.PartitioningUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillDriver [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.FixedLenByteArrayAsLongUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.trees.TreeNodeRef [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveDeserializer [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.RocksDBConf.ConfEntry Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.csv.CSVTable [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.UpdateDelegationTokens [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.ByteArrays [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.SparseMatrixPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionSpecContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeQueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelWriter.Data Error instrumenting class:org.apache.spark.sql.execution.streaming.CheckpointFileManager$ [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.BlockStoreUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchRequest [WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClient.$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UncacheTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.plans.logical.statsEstimation.EstimationUtils.OverlappedRange [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MatchedActionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslSymbol [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Max Error instrumenting class:org.apache.spark.sql.internal.SharedState$ Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.VectorizedParquetRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ElementwiseProduct.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.DecisionTreeRegressionModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.StateStoreAwareZipPartitionsRDD [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingJoinStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.ShuffleMetricsSource [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.DeprecatedConfig [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComplexDataTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.InMemoryScans [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$9 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.ApproxCountDistinctForIntervals.LongArrayInternalRow [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.ALSWrapper.ALSWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ValueExpressionContext Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.text.TextTable [WARN] Unable to detect inner functions for class:org.apache.spark.util.HadoopFSUtils.SerializableBlockLocation [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.impl.GLMRegressionModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.Schema [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QuotedIdentifierAlternativeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator9$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetFilters.ParquetPrimitiveField [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.KafkaDataConsumer.CachedKafkaDataConsumer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.StructAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RFormulaModel.RFormulaModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.Aggregation [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.KolmogorovSmirnovTest.KolmogorovSmirnovTestResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.WithCTEStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetMemoryStatus Error instrumenting class:org.apache.spark.ml.source.libsvm.LibSVMFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.AttributeSeq [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.$$typecreator2$1 Error instrumenting class:org.apache.spark.mllib.tree.model.TreeEnsembleModel$SaveLoadV1_0$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator11$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.RocksDBStateStore.$UPDATING$ [WARN] Unable to detect inner functions for class:org.apache.spark.util.HadoopFSUtils.SerializableFileStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveHints.ResolveJoinStrategyHints [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedStateStore.UPDATING [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.CrossValidator.CrossValidatorReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.DataSourceV2Implicits.TableHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator16$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExpressionSeqContext Error instrumenting class:org.apache.spark.deploy.history.EventLogFileReader$ [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestExecutors Error instrumenting class:org.apache.spark.sql.execution.datasources.DataSourceUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.UpdateColumnType [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelWriter.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocations [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.RemovedConfig [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.UnsignedPrefixComparatorDesc [WARN] Unable to detect inner functions for class:org.apache.spark.ui.JettyUtils.ServletParams [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.ContinuousRow [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ErrorHandler.BlockFetchErrorHandler [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.io.LocalDiskShuffleMapOutputWriter.$PartitionWriterChannel [WARN] Unable to detect inner functions for class:org.apache.spark.util.HadoopFSUtils.SerializableBlockLocation [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveAggAliasInGroupBy [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.PushLeftSemiLeftAntiThroughJoin.AllowedJoin [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.io.LocalDiskShuffleMapOutputWriter.1 Error instrumenting class:org.apache.spark.deploy.rest.RestSubmissionServer [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV2_0.$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.DecommissionBlockManagers [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDeBase.BasePickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator25$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator24$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Binarizer.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.Cluster [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.Cluster [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerDecommissionSigReceived [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.SpecialLimits [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator13$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeM2n [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.PartitionOverwriteMode [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LinearSVCWrapper.LinearSVCWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.SparkConf.DeprecatedConfig [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.TemporalSequenceImpl [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.AppShufflePartitionInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FlatMapGroupsWithStateExec.InputProcessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$17 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.PushMergedRemoteMetaFailedFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.AppShuffleInfo [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterChanged [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.NaiveBayesWrapper.NaiveBayesWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.RandomForestClassifierWrapper.RandomForestClassifierWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DatasetUtils.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.SubmitDriverResponse [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.Schema [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.ArrowVectorAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.ShuffleBlockPusher.PushRequest [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GBTRegressionModel.GBTRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ErrorCapturingUnitToUnitIntervalContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamedWindowContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.MapAccessor Error instrumenting class:org.apache.spark.sql.execution.command.CommandUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.IdentityConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LinearRegressionWrapper.LinearRegressionWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CacheTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.Shutdown [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.streaming.InternalOutputModes.Update [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArray.SpillableArrayIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueRowConverterFormatV2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.WindowInPandasExec.WindowBoundType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetMapConverter.$KeyValueConverter [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.StringArrays [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GaussianMixtureWrapper.GaussianMixtureWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.TypedAverage.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.DecisionTreeClassifierWrapper.DecisionTreeClassifierWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StorageHandlerContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ReplaceTableHeaderContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.ContinuousRow Error instrumenting class:org.apache.spark.input.Configurable [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.1 Error instrumenting class:org.apache.spark.sql.execution.SparkScriptTransformationExec [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DataType.JSortedObject [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.Decimal.DecimalIsFractional [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClustering.Assignment [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DecimalType.Expression [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.RatingBlock Error instrumenting class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection.Schema [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.OneForOneBlockFetcher.$ChunkCallback [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropFunctionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FrameBoundContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.SignedPrefixComparator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.FMRegressionModel.FMRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.BooleanUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.DoubleConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.DurationConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetFilters.ParquetSchemaType [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.NodeData [WARN] Unable to detect inner functions for class:org.apache.spark.SparkConf.AlternateConfig [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisteredApplication [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.RobustScalerModelWriter.$$typecreator1$2 Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetUtils$FileTypes [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.CosineSilhouette.$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.DCT.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.TrainValidationSplit.TrainValidationSplitWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ExecutorAllocationManager.StageAttempt [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ChiSquareTest.$typecreator19$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.UnsignedPrefixComparatorNullsLast [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchBlockInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ANOVATest.$typecreator5$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.csv.TextInputCSVDataSource$ [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalShuffleBlockResolver.$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.UnivariateFeatureSelectorModel.UnivariateFeatureSelectorModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.VertexData [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorAdded [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchBlockInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateViewContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetBlockStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveFunctions [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator18$1 [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.ShuffleInMemorySorter.SortComparator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator6$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.PeriodConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.IntegralSequenceImpl [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.SerializerBuildHelper.MapElementInformation [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.IsExecutorAlive [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.ConcurrentOutputWriterSpec [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.FMClassificationModel.FMClassificationModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StatefulAggregationStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.FPGrowthWrapper.FPGrowthWrapperReader Error instrumenting class:org.apache.spark.ui.ProxyRedirectHandler$ResponseWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.executor.ExecutorMetricsSource.ExecutorMetricGauge Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.json.JsonScan [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.RatingPickler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LDAModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.SplitData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetOperationContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.RpcHandler.NoopMergedBlockMetaReqHandler [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.MessageDecoder.1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.LocalLDAModel.SaveLoadV1_0.Data$ [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.DefaultPartitionCoalescer.partitionGroupOrdering [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.EpochMarker [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GaussianMixtureWrapper.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.StateStore.MaintenanceTask [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerLatestState [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeColNameContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.JoinTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.$$typecreator6$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.LongAsMicrosRebaseUpdater Error instrumenting class:org.apache.spark.sql.execution.datasources.orc.OrcFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.IsExecutorAlive [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.FunctionIdentifierHelper [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.OneVsRestModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.BytesToBytesMap.1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorAdded [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.TimestampTypes [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.ALSWrapper.ALSWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestSubmitDriver [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MultipartIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ArithmeticBinaryContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.CatalogImpl.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$12 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableFileFormatContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.PredictData [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator8$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.analysis.DetectAmbiguousSelfJoin.ColumnReference [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExplainContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WriteStyle.FlattenStyle [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0.Data Error instrumenting class:org.apache.spark.sql.execution.streaming.CheckpointFileManager$CancellableFSDataOutputStream [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.RandomForestClassificationModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.GroupByClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$13 Error instrumenting class:org.apache.spark.ui.JettyUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeKVExternalSorter.$KVSorterIterator [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.SpillableIterator [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherServer.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SparkSession.implicits [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.FMRegressorWrapper.FMRegressorWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.SplitData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveOrdinalInOrderByAndGroupBy [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.RemoteBlockDownloadFileManager.ReferenceWithCleanup [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelWriter.Data Error instrumenting class:org.apache.spark.input.FixedLengthBinaryRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator15$2 [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.BlockStoreClient.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.Pipeline.PipelineWriter Error instrumenting class:org.apache.spark.internal.io.HadoopMapRedWriteConfigUtil [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryPrimaryContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.StopAppClient [WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.ErrorHandlingWritableChannel [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClusteringModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.LongAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveAlterTableCommands [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.CoalesceExec.EmptyPartition [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.NaiveBayesWrapper.NaiveBayesWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Identity [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.PeriodSequenceImpl [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.QueryExecution.debug [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.GroupingSetContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Normalizer.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.attribute.AttributeType.Nominal$1$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.ShortAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.TaskIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.RatingBlock [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.RobustScalerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.MetricsAggregate [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.LaunchedExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixture.$$typecreator5$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.PartitionPath$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.ShortUpdater Error instrumenting class:org.apache.spark.sql.execution.datasources.SchemaMergeUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.DefaultStep [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateManagerImplV2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ResourceContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetPeers [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDBTypeInfo.$Index [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.TextSocketContinuousStream.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.IsotonicRegressionModel.SaveLoadV1_0.Data$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.FMRegressionModel.FMRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog.CatalogDatabaseHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.StructConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Binarizer.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.NGram.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.HavingClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ConstantContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveTempViews [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.BinaryToSQLTimestampRebaseUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.KMeansModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslExpression [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator2$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator10$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerSchedulerStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.ArrayAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.CreateStageResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Subscript [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveReferences [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.CoalesceExec.EmptyRDDWithPartitions [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.SignedPrefixComparatorDescNullsFirst [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelWriter.Data Error instrumenting class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider$StoreFile$ [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorStateChanged [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PredicateContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelWriter.Data Error instrumenting class:org.apache.spark.sql.catalyst.parser.ParserUtils$EnhancedLogicalPlan$ [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.ShuffleInMemorySorter.ShuffleSorterIterator [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.HashComparator [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestKillDriver [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestSubmitDriver [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.JoinReorderDP.JoinPlan Error instrumenting class:org.apache.spark.deploy.worker.ui.WorkerWebUI [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.receiver.BlockGenerator.GeneratorState [WARN] Unable to detect inner functions for class:org.apache.spark.InternalAccumulator.shuffleWrite [WARN] Unable to detect inner functions for class:org.apache.spark.ml.python.MLSerDe.DenseMatrixPickler Error instrumenting class:org.apache.spark.metrics.MetricsSystem [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.DenseVectorPickler [WARN] Unable to detect inner functions for class:org.apache.spark.executor.ExecutorMetricsPoller.TCMP Error instrumenting class:org.apache.spark.status.api.v1.PrometheusResource$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator28$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveNamespace [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RegisterBlockManager [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateManagerImplBase [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.RenameColumn [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.LaunchExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DefaultParamsReader.Metadata [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.DecisionTreeClassificationModel.DecisionTreeClassificationModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Interaction.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator17$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.xml.UDFXPathUtil.ReusableStringReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StringLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.RebaseDateTime.JsonRebaseRecord [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BoundPortsResponse [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.FMRegressionModel.FMRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.ImplicitAttribute [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.util.Utils.Lock [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec.$ColumnMetrics$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.BisectingKMeansModel.BisectingKMeansModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator26$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BooleanValueContext Error instrumenting class:org.apache.spark.sql.execution.streaming.state.RocksDB$ Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.parquet.ParquetTable [WARN] Unable to detect inner functions for class:org.apache.spark.sql.streaming.StreamingQueryListener.QueryStartedEvent [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.VertexData [WARN] Unable to detect inner functions for class:org.apache.spark.util.JsonProtocol.TASK_END_REASON_FORMATTED_CLASS_NAMES Error instrumenting class:org.apache.spark.sql.execution.SparkScriptTransformationWriterThread$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.analysis.DetectAmbiguousSelfJoin.ColumnReference [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.FlatMapGroupsWithStateStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.protocol.BlockTransferMessage.Type [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AnalyzeTablesContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InsertOverwriteTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ExtractGenerator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.CatalogAndIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.CompleteRecovery [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.impl.ShippableVertexPartition.ShippableVertexPartitionOpsConstructor [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorker [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.FPGrowthWrapper.FPGrowthWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.OpenHashMapBasedStateMap.LimitMarker Error instrumenting class:org.apache.spark.sql.execution.streaming.FileContextBasedCheckpointFileManager [WARN] Unable to detect inner functions for class:org.apache.spark.ExecutorAllocationManager.TargetNumUpdates [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator11$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleByPercentileContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedReadStateStore [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcFiltersBase.OrcPrimitiveField [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComplexColTypeListContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.$$typecreator2$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.json.JsonFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.Trigger [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveSubqueryColumnAliases [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$4 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.BinaryType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Gaussian [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowColumnsContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.util.LevelDBProvider.StoreVersion [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.UncompressedInBlockSort [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockHandler.$ShuffleMetrics [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LDA.LDAReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAssembler.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateData [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.dynalloc.ExecutorMonitor.ShuffleCleanedEvent [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Log [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.RobustScalerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ApplicationFinished [WARN] Unable to detect inner functions for class:org.apache.spark.sql.jdbc.MsSqlServerDialect.SpecificTypes Error instrumenting class:org.apache.spark.sql.execution.streaming.FileStreamSinkLog [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Tweedie [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.ExternalIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LogicalNotContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator20$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PolynomialExpansion.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.attribute.AttributeType.Binary$1$ [WARN] Unable to detect inner functions for class:org.apache.spark.util.SizeEstimator.ClassInfo [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.LaunchTask [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.RepeatedConverter [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.PartitionStrategy.RandomVertexCut [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.HintContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.StreamingListenerBus.WrappedStreamingListenerEvent [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClustering.Assignment [WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.WriteQueued [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.types.UTF8String.IntWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterClusterManager [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GBTClassifierWrapper.GBTClassifierWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.OneForOneStreamManager.StreamState [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.DataSourceV2Implicits.PartitionSpecsHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.GroupByType [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.types.UTF8String.LongWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FileFormatContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingRelationStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.ParserUtils.EnhancedLogicalPlan [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocationsMultipleBlockIds [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorkerFailed Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetFooterReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DecommissionWorkersOnHosts [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.PushLeftSemiLeftAntiThroughJoin.PushdownDirection [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierListContext Error instrumenting class:org.apache.spark.mllib.clustering.DistributedLDAModel$SaveLoadV1_0$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetFilters.ParquetPrimitiveField [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueStore [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ColTypeListContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.attribute.AttributeType.Unresolved$1$ [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.ArrayWrappers.ComparableIntArray [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateKeyWatermarkPredicate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Named [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetNamespacePropertiesContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SortPrefixUtils.NoOpPrefixComparator Error instrumenting class:org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CommentSpecContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.CheckForWorkerTimeOut [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetDecimalConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.KVTypeInfo.$MethodAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.ReviveOffers [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.BooleanBitSet.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorker [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RenameTableColumnContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.Serializer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetLongDictionaryAwareDecimalConverter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.OutputCommitCoordinatorEndpoint [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.EnsembleNodeData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.DoubleConverter [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.MergeShuffleFile [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.ByteBufferBlockStoreUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.LeastSquaresNESolver [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FunctionNameContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.MetricsAggregate [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.CosineSilhouette.$typecreator2$2 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.FileBasedWriteAheadLog.LogInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ExecutorAllocationManager.ExecutorAllocationListener Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat Error instrumenting class:org.apache.spark.sql.execution.datasources.binaryfile.BinaryFileFormat$ Error instrumenting class:org.apache.spark.metrics.sink.MetricsServlet [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TablePropertyListContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GBTRegressionModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$5 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.DataTypeJsonUtils.DataTypeJsonDeserializer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.DurationSequenceImpl [WARN] Unable to detect inner functions for class:org.apache.spark.util.JsonProtocol.SPARK_LISTENER_EVENT_FORMATTED_CLASS_NAMES [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StopWordsRemover.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.ListObjectOutputStream [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.Message [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockStoreClient.$2 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestDriverStatus Error instrumenting class:org.apache.spark.ui.DelegatingServletContextHandler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.NumNonZeros [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.NullIntolerant [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.PivotType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.ArrayConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$15 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FallbackOnPushMergedFailureResult Error instrumenting class:org.apache.spark.ml.tuning.CrossValidatorModel$CrossValidatorModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.storage.DiskBlockObjectWriter.$ManualCloseBufferedOutputStream$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.RebaseDateTime.RebaseInfo [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.Main.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.IntAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.VariableLengthRowBasedKeyValueBatch.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.PushBlockStreamCallback [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.IntegerType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.WriteQueueResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.BooleanBitSet.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.DCT.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.QueryPlanningTracker.PhaseSummary [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Poisson [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveShufflePushMergerLocation [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.QuasiNewton [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeExternalRowSorter.PrefixComputer.Prefix [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.StructNullableTypeConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.KMeansWrapper.KMeansWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.StringAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$8 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.RandomForestRegressionModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.OptimizeOneRowRelationSubquery.OneRowSubquery Error instrumenting class:org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand$ Error instrumenting class:org.apache.spark.sql.execution.streaming.state.StateStore$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.LaunchExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RelationContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterWriter.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.sources.MemorySink.AddedData [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.CombinedTypeCoercionRule [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.CharType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.StandaloneResourceUtils.StandaloneResourceAllocation [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Probit [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockHandler.$ManagedBufferIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleMethodContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.PushMergedLocalMetaFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.SubmitDriverResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.BinaryAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionSpecLocationContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.ExecutorDecommissioning Error instrumenting class:org.apache.spark.sql.execution.streaming.StreamMetadata$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.IntDelta.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator9$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.LocalPrefixSpan.ReversedPrefix Error instrumenting class:org.apache.spark.sql.execution.streaming.state.SchemaHelper$SchemaWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierCommentContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.RightSide Error instrumenting class:org.apache.spark.ui.ServerInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StopWordsRemover.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.FileBasedWriteAheadLog.LogInfo Error instrumenting class:org.apache.spark.kafka010.KafkaTokenUtil$KafkaDelegationTokenIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RemoveWorker [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SubstringContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RFormulaModel.RFormulaModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.CosineSilhouette.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.TriggerThreadDump [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.Strings [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.FileEntry [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.WindowsSubstitution [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.DiskMapIterator [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.GradientBoostedTreesModel.SaveLoadV1_0 Error instrumenting class:org.apache.spark.sql.execution.datasources.NoopCache$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeExternalRowSorter.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VarianceThresholdSelectorModel.VarianceThresholdSelectorWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillExecutorsOnHost [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.protocol.BlockTransferMessage.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.CholeskySolver [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper.GeneralizedLinearRegressionWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockHandler.$ShuffleMetrics.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.ConcatCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StopDriver [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.BlockLocationsAndStatus [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.JoinSelection [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.Postfix [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InMemoryLists [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$6 [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.TransportRequestHandler.$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator14$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LambdaContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BucketSpecContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.SourceFileRemover [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DriverStateChanged [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinConditionSplitPredicates [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TransformClauseContext Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.csv.CSVWrite [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelReader.$$typecreator5$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.InternalSequenceBase [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeKVExternalSorter.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.PartitioningUtils.TypedPartValue [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$9 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.FloatType.FloatAsIfIntegral [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.IDF.DocumentFrequencyAggregator [WARN] Unable to detect inner functions for class:org.apache.spark.ExecutorAllocationManager.TargetNumUpdates [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DoubleLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.Block.InlineHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.SetProperty [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.SquaredEuclideanSilhouette.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveNamespace [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.GBTClassificationModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.SerializationDebugger [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.receiver.BlockGenerator.Block [WARN] Unable to detect inner functions for class:org.apache.spark.BarrierCoordinator.ContextBarrierState [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FailureFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.VarcharType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FunctionCallContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.CalendarConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSlicer.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeM2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$14 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.ReplicateBlock [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SubqueryExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.ObjectStreamClassReflection [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugQuery [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ImputerModel.ImputerReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateWatermarkPredicates [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTableClausesContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.DynamicPartitionDataConcurrentWriter.WriterStatus [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.RobustScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DecimalType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveShuffle [WARN] Unable to detect inner functions for class:org.apache.spark.network.crypto.TransportCipher.EncryptedMessage [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.HashingTF.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.LongDelta.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.StarSchemaDetection.TableAccessCardinality [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator4$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.SessionCatalogAndIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.DirectKafkaRateController [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GBTRegressionModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.BytesToBytesMap.$Location [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ApplyTransformContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.BisectingKMeansWrapper.BisectingKMeansWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.LongArrays [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.adaptive.OptimizeSkewedJoin.ShuffleStage [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.RowUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ConstantDefaultContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.DictionaryEncoding.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.ArrayWrappers.ComparableByteArray [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.InBlock [WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DatasetUtils.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.OldData [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.ByteBufferBlockStoreUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$5 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.DecommissionBlockManagers [WARN] Unable to detect inner functions for class:org.apache.spark.storage.StorageStatus.RddStorageInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DayTimeIntervalType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LateralViewContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.CatalogAndMultipartIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComplexColTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RemoveExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InMemoryView [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.RepeatedPrimitiveConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator12$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ErrorCapturingMultiUnitsIntervalContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.ShortType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Binarizer.$$typecreator2$1 Error instrumenting class:org.apache.spark.mllib.regression.IsotonicRegressionModel$SaveLoadV1_0$Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelWriter.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Inverse [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.stat.test.ChiSqTest.Method [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Tokenizer.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator23$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingGlobalLimitStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ClearCacheContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.SpilledFile [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.ObjectSerializerPruning.IsNullCondition [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator21$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.EvaluatePython.StructTypePickler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.StringConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Sqrt [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.LongAsMicrosUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.DirectKafkaInputDStreamCheckpointData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.analysis.DetectAmbiguousSelfJoin.LogicalPlanWithDatasetId [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.KMeansModelReader.$$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TransformArgumentContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.internal.plugin.PluginContextImpl.PluginMetricsSource [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.ArrayConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.PartitioningUtils.TypedPartValue [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.DeprecatedConfig [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.PushMergedLocalMetaFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.plans.logical.statsEstimation.EstimationUtils.OverlappedRange [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LastContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SparkSaslClient.1 Error instrumenting class:org.apache.spark.ml.image.SamplePathFilter$ [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.PartitionStrategy.EdgePartition1D [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator27$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NotMatchedActionContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.UpdateDelegationTokens [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.HistoryServerDiskManager.Lease [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.HashMapGenerator.Buffer [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.RemoteBlockDownloadFileManager [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.LocalDateConverter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.AddWebUIFilter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ReconnectWorker [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TablePropertyContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.Deprecated [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ResetConfigurationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierSeqContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.UnivariateFeatureSelectorModel.UnivariateFeatureSelectorModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.recommendation.MatrixFactorizationModel.SaveLoadV1_0.$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveRelations [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveHints.ResolveCoalesceHints [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator5$2 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.UpdateBlockInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.ProgressReporter.ExecutionStats [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.RunLengthEncoding.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.PartitionStrategy.EdgePartition2D [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.OldData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TinyIntLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TrimContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.StructConverter [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.TimSort.$SortState [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DecommissionWorkersOnHosts [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.IsotonicRegressionWrapper.IsotonicRegressionWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.BasicNullableTypeConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ColumnarBatch.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.impl.GLMClassificationModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ClassificationModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetBinaryDictionaryAwareDecimalConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.FixedPoint [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$14 [WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.NettyRpcEnv.FileDownloadChannel [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.FMClassificationModel.FMClassificationModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexAndValue [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateManager [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RegularQuerySpecificationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.BooleanConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.FamilyAndLink [WARN] Unable to detect inner functions for class:org.apache.spark.util.sketch.CountMinSketch.Version [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec.$ColumnMetrics [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.TempFileBasedBlockStoreUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.dynalloc.ExecutorMonitor.ExecutorIdCollector [WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.BaseErrorHandler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.VectorizedRleValuesReader.1 Error instrumenting class:org.apache.spark.ml.tuning.TrainValidationSplitModel$TrainValidationSplitModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.Postfix [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.$typecreator5$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LDAModel.$$typecreator1$2 Error instrumenting class:org.apache.spark.sql.catalyst.util.CompressionCodecs$ [WARN] Unable to detect inner functions for class:org.apache.spark.executor.Executor.TaskRunner [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTableHeaderContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.streaming.StreamingQueryListener.QueryProgressEvent [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.HadoopRDD.HadoopMapPartitionsWithSplitRDD [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DoubleType.DoubleAsIfIntegral [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowSpecContext [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.ObjectStreamClassMethods [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.functions.$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.attribute.AttributeType.Numeric$1$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TransformContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveEncodersInUDF [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.DoublePrefixComparator Error instrumenting class:org.apache.spark.sql.execution.datasources.orc.OrcColumnarBatchReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorUpdated [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.SparkScripts [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LoadDataContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WriteStyle.QuotedStyle [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.Auto [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV2_0.$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDB.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeNNZ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DateType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.AFTSurvivalRegressionWrapper.AFTSurvivalRegressionWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WhenClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.$3 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisteredWorker [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.Heartbeat [WARN] Unable to detect inner functions for class:org.apache.spark.executor.CoarseGrainedExecutorBackend.Arguments [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.SparkSubmitCommandBuilder.$OptionParser [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ManageResourceContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.LongUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntervalLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveBroadcast [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.Metadata$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcFiltersBase.OrcPrimitiveField [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetTimeZoneContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClusteringModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RealIdentContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ArithmeticOperatorContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AssignmentContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MultiInsertQueryBodyContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.crypto.TransportCipher.DecryptionHandler [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveBlock [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.InternalSequence [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslAttribute Error instrumenting class:org.apache.spark.sql.execution.streaming.FileSystemBasedCheckpointFileManager [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalShuffleBlockResolver.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.SimpleDownloadFile.$SimpleDownloadWritableChannel [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.SeenFilesMap [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.BinaryToSQLTimestampConvertTzUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RowFormatSerdeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec.$SetAccumulator [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.DeferFetchRequestResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.JoinCriteriaContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ValueExpressionDefaultContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator22$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.MultilayerPerceptronClassifierWrapper.MultilayerPerceptronClassifierWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropViewContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VarianceThresholdSelectorModel.VarianceThresholdSelectorModelReader Error instrumenting class:org.apache.spark.mllib.tree.model.TreeEnsembleModel$SaveLoadV1_0$Metadata [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StrictNonReservedContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator15$3 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.UnregisterApplication [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Link [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.DecimalConverter [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDBTypeInfo.1 [WARN] Unable to detect inner functions for class:org.apache.spark.SparkConf.AlternateConfig Error instrumenting class:org.apache.spark.sql.execution.streaming.FileStreamSourceLog [WARN] Unable to detect inner functions for class:org.apache.spark.util.random.StratifiedSamplingUtils.RandomDataGenerator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.LongType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.PythonMLLibAPI.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelWriter.Data Error instrumenting class:org.apache.spark.WritableFactory$ [WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClient.$StdChannelListener [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.SerDeUtil.AutoBatchedPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator26$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.DecommissionExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.param.shared.SharedParamsCodeGen.ParamDesc Error instrumenting class:org.apache.spark.ui.ServerInfo$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MultipartIdentifierListContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.IsotonicRegressionWrapper.IsotonicRegressionWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ElementwiseProduct.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RegexTokenizer.$$typecreator2$2 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.GetExecutorLossReason [WARN] Unable to detect inner functions for class:org.apache.spark.storage.StorageStatus.NonRddStorageInfo Error instrumenting class:org.apache.spark.sql.execution.streaming.state.SchemaHelper$SchemaV1Writer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.ShortConverter Error instrumenting class:org.apache.spark.sql.execution.streaming.FileStreamSink$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeMean [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator9$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.JsonProtocol.JOB_RESULT_FORMATTED_CLASS_NAMES Error instrumenting class:org.apache.spark.ui.WebUI [WARN] Unable to detect inner functions for class:org.apache.spark.status.KVUtils.KVStoreScalaSerializer [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.NormL1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PrimaryExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator19$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.IdentityConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.StringUtils.PlanStringConcat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateManagerImplV1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ArithmeticUnaryContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.CLogLog [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.PushMergedRemoteMetaFailedFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.MiscellaneousProcessAdded [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCheckResult.TypeCheckSuccess Error instrumenting class:org.apache.spark.sql.execution.streaming.CompactibleFileStreamLog [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.SquaredEuclideanSilhouette.ClusterStats [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.BooleanConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexer.CategoryStats [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.FPGrowthModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.SpilledFile [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator24$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleByBucketContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DoubleType.DoubleAsIfIntegral [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SearchedCaseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetConfigurationContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.RevokedLeadership [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RFormulaModel.RFormulaModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.BatchedWriteAheadLog.Record [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ColPositionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.QuantileSummaries.Stats [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.lib.SVDPlusPlus.Conf [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GaussianMixtureWrapper.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AlterColumnActionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DoubleType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleByRowsContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcDeserializer.CatalystDataUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NonReservedContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueRowConverter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.ElectedLeader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExistsContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.NamespaceChange.RemoveProperty [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.StringUtils.StringConcat [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.UncompressedInBlock [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator20$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GBTRegressorWrapper.GBTRegressorWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RenameTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Interaction.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AlterTableAlterColumnContext Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.csv.CSVScan Error instrumenting class:org.apache.spark.executor.ExecutorSource Error instrumenting class:org.apache.spark.sql.execution.datasources.FileFormatWriter$ [WARN] Unable to detect inner functions for class:org.apache.spark.TestUtils.JavaSourceFromString [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.DistributedLDAModel.DistributedWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator7$2 Error instrumenting class:org.apache.spark.sql.execution.datasources.json.MultiLineJsonDataSource$ [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StatusUpdate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NullLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.HealthTracker.ExecutorFailureList.TaskId [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TruncateTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Sum [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StarContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RequestExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowPartitionsContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.DecisionTreeClassificationModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.AnsiTypeCoercion.GetDateFieldOperations [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$18 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueRowConverterFormatV1 [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.BasePythonRunner.ReaderIterator Error instrumenting class:org.apache.spark.sql.execution.datasources.ModifiedAfterFilter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.MetadataColumnHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.AddColumn [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.StageState [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestKillDriver [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SparkSession.Builder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.EvaluatePython.RowPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.TimSort.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveTables Error instrumenting class:org.apache.spark.input.StreamRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.UnivariateFeatureSelectorModel.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.CatalogImpl.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UpdateTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator10$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeExternalRowSorter.RowComparator [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeans.ClusterSummary [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.functions.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.PassThrough.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GBTRegressionModel.$$typecreator2$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.SQLHadoopMapReduceCommitProtocol [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.BasicOperators [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.impl.GLMRegressionModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.DistributedLDAModel.DistributedLDAModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.impl.GLMClassificationModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.PartitionTypeHelper [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.LaunchTask [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.BitmapArrays [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BoundPortsRequest [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.InternalLinearRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$12 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.DataSource.SourceInfo [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.AbstractLauncher.ArgumentValidator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DoubleType.DoubleIsConflicted [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.SerializerBuildHelper.MapElementInformation Error instrumenting class:org.apache.spark.ml.source.image.ImageFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SimpleCaseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.TimestampNTZConverter Error instrumenting class:org.apache.spark.sql.catalyst.expressions.codegen.Block$InlineHelper$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveWindowFrame [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NestedConstantListContext Error instrumenting class:org.apache.spark.api.python.JavaToWritableConverter Error instrumenting class:org.apache.spark.sql.execution.datasources.PartitionDirectory$ [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterClusterManager [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Family [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryOrganizationContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.FMClassificationModel.FMClassificationModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.FamilyAndLink [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkDirCleanup [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator21$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator10$3 [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.ShuffleBlockPusher.PushResult [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelWriter.Data Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.SpecificParquetRecordReaderBase Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.TextBasedFileScan [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.$SpillableIterator [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.ExternalIterator.$StreamBuffer [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.BasePythonRunner.WriterThread [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$2 [WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.RpcEndpointVerifier.CheckExistence [WARN] Unable to detect inner functions for class:org.apache.spark.ml.PipelineModel.PipelineModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.RebaseDateTime.JsonRebaseRecord [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator12$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRest.OneVsRestReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.resource.ResourceProfile.ExecutorResourcesOrDefaults [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$7 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.NNLSSolver [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeans.ClusterSummary [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.DecisionTreeRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.network.util.NettyUtils.1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.functions.$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTableLikeContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.python.MLSerDe.DenseVectorPickler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerWriter [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InMemoryIterator [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.SerDeUtil.ByteArrayConstructor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.UnivariateFeatureSelectorModel.UnivariateFeatureSelectorModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinSide [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.DictionaryEncoding.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.Instrumentation.loggerTags [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.TableDesc [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator30$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator2$3 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.dstream.FileInputDStream.FileInputDStreamCheckpointData [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SkewSpecContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.GetExecutorLossReason [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator21$2 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DecommissionWorkers Error instrumenting class:org.apache.spark.mllib.clustering.GaussianMixtureModel$SaveLoadV1_0$Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.ApproxCountDistinctForIntervals.LongArrayInternalRow [WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClient.$3 [WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.ErrorHandlingInputStream Error instrumenting class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider$StoreFile [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator5$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Projection [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$5 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.RandomForestRegressorWrapper.RandomForestRegressorWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.adaptive.OptimizeShuffleWithLocalRead.BroadcastJoinWithShuffleRight [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ResetQuotedConfigurationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveHints.RemoveAllHints [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.PythonMLLibAPI.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherBackend.BackendConnection [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetIntDictionaryAwareDecimalConverter Error instrumenting class:org.apache.spark.mllib.clustering.DistributedLDAModel$SaveLoadV1_0$EdgeData [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.Prefix [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.NodeData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.BooleanType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.util.TimerWithCustomTimeUnit.$SnapshotWithCustomTimeUnit [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetExecutorEndpointRef [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RemoveExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.HintStatementContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LinearSVCWrapper.LinearSVCWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.CatalogHelper [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.JdbcRDD.ConnectionFactory [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.OrderedIdentifierListContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.CatalogImpl.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveInsertInto [WARN] Unable to detect inner functions for class:org.apache.spark.executor.ExecutorMetricsPoller.TCMP Error instrumenting class:org.apache.spark.sql.execution.streaming.CheckpointFileManager$RenameBasedFSDataOutputStream [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$6 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeKVExternalSorter.KVComparator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.Block.BlockHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.IntegerUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAssembler.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateValueWatermarkPredicate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveWindowOrder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.InstantConverter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.$typecreator1$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.json.TextInputJsonDataSource$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.Solver [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.KolmogorovSmirnovTest.KolmogorovSmirnovTestResult [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TablePropertyValueContext Error instrumenting class:org.apache.spark.internal.io.FileCommitProtocol$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.QueryPlanningTracker.RuleSummary Error instrumenting class:org.apache.spark.ui.ProxyRedirectHandler [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockHandler.$ShuffleChunkManagedBufferIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamedExpressionSeqContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FunctionIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.util.LevelDBProvider.1 Error instrumenting class:org.apache.spark.sql.execution.datasources.PathFilterWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.BinaryUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.AddWebUIFilter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AddTableColumnsContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DmlStatementContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleDataTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Wildcard [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetTablePropertiesContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCheckResult.TypeCheckFailure Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.parquet.ParquetScan$ [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SaslEncryption.EncryptionHandler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog.TableIdentifierHelper [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionBase.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.write.WriteBuilder.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSizeHint.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.AnsiTypeCoercion.PromoteStringLiterals [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DeleteFromTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FailureFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV2_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.ByteConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveAliases [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.FloatHasher [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.SignedPrefixComparatorNullsLast [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.DynamicPartitionDataConcurrentWriter.WriterIndex Error instrumenting class:org.apache.spark.sql.execution.datasources.InMemoryFileIndex$ [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.ListObjectOutput [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherServer.$ServerConnection [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RowFormatContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocationsAndStatus [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.FPTree.Node [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslString Error instrumenting class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider$HDFSBackedStateStore Error instrumenting class:org.apache.spark.api.python.TestOutputValueConverter [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.RadixSortSupport [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.MapConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.PartitioningUtils.PartitionValues [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator4$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeRelationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ExtractGenerator.AliasedGenerator$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.PipelineModel.PipelineModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.analysis.DetectAmbiguousSelfJoin.AttrWithCast [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.AppShuffleMergePartitionsInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.FloatUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ErrorHandler.BlockPushErrorHandler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.AFTSurvivalRegressionWrapper.AFTSurvivalRegressionWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV1_0.$typecreator1$1 Error instrumenting class:org.apache.spark.sql.execution.streaming.state.SchemaHelper$SchemaV2Writer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.PythonEvals [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.DataSourceV2Implicits.OptionsHelper Error instrumenting class:org.apache.spark.sql.errors.QueryExecutionErrors$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StatementDefaultContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$6 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateNamespaceContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveGenerate [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.FValueTest.$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV3_0.$typecreator1$6 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.GBTClassificationModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ReconnectWorker [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Binomial [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArray.ExternalAppendOnlyUnsafeRowArrayIterator [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.BlockLocationsAndStatus [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.InternalLinearRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QualifiedNameContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.StringToAttributeConversionHelper [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.TransportRequestHandler.$4 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelWriter.Data Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$ [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DriverStateChanged [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.JoinRelationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetFilters.ParquetSchemaType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FirstContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.KMeansModelReader.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InsertIntoTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator22$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetTableLocationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateData [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator18$3 Error instrumenting class:org.apache.spark.sql.catalyst.streaming.WriteToStreamStatement$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LogicalBinaryContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.BlockPushNonFatalFailure.ReturnCode Error instrumenting class:org.apache.spark.SparkEnv$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.ByteAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.PushMergedRemoteMetaFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.Batch [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetStorageStatus [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.NewHadoopRDD.NewHadoopMapPartitionsWithSplitRDD [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorter.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.StateStoreOps [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.QuantileSummaries.Stats [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSizeHint.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.EdgeData$ [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.DenseMatrixPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SubscriptContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.api.r.SQLUtils.RegexContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterChangeAcknowledged [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.PythonWorkerFactory.MonitorThread [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV2_0.$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$13 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.KafkaDataConsumer.CachedKafkaDataConsumer [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.recommendation.MatrixFactorizationModel.SaveLoadV1_0.$typecreator16$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FunctionTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator15$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.UnivariateFeatureSelectorModel.UnivariateFeatureSelectorModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.IsotonicRegressionModel.SaveLoadV1_0.$typecreator1$1 Error instrumenting class:org.apache.spark.deploy.history.EventLogFileWriter$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.DiskBlockObjectWriter.ManualCloseOutputStream Error instrumenting class:org.apache.spark.streaming.StreamingContext$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.FeatureHasher.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeWeightSum [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorkerResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.DecimalAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.impl.GLMRegressionModel.SaveLoadV1_0.Data$ Error instrumenting class:org.apache.spark.sql.catalyst.catalog.InMemoryCatalog$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.PMMLLinearRegressionModelWriter.Data Error instrumenting class:org.apache.spark.streaming.api.java.JavaStreamingContext$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinConditionSplitPredicates [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.ToBlockManagerMasterStorageEndpoint [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockStoreClient.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.recommendation.MatrixFactorizationModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.DecimalConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator20$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SaslEncryption.DecryptionHandler [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InstanceList [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$8 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.First [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.BasicNullableTypeConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.RandomForestRegressionModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.functions.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Metric [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.OutputSpec [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.NullOutputStream [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillDriverResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.status.AppStatusListener.StageCompletionTime [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.PythonForeachWriter.UnsafeRowBuffer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext.MutableStateArrays [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.PipedRDD.NotEqualsFileNameFilter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveHints.DisableHints [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FromStatementContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableNameContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowViewsContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.DumpByteCode [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropTablePartitionsContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.HybridStore.SwitchToLevelDBListener [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Variance [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryClassificationSummary.$$typecreator6$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.WindowInPandasExec.BoundedWindow [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator29$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.SparkSubmitUtils.MavenCoordinate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTempViewUsingContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BeginRecovery [WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.NettyRpcEnv.FileDownloadCallback [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.StringConverter Error instrumenting class:org.apache.spark.sql.execution.datasources.csv.CSVFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.EpochMarkerGenerator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateWatermarkPredicates [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.KVTypeInfo.$FieldAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Power [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CommentNamespaceContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.ValueAndMatchPair [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection.Schema [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerDriverStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.KolmogorovSmirnovTest.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.GBTClassificationModel.GBTClassificationModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.FileStreamSourceCleaner [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.DummySerializerInstance.$1 Error instrumenting class:org.apache.spark.input.ConfigurableCombineFileRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NotMatchedClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchRequest [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.FPTree.Summary [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.CaseWhenCoercion Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.orc.OrcScan [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveOutputRelation [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateFileFormatContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinExec.OneSideHashJoiner.$AddingProcessedRowToStateCompletionIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.WindowInPandasExec.UnboundedWindow [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.JobScheduler.JobHandler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Named [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerDecommissioning [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.RandomForestRegressionModel.RandomForestRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.HandleNullInputsForUDF [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Message.Type [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator28$1 [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.impl.VertexPartition.VertexPartitionOpsConstructor [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.PushMergedRemoteMetaFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.OneForOneBlockFetcher.$DownloadCallback [WARN] Unable to detect inner functions for class:org.apache.spark.internal.io.FileCommitProtocol.EmptyTaskCommitMessage [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Normalizer.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDB.TypeAliases [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.Empty2Null Error instrumenting class:org.apache.spark.sql.catalyst.expressions.codegen.Block$BlockHelper$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeMetric Error instrumenting class:org.apache.spark.sql.execution.streaming.SinkFileStatus$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetArrayConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropNamespaceContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.FMRegressionModel.FMRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.FunctionArgumentConversion [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.HashingTF.HashingTFReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.UpdateColumnComment [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.BasePythonRunner.MonitorThread [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.FMClassifierWrapper.FMClassifierWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClusteringModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.GlobalAggregates Error instrumenting class:org.apache.spark.serializer.SerializationDebugger$ObjectStreamClassMethods$ [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.ExecutorDecommissionSigReceived [WARN] Unable to detect inner functions for class:org.apache.spark.util.SizeEstimator.SearchState [WARN] Unable to detect inner functions for class:org.apache.spark.InternalAccumulator.input [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateWatermarkPredicate [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.PowerIterationClustering.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.InternalLinearRegressionModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcDeserializer.RowUpdater Error instrumenting class:org.apache.spark.status.api.v1.ApiRootResource$ [WARN] Unable to detect inner functions for class:org.apache.spark.InternalAccumulator.shuffleRead [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.Empty2Null [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowFrameContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorUpdated [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RetrieveSparkAppConfig [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.UDTConverter Error instrumenting class:org.apache.spark.mllib.clustering.LocalLDAModel$SaveLoadV1_0$Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.RandomForestRegressorWrapper.RandomForestRegressorWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InlineTableDefault1Context [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.DynamicPartitionDataConcurrentWriter.WriterIndex [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator14$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ErrorCapturingIdentifierExtraContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.HashMapGenerator.Buffer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.TimestampType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.LaunchedExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator9$2 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerDriverStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyAndNumValues [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.EnsembleNodeData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.StructConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveAggregateFunctions [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InlineTableDefault2Context [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.DecisionTreeClassifierWrapper.DecisionTreeClassifierWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.StructNullableTypeConverter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ReregisterWithMaster [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.GaussianMixtureModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.TimestampAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.util.logging.DriverLogger.DfsAsyncWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveSessionCatalog.ResolvedV1TableOrViewIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorter.SortComparator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.Rating [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.receiver.ReceiverSupervisor.ReceiverState [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RenameTablePartitionContext [WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.CryptoParams [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.Stop [WARN] Unable to detect inner functions for class:org.apache.spark.util.sketch.BloomFilter.Version [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.DataTypeJsonUtils.DataTypeJsonSerializer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator24$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.V1Table.IdentifierHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.FixedLenByteArrayAsIntUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NumberContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.VectorizedRleValuesReader.MODE [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.KeyWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.LongConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AlterViewQueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator2$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.stat.test.ChiSqTest.NullHypothesis [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$10 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeFuncNameContext Error instrumenting class:org.apache.spark.SparkContext$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.KolmogorovSmirnovTest.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GaussianMixtureWrapper.GaussianMixtureWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.NormalEquation [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.UnsafeShuffleWriter.StreamFallbackChannelWrapper Error instrumenting class:org.apache.spark.sql.execution.datasources.CodecStreams$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLContext.implicits [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$7 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.OpenHashMapBasedStateMap.StateInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.NonSessionCatalogAndIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleStatementContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BooleanLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SelectClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.ConcurrentOutputWriterSpec [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.RebaseDateTime.RebaseInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.UncompressedInBlockBuilder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.ArraySortLike.NullOrder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.FloatType.FloatAsIfIntegral Error instrumenting class:org.apache.spark.sql.execution.command.PathFilterIgnoreNonData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.RemovedConfig [WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.Trigger [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.impl.GLMClassificationModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisteredApplication [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.DowncastLongUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRest.OneVsRestWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.ApproximatePercentile.PercentileDigestSerializer [WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.ErrorHandlingOutputStream [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RefreshResourceContext [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.HashMapGrowthStrategy.Doubling [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FromStmtContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.impl.RandomForest.NodeIndexInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleFunctionIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.BinaryToSQLTimestampUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV3_0.$typecreator1$5 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.IsExecutorAlive [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.FileStreamSourceCleaner [WARN] Unable to detect inner functions for class:org.apache.spark.status.KVUtils.MetadataMismatchException [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.CatalogAndNamespace [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.BytesToBytesMap.$MapIterator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.QuasiNewtonSolver.NormalEquationCostFun [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.$$typecreator16$2 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.HealthTracker.ExecutorFailureList.TaskId [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Mean [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowTablesContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Binarizer.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocationsMultipleBlockIds [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.BinaryToSQLTimestampConvertTzRebaseUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelWriter.Data Error instrumenting class:org.apache.spark.sql.execution.aggregate.TungstenAggregationIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$15 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RequestExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BeginRecovery [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ApplicationRemoved [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.StateStoreHandler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.DecisionTreeClassificationModel.DecisionTreeClassificationModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ANOVATest.$typecreator13$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerSchedulerStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.LongHasher [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.MapKeyDedupPolicy [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.KryoSerializer.PoolWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestMasterState [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelWriter.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MatchedClauseContext Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.UnivariateFeatureSelectorModel.UnivariateFeatureSelectorModelWriter.Data Error instrumenting class:org.apache.spark.ui.SparkUI [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.WindowFrameCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RemoveWorker [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VarianceThresholdSelectorModel.VarianceThresholdSelectorWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.DeclarativeAggregate.RichAttribute Error instrumenting class:org.apache.spark.sql.catalyst.catalog.ExternalCatalogUtils$ Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.orc.OrcWrite [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.SizeTracker.Sample [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ConstantListContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.FMClassifierWrapper.FMClassifierWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcShimUtils.VectorizedRowBatchWrap [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.$4 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV3_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator27$2 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillTask [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.SetAppId [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.ToBlockManagerMaster [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.OneVsRestModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeL1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.RocksDBConf.ConfEntry [WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DatasetUtils.$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveUpCast [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolvePivot [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArray.InMemoryBufferIterator [WARN] Unable to detect inner functions for class:org.apache.spark.executor.CoarseGrainedExecutorBackend.RegisteredExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.stat.FrequentItems.FreqItemCounter [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.StringPrefixComparator [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.BatchedWriteAheadLog.Record [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.AppPathsInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.InternalKMeansModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.ColumnPosition [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.GenericFileFormatContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetTableSerDeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.UnsignedLongUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.UnsafeShuffleWriter.MyByteArrayOutputStream [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowRefContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowTblPropertiesContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropTableColumnsContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.UDTConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.ShortConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.NamespaceHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetStringConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.streaming.StreamingQueryListener.Event [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.FPGrowth.FreqItemset [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.MergingSortWithSessionWindowStateIterator.SessionRowInformation [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.SelectorModel.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator17$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DmlStatementNoWithContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.RandomForestClassificationModel.RandomForestClassificationModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterApplication [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.impl.GLMClassificationModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.NormL2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeMin [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.VectorizedColumnReader.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.AppListingListener.MutableApplicationInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator7$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StatementContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeNamespaceContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ChiSquareTest.$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.ContinuousRecord [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.IntConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AliasedQueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.resource.ResourceProfile.DefaultProfileExecutorResources [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.GroupingElementContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.LaunchDriver [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.Decimal.DecimalIsConflicted [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SaslEncryption.EncryptedMessage [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.FMRegressorWrapper.FMRegressorWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.StageState [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalShuffleBlockResolver.AppExecId [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetBlockStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PositionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.IntConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.NamespaceChange.SetProperty [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DecimalLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator16$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.SparseVectorPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator8$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.GaussianMixtureModel.SaveLoadV1_0.Data$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexModelWriter.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator3$2 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.StandaloneResourceUtils.MutableResourceInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.FileEntry [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BigDecimalLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RegexTokenizer.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetPeers [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.RocksDBStateStore.STATE [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Gamma [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.PMMLLinearRegressionModelWriter.Data Error instrumenting class:org.apache.spark.input.WholeTextFileRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.RatingBlockBuilder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.StringUtils.StringConcat [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator3$1 Error instrumenting class:org.apache.spark.internal.io.HadoopMapReduceCommitProtocol [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.StringToColumn [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveBroadcast [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.PredictData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.Optimizer.UpdateCTERelationStats [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StatusUpdate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RowFormatDelimitedContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.ValueAndMatchPair [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.SetState [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.LocalPrefixSpan.ReversedPrefix [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BoundPortsResponse [WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.ErrorHandlingReadableChannel [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugStreamQuery [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator25$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.DataReaderThread [WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.LatchedTriggers [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.RandomForestModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.FloatAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorStateChanged [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.LinearRegressionModel.LinearRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.SpillableIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyToValuePair [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueType [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.ReplicateBlock [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.TransportRequestHandler.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ApplyCharTypePadding.AttrOrOuterRef [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpanModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ApplicationFinished [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator13$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GBTClassifierWrapper.GBTClassifierWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.LegacyBehaviorPolicy [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PrimitiveDataTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DecimalType.Fixed [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.json.JsonFilters.JsonPredicate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeFunctionContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.StorageStatus.RddStorageInfo [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.LogisticRegressionWithLBFGS.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.LongConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.UnsignedIntegerUpdater Error instrumenting class:org.apache.spark.sql.execution.datasources.text.TextFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowTableExtendedContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ANOVATest.$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.SplitData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowCreateTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.adaptive.OptimizeShuffleWithLocalRead.BroadcastJoinWithShuffleLeft [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelWriter.$$typecreator5$1 Created : .generated-mima-class-excludes in current directory. Created : .generated-mima-member-excludes in current directory. Using /usr/java/latest as default JAVA_HOME. Note, this will be overridden by -java-home if it is set. [info] welcome to sbt 1.5.5 (Private Build Java 1.8.0_275) [info] loading settings for project spark-master-test-sbt-hadoop-3-2-build from plugins.sbt ... [info] loading project definition from /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project [info] resolving key references (36381 settings) ... [info] set current project to spark-parent (in build file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/) [warn] there are 210 keys that are not used by any other settings/tasks: [warn] [warn] * assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * avro / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * avro / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * avro / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * avro / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * avro / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * avro / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * catalyst / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * catalyst / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * catalyst / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * catalyst / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * catalyst / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * catalyst / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * core / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * core / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * core / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * core / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * core / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * core / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * docker-integration-tests / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * docker-integration-tests / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * docker-integration-tests / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * docker-integration-tests / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * docker-integration-tests / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * docker-integration-tests / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * examples / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * examples / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * examples / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * examples / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * examples / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * examples / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * ganglia-lgpl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * ganglia-lgpl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * ganglia-lgpl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * ganglia-lgpl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * ganglia-lgpl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * ganglia-lgpl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * graphx / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * graphx / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * graphx / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * graphx / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * graphx / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * graphx / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * hadoop-cloud / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * hadoop-cloud / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * hadoop-cloud / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * hadoop-cloud / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * hadoop-cloud / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * hadoop-cloud / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * hive / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * hive / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * hive / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * hive / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * hive / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * hive / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * hive-thriftserver / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * hive-thriftserver / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * hive-thriftserver / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * hive-thriftserver / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * hive-thriftserver / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * hive-thriftserver / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * kubernetes / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * kubernetes / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * kubernetes / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * kubernetes / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * kubernetes / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * kubernetes / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * kvstore / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * kvstore / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * kvstore / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * kvstore / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * kvstore / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * kvstore / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * launcher / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * launcher / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * launcher / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * launcher / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * launcher / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * launcher / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * mesos / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * mesos / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * mesos / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * mesos / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * mesos / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * mesos / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * mllib / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * mllib / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * mllib / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * mllib / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * mllib / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * mllib / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * mllib-local / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * mllib-local / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * mllib-local / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * mllib-local / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * mllib-local / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * mllib-local / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * network-common / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * network-common / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * network-common / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * network-common / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * network-common / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * network-common / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * network-shuffle / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * network-shuffle / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * network-shuffle / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * network-shuffle / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * network-shuffle / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * network-shuffle / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * network-yarn / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * network-yarn / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * network-yarn / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * network-yarn / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * network-yarn / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * network-yarn / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * repl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * repl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * repl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * repl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * repl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * repl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * sketch / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * sketch / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * sketch / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * sketch / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * sketch / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * sketch / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * spark / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * spark / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * spark / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * spark / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * spark / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * spark / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * sql / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * sql / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * sql / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * sql / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * sql / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * sql / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * sql-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * sql-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * sql-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * sql-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * sql-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * sql-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming-kafka-0-10-assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming-kafka-0-10-assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming-kafka-0-10-assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming-kafka-0-10-assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming-kafka-0-10-assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming-kafka-0-10-assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming-kinesis-asl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming-kinesis-asl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming-kinesis-asl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming-kinesis-asl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming-kinesis-asl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming-kinesis-asl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming-kinesis-asl-assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming-kinesis-asl-assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming-kinesis-asl-assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming-kinesis-asl-assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming-kinesis-asl-assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming-kinesis-asl-assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * tags / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * tags / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * tags / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * tags / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * tags / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * tags / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * token-provider-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * token-provider-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * token-provider-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * token-provider-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * token-provider-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * token-provider-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * tools / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * tools / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * tools / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * tools / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * tools / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * tools / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * unsafe / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * unsafe / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * unsafe / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * unsafe / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * unsafe / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * unsafe / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * yarn / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * yarn / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * yarn / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * yarn / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * yarn / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * yarn / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] [warn] note: a setting might still be used by a command; to exclude a key from this `lintUnused` check [warn] either append it to `Global / excludeLintKeys` or call .withRank(KeyRanks.Invisible) on the key [info] spark-parent: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-tags: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-kvstore: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-unsafe: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-network-common: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-network-shuffle: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-network-yarn: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-tools: mimaPreviousArtifacts not set, not analyzing binary compatibility [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] spark-ganglia-lgpl: mimaPreviousArtifacts not set, not analyzing binary compatibility [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] spark-yarn: mimaPreviousArtifacts not set, not analyzing binary compatibility [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] spark-mesos: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-kubernetes: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-token-provider-kafka-0-10: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-catalyst: mimaPreviousArtifacts not set, not analyzing binary compatibility [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] spark-streaming-kinesis-asl: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-streaming-kinesis-asl-assembly: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-sql-kafka-0-10: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-streaming-kafka-0-10-assembly: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-hadoop-cloud: mimaPreviousArtifacts not set, not analyzing binary compatibility [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] spark-docker-integration-tests: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-hive: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-avro: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-repl: mimaPreviousArtifacts not set, not analyzing binary compatibility [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] spark-hive-thriftserver: mimaPreviousArtifacts not set, not analyzing binary compatibility [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] spark-examples: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-assembly: mimaPreviousArtifacts not set, not analyzing binary compatibility [success] Total time: 46 s, completed Dec 5, 2021 8:01:52 PM [info] Building Spark assembly using SBT with these arguments: -Phadoop-3.2 -Pmesos -Phadoop-cloud -Pkinesis-asl -Pdocker-integration-tests -Pyarn -Pspark-ganglia-lgpl -Pkubernetes -Phive-thriftserver -Phive assembly/package Using /usr/java/latest as default JAVA_HOME. Note, this will be overridden by -java-home if it is set. [info] welcome to sbt 1.5.5 (Private Build Java 1.8.0_275) [info] loading settings for project spark-master-test-sbt-hadoop-3-2-build from plugins.sbt ... [info] loading project definition from /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project [info] resolving key references (36387 settings) ... [info] set current project to spark-parent (in build file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/) [warn] there are 210 keys that are not used by any other settings/tasks: [warn] [warn] * assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * avro / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * avro / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * avro / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * avro / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * avro / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * avro / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * catalyst / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * catalyst / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * catalyst / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * catalyst / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * catalyst / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * catalyst / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * core / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * core / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * core / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * core / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * core / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * core / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * docker-integration-tests / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * docker-integration-tests / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * docker-integration-tests / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * docker-integration-tests / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * docker-integration-tests / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * docker-integration-tests / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * examples / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * examples / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * examples / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * examples / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * examples / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * examples / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * ganglia-lgpl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * ganglia-lgpl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * ganglia-lgpl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * ganglia-lgpl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * ganglia-lgpl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * ganglia-lgpl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * graphx / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * graphx / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * graphx / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * graphx / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * graphx / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * graphx / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * hadoop-cloud / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * hadoop-cloud / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * hadoop-cloud / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * hadoop-cloud / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * hadoop-cloud / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * hadoop-cloud / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * hive / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * hive / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * hive / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * hive / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * hive / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * hive / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * hive-thriftserver / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * hive-thriftserver / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * hive-thriftserver / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * hive-thriftserver / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * hive-thriftserver / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * hive-thriftserver / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * kubernetes / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * kubernetes / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * kubernetes / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * kubernetes / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * kubernetes / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * kubernetes / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * kvstore / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * kvstore / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * kvstore / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * kvstore / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * kvstore / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * kvstore / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * launcher / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * launcher / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * launcher / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * launcher / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * launcher / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * launcher / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * mesos / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * mesos / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * mesos / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * mesos / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * mesos / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * mesos / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * mllib / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * mllib / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * mllib / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * mllib / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * mllib / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * mllib / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * mllib-local / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * mllib-local / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * mllib-local / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * mllib-local / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * mllib-local / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * mllib-local / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * network-common / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * network-common / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * network-common / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * network-common / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * network-common / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * network-common / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * network-shuffle / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * network-shuffle / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * network-shuffle / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * network-shuffle / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * network-shuffle / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * network-shuffle / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * network-yarn / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * network-yarn / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * network-yarn / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * network-yarn / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * network-yarn / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * network-yarn / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * repl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * repl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * repl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * repl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * repl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * repl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * sketch / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * sketch / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * sketch / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * sketch / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * sketch / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * sketch / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * spark / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * spark / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * spark / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * spark / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * spark / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * spark / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * sql / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * sql / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * sql / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * sql / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * sql / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * sql / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * sql-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * sql-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * sql-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * sql-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * sql-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * sql-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming-kafka-0-10-assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming-kafka-0-10-assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming-kafka-0-10-assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming-kafka-0-10-assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming-kafka-0-10-assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming-kafka-0-10-assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming-kinesis-asl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming-kinesis-asl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming-kinesis-asl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming-kinesis-asl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming-kinesis-asl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming-kinesis-asl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming-kinesis-asl-assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming-kinesis-asl-assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming-kinesis-asl-assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming-kinesis-asl-assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming-kinesis-asl-assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming-kinesis-asl-assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * tags / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * tags / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * tags / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * tags / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * tags / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * tags / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * token-provider-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * token-provider-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * token-provider-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * token-provider-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * token-provider-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * token-provider-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * tools / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * tools / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * tools / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * tools / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * tools / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * tools / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * unsafe / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * unsafe / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * unsafe / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * unsafe / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * unsafe / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * unsafe / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * yarn / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * yarn / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * yarn / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * yarn / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * yarn / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * yarn / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] [warn] note: a setting might still be used by a command; to exclude a key from this `lintUnused` check [warn] either append it to `Global / excludeLintKeys` or call .withRank(KeyRanks.Invisible) on the key [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [success] Total time: 36 s, completed Dec 5, 2021 8:02:42 PM ======================================================================== Running Java style checks ======================================================================== [info] Checking Java style using SBT with these profiles: -Phadoop-3.2 -Pmesos -Phadoop-cloud -Pkinesis-asl -Pdocker-integration-tests -Pyarn -Pspark-ganglia-lgpl -Pkubernetes -Phive-thriftserver -Phive Checkstyle checks passed. ======================================================================== Running Spark unit tests ======================================================================== [info] Running Spark tests using SBT with these arguments: -Phadoop-3.2 -Pmesos -Phadoop-cloud -Pkinesis-asl -Pdocker-integration-tests -Pyarn -Pspark-ganglia-lgpl -Pkubernetes -Phive-thriftserver -Phive test Using /usr/java/latest as default JAVA_HOME. Note, this will be overridden by -java-home if it is set. [info] welcome to sbt 1.5.5 (Private Build Java 1.8.0_275) [info] loading settings for project spark-master-test-sbt-hadoop-3-2-build from plugins.sbt ... [info] loading project definition from /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project [info] resolving key references (36387 settings) ... [info] set current project to spark-parent (in build file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/) [warn] there are 210 keys that are not used by any other settings/tasks: [warn] [warn] * assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * avro / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * avro / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * avro / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * avro / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * avro / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * avro / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * catalyst / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * catalyst / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * catalyst / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * catalyst / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * catalyst / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * catalyst / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * core / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * core / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * core / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * core / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * core / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * core / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * docker-integration-tests / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * docker-integration-tests / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * docker-integration-tests / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * docker-integration-tests / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * docker-integration-tests / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * docker-integration-tests / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * examples / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * examples / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * examples / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * examples / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * examples / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * examples / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * ganglia-lgpl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * ganglia-lgpl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * ganglia-lgpl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * ganglia-lgpl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * ganglia-lgpl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * ganglia-lgpl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * graphx / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * graphx / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * graphx / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * graphx / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * graphx / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * graphx / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * hadoop-cloud / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * hadoop-cloud / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * hadoop-cloud / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * hadoop-cloud / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * hadoop-cloud / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * hadoop-cloud / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * hive / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * hive / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * hive / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * hive / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * hive / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * hive / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * hive-thriftserver / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * hive-thriftserver / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * hive-thriftserver / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * hive-thriftserver / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * hive-thriftserver / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * hive-thriftserver / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * kubernetes / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * kubernetes / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * kubernetes / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * kubernetes / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * kubernetes / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * kubernetes / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * kvstore / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * kvstore / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * kvstore / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * kvstore / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * kvstore / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * kvstore / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * launcher / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * launcher / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * launcher / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * launcher / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * launcher / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * launcher / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * mesos / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * mesos / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * mesos / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * mesos / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * mesos / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * mesos / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * mllib / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * mllib / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * mllib / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * mllib / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * mllib / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * mllib / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * mllib-local / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * mllib-local / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * mllib-local / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * mllib-local / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * mllib-local / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * mllib-local / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * network-common / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * network-common / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * network-common / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * network-common / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * network-common / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * network-common / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * network-shuffle / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * network-shuffle / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * network-shuffle / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * network-shuffle / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * network-shuffle / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * network-shuffle / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * network-yarn / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * network-yarn / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * network-yarn / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * network-yarn / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * network-yarn / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * network-yarn / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * repl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * repl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * repl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * repl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * repl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * repl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * sketch / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * sketch / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * sketch / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * sketch / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * sketch / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * sketch / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * spark / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * spark / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * spark / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * spark / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * spark / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * spark / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * sql / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * sql / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * sql / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * sql / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * sql / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * sql / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * sql-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * sql-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * sql-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * sql-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * sql-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * sql-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming-kafka-0-10-assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming-kafka-0-10-assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming-kafka-0-10-assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming-kafka-0-10-assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming-kafka-0-10-assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming-kafka-0-10-assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming-kinesis-asl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming-kinesis-asl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming-kinesis-asl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming-kinesis-asl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming-kinesis-asl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming-kinesis-asl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * streaming-kinesis-asl-assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * streaming-kinesis-asl-assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * streaming-kinesis-asl-assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * streaming-kinesis-asl-assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * streaming-kinesis-asl-assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * streaming-kinesis-asl-assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * tags / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * tags / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * tags / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * tags / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * tags / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * tags / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * token-provider-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * token-provider-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * token-provider-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * token-provider-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * token-provider-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * token-provider-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * tools / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * tools / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * tools / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * tools / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * tools / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * tools / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * unsafe / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * unsafe / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * unsafe / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * unsafe / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * unsafe / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * unsafe / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] * yarn / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1076 [warn] * yarn / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:299 [warn] * yarn / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:300 [warn] * yarn / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:1077 [warn] * yarn / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:191 [warn] * yarn / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/project/SparkBuild.scala:192 [warn] [warn] note: a setting might still be used by a command; to exclude a key from this `lintUnused` check [warn] either append it to `Global / excludeLintKeys` or call .withRank(KeyRanks.Invisible) on the key [info] Test run started [info] Test org.apache.spark.util.kvstore.LevelDBBenchmark ignored [info] Test run finished: 0 failed, 1 ignored, 0 total, 0.009s [info] Test run started [info] Test org.apache.spark.network.crypto.AuthMessagesSuite.testPublicKeyEncodeDecode started [info] Test run started [info] Test org.apache.spark.util.kvstore.ArrayWrappersSuite.testGenericArrayKey started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.009s [info] Test run started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testNoRedirectToLog started [info] Test run started [info] Test run started [info] Test org.apache.spark.network.sasl.ShuffleSecretManagerSuite.testMultipleRegisters started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testProcMonitorWithOutputRedirection started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectOutputToLog started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexDescendingWithStart started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.205s [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexWithStart started [info] Test run started [info] Test org.apache.spark.network.ChunkFetchRequestHandlerSuite.handleChunkFetchRequest started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexDescending started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexWithSkip started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexWithMax started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexDescending started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexDescending started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndex started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexWithStart started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectsSimple started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectErrorToLog started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexWithSkip started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexDescending started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.testRefWithIntNaturalKey started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexDescending started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexWithMax started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testProcMonitorWithLogRedirection started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndex started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexWithSkip started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexWithMax started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexDescendingWithStart started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testFailedChildProc started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndex started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexWithSkip started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndex started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexDescendingWithLast started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectErrorTwiceFails started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testBadLogRedirect started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexWithStart started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectLastWins started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectToLog started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndex started [info] Test run finished: 0 failed, 0 ignored, 38 total, 0.291s [info] Test run finished: 0 failed, 0 ignored, 11 total, 0.294s [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.284s [info] Test run started [info] Test org.apache.spark.network.shuffle.ExternalShuffleCleanupSuite.cleanupOnlyRemovedApp started [info] Test run started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testMultipleTypesWriteReadDelete started [info] Test run started [info] Test org.apache.spark.launcher.SparkSubmitOptionParserSuite.testMissingArg started [info] Test org.apache.spark.network.shuffle.ExternalShuffleCleanupSuite.cleanupUsesExecutor started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testObjectWriteReadDelete started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testSkip started [info] Test org.apache.spark.network.shuffle.ExternalShuffleCleanupSuite.noCleanupAndCleanup started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testMultipleObjectWriteReadDelete started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testReopenAndVersionCheckDb started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testMetadata started [info] Test org.apache.spark.network.shuffle.ExternalShuffleCleanupSuite.cleanupMultipleExecutors started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testUpdate started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testCloseLevelDBIterator started [info] Test run finished: 0 failed, 0 ignored, 4 total, 0.443s [info] Test run started [info] Test org.apache.spark.launcher.SparkSubmitOptionParserSuite.testAllOptions started [info] Test org.apache.spark.launcher.SparkSubmitOptionParserSuite.testEqualSeparatedOption started [info] Test org.apache.spark.launcher.SparkSubmitOptionParserSuite.testExtraOptions started [info] Test run finished: 0 failed, 0 ignored, 4 total, 0.665s [info] Test run started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testCliParser started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testPySparkLauncher started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testAlternateSyntaxParsing started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunner started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testSparkRShell started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testMissingAppResource started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunnerPrimaryResource started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testShellCliParser started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testClusterCmdBuilder started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testDriverCmdBuilder started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunnerNoMainClass started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testCliKillAndStatus started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunnerNoArg started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testIsClientMode started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testPySparkFallback started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunnerWithMasterNoMainClass started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testCliHelpAndNoArg started [info] Test run finished: 0 failed, 0 ignored, 17 total, 0.108s [info] Test run started [info] Test org.apache.spark.launcher.LauncherServerSuite.testTimeout started [info] Test org.apache.spark.launcher.LauncherServerSuite.testStreamFiltering started [info] Test org.apache.spark.launcher.LauncherServerSuite.testSparkSubmitVmShutsDown started [info] Test org.apache.spark.launcher.LauncherServerSuite.testLauncherServerReuse started [info] Test org.apache.spark.launcher.LauncherServerSuite.testAppHandleDisconnect started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.961s [info] Test run started [info] Test org.apache.spark.network.protocol.MergedBlockMetaSuccessSuite.testMergedBlocksMetaEncodeDecode started [info] Test org.apache.spark.launcher.LauncherServerSuite.testCommunication started [info] Test run finished: 0 failed, 0 ignored, 6 total, 0.039s [info] Test run started [info] Test org.apache.spark.launcher.InProcessLauncherSuite.testKill started [info] Test org.apache.spark.launcher.InProcessLauncherSuite.testLauncher started [info] Test org.apache.spark.launcher.InProcessLauncherSuite.testErrorPropagation started [info] Test run finished: 0 failed, 0 ignored, 3 total, 0.069s [info] Test run started [info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testValidOptionStrings started [info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testJavaMajorVersion started [info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testPythonArgQuoting started [info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testWindowsBatchQuoting started [info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testInvalidOptionStrings started [info] Test run finished: 0 failed, 0 ignored, 5 total, 0.002s [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.178s [info] Test run started [info] Test org.apache.spark.network.client.TransportClientFactorySuite.reuseClientsUpToConfigVariable started [info] Test org.apache.spark.network.sasl.SaslIntegrationSuite.testGoodClient started [info] Test org.apache.spark.network.sasl.SaslIntegrationSuite.testNoSaslClient started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testRemoveAll started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testNegativeIndexValues started [info] Test org.apache.spark.network.sasl.SaslIntegrationSuite.testNoSaslServer started [info] Test run finished: 0 failed, 0 ignored, 10 total, 1.439s [info] Test run started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testBasicIteration started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testObjectWriteReadDelete started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testDeleteParentIndex started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testMultipleObjectWriteReadDelete started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testMetadata started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testArrayIndices started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testUpdate started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testRemoveAll started [info] Test run finished: 0 failed, 0 ignored, 8 total, 0.006s [info] Test run started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testDuplicateIndex started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testEmptyIndexName started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testIndexAnnotation started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testNumEncoding started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testIllegalIndexMethod started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testKeyClashes started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testArrayIndices started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testNoNaturalIndex2 started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testIllegalIndexName started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testNoNaturalIndex started [info] Test run finished: 0 failed, 0 ignored, 10 total, 0.01s [info] Test run started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexDescendingWithStart started [info] Test org.apache.spark.network.client.TransportClientFactorySuite.reuseClientsUpToConfigVariableConcurrent started [info] Test org.apache.spark.network.sasl.SaslIntegrationSuite.testBadClient started [info] Test run finished: 0 failed, 0 ignored, 4 total, 1.138s [info] Test run started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testEmptyBlockFetch started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexDescending started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexWithSkip started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexWithMax started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexDescending started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexDescending started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndex started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexWithLast started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFailure started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexWithStart started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFailureAndSuccess started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexDescendingWithStart started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testBatchFetchThreeShuffleBlocks started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexWithLast started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testShuffleBlockChunksFetch started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testShuffleBlockChunkFetchFailure started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFetchThreeShuffleBlocks started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testUseOldProtocol started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFetchThree started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testInvalidShuffleBlockIds started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testBatchFetchShuffleBlocksOrder started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFetchOne started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFetchShuffleBlocksOrder started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexWithSkip started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexDescending started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.testRefWithIntNaturalKey started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexDescending started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexWithMax started [info] Test run finished: 0 failed, 0 ignored, 13 total, 0.341s [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndex started [info] Test run started [info] Test org.apache.spark.network.shuffle.ErrorHandlerSuite.testErrorRetry started [info] Test org.apache.spark.network.shuffle.ErrorHandlerSuite.testErrorLogging started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.004s [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexWithLast started [info] Test run started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testShuffleCorruptionDiagnosisNetworkIssue started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexWithSkip started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexWithMax started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexDescendingWithLast started [info] Test org.apache.spark.network.client.TransportClientFactorySuite.fastFailConnectionInTimeWindow started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndex started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexWithSkip started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndex started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndex started [info] Test run finished: 0 failed, 0 ignored, 38 total, 0.775s [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testFetchShuffleBlocks started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testFetchShuffleChunks started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testShuffleCorruptionDiagnosisCRC32 started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testShuffleCorruptionDiagnosisChecksumVerifyPass started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testOpenBlocksWithShuffleChunks started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testOpenDiskPersistedRDDBlocksWithMissingBlock started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testShuffleCorruptionDiagnosisDiskIssue started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testFinalizeShuffleMerge started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testRegisterExecutor started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testShuffleCorruptionDiagnosisUnknownIssue started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testFetchMergedBlocksMeta started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testBadMessages started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testCompatibilityWithOldVersion started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testFetchShuffleBlocksInBatch started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testShuffleCorruptionDiagnosisUnSupportedAlgorithm started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testOpenDiskPersistedRDDBlocks started [info] Test run finished: 0 failed, 0 ignored, 17 total, 0.395s [info] BloomFilterSuite: [info] Test run started [info] Test org.apache.spark.network.shuffle.protocol.FetchShuffleBlockChunksSuite.testFetchShuffleBlockChunksEncodeDecode started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.001s [info] Test run started [info] Test org.apache.spark.network.shuffle.protocol.FetchShuffleBlocksSuite.testFetchShuffleBlockEncodeDecode started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.001s [info] Test org.apache.spark.network.client.TransportClientFactorySuite.closeFactoryBeforeCreateClient started [info] Test run started [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnRemovedExecutorWithoutFilesToKeep started [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnRemovedExecutorWithFilesToKeepFetchRddEnabled started [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRemovedExecutorWithoutFilesToKeep started [info] - accuracy - Byte (34 milliseconds) [info] - mergeInPlace - Byte (17 milliseconds) [info] - intersectInPlace - Byte (11 milliseconds) [info] - accuracy - Short (9 milliseconds) [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRegisteredExecutorWithFilesToKeepFetchRddDisabled started [info] - mergeInPlace - Short (11 milliseconds) [info] - intersectInPlace - Short (10 milliseconds) [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRegisteredExecutorWithoutFilesToKeep started [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRegisteredExecutorWithFilesToKeepFetchRddEnabled started [info] - accuracy - Int (42 milliseconds) [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupUsesExecutorWithoutFilesToKeep started [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnRemovedExecutorWithFilesToKeepFetchRddDisabled started [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRemovedExecutorWithFilesToKeepFetchRddDisabled started [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupUsesExecutorWithFilesToKeep started [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRemovedExecutorWithFilesToKeepFetchRddEnabled started [info] - mergeInPlace - Int (170 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 11 total, 0.378s [info] Test run started [info] Test org.apache.spark.network.shuffle.OneForOneBlockPusherSuite.testHandlingRetriableFailures started [info] Test org.apache.spark.network.shuffle.OneForOneBlockPusherSuite.testPushOne started [info] Test org.apache.spark.network.shuffle.OneForOneBlockPusherSuite.testServerFailures started [info] Test org.apache.spark.network.shuffle.OneForOneBlockPusherSuite.testPushThree started [info] Test run finished: 0 failed, 0 ignored, 4 total, 0.033s [info] Test run started [info] Test org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testBadSecret started [info] Test org.apache.spark.network.client.TransportClientFactorySuite.closeBlockClientsWithFactory started [info] - intersectInPlace - Int (93 milliseconds) [info] - accuracy - Long (48 milliseconds) [info] Test org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testBadAppId started [info] - mergeInPlace - Long (117 milliseconds) [info] - intersectInPlace - Long (103 milliseconds) [info] Test org.apache.spark.network.client.TransportClientFactorySuite.neverReturnInactiveClients started [info] Test org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testValid started [info] Test org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testEncryption started [info] Test org.apache.spark.network.client.TransportClientFactorySuite.closeIdleConnectionForRequestTimeOut started [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] Test run finished: 0 failed, 0 ignored, 4 total, 0.737s [info] Test run started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testBlockFetchWithOlderShuffleMergeId started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testDuplicateBlocksAreIgnoredWhenPrevStreamIsInProgress started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testRequestForAbortedShufflePartitionThrowsException started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureAfterMultipleDataBlocks started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testOngoingMergeOfBlockFromPreviousAttemptIsAborted started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testBasicBlockMerge started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testCollision started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testPushBlockFromPreviousAttemptIsRejected started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testBlockPushWithOlderShuffleMergeId started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureAfterDuplicateBlockDoesNotInterfereActiveStream started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testExecutorRegisterWithInvalidJsonForPushShuffle started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testDuplicateBlocksAreIgnoredWhenPrevStreamHasCompleted started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureWhileTruncatingFiles started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFinalizeShuffleMergeFromPreviousAttemptIsAborted started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testCleanupOlderShuffleMergeId started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testIOExceptionsExceededThreshold started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testOnFailureInvokedMoreThanOncePerBlock started [info] UTF8StringPropertyCheckSuite: [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testBlockReceivedAfterMergeFinalize started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureAfterData started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFinalizeWithOlderShuffleMergeId started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testWritingPendingBufsIsAbortedImmediatelyDuringComplete started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFinalizeWithMultipleReducePartitions started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testUpdateLocalDirsOnlyOnce started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testNoIndexFile started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFinalizeOfDeterminateShuffle started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testRecoverIndexFileAfterIOExceptionsInFinalize started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testExecutorRegistrationFromTwoAppAttempts started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureAfterComplete started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testRecoverMetaFileAfterIOExceptionsInFinalize started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testIncompleteStreamsAreOverwritten started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureInAStreamDoesNotInterfereWithStreamWhichIsWriting started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testDeferredBufsAreWrittenDuringOnData started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testRecoverIndexFileAfterIOExceptions started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testCleanUpDirectory started [info] - toString (121 milliseconds) [info] - numChars (7 milliseconds) [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testIOExceptionsDuringMetaUpdateIncreasesExceptionCount started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testDividingMergedBlocksIntoChunks started [info] - startsWith (19 milliseconds) [info] - endsWith (11 milliseconds) [info] - toUpperCase (5 milliseconds) [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testErrorLogging started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testPendingBlockIsAbortedImmediately started [info] - toLowerCase (3 milliseconds) [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testDeferredBufsAreWrittenDuringOnComplete started [info] - compare (9 milliseconds) [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testRecoverMetaFileAfterIOExceptions started [info] Test run finished: 0 failed, 0 ignored, 40 total, 0.696s [info] Test run started [info] Test org.apache.spark.network.shuffle.RetryingBlockTransferorSuite.testRetryAndUnrecoverable started [info] Test org.apache.spark.network.shuffle.RetryingBlockTransferorSuite.testSingleIOExceptionOnFirst started [info] - substring (77 milliseconds) [info] Test org.apache.spark.network.shuffle.RetryingBlockTransferorSuite.testUnrecoverableFailure started [info] Test org.apache.spark.network.shuffle.RetryingBlockTransferorSuite.testSingleIOExceptionOnSecond started [info] Test org.apache.spark.network.shuffle.RetryingBlockTransferorSuite.testThreeIOExceptions started [info] Test org.apache.spark.network.shuffle.RetryingBlockTransferorSuite.testNoFailures started [info] Test org.apache.spark.network.shuffle.RetryingBlockTransferorSuite.testTwoIOExceptions started [info] - contains (74 milliseconds) [info] - trim, trimLeft, trimRight (20 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 7 total, 0.155s [info] Test run started [info] - reverse (4 milliseconds) [info] Test org.apache.spark.network.shuffle.BlockTransferMessagesSuite.serializeOpenShuffleBlocks started [info] Test org.apache.spark.network.shuffle.BlockTransferMessagesSuite.testLocalDirsMessages started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.004s [info] Test run started [info] - indexOf (31 milliseconds) [info] - repeat (8 milliseconds) [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchUnregisteredExecutor started [info] - lpad, rpad (5 milliseconds) [info] - concat (31 milliseconds) [info] - concatWs (21 milliseconds) [info] - split !!! IGNORED !!! [info] - levenshteinDistance (9 milliseconds) [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchWrongExecutor started [info] - hashCode (4 milliseconds) [info] - equals (3 milliseconds) [info] Test org.apache.spark.network.client.TransportClientFactorySuite.returnDifferentClientsForDifferentServers started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testRegisterWithCustomShuffleManager started [info] Test run started [info] Test org.apache.spark.unsafe.array.LongArraySuite.basicTest started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.008s [info] Test run started [info] Test org.apache.spark.unsafe.array.ByteArraySuite.testCompareBinary started [info] Test org.apache.spark.unsafe.array.ByteArraySuite.testGetPrefix started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.001s [info] Test run started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.freeingOnHeapMemoryBlockResetsBaseObjectAndOffset started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.overlappingCopyMemory started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchCorruptRddBlock started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.memoryDebugFillEnabledInTest started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.offHeapMemoryAllocatorThrowsAssertionErrorOnDoubleFree started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.heapMemoryReuse started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.onHeapMemoryAllocatorPoolingReUsesLongArrays started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.onHeapMemoryAllocatorThrowsAssertionErrorOnDoubleFree started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.freeingOffHeapMemoryBlockResetsOffset started [info] Test run finished: 0 failed, 0 ignored, 8 total, 0.057s [info] Test run started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.titleCase started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.concatTest started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.soundex started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.basicTest started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStreamUnderflow started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.testToShort started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.startsWith started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.compareTo started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.levenshteinDistance started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStreamOverflow started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStreamIntArray started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.upperAndLower started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.testToInt started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.createBlankString started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.prefix started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.concatWsTest started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.repeat started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.contains started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.skipWrongFirstByte started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.emptyStringTest started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStreamSlice started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.trimBothWithTrimString started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.substringSQL started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.substring_index started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.pad started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.split started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.trims started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.trimRightWithTrimString started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.findInSet started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.substring started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.translate started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.replace started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.reverse started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.trimLeftWithTrimString started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.endsWith started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.testToByte started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.testToLong started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStream started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.indexOf started [info] Test run finished: 0 failed, 0 ignored, 39 total, 0.09s [info] Test run started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.testKnownLongInputs started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.testKnownIntegerInputs started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.randomizedStressTest started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchNoServer started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.testKnownBytesInputs started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.randomizedStressTestPaddedStrings started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.randomizedStressTestBytes started [info] Test run finished: 0 failed, 0 ignored, 8 total, 4.048s [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchDeletedRddBlock started [info] Test run started [info] Test org.apache.spark.network.util.CryptoUtilsSuite.testConfConversion started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.017s [info] Test run started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptChallengeSalt started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchValidRddBlock started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptChallengeAppId started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testFixedChallengeResponse started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testRemoveRddBlocks started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptChallengeCiphertext started [info] Test run finished: 0 failed, 0 ignored, 6 total, 0.322s [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testMismatchedSecret started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptResponseSalt started [info] Test run started [info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.periodAndDurationTest started [info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.equalsTest started [info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.toStringTest started [info] Test run finished: 0 failed, 0 ignored, 3 total, 0.005s [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testEncryptedMessage started [info] Passed: Total 106, Failed 0, Errors 0, Passed 105, Skipped 1 [info] Passed: Total 46, Failed 0, Errors 0, Passed 46 [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testEncryptedMessageWhenTransferringZeroBytes started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testFixedChallenge started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptServerCiphertext started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptResponseAppId started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchThreeSort started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testAuthEngine started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchWrongBlockId started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testBadKeySize started [info] Test run finished: 0 failed, 0 ignored, 13 total, 0.253s [info] Test run started [info] Test org.apache.spark.network.util.NettyMemoryMetricsSuite.testGeneralNettyMemoryMetrics started [info] Test org.apache.spark.network.util.NettyMemoryMetricsSuite.testAdditionalMetrics started [info] ScalaTest [info] Run completed in 5 seconds, 778 milliseconds. [info] Total number of tests run: 19 [info] Suites: completed 1, aborted 0 [info] Tests: succeeded 19, failed 0, canceled 0, ignored 1, pending 0 [info] All tests passed. [info] Passed: Total 78, Failed 0, Errors 0, Passed 78, Ignored 1 [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchNonexistent started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchOneSort started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.288s [info] Test run started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testDeallocateReleasesManagedBuffer started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testByteBufBody started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testShortWrite started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testCompositeByteBufBodySingleBuffer started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testCompositeByteBufBodyMultipleBuffers started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testSingleWrite started [info] Test run finished: 0 failed, 0 ignored, 6 total, 0.051s [info] Test run started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testEmptyFrame started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testNegativeFrameSize started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testSplitLengthField started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testFrameDecoding started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testInterception started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testRetainedFrames started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testConsolidationPerf started [info] Test run finished: 0 failed, 0 ignored, 12 total, 1.436s [info] Test run started [info] Test org.apache.spark.network.shuffle.ExternalShuffleBlockResolverSuite.testSortShuffleBlocks started [info] Test org.apache.spark.network.shuffle.ExternalShuffleBlockResolverSuite.testBadRequests started [info] Test org.apache.spark.network.shuffle.ExternalShuffleBlockResolverSuite.jsonSerializationOfExecutorRegistration started [info] Test run finished: 0 failed, 0 ignored, 3 total, 0.168s [info] Test run started [info] Test org.apache.spark.network.shuffle.AppIsolationSuite.testSaslAppIsolation started [info] Test org.apache.spark.network.shuffle.AppIsolationSuite.testAuthEngineAppIsolation started [info] - accuracy - String (3 seconds, 93 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.412s [info] Passed: Total 128, Failed 0, Errors 0, Passed 128 [info] - mergeInPlace - String (2 seconds, 117 milliseconds) [info] DistributedSuite: [info] - intersectInPlace - String (2 seconds, 486 milliseconds) [info] - incompatible merge (3 milliseconds) [info] BitArraySuite: [info] - error case when create BitArray (1 millisecond) [info] - bitSize (1 millisecond) [info] - set (2 milliseconds) [info] - normal operation (31 milliseconds) [info] - merge (10 milliseconds) [info] CountMinSketchSuite: [info] - accuracy - Byte (282 milliseconds) [info] ExternalSorterSuite: [info] - mergeInPlace - Byte (307 milliseconds) [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] - accuracy - Short (725 milliseconds) [info] - mergeInPlace - Short (352 milliseconds) [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] - accuracy - Int (651 milliseconds) [info] - mergeInPlace - Int (374 milliseconds) [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] - accuracy - Long (756 milliseconds) [info] - empty data stream with kryo ser (3 seconds, 121 milliseconds) [info] - mergeInPlace - Long (451 milliseconds) [info] - empty data stream with java ser (253 milliseconds) [info] - few elements per partition with kryo ser (224 milliseconds) [info] - few elements per partition with java ser (148 milliseconds) [info] - empty partitions with spilling with kryo ser (751 milliseconds) [info] - empty partitions with spilling with java ser (303 milliseconds) [info] - accuracy - String (2 seconds, 705 milliseconds) [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] Test run finished: 0 failed, 0 ignored, 7 total, 12.933s [info] Test run started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.failOutstandingStreamCallbackOnException started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.failOutstandingStreamCallbackOnClose started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.testActiveStreams started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleSuccessfulFetch started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleFailedRPC started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleFailedFetch started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleSuccessfulMergedBlockMeta started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleFailedMergedBlockMeta started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleSuccessfulRPC started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.clearAllOutstandingRequests started [info] Test run finished: 0 failed, 0 ignored, 10 total, 0.297s [info] Test run started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testNonMatching started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testSaslAuthentication started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testEncryptedMessageChunking started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testServerAlwaysEncrypt started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testDataEncryptionIsActuallyEnabled started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testFileRegionEncryption started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testSaslEncryption started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testDelegates started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testEncryptedMessage started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testMatching started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testRpcHandlerDelegate started [info] Test run finished: 0 failed, 0 ignored, 11 total, 0.607s [info] Test run started [info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchNonExistentChunk started [info] - mergeInPlace - String (2 seconds, 21 milliseconds) [info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchFileChunk started [info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchBothChunks started [info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchChunkAndNonExistent started [info] - task throws not serializable exception (9 seconds, 110 milliseconds) [info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchBufferChunk started [info] - local-cluster format (5 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 5 total, 0.116s [info] Test run started [info] Test org.apache.spark.network.RequestTimeoutIntegrationSuite.furtherRequestsDelay started [info] - accuracy - Byte array (5 seconds, 19 milliseconds) [info] - simple groupByKey (7 seconds, 51 milliseconds) [info] - spilling in local cluster with kryo ser (10 seconds, 489 milliseconds) [info] - mergeInPlace - Byte array (2 seconds, 571 milliseconds) [info] - incompatible merge (3 milliseconds) [info] JobGeneratorSuite: [info] Test org.apache.spark.network.RequestTimeoutIntegrationSuite.timeoutCleanlyClosesClient started [info] - groupByKey where map output sizes exceed maxMbInFlight (7 seconds, 223 milliseconds) [info] - spilling in local cluster with java ser (10 seconds, 650 milliseconds) [info] - SPARK-6222: Do not clear received block data too soon (8 seconds, 248 milliseconds) [info] ReceiverInputDStreamSuite: [info] - Without WAL enabled: createBlockRDD creates empty BlockRDD when no block info (101 milliseconds) [info] - Without WAL enabled: createBlockRDD creates correct BlockRDD with block info (147 milliseconds) [info] - Without WAL enabled: createBlockRDD filters non-existent blocks before creating BlockRDD (173 milliseconds) [info] - With WAL enabled: createBlockRDD creates empty WALBackedBlockRDD when no block info (167 milliseconds) [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] - With WAL enabled: createBlockRDD creates correct WALBackedBlockRDD with all block info having WAL info (174 milliseconds) [info] - With WAL enabled: createBlockRDD creates BlockRDD when some block info don't have WAL info (196 milliseconds) [info] FileBasedWriteAheadLogWithFileCloseAfterWriteSuite: [info] - FileBasedWriteAheadLog - read all logs (204 milliseconds) [info] - FileBasedWriteAheadLog - write logs (1 second, 268 milliseconds) [info] - accumulators (6 seconds, 327 milliseconds) [info] Test org.apache.spark.network.RequestTimeoutIntegrationSuite.timeoutInactiveRequests started [info] - FileBasedWriteAheadLog - read all logs after write (948 milliseconds) [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] - FileBasedWriteAheadLog - clean old logs (341 milliseconds) [info] - FileBasedWriteAheadLog - clean old logs synchronously (427 milliseconds) [info] - FileBasedWriteAheadLog - handling file errors while reading rotating logs (1 second, 325 milliseconds) [info] - FileBasedWriteAheadLog - do not create directories or files unless write (27 milliseconds) [info] - FileBasedWriteAheadLog - parallel recovery not enabled if closeFileAfterWrite = false (204 milliseconds) [info] - FileBasedWriteAheadLog - close after write flag (35 milliseconds) [info] RateLimiterSuite: [info] - rate limiter initializes even without a maxRate set (4 milliseconds) [info] - rate limiter updates when below maxRate (1 millisecond) [info] - rate limiter stays below maxRate despite large updates (2 milliseconds) [info] ReceivedBlockTrackerSuite: [info] - block addition, and block to batch allocation (45 milliseconds) [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] - broadcast variables (6 seconds, 220 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 3 total, 32.0s [info] Test run started [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testSaslClientFallback started [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testSaslServerFallback started [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testAuthReplay started [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testNewAuth started [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testLargeMessageEncryption started [info] - spilling in local cluster with many reduce tasks with kryo ser (14 seconds, 461 milliseconds) [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testAuthFailure started [info] Test run finished: 0 failed, 0 ignored, 6 total, 0.862s [info] Test run started [info] Test org.apache.spark.network.util.TimerWithCustomUnitSuite.testTimingViaContext started [info] Test org.apache.spark.network.util.TimerWithCustomUnitSuite.testTimerWithMillisecondTimeUnit started [info] Test org.apache.spark.network.util.TimerWithCustomUnitSuite.testTimerWithNanosecondTimeUnit started [info] Test run finished: 0 failed, 0 ignored, 3 total, 0.03s [info] Test run started [info] Test org.apache.spark.network.server.OneForOneStreamManagerSuite.streamStatesAreFreedWhenConnectionIsClosedEvenIfBufferIteratorThrowsException started [info] Test org.apache.spark.network.server.OneForOneStreamManagerSuite.testMissingChunk started [info] Test org.apache.spark.network.server.OneForOneStreamManagerSuite.managedBuffersAreFreedWhenConnectionIsClosed started [info] Test run finished: 0 failed, 0 ignored, 3 total, 0.057s [info] Test run started [info] Test org.apache.spark.network.TransportRequestHandlerSuite.handleMergedBlockMetaRequest started [info] Test org.apache.spark.network.TransportRequestHandlerSuite.handleStreamRequest started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.014s [info] Test run started [info] Test org.apache.spark.network.protocol.EncodersSuite.testBitmapArraysEncodeDecode started [info] Test org.apache.spark.network.protocol.EncodersSuite.testRoaringBitmapEncodeShouldFailWhenBufferIsSmall started [info] Test org.apache.spark.network.protocol.EncodersSuite.testRoaringBitmapEncodeDecode started [info] Test run finished: 0 failed, 0 ignored, 3 total, 0.015s [info] Test run started [info] Test org.apache.spark.network.StreamSuite.testSingleStream started [info] Test org.apache.spark.network.StreamSuite.testMultipleStreams started [info] - repeatedly failing task (5 seconds, 825 milliseconds) [info] Test org.apache.spark.network.StreamSuite.testConcurrentStreams started [info] Test org.apache.spark.network.StreamSuite.testZeroLengthStream started [info] Test run finished: 0 failed, 0 ignored, 4 total, 0.424s [info] Test run started [info] Test org.apache.spark.network.crypto.TransportCipherSuite.testBufferNotLeaksOnInternalError started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.061s [info] Test run started [info] Test org.apache.spark.network.RpcIntegrationSuite.sendRpcWithStreamConcurrently started [info] Test org.apache.spark.network.RpcIntegrationSuite.sendOneWayMessage started [info] Test org.apache.spark.network.RpcIntegrationSuite.singleRPC started [info] Test org.apache.spark.network.RpcIntegrationSuite.throwErrorRPC started [info] Test org.apache.spark.network.RpcIntegrationSuite.doubleTrouble started [info] Test org.apache.spark.network.RpcIntegrationSuite.doubleRPC started [info] Test org.apache.spark.network.RpcIntegrationSuite.returnErrorRPC started [info] Test org.apache.spark.network.RpcIntegrationSuite.sendRpcWithStreamFailures started [info] Test org.apache.spark.network.RpcIntegrationSuite.sendRpcWithStreamOneAtATime started [info] Test org.apache.spark.network.RpcIntegrationSuite.sendSuccessAndFailure started [info] Test run finished: 0 failed, 0 ignored, 10 total, 0.332s [info] Test run started [info] Test org.apache.spark.network.ProtocolSuite.responses started [info] Test org.apache.spark.network.ProtocolSuite.requests started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.006s [info] ResourceRequestHelperSuite: [info] - empty SparkConf should be valid (264 milliseconds) [info] - just normal resources are defined (7 milliseconds) [info] - get yarn resources from configs (15 milliseconds) [info] - get invalid yarn resources from configs (31 milliseconds) [info] - valid request: value with unit (198 milliseconds) [info] - valid request: value without unit (3 milliseconds) [info] - valid request: multiple resources (5 milliseconds) [info] - invalid request: value does not match pattern (3 milliseconds) [info] - invalid request: only unit defined (3 milliseconds) [info] - invalid request: invalid unit (5 milliseconds) [info] - disallowed resource request: spark.yarn.executor.resource.memory.amount (3 milliseconds) [info] - disallowed resource request: spark.yarn.executor.resource.memory-mb.amount (2 milliseconds) [info] - disallowed resource request: spark.yarn.executor.resource.mb.amount (2 milliseconds) [info] - disallowed resource request: spark.yarn.executor.resource.cores.amount (2 milliseconds) [info] - disallowed resource request: spark.yarn.executor.resource.vcores.amount (2 milliseconds) [info] - disallowed resource request: spark.yarn.am.resource.memory.amount (1 millisecond) [info] - disallowed resource request: spark.yarn.driver.resource.memory.amount (2 milliseconds) [info] - disallowed resource request: spark.yarn.am.resource.cores.amount (2 milliseconds) [info] - disallowed resource request: spark.yarn.driver.resource.cores.amount (2 milliseconds) [info] - multiple disallowed resources in config (10 milliseconds) [info] FailureTrackerSuite: [info] - failures expire if validity interval is set (49 milliseconds) [info] - failures never expire if validity interval is not set (-1) (4 milliseconds) [info] YarnSparkHadoopUtilSuite: [info] - shell script escaping (69 milliseconds) [info] - Yarn configuration override (269 milliseconds) [info] - test getApplicationAclsForYarn acls on (90 milliseconds) [info] - test getApplicationAclsForYarn acls on and specify users (24 milliseconds) [info] - SPARK-35672: test replaceEnvVars in Unix mode (17 milliseconds) [info] - SPARK-35672: test replaceEnvVars in Windows mode (11 milliseconds) [info] ClientSuite: [info] - default Yarn application classpath (11 milliseconds) [info] - default MR application classpath (1 millisecond) [info] - resultant classpath for an application that defines a classpath for YARN (20 milliseconds) [info] - resultant classpath for an application that defines a classpath for MR (14 milliseconds) [info] - resultant classpath for an application that defines both classpaths, YARN and MR (15 milliseconds) [info] - Local jar URIs (193 milliseconds) [info] - Jar path propagation through SparkConf (1 second, 750 milliseconds) [info] - Cluster path translation (6 milliseconds) [info] - configuration and args propagate through createApplicationSubmissionContext (80 milliseconds) [info] - specify a more specific type for the application !!! CANCELED !!! (15 milliseconds) [info] org.apache.spark.deploy.yarn.ResourceRequestHelper.isYarnResourceTypesAvailable() was true (ClientSuite.scala:221) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ClientSuite.$anonfun$new$19(ClientSuite.scala:221) [info] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - spark.yarn.jars with multiple paths and globs (395 milliseconds) [info] - distribute jars archive (188 milliseconds) [info] - SPARK-37239: distribute jars archive with set STAGING_FILE_REPLICATION (169 milliseconds) [info] - distribute archive multiple times (868 milliseconds) [info] - distribute local spark jars (205 milliseconds) [info] - ignore same name jars (162 milliseconds) [info] - custom resource request (client mode) (26 milliseconds) [info] - custom resource request (cluster mode) (27 milliseconds) [info] - custom driver resource request yarn config and spark config fails (5 milliseconds) [info] - custom executor resource request yarn config and spark config fails (4 milliseconds) [info] - custom resources spark config mapped to yarn config (30 milliseconds) [info] - gpu/fpga spark resources mapped to custom yarn resources (22 milliseconds) [info] - test yarn jars path not exists (55 milliseconds) [info] - SPARK-31582 Being able to not populate Hadoop classpath (63 milliseconds) [info] - SPARK-35672: test Client.getUserClasspathUrls (12 milliseconds) [info] - files URI match test1 (1 millisecond) [info] - files URI match test2 (2 milliseconds) [info] - files URI match test3 (0 milliseconds) [info] - wasb URI match test (1 millisecond) [info] - hdfs URI match test (0 milliseconds) [info] - files URI unmatch test1 (1 millisecond) [info] - files URI unmatch test2 (0 milliseconds) [info] - files URI unmatch test3 (1 millisecond) [info] - wasb URI unmatch test1 (0 milliseconds) [info] - wasb URI unmatch test2 (1 millisecond) [info] - s3 URI unmatch test (0 milliseconds) [info] - hdfs URI unmatch test1 (1 millisecond) [info] - hdfs URI unmatch test2 (3 milliseconds) [info] ClientDistributedCacheManagerSuite: [info] - test getFileStatus empty (429 milliseconds) [info] - test getFileStatus cached (1 millisecond) [info] - test addResource (5 milliseconds) [info] - test addResource link null (2 milliseconds) [info] - test addResource appmaster only (2 milliseconds) [info] - test addResource archive (2 milliseconds) [info] ContainerPlacementStrategySuite: [info] - allocate locality preferred containers with enough resource and no matched existed containers (266 milliseconds) [info] - allocate locality preferred containers with enough resource and partially matched containers (32 milliseconds) [info] - allocate locality preferred containers with limited resource and partially matched containers (28 milliseconds) [info] - allocate locality preferred containers with fully matched containers (29 milliseconds) [info] - allocate containers with no locality preference (34 milliseconds) [info] - allocate locality preferred containers by considering the localities of pending requests (81 milliseconds) [info] YarnClusterSuite: [info] - repeatedly failing task that crashes JVM (10 seconds, 466 milliseconds) [info] - spilling in local cluster with many reduce tasks with java ser (12 seconds, 990 milliseconds) [info] - block addition, and block to batch allocation with many blocks (21 seconds, 528 milliseconds) [info] - recovery with write ahead logs should remove only allocated blocks from received queue (63 milliseconds) [info] - cleanup of intermediate files in sorter (322 milliseconds) [info] - cleanup of intermediate files in sorter with failures (249 milliseconds) [info] - block allocation to batch should not loose blocks from received queue (927 milliseconds) [info] - recovery and cleanup with write ahead logs (95 milliseconds) [info] - disable write ahead log when checkpoint directory is not set (3 milliseconds) [info] - parallel file deletion in FileBasedWriteAheadLog is robust to deletion error (135 milliseconds) [info] ReceivedBlockHandlerWithEncryptionSuite: [info] - cleanup of intermediate files in shuffle (1 second, 137 milliseconds) [info] - BlockManagerBasedBlockHandler - store blocks (329 milliseconds) [info] - cleanup of intermediate files in shuffle with failures (345 milliseconds) [info] - BlockManagerBasedBlockHandler - handle errors in storing block (12 milliseconds) [info] - no sorting or partial aggregation with kryo ser (173 milliseconds) [info] - WriteAheadLogBasedBlockHandler - store blocks (183 milliseconds) [info] - no sorting or partial aggregation with java ser (94 milliseconds) [info] - WriteAheadLogBasedBlockHandler - handle errors in storing block (31 milliseconds) [info] - no sorting or partial aggregation with spilling with kryo ser (136 milliseconds) [info] - WriteAheadLogBasedBlockHandler - clean old blocks (64 milliseconds) [info] - no sorting or partial aggregation with spilling with java ser (173 milliseconds) [info] - sorting, no partial aggregation with kryo ser (103 milliseconds) [info] - Test Block - count messages (195 milliseconds) [info] - sorting, no partial aggregation with java ser (115 milliseconds) [info] - Test Block - isFullyConsumed (52 milliseconds) [info] ReceiverSchedulingPolicySuite: [info] - rescheduleReceiver: empty executors (1 millisecond) [info] - rescheduleReceiver: receiver preferredLocation (6 milliseconds) [info] - rescheduleReceiver: return all idle executors if there are any idle executors (7 milliseconds) [info] - rescheduleReceiver: return all executors that have minimum weight if no idle executors (5 milliseconds) [info] - scheduleReceivers: schedule receivers evenly when there are more receivers than executors (6 milliseconds) [info] - scheduleReceivers: schedule receivers evenly when there are more executors than receivers (5 milliseconds) [info] - sorting, no partial aggregation with spilling with kryo ser (110 milliseconds) [info] - scheduleReceivers: schedule receivers evenly when the preferredLocations are even (6 milliseconds) [info] - scheduleReceivers: return empty if no receiver (1 millisecond) [info] - scheduleReceivers: return empty scheduled executors if no executors (2 milliseconds) [info] MapWithStateSuite: [info] - state - get, exists, update, remove, (6 milliseconds) [info] - sorting, no partial aggregation with spilling with java ser (99 milliseconds) [info] - partial aggregation, no sorting with kryo ser (160 milliseconds) [info] - partial aggregation, no sorting with java ser (159 milliseconds) [info] - partial aggregation, no sorting with spilling with kryo ser (111 milliseconds) [info] - partial aggregation, no sorting with spilling with java ser (83 milliseconds) [info] - partial aggregation and sorting with kryo ser (77 milliseconds) [info] - partial aggregation and sorting with java ser (124 milliseconds) [info] - partial aggregation and sorting with spilling with kryo ser (127 milliseconds) [info] - partial aggregation and sorting with spilling with java ser (153 milliseconds) [info] - mapWithState - basic operations with simple API (1 second, 796 milliseconds) [info] - mapWithState - basic operations with advanced API (916 milliseconds) [info] - mapWithState - type inferencing and class tags (13 milliseconds) [info] - mapWithState - states as mapped data (690 milliseconds) [info] - sort without breaking sorting contracts with kryo ser (3 seconds, 15 milliseconds) [info] - mapWithState - initial states, with nothing returned as from mapping function (634 milliseconds) [info] - mapWithState - state removing (735 milliseconds) [info] - sort without breaking sorting contracts with java ser (2 seconds, 245 milliseconds) [info] - sort without breaking timsort contracts for large arrays !!! IGNORED !!! [info] - spilling with hash collisions (310 milliseconds) [info] - mapWithState - state timing out (1 second, 732 milliseconds) [info] - repeatedly failing task that crashes JVM with a zero exit code (SPARK-16925) (12 seconds, 896 milliseconds) [info] - mapWithState - checkpoint durations (123 milliseconds) ------------------------------------------- Time: 1000 ms ------------------------------------------- [info] - spilling with many hash collisions (937 milliseconds) ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,1) ------------------------------------------- Time: 3000 ms ------------------------------------------- (a,2) (b,1) ------------------------------------------- Time: 3000 ms ------------------------------------------- (a,2) (b,1) [info] - spilling with hash collisions using the Int.MaxValue key (318 milliseconds) ------------------------------------------- Time: 4000 ms ------------------------------------------- (a,3) (b,2) (c,1) ------------------------------------------- Time: 5000 ms ------------------------------------------- (c,1) (a,4) (b,3) ------------------------------------------- Time: 6000 ms ------------------------------------------- (b,3) (c,1) (a,5) ------------------------------------------- Time: 7000 ms ------------------------------------------- (a,5) (b,3) (c,1) [info] - mapWithState - driver failure recovery (983 milliseconds) [info] - spilling with null keys and values (275 milliseconds) [info] JavaStreamingListenerWrapperSuite: [info] - basic (23 milliseconds) [info] DurationSuite: [info] - less (0 milliseconds) [info] - lessEq (0 milliseconds) [info] - greater (0 milliseconds) [info] - greaterEq (0 milliseconds) [info] - plus (0 milliseconds) [info] - minus (1 millisecond) [info] - times (0 milliseconds) [info] - div (0 milliseconds) [info] - isMultipleOf (0 milliseconds) [info] - min (1 millisecond) [info] - max (0 milliseconds) [info] - isZero (0 milliseconds) [info] - Milliseconds (0 milliseconds) [info] - Seconds (0 milliseconds) [info] - Minutes (1 millisecond) [info] PIDRateEstimatorSuite: [info] - the right estimator is created (10 milliseconds) [info] - estimator checks ranges (3 milliseconds) [info] - first estimate is None (5 milliseconds) [info] - second estimate is not None (1 millisecond) [info] - no estimate when no time difference between successive calls (1 millisecond) [info] - no estimate when no records in previous batch (0 milliseconds) [info] - no estimate when there is no processing delay (0 milliseconds) [info] - estimate is never less than min rate (6 milliseconds) [info] - with no accumulated or positive error, |I| > 0, follow the processing speed (5 milliseconds) [info] - with no accumulated but some positive error, |I| > 0, follow the processing speed (4 milliseconds) [info] - with some accumulated and some positive error, |I| > 0, stay below the processing speed (24 milliseconds) [info] WindowOperationsSuite: [info] - window - basic window (432 milliseconds) [info] - window - tumbling window (429 milliseconds) [info] - window - larger window (492 milliseconds) [info] - window - non-overlapping window (344 milliseconds) [info] - window - persistence level (61 milliseconds) [info] - sorting updates peak execution memory (1 second, 953 milliseconds) [info] - reduceByKeyAndWindow - basic reduction (503 milliseconds) [info] - reduceByKeyAndWindow - key already in window and new value added into window (308 milliseconds) [info] - reduceByKeyAndWindow - new key added into window (285 milliseconds) [info] - force to spill for external sorter (969 milliseconds) [info] DAGSchedulerSuite: [info] - reduceByKeyAndWindow - key removed from window (423 milliseconds) [info] - reduceByKeyAndWindow - larger slide time (905 milliseconds) [info] - [SPARK-3353] parent stage should have lower stage id (1 second, 970 milliseconds) [info] - [SPARK-13902] Ensure no duplicate stages are created (110 milliseconds) [info] - reduceByKeyAndWindow - big test (927 milliseconds) [info] - All shuffle files on the storage endpoint should be cleaned up when it is lost (117 milliseconds) [info] - SPARK-32003: All shuffle files for executor should be cleaned up on fetch failure (101 milliseconds) [info] - zero split job (70 milliseconds) [info] - reduceByKeyAndWindow with inverse function - basic reduction (313 milliseconds) [info] - run trivial job (70 milliseconds) [info] - run trivial job w/ dependency (70 milliseconds) [info] - equals and hashCode AccumulableInfo (0 milliseconds) [info] - cache location preferences w/ dependency (62 milliseconds) [info] - reduceByKeyAndWindow with inverse function - key already in window and new value added into window (289 milliseconds) [info] - regression test for getCacheLocs (78 milliseconds) [info] - getMissingParentStages should consider all ancestor RDDs' cache statuses (97 milliseconds) [info] - reduceByKeyAndWindow with inverse function - new key added into window (349 milliseconds) [info] - avoid exponential blowup when getting preferred locs list (235 milliseconds) [info] - unserializable task (104 milliseconds) [info] - caching (encryption = off) (7 seconds, 551 milliseconds) [info] - trivial job failure (97 milliseconds) [info] - trivial job cancellation (67 milliseconds) [info] - reduceByKeyAndWindow with inverse function - key removed from window (434 milliseconds) [info] - job cancellation no-kill backend (125 milliseconds) [info] - run trivial shuffle (145 milliseconds) [info] - run trivial shuffle with fetch failure (113 milliseconds) [info] - shuffle files not lost when executor process lost with shuffle service (51 milliseconds) [info] - reduceByKeyAndWindow with inverse function - larger slide time (506 milliseconds) [info] - shuffle files lost when worker lost with shuffle service (54 milliseconds) [info] - shuffle files lost when worker lost without shuffle service (88 milliseconds) [info] - shuffle files not lost when executor failure with shuffle service (92 milliseconds) [info] - shuffle files lost when executor failure without shuffle service (126 milliseconds) [info] - SPARK-28967 properties must be cloned before posting to listener bus for 0 partition (60 milliseconds) [info] - Single stage fetch failure should not abort the stage. (91 milliseconds) [info] - Multiple consecutive stage fetch failures should lead to job being aborted. (89 milliseconds) [info] - reduceByKeyAndWindow with inverse function - big test (735 milliseconds) [info] - Failures in different stages should not trigger an overall abort (128 milliseconds) [info] - Non-consecutive stage failures don't trigger abort (191 milliseconds) [info] - trivial shuffle with multiple fetch failures (96 milliseconds) [info] - Retry all the tasks on a resubmitted attempt of a barrier stage caused by FetchFailure (162 milliseconds) [info] - Retry all the tasks on a resubmitted attempt of a barrier stage caused by TaskKilled (93 milliseconds) [info] - reduceByKeyAndWindow with inverse and filter functions - big test (762 milliseconds) [info] - Fail the job if a barrier ResultTask failed (101 milliseconds) [info] - late fetch failures don't cause multiple concurrent attempts for the same map stage (100 milliseconds) [info] - extremely late fetch failures don't cause multiple concurrent attempts for the same stage (160 milliseconds) [info] - task events always posted in speculation / when stage is killed (144 milliseconds) [info] - ignore late map task completions (83 milliseconds) [info] - groupByKeyAndWindow (775 milliseconds) [info] - run shuffle with map stage failure (191 milliseconds) [info] - shuffle fetch failure in a reused shuffle dependency (106 milliseconds) [info] - don't submit stage until its dependencies map outputs are registered (SPARK-5259) (184 milliseconds) [info] - register map outputs correctly after ExecutorLost and task Resubmitted (77 milliseconds) [info] - countByWindow (593 milliseconds) [info] - failure of stage used by two jobs (89 milliseconds) [info] - stage used by two jobs, the first no longer active (SPARK-6880) (127 milliseconds) [info] - stage used by two jobs, some fetch failures, and the first job no longer active (SPARK-6880) (113 milliseconds) [info] - countByValueAndWindow (382 milliseconds) [info] TimeSuite: [info] - less (1 millisecond) [info] - lessEq (0 milliseconds) [info] - greater (0 milliseconds) [info] - greaterEq (1 millisecond) [info] - plus (1 millisecond) [info] - minus Time (1 millisecond) [info] - minus Duration (0 milliseconds) [info] - floor (0 milliseconds) [info] - isMultipleOf (0 milliseconds) [info] - min (0 milliseconds) [info] - max (1 millisecond) [info] - until (1 millisecond) [info] - to (1 millisecond) [info] DStreamScopeSuite: [info] - run trivial shuffle with out-of-band executor failure and retry (85 milliseconds) [info] - dstream without scope (2 milliseconds) [info] - input dstream without scope (4 milliseconds) [info] - recursive shuffle failures (159 milliseconds) [info] - scoping simple operations (16 milliseconds) [info] - cached post-shuffle (132 milliseconds) [info] - scoping nested operations (61 milliseconds) [info] - transform should allow RDD operations to be captured in scopes (35 milliseconds) [info] - foreachRDD should allow RDD operations to be captured in scope (59 milliseconds) [info] StreamingContextSuite: [info] - from no conf constructor (99 milliseconds) [info] - SPARK-30388: shuffle fetch failed on speculative task, but original task succeed (631 milliseconds) [info] - from no conf + spark home (107 milliseconds) [info] - misbehaved accumulator should not crash DAGScheduler and SparkContext (161 milliseconds) [info] - from no conf + spark home + env (123 milliseconds) [info] - misbehaved accumulator should not impact other accumulators (196 milliseconds) [info] - misbehaved resultHandler should not crash DAGScheduler and SparkContext (240 milliseconds) [info] - from conf with settings (449 milliseconds) [info] - from existing SparkContext (42 milliseconds) [info] - invalid spark.job.interruptOnCancel should not crash DAGScheduler (107 milliseconds) [info] - from existing SparkContext with settings (69 milliseconds) [info] - getPartitions exceptions should not crash DAGScheduler and SparkContext (SPARK-8606) (88 milliseconds) [info] - getPreferredLocations errors should not crash DAGScheduler and SparkContext (SPARK-8606) (80 milliseconds) [info] - from checkpoint (176 milliseconds) [info] - accumulator not calculated for resubmitted result stage (68 milliseconds) [info] - accumulator not calculated for resubmitted task in result stage (77 milliseconds) [info] - checkPoint from conf (92 milliseconds) [info] - state matching (1 millisecond) [info] - accumulators are updated on exception failures and task killed (119 milliseconds) [info] - reduce tasks should be placed locally with map output (93 milliseconds) [info] - start and stop state check (327 milliseconds) [info] - reduce task locality preferences should only include machines with largest map outputs (89 milliseconds) [info] - stages with both narrow and shuffle dependencies use narrow ones for locality (69 milliseconds) [info] - start with non-serializable DStream checkpoints (122 milliseconds) [info] - Spark exceptions should include call site in stack trace (101 milliseconds) [info] - start failure should stop internal components (67 milliseconds) [info] - catch errors in event loop (106 milliseconds) [info] - simple map stage submission (89 milliseconds) [info] - map stage submission with reduce stage also depending on the data (77 milliseconds) [info] - map stage submission with fetch failure (94 milliseconds) [info] - start should set local properties of streaming jobs correctly (460 milliseconds) [info] - start multiple times (42 milliseconds) [info] - caching (encryption = on) (7 seconds, 240 milliseconds) [info] - map stage submission with multiple shared stages and failures (133 milliseconds) [info] - stop multiple times (70 milliseconds) [info] - Trigger mapstage's job listener in submitMissingTasks (80 milliseconds) [info] - stop before start (55 milliseconds) [info] - start after stop (49 milliseconds) [info] - map stage submission with executor failure late map task completions (81 milliseconds) [info] - getShuffleDependenciesAndResourceProfiles correctly returns only direct shuffle parents (55 milliseconds) [info] - stop only streaming context (172 milliseconds) [info] - stop(stopSparkContext=true) after stop(stopSparkContext=false) (73 milliseconds) [info] - SPARK-17644: After one stage is aborted for too many failed attempts, subsequent stagesstill behave correctly on fetch failures (1 second, 537 milliseconds) [info] - [SPARK-19263] DAGScheduler should not submit multiple active tasksets, even with late completions from earlier stage attempts (194 milliseconds) [info] - task end event should have updated accumulators (SPARK-20342) (470 milliseconds) [info] - Barrier task failures from the same stage attempt don't trigger multiple stage retries (170 milliseconds) [info] - Barrier task failures from a previous stage attempt don't trigger stage retry (121 milliseconds) [info] - SPARK-25341: abort stage while using old fetch protocol (110 milliseconds) [info] - SPARK-25341: retry all the succeeding stages when the map stage is indeterminate (181 milliseconds) [info] - SPARK-25341: continuous indeterminate stage roll back (183 milliseconds) [info] - SPARK-29042: Sampled RDD with unordered input should be indeterminate (138 milliseconds) [info] - SPARK-23207: cannot rollback a result stage (108 milliseconds) [info] - SPARK-23207: local checkpoint fail to rollback (checkpointed before) (237 milliseconds) [info] - SPARK-23207: local checkpoint fail to rollback (checkpointing now) (117 milliseconds) [info] - SPARK-23207: reliable checkpoint can avoid rollback (checkpointed before) (550 milliseconds) [info] - SPARK-23207: reliable checkpoint fail to rollback (checkpointing now) (150 milliseconds) [info] - SPARK-27164: RDD.countApprox on empty RDDs schedules jobs which never complete (145 milliseconds) [info] - Completions in zombie tasksets update status of non-zombie taskset (179 milliseconds) [info] - test default resource profile (107 milliseconds) [info] - test 1 resource profile (91 milliseconds) [info] - test 2 resource profiles errors by default (126 milliseconds) [info] - test 2 resource profile with merge conflict config true (145 milliseconds) [info] - caching on disk (encryption = off) (5 seconds, 948 milliseconds) [info] - test multiple resource profiles created from merging use same rp (182 milliseconds) [info] - test merge 2 resource profiles multiple configs (3 milliseconds) [info] - test merge 3 resource profiles (1 millisecond) [info] - getShuffleDependenciesAndResourceProfiles returns deps and profiles correctly (57 milliseconds) [info] - SPARK-32920: shuffle merge finalization (538 milliseconds) [info] - SPARK-32920: merger locations not empty (97 milliseconds) [info] - run Spark in yarn-client mode (28 seconds, 165 milliseconds) [info] - SPARK-32920: merger locations reuse from shuffle dependency (91 milliseconds) [info] - SPARK-32920: Disable shuffle merge due to not enough mergers available (84 milliseconds) [info] - SPARK-32920: Ensure child stage should not start before all the parent stages are completed with shuffle merge finalized for all the parent stages (89 milliseconds) [info] - SPARK-32920: Reused ShuffleDependency with Shuffle Merge disabled for the corresponding ShuffleDependency should not cause DAGScheduler to hang (113 milliseconds) [info] - SPARK-32920: Reused ShuffleDependency with Shuffle Merge disabled for the corresponding ShuffleDependency with shuffle data loss should recompute missing partitions (148 milliseconds) [info] - SPARK-32920: Empty RDD should not be computed (155 milliseconds) [info] - SPARK-32920: Merge results should be unregistered if the running stage is cancelled before shuffle merge is finalized (158 milliseconds) [info] - SPARK-32920: SPARK-35549: Merge results should not get registered after shuffle merge finalization (140 milliseconds) [info] - SPARK-32920: Disable push based shuffle in the case of a barrier stage (146 milliseconds) [info] - SPARK-32920: metadata fetch failure should not unregister map status (176 milliseconds) [info] - SPARK-32923: handle stage failure for indeterminate map stage with push-based shuffle (199 milliseconds) [info] FsHistoryProviderSuite: [info] - Parse application logs (inMemory = true) (336 milliseconds) [info] - Parse application logs (inMemory = false) (898 milliseconds) [info] - SPARK-31608: parse application logs with HybridStore (357 milliseconds) [info] - SPARK-3697: ignore files that cannot be read. (158 milliseconds) [info] - history file is renamed from inprogress to completed (211 milliseconds) [info] - Parse logs that application is not started (80 milliseconds) [info] - SPARK-5582: empty log directory (176 milliseconds) [info] - stop gracefully (11 seconds, 154 milliseconds) [info] - apps with multiple attempts with order (580 milliseconds) [info] - caching on disk (encryption = on) (6 seconds, 206 milliseconds) [info] - log urls without customization (431 milliseconds) [info] - stop gracefully even if a receiver misses StopReceiver (782 milliseconds) [info] - custom log urls, including FILE_NAME (340 milliseconds) [info] - custom log urls, excluding FILE_NAME (280 milliseconds) [info] - custom log urls with invalid attribute (383 milliseconds) [info] - custom log urls, LOG_FILES not available while FILE_NAME is specified (313 milliseconds) [info] - custom log urls, app not finished, applyIncompleteApplication: true (272 milliseconds) [info] - custom log urls, app not finished, applyIncompleteApplication: false (320 milliseconds) [info] - log cleaner (120 milliseconds) [info] - should not clean inprogress application with lastUpdated time less than maxTime (105 milliseconds) [info] - log cleaner for inProgress files (114 milliseconds) [info] - Event log copy (116 milliseconds) [info] - driver log cleaner (113 milliseconds) [info] - SPARK-8372: new logs with no app ID are ignored (68 milliseconds) [info] - provider correctly checks whether fs is in safe mode (649 milliseconds) [info] - provider waits for safe mode to finish before initializing (43 milliseconds) [info] - provider reports error after FS leaves safe mode (63 milliseconds) [info] - ignore hidden files (101 milliseconds) [info] - support history server ui admin acls (528 milliseconds) [info] - mismatched version discards old listing (127 milliseconds) [info] - invalidate cached UI (363 milliseconds) [info] - clean up stale app information (252 milliseconds) [info] - SPARK-21571: clean up removes invalid history files (79 milliseconds) [info] - always find end event for finished apps (154 milliseconds) [info] - parse event logs with optimizations off (138 milliseconds) [info] - SPARK-24948: ignore files we don't have read permission on (384 milliseconds) [info] - check in-progress event logs absolute length (263 milliseconds) [info] - caching in memory, replicated (encryption = off) (6 seconds, 333 milliseconds) [info] - log cleaner with the maximum number of log files (625 milliseconds) [info] - backwards compatibility with LogInfo from Spark 2.4 (8 milliseconds) [info] - SPARK-29755 LogInfo should be serialized/deserialized by jackson properly (4 milliseconds) [info] - SPARK-29755 AttemptInfoWrapper should be serialized/deserialized by jackson properly (39 milliseconds) [info] - SPARK-29043: clean up specified event log (84 milliseconds) [info] - compact event log files (276 milliseconds) [info] - SPARK-33146: don't let one bad rolling log folder prevent loading other applications (104 milliseconds) [info] - SPARK-36354: EventLogFileReader should skip rolling event log directories with no logs (60 milliseconds) [info] - SPARK-33215: check ui view permissions without retrieving ui (215 milliseconds) [info] RDDSuite: [info] - basic operations (1 second, 153 milliseconds) [info] - serialization (3 milliseconds) [info] - distinct with known partitioner preserves partitioning (426 milliseconds) [info] - countApproxDistinct (155 milliseconds) [info] - SparkContext.union (80 milliseconds) [info] - SparkContext.union parallel partition listing (135 milliseconds) [info] - SparkContext.union creates UnionRDD if at least one RDD has no partitioner (5 milliseconds) [info] - SparkContext.union creates PartitionAwareUnionRDD if all RDDs have partitioners (4 milliseconds) [info] - PartitionAwareUnionRDD raises exception if at least one RDD has no partitioner (2 milliseconds) [info] - SPARK-23778: empty RDD in union should not produce a UnionRDD (11 milliseconds) [info] - partitioner aware union (170 milliseconds) [info] - UnionRDD partition serialized size should be small (10 milliseconds) [info] - fold (27 milliseconds) [info] - fold with op modifying first arg (29 milliseconds) [info] - aggregate (37 milliseconds) [info] - treeAggregate (781 milliseconds) [info] - treeAggregate with ops modifying first args (857 milliseconds) [info] - SPARK-36419: treeAggregate with finalAggregateOnExecutor set to true (1 second, 39 milliseconds) [info] - caching in memory, replicated (encryption = off) (with replication as stream) (6 seconds, 349 milliseconds) [info] - treeReduce (642 milliseconds) [info] - basic caching (41 milliseconds) [info] - caching with failures (24 milliseconds) [info] - empty RDD (194 milliseconds) [info] - repartitioned RDDs (249 milliseconds) [info] - stop slow receiver gracefully (15 seconds, 883 milliseconds) [info] - registering and de-registering of streamingSource (73 milliseconds) [info] - SPARK-28709 registering and de-registering of progressListener (160 milliseconds) [info] - repartitioned RDDs perform load balancing (3 seconds, 84 milliseconds) [info] - coalesced RDDs (225 milliseconds) [info] - coalesced RDDs with locality (88 milliseconds) [info] - coalesced RDDs with partial locality (57 milliseconds) [info] - coalesced RDDs with locality, large scale (10K partitions) (1 second, 427 milliseconds) [info] - awaitTermination (2 seconds, 222 milliseconds) [info] - awaitTermination after stop (107 milliseconds) [info] - coalesced RDDs with partial locality, large scale (10K partitions) (512 milliseconds) [info] - coalesced RDDs with locality, fail first pass (10 milliseconds) [info] - zipped RDDs (70 milliseconds) [info] - partition pruning (23 milliseconds) [info] - caching in memory, replicated (encryption = on) (6 seconds, 303 milliseconds) [info] - awaitTermination with error in task (600 milliseconds) [info] - awaitTermination with error in job generation (431 milliseconds) [info] - awaitTerminationOrTimeout (1 second, 243 milliseconds) [info] - collect large number of empty partitions (2 seconds, 731 milliseconds) [info] - getOrCreate (1 second, 201 milliseconds) [info] - getActive and getActiveOrCreate (308 milliseconds) [info] - getActiveOrCreate with checkpoint (826 milliseconds) [info] - multiple streaming contexts (85 milliseconds) [info] - DStream and generated RDD creation sites (415 milliseconds) [info] - take (2 seconds, 320 milliseconds) [info] - throw exception on using active or stopped context (114 milliseconds) [info] - top with predefined ordering (140 milliseconds) [info] - top with custom ordering (21 milliseconds) [info] - takeOrdered with predefined ordering (18 milliseconds) [info] - takeOrdered with limit 0 (1 millisecond) [info] - takeOrdered with custom ordering (17 milliseconds) [info] - isEmpty (122 milliseconds) [info] - sample preserves partitioner (3 milliseconds) [info] - queueStream doesn't support checkpointing (305 milliseconds) [info] - Creating an InputDStream but not using it should not crash (975 milliseconds) [info] - run Spark in yarn-cluster mode (31 seconds, 47 milliseconds) [info] - caching in memory, replicated (encryption = on) (with replication as stream) (6 seconds, 990 milliseconds) [info] - caching in memory, serialized, replicated (encryption = off) (6 seconds, 583 milliseconds) Exception in thread "streaming-job-executor-0" java.lang.Error: java.lang.InterruptedException at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998) at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304) at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:242) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:258) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:187) at org.apache.spark.util.ThreadUtils$.awaitReady(ThreadUtils.scala:334) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:925) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2227) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2248) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2267) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2292) at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1021) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:406) at org.apache.spark.rdd.RDD.collect(RDD.scala:1020) at org.apache.spark.streaming.StreamingContextSuite.$anonfun$new$133(StreamingContextSuite.scala:842) at org.apache.spark.streaming.StreamingContextSuite.$anonfun$new$133$adapted(StreamingContextSuite.scala:840) at org.apache.spark.streaming.dstream.DStream.$anonfun$foreachRDD$2(DStream.scala:629) at org.apache.spark.streaming.dstream.DStream.$anonfun$foreachRDD$2$adapted(DStream.scala:629) at org.apache.spark.streaming.dstream.ForEachDStream.$anonfun$generateJob$2(ForEachDStream.scala:51) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:417) at org.apache.spark.streaming.dstream.ForEachDStream.$anonfun$generateJob$1(ForEachDStream.scala:51) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.util.Try$.apply(Try.scala:213) at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.$anonfun$run$1(JobScheduler.scala:256) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more [info] - SPARK-18560 Receiver data should be deserialized properly. (11 seconds, 713 milliseconds) [info] - SPARK-22955 graceful shutdown shouldn't lead to job generation error (1 second, 795 milliseconds) [info] DStreamClosureSuite: [info] - user provided closures are actually cleaned (113 milliseconds) [info] BatchedWriteAheadLogSuite: [info] - BatchedWriteAheadLog - read all logs (58 milliseconds) [info] - BatchedWriteAheadLog - write logs (176 milliseconds) [info] - BatchedWriteAheadLog - read all logs after write (185 milliseconds) [info] - caching in memory, serialized, replicated (encryption = off) (with replication as stream) (6 seconds, 483 milliseconds) [info] - BatchedWriteAheadLog - clean old logs (224 milliseconds) [info] - BatchedWriteAheadLog - clean old logs synchronously (128 milliseconds) [info] - BatchedWriteAheadLog - handling file errors while reading rotating logs (371 milliseconds) [info] - BatchedWriteAheadLog - do not create directories or files unless write (5 milliseconds) [info] - BatchedWriteAheadLog - parallel recovery not enabled if closeFileAfterWrite = false (31 milliseconds) [info] - BatchedWriteAheadLog - serializing and deserializing batched records (2 milliseconds) [info] - BatchedWriteAheadLog - failures in wrappedLog get bubbled up (119 milliseconds) [info] - BatchedWriteAheadLog - name log with the highest timestamp of aggregated entries (24 milliseconds) [info] - BatchedWriteAheadLog - shutdown properly (2 milliseconds) [info] - BatchedWriteAheadLog - fail everything in queue during shutdown (8 milliseconds) [info] ReceiverTrackerSuite: [info] - send rate update to receivers (413 milliseconds) [info] - should restart receiver after stopping it (859 milliseconds) [info] - SPARK-11063: TaskSetManager should use Receiver RDD's preferredLocations (563 milliseconds) [info] - get allocated executors (426 milliseconds) [info] StateMapSuite: [info] - EmptyStateMap (1 millisecond) [info] - OpenHashMapBasedStateMap - put, get, getByTime, getAll, remove (1 millisecond) [info] - OpenHashMapBasedStateMap - put, get, getByTime, getAll, remove with copy (1 millisecond) [info] - OpenHashMapBasedStateMap - serializing and deserializing (49 milliseconds) [info] - OpenHashMapBasedStateMap - serializing and deserializing with compaction (7 milliseconds) [info] - takeSample (19 seconds, 49 milliseconds) [info] - takeSample from an empty rdd (15 milliseconds) [info] - randomSplit (528 milliseconds) [info] - runJob on an invalid partition (10 milliseconds) [info] - sort an empty RDD (34 milliseconds) [info] - sortByKey (241 milliseconds) [info] - sortByKey ascending parameter (134 milliseconds) [info] - sortByKey with explicit ordering (110 milliseconds) [info] - repartitionAndSortWithinPartitions (39 milliseconds) [info] - SPARK-32384: repartitionAndSortWithinPartitions without shuffle (17 milliseconds) [info] - cartesian on empty RDD (35 milliseconds) [info] - cartesian on non-empty RDDs (89 milliseconds) [info] - intersection (117 milliseconds) [info] - intersection strips duplicates in an input (114 milliseconds) [info] - zipWithIndex (44 milliseconds) [info] - zipWithIndex with a single partition (17 milliseconds) [info] - zipWithIndex chained with other RDDs (SPARK-4433) (54 milliseconds) [info] - zipWithUniqueId (61 milliseconds) [info] - retag with implicit ClassTag (29 milliseconds) [info] - parent method (9 milliseconds) [info] - getNarrowAncestors (53 milliseconds) [info] - getNarrowAncestors with multiple parents (25 milliseconds) [info] - getNarrowAncestors with cycles (30 milliseconds) [info] - task serialization exception should not hang scheduler (116 milliseconds) [info] - RDD.partitions() fails fast when partitions indices are incorrect (SPARK-13021) (3 milliseconds) [info] - nested RDDs are not supported (SPARK-5063) (43 milliseconds) [info] - actions cannot be performed inside of transformations (SPARK-5063) (28 milliseconds) [info] - caching in memory, serialized, replicated (encryption = on) (6 seconds, 159 milliseconds) [info] - custom RDD coalescer (693 milliseconds) [info] - SPARK-18406: race between end-of-task and completion iterator read lock release (45 milliseconds) [info] - SPARK-27666: Do not release lock while TaskContext already completed (1 second, 46 milliseconds) [info] - SPARK-23496: order of input partitions can result in severe skew in coalesce (4 milliseconds) [info] - cannot run actions after SparkContext has been stopped (SPARK-5063) (74 milliseconds) [info] - cannot call methods on a stopped SparkContext (SPARK-5063) (4 milliseconds) [info] ExecutorSuite: [info] - run Spark in yarn-client mode with unmanaged am (22 seconds, 39 milliseconds) [info] - SPARK-15963: Catch `TaskKilledException` correctly in Executor.TaskRunner (280 milliseconds) [info] - SPARK-19276: Handle FetchFailedExceptions that are hidden by user exceptions (101 milliseconds) [info] - Executor's worker threads should be UninterruptibleThread (114 milliseconds) [info] - SPARK-19276: OOMs correctly handled with a FetchFailure (91 milliseconds) [info] - SPARK-23816: interrupts are not masked by a FetchFailure (70 milliseconds) [info] - Gracefully handle error in task deserialization (18 milliseconds) [info] - Heartbeat should drop zero accumulator updates (118 milliseconds) [info] - Heartbeat should not drop zero accumulator updates when the conf is disabled (6 milliseconds) [info] - Send task executor metrics in DirectTaskResult (65 milliseconds) [info] - Send task executor metrics in TaskKilled (52 milliseconds) [info] - Send task executor metrics in ExceptionFailure (59 milliseconds) [info] - SPARK-34949: do not re-register BlockManager when executor is shutting down (9 milliseconds) [info] - SPARK-33587: isFatalError (125 milliseconds) [info] SerDeUtilSuite: [info] - Converting an empty pair RDD to python does not throw an exception (SPARK-5441) (42 milliseconds) [info] - Converting an empty python RDD to pair RDD does not throw an exception (SPARK-5441) (50 milliseconds) [info] UtilsSuite: [info] - timeConversion (2 milliseconds) [info] - Test byteString conversion (3 milliseconds) [info] - bytesToString (1 millisecond) [info] - copyStream (7 milliseconds) [info] - copyStreamUpTo (33 milliseconds) [info] - memoryStringToMb (1 millisecond) [info] - splitCommandString (1 millisecond) [info] - string formatting of time durations (2 milliseconds) [info] - reading offset bytes of a file (8 milliseconds) [info] - reading offset bytes of a file (compressed) (8 milliseconds) [info] - reading offset bytes across multiple files (26 milliseconds) [info] - reading offset bytes across multiple files (compressed) (13 milliseconds) [info] - deserialize long value (0 milliseconds) [info] - writeByteBuffer should not change ByteBuffer position (1 millisecond) [info] - get iterator size (2 milliseconds) [info] - getIteratorZipWithIndex (2 milliseconds) [info] - SPARK-35907: createDirectory (26 milliseconds) [info] - doesDirectoryContainFilesNewerThan (10 milliseconds) [info] - resolveURI (1 millisecond) [info] - resolveURIs with multiple paths (3 milliseconds) [info] - nonLocalPaths (1 millisecond) [info] - isBindCollision (2 milliseconds) [info] - log4j log level change (1 millisecond) [info] - deleteRecursively (12 milliseconds) [info] - loading properties from file (9 milliseconds) [info] - OpenHashMapBasedStateMap - all possible sequences of operations with copies (8 seconds, 90 milliseconds) [info] - timeIt with prepare (2 seconds, 2 milliseconds) [info] - OpenHashMapBasedStateMap - serializing and deserializing with KryoSerializable states (16 milliseconds) [info] - EmptyStateMap - serializing and deserializing (26 milliseconds) [info] - fetch hcfs dir (37 milliseconds) [info] - shutdown hook manager (5 milliseconds) [info] - MapWithStateRDDRecord - serializing and deserializing with KryoSerializable states (13 milliseconds) [info] - isInDirectory (5 milliseconds) [info] - circular buffer: if nothing was written to the buffer, display nothing (1 millisecond) [info] RecurringTimerSuite: [info] - circular buffer: if the buffer isn't full, print only the contents written (1 millisecond) [info] - circular buffer: data written == size of the buffer (1 millisecond) [info] - circular buffer: multiple overflow (1 millisecond) [info] - isDynamicAllocationEnabled (1 millisecond) [info] - basic (4 milliseconds) [info] - getDynamicAllocationInitialExecutors (4 milliseconds) [info] - Set Spark CallerContext (1 millisecond) [info] - encodeFileNameToURIRawPath (1 millisecond) [info] - SPARK-10224: call 'callback' after stopping (5 milliseconds) [info] - decodeFileNameInURI (0 milliseconds) [info] CheckpointSuite: [info] - non-existent checkpoint dir (2 milliseconds) [info] - caching in memory, serialized, replicated (encryption = on) (with replication as stream) (5 seconds, 741 milliseconds) [info] - Kill process (5 seconds, 44 milliseconds) [info] - chi square test of randomizeInPlace (23 milliseconds) [info] - redact sensitive information (3 milliseconds) [info] - redact sensitive information in command line args (5 milliseconds) [info] - redact sensitive information in sequence of key value pairs (3 milliseconds) [info] - tryWithSafeFinally (7 milliseconds) [info] - tryWithSafeFinallyAndFailureCallbacks (31 milliseconds) [info] - load extensions (17 milliseconds) [info] - check Kubernetes master URL (8 milliseconds) [info] - stringHalfWidth (11 milliseconds) [info] - trimExceptCRLF standalone (6 milliseconds) [info] - pathsToMetadata (2 milliseconds) [info] - checkHost supports both IPV4 and IPV6 (5 milliseconds) [info] - checkHostPort support IPV6 and IPV4 (3 milliseconds) [info] - parseHostPort support IPV6 and IPV4 (2 milliseconds) [info] - executorOffHeapMemorySizeAsMb when MEMORY_OFFHEAP_ENABLED is false (1 millisecond) [info] - executorOffHeapMemorySizeAsMb when MEMORY_OFFHEAP_ENABLED is true (2 milliseconds) [info] - executorMemoryOverhead when MEMORY_OFFHEAP_ENABLED is true, but MEMORY_OFFHEAP_SIZE not config scene (2 milliseconds) [info] - isPushBasedShuffleEnabled when PUSH_BASED_SHUFFLE_ENABLED and SHUFFLE_SERVICE_ENABLED are both set to true in YARN mode with maxAttempts set to 1 (9 milliseconds) [info] PagedDataSourceSuite: [info] - basic (3 milliseconds) [info] CheckpointStorageSuite: [info] - checkpoint compression (318 milliseconds) [info] - cache checkpoint preferred location (159 milliseconds) [info] - caching on disk, replicated 2 (encryption = off) (5 seconds, 249 milliseconds) [info] - SPARK-31484: checkpoint should not fail in retry (633 milliseconds) [info] SortingSuite: [info] - sortByKey (48 milliseconds) [info] - large array (60 milliseconds) [info] - large array with one split (42 milliseconds) [info] - large array with many partitions (94 milliseconds) [info] - sort descending (58 milliseconds) [info] - sort descending with one split (41 milliseconds) [info] - sort descending with many partitions (109 milliseconds) [info] - basic rdd checkpoints + dstream graph checkpoint recovery (6 seconds, 946 milliseconds) [info] - more partitions than elements (120 milliseconds) [info] - empty RDD (70 milliseconds) [info] - partition balancing (104 milliseconds) [info] - partition balancing for descending sort (112 milliseconds) [info] - get a range of elements in a sorted RDD that is on one partition (85 milliseconds) [info] - recovery of conf through checkpoints (516 milliseconds) [info] - get a range of elements over multiple partitions in a descendingly sorted RDD (89 milliseconds) [info] - get a range of elements in an array not partitioned by a range partitioner (26 milliseconds) [info] - get a range of elements over multiple partitions but not taking up full partitions (71 milliseconds) [info] RpcAddressSuite: [info] - hostPort (0 milliseconds) [info] - fromSparkURL (0 milliseconds) [info] - fromSparkURL: a typo url (1 millisecond) [info] - fromSparkURL: invalid scheme (1 millisecond) [info] - toSparkURL (1 millisecond) [info] JavaSerializerSuite: [info] - JavaSerializer instances are serializable (0 milliseconds) [info] - Deserialize object containing a primitive Class as attribute (5 milliseconds) [info] - SPARK-36627: Deserialize object containing a proxy Class as attribute (7 milliseconds) [info] LocalDirsSuite: [info] - Utils.getLocalDir() returns a valid directory, even if some local dirs are missing (6 milliseconds) [info] - SPARK_LOCAL_DIRS override also affects driver (6 milliseconds) [info] - Utils.getLocalDir() throws an exception if any temporary directory cannot be retrieved (10 milliseconds) [info] TaskContextSuite: [info] - get correct spark.driver.[host|port] from checkpoint (315 milliseconds) [info] - provide metrics sources (150 milliseconds) [info] - calls TaskCompletionListener after failure (172 milliseconds) [info] - SPARK-30199 get ui port and blockmanager port (237 milliseconds) [info] - calls TaskFailureListeners after failure (65 milliseconds) [info] - all TaskCompletionListeners should be called even if some fail (9 milliseconds) [info] - all TaskFailureListeners should be called even if some fail (9 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (a,2) (b,1) [info] - TaskContext.attemptNumber should return attempt number, not task id (SPARK-4014) (114 milliseconds) ------------------------------------------- Time: 1000 ms ------------------------------------------- (,2) ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,2) (b,1) ------------------------------------------- Time: 2500 ms ------------------------------------------- (,2) ------------------------------------------- Time: 3000 ms ------------------------------------------- [info] - recovery with map and reduceByKey operations (692 milliseconds) [info] - TaskContext.stageAttemptNumber getter (588 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (a,1) ------------------------------------------- Time: 1000 ms ------------------------------------------- (a,2) [info] - accumulators are updated on exception failures (191 milliseconds) ------------------------------------------- Time: 1500 ms ------------------------------------------- (a,3) [info] - failed tasks collect only accumulators whose values count during failures (68 milliseconds) ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,4) [info] - only updated internal accumulators will be sent back to driver (51 milliseconds) ------------------------------------------- Time: 2500 ms ------------------------------------------- (a,4) [info] - localProperties are propagated to executors correctly (61 milliseconds) [info] - immediately call a completion listener if the context is completed (1 millisecond) [info] - immediately call a failure listener if the context has failed (1 millisecond) [info] - TaskCompletionListenerException.getMessage should include previousError (1 millisecond) [info] - all TaskCompletionListeners should be called even if some fail or a task (3 milliseconds) [info] - listener registers another listener (reentrancy) (1 millisecond) [info] - listener registers another listener using a second thread (3 milliseconds) ------------------------------------------- Time: 3000 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 3500 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 3500 ms ------------------------------------------- (a,4) [info] - listeners registered from different threads are called sequentially (403 milliseconds) [info] - listeners registered from same thread are called in reverse order (1 millisecond) [info] HistoryServerSuite: ------------------------------------------- Time: 4000 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 4500 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 5000 ms ------------------------------------------- (a,4) [info] - recovery with invertible reduceByKeyAndWindow operation (1 second, 365 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (a,2) (b,1) ------------------------------------------- Time: 1000 ms ------------------------------------------- (,2) ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,2) (b,1) ------------------------------------------- Time: 2500 ms ------------------------------------------- (,2) [info] - application list json (2 seconds, 1 milliseconds) [info] - completed app list json (79 milliseconds) [info] - running app list json (12 milliseconds) [info] - minDate app list json (18 milliseconds) [info] - maxDate app list json (16 milliseconds) [info] - maxDate2 app list json (11 milliseconds) [info] - minEndDate app list json (15 milliseconds) [info] - maxEndDate app list json (14 milliseconds) [info] - minEndDate and maxEndDate app list json (12 milliseconds) ------------------------------------------- Time: 3000 ms ------------------------------------------- [info] - minDate and maxEndDate app list json (11 milliseconds) [info] - limit app list json (13 milliseconds) [info] - one app json (117 milliseconds) [info] - recovery with saveAsHadoopFiles operation (2 seconds, 243 milliseconds) [info] - one app multi-attempt json (6 milliseconds) [info] - caching on disk, replicated 2 (encryption = off) (with replication as stream) (6 seconds, 184 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (a,2) (b,1) ------------------------------------------- Time: 1000 ms ------------------------------------------- (,2) ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,2) (b,1) [info] - job list json (1 second, 49 milliseconds) ------------------------------------------- Time: 2500 ms ------------------------------------------- (,2) ------------------------------------------- Time: 3000 ms ------------------------------------------- [info] - recovery with saveAsNewAPIHadoopFiles operation (1 second, 285 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (b,1) (a,2) ------------------------------------------- Time: 1000 ms ------------------------------------------- (,2) [info] - job list from multi-attempt app json(1) (688 milliseconds) ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 2000 ms ------------------------------------------- (b,1) (a,2) ------------------------------------------- Time: 2500 ms ------------------------------------------- (,2) [info] - job list from multi-attempt app json(2) (669 milliseconds) [info] - one job json (7 milliseconds) [info] - succeeded job list json (7 milliseconds) ------------------------------------------- Time: 3000 ms ------------------------------------------- [info] - succeeded&failed job list json (11 milliseconds) [info] - recovery with saveAsHadoopFile inside transform operation (1 second, 211 milliseconds) [info] - executor list json (171 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (a,1) ------------------------------------------- Time: 1000 ms ------------------------------------------- (a,2) ------------------------------------------- Time: 1500 ms ------------------------------------------- (a,3) ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 2500 ms ------------------------------------------- (a,5) ------------------------------------------- Time: 3000 ms ------------------------------------------- (a,6) ------------------------------------------- Time: 3500 ms ------------------------------------------- (a,7) ------------------------------------------- Time: 3500 ms ------------------------------------------- (a,7) ------------------------------------------- Time: 4000 ms ------------------------------------------- (a,8) ------------------------------------------- Time: 4500 ms ------------------------------------------- (a,9) ------------------------------------------- Time: 5000 ms ------------------------------------------- (a,10) [info] - recovery with updateStateByKey operation (1 second, 198 milliseconds) [info] - executor list with executor metrics json (1 second, 167 milliseconds) [info] - stage list json (472 milliseconds) [info] - complete stage list json (22 milliseconds) [info] - failed stage list json (25 milliseconds) [info] - one stage json (269 milliseconds) [info] - one stage json with details (47 milliseconds) [info] - one stage attempt json (44 milliseconds) [info] - one stage attempt json details with failed task (18 milliseconds) [info] - stage task summary w shuffle write (1 second, 16 milliseconds) [info] - stage task summary w shuffle read (26 milliseconds) [info] - stage task summary w/ custom quantiles (87 milliseconds) [info] - stage task list (95 milliseconds) [info] - stage task list w/ offset & length (34 milliseconds) [info] - stage task list w/ sortBy (17 milliseconds) [info] - stage task list w/ sortBy short names: -runtime (18 milliseconds) [info] - stage task list w/ sortBy short names: runtime (14 milliseconds) [info] - caching on disk, replicated 2 (encryption = on) (5 seconds, 961 milliseconds) [info] - recovery maintains rate controller (2 seconds, 940 milliseconds) [info] - stage task list w/ status (822 milliseconds) [info] - stage task list w/ status & offset & length (15 milliseconds) [info] - stage task list w/ status & sortBy short names: runtime (19 milliseconds) [info] - stage list with accumulable json (361 milliseconds) [info] - stage with accumulable json (148 milliseconds) [info] - stage task list from multi-attempt app json(1) (9 milliseconds) [info] - stage task list from multi-attempt app json(2) (132 milliseconds) [info] - excludeOnFailure for stage (1 second, 85 milliseconds) Exception in thread "streaming-job-executor-0" java.lang.Error: java.lang.InterruptedException at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998) at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304) at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:242) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:258) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:187) at org.apache.spark.util.ThreadUtils$.awaitReady(ThreadUtils.scala:334) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:925) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2227) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2248) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2267) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2292) at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1021) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:406) at org.apache.spark.rdd.RDD.collect(RDD.scala:1020) at org.apache.spark.streaming.TestOutputStream$$anonfun$$lessinit$greater$1.apply(TestSuiteBase.scala:99) at org.apache.spark.streaming.TestOutputStream$$anonfun$$lessinit$greater$1.apply(TestSuiteBase.scala:98) at org.apache.spark.streaming.dstream.ForEachDStream.$anonfun$generateJob$2(ForEachDStream.scala:51) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:417) at org.apache.spark.streaming.dstream.ForEachDStream.$anonfun$generateJob$1(ForEachDStream.scala:51) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.util.Try$.apply(Try.scala:213) at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.$anonfun$run$1(JobScheduler.scala:256) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more [info] - excludeOnFailure node for stage (1 second, 144 milliseconds) [info] - rdd list storage json (70 milliseconds) [info] - run Spark in yarn-client mode with different configurations, ensuring redaction (26 seconds, 45 milliseconds) [info] - recovery with file input stream (3 seconds, 551 milliseconds) [info] - executor node excludeOnFailure (798 milliseconds) [info] - executor node excludeOnFailure unexcluding (9 milliseconds) [info] - executor memory usage (9 milliseconds) [info] - DStreamCheckpointData.restore invoking times (676 milliseconds) [info] - executor resource information (376 milliseconds) [info] - multiple resource profiles (714 milliseconds) [info] - recovery from checkpoint contains array object (1 second, 238 milliseconds) [info] - SPARK-11267: the race condition of two checkpoints in a batch (189 milliseconds) [info] - SPARK-28912: Fix MatchError in getCheckpointFiles (39 milliseconds) [info] - caching on disk, replicated 2 (encryption = on) (with replication as stream) (6 seconds, 345 milliseconds) [info] - SPARK-6847: stack overflow when updateStateByKey is followed by a checkpointed dstream (569 milliseconds) [info] - stage list with peak metrics (1 second, 302 milliseconds) [info] FailureSuite: [info] - stage with peak metrics (158 milliseconds) [info] - stage with summaries (141 milliseconds) [info] - app environment (52 milliseconds) [info] - one rdd storage json (24 milliseconds) [info] - miscellaneous process (30 milliseconds) [info] - download all logs for app with multiple attempts (39 milliseconds) [info] - download one log for app with multiple attempts (35 milliseconds) [info] - response codes on bad paths (37 milliseconds) [info] - automatically retrieve uiRoot from request through Knox (48 milliseconds) [info] - static relative links are prefixed with uiRoot (spark.ui.proxyBase) (6 milliseconds) [info] - /version api endpoint (7 milliseconds) [info] - security manager starts with spark.authenticate set (11 milliseconds) [info] - caching on disk, replicated 3 (encryption = off) (5 seconds, 882 milliseconds) [info] - caching on disk, replicated 3 (encryption = off) (with replication as stream) (5 seconds, 851 milliseconds) [info] - incomplete apps get refreshed (13 seconds, 61 milliseconds) [info] - ui and api authorization checks (1 second, 338 milliseconds) [info] - SPARK-33215: speed up event log download by skipping UI rebuild (588 milliseconds) [info] - access history application defaults to the last attempt id (1 second, 767 milliseconds) [info] - SPARK-31697: HistoryServer should set Content-Type (5 milliseconds) [info] - Redirect to the root page when accessed to /history/ (2 milliseconds) [info] NextIteratorSuite: [info] - one iteration (2 milliseconds) [info] - two iterations (1 millisecond) [info] - empty iteration (1 millisecond) [info] - close is called once for empty iterations (0 milliseconds) [info] - close is called once for non-empty iterations (0 milliseconds) [info] ParallelCollectionSplitSuite: [info] - one element per slice (1 millisecond) [info] - one slice (0 milliseconds) [info] - equal slices (0 milliseconds) [info] - non-equal slices (0 milliseconds) [info] - splitting exclusive range (0 milliseconds) [info] - splitting inclusive range (1 millisecond) [info] - empty data (1 millisecond) [info] - zero slices (1 millisecond) [info] - negative number of slices (1 millisecond) [info] - exclusive ranges sliced into ranges (2 milliseconds) [info] - inclusive ranges sliced into ranges (2 milliseconds) [info] - identical slice sizes between Range and NumericRange (3 milliseconds) [info] - identical slice sizes between List and NumericRange (2 milliseconds) [info] - large ranges don't overflow (2 milliseconds) [info] - caching on disk, replicated 3 (encryption = on) (5 seconds, 909 milliseconds) [info] - random array tests (210 milliseconds) [info] - random exclusive range tests (12 milliseconds) [info] - random inclusive range tests (9 milliseconds) [info] - exclusive ranges of longs (2 milliseconds) [info] - inclusive ranges of longs (1 millisecond) [info] - exclusive ranges of doubles (3 milliseconds) [info] - inclusive ranges of doubles (1 millisecond) [info] - inclusive ranges with Int.MaxValue and Int.MinValue (1 millisecond) [info] - empty ranges with Int.MaxValue and Int.MinValue (1 millisecond) [info] UISeleniumSuite: [info] - all jobs page should be rendered even though we configure the scheduling mode to fair (859 milliseconds) [info] - effects of unpersist() / persist() should be reflected (2 seconds, 93 milliseconds) [info] - failed stages should not appear to be active (1 second, 364 milliseconds) [info] - caching on disk, replicated 3 (encryption = on) (with replication as stream) (5 seconds, 790 milliseconds) [info] - spark.ui.killEnabled should properly control kill button display (1 second, 509 milliseconds) [info] - run Spark in yarn-cluster mode with different configurations, ensuring redaction (27 seconds, 45 milliseconds) [info] - jobs page should not display job group name unless some job was submitted in a job group (893 milliseconds) [info] - job progress bars should handle stage / task failures (1 second, 430 milliseconds) [info] - job details page should display useful information for stages that haven't started (638 milliseconds) [info] - job progress bars / cells reflect skipped stages / tasks (690 milliseconds) [info] - stages that aren't run appear as 'skipped stages' after a job finishes (554 milliseconds) [info] - jobs with stages that are skipped should show correct link descriptions on all jobs page (569 milliseconds) [info] - attaching and detaching a new tab (789 milliseconds) [info] - kill stage POST/GET response is correct (222 milliseconds) [info] - caching in memory and disk, replicated (encryption = off) (5 seconds, 825 milliseconds) [info] - kill job POST/GET response is correct (218 milliseconds) [info] - stage & job retention (1 second, 913 milliseconds) [info] - live UI json application list (484 milliseconds) [info] - job stages should have expected dotfile under DAG visualization (252 milliseconds) [info] - stages page should show skipped stages (1 second, 303 milliseconds) [info] - Staleness of Spark UI should not last minutes or hours (512 milliseconds) [info] - description for empty jobs (382 milliseconds) [info] HadoopDelegationTokenManagerSuite: [info] - default configuration (12 milliseconds) [info] - disable hadoopfs credential provider (1 millisecond) [info] - using deprecated configurations (2 milliseconds) [info] - caching in memory and disk, replicated (encryption = off) (with replication as stream) (5 seconds, 754 milliseconds) [info] - multiple failures with map (35 seconds, 476 milliseconds) [info] - SPARK-29082: do not fail if current user does not have credentials (1 second, 145 milliseconds) [info] RollingEventLogFilesWriterSuite: [info] - create EventLogFileWriter with enable/disable rolling (27 milliseconds) [info] - initialize, write, stop - with codec None (46 milliseconds) [info] - initialize, write, stop - with codec Some(lz4) (43 milliseconds) [info] - initialize, write, stop - with codec Some(lzf) (41 milliseconds) [info] - initialize, write, stop - with codec Some(snappy) (90 milliseconds) [info] - initialize, write, stop - with codec Some(zstd) (75 milliseconds) [info] - Use the defalut value of spark.eventLog.compression.codec (11 milliseconds) [info] - Event log names (1 millisecond) [info] - Log overwriting (34 milliseconds) [info] - rolling event log files - codec None (288 milliseconds) [info] - rolling event log files - codec Some(lz4) (269 milliseconds) [info] - rolling event log files - codec Some(lzf) (382 milliseconds) [info] - rolling event log files - codec Some(snappy) (372 milliseconds) [info] - rolling event log files - codec Some(zstd) (371 milliseconds) [info] - rolling event log files - the max size of event log file size less than lower limit (16 milliseconds) [info] RandomBlockReplicationPolicyBehavior: [info] - block replication - random block replication policy (12 milliseconds) [info] RDDCleanerSuite: [info] - RDD shuffle cleanup standalone (207 milliseconds) [info] LocalDiskShuffleMapOutputWriterSuite: [info] - writing to an outputstream (5 milliseconds) [info] - writing to a channel (8 milliseconds) [info] TestMemoryManagerSuite: [info] - tracks allocated execution memory by task (4 milliseconds) [info] - markconsequentOOM (1 millisecond) [info] ResourceProfileManagerSuite: [info] - ResourceProfileManager (4 milliseconds) [info] - isSupported yarn no dynamic allocation (1 millisecond) [info] - isSupported yarn with dynamic allocation (1 millisecond) [info] - isSupported k8s with dynamic allocation (0 milliseconds) [info] - isSupported with local mode (1 millisecond) [info] - ResourceProfileManager has equivalent profile (174 milliseconds) [info] ExecutorRunnerTest: [info] - command includes appId (23 milliseconds) [info] BlockTransferServiceSuite: [info] - fetchBlockSync should not hang when BlockFetchingListener.onBlockFetchSuccess fails (4 milliseconds) [info] EventLoggingListenerSuite: [info] - Basic event logging with compression (279 milliseconds) [info] - caching in memory and disk, replicated (encryption = on) (5 seconds, 956 milliseconds) [info] - End-to-end event logging (5 seconds, 325 milliseconds) [info] - caching in memory and disk, replicated (encryption = on) (with replication as stream) (6 seconds, 301 milliseconds) [info] - yarn-cluster should respect conf overrides in SparkHadoopUtil (SPARK-16414, SPARK-23630) (25 seconds, 41 milliseconds) [info] - caching in memory and disk, serialized, replicated (encryption = off) (6 seconds, 126 milliseconds) [info] - caching in memory and disk, serialized, replicated (encryption = off) (with replication as stream) (6 seconds, 181 milliseconds) [info] - End-to-end event logging with compression (20 seconds, 442 milliseconds) [info] - Event logging with password redaction (14 milliseconds) [info] - Spark-33504 sensitive attributes redaction in properties (20 milliseconds) [info] - Executor metrics update (55 milliseconds) [info] - SPARK-31764: isBarrier should be logged in event log (166 milliseconds) [info] PluginContainerSuite: [info] - plugin initialization and communication (163 milliseconds) [info] - do nothing if plugins are not configured (2 milliseconds) [info] - merging of config options (61 milliseconds) [info] - SPARK-33088: executor tasks trigger plugin calls (84 milliseconds) [info] - SPARK-33088: executor failed tasks trigger plugin calls (101 milliseconds) [info] - caching in memory and disk, serialized, replicated (encryption = on) (6 seconds, 175 milliseconds) [info] - plugin initialization in non-local mode (4 seconds, 450 milliseconds) [info] - multiple failures with updateStateByKey (35 seconds, 524 milliseconds) [info] WriteAheadLogUtilsSuite: [info] - log selection and creation (17 milliseconds) [info] - wrap WriteAheadLog in BatchedWriteAheadLog when batching is enabled (3 milliseconds) [info] - batching is enabled by default in WriteAheadLog (0 milliseconds) [info] - closeFileAfterWrite is disabled by default in WriteAheadLog (0 milliseconds) [info] BlockGeneratorSuite: [info] - block generation and data callbacks (26 milliseconds) [info] - stop ensures correct shutdown (241 milliseconds) [info] - block push errors are reported (25 milliseconds) [info] UISeleniumSuite: [info] - caching in memory and disk, serialized, replicated (encryption = on) (with replication as stream) (6 seconds, 145 milliseconds) [info] - plugin initialization in non-local mode with resources (4 seconds, 731 milliseconds) [info] DriverRunnerTest: [info] - Process succeeds instantly (74 milliseconds) [info] - Process failing several times and then succeeding (16 milliseconds) [info] - Process doesn't restart if not supervised (12 milliseconds) [info] - Process doesn't restart if killed (16 milliseconds) [info] - Reset of backoff counter (30 milliseconds) [info] - Kill process finalized with state KILLED (61 milliseconds) [info] - Finalized with state FINISHED (41 milliseconds) [info] - Finalized with state FAILED (39 milliseconds) [info] - Handle exception starting process (28 milliseconds) [info] - SPARK-35672: run Spark in yarn-client mode with additional jar using URI scheme 'local' (26 seconds, 42 milliseconds) [info] PrefixComparatorsSuite: [info] - String prefix comparator (75 milliseconds) [info] - Binary prefix comparator (20 milliseconds) [info] - double prefix comparator handles NaNs properly (2 milliseconds) [info] - double prefix comparator handles negative NaNs properly (1 millisecond) [info] - double prefix comparator handles other special values properly (2 milliseconds) [info] NettyBlockTransferSecuritySuite: [info] - security default off (62 milliseconds) [info] - security on same password (134 milliseconds) [info] - security on mismatch password (91 milliseconds) [info] - security mismatch auth off on server (40 milliseconds) [info] - security mismatch auth off on client (54 milliseconds) [info] - security with aes encryption (200 milliseconds) [info] CommandUtilsSuite: [info] - set libraryPath correctly (23 milliseconds) [info] - auth secret shouldn't appear in java opts (41 milliseconds) [info] PairRDDFunctionsSuite: [info] - aggregateByKey (119 milliseconds) [info] - groupByKey (67 milliseconds) [info] - groupByKey with duplicates (72 milliseconds) [info] - groupByKey with negative key hash codes (48 milliseconds) [info] - groupByKey with many output partitions (63 milliseconds) [info] - attaching and detaching a Streaming tab (5 seconds, 444 milliseconds) [info] ExecutorAllocationManagerSuite: [info] - basic functionality (179 milliseconds) [info] - basic decommissioning (56 milliseconds) [info] - requestExecutors policy (38 milliseconds) [info] - killExecutor policy (16 milliseconds) [info] - parameter validation (21 milliseconds) [info] - enabling and disabling (936 milliseconds) [info] StreamingJobProgressListenerSuite: [info] - onBatchSubmitted, onBatchStarted, onBatchCompleted, onReceiverStarted, onReceiverError, onReceiverStopped (69 milliseconds) [info] - Remove the old completed batches when exceeding the limit (71 milliseconds) [info] - out-of-order onJobStart and onBatchXXX (80 milliseconds) [info] - detect memory leak (101 milliseconds) [info] ReceiverSuite: [info] - receiver life cycle (334 milliseconds) [info] - block generator throttling !!! IGNORED !!! [info] - compute without caching when no partitions fit in memory (7 seconds, 777 milliseconds) [info] - sampleByKey (6 seconds, 212 milliseconds) [info] - compute when only some partitions fit in memory (6 seconds, 599 milliseconds) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@58238b87 rejected from java.util.concurrent.ThreadPoolExecutor@4cdbb92a[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 108] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete(Promise.scala:372) at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete$(Promise.scala:371) at scala.concurrent.impl.Promise$KeptPromise$Successful.onComplete(Promise.scala:379) at scala.concurrent.impl.Promise.transform(Promise.scala:33) at scala.concurrent.impl.Promise.transform$(Promise.scala:31) at scala.concurrent.impl.Promise$KeptPromise$Successful.transform(Promise.scala:379) at scala.concurrent.Future.map(Future.scala:292) at scala.concurrent.Future.map$(Future.scala:292) at scala.concurrent.impl.Promise$KeptPromise$Successful.map(Promise.scala:379) at scala.concurrent.Future$.apply(Future.scala:659) at org.apache.spark.streaming.receiver.WriteAheadLogBasedBlockHandler.storeBlock(ReceivedBlockHandler.scala:190) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:158) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushArrayBuffer(ReceiverSupervisorImpl.scala:129) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl$$anon$2.onPushBlock(ReceiverSupervisorImpl.scala:110) at org.apache.spark.streaming.receiver.BlockGenerator.pushBlock(BlockGenerator.scala:299) at org.apache.spark.streaming.receiver.BlockGenerator.org$apache$spark$streaming$receiver$BlockGenerator$$keepPushingBlocks(BlockGenerator.scala:271) at org.apache.spark.streaming.receiver.BlockGenerator$$anon$1.run(BlockGenerator.scala:112) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@403b7f8e rejected from java.util.concurrent.ThreadPoolExecutor@18058a4f[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 108] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete(Promise.scala:372) at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete$(Promise.scala:371) at scala.concurrent.impl.Promise$KeptPromise$Successful.onComplete(Promise.scala:379) at scala.concurrent.impl.Promise.transform(Promise.scala:33) at scala.concurrent.impl.Promise.transform$(Promise.scala:31) at scala.concurrent.impl.Promise$KeptPromise$Successful.transform(Promise.scala:379) at scala.concurrent.Future.map(Future.scala:292) at scala.concurrent.Future.map$(Future.scala:292) at scala.concurrent.impl.Promise$KeptPromise$Successful.map(Promise.scala:379) at scala.concurrent.Future$.apply(Future.scala:659) at org.apache.spark.streaming.receiver.WriteAheadLogBasedBlockHandler.storeBlock(ReceivedBlockHandler.scala:190) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:158) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushArrayBuffer(ReceiverSupervisorImpl.scala:129) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl$$anon$2.onPushBlock(ReceiverSupervisorImpl.scala:110) at org.apache.spark.streaming.receiver.BlockGenerator.pushBlock(BlockGenerator.scala:299) at org.apache.spark.streaming.receiver.BlockGenerator.org$apache$spark$streaming$receiver$BlockGenerator$$keepPushingBlocks(BlockGenerator.scala:271) at org.apache.spark.streaming.receiver.BlockGenerator$$anon$1.run(BlockGenerator.scala:112) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@4d7c87e6 rejected from java.util.concurrent.ThreadPoolExecutor@4cdbb92a[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 108] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete(Promise.scala:372) at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete$(Promise.scala:371) at scala.concurrent.impl.Promise$KeptPromise$Successful.onComplete(Promise.scala:379) at scala.concurrent.impl.Promise.transform(Promise.scala:33) at scala.concurrent.impl.Promise.transform$(Promise.scala:31) at scala.concurrent.impl.Promise$KeptPromise$Successful.transform(Promise.scala:379) at scala.concurrent.Future.map(Future.scala:292) at scala.concurrent.Future.map$(Future.scala:292) at scala.concurrent.impl.Promise$KeptPromise$Successful.map(Promise.scala:379) at scala.concurrent.Future$.apply(Future.scala:659) at org.apache.spark.streaming.receiver.WriteAheadLogBasedBlockHandler.storeBlock(ReceivedBlockHandler.scala:203) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:158) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushArrayBuffer(ReceiverSupervisorImpl.scala:129) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl$$anon$2.onPushBlock(ReceiverSupervisorImpl.scala:110) at org.apache.spark.streaming.receiver.BlockGenerator.pushBlock(BlockGenerator.scala:299) at org.apache.spark.streaming.receiver.BlockGenerator.org$apache$spark$streaming$receiver$BlockGenerator$$keepPushingBlocks(BlockGenerator.scala:271) at org.apache.spark.streaming.receiver.BlockGenerator$$anon$1.run(BlockGenerator.scala:112) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@78878749 rejected from java.util.concurrent.ThreadPoolExecutor@18058a4f[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 108] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete(Promise.scala:372) at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete$(Promise.scala:371) at scala.concurrent.impl.Promise$KeptPromise$Successful.onComplete(Promise.scala:379) at scala.concurrent.impl.Promise.transform(Promise.scala:33) at scala.concurrent.impl.Promise.transform$(Promise.scala:31) at scala.concurrent.impl.Promise$KeptPromise$Successful.transform(Promise.scala:379) at scala.concurrent.Future.map(Future.scala:292) at scala.concurrent.Future.map$(Future.scala:292) at scala.concurrent.impl.Promise$KeptPromise$Successful.map(Promise.scala:379) at scala.concurrent.Future$.apply(Future.scala:659) at org.apache.spark.streaming.receiver.WriteAheadLogBasedBlockHandler.storeBlock(ReceivedBlockHandler.scala:203) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:158) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushArrayBuffer(ReceiverSupervisorImpl.scala:129) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl$$anon$2.onPushBlock(ReceiverSupervisorImpl.scala:110) at org.apache.spark.streaming.receiver.BlockGenerator.pushBlock(BlockGenerator.scala:299) at org.apache.spark.streaming.receiver.BlockGenerator.org$apache$spark$streaming$receiver$BlockGenerator$$keepPushingBlocks(BlockGenerator.scala:271) at org.apache.spark.streaming.receiver.BlockGenerator$$anon$1.run(BlockGenerator.scala:112) [info] - sampleByKeyExact (8 seconds, 396 milliseconds) [info] - reduceByKey (33 milliseconds) [info] - reduceByKey with collectAsMap (33 milliseconds) [info] - reduceByKey with many output partitions (34 milliseconds) [info] - reduceByKey with partitioner (35 milliseconds) [info] - countApproxDistinctByKey (126 milliseconds) [info] - join (42 milliseconds) [info] - join all-to-all (31 milliseconds) [info] - leftOuterJoin (31 milliseconds) [info] - cogroup with empty RDD (26 milliseconds) [info] - cogroup with groupByed RDD having 0 partitions (45 milliseconds) [info] - cogroup between multiple RDD with an order of magnitude difference in number of partitions (10 milliseconds) [info] - cogroup between multiple RDD with number of partitions similar in order of magnitude (7 milliseconds) [info] - cogroup between multiple RDD when defaultParallelism is set without proper partitioner (4 milliseconds) [info] - cogroup between multiple RDD when defaultParallelism is set with proper partitioner (4 milliseconds) [info] - cogroup between multiple RDD when defaultParallelism is set; with huge number of partitions in upstream RDDs (4 milliseconds) [info] - rightOuterJoin (33 milliseconds) [info] - fullOuterJoin (32 milliseconds) [info] - join with no matches (31 milliseconds) [info] - join with many output partitions (37 milliseconds) [info] - groupWith (31 milliseconds) [info] - groupWith3 (38 milliseconds) [info] - groupWith4 (44 milliseconds) [info] - zero-partition RDD (77 milliseconds) [info] - keys and values (77 milliseconds) [info] - default partitioner uses partition size (15 milliseconds) [info] - default partitioner uses largest partitioner (12 milliseconds) [info] - subtract (54 milliseconds) [info] - subtract with narrow dependency (70 milliseconds) [info] - subtractByKey (30 milliseconds) [info] - subtractByKey with narrow dependency (36 milliseconds) [info] - foldByKey (34 milliseconds) [info] - foldByKey with mutable result type (57 milliseconds) [info] - saveNewAPIHadoopFile should call setConf if format is configurable (102 milliseconds) [info] - The JobId on the driver and executors should be the same during the commit (59 milliseconds) [info] - saveAsHadoopFile should respect configured output committers (67 milliseconds) [info] - failure callbacks should be called before calling writer.close() in saveNewAPIHadoopFile (58 milliseconds) [info] - failure callbacks should be called before calling writer.close() in saveAsHadoopFile (76 milliseconds) [info] - saveAsNewAPIHadoopDataset should support invalid output paths when there are no files to be committed to an absolute output location (106 milliseconds) [info] - saveAsHadoopDataset should respect empty output directory when there are no files to be committed to an absolute output location (44 milliseconds) [info] - passing environment variables to cluster (5 seconds, 47 milliseconds) [info] - lookup (57 milliseconds) [info] - lookup with partitioner (57 milliseconds) [info] - lookup with bad partitioner (26 milliseconds) [info] RBackendSuite: [info] - close() clears jvmObjectTracker (3 milliseconds) [info] PrimitiveVectorSuite: [info] - primitive value (7 milliseconds) [info] - non-primitive value (6 milliseconds) [info] - ideal growth (5 milliseconds) [info] - ideal size (2 milliseconds) [info] - resizing (6 milliseconds) [info] MetricsConfigSuite: [info] - MetricsConfig with default properties (2 milliseconds) [info] - MetricsConfig with properties set from a file (1 millisecond) [info] - MetricsConfig with properties set from a Spark configuration (1 millisecond) [info] - MetricsConfig with properties set from a file and a Spark configuration (1 millisecond) [info] - MetricsConfig with subProperties (1 millisecond) [info] PartiallySerializedBlockSuite: [info] - valuesIterator() and finishWritingToStream() cannot be called after discard() is called (60 milliseconds) [info] - discard() can be called more than once (1 millisecond) [info] - cannot call valuesIterator() more than once (4 milliseconds) [info] - cannot call finishWritingToStream() more than once (4 milliseconds) [info] - cannot call finishWritingToStream() after valuesIterator() (2 milliseconds) [info] - cannot call valuesIterator() after finishWritingToStream() (2 milliseconds) [info] - buffers are deallocated in a TaskCompletionListener (2 milliseconds) [info] - basic numbers with discard() and numBuffered = 50 (5 milliseconds) [info] - basic numbers with finishWritingToStream() and numBuffered = 50 (39 milliseconds) [info] - basic numbers with valuesIterator() and numBuffered = 50 (4 milliseconds) [info] - basic numbers with discard() and numBuffered = 0 (1 millisecond) [info] - basic numbers with finishWritingToStream() and numBuffered = 0 (17 milliseconds) [info] - basic numbers with valuesIterator() and numBuffered = 0 (2 milliseconds) [info] - basic numbers with discard() and numBuffered = 1000 (20 milliseconds) [info] - basic numbers with finishWritingToStream() and numBuffered = 1000 (20 milliseconds) [info] - basic numbers with valuesIterator() and numBuffered = 1000 (27 milliseconds) [info] - case classes with discard() and numBuffered = 50 (28 milliseconds) [info] - case classes with finishWritingToStream() and numBuffered = 50 (158 milliseconds) [info] - case classes with valuesIterator() and numBuffered = 50 (11 milliseconds) [info] - case classes with discard() and numBuffered = 0 (2 milliseconds) [info] - case classes with finishWritingToStream() and numBuffered = 0 (120 milliseconds) [info] - case classes with valuesIterator() and numBuffered = 0 (2 milliseconds) [info] - case classes with discard() and numBuffered = 1000 (159 milliseconds) [info] - case classes with finishWritingToStream() and numBuffered = 1000 (148 milliseconds) [info] - case classes with valuesIterator() and numBuffered = 1000 (147 milliseconds) [info] - empty iterator with discard() and numBuffered = 0 (1 millisecond) [info] - empty iterator with finishWritingToStream() and numBuffered = 0 (1 millisecond) [info] - empty iterator with valuesIterator() and numBuffered = 0 (1 millisecond) [info] SparkContextSchedulerCreationSuite: [info] - bad-master (34 milliseconds) [info] - local (39 milliseconds) [info] - local-* (33 milliseconds) [info] - local-n (54 milliseconds) [info] - local-*-n-failures (42 milliseconds) [info] - local-n-failures (36 milliseconds) [info] - bad-local-n (35 milliseconds) [info] - bad-local-n-failures (35 milliseconds) [info] - local-default-parallelism (37 milliseconds) [info] - local-cluster (223 milliseconds) [info] SerializationDebuggerSuite: [info] - primitives, strings, and nulls (2 milliseconds) [info] - primitive arrays (0 milliseconds) [info] - non-primitive arrays (1 millisecond) [info] - serializable object (1 millisecond) [info] - nested arrays (0 milliseconds) [info] - nested objects (0 milliseconds) [info] - cycles (should not loop forever) (0 milliseconds) [info] - root object not serializable (1 millisecond) [info] - array containing not serializable element (2 milliseconds) [info] - object containing not serializable field (1 millisecond) [info] - externalizable class writing out not serializable object (1 millisecond) [info] - externalizable class writing out serializable objects (0 milliseconds) [info] - object containing writeReplace() which returns not serializable object (1 millisecond) [info] - object containing writeReplace() which returns serializable object (0 milliseconds) [info] - no infinite loop with writeReplace() which returns class of its own type (1 millisecond) [info] - object containing writeObject() and not serializable field (2 milliseconds) [info] - object containing writeObject() and serializable field (1 millisecond) [info] - object of serializable subclass with more fields than superclass (SPARK-7180) (1 millisecond) [info] - crazy nested objects (1 millisecond) [info] - improveException (1 millisecond) [info] - improveException with error in debugger (3 milliseconds) [info] LoggingSuite: [info] - spark-shell logging filter (1 millisecond) [info] NettyRpcHandlerSuite: [info] - receive (79 milliseconds) [info] - connectionTerminated (2 milliseconds) [info] SamplingUtilsSuite: [info] - reservoirSampleAndCount (1 millisecond) [info] - SPARK-18678 reservoirSampleAndCount with tiny input (5 milliseconds) [info] - computeFraction (4 milliseconds) [info] AppStatusListenerWithInMemoryStoreSuite: [info] - environment info (1 millisecond) [info] - scheduler events (30 milliseconds) [info] - storage events (20 milliseconds) [info] - eviction of old data (14 milliseconds) [info] - eviction should respect job completion time (2 milliseconds) [info] - eviction should respect stage completion time (2 milliseconds) [info] - skipped stages should be evicted before completed stages (1 millisecond) [info] - eviction should respect task completion time (2 milliseconds) [info] - lastStageAttempt should fail when the stage doesn't exist (3 milliseconds) [info] - SPARK-24415: update metrics for tasks that finish late (2 milliseconds) [info] - Total tasks in the executor summary should match total stage tasks (live = true) (6 milliseconds) [info] - Total tasks in the executor summary should match total stage tasks (live = false) (1 millisecond) [info] - driver logs (2 milliseconds) [info] - executor metrics updates (9 milliseconds) [info] - stage executor metrics (3 milliseconds) [info] - storage information on executor lost/down (5 milliseconds) [info] - clean up used memory when BlockManager added (1 millisecond) [info] - SPARK-34877 - check YarnAmInfoEvent is populated correctly (2 milliseconds) [info] TimeStampedHashMapSuite: [info] - HashMap - basic test (4 milliseconds) [info] - TimeStampedHashMap - basic test (5 milliseconds) [info] - TimeStampedHashMap - threading safety test (122 milliseconds) [info] - TimeStampedHashMap - clearing by timestamp (33 milliseconds) [info] RandomSamplerSuite: [info] - utilities (12 milliseconds) [info] - sanity check medianKSD against references (71 milliseconds) [info] - bernoulli sampling (30 milliseconds) [info] - bernoulli sampling without iterator (30 milliseconds) [info] - bernoulli sampling with gap sampling optimization (73 milliseconds) [info] - bernoulli sampling (without iterator) with gap sampling optimization (86 milliseconds) [info] - bernoulli boundary cases (1 millisecond) [info] - bernoulli (without iterator) boundary cases (2 milliseconds) [info] - bernoulli data types (81 milliseconds) [info] - bernoulli clone (16 milliseconds) [info] - bernoulli set seed (30 milliseconds) [info] - replacement sampling (37 milliseconds) [info] - replacement sampling without iterator (39 milliseconds) [info] - replacement sampling with gap sampling (114 milliseconds) [info] - replacement sampling (without iterator) with gap sampling (125 milliseconds) [info] - replacement boundary cases (0 milliseconds) [info] - replacement (without) boundary cases (1 millisecond) [info] - replacement data types (147 milliseconds) [info] - replacement clone (30 milliseconds) [info] - replacement set seed (44 milliseconds) [info] - bernoulli partitioning sampling (20 milliseconds) [info] - bernoulli partitioning sampling without iterator (23 milliseconds) [info] - bernoulli partitioning boundary cases (1 millisecond) [info] - bernoulli partitioning (without iterator) boundary cases (4 milliseconds) [info] - bernoulli partitioning data (1 millisecond) [info] - bernoulli partitioning clone (0 milliseconds) [info] ChunkedByteBufferOutputStreamSuite: [info] - empty output (2 milliseconds) [info] - write a single byte (1 millisecond) [info] - write a single near boundary (1 millisecond) [info] - write a single at boundary (1 millisecond) [info] - single chunk output (1 millisecond) [info] - single chunk output at boundary size (1 millisecond) [info] - multiple chunk output (1 millisecond) [info] - multiple chunk output at boundary size (1 millisecond) [info] - SPARK-36464: size returns correct positive number even with over 2GB data (821 milliseconds) [info] ProcfsMetricsGetterSuite: [info] - testGetProcessInfo (2 milliseconds) [info] - SPARK-34845: partial metrics shouldn't be returned (30 milliseconds) [info] GraphiteSinkSuite: [info] - GraphiteSink with default MetricsFilter (13 milliseconds) [info] - GraphiteSink with regex MetricsFilter (2 milliseconds) [info] SparkSubmitUtilsSuite: [info] - incorrect maven coordinate throws error (4 milliseconds) [info] - create repo resolvers (113 milliseconds) [info] - create additional resolvers (6 milliseconds) :: loading settings :: url = jar:file:/home/jenkins/sparkivy/per-executor-caches/0/.cache/coursier/v1/https/maven-central.storage-download.googleapis.com/maven2/org/apache/ivy/ivy/2.5.0/ivy-2.5.0.jar!/org/apache/ivy/core/settings/ivysettings.xml [info] - add dependencies works correctly (91 milliseconds) [info] - excludes works correctly (7 milliseconds) [info] - recover from node failures (6 seconds, 224 milliseconds) [info] - ivy path works correctly (1 second, 767 milliseconds) [info] - search for artifact at local repositories (1 second, 879 milliseconds) [info] - SPARK-35672: run Spark in yarn-cluster mode with additional jar using URI scheme 'local' (26 seconds, 57 milliseconds) [info] - dependency not found throws RuntimeException (92 milliseconds) [info] - neglects Spark and Spark's dependencies (309 milliseconds) [info] - exclude dependencies end to end (463 milliseconds) :: loading settings :: file = /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/target/tmp/ivy-fea0697b-9a32-4d00-adbe-170e71c300e4/ivysettings.xml [info] - load ivy settings file (492 milliseconds) [info] - SPARK-10878: test resolution files cleaned after resolving artifact (221 milliseconds) [info] - SPARK-34624: should ignore non-jar dependencies (521 milliseconds) [info] BasicEventFilterBuilderSuite: [info] - track live jobs (10 milliseconds) [info] - track live executors (1 millisecond) [info] ImplicitOrderingSuite: [info] - basic inference of Orderings (144 milliseconds) [info] TaskMetricsSuite: [info] - mutating values (1 millisecond) [info] - mutating shuffle read metrics values (1 millisecond) [info] - mutating shuffle write metrics values (0 milliseconds) [info] - mutating input metrics values (0 milliseconds) [info] - mutating output metrics values (1 millisecond) [info] - merging multiple shuffle read metrics (1 millisecond) [info] - additional accumulables (2 milliseconds) [info] ExternalShuffleServiceSuite: [info] - groupByKey without compression (140 milliseconds) [info] - shuffle non-zero block size (6 seconds, 122 milliseconds) [info] - recover from repeated node failures during shuffle-map (13 seconds, 153 milliseconds) [info] - shuffle serializer (6 seconds, 109 milliseconds) [info] - write ahead log - generating and cleaning (39 seconds, 276 milliseconds) [info] StreamingListenerSuite: [info] - batch info reporting (610 milliseconds) [info] - receiver info reporting (198 milliseconds) [info] - output operation reporting (1 second, 246 milliseconds) [info] - don't call ssc.stop in listener (859 milliseconds) [info] - onBatchCompleted with successful batch (1 second, 18 milliseconds) [info] - onBatchCompleted with failed batch and one failed job (1 second, 9 milliseconds) [info] - zero sized blocks (8 seconds, 977 milliseconds) [info] - onBatchCompleted with failed batch and multiple failed jobs (917 milliseconds) [info] - StreamingListener receives no events after stopping StreamingListenerBus (402 milliseconds) [info] MapWithStateRDDSuite: [info] - creation from pair RDD (122 milliseconds) [info] - updating state and generating mapped data in MapWithStateRDDRecord (4 milliseconds) [info] - SPARK-35672: run Spark in yarn-client mode with additional jar using URI scheme 'local' and gateway-replacement path (26 seconds, 40 milliseconds) [info] - recover from repeated node failures during shuffle-reduce (14 seconds, 979 milliseconds) [info] - states generated by MapWithStateRDD (1 second, 219 milliseconds) [info] - checkpointing (907 milliseconds) [info] - checkpointing empty state RDD (253 milliseconds) [info] BasicOperationsSuite: [info] - map (494 milliseconds) [info] - flatMap (326 milliseconds) [info] - filter (390 milliseconds) [info] - glom (305 milliseconds) [info] - mapPartitions (320 milliseconds) [info] - repartition (more partitions) (414 milliseconds) [info] - repartition (fewer partitions) (424 milliseconds) [info] - groupByKey (541 milliseconds) [info] - reduceByKey (494 milliseconds) [info] - reduce (433 milliseconds) [info] - zero sized blocks without kryo (8 seconds, 173 milliseconds) [info] - count (521 milliseconds) [info] - countByValue (361 milliseconds) [info] - mapValues (294 milliseconds) [info] - flatMapValues (327 milliseconds) [info] - union (282 milliseconds) [info] - union with input stream return None (140 milliseconds) [info] - StreamingContext.union (276 milliseconds) [info] - transform (269 milliseconds) [info] - transform with NULL (104 milliseconds) [info] - transform with input stream return None (164 milliseconds) [info] - transformWith (464 milliseconds) [info] - transformWith with input stream return None (218 milliseconds) [info] - StreamingContext.transform (306 milliseconds) [info] - StreamingContext.transform with input stream return None (189 milliseconds) [info] - cogroup (374 milliseconds) [info] - join (360 milliseconds) [info] - leftOuterJoin (471 milliseconds) [info] - rightOuterJoin (478 milliseconds) [info] - fullOuterJoin (452 milliseconds) [info] - updateStateByKey (531 milliseconds) [info] - shuffle on mutable pairs (6 seconds, 117 milliseconds) [info] - updateStateByKey - simple with initial value RDD (401 milliseconds) [info] - recover from node failures with replication (12 seconds, 903 milliseconds) [info] - updateStateByKey - testing time stamps as input (448 milliseconds) [info] - updateStateByKey - with initial value RDD (384 milliseconds) [info] - updateStateByKey - object lifecycle (334 milliseconds) [info] - slice (2 seconds, 150 milliseconds) [info] - slice - has not been initialized (58 milliseconds) [info] - rdd cleanup - map and window (349 milliseconds) [info] - rdd cleanup - updateStateByKey (753 milliseconds) [info] - sorting on mutable pairs (6 seconds, 57 milliseconds) [info] - unpersist RDDs (5 seconds, 622 milliseconds) Exception in thread "receiver-supervisor-future-0" java.lang.Error: java.lang.InterruptedException: sleep interrupted at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException: sleep interrupted at java.lang.Thread.sleep(Native Method) at org.apache.spark.streaming.receiver.ReceiverSupervisor.$anonfun$restartReceiver$1(ReceiverSupervisor.scala:196) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659) at scala.util.Success.$anonfun$map$1(Try.scala:255) at scala.util.Success.map(Try.scala:213) at scala.concurrent.Future.$anonfun$map$1(Future.scala:292) at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more [info] - rdd cleanup - input blocks and persisted RDDs (2 seconds, 155 milliseconds) [info] InputStreamsSuite: Exception in thread "receiver-supervisor-future-0" java.lang.Error: java.lang.InterruptedException: sleep interrupted at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException: sleep interrupted at java.lang.Thread.sleep(Native Method) at org.apache.spark.streaming.receiver.ReceiverSupervisor.$anonfun$restartReceiver$1(ReceiverSupervisor.scala:196) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659) at scala.util.Success.$anonfun$map$1(Try.scala:255) at scala.util.Success.map(Try.scala:213) at scala.concurrent.Future.$anonfun$map$1(Future.scala:292) at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more [info] - socket input stream (586 milliseconds) [info] - socket input stream - no block in a batch (394 milliseconds) [info] - reference partitions inside a task (4 seconds, 126 milliseconds) [info] - cogroup using mutable pairs (4 seconds, 838 milliseconds) [info] MesosCoarseGrainedSchedulerBackendSuite: [info] - SPARK-35672: run Spark in yarn-cluster mode with additional jar using URI scheme 'local' and gateway-replacement path (26 seconds, 46 milliseconds) [info] - binary records stream (6 seconds, 90 milliseconds) [info] - file input stream - newFilesOnly = true (276 milliseconds) [info] - file input stream - newFilesOnly = false (298 milliseconds) [info] - file input stream - wildcard (431 milliseconds) [info] - Modified files are correctly detected. (155 milliseconds) [info] - subtract mutable pairs (4 seconds, 385 milliseconds) [info] - mesos supports killing and limiting executors (3 seconds, 197 milliseconds) [info] - mesos supports killing and relaunching tasks with executors (114 milliseconds) [info] - mesos supports spark.executor.cores (110 milliseconds) [info] - mesos supports unset spark.executor.cores (93 milliseconds) [info] - multi-thread receiver (1 second, 552 milliseconds) [info] - mesos does not acquire more than spark.cores.max (85 milliseconds) [info] - mesos does not acquire gpus if not specified (84 milliseconds) [info] - mesos does not acquire more than spark.mesos.gpus.max (80 milliseconds) [info] - mesos declines offers that violate attribute constraints (82 milliseconds) [info] - mesos declines offers with a filter when reached spark.cores.max (86 milliseconds) [info] - mesos declines offers with a filter when maxCores not a multiple of executor.cores (87 milliseconds) [info] - mesos declines offers with a filter when reached spark.cores.max with executor.cores (85 milliseconds) [info] - mesos assigns tasks round-robin on offers (70 milliseconds) [info] - mesos creates multiple executors on a single agent (76 milliseconds) [info] - mesos doesn't register twice with the same shuffle service (80 milliseconds) [info] - queue input stream - oneAtATime = true (1 second, 58 milliseconds) [info] - Port offer decline when there is no appropriate range (114 milliseconds) [info] - Port offer accepted when ephemeral ports are used (62 milliseconds) [info] - Port offer accepted with user defined port numbers (62 milliseconds) [info] - mesos kills an executor when told (109 milliseconds) [info] - weburi is set in created scheduler driver (89 milliseconds) [info] - failover timeout is set in created scheduler driver (91 milliseconds) [info] - honors unset spark.mesos.containerizer (110 milliseconds) [info] - honors spark.mesos.containerizer="mesos" (89 milliseconds) [info] - docker settings are reflected in created tasks (71 milliseconds) [info] - force-pull-image option is disabled by default (70 milliseconds) [info] - mesos supports spark.executor.uri (83 milliseconds) [info] - mesos supports setting fetcher cache (81 milliseconds) [info] - mesos supports disabling fetcher cache (84 milliseconds) [info] - mesos sets task name to spark.app.name (64 milliseconds) [info] - mesos sets configurable labels on tasks (83 milliseconds) [info] - mesos supports spark.mesos.network.name and spark.mesos.network.labels (91 milliseconds) [info] - queue input stream - oneAtATime = false (2 seconds, 76 milliseconds) [info] - test track the number of input stream (68 milliseconds) [info] UIUtilsSuite: [info] - shortTimeUnitString (1 millisecond) [info] - normalizeDuration (4 milliseconds) [info] - convertToTimeUnit (1 millisecond) [info] - formatBatchTime (1 millisecond) [info] RateLimitedOutputStreamSuite: [info] - SPARK-28778 '--hostname' shouldn't be set for executor when virtual network is enabled (691 milliseconds) [info] - supports spark.scheduler.minRegisteredResourcesRatio (225 milliseconds) [info] - sort with Java non serializable class - Kryo (5 seconds, 339 milliseconds) [info] - write (4 seconds, 103 milliseconds) [info] FileBasedWriteAheadLogSuite: [info] - FileBasedWriteAheadLog - read all logs (47 milliseconds) [info] - FileBasedWriteAheadLog - write logs (82 milliseconds) [info] - FileBasedWriteAheadLog - read all logs after write (88 milliseconds) [info] - FileBasedWriteAheadLog - clean old logs (117 milliseconds) [info] - FileBasedWriteAheadLog - clean old logs synchronously (111 milliseconds) [info] - FileBasedWriteAheadLog - handling file errors while reading rotating logs (192 milliseconds) [info] - FileBasedWriteAheadLog - do not create directories or files unless write (5 milliseconds) [info] - FileBasedWriteAheadLog - parallel recovery not enabled if closeFileAfterWrite = false (33 milliseconds) [info] - FileBasedWriteAheadLog - seqToParIterator (103 milliseconds) [info] - FileBasedWriteAheadLogWriter - writing data (17 milliseconds) [info] - FileBasedWriteAheadLogWriter - syncing of data by writing and reading immediately (18 milliseconds) [info] - FileBasedWriteAheadLogReader - sequentially reading data (5 milliseconds) [info] - FileBasedWriteAheadLogReader - sequentially reading data written with writer (5 milliseconds) [info] - sort with Java non serializable class - Java (4 seconds, 298 milliseconds) [info] - shuffle with different compression settings (SPARK-3426) (403 milliseconds) [info] - FileBasedWriteAheadLogReader - reading data written with writer after corrupted write (844 milliseconds) [info] - FileBasedWriteAheadLogReader - handles errors when file doesn't exist (7 milliseconds) [info] - FileBasedWriteAheadLogRandomReader - reading data using random reader (15 milliseconds) [info] - FileBasedWriteAheadLogRandomReader- reading data using random reader written with writer (12 milliseconds) [info] RateControllerSuite: [info] - [SPARK-4085] rerun map stage if reduce stage cannot find its local shuffle file (333 milliseconds) [info] - cannot find its local shuffle file if no execution of the stage and rerun shuffle (71 milliseconds) [info] - RateController - rate controller publishes updates after batches complete (463 milliseconds) [info] - metrics for shuffle without aggregation (173 milliseconds) [info] - supports data locality with dynamic allocation (6 seconds, 99 milliseconds) [info] - Creates an env-based reference secrets. (69 milliseconds) [info] - Creates an env-based value secrets. (56 milliseconds) [info] - ReceiverRateController - published rates reach receivers (595 milliseconds) [info] InputInfoTrackerSuite: [info] - test report and get InputInfo from InputInfoTracker (1 millisecond) [info] - Creates file-based reference secrets. (67 milliseconds) [info] - test cleanup InputInfo from InputInfoTracker (0 milliseconds) [info] - metrics for shuffle with aggregation (635 milliseconds) [info] WriteAheadLogBackedBlockRDDSuite: [info] - Creates a file-based value secrets. (62 milliseconds) [info] - multiple simultaneous attempts for one task (SPARK-8029) (63 milliseconds) [info] - Read data available in both block manager and write ahead log (113 milliseconds) [info] MesosClusterManagerSuite: [info] - SPARK-34541: shuffle can be removed (98 milliseconds) [info] - mesos fine-grained (66 milliseconds) [info] - Read data available only in block manager, not in write ahead log (69 milliseconds) [info] - mesos coarse-grained (43 milliseconds) [info] - Read data available only in write ahead log, not in block manager (81 milliseconds) [info] - mesos with zookeeper (60 milliseconds) [info] - Read data with partially available in block manager, and rest in write ahead log (69 milliseconds) [info] - Test isBlockValid skips block fetching from BlockManager (147 milliseconds) [info] - mesos with i/o encryption throws error (265 milliseconds) [info] MesosFineGrainedSchedulerBackendSuite: [info] - Test whether RDD is valid after removing blocks from block manager (145 milliseconds) [info] - Test storing of blocks recovered from write ahead log back into block manager (136 milliseconds) Exception in thread "block-manager-storage-async-thread-pool-4" Exception in thread "block-manager-storage-async-thread-pool-0" java.lang.Error: java.lang.InterruptedException at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2048) at org.apache.spark.storage.BlockInfoManager.$anonfun$acquireLock$1(BlockInfoManager.scala:221) at org.apache.spark.storage.BlockInfoManager.$anonfun$acquireLock$1$adapted(BlockInfoManager.scala:214) at org.apache.spark.storage.BlockInfoWrapper.withLock(BlockInfoManager.scala:105) at org.apache.spark.storage.BlockInfoManager.acquireLock(BlockInfoManager.scala:214) at org.apache.spark.storage.BlockInfoManager.lockForWriting(BlockInfoManager.scala:293) at org.apache.spark.storage.BlockManager.removeBlock(BlockManager.scala:1930) at org.apache.spark.storage.BlockManagerStorageEndpoint$$anonfun$receiveAndReply$1.$anonfun$applyOrElse$1(BlockManagerStorageEndpoint.scala:47) at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) at org.apache.spark.storage.BlockManagerStorageEndpoint.$anonfun$doAsync$1(BlockManagerStorageEndpoint.scala:89) at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659) at scala.util.Success.$anonfun$map$1(Try.scala:255) at scala.util.Success.map(Try.scala:213) at scala.concurrent.Future.$anonfun$map$1(Future.scala:292) at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more Exception in thread "block-manager-storage-async-thread-pool-1" java.lang.Error: java.lang.InterruptedException at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2048) at org.apache.spark.storage.BlockInfoManager.$anonfun$acquireLock$1(BlockInfoManager.scala:221) at org.apache.spark.storage.BlockInfoManager.$anonfun$acquireLock$1$adapted(BlockInfoManager.scala:214) at org.apache.spark.storage.BlockInfoWrapper.withLock(BlockInfoManager.scala:105) at org.apache.spark.storage.BlockInfoManager.acquireLock(BlockInfoManager.scala:214) at org.apache.spark.storage.BlockInfoManager.lockForWriting(BlockInfoManager.scala:293) at org.apache.spark.storage.BlockManager.removeBlock(BlockManager.scala:1930) at org.apache.spark.storage.BlockManagerStorageEndpoint$$anonfun$receiveAndReply$1.$anonfun$applyOrElse$1(BlockManagerStorageEndpoint.scala:47) at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) at org.apache.spark.storage.BlockManagerStorageEndpoint.$anonfun$doAsync$1(BlockManagerStorageEndpoint.scala:89) at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659) at scala.util.Success.$anonfun$map$1(Try.scala:255) at scala.util.Success.map(Try.scala:213) at scala.concurrent.Future.$anonfun$map$1(Future.scala:292) at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more java.lang.Error: java.lang.InterruptedException at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2048) at org.apache.spark.storage.BlockInfoManager.$anonfun$acquireLock$1(BlockInfoManager.scala:221) at org.apache.spark.storage.BlockInfoManager.$anonfun$acquireLock$1$adapted(BlockInfoManager.scala:214) at org.apache.spark.storage.BlockInfoWrapper.withLock(BlockInfoManager.scala:105) at org.apache.spark.storage.BlockInfoManager.acquireLock(BlockInfoManager.scala:214) at org.apache.spark.storage.BlockInfoManager.lockForWriting(BlockInfoManager.scala:293) at org.apache.spark.storage.BlockManager.removeBlock(BlockManager.scala:1930) at org.apache.spark.storage.BlockManagerStorageEndpoint$$anonfun$receiveAndReply$1.$anonfun$applyOrElse$1(BlockManagerStorageEndpoint.scala:47) at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) at org.apache.spark.storage.BlockManagerStorageEndpoint.$anonfun$doAsync$1(BlockManagerStorageEndpoint.scala:89) at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659) at scala.util.Success.$anonfun$map$1(Try.scala:255) at scala.util.Success.map(Try.scala:213) at scala.concurrent.Future.$anonfun$map$1(Future.scala:292) at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@15c7d6bb rejected from java.util.concurrent.ThreadPoolExecutor@39c9f9ce[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 3] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@3df67aa8 rejected from java.util.concurrent.ThreadPoolExecutor@39c9f9ce[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 3] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@54fadc80 rejected from java.util.concurrent.ThreadPoolExecutor@39c9f9ce[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 3] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@4350b9d6 rejected from java.util.concurrent.ThreadPoolExecutor@39c9f9ce[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 3] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) [info] - weburi is set in created scheduler driver (352 milliseconds) [info] - read data in block manager and WAL with encryption on (145 milliseconds) [info] BatchedWriteAheadLogWithCloseFileAfterWriteSuite: [info] - Use configured mesosExecutor.cores for ExecutorInfo (52 milliseconds) [info] - check spark-class location correctly (4 milliseconds) [info] - spark docker properties correctly populate the DockerInfo message (4 milliseconds) [info] - BatchedWriteAheadLog - read all logs (49 milliseconds) [info] - mesos resource offers result in launching tasks (25 milliseconds) [info] - can handle multiple roles (8 milliseconds) [info] MesosProtoUtilsSuite: [info] - mesosLabels (2 milliseconds) [info] MesosSchedulerUtilsSuite: [info] - use at-least minimum overhead (26 milliseconds) [info] - use overhead if it is greater than minimum value (1 millisecond) [info] - use spark.mesos.executor.memoryOverhead (if set) (1 millisecond) [info] - parse a non-empty constraint string correctly (8 milliseconds) [info] - parse an empty constraint string correctly (0 milliseconds) [info] - throw an exception when the input is malformed (3 milliseconds) [info] - empty values for attributes' constraints matches all values (4 milliseconds) [info] - subset match is performed for set attributes (2 milliseconds) [info] - less than equal match is performed on scalar attributes (3 milliseconds) [info] - contains match is performed for range attributes (30 milliseconds) [info] - equality match is performed for text attributes (1 millisecond) [info] - Port reservation is done correctly with user specified ports only (11 milliseconds) [info] - Port reservation is done correctly with all random ports (1 millisecond) [info] - Port reservation is done correctly with user specified ports only - multiple ranges (2 milliseconds) [info] - Port reservation is done correctly with all random ports - multiple ranges (1 millisecond) [info] - Principal specified via spark.mesos.principal (11 milliseconds) [info] - Principal specified via spark.mesos.principal.file (17 milliseconds) [info] - Principal specified via spark.mesos.principal.file that does not exist (3 milliseconds) [info] - Principal specified via SPARK_MESOS_PRINCIPAL (2 milliseconds) [info] - Principal specified via SPARK_MESOS_PRINCIPAL_FILE (1 millisecond) [info] - Principal specified via SPARK_MESOS_PRINCIPAL_FILE that does not exist (1 millisecond) [info] - Secret specified via spark.mesos.secret (0 milliseconds) [info] - Principal specified via spark.mesos.secret.file (1 millisecond) [info] - Principal specified via spark.mesos.secret.file that does not exist (1 millisecond) [info] - Principal specified via SPARK_MESOS_SECRET (1 millisecond) [info] - Principal specified via SPARK_MESOS_SECRET_FILE (1 millisecond) [info] - Secret specified with no principal (1 millisecond) [info] - Principal specification preference (0 milliseconds) [info] - Secret specification preference (0 milliseconds) [info] MesosRestServerSuite: [info] - BatchedWriteAheadLog - write logs (361 milliseconds) [info] - test default driver overhead memory (60 milliseconds) [info] - test driver overhead memory with overhead factor (3 milliseconds) [info] - test configured driver overhead memory (2 milliseconds) [info] MesosClusterDispatcherArgumentsSuite: Using host: 192.168.122.1 Using port: 7077 Using webUiPort: 8081 Framework Name: Spark Cluster Spark Config properties set: (spark.hadoop.hadoop.security.key.provider.path,test:///) (spark.testing,true) (spark.ui.showConsoleProgress,false) (spark.master.rest.enabled,false) (spark.test.home,/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2) (spark.ui.enabled,false) (spark.unsafe.exceptionOnMemoryLeak,true) (spark.mesos.key2,value2) (spark.memory.debugFill,true) (spark.port.maxRetries,100) [info] - test if spark config args are passed successfully (20 milliseconds) Using host: localhost Using port: 1212 Using webUiPort: 2323 Framework Name: myFramework Spark Config properties set: (spark.hadoop.hadoop.security.key.provider.path,test:///) (spark.testing,true) (spark.ui.showConsoleProgress,false) (spark.master.rest.enabled,false) (spark.test.home,/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2) (spark.ui.enabled,false) (spark.unsafe.exceptionOnMemoryLeak,true) (spark.memory.debugFill,true) (spark.port.maxRetries,100) [info] - test non conf settings (2 milliseconds) [info] MesosSchedulerBackendUtilSuite: [info] - ContainerInfo fails to parse invalid docker parameters (5 milliseconds) [info] - ContainerInfo parses docker parameters (1 millisecond) [info] - SPARK-28778 ContainerInfo respects Docker network configuration (2 milliseconds) [info] MesosClusterSchedulerSuite: [info] - can queue drivers (10 milliseconds) [info] - can kill queued drivers (7 milliseconds) [info] - can handle multiple roles (34 milliseconds) [info] - escapes commandline args for the shell (6 milliseconds) [info] - SPARK-22256: supports spark.mesos.driver.memoryOverhead with 384mb default (4 milliseconds) [info] - SPARK-22256: supports spark.mesos.driver.memoryOverhead with 10% default (4 milliseconds) [info] - supports spark.mesos.driverEnv.* (4 milliseconds) [info] - supports spark.mesos.network.name and spark.mesos.network.labels (3 milliseconds) [info] - supports setting fetcher cache (4 milliseconds) [info] - supports setting fetcher cache on the dispatcher (3 milliseconds) [info] - supports disabling fetcher cache (4 milliseconds) [info] - accept/decline offers with driver constraints (23 milliseconds) [info] - supports spark.mesos.driver.labels (3 milliseconds) [info] - can kill supervised drivers (5 milliseconds) [info] - BatchedWriteAheadLog - read all logs after write (400 milliseconds) [info] - BatchedWriteAheadLog - clean old logs (353 milliseconds) [info] - BatchedWriteAheadLog - clean old logs synchronously (284 milliseconds) [info] - BatchedWriteAheadLog - handling file errors while reading rotating logs (513 milliseconds) [info] - BatchedWriteAheadLog - do not create directories or files unless write (5 milliseconds) [info] - BatchedWriteAheadLog - parallel recovery not enabled if closeFileAfterWrite = false (29 milliseconds) [info] - BatchedWriteAheadLog - close after write flag (11 milliseconds) [info] ReceivedBlockHandlerSuite: [info] - SPARK-27347: do not restart outdated supervised drivers (1 second, 509 milliseconds) [info] - Declines offer with refuse seconds = 120. (2 milliseconds) [info] - BlockManagerBasedBlockHandler - store blocks (21 milliseconds) [info] - Creates an env-based reference secrets. (3 milliseconds) [info] - Creates an env-based value secrets. (4 milliseconds) [info] - Creates file-based reference secrets. (4 milliseconds) [info] - Creates a file-based value secrets. (3 milliseconds) [info] - assembles a valid driver command, escaping all confs and args (2 milliseconds) [info] - SPARK-23499: Test dispatcher priority queue with non float value (2 milliseconds) [info] - SPARK-23499: Get driver priority (3 milliseconds) [info] - SPARK-23499: Can queue drivers with priority (4 milliseconds) [info] - SPARK-23499: Can queue drivers with negative priority (1 millisecond) [info] - BlockManagerBasedBlockHandler - handle errors in storing block (2 milliseconds) [info] MesosClusterDispatcherSuite: [info] - prints usage on empty input (7 milliseconds) [info] - prints usage with only --help (1 millisecond) [info] - prints error with unrecognized options (2 milliseconds) [info] - WriteAheadLogBasedBlockHandler - store blocks (116 milliseconds) [info] - WriteAheadLogBasedBlockHandler - handle errors in storing block (25 milliseconds) [info] - WriteAheadLogBasedBlockHandler - clean old blocks (66 milliseconds) [info] - Test Block - count messages (74 milliseconds) [info] - Test Block - isFullyConsumed (26 milliseconds) [info] Test run started [info] Test org.apache.spark.streaming.JavaDurationSuite.testGreaterEq started [info] Test org.apache.spark.streaming.JavaDurationSuite.testDiv started [info] Test org.apache.spark.streaming.JavaDurationSuite.testMinus started [info] Test org.apache.spark.streaming.JavaDurationSuite.testTimes started [info] Test org.apache.spark.streaming.JavaDurationSuite.testLess started [info] Test org.apache.spark.streaming.JavaDurationSuite.testPlus started [info] Test org.apache.spark.streaming.JavaDurationSuite.testGreater started [info] Test org.apache.spark.streaming.JavaDurationSuite.testMinutes started [info] Test org.apache.spark.streaming.JavaDurationSuite.testMilliseconds started [info] Test org.apache.spark.streaming.JavaDurationSuite.testLessEq started [info] Test org.apache.spark.streaming.JavaDurationSuite.testSeconds started [info] Test run finished: 0 failed, 0 ignored, 11 total, 0.015s [info] Test run started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testStreamingContextTransform started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testFlatMapValues started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduceByWindowWithInverse started [info] BLASSuite: [info] Test test.org.apache.spark.streaming.JavaAPISuite.testMapPartitions started [info] - nativeL1Threshold (56 milliseconds) [info] - copy (22 milliseconds) [info] - scal (1 millisecond) [info] - axpy (6 milliseconds) [info] - dot (5 milliseconds) [info] - spr (4 milliseconds) [info] - syr (12 milliseconds) [info] - gemm (15 milliseconds) [info] - gemv (12 milliseconds) [info] - spmv (1 millisecond) [info] UtilsSuite: [info] - EPSILON (7 milliseconds) [info] TestingUtilsSuite: [info] - Comparing doubles using relative error. (11 milliseconds) [info] - Comparing doubles using absolute error. (2 milliseconds) [info] - Comparing vectors using relative error. (5 milliseconds) [info] - Comparing vectors using absolute error. (3 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairFilter started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testRepartitionFewerPartitions started [info] - Comparing Matrices using absolute error. (426 milliseconds) [info] - Comparing Matrices using relative error. (12 milliseconds) [info] BreezeMatrixConversionSuite: [info] - dense matrix to breeze (1 millisecond) [info] - dense breeze matrix to matrix (2 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCombineByKey started [info] - sparse matrix to breeze (145 milliseconds) [info] - sparse breeze matrix to sparse matrix (15 milliseconds) [info] BreezeVectorConversionSuite: [info] Test test.org.apache.spark.streaming.JavaAPISuite.testContextGetOrCreate started [info] - dense to breeze (295 milliseconds) [info] - sparse to breeze (160 milliseconds) [info] - dense breeze to vector (1 millisecond) [info] - sparse breeze to vector (0 milliseconds) [info] - sparse breeze with partially-used arrays to vector (2 milliseconds) [info] MultivariateGaussianSuite: Dec 05, 2021 8:09:12 PM com.github.fommil.netlib.LAPACK <clinit> WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeSystemLAPACK Dec 05, 2021 8:09:12 PM com.github.fommil.netlib.LAPACK <clinit> WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeRefLAPACK Dec 05, 2021 8:09:12 PM com.github.fommil.netlib.BLAS <clinit> WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS Dec 05, 2021 8:09:12 PM com.github.fommil.netlib.BLAS <clinit> WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS [info] - univariate (112 milliseconds) [info] - multivariate (8 milliseconds) [info] - multivariate degenerate (0 milliseconds) [info] - SPARK-11302 (7 milliseconds) [info] MatricesSuite: [info] - dense matrix construction (0 milliseconds) [info] - dense matrix construction with wrong dimension (0 milliseconds) [info] - sparse matrix construction (114 milliseconds) [info] - sparse matrix construction with wrong number of elements (1 millisecond) [info] - index in matrices incorrect input (3 milliseconds) [info] - equals (3 milliseconds) [info] - matrix copies are deep copies (0 milliseconds) [info] - matrix indexing and updating (1 millisecond) [info] - dense to dense (1 millisecond) [info] - dense to sparse (3 milliseconds) [info] - sparse to sparse (4 milliseconds) [info] - sparse to dense (3 milliseconds) [info] - compressed dense (5 milliseconds) [info] - compressed sparse (2 milliseconds) [info] - map, update (2 milliseconds) [info] - transpose (1 millisecond) [info] - foreachActive (1 millisecond) [info] - horzcat, vertcat, eye, speye (13 milliseconds) [info] - zeros (1 millisecond) [info] - ones (1 millisecond) [info] - eye (1 millisecond) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testWindowWithSlideDuration started [info] - SPARK-36206: shuffle checksum detect disk corruption (6 seconds, 709 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testQueueStream started [info] - rand (648 milliseconds) [info] - randn (4 milliseconds) [info] - diag (1 millisecond) [info] - sprand (29 milliseconds) [info] - sprandn (3 milliseconds) [info] - toString (14 milliseconds) [info] - numNonzeros and numActives (1 millisecond) [info] - fromBreeze with sparse matrix (42 milliseconds) [info] - row/col iterator (7 milliseconds) [info] VectorsSuite: [info] - dense vector construction with varargs (0 milliseconds) [info] - dense vector construction from a double array (0 milliseconds) [info] - sparse vector construction (1 millisecond) [info] - sparse vector construction with unordered elements (2 milliseconds) [info] - sparse vector construction with mismatched indices/values array (1 millisecond) [info] - sparse vector construction with too many indices vs size (0 milliseconds) [info] - sparse vector construction with negative indices (0 milliseconds) [info] - dense to array (0 milliseconds) [info] - dense argmax (1 millisecond) [info] - sparse to array (1 millisecond) [info] - sparse argmax (1 millisecond) [info] - vector equals (3 milliseconds) [info] - vectors equals with explicit 0 (2 milliseconds) [info] - indexing dense vectors (1 millisecond) [info] - indexing sparse vectors (1 millisecond) [info] - zeros (0 milliseconds) [info] - Vector.copy (1 millisecond) [info] - fromBreeze (2 milliseconds) [info] - sqdist (50 milliseconds) [info] - foreach (8 milliseconds) [info] - foreachActive (3 milliseconds) [info] - foreachNonZero (1 millisecond) [info] - vector p-norm (5 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCountByValue started [info] - Vector numActive and numNonzeros (1 millisecond) [info] - Vector toSparse and toDense (2 milliseconds) [info] - Vector.compressed (0 milliseconds) [info] - SparseVector.slice (0 milliseconds) [info] - SparseVector.slice with sorted indices (1 millisecond) [info] - sparse vector only support non-negative length (2 milliseconds) [info] - dot product only supports vectors of same size (1 millisecond) [info] - dense vector dot product (0 milliseconds) [info] - sparse vector dot product (0 milliseconds) [info] - mixed sparse and dense vector dot product (0 milliseconds) [info] - iterator (2 milliseconds) [info] - activeIterator (2 milliseconds) [info] - nonZeroIterator (3 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testMap started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairToNormalRDDTransform started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairReduceByKey started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCount started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery started [info] BasicDriverFeatureStepSuite: [info] - Check the pod respects all configurations from the user. (652 milliseconds) [info] - Check driver pod respects kubernetes driver request cores (33 milliseconds) [info] - Check appropriate entrypoint rerouting for various bindings (14 milliseconds) [info] - memory overhead factor: java (6 milliseconds) [info] - memory overhead factor: python default (4 milliseconds) [info] - memory overhead factor: python w/ override (4 milliseconds) [info] - memory overhead factor: r default (6 milliseconds) [info] - SPARK-35493: make spark.blockManager.port be able to be fallen back to in driver pod (8 milliseconds) [info] - SPARK-36075: Check driver pod respects nodeSelector/driverNodeSelector (5 milliseconds) [info] EnvSecretsFeatureStepSuite: [info] - sets up all keyRefs (9 milliseconds) [info] ExecutorPodsPollingSnapshotSourceSuite: [info] - SPARK-35672: run Spark in yarn-cluster mode with additional jar using URI scheme 'local' and gateway-replacement path containing an environment variable (24 seconds, 34 milliseconds) [info] - Items returned by the API should be pushed to the event queue (59 milliseconds) [info] - SPARK-36334: Support pod listing with resource version (18 milliseconds) [info] ExecutorPodsSnapshotSuite: [info] - States are interpreted correctly from pod metadata. (65 milliseconds) [info] - SPARK-30821: States are interpreted correctly from pod metadata when configured to check all containers. (20 milliseconds) [info] - Updates add new pods for non-matching ids and edit existing pods for matching ids (10 milliseconds) [info] ExecutorKubernetesCredentialsFeatureStepSuite: [info] - configure spark pod with executor service account (10 milliseconds) [info] - configure spark pod with with driver service account and without executor service account (6 milliseconds) [info] - configure spark pod with with driver service account and with executor service account (3 milliseconds) [info] DriverKubernetesCredentialsFeatureStepSuite: [info] - Don't set any credentials (14 milliseconds) [info] - Only set credentials that are manually mounted. (4 milliseconds) [info] - Mount credentials from the submission client as a secret. (197 milliseconds) [info] PodTemplateConfigMapStepSuite: [info] - Do nothing when executor template is not specified (3 milliseconds) [info] - Mounts executor template volume if config specified (79 milliseconds) [info] KubernetesExecutorBuilderSuite: [info] - use empty initial pod if template is not specified (130 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairMap started [info] - load pod template if specified (188 milliseconds) [info] - configure a custom test step (62 milliseconds) [info] - complain about misconfigured pod template (42 milliseconds) [info] KubernetesConfSuite: [info] - Resolve driver labels, annotations, secret mount paths, envs, and memory overhead (7 milliseconds) [info] - Basic executor translated fields. (2 milliseconds) [info] - resource profile not default. (1 millisecond) [info] - Image pull secrets. (2 milliseconds) [info] - Set executor labels, annotations, and secrets (5 milliseconds) [info] - Verify that executorEnv key conforms to the regular specification (3 milliseconds) [info] - SPARK-36075: Set nodeSelector, driverNodeSelector, executorNodeSelect (3 milliseconds) [info] - SPARK-36059: Set driver.scheduler and executor.scheduler (3 milliseconds) [info] - SPARK-36566: get app name label (3 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testUnion started [info] BasicExecutorFeatureStepSuite: [info] - test spark resource missing vendor (23 milliseconds) [info] - test spark resource missing amount (3 milliseconds) [info] - basic executor pod with resources (22 milliseconds) [info] - basic executor pod has reasonable defaults (30 milliseconds) [info] - executor pod hostnames get truncated to 63 characters (25 milliseconds) [info] - SPARK-35460: invalid PodNamePrefixes (5 milliseconds) [info] - hostname truncation generates valid host names (54 milliseconds) [info] - classpath and extra java options get translated into environment variables (21 milliseconds) [info] - SPARK-32655 Support appId/execId placeholder in SPARK_EXECUTOR_DIRS (17 milliseconds) [info] - test executor pyspark memory (17 milliseconds) [info] - auth secret propagation (22 milliseconds) [info] - Auth secret shouldn't propagate if files are loaded. (27 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testFlatMap started [info] - SPARK-32661 test executor offheap memory (20 milliseconds) [info] - basic resourceprofile (21 milliseconds) [info] - resourceprofile with gpus (15 milliseconds) [info] - Verify spark conf dir is mounted as configmap volume on executor pod's container. (18 milliseconds) [info] - SPARK-34316 Disable configmap volume on executor pod's container (16 milliseconds) [info] - SPARK-35482: user correct block manager port for executor pods (28 milliseconds) [info] - SPARK-35969: Make the pod prefix more readable and tallied with K8S DNS Label Names (41 milliseconds) [info] - SPARK-36075: Check executor pod respects nodeSelector/executorNodeSelector (20 milliseconds) [info] KubernetesVolumeUtilsSuite: [info] - Parses hostPath volumes correctly (4 milliseconds) [info] - Parses subPath correctly (1 millisecond) [info] - Parses persistentVolumeClaim volumes correctly (3 milliseconds) [info] - Parses emptyDir volumes correctly (1 millisecond) [info] - Parses emptyDir volume options can be optional (1 millisecond) [info] - Defaults optional readOnly to false (1 millisecond) [info] - Fails on missing mount key (1 millisecond) [info] - Fails on missing option key (5 milliseconds) [info] - SPARK-33063: Fails on missing option key in persistentVolumeClaim (5 milliseconds) [info] - Parses read-only nfs volumes correctly (6 milliseconds) [info] - Parses read/write nfs volumes correctly (1 millisecond) [info] - Fails on missing path option (1 millisecond) [info] - Fails on missing server option (2 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduceByKeyAndWindowWithInverse started [info] KubernetesClusterSchedulerBackendSuite: [info] Test test.org.apache.spark.streaming.JavaAPISuite.testGlom started [info] - using external shuffle service (6 seconds, 594 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testJoin started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairFlatMap started [info] - Start all components (6 milliseconds) [info] - Stop all components (15 milliseconds) [info] - Remove executor (77 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairToPairFlatMapWithChangingTypes started [info] - Kill executors (168 milliseconds) [info] - SPARK-34407: CoarseGrainedSchedulerBackend.stop may throw SparkException (10 milliseconds) [info] - SPARK-34469: Ignore RegisterExecutor when SparkContext is stopped (5 milliseconds) [info] - Dynamically fetch an executor ID (1 millisecond) [info] KubernetesDriverBuilderSuite: [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairMapPartitions started [info] - use empty initial pod if template is not specified (72 milliseconds) [info] - load pod template if specified (40 milliseconds) [info] - configure a custom test step (43 milliseconds) [info] - complain about misconfigured pod template (20 milliseconds) [info] LocalDirsFeatureStepSuite: [info] - Resolve to default local dir if neither env nor configuration are set (2 milliseconds) [info] - Use configured local dirs split on comma if provided. (3 milliseconds) [info] - Use tmpfs to back default local dir (1 millisecond) [info] - local dir on mounted volume (3 milliseconds) [info] ExecutorPodsWatchSnapshotSourceSuite: [info] - Watch events should be pushed to the snapshots store as snapshot updates. (4 milliseconds) [info] ExecutorPodsAllocatorSuite: [info] - SPARK-36052: test splitSlots (3 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testRepartitionMorePartitions started [info] - SPARK-36052: pending pod limit with multiple resource profiles (52 milliseconds) [info] - Initially request executors in batches. Do not request another batch if the first has not finished. (14 milliseconds) [info] - Request executors in batches. Allow another batch to be requested if all pending executors start running. (32 milliseconds) [info] - When a current batch reaches error states immediately, re-request them on the next batch. (13 milliseconds) [info] - Verify stopping deletes the labeled pods (3 milliseconds) [info] - When an executor is requested but the API does not report it in a reasonable time, retry requesting that executor. (8 milliseconds) [info] - SPARK-28487: scale up and down on target executor count changes (19 milliseconds) [info] - SPARK-34334: correctly identify timed out pending pod requests as excess (11 milliseconds) [info] - SPARK-33099: Respect executor idle timeout configuration (14 milliseconds) [info] - SPARK-34361: scheduler backend known pods with multiple resource profiles at downscaling (26 milliseconds) [info] - SPARK-33288: multiple resource profiles (19 milliseconds) [info] - SPARK-33262: pod allocator does not stall with pending pods (10 milliseconds) [info] - SPARK-35416: Support PersistentVolumeClaim Reuse (34 milliseconds) [info] - print the pod name instead of Some(name) if pod is absent (4 milliseconds) [info] ExecutorPodsSnapshotsStoreSuite: [info] - Subscribers get notified of events periodically. (9 milliseconds) [info] - Even without sending events, initially receive an empty buffer. (2 milliseconds) [info] - Replacing the snapshot passes the new snapshot to subscribers. (2 milliseconds) [info] ExecutorPodsLifecycleManagerSuite: [info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduceByWindowWithoutInverse started [info] - When an executor reaches error states immediately, remove from the scheduler backend. (55 milliseconds) [info] - Don't remove executors twice from Spark but remove from K8s repeatedly. (3 milliseconds) [info] - When the scheduler backend lists executor ids that aren't present in the cluster, remove those executors from Spark. (4 milliseconds) [info] - Keep executor pods in k8s if configured. (5 milliseconds) [info] StatefulSetAllocatorSuite: [info] - Validate initial statefulSet creation & cleanup with two resource profiles (27 milliseconds) [info] - Validate statefulSet scale up (2 milliseconds) [info] HadoopConfDriverFeatureStepSuite: [info] - mount hadoop config map if defined (3 milliseconds) [info] - create hadoop config map if config dir is defined (6 milliseconds) [info] KubernetesClusterManagerSuite: [info] - constructing a AbstractPodsAllocator works (6 milliseconds) [info] KubernetesClientUtilsSuite: [info] - verify load files, loads only allowed files and not the disallowed files. (22 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testLeftOuterJoin started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testVariousTransform started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testTransformWith started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testVariousTransformWith started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testTextFileStream started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairGroupByKey started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCoGroup started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testInitialization started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testSocketString started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testGroupByKeyAndWindow started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduceByKeyAndWindow started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testForeachRDD started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testFileStream started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairTransform started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testFilter started [info] - SPARK-25888: using external shuffle service fetching disk persisted blocks (4 seconds, 654 milliseconds) [info] - verify load files, truncates the content to maxSize, when keys are very large in number. (2 seconds, 936 milliseconds) [info] - verify load files, truncates the content to maxSize, when keys are equal in length. (4 milliseconds) [info] MountVolumesFeatureStepSuite: [info] - Mounts hostPath volumes (3 milliseconds) [info] - Mounts persistentVolumeClaims (3 milliseconds) [info] - SPARK-32713 Mounts parameterized persistentVolumeClaims in executors (2 milliseconds) [info] - Create and mounts persistentVolumeClaims in driver (1 millisecond) [info] - Create and mount persistentVolumeClaims in executors (1 millisecond) [info] - Mounts emptyDir (5 milliseconds) [info] - Mounts emptyDir with no options (1 millisecond) [info] - Mounts read/write nfs volumes (4 milliseconds) [info] - Mounts read-only nfs volumes (2 milliseconds) [info] - Mounts multiple volumes (1 millisecond) [info] ClosureCleanerSuite: [info] - mountPath should be unique (2 milliseconds) [info] - Mounts subpath on emptyDir (1 millisecond) [info] - Mounts subpath on persistentVolumeClaims (1 millisecond) [info] - Mounts multiple subpaths (2 milliseconds) [info] ClientSuite: [info] - The client should configure the pod using the builder. (12 milliseconds) [info] - The client should create Kubernetes resources (4 milliseconds) [info] - All files from SPARK_CONF_DIR, except templates, spark config, binary files and are within size limit, should be populated to pod's configMap. (16 milliseconds) [info] - Waiting for app completion should stall on the watcher (2 milliseconds) [info] K8sSubmitOpSuite: [info] - closures inside an object (180 milliseconds) [info] - List app status (7 milliseconds) [info] - List status for multiple apps with glob (4 milliseconds) [info] - Kill app (3 milliseconds) [info] - Kill app with gracePeriod (2 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairMap2 started [info] - Kill multiple apps with glob without gracePeriod (4 milliseconds) [info] KubernetesLocalDiskShuffleDataIOSuite: [info] - closures inside a class (159 milliseconds) [info] - closures inside a class with no default constructor (69 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testMapValues started [info] - closures that don't use fields of the outer class (68 milliseconds) [info] - nested closures inside an object (97 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduce started [info] - nested closures inside a class (145 milliseconds) [info] - toplevel return statements in closures are identified at cleaning time (94 milliseconds) [info] - return statements from named functions nested in closures don't raise exceptions (81 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testUpdateStateByKey started [info] - user provided closures are actually cleaned (166 milliseconds) [info] - createNullValue (6 milliseconds) [info] UnpersistSuite: [info] - unpersist RDD (103 milliseconds) [info] PeriodicRDDCheckpointerSuite: [info] - Persisting (10 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testTransform started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testWindow started [info] - Checkpointing (528 milliseconds) [info] TaskSetManagerSuite: [info] - TaskSet with no preferences (84 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCountByValueAndWindow started [info] - multiple offers with no preferences (65 milliseconds) [info] - skip unsatisfiable locality levels (69 milliseconds) [info] - basic delay scheduling (68 milliseconds) [info] - we do not need to delay scheduling when we only have noPref tasks in the queue (41 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testRawSocketStream started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testSocketTextStream started [info] - delay scheduling with fallback (168 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testUpdateStateByKeyWithInitial started [info] - delay scheduling with failed hosts (91 milliseconds) [info] - task result lost (142 milliseconds) [info] - repeated failures lead to task set abortion (100 milliseconds) [info] - executors should be excluded after task failure, in spite of locality preferences (63 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testContextState started [info] - new executors get added and lost (38 milliseconds) [info] - Executors exit for reason unrelated to currently running tasks (45 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 53 total, 17.666s [info] Test run started [info] Test test.org.apache.spark.streaming.Java8APISuite.testStreamingContextTransform started [info] - SPARK-31837: Shift to the new highest locality level if there is when recomputeLocality (50 milliseconds) [info] - SPARK-32653: Decommissioned host should not be used to calculate locality levels (63 milliseconds) [info] - SPARK-32653: Decommissioned executor should not be used to calculate locality levels (65 milliseconds) [info] - test RACK_LOCAL tasks (35 milliseconds) [info] Test test.org.apache.spark.streaming.Java8APISuite.testFlatMapValues started [info] - do not emit warning when serialized task is small (33 milliseconds) [info] - emit warning when serialized task is large (46 milliseconds) [info] - Not serializable exception thrown if the task cannot be serialized (35 milliseconds) [info] Test test.org.apache.spark.streaming.Java8APISuite.testMapPartitions started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairFilter started [info] Test test.org.apache.spark.streaming.Java8APISuite.testCombineByKey started [info] Test test.org.apache.spark.streaming.Java8APISuite.testMap started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairToNormalRDDTransform started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairReduceByKey started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairMap started [info] - abort the job if total size of results is too large (1 second, 603 milliseconds) [info] Test test.org.apache.spark.streaming.Java8APISuite.testFlatMap started [info] Test test.org.apache.spark.streaming.Java8APISuite.testReduceByKeyAndWindowWithInverse started [info] Test test.org.apache.spark.streaming.Java8APISuite.testReduceByWindow started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairFlatMap started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairToPairFlatMapWithChangingTypes started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairMapPartitions started [info] Test test.org.apache.spark.streaming.Java8APISuite.testVariousTransform started [info] Test test.org.apache.spark.streaming.Java8APISuite.testTransformWith started [info] Test test.org.apache.spark.streaming.Java8APISuite.testVariousTransformWith started [info] Test test.org.apache.spark.streaming.Java8APISuite.testReduceByKeyAndWindow started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairTransform started [info] Test test.org.apache.spark.streaming.Java8APISuite.testFilter started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairMap2 started [info] Test test.org.apache.spark.streaming.Java8APISuite.testMapValues started [info] Test test.org.apache.spark.streaming.Java8APISuite.testReduce started [info] Test test.org.apache.spark.streaming.Java8APISuite.testUpdateStateByKey started [info] Test test.org.apache.spark.streaming.Java8APISuite.testTransform started [info] Test run finished: 0 failed, 0 ignored, 26 total, 6.461s [info] Test run started [info] Test org.apache.spark.streaming.JavaMapWithStateSuite.testBasicFunction started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.438s [info] Test run started [info] Test org.apache.spark.streaming.JavaWriteAheadLogSuite.testCustomWAL started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.003s [info] Test run started [info] Test org.apache.spark.streaming.JavaReceiverAPISuite.testReceiver started [info] Test run finished: 0 failed, 0 ignored, 1 total, 1.223s [info] Test run started [info] Test org.apache.spark.streaming.JavaTimeSuite.testGreaterEq started [info] Test org.apache.spark.streaming.JavaTimeSuite.testLess started [info] Test org.apache.spark.streaming.JavaTimeSuite.testPlus started [info] Test org.apache.spark.streaming.JavaTimeSuite.testMinusDuration started [info] Test org.apache.spark.streaming.JavaTimeSuite.testGreater started [info] Test org.apache.spark.streaming.JavaTimeSuite.testLessEq started [info] Test org.apache.spark.streaming.JavaTimeSuite.testMinusTime started [info] Test run finished: 0 failed, 0 ignored, 7 total, 0.001s [info] - SPARK-32470: do not check total size of intermediate stages (7 seconds, 296 milliseconds) [info] - [SPARK-13931] taskSetManager should not send Resubmitted tasks after being a zombie (62 milliseconds) [info] - [SPARK-22074] Task killed by other attempt task should not be resubmitted (91 milliseconds) [info] - speculative and noPref task should be scheduled after node-local (65 milliseconds) [info] - node-local tasks should be scheduled right away when there are only node-local and no-preference tasks (52 milliseconds) [info] - recompute is not blocked by the recovery (13 seconds, 69 milliseconds) [info] - SPARK-4939: node-local tasks should be scheduled right after process-local tasks finished (53 milliseconds) [info] - SPARK-4939: no-pref tasks should be scheduled after process-local tasks finished (54 milliseconds) [info] - Ensure TaskSetManager is usable after addition of levels (54 milliseconds) [info] KafkaHadoopDelegationTokenManagerSuite: [info] - Test that locations with HDFSCacheTaskLocation are treated as PROCESS_LOCAL. (61 milliseconds) [info] - Test TaskLocation for different host type. (2 milliseconds) [info] - Kill other task attempts when one attempt belonging to the same task succeeds (79 milliseconds) [info] - Killing speculative tasks does not count towards aborting the taskset (57 milliseconds) [info] - default configuration (220 milliseconds) [info] - SPARK-19868: DagScheduler only notified of taskEnd when state is ready (186 milliseconds) [info] KafkaConfigUpdaterSuite: [info] - set should always set value (11 milliseconds) [info] - SPARK-17894: Verify TaskSetManagers for different stage attempts have unique names (122 milliseconds) [info] - setIfUnset without existing key should set value (1 millisecond) [info] - setIfUnset with existing key should not set value (4 milliseconds) [info] - don't update excludelist for shuffle-fetch failures, preemption, denied commits, or killed tasks (69 milliseconds) [info] - update application healthTracker for shuffle-fetch (84 milliseconds) [info] - update healthTracker before adding pending task to avoid race condition (67 milliseconds) [info] - SPARK-21563 context's added jars shouldn't change mid-TaskSet (70 milliseconds) [info] - SPARK-24677: Avoid NoSuchElementException from MedianHeap (63 milliseconds) [info] - SPARK-24755 Executor loss can cause task to not be resubmitted (59 milliseconds) [info] - SPARK-13343 speculative tasks that didn't commit shouldn't be marked as success (99 milliseconds) [info] - SPARK-13704 Rack Resolution is done with a batch of de-duped hosts (145 milliseconds) [info] - TaskSetManager passes task resource along (80 milliseconds) [info] - setAuthenticationConfigIfNeeded with global security should not set values (1 second, 78 milliseconds) [info] - SPARK-26755 Ensure that a speculative task is submitted only once for execution (199 milliseconds) [info] - setAuthenticationConfigIfNeeded with token should set values (57 milliseconds) [info] - setAuthenticationConfigIfNeeded with token should not override user-defined protocol (31 milliseconds) [info] - setAuthenticationConfigIfNeeded with invalid mechanism should throw exception (26 milliseconds) [info] - setAuthenticationConfigIfNeeded without security should not set values (21 milliseconds) [info] KafkaTokenUtilSuite: [info] - SPARK-26755 Ensure that a speculative task obeys original locality preferences (130 milliseconds) [info] - checkProxyUser with proxy current user should throw exception (71 milliseconds) [info] - createAdminClientProperties with SASL_PLAINTEXT protocol should not include keystore and truststore config (9 milliseconds) [info] - createAdminClientProperties with SASL_SSL protocol should include truststore config (5 milliseconds) [info] - createAdminClientProperties with SSL protocol should include keystore and truststore config (4 milliseconds) [info] - createAdminClientProperties with global config should not set dynamic jaas config (2 milliseconds) [info] - createAdminClientProperties with keytab should set keytab dynamic jaas config (9 milliseconds) [info] - createAdminClientProperties without keytab should set ticket cache dynamic jaas config (2 milliseconds) [info] - createAdminClientProperties with specified params should include it (2 milliseconds) [info] - isGlobalJaasConfigurationProvided without global config should return false (2 milliseconds) [info] - isGlobalJaasConfigurationProvided with global config should return false (1 millisecond) [info] - findMatchingTokenClusterConfig without token should return None (16 milliseconds) [info] - findMatchingTokenClusterConfig with non-matching tokens should return None (17 milliseconds) [info] - SPARK-29976 when a speculation time threshold is provided, should speculative run the task even if there are not enough successful runs, total tasks: 1 (154 milliseconds) [info] - findMatchingTokenClusterConfig with one matching token should return token and cluster configuration (15 milliseconds) [info] - findMatchingTokenClusterConfig with multiple matching tokens should throw exception (15 milliseconds) [info] - getTokenJaasParams with token should return scram module (12 milliseconds) [info] - needTokenUpdate without cluster config should return false (2 milliseconds) [info] - needTokenUpdate without jaas config should return false (2 milliseconds) [info] - needTokenUpdate with same token should return false (15 milliseconds) [info] - SPARK-29976: when the speculation time threshold is not provided,don't speculative run if there are not enough successful runs, total tasks: 1 (42 milliseconds) [info] - needTokenUpdate with different token should return true (14 milliseconds) [info] KafkaRedactionUtilSuite: [info] - redactParams shouldn't throw exception when no SparkEnv available (4 milliseconds) [info] - redactParams should give back empty parameters (2 milliseconds) [info] - redactParams should give back null value (2 milliseconds) [info] - redactParams should redact non String parameters (7 milliseconds) [info] - SPARK-29976 when a speculation time threshold is provided, should speculative run the task even if there are not enough successful runs, total tasks: 2 (71 milliseconds) [info] - redactParams should redact token password from parameters (14 milliseconds) [info] - redactParams should redact passwords from parameters (3 milliseconds) [info] - redactJaasParam should give back null (1 millisecond) [info] - redactJaasParam should give back empty string (0 milliseconds) [info] - redactJaasParam should redact token password (13 milliseconds) [info] KafkaTokenSparkConfSuite: [info] - getClusterConfig should trow exception when not exists (3 milliseconds) [info] - getClusterConfig should return entry with defaults (2 milliseconds) [info] - getClusterConfig should return entry overwrite defaults (1 millisecond) [info] - getClusterConfig should return specified kafka params (1 millisecond) [info] - getAllClusterConfigs should return empty list when nothing configured (4 milliseconds) [info] - getAllClusterConfigs should return empty list with malformed configuration (2 milliseconds) [info] - getAllClusterConfigs should return multiple entries (3 milliseconds) [info] - SPARK-29976: when the speculation time threshold is not provided,don't speculative run if there are not enough successful runs, total tasks: 2 (75 milliseconds) [info] - SPARK-29976 when a speculation time threshold is provided, should not speculative if there are too many tasks in the stage even though time threshold is provided (45 milliseconds) [info] - SPARK-21040: Check speculative tasks are launched when an executor is decommissioned and the tasks running on it cannot finish within EXECUTOR_DECOMMISSION_KILL_INTERVAL (54 milliseconds) [info] - SPARK-29976 Regular speculation configs should still take effect even when a threshold is provided (53 milliseconds) [info] - SPARK-30417 when spark.task.cpus is greater than spark.executor.cores due to standalone settings, speculate if there is only one task in the stage (73 milliseconds) [info] - TaskOutputFileAlreadyExistException lead to task set abortion (40 milliseconds) [info] - SPARK-35672: run Spark in yarn-client mode with additional jar using URI scheme 'file' (24 seconds, 47 milliseconds) [info] KinesisInputDStreamBuilderSuite: [info] - should raise an exception if the StreamingContext is missing (47 milliseconds) [info] - should raise an exception if the stream name is missing (10 milliseconds) [info] - should raise an exception if the checkpoint app name is missing (3 milliseconds) [info] - SPARK-30359: don't clean executorsPendingToRemove at the beginning of CoarseGrainedSchedulerBackend.reset (3 seconds, 831 milliseconds) [info] - SPARK-33741 Test minimum amount of time a task runs before being considered for speculation (60 milliseconds) [info] BlockManagerBasicStrategyReplicationSuite: [info] - get peers with addition and removal of block managers (22 milliseconds) [info] - should propagate required values to KinesisInputDStream (338 milliseconds) [info] - should propagate default values to KinesisInputDStream (7 milliseconds) [info] - block replication - 2x replication (106 milliseconds) [info] - block replication - 3x replication (112 milliseconds) [info] - should propagate custom non-auth values to KinesisInputDStream (551 milliseconds) [info] - old Api should throw UnsupportedOperationExceptionexception with AT_TIMESTAMP (3 milliseconds) [info] - block replication - mixed between 1x to 5x (160 milliseconds) [info] WithoutAggregationKinesisBackedBlockRDDSuite: [info] - Basic reading from Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Read data available in both block manager and Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Read data available only in block manager, not in Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Read data available only in Kinesis, not in block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Read data available partially in block manager, rest in Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Test isBlockValid skips block fetching from block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Test whether RDD is valid after removing blocks from block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] KinesisReceiverSuite: [info] - block replication - off-heap (35 milliseconds) [info] - block replication - 2x replication without peers (1 millisecond) [info] - block replication - replication failures (52 milliseconds) [info] - process records including store and set checkpointer (37 milliseconds) [info] - split into multiple processes if a limitation is set (3 milliseconds) [info] - shouldn't store and update checkpointer when receiver is stopped (4 milliseconds) [info] - shouldn't update checkpointer when exception occurs during store (11 milliseconds) [info] - shutdown should checkpoint if the reason is TERMINATE (6 milliseconds) [info] - shutdown should not checkpoint if the reason is something other than TERMINATE (1 millisecond) [info] - retry success on first attempt (3 milliseconds) [info] - retry success on second attempt after a Kinesis throttling exception (63 milliseconds) [info] - test block replication failures when block is received by remote block manager but putBlock fails (stream = false) (61 milliseconds) [info] - retry success on second attempt after a Kinesis dependency exception (60 milliseconds) [info] - retry failed after a shutdown exception (4 milliseconds) [info] - retry failed after an invalid state exception (4 milliseconds) [info] - retry failed after unexpected exception (3 milliseconds) [info] - retry failed after exhausting all retries (55 milliseconds) [info] - test block replication failures when block is received by remote block manager but putBlock fails (stream = true) (30 milliseconds) [info] WithAggregationKinesisBackedBlockRDDSuite: [info] - Basic reading from Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Read data available in both block manager and Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Read data available only in block manager, not in Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Read data available only in Kinesis, not in block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Read data available partially in block manager, rest in Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Test isBlockValid skips block fetching from block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Test whether RDD is valid after removing blocks from block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] WithAggregationKinesisStreamSuite: [info] - block replication - addition and deletion of block managers (121 milliseconds) [info] BlockManagerMasterSuite: [info] - SPARK-31422: getMemoryStatus should not fail after BlockManagerMaster stops (4 milliseconds) [info] - SPARK-31422: getStorageStatus should not fail after BlockManagerMaster stops (1 millisecond) [info] RDDOperationGraphSuite: [info] - RDD generation (46 milliseconds) [info] - Test simple cluster equals (1 millisecond) [info] ShuffleExternalSorterSuite: [info] - basic operation [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - custom message handling [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Kinesis read with custom configurations (6 milliseconds) [info] - nested spill should be no-op (83 milliseconds) [info] - split and merge shards in a stream [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - failure recovery [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Prepare KinesisTestUtils [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] KinesisCheckpointerSuite: [info] ChunkedByteBufferSuite: [info] - checkpoint is not called twice for the same sequence number (8 milliseconds) [info] - checkpoint is called after sequence number increases (2 milliseconds) [info] - should checkpoint if we have exceeded the checkpoint interval (6 milliseconds) [info] - shouldn't checkpoint if we have not exceeded the checkpoint interval (1 millisecond) [info] - should not checkpoint for the same sequence number (3 milliseconds) [info] - removing checkpointer checkpoints one last time (1 millisecond) [info] - no chunks (0 milliseconds) [info] - getChunks() duplicates chunks (0 milliseconds) [info] - copy() does not affect original buffer's position (1 millisecond) [info] - writeFully() does not affect original buffer's position (0 milliseconds) [info] - if checkpointing is going on, wait until finished before removing and checkpointing (77 milliseconds) [info] - SPARK-24107: writeFully() write buffer which is larger than bufferWriteChunkSize (63 milliseconds) [info] - toArray() (2 milliseconds) [info] - toArray() throws UnsupportedOperationException if size exceeds 2GB (3 milliseconds) [info] - toInputStream() (1 millisecond) [info] SparkAWSCredentialsBuilderSuite: [info] - should build DefaultCredentials when given no params (3 milliseconds) [info] - should build BasicCredentials (0 milliseconds) [info] - should build STSCredentials (1 millisecond) [info] - SparkAWSCredentials classes should be serializable (4 milliseconds) [info] HistoryServerDiskManagerSuite: [info] WithoutAggregationKinesisStreamSuite: [info] - leasing space (64 milliseconds) [info] - RDD generation (8 milliseconds) [info] - tracking active stores (11 milliseconds) [info] - approximate size heuristic (1 millisecond) [info] - basic operation [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - custom message handling [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - SPARK-32024: update ApplicationStoreInfo.size during initializing (13 milliseconds) [info] PythonBroadcastSuite: [info] - Kinesis read with custom configurations (4 milliseconds) [info] - PythonBroadcast can be serialized with Kryo (SPARK-4882) (17 milliseconds) [info] - split and merge shards in a stream [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - failure recovery [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Prepare KinesisTestUtils [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] KeyLockSuite: [info] - The same key should wait when its lock is held (10 milliseconds) [info] - A different key should not be locked (2 milliseconds) [info] Test run started [info] Test org.apache.spark.streaming.kinesis.JavaKinesisInputDStreamBuilderSuite.testJavaKinesisDStreamBuilderOldApi started [info] NettyBlockTransferServiceSuite: [info] - can bind to a random port (17 milliseconds) [info] - can bind to two random ports (32 milliseconds) [info] - can bind to a specific port (20 milliseconds) [info] - can bind to a specific port twice and the second increments (34 milliseconds) [info] - SPARK-27637: test fetch block with executor dead (81 milliseconds) [info] BasicSchedulerIntegrationSuite: [info] Test org.apache.spark.streaming.kinesis.JavaKinesisInputDStreamBuilderSuite.testJavaKinesisDStreamBuilder started [info] - super simple job (119 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.352s [info] - multi-stage job (168 milliseconds) [info] - job with fetch failure (381 milliseconds) [info] - job failure after 4 attempts (63 milliseconds) [info] - SPARK-23626: RDD with expensive getPartitions() doesn't block scheduler loop (82 milliseconds) [info] JobWaiterSuite: [info] - call jobFailed multiple times (2 milliseconds) [info] RDDBarrierSuite: [info] - create an RDDBarrier (4 milliseconds) [info] - RDDBarrier mapPartitionsWithIndex (52 milliseconds) [info] - create an RDDBarrier in the middle of a chain of RDDs (6 milliseconds) [info] - RDDBarrier with shuffle (6 milliseconds) [info] UninterruptibleThreadSuite: [info] MathExpressionsSuite: [info] - interrupt when runUninterruptibly is running (1 second, 2 milliseconds) [info] - interrupt before runUninterruptibly runs (1 millisecond) [info] - nested runUninterruptibly (10 milliseconds) [info] - Partial recompute shuffle data (11 seconds, 668 milliseconds) [info] - stress test (1 second, 892 milliseconds) [info] BlockManagerDecommissionUnitSuite: [info] - conv (3 seconds, 574 milliseconds) [info] - e (77 milliseconds) [info] - pi (38 milliseconds) [info] - sin (1 second, 647 milliseconds) [info] - csc (1 second, 245 milliseconds) [info] - test that with no blocks we finish migration (5 seconds, 35 milliseconds) [info] - block decom manager with no migrations configured (8 milliseconds) [info] - block decom manager with no peers (4 milliseconds) [info] - asin (740 milliseconds) [info] - sinh (1 second, 218 milliseconds) [info] - asinh (1 second, 452 milliseconds) [info] - cos (1 second, 425 milliseconds) [info] - block decom manager with only shuffle files time moves forward (5 seconds, 32 milliseconds) [info] - A new rdd and full recovery of old data (12 seconds, 289 milliseconds) [info] - sec (1 second, 323 milliseconds) [info] - acos (745 milliseconds) [info] - SPARK-35672: run Spark in yarn-cluster mode with additional jar using URI scheme 'file' (23 seconds, 39 milliseconds) [info] - cosh (1 second, 284 milliseconds) [info] - acosh (1 second, 216 milliseconds) [info] - block decom manager does not re-add removed shuffle files (5 seconds, 6 milliseconds) [info] - tan (1 second, 163 milliseconds) [info] - cot (877 milliseconds) [info] - atan (845 milliseconds) [info] - tanh (678 milliseconds) [info] - atanh (783 milliseconds) [info] - toDegrees (797 milliseconds) [info] - block decom manager handles IO failures (5 seconds, 35 milliseconds) [info] - toRadians (973 milliseconds) [info] - cbrt (859 milliseconds) [info] - ceil (2 seconds, 1 milliseconds) [info] - Multi stages (12 seconds, 763 milliseconds) [info] KerberosConfDriverFeatureStepSuite: [info] - mount krb5 config map if defined (26 milliseconds) [info] - create krb5.conf config map if local config provided (25 milliseconds) [info] - create keytab secret if client keytab file used (17 milliseconds) [info] - do nothing if container-local keytab used (11 milliseconds) [info] - mount delegation tokens if provided (12 milliseconds) [info] - create delegation tokens if needed (39 milliseconds) [info] - do nothing if no config and no tokens (21 milliseconds) [info] MountSecretsFeatureStepSuite: [info] - mounts all given secrets (2 milliseconds) [info] DriverServiceFeatureStepSuite: [info] - Headless service has a port for the driver RPC, the block manager and driver ui. (5 milliseconds) [info] - Hostname and ports are set according to the service name. (1 millisecond) [info] - Ports should resolve to defaults in SparkConf and in the service. (1 millisecond) [info] - Long prefixes should switch to using a generated unique name. (9 milliseconds) [info] - Disallow bind address and driver host to be set explicitly. (1 millisecond) [info] DriverCommandFeatureStepSuite: [info] - java resource (1 millisecond) [info] - python resource (2 milliseconds) [info] - python executable precedence (3 milliseconds) [info] - R resource (0 milliseconds) [info] - SPARK-25355: java resource args with proxy-user (1 millisecond) [info] - SPARK-25355: python resource args with proxy-user (1 millisecond) [info] - SPARK-25355: R resource args with proxy-user (1 millisecond) [info] KubernetesUtilsSuite: [info] - Selects the given container as spark container. (2 milliseconds) [info] - Selects the first container if no container name is given. (1 millisecond) [info] - Falls back to the first container if given container name does not exist. (2 milliseconds) [info] - constructs spark pod correctly with pod template with no containers (0 milliseconds) [info] - floor (1 second, 234 milliseconds) [info] - factorial (643 milliseconds) [info] - block decom manager short circuits removed blocks (5 seconds, 34 milliseconds) [info] - test shuffle and cached rdd migration without any error (35 milliseconds) [info] DriverSuite: [info] - driver should exit after finishing without cleanup (SPARK-530) !!! IGNORED !!! [info] CompactBufferSuite: [info] - empty buffer (1 millisecond) [info] - basic inserts (4 milliseconds) [info] - adding sequences (3 milliseconds) [info] - adding the same buffer to itself (2 milliseconds) [info] MapStatusSuite: [info] - compressSize (0 milliseconds) [info] - decompressSize (2 milliseconds) [info] - MapStatus should never report non-empty blocks' sizes as 0 (442 milliseconds) [info] - large tasks should use org.apache.spark.scheduler.HighlyCompressedMapStatus (1 millisecond) [info] - HighlyCompressedMapStatus: estimated size should be the average non-empty block size (7 milliseconds) [info] - SPARK-22540: ensure HighlyCompressedMapStatus calculates correct avgSize (18 milliseconds) [info] - RoaringBitmap: runOptimize succeeded (12 milliseconds) [info] - RoaringBitmap: runOptimize failed (4 milliseconds) [info] - rint (757 milliseconds) [info] - Blocks which are bigger than SHUFFLE_ACCURATE_BLOCK_THRESHOLD should not be underestimated. (9 milliseconds) [info] - exp (925 milliseconds) [info] ReassignLambdaVariableIDSuite: [info] - expm1 (777 milliseconds) [info] - basic: replace positive IDs with unique negative IDs (797 milliseconds) [info] - ignore LambdaVariable with negative IDs (10 milliseconds) [info] - fail if positive ID LambdaVariable and negative LambdaVariable both exist (7 milliseconds) [info] AnalysisHelperSuite: [info] - setAnalyze is recursive (5 milliseconds) [info] - resolveOperator runs on operators recursively (9 milliseconds) [info] - resolveOperatorsDown runs on operators recursively (1 millisecond) [info] - resolveExpressions runs on operators recursively (6 milliseconds) [info] - resolveOperator skips all ready resolved plans (1 millisecond) [info] - resolveOperatorsDown skips all ready resolved plans (1 millisecond) [info] - resolveExpressions skips all ready resolved plans (0 milliseconds) [info] - resolveOperator skips partially resolved plans (1 millisecond) [info] - resolveOperatorsDown skips partially resolved plans (1 millisecond) [info] - resolveExpressions skips partially resolved plans (1 millisecond) [info] - do not allow transform in analyzer (83 milliseconds) [info] - allow transform in resolveOperators in the analyzer (3 milliseconds) [info] - allow transform with allowInvokingTransformsInAnalyzer in the analyzer (3 milliseconds) [info] - signum (676 milliseconds) [info] ProductAggSuite: [info] - empty buffer (288 milliseconds) [info] - run Spark in yarn-cluster mode unsuccessfully (16 seconds, 39 milliseconds) [info] - update (65 milliseconds) [info] - update - with nulls (1 millisecond) [info] - update - with specials (1 millisecond) [info] - merge (11 milliseconds) [info] - merge - with nulls (2 milliseconds) [info] - merge - with specials (2 milliseconds) [info] - eval (15 milliseconds) [info] AggregateEstimationSuite: [info] - SPARK-26894: propagate child stats for aliases in Aggregate (26 milliseconds) [info] - set an upper bound if the product of ndv's of group-by columns is too large (3 milliseconds) [info] - data contains all combinations of distinct values of group-by columns. (1 millisecond) [info] - empty group-by column (1 millisecond) [info] - aggregate on empty table - with or without group-by column (1 millisecond) [info] - group-by column with only null value (1 millisecond) [info] - group-by column with null value (1 millisecond) [info] - non-cbo estimation (10 milliseconds) [info] LiteralExpressionSuite: [info] - log (563 milliseconds) [info] - log10 (605 milliseconds) [info] - log1p (845 milliseconds) [info] - null (1 second, 628 milliseconds) [info] - SPARK-21133 HighlyCompressedMapStatus#writeExternal throws NPE (4 seconds, 880 milliseconds) [info] BlockInfoManagerSuite: [info] - initial memory usage (0 milliseconds) [info] - get non-existent block (0 milliseconds) [info] - basic lockNewBlockForWriting (3 milliseconds) [info] - lockNewBlockForWriting blocks while write lock is held, then returns false after release (303 milliseconds) [info] - lockNewBlockForWriting blocks while write lock is held, then returns true after removal (306 milliseconds) [info] - read locks are reentrant (2 milliseconds) [info] - multiple tasks can hold read locks (4 milliseconds) [info] - single task can hold write lock (3 milliseconds) [info] - cannot grab a writer lock while already holding a write lock (2 milliseconds) [info] - assertBlockIsLockedForWriting throws exception if block is not locked (2 milliseconds) [info] - downgrade lock (2 milliseconds) [info] - write lock will block readers (304 milliseconds) [info] - bin (1 second, 496 milliseconds) [info] - read locks will block writer (303 milliseconds) [info] - removing a non-existent block throws SparkException (1 millisecond) [info] - removing a block without holding any locks throws IllegalStateException (1 millisecond) [info] - removing a block while holding only a read lock throws IllegalStateException (1 millisecond) [info] - default (1 second, 862 milliseconds) [info] - removing a block causes blocked callers to receive None (302 milliseconds) [info] - releaseAllLocksForTask releases write locks (2 milliseconds) [info] StoragePageSuite: [info] - rddTable (9 milliseconds) [info] - empty rddTable (0 milliseconds) [info] - streamBlockStorageLevelDescriptionAndSize (0 milliseconds) [info] - receiverBlockTables (10 milliseconds) [info] - empty receiverBlockTables (0 milliseconds) [info] TaskSchedulerImplSuite: [info] - SPARK-32653: Decommissioned host/executor should be considered as inactive (92 milliseconds) [info] - log2 (757 milliseconds) [info] - sqrt (697 milliseconds) [info] - Scheduler does not always schedule tasks on the same workers (890 milliseconds) [info] - Scheduler correctly accounts for multiple CPUs per task (70 milliseconds) [info] - SPARK-18886 - partial offers (isAllFreeResources = false) reset timer before any resources have been rejected (52 milliseconds) [info] - SPARK-18886 - delay scheduling timer is reset when it accepts all resources offered when isAllFreeResources = true (46 milliseconds) [info] - SPARK-18886 - task set with no locality requirements should not starve one with them (54 milliseconds) [info] - SPARK-18886 - partial resource offers (isAllFreeResources = false) reset time if last full resource offer (isAllResources = true) was accepted as well as any following partial resource offers (49 milliseconds) [info] - SPARK-18886 - partial resource offers (isAllFreeResources = false) do not reset time if any offer was rejected since last full offer was fully accepted (36 milliseconds) [info] - Scheduler does not crash when tasks are not serializable (39 milliseconds) [info] - pow (655 milliseconds) [info] - concurrent attempts for the same stage only have one active taskset (36 milliseconds) [info] - don't schedule more tasks after a taskset is zombie (51 milliseconds) [info] - boolean literals (1 second, 778 milliseconds) [info] - if a zombie attempt finishes, continue scheduling tasks for non-zombie attempts (50 milliseconds) [info] - tasks are not re-scheduled while executor loss reason is pending (92 milliseconds) [info] - scheduled tasks obey task and stage excludelist (152 milliseconds) [info] - scheduled tasks obey node and executor excludelists (51 milliseconds) [info] - abort stage when all executors are excluded and we cannot acquire new executor (46 milliseconds) [info] - SPARK-22148 abort timer should kick in when task is completely excluded & no new executor can be acquired (62 milliseconds) [info] - shift left (641 milliseconds) [info] - SPARK-22148 try to acquire a new executor when task is unschedulable with 1 executor (50 milliseconds) [info] - SPARK-22148 abort timer should clear unschedulableTaskSetToExpiryTime for all TaskSets (78 milliseconds) [info] - SPARK-22148 Ensure we don't abort the taskSet if we haven't been completely excluded (57 milliseconds) [info] - SPARK-31418 abort timer should kick in when task is completely excluded &allocation manager could not acquire a new executor before the timeout (57 milliseconds) [info] - int literals (871 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 0 (98 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 1 (110 milliseconds) [info] - shift right (566 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 2 (94 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 3 (88 milliseconds) [info] - double literals (417 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 4 (98 milliseconds) [info] - string literals (153 milliseconds) [info] - shift right unsigned (378 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 5 (104 milliseconds) [info] - sum two literals (65 milliseconds) [info] - binary literals (49 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 6 (91 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 7 (102 milliseconds) [info] - hex (361 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 8 (151 milliseconds) [info] - unhex (130 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 9 (116 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 0 (98 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 1 (90 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 2 (113 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 3 (136 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 4 (118 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 5 (121 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 6 (121 milliseconds) [info] - decimal (1 second, 302 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 7 (143 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 8 (115 milliseconds) [info] - hypot (1 second, 314 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 9 (160 milliseconds) [info] - abort stage if executor loss results in unschedulability from previously failed tasks (143 milliseconds) [info] - array (620 milliseconds) [info] - don't abort if there is an executor available, though it hasn't had scheduled tasks yet (124 milliseconds) [info] - seq (287 milliseconds) [info] - SPARK-16106 locality levels updated if executor added to existing host (167 milliseconds) [info] - scheduler checks for executors that can be expired from excludeOnFailure (102 milliseconds) [info] - map (307 milliseconds) [info] - if an executor is lost then the state for its running tasks is cleaned up (SPARK-18553) (57 milliseconds) [info] - if a task finishes with TaskState.LOST its executor is marked as dead (54 milliseconds) [info] - Locality should be used for bulk offers even with delay scheduling off (52 milliseconds) [info] - With delay scheduling off, tasks can be run at any locality level immediately (51 milliseconds) [info] - struct (248 milliseconds) [info] - TaskScheduler should throw IllegalArgumentException when schedulingMode is not supported (40 milliseconds) [info] - unsupported types (map and struct) in Literal.apply (64 milliseconds) [info] - don't schedule for a barrier taskSet if available slots are less than pending tasks (54 milliseconds) [info] - SPARK-24571: char literals (40 milliseconds) [info] - SPARK-33390: Make Literal support char array (29 milliseconds) [info] - don't schedule for a barrier taskSet if available slots are less than pending tasks gpus limiting (89 milliseconds) [info] - schedule tasks for a barrier taskSet if all tasks can be launched together gpus (71 milliseconds) [info] - construct literals from java.time.LocalDate (169 milliseconds) [info] - construct literals from arrays of java.time.LocalDate (21 milliseconds) [info] - schedule tasks for a barrier taskSet if all tasks can be launched together diff ResourceProfile (68 milliseconds) [info] - atan2 (1 second, 485 milliseconds) [info] - schedule tasks for a barrier taskSet if all tasks can be launched together diff ResourceProfile, but not enough gpus (57 milliseconds) [info] - construct literals from java.time.Instant (154 milliseconds) [info] - construct literals from arrays of java.time.Instant (18 milliseconds) [info] - schedule tasks for a barrier taskSet if all tasks can be launched together (52 milliseconds) [info] - format timestamp literal using spark.sql.session.timeZone (20 milliseconds) [info] - format date literal independently from time zone (22 milliseconds) [info] - SPARK-33860: Make CatalystTypeConverters.convertToCatalyst match special Array value (1 millisecond) [info] - SPARK-34342: Date/Timestamp toString (2 milliseconds) [info] - SPARK-29263: barrier TaskSet can't schedule when higher prio taskset takes the slots (41 milliseconds) [info] - SPARK-36055: TimestampNTZ toString (16 milliseconds) [info] - cancelTasks shall kill all the running tasks and fail the stage (45 milliseconds) [info] - SPARK-35664: construct literals from java.time.LocalDateTime (68 milliseconds) [info] - killAllTaskAttempts shall kill all the running tasks and not fail the stage (39 milliseconds) [info] - SPARK-34605: construct literals from java.time.Duration (74 milliseconds) [info] - SPARK-34605: construct literals from arrays of java.time.Duration (16 milliseconds) [info] - mark taskset for a barrier stage as zombie in case a task fails (43 milliseconds) [info] - SPARK-34615: construct literals from java.time.Period (72 milliseconds) [info] - Scheduler correctly accounts for GPUs per task (55 milliseconds) [info] - SPARK-34615: construct literals from arrays of java.time.Period (17 milliseconds) [info] - SPARK-35099: convert a literal of day-time interval to SQL string (3 milliseconds) [info] - SPARK-35099: convert a literal of year-month interval to SQL string (1 millisecond) [info] - Scheduler correctly accounts for GPUs per task with fractional amount (64 milliseconds) [info] - Scheduler works with multiple ResourceProfiles and gpus (50 milliseconds) [info] - SPARK-35871: Literal.create(value, dataType) should support fields (220 milliseconds) [info] MiscExpressionsSuite: [info] - scheduler should keep the decommission state where host was decommissioned (118 milliseconds) [info] - RaiseError (67 milliseconds) [info] - test full decommissioning flow (96 milliseconds) [info] - SPARK-24818: test delay scheduling for barrier TaskSetManager (40 milliseconds) [info] - SPARK-24818: test resource revert of barrier TaskSetManager (45 milliseconds) [info] SparkHadoopUtilSuite: [info] - appendSparkHadoopConfigs with propagation and defaults (13 milliseconds) [info] - appendSparkHadoopConfigs with S3A endpoint set to empty string (12 milliseconds) [info] - appendSparkHadoopConfigs with S3A options explicitly set (11 milliseconds) [info] - appendSparkHadoopConfigs with S3A endpoint region set to an empty string (9 milliseconds) [info] SparkConfSuite: [info] - Test byteString conversion (3 milliseconds) [info] - Test timeString conversion (1 millisecond) [info] - loading from system properties (1 millisecond) [info] - initializing without loading defaults (0 milliseconds) [info] - named set methods (2 milliseconds) [info] - basic get and set (1 millisecond) [info] - basic getAllWithPrefix (1 millisecond) [info] - creating SparkContext without master and app name (1 millisecond) [info] - creating SparkContext without master (1 millisecond) [info] - creating SparkContext without app name (2 milliseconds) [info] - creating SparkContext with both master and app name (48 milliseconds) [info] - SparkContext property overriding (37 milliseconds) [info] - nested property names (0 milliseconds) [info] - uuid (652 milliseconds) [info] - PrintToStderr (13 milliseconds) [info] ReplaceOperatorSuite: [info] - replace Intersect with Left-semi Join (234 milliseconds) [info] - replace Except with Filter while both the nodes are of type Filter (44 milliseconds) [info] - replace Except with Filter while only right node is of type Filter (15 milliseconds) [info] - replace Except with Filter while both the nodes are of type Project (27 milliseconds) [info] - replace Except with Filter while only right node is of type Project (23 milliseconds) [info] - replace Except with Filter while left node is Project and right node is Filter (24 milliseconds) [info] - replace Except with Left-anti Join (20 milliseconds) [info] - replace Except with Filter when only right filter can be applied to the left (35 milliseconds) [info] - replace Distinct with Aggregate (3 milliseconds) [info] - binary log (1 second, 826 milliseconds) [info] - replace batch Deduplicate with Aggregate (185 milliseconds) [info] - add one grouping key if necessary when replace Deduplicate with Aggregate (5 milliseconds) [info] - don't replace streaming Deduplicate (4 milliseconds) [info] - SPARK-26366: ReplaceExceptWithFilter should handle properly NULL (28 milliseconds) [info] - SPARK-26366: ReplaceExceptWithFilter should not transform non-deterministic (34 milliseconds) [info] AnsiCastSuiteWithAnsiModeOff: [info] - Thread safeness - SPARK-5425 (1 second, 3 milliseconds) [info] - register kryo classes through registerKryoClasses (4 milliseconds) [info] - register kryo classes through registerKryoClasses and custom registrator (1 millisecond) [info] - register kryo classes through conf (1 millisecond) [info] - deprecated configs (4 milliseconds) [info] - akka deprecated configs (1 millisecond) [info] - SPARK-13727 (0 milliseconds) [info] - SPARK-17240: SparkConf should be serializable (java) (11 milliseconds) [info] - SPARK-17240: SparkConf should be serializable (kryo) (4 milliseconds) [info] - encryption requires authentication (1 millisecond) [info] - spark.network.timeout should bigger than spark.executor.heartbeatInterval (1 millisecond) [info] - SPARK-26998: SSL configuration not needed on executors (1 millisecond) [info] - SPARK-27244 toDebugString redacts sensitive information (1 millisecond) [info] - SPARK-28355: Use Spark conf for threshold at which UDFs are compressed by broadcast (0 milliseconds) [info] - SPARK-24337: getSizeAsKb with default throws an useful error message with key name (1 millisecond) [info] - SPARK-24337: getTimeAsMs throws an useful error message with key name (0 milliseconds) [info] - SPARK-24337: getTimeAsSeconds throws an useful error message with key name (1 millisecond) [info] - SPARK-24337: getTimeAsSeconds with default throws an useful error message with key name (0 milliseconds) [info] - SPARK-24337: getSizeAsBytes with default long throws an useful error message with key name (1 millisecond) [info] - SPARK-24337: getSizeAsMb throws an useful error message with key name (1 millisecond) [info] - SPARK-24337: getSizeAsGb throws an useful error message with key name (1 millisecond) [info] - SPARK-24337: getSizeAsBytes with default string throws an useful error message with key name (0 milliseconds) [info] - SPARK-24337: getDouble throws an useful error message with key name (1 millisecond) [info] - SPARK-24337: getTimeAsMs with default throws an useful error message with key name (0 milliseconds) [info] - SPARK-24337: getSizeAsBytes throws an useful error message with key name (1 millisecond) [info] - SPARK-24337: getSizeAsGb with default throws an useful error message with key name (0 milliseconds) [info] - SPARK-24337: getInt throws an useful error message with key name (0 milliseconds) [info] - SPARK-24337: getSizeAsMb with default throws an useful error message with key name (0 milliseconds) [info] - SPARK-24337: getSizeAsKb throws an useful error message with key name (0 milliseconds) [info] - SPARK-24337: getBoolean throws an useful error message with key name (0 milliseconds) [info] - SPARK-24337: getLong throws an useful error message with key name (0 milliseconds) [info] - get task resource requirement from config (2 milliseconds) [info] - test task resource requirement with 0 amount (0 milliseconds) [info] - Ensure that we can configure fractional resources for a task (3 milliseconds) [info] - Non-task resources are never fractional (1 millisecond) [info] ShuffleBlockFetcherIteratorSuite: [info] - SPARK-36206: diagnose the block when it's corrupted twice (31 milliseconds) [info] - SPARK-36206: diagnose the block when it's corrupted inside BufferReleasingInputStream (5 milliseconds) [info] - successful 3 local + 4 host local + 2 remote reads (36 milliseconds) [info] - error during accessing host local dirs for executors (4 milliseconds) [info] - Hit maxBytesInFlight limitation before maxBlocksInFlightPerAddress (4 milliseconds) [info] - Hit maxBlocksInFlightPerAddress limitation before maxBytesInFlight (6 milliseconds) [info] - fetch continuous blocks in batch successful 3 local + 4 host local + 2 remote reads (10 milliseconds) [info] - fetch continuous blocks in batch should respect maxBytesInFlight (8 milliseconds) [info] - SPARK-35910: Update remoteBlockBytes based on merged fetch request (3 milliseconds) [info] - fetch continuous blocks in batch should respect maxBlocksInFlightPerAddress (7 milliseconds) [info] - release current unexhausted buffer in case the task completes early (5 milliseconds) [info] - fail all blocks if any of the remote request fails (4 milliseconds) [info] - retry corrupt blocks (10 milliseconds) [info] - big blocks are also checked for corruption (7 milliseconds) [info] - ensure big blocks available as a concatenated stream can be read (74 milliseconds) [info] - retry corrupt blocks (disabled) (6 milliseconds) [info] - Blocks should be shuffled to disk when size of the request is above the threshold(maxReqSizeShuffleToMem). (37 milliseconds) [info] - fail zero-size blocks (3 milliseconds) [info] - SPARK-31521: correct the fetch size when merging blocks into a merged block (1 millisecond) [info] - SPARK-27991: defer shuffle fetch request (one block) on Netty OOM (8 milliseconds) [info] - SPARK-27991: defer shuffle fetch request (multiple blocks) on Netty OOM, oomBlockIndex=0 (6 milliseconds) [info] - SPARK-27991: defer shuffle fetch request (multiple blocks) on Netty OOM, oomBlockIndex=1 (6 milliseconds) [info] - SPARK-27991: defer shuffle fetch request (multiple blocks) on Netty OOM, oomBlockIndex=2 (5 milliseconds) [info] - SPARK-27991: block shouldn't retry endlessly on Netty OOM (15 milliseconds) [info] - SPARK-32922: fetch remote push-merged block meta (22 milliseconds) [info] - SPARK-32922: failed to fetch remote push-merged block meta so fallback to original blocks. (11 milliseconds) [info] - SPARK-32922: iterator has just 1 push-merged block and fails to fetch the meta (6 milliseconds) [info] - SPARK-32922: failure to fetch push-merged-local meta should fallback to fetch original shuffle blocks (14 milliseconds) [info] - SPARK-32922: failure to reading chunkBitmaps of push-merged-local meta should fallback to original shuffle blocks (21 milliseconds) [info] - SPARK-32922: failure to fetch push-merged-local data should fallback to fetch original shuffle blocks (6 milliseconds) [info] - SPARK-32922: failure to fetch push-merged-local meta of a single merged block should not drop the fetch of other push-merged-local blocks (9 milliseconds) [info] - SPARK-32922: failure to fetch push-merged block as well as fallback block should throw a FetchFailedException (7 milliseconds) [info] - SPARK-32922: failure to fetch push-merged-local block should fallback to fetch original shuffle blocks which contain host-local blocks (9 milliseconds) [info] - SPARK-32922: fetch host local blocks with push-merged block during initialization and fallback to host locals blocks (9 milliseconds) [info] - SPARK-32922: failure while reading local shuffle chunks should fallback to original shuffle blocks (7 milliseconds) [info] - SPARK-32922: fallback to original shuffle block when a push-merged shuffle chunk is corrupt (14 milliseconds) [info] - SPARK-32922: fallback to original blocks when failed to fetch remote shuffle chunk (10 milliseconds) [info] - SPARK-32922: fallback to original blocks when failed to parse remote merged block meta (5 milliseconds) [info] - SPARK-32922: failure to fetch a remote shuffle chunk initiates the fallback of pending shuffle chunks immediately (11 milliseconds) [info] - SPARK-32922: failure to fetch a remote shuffle chunk initiates the fallback of pending shuffle chunks immediately which got deferred (10 milliseconds) [info] ConfigEntrySuite: [info] - conf entry: int (0 milliseconds) [info] - conf entry: long (0 milliseconds) [info] - conf entry: double (0 milliseconds) [info] - conf entry: boolean (1 millisecond) [info] - conf entry: optional (0 milliseconds) [info] - conf entry: fallback (1 millisecond) [info] - conf entry: time (0 milliseconds) [info] - conf entry: bytes (1 millisecond) [info] - conf entry: regex (1 millisecond) [info] - conf entry: string seq (1 millisecond) [info] - conf entry: int seq (0 milliseconds) [info] - conf entry: transformation (0 milliseconds) [info] - conf entry: checkValue() (2 milliseconds) [info] - conf entry: valid values check (1 millisecond) [info] - conf entry: conversion error (1 millisecond) [info] - default value handling is null-safe (1 millisecond) [info] - variable expansion of spark config entries (1 millisecond) [info] - conf entry : default function (1 millisecond) [info] - conf entry: alternative keys (1 millisecond) [info] - conf entry: prepend with default separator (0 milliseconds) [info] - conf entry: prepend with custom separator (1 millisecond) [info] - conf entry: prepend with fallback (1 millisecond) [info] - conf entry: prepend should work only with string type (3 milliseconds) [info] - onCreate (2 milliseconds) [info] WorkerSuite: [info] - test isUseLocalNodeSSLConfig (2 milliseconds) [info] - test maybeUpdateSSLSettings (20 milliseconds) [info] - test clearing of finishedExecutors (small number of executors) (50 milliseconds) [info] - test clearing of finishedExecutors (more executors) (45 milliseconds) [info] - test clearing of finishedDrivers (small number of drivers) (80 milliseconds) [info] - test clearing of finishedDrivers (more drivers) (535 milliseconds) [info] - worker could be launched without any resources (42 milliseconds) [info] - worker could load resources from resources file while launching (57 milliseconds) [info] - worker could load resources from discovery script while launching (50 milliseconds) [info] - worker could load resources from resources file and discovery script while launching (82 milliseconds) [info] - cleanup non-shuffle files after executor exits when config spark.storage.cleanupFilesAfterExecutorExit=true (24 milliseconds) [info] - don't cleanup non-shuffle files after executor exits when config spark.storage.cleanupFilesAfterExecutorExit=false (25 milliseconds) [info] - WorkDirCleanup cleans app dirs and shuffle metadata when spark.shuffle.service.db.enabled=true (40 milliseconds) [info] - WorkDirCleanup cleans only app dirs whenspark.shuffle.service.db.enabled=false (29 milliseconds) [info] BlockManagerSuite: [info] - SPARK-36036: make sure temporary download files are deleted (1 second, 679 milliseconds) [info] - null cast (4 seconds, 278 milliseconds) [info] - round/bround (4 seconds, 676 milliseconds) [info] - cast string to date (117 milliseconds) [info] - SPARK-36922: Support ANSI intervals for SIGN/SIGNUM (422 milliseconds) [info] - SPARK-35926: Support YearMonthIntervalType in width-bucket function (353 milliseconds) [info] - SPARK-35925: Support DayTimeIntervalType in width-bucket function (382 milliseconds) [info] - SPARK-37388: width_bucket (926 milliseconds) [info] DateExpressionsSuite: [info] - datetime function current_date (69 milliseconds) [info] - datetime function current_timestamp (752 milliseconds) [info] - datetime function localtimestamp (4 milliseconds) [info] - cast string to timestamp (6 seconds, 741 milliseconds) [info] - cast from boolean (114 milliseconds) [info] - cast from int (238 milliseconds) [info] - cast from long (202 milliseconds) [info] - cast from float (198 milliseconds) [info] - cast from double (124 milliseconds) [info] - cast from string (1 millisecond) [info] - DayOfYear (3 seconds, 994 milliseconds) [info] - Year (2 seconds, 197 milliseconds) [info] - SPARK-32091: count failures from active executors when remove rdd/broadcast/shuffle (15 seconds, 27 milliseconds) [info] - Quarter (6 seconds, 92 milliseconds) [info] - run Spark in yarn-cluster mode failure after sc initialized (36 seconds, 41 milliseconds) [info] - Month (2 seconds, 240 milliseconds) [info] - Day / DayOfMonth (577 milliseconds) [info] - Seconds (5 seconds, 437 milliseconds) [info] - DayOfWeek (217 milliseconds) [info] - WeekDay (270 milliseconds) [info] - WeekOfYear (192 milliseconds) [info] - DateFormat (637 milliseconds) [info] - SPARK-32091: ignore failures from lost executors when remove rdd/broadcast/shuffle (15 seconds, 18 milliseconds) [info] - Hour (2 seconds, 875 milliseconds) [info] - StorageLevel object caching (1 millisecond) [info] - BlockManagerId object caching (1 millisecond) [info] - BlockManagerId.isDriver() with DRIVER_IDENTIFIER (SPARK-27090) (0 milliseconds) [info] - master + 1 manager interaction (22 milliseconds) [info] - master + 2 managers interaction (41 milliseconds) [info] - removing block (30 milliseconds) [info] - removing rdd (24 milliseconds) [info] - removing broadcast (121 milliseconds) [info] - reregistration on heart beat (17 milliseconds) [info] - reregistration on block update (33 milliseconds) [info] - reregistration doesn't dead lock (474 milliseconds) [info] - correct BlockResult returned from get() calls (18 milliseconds) [info] - optimize a location order of blocks without topology information (14 milliseconds) [info] - optimize a location order of blocks with topology information (14 milliseconds) [info] - SPARK-9591: getRemoteBytes from another location when Exception throw (70 milliseconds) [info] - SPARK-27622: avoid the network when block requested from same host, StorageLevel(disk, 1 replicas) (71 milliseconds) [info] - SPARK-27622: avoid the network when block requested from same host, StorageLevel(disk, deserialized, 1 replicas) (43 milliseconds) [info] - SPARK-27622: avoid the network when block requested from same host, StorageLevel(disk, deserialized, 2 replicas) (52 milliseconds) [info] - SPARK-27622: as file is removed fall back to network fetch, StorageLevel(disk, 1 replicas), getRemoteValue() (34 milliseconds) [info] - SPARK-27622: as file is removed fall back to network fetch, StorageLevel(disk, 1 replicas), getRemoteBytes() (29 milliseconds) [info] - SPARK-27622: as file is removed fall back to network fetch, StorageLevel(disk, deserialized, 1 replicas), getRemoteValue() (27 milliseconds) [info] - SPARK-27622: as file is removed fall back to network fetch, StorageLevel(disk, deserialized, 1 replicas), getRemoteBytes() (28 milliseconds) [info] - SPARK-14252: getOrElseUpdate should still read from remote storage (34 milliseconds) [info] - in-memory LRU storage (21 milliseconds) [info] - in-memory LRU storage with serialization (20 milliseconds) [info] - in-memory LRU storage with off-heap (27 milliseconds) [info] - in-memory LRU for partitions of same RDD (17 milliseconds) [info] - in-memory LRU for partitions of multiple RDDs (19 milliseconds) [info] - on-disk storage (encryption = off) (17 milliseconds) [info] - on-disk storage (encryption = on) (32 milliseconds) [info] - disk and memory storage (encryption = off) (18 milliseconds) [info] - disk and memory storage (encryption = on) (18 milliseconds) [info] - disk and memory storage with getLocalBytes (encryption = off) (18 milliseconds) [info] - disk and memory storage with getLocalBytes (encryption = on) (18 milliseconds) [info] - disk and memory storage with serialization (encryption = off) (20 milliseconds) [info] - disk and memory storage with serialization (encryption = on) (18 milliseconds) [info] - disk and memory storage with serialization and getLocalBytes (encryption = off) (18 milliseconds) [info] - disk and memory storage with serialization and getLocalBytes (encryption = on) (17 milliseconds) [info] - disk and off-heap memory storage (encryption = off) (40 milliseconds) [info] - disk and off-heap memory storage (encryption = on) (19 milliseconds) [info] - disk and off-heap memory storage with getLocalBytes (encryption = off) (21 milliseconds) [info] - disk and off-heap memory storage with getLocalBytes (encryption = on) (17 milliseconds) [info] - LRU with mixed storage levels (encryption = off) (21 milliseconds) [info] - LRU with mixed storage levels (encryption = on) (22 milliseconds) [info] - in-memory LRU with streams (encryption = off) (17 milliseconds) [info] - in-memory LRU with streams (encryption = on) (17 milliseconds) [info] - LRU with mixed storage levels and streams (encryption = off) (24 milliseconds) [info] - LRU with mixed storage levels and streams (encryption = on) (26 milliseconds) [info] - negative byte values in ByteBufferInputStream (2 milliseconds) [info] - overly large block (19 milliseconds) [info] - block compression (169 milliseconds) [info] - block store put failure (8 milliseconds) [info] - test putBlockDataAsStream with caching (encryption = off) (26 milliseconds) [info] - test putBlockDataAsStream with caching (encryption = on) (29 milliseconds) [info] - test putBlockDataAsStream with caching, serialized (encryption = off) (24 milliseconds) [info] - test putBlockDataAsStream with caching, serialized (encryption = on) (25 milliseconds) [info] - test putBlockDataAsStream with caching on disk (encryption = off) (29 milliseconds) [info] - test putBlockDataAsStream with caching on disk (encryption = on) (27 milliseconds) [info] - turn off updated block statuses (16 milliseconds) [info] - updated block statuses (23 milliseconds) [info] - query block statuses (22 milliseconds) [info] - get matching blocks (33 milliseconds) [info] - SPARK-1194 regression: fix the same-RDD rule for cache replacement (19 milliseconds) [info] - safely unroll blocks through putIterator (disk) (22 milliseconds) [info] - read-locked blocks cannot be evicted from memory (22 milliseconds) [info] - remove block if a read fails due to missing DiskStore files (SPARK-15736) (90 milliseconds) [info] - SPARK-13328: refresh block locations (fetch should fail after hitting a threshold) (13 milliseconds) [info] - SPARK-13328: refresh block locations (fetch should succeed after location refresh) (16 milliseconds) [info] - SPARK-17484: block status is properly updated following an exception in put() (26 milliseconds) [info] - SPARK-17484: master block locations are updated following an invalid remote block fetch (43 milliseconds) [info] - SPARK-25888: serving of removed file not detected by shuffle service (29 milliseconds) [info] - test sorting of block locations (13 milliseconds) [info] - Minute (5 seconds, 555 milliseconds) [info] - date_add (348 milliseconds) [info] - date add interval (242 milliseconds) [info] - date_sub (367 milliseconds) [info] - time_add (2 seconds, 599 milliseconds) [info] - SPARK-20640: Shuffle registration timeout and maxAttempts conf are working (5 seconds, 109 milliseconds) [info] - fetch remote block to local disk if block size is larger than threshold (13 milliseconds) [info] - query locations of blockIds (3 milliseconds) [info] - SPARK-30594: Do not post SparkListenerBlockUpdated when updateBlockInfo returns false (1 millisecond) [info] - we reject putting blocks when we have the wrong shuffle resolver (27 milliseconds) [info] - test decommission block manager should not be part of peers (48 milliseconds) [info] - run Python application in yarn-client mode (22 seconds, 34 milliseconds) [info] - test decommissionRddCacheBlocks should migrate all cached blocks (55 milliseconds) [info] - test decommissionRddCacheBlocks should keep the block if it is not able to migrate (44 milliseconds) [info] - test migration of shuffle blocks during decommissioning - no limit (53 milliseconds) [info] - test migration of shuffle blocks during decommissioning - larger limit (55 milliseconds) [info] - time_sub (1 second, 998 milliseconds) [info] - add_months (281 milliseconds) [info] - SPARK-34721: add a year-month interval to a date (133 milliseconds) [info] - data type casting (34 seconds, 966 milliseconds) [info] - [SPARK-34363]test migration of shuffle blocks during decommissioning - small limit (1 second, 29 milliseconds) [info] - cast and add (166 milliseconds) [info] - SPARK-32919: Shuffle push merger locations should be bounded with in spark.shuffle.push.retainedMergerLocations (68 milliseconds) [info] - SPARK-32919: Prefer active executor locations for shuffle push mergers (85 milliseconds) [info] - SPARK-33387 Support ordered shuffle block migration (2 milliseconds) [info] - from decimal (263 milliseconds) [info] - SPARK-34193: Potential race condition during decommissioning with TorrentBroadcast (15 milliseconds) [info] PythonRunnerSuite: [info] - format path (4 milliseconds) [info] - format paths (3 milliseconds) [info] SortShuffleWriterSuite: [info] - cast from array (92 milliseconds) [info] - cast from map (53 milliseconds) [info] - write empty iterator (2 milliseconds) [info] - write with some records (3 milliseconds) [info] - write checksum file (spill=true, aggregator=false, order=false) (144 milliseconds) [info] - cast from struct (181 milliseconds) [info] - cast struct with a timestamp field (65 milliseconds) [info] - complex casting (2 milliseconds) [info] - write checksum file (spill=true, aggregator=true, order=false) (101 milliseconds) [info] - write checksum file (spill=true, aggregator=false, order=true) (85 milliseconds) [info] - cast between string and interval (132 milliseconds) [info] - write checksum file (spill=true, aggregator=true, order=true) (80 milliseconds) [info] - cast string to boolean (113 milliseconds) [info] - SPARK-16729 type checking for casting to date type (1 millisecond) [info] - write checksum file (spill=false, aggregator=false, order=false) (63 milliseconds) [info] - SPARK-20302 cast with same structure (44 milliseconds) [info] - write checksum file (spill=false, aggregator=true, order=false) (64 milliseconds) [info] - write checksum file (spill=false, aggregator=false, order=true) (67 milliseconds) [info] - write checksum file (spill=false, aggregator=true, order=true) (73 milliseconds) [info] CryptoStreamUtilsSuite: [info] - crypto configuration conversion (1 millisecond) [info] - shuffle encryption key length should be 128 by default (1 millisecond) [info] - create 256-bit key (0 milliseconds) [info] - create key with invalid length (1 millisecond) [info] - serializer manager integration (3 milliseconds) [info] - SPARK-22500: cast for struct should not generate codes beyond 64KB (1 second, 59 milliseconds) [info] - SPARK-22570: Cast should not create a lot of global variables (2 milliseconds) [info] - up-cast (28 milliseconds) [info] - SPARK-27671: cast from nested null type in struct (569 milliseconds) [info] - Process Infinity, -Infinity, NaN in case insensitive manner (109 milliseconds) [info] - months_between (3 seconds, 447 milliseconds) [info] - SPARK-22825 Cast array to string (267 milliseconds) [info] - SPARK-33291: Cast array with null elements to string (57 milliseconds) [info] - last_day (468 milliseconds) [info] - SPARK-22973 Cast map to string (358 milliseconds) [info] - SPARK-22981 Cast struct to string (432 milliseconds) [info] - next_day (477 milliseconds) [info] - SPARK-33291: Cast struct with null elements to string (60 milliseconds) [info] - TruncDate (254 milliseconds) [info] - SPARK-34667: cast year-month interval to string (949 milliseconds) [info] - TruncTimestamp (778 milliseconds) [info] - encryption key propagation to executors (4 seconds, 88 milliseconds) [info] - crypto stream wrappers (9 milliseconds) [info] - error handling wrapper (7 milliseconds) [info] StatsdSinkSuite: [info] - metrics StatsD sink with Counter (12 milliseconds) [info] - metrics StatsD sink with Gauge (1 millisecond) [info] - metrics StatsD sink with Histogram (9 milliseconds) [info] - metrics StatsD sink with Timer (5 milliseconds) [info] FileCommitProtocolInstantiationSuite: [info] - Dynamic partitions require appropriate constructor (1 millisecond) [info] - Standard partitions work with classic constructor (1 millisecond) [info] - Three arg constructors have priority (0 milliseconds) [info] - Three arg constructors have priority when dynamic (0 milliseconds) [info] - The protocol must be of the correct class (1 millisecond) [info] - If there is no matching constructor, class hierarchy is irrelevant (1 millisecond) [info] CompletionIteratorSuite: [info] - basic test (1 millisecond) [info] - unsupported fmt fields for trunc/date_trunc results null (478 milliseconds) [info] - reference to sub iterator should not be available after completion (620 milliseconds) [info] LauncherBackendSuite: [info] - from_unixtime (1 second, 296 milliseconds) [info] - SPARK-34668: cast day-time interval to string (2 seconds, 550 milliseconds) [info] - unix_timestamp (851 milliseconds) [info] - SPARK-35698: cast timestamp without time zone to string (148 milliseconds) [info] - SPARK-35711: cast timestamp without time zone to timestamp with local time zone (456 milliseconds) [info] - SPARK-35716: cast timestamp without time zone to date type (179 milliseconds) [info] - SPARK-35718: cast date type to timestamp without timezone (164 milliseconds) [info] - to_unix_timestamp (981 milliseconds) [info] - datediff (100 milliseconds) [info] - to_utc_timestamp (333 milliseconds) [info] - to_utc_timestamp - invalid time zone id (41 milliseconds) [info] - from_utc_timestamp (296 milliseconds) [info] - from_utc_timestamp - invalid time zone id (40 milliseconds) [info] - SPARK-35719: cast timestamp with local time zone to timestamp without timezone (1 second, 46 milliseconds) [info] - disallow type conversions between Numeric types and Timestamp without time zone type (5 milliseconds) [info] - local: launcher handle (3 seconds, 644 milliseconds) [info] - creating values of DateType via make_date (441 milliseconds) [info] - SPARK-35720: cast string to timestamp without timezone (386 milliseconds) [info] - SPARK-35112: Cast string to day-time interval (456 milliseconds) [info] - SPARK-35111: Cast string to year-month interval (413 milliseconds) [info] - SPARK-35820: Support cast DayTimeIntervalType in different fields (383 milliseconds) [info] - SPARK-35819: Support cast YearMonthIntervalType in different fields (102 milliseconds) [info] - SPARK-35768: Take into account year-month interval fields in cast (252 milliseconds) [info] - SPARK-35735: Take into account day-time interval fields in cast (928 milliseconds) [info] - ANSI mode: Throw exception on casting out-of-range value to byte type (668 milliseconds) [info] - creating values of Timestamp/TimestampNTZ via make_timestamp (3 seconds, 285 milliseconds) [info] - ISO 8601 week-numbering year (68 milliseconds) [info] - ANSI mode: Throw exception on casting out-of-range value to short type (681 milliseconds) [info] - extract the seconds part with fraction from timestamps (574 milliseconds) [info] - standalone/client: launcher handle (4 seconds, 381 milliseconds) [info] LogPageSuite: [info] - get logs simple (123 milliseconds) [info] UnifiedMemoryManagerSuite: [info] - single task requesting on-heap execution memory (1 millisecond) [info] - two tasks requesting full on-heap execution memory (2 milliseconds) [info] - two tasks cannot grow past 1 / N of on-heap execution memory (2 milliseconds) [info] - ANSI mode: Throw exception on casting out-of-range value to int type (604 milliseconds) [info] - tasks can block to get at least 1 / 2N of on-heap execution memory (303 milliseconds) [info] - SPARK-35486: memory freed by self-spilling is taken by another task (304 milliseconds) [info] - ANSI mode: Throw exception on casting out-of-range value to long type (468 milliseconds) [info] - ANSI mode: Throw exception on casting out-of-range value to decimal type (103 milliseconds) [info] - ANSI mode: disallow type conversions between Numeric types and Timestamp type (2 milliseconds) [info] - ANSI mode: disallow type conversions between Numeric types and Date type (1 millisecond) [info] - ANSI mode: disallow type conversions between Numeric types and Binary type (2 milliseconds) [info] - ANSI mode: disallow type conversions between Datatime types and Boolean types (1 millisecond) [info] - cast from invalid string to numeric should throw NumberFormatException (142 milliseconds) [info] - TaskMemoryManager.cleanUpAllAllocatedMemory (304 milliseconds) [info] - tasks should not be granted a negative amount of execution memory (2 milliseconds) [info] - off-heap execution allocations cannot exceed limit (2 milliseconds) [info] - basic execution memory (3 milliseconds) [info] - basic storage memory (2 milliseconds) [info] - execution evicts storage (1 millisecond) [info] - execution memory requests smaller than free memory should evict storage (SPARK-12165) (0 milliseconds) [info] - storage does not evict execution (2 milliseconds) [info] - small heap (1 millisecond) [info] - insufficient executor memory (1 millisecond) [info] - execution can evict cached blocks when there are multiple active tasks (SPARK-12155) (1 millisecond) [info] - SPARK-15260: atomically resize memory pools (2 milliseconds) [info] - not enough free memory in the storage pool --OFF_HEAP (1 millisecond) [info] UnsafeKryoSerializerSuite: [info] - cast from invalid string array to numeric array should throw NumberFormatException (90 milliseconds) [info] - SPARK-7392 configuration limits (2 milliseconds) [info] - basic types (11 milliseconds) [info] - pairs (3 milliseconds) [info] - Scala data structures (2 milliseconds) [info] - Bug: SPARK-10251 (4 milliseconds) [info] - ranges (7 milliseconds) [info] - asJavaIterable (10 milliseconds) [info] - custom registrator (5 milliseconds) [info] - Fast fail for cast string type to decimal type in ansi mode (101 milliseconds) [info] - kryo with collect (40 milliseconds) [info] - kryo with parallelize (16 milliseconds) [info] - ANSI mode: cast string to boolean with parse error (26 milliseconds) [info] - kryo with parallelize for specialized tuples (12 milliseconds) [info] - kryo with parallelize for primitive arrays (12 milliseconds) [info] - kryo with collect for specialized tuples (13 milliseconds) [info] - kryo with SerializableHyperLogLog (23 milliseconds) [info] - kryo with reduce (15 milliseconds) [info] - kryo with fold (14 milliseconds) [info] - kryo with nonexistent custom registrator should fail (2 milliseconds) [info] - default class loader can be set by a different thread (4 milliseconds) [info] - registration of HighlyCompressedMapStatus (2 milliseconds) [info] - registration of TaskCommitMessage (3 milliseconds) [info] - cast from timestamp II (146 milliseconds) [info] - serialization buffer overflow reporting (70 milliseconds) [info] - KryoOutputObjectOutputBridge.writeObject and KryoInputObjectInputBridge.readObject (2 milliseconds) [info] - getAutoReset (2 milliseconds) [info] - SPARK-25176 ClassCastException when writing a Map after previously reading a Map with different generic type (2 milliseconds) [info] - instance reuse with autoReset = true, referenceTracking = true, usePool = true (1 millisecond) [info] - instance reuse with autoReset = true, referenceTracking = true, usePool = false (1 millisecond) [info] - instance reuse with autoReset = false, referenceTracking = true, usePool = true (1 millisecond) [info] - instance reuse with autoReset = false, referenceTracking = true, usePool = false (1 millisecond) [info] - instance reuse with autoReset = true, referenceTracking = false, usePool = true (1 millisecond) [info] - instance reuse with autoReset = true, referenceTracking = false, usePool = false (0 milliseconds) [info] - instance reuse with autoReset = false, referenceTracking = false, usePool = true (1 millisecond) [info] - instance reuse with autoReset = false, referenceTracking = false, usePool = false (1 millisecond) [info] - SPARK-25839 KryoPool implementation works correctly in multi-threaded environment (2 milliseconds) [info] - SPARK-27216: test RoaringBitmap ser/dser with Kryo (2 milliseconds) [info] - SPARK-37071: OpenHashMap serialize with reference tracking turned off (5 milliseconds) [info] - cast a timestamp before the epoch 1970-01-01 00:00:00Z II (61 milliseconds) [info] NettyRpcAddressSuite: [info] - toString (0 milliseconds) [info] - toString for client mode (0 milliseconds) [info] ExternalShuffleServiceMetricsSuite: [info] - SPARK-31646: metrics should be registered (0 milliseconds) [info] BitSetSuite: [info] - basic set and get (1 millisecond) [info] - 100% full bit set (7 milliseconds) [info] - nextSetBit (0 milliseconds) [info] - xor len(bitsetX) < len(bitsetY) (1 millisecond) [info] - xor len(bitsetX) > len(bitsetY) (1 millisecond) [info] - andNot len(bitsetX) < len(bitsetY) (1 millisecond) [info] - andNot len(bitsetX) > len(bitsetY) (1 millisecond) [info] - [gs]etUntil (3 milliseconds) [info] AsyncRDDActionsSuite: [info] - countAsync (17 milliseconds) [info] - collectAsync (12 milliseconds) [info] - foreachAsync (17 milliseconds) [info] - foreachPartitionAsync (15 milliseconds) [info] - cast from timestamp (306 milliseconds) [info] - cast a timestamp before the epoch 1970-01-01 00:00:00Z (30 milliseconds) [info] - cast from date (258 milliseconds) [info] - cast from array II (42 milliseconds) [info] - SPARK-34903: timestamps difference (2 seconds, 370 milliseconds) [info] - cast from map II (79 milliseconds) [info] - cast from struct II (102 milliseconds) [info] - ANSI mode: cast string to timestamp with parse error (175 milliseconds) [info] - ANSI mode: cast string to date with parse error (151 milliseconds) [info] - SPARK-26218: Fix the corner case of codegen when casting float to Integer (37 milliseconds) [info] - SPARK-35720: cast invalid string input to timestamp without time zone (48 milliseconds) [info] SameResultSuite: [info] - relations (5 milliseconds) [info] - projections (34 milliseconds) [info] - filters (8 milliseconds) [info] - sorts (16 milliseconds) [info] - union (28 milliseconds) [info] - hint (13 milliseconds) [info] - join hint (27 milliseconds) [info] UnsupportedOperationsSuite: [info] - batch plan - local relation: supported (7 milliseconds) [info] - batch plan - streaming source: not supported (12 milliseconds) [info] - batch plan - select on streaming source: not supported (0 milliseconds) [info] - streaming plan - no streaming source (7 milliseconds) [info] - streaming plan - commmands: not supported (4 milliseconds) [info] - streaming plan - aggregate - multiple batch aggregations: supported (7 milliseconds) [info] - streaming plan - aggregate - multiple aggregations but only one streaming aggregation: supported (2 milliseconds) [info] - streaming plan - aggregate - multiple streaming aggregations: not supported (4 milliseconds) [info] - streaming plan - aggregate - streaming aggregations in update mode: supported (1 millisecond) [info] - streaming plan - aggregate - streaming aggregations in complete mode: supported (0 milliseconds) [info] - streaming plan - aggregate - streaming aggregations with watermark in append mode: supported (1 millisecond) [info] - streaming plan - aggregate - streaming aggregations without watermark in append mode: not supported (3 milliseconds) [info] - streaming plan - distinct aggregate - aggregate on batch relation: supported (1 millisecond) [info] - streaming plan - distinct aggregate - aggregate on streaming relation: not supported (3 milliseconds) [info] - batch plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on batch relation: supported (0 milliseconds) [info] - batch plan - flatMapGroupsWithState - multiple flatMapGroupsWithState(Append)s on batch relation: supported (0 milliseconds) [info] - batch plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on batch relation: supported (0 milliseconds) [info] - batch plan - flatMapGroupsWithState - multiple flatMapGroupsWithState(Update)s on batch relation: supported (0 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation without aggregation in update mode: supported (1 millisecond) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation without aggregation in append mode: not supported (4 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation without aggregation in complete mode: not supported (5 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation with aggregation in Append mode: not supported (6 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation with aggregation in Update mode: not supported (12 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation with aggregation in Complete mode: not supported (5 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation without aggregation in append mode: supported (1 millisecond) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation without aggregation in update mode: not supported (4 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation before aggregation in Append mode: supported (4 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation before aggregation in Update mode: supported (2 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation before aggregation in Complete mode: supported (2 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation after aggregation in Append mode: not supported (5 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation after aggregation in Update mode: not supported (4 milliseconds) [info] - takeAsync (1 second, 498 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation in complete mode: not supported (3 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on batch relation inside streaming relation in Append output mode: supported (0 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on batch relation inside streaming relation in Update output mode: supported (0 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on batch relation inside streaming relation in Append output mode: supported (0 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on batch relation inside streaming relation in Update output mode: supported (1 millisecond) [info] - async success handling (9 milliseconds) [info] - streaming plan - flatMapGroupsWithState - multiple flatMapGroupsWithStates on streaming relation and all are in append mode: supported (4 milliseconds) [info] - streaming plan - flatMapGroupsWithState - multiple flatMapGroupsWithStates on s streaming relation but some are not in append mode: not supported (6 milliseconds) [info] - streaming plan - mapGroupsWithState - mapGroupsWithState on streaming relation without aggregation in append mode: not supported (2 milliseconds) [info] - streaming plan - mapGroupsWithState - mapGroupsWithState on streaming relation without aggregation in complete mode: not supported (3 milliseconds) [info] - async failure handling (15 milliseconds) [info] - streaming plan - mapGroupsWithState - mapGroupsWithState on streaming relation with aggregation in Append mode: not supported (4 milliseconds) [info] - streaming plan - mapGroupsWithState - mapGroupsWithState on streaming relation with aggregation in Update mode: not supported (3 milliseconds) [info] - FutureAction result, infinite wait (10 milliseconds) [info] - streaming plan - mapGroupsWithState - mapGroupsWithState on streaming relation with aggregation in Complete mode: not supported (3 milliseconds) [info] - streaming plan - mapGroupsWithState - multiple mapGroupsWithStates on streaming relation and all are in append mode: not supported (3 milliseconds) [info] - streaming plan - mapGroupsWithState - mixing mapGroupsWithStates and flatMapGroupsWithStates on streaming relation: not supported (3 milliseconds) [info] - FutureAction result, finite wait (9 milliseconds) [info] - streaming plan - mapGroupsWithState - mapGroupsWithState with event time timeout without watermark: not supported (4 milliseconds) [info] - streaming plan - mapGroupsWithState - mapGroupsWithState with event time timeout with watermark: supported (1 millisecond) [info] - streaming plan - Deduplicate - Deduplicate on streaming relation before aggregation: supported (1 millisecond) [info] - streaming plan - Deduplicate - Deduplicate on streaming relation after aggregation: not supported (2 milliseconds) [info] - streaming plan - Deduplicate - Deduplicate on batch relation inside a streaming query: supported (1 millisecond) [info] - streaming plan - single inner join in append mode with stream-stream relations: supported (1 millisecond) [info] - streaming plan - single inner join in append mode with stream-batch relations: supported (0 milliseconds) [info] - streaming plan - single inner join in append mode with batch-stream relations: supported (0 milliseconds) [info] - streaming plan - single inner join in append mode with batch-batch relations: supported (0 milliseconds) [info] - streaming plan - multiple inner joins in append mode with stream-stream relations: supported (0 milliseconds) [info] - streaming plan - multiple inner joins in append mode with stream-batch relations: supported (1 millisecond) [info] - streaming plan - multiple inner joins in append mode with batch-stream relations: supported (0 milliseconds) [info] - streaming plan - multiple inner joins in append mode with batch-batch relations: supported (1 millisecond) [info] - streaming plan - inner join in update mode with stream-stream relations: not supported (2 milliseconds) [info] - streaming plan - inner join in update mode with stream-batch relations: supported (1 millisecond) [info] - streaming plan - inner join in update mode with batch-stream relations: supported (0 milliseconds) [info] - streaming plan - inner join in update mode with batch-batch relations: supported (0 milliseconds) [info] - FutureAction result, timeout (24 milliseconds) [info] - streaming plan - FullOuter join with stream-stream relations: not supported (11 milliseconds) [info] - streaming plan - FullOuter join with stream-batch relations: not supported (1 millisecond) [info] - streaming plan - FullOuter join with batch-stream relations: not supported (1 millisecond) [info] - streaming plan - FullOuter join with batch-batch relations: supported (0 milliseconds) [info] - streaming plan - LeftOuter join with stream-stream relations: not supported (1 millisecond) [info] - streaming plan - LeftOuter join with stream-batch relations: supported (0 milliseconds) [info] - streaming plan - LeftOuter join with batch-stream relations: not supported (1 millisecond) [info] - streaming plan - LeftOuter join with batch-batch relations: supported (0 milliseconds) [info] - streaming plan - LeftSemi join with stream-stream relations: not supported (1 millisecond) [info] - streaming plan - LeftSemi join with stream-batch relations: supported (1 millisecond) [info] - streaming plan - LeftSemi join with batch-stream relations: not supported (1 millisecond) [info] - streaming plan - LeftSemi join with batch-batch relations: supported (0 milliseconds) [info] - streaming plan - LeftAnti join with stream-stream relations: not supported (1 millisecond) [info] - streaming plan - LeftAnti join with stream-batch relations: supported (0 milliseconds) [info] - streaming plan - LeftAnti join with batch-stream relations: not supported (2 milliseconds) [info] - streaming plan - LeftAnti join with batch-batch relations: supported (0 milliseconds) [info] - streaming plan - RightOuter join with stream-stream relations: not supported (1 millisecond) [info] - streaming plan - RightOuter join with stream-batch relations: not supported (1 millisecond) [info] - streaming plan - RightOuter join with batch-stream relations: supported (0 milliseconds) [info] - streaming plan - RightOuter join with batch-batch relations: supported (0 milliseconds) [info] - streaming plan - LeftOuter join with stream-stream relations and update mode: not supported (1 millisecond) [info] - streaming plan - LeftOuter join with stream-stream relations and complete mode: not supported (1 millisecond) [info] - streaming plan - LeftOuter join with stream-stream relations and join on attribute with left watermark: supported (4 milliseconds) [info] - streaming plan - LeftOuter join with stream-stream relations and join on attribute with right watermark: supported (0 milliseconds) [info] - streaming plan - LeftOuter join with stream-stream relations and join on non-watermarked attribute: not supported (1 millisecond) [info] - SimpleFutureAction callback must not consume a thread while waiting (34 milliseconds) [info] - ComplexFutureAction callback must not consume a thread while waiting (16 milliseconds) [info] - streaming plan - LeftOuter join with stream-stream relations and state value watermark: supported (24 milliseconds) [info] - streaming plan - LeftOuter join with stream-stream relations and state value watermark: not supported (3 milliseconds) [info] - streaming plan - RightOuter join with stream-stream relations and update mode: not supported (1 millisecond) [info] - streaming plan - RightOuter join with stream-stream relations and complete mode: not supported (2 milliseconds) [info] - streaming plan - RightOuter join with stream-stream relations and join on attribute with left watermark: supported (1 millisecond) [info] - streaming plan - RightOuter join with stream-stream relations and join on attribute with right watermark: supported (0 milliseconds) [info] - streaming plan - RightOuter join with stream-stream relations and join on non-watermarked attribute: not supported (2 milliseconds) [info] - streaming plan - RightOuter join with stream-stream relations and state value watermark: supported (2 milliseconds) [info] - streaming plan - RightOuter join with stream-stream relations and state value watermark: not supported (2 milliseconds) [info] - streaming plan - FullOuter join with stream-stream relations and update mode: not supported (1 millisecond) [info] - streaming plan - FullOuter join with stream-stream relations and complete mode: not supported (2 milliseconds) [info] - streaming plan - FullOuter join with stream-stream relations and join on attribute with left watermark: supported (1 millisecond) [info] - streaming plan - FullOuter join with stream-stream relations and join on attribute with right watermark: supported (1 millisecond) [info] - streaming plan - FullOuter join with stream-stream relations and join on non-watermarked attribute: not supported (2 milliseconds) [info] StagePageSuite: [info] - streaming plan - FullOuter join with stream-stream relations and state value watermark: supported (1 millisecond) [info] - streaming plan - FullOuter join with stream-stream relations and state value watermark: not supported (2 milliseconds) [info] - streaming plan - LeftSemi join with stream-stream relations and update mode: not supported (1 millisecond) [info] - streaming plan - LeftSemi join with stream-stream relations and complete mode: not supported (1 millisecond) [info] - ApiHelper.COLUMN_TO_INDEX should match headers of the task table (3 milliseconds) [info] - streaming plan - LeftSemi join with stream-stream relations and join on attribute with left watermark: supported (0 milliseconds) [info] - streaming plan - LeftSemi join with stream-stream relations and join on attribute with right watermark: supported (0 milliseconds) [info] - streaming plan - LeftSemi join with stream-stream relations and join on non-watermarked attribute: not supported (1 millisecond) [info] - streaming plan - LeftSemi join with stream-stream relations and state value watermark: supported (1 millisecond) [info] - streaming plan - LeftSemi join with stream-stream relations and state value watermark: not supported (2 milliseconds) [info] - Global watermark limit - single Inner join in Append mode (4 milliseconds) [info] - Global watermark limit - streaming aggregation after stream-stream Inner join in Append mode (0 milliseconds) [info] - Global watermark limit - streaming-stream Inner after stream-stream Inner join in Append mode (1 millisecond) [info] BarrierStageOnSubmittedSuite: [info] - Global watermark limit - streaming-stream LeftOuter after stream-stream Inner join in Append mode (0 milliseconds) [info] - Global watermark limit - streaming-stream RightOuter after stream-stream Inner join in Append mode (1 millisecond) [info] - Global watermark limit - FlatMapGroupsWithState after stream-stream Inner join in Append mode (0 milliseconds) [info] - Global watermark limit - deduplicate after stream-stream Inner join in Append mode (1 millisecond) [info] - Global watermark limit - single LeftOuter join in Append mode (0 milliseconds) [info] - Global watermark limit - streaming aggregation after stream-stream LeftOuter join in Append mode (2 milliseconds) [info] - Global watermark limit - streaming-stream Inner after stream-stream LeftOuter join in Append mode (1 millisecond) [info] - Global watermark limit - streaming-stream LeftOuter after stream-stream LeftOuter join in Append mode (0 milliseconds) [info] - Global watermark limit - streaming-stream RightOuter after stream-stream LeftOuter join in Append mode (0 milliseconds) [info] - Global watermark limit - FlatMapGroupsWithState after stream-stream LeftOuter join in Append mode (0 milliseconds) [info] - Global watermark limit - deduplicate after stream-stream LeftOuter join in Append mode (0 milliseconds) [info] - Global watermark limit - single RightOuter join in Append mode (0 milliseconds) [info] - Global watermark limit - streaming aggregation after stream-stream RightOuter join in Append mode (0 milliseconds) [info] - Global watermark limit - streaming-stream Inner after stream-stream RightOuter join in Append mode (1 millisecond) [info] - Global watermark limit - streaming-stream LeftOuter after stream-stream RightOuter join in Append mode (0 milliseconds) [info] - Global watermark limit - streaming-stream RightOuter after stream-stream RightOuter join in Append mode (1 millisecond) [info] - Global watermark limit - FlatMapGroupsWithState after stream-stream RightOuter join in Append mode (0 milliseconds) [info] - Global watermark limit - deduplicate after stream-stream RightOuter join in Append mode (1 millisecond) [info] - streaming plan - cogroup with stream-stream relations: not supported (4 milliseconds) [info] - streaming plan - cogroup with stream-batch relations: not supported (2 milliseconds) [info] - streaming plan - cogroup with batch-stream relations: not supported (3 milliseconds) [info] - streaming plan - cogroup with batch-batch relations: supported (1 millisecond) [info] - streaming plan - union with stream-stream relations: supported (1 millisecond) [info] - streaming plan - union with stream-batch relations: not supported (1 millisecond) [info] - streaming plan - union with batch-stream relations: not supported (1 millisecond) [info] - streaming plan - union with batch-batch relations: supported (1 millisecond) [info] - streaming plan - except with stream-stream relations: not supported (2 milliseconds) [info] - streaming plan - except with stream-batch relations: supported (1 millisecond) [info] - streaming plan - except with batch-stream relations: not supported (1 millisecond) [info] - streaming plan - except with batch-batch relations: supported (1 millisecond) [info] - streaming plan - intersect with stream-stream relations: not supported (1 millisecond) [info] - streaming plan - intersect with stream-batch relations: not supported (1 millisecond) [info] - streaming plan - intersect with batch-stream relations: not supported (1 millisecond) [info] - streaming plan - intersect with batch-batch relations: supported (1 millisecond) [info] - streaming plan - sort with stream relation: not supported (2 milliseconds) [info] - streaming plan - sort with batch relation: supported (0 milliseconds) [info] - streaming plan - sort - sort after aggregation in Complete output mode: supported (1 millisecond) [info] - streaming plan - sort - sort before aggregation in Complete output mode: not supported (1 millisecond) [info] - streaming plan - sort - sort over aggregated data in Update output mode: not supported (2 milliseconds) [info] - streaming plan - sample with stream relation: not supported (1 millisecond) [info] - streaming plan - sample with batch relation: supported (1 millisecond) [info] - streaming plan - window with stream relation: not supported (3 milliseconds) [info] - streaming plan - window with batch relation: supported (0 milliseconds) [info] - streaming plan - Append output mode - aggregation: not supported (1 millisecond) [info] - streaming plan - Append output mode - no aggregation: supported (0 milliseconds) [info] - streaming plan - Update output mode - aggregation: supported (0 milliseconds) [info] - streaming plan - Update output mode - no aggregation: supported (0 milliseconds) [info] - streaming plan - Complete output mode - aggregation: supported (0 milliseconds) [info] - streaming plan - Complete output mode - no aggregation: not supported (1 millisecond) [info] - streaming plan - MonotonicallyIncreasingID: not supported (5 milliseconds) [info] - submit a barrier ResultStage that contains PartitionPruningRDD (60 milliseconds) [info] - continuous processing - TypedFilter: supported (6 milliseconds) [info] - Global watermark limit - single streaming aggregation in Append mode (1 millisecond) [info] - Global watermark limit - chained streaming aggregations in Append mode (0 milliseconds) [info] - Global watermark limit - Inner join after streaming aggregation in Append mode (0 milliseconds) [info] - Global watermark limit - LeftOuter join after streaming aggregation in Append mode (0 milliseconds) [info] - Global watermark limit - RightOuter join after streaming aggregation in Append mode (1 millisecond) [info] - Global watermark limit - deduplicate after streaming aggregation in Append mode (0 milliseconds) [info] - Global watermark limit - FlatMapGroupsWithState after streaming aggregation in Append mode (1 millisecond) [info] - Global watermark limit - single FlatMapGroupsWithState in Append mode (0 milliseconds) [info] - Global watermark limit - streaming aggregation after FlatMapGroupsWithState in Append mode (0 milliseconds) [info] - Global watermark limit - stream-stream Inner after FlatMapGroupsWithState in Append mode (1 millisecond) [info] - Global watermark limit - stream-stream LeftOuter after FlatMapGroupsWithState in Append mode (0 milliseconds) [info] - Global watermark limit - stream-stream RightOuter after FlatMapGroupsWithState in Append mode (0 milliseconds) [info] - Global watermark limit - FlatMapGroupsWithState after FlatMapGroupsWithState in Append mode (1 millisecond) [info] - Global watermark limit - deduplicate after FlatMapGroupsWithState in Append mode (0 milliseconds) [info] - Global watermark limit - streaming aggregation after deduplicate in Append mode (1 millisecond) [info] - Global watermark limit - Inner join after deduplicate in Append mode (0 milliseconds) [info] - Global watermark limit - LeftOuter join after deduplicate in Append mode (0 milliseconds) [info] - Global watermark limit - RightOuter join after deduplicate in Append mode (1 millisecond) [info] - Global watermark limit - FlatMapGroupsWithState after deduplicate in Append mode (0 milliseconds) [info] - submit a barrier ShuffleMapStage that contains PartitionPruningRDD (41 milliseconds) [info] V2OverwriteByExpressionANSIAnalysisSuite: [info] - submit a barrier stage that doesn't contain PartitionPruningRDD (76 milliseconds) [info] - submit a barrier stage with partial partitions (42 milliseconds) [info] - submit a barrier stage with union() (39 milliseconds) [info] - submit a barrier stage with coalesce() (34 milliseconds) [info] - submit a barrier stage that contains an RDD that depends on multiple barrier RDDs (40 milliseconds) [info] - submit a barrier stage with zip() (54 milliseconds) [info] - submit a barrier ResultStage with dynamic resource allocation enabled (40 milliseconds) [info] - SPARK-33136: output resolved on complex types for V2 write commands (451 milliseconds) [info] - skipSchemaResolution should still require query to be resolved (11 milliseconds) [info] - submit a barrier ShuffleMapStage with dynamic resource allocation enabled (48 milliseconds) [info] - byName: basic behavior (26 milliseconds) [info] - byName: does not match by position (66 milliseconds) [info] - byName: case sensitive column resolution (22 milliseconds) [info] - byName: case insensitive column resolution (25 milliseconds) [info] - byName: data columns are reordered by name (24 milliseconds) [info] - byName: fail nullable data written to required columns (28 milliseconds) [info] - byName: allow required data written to nullable columns (22 milliseconds) [info] - byName: missing required columns cause failure and are identified by name (20 milliseconds) [info] - byName: missing optional columns cause failure and are identified by name (21 milliseconds) [info] - byName: insert safe cast (29 milliseconds) [info] - byName: fail extra data fields (23 milliseconds) [info] - byName: fail extra data fields in struct (24 milliseconds) [info] - byPosition: basic behavior (25 milliseconds) [info] - byPosition: data columns are not reordered (24 milliseconds) [info] - byPosition: fail nullable data written to required columns (22 milliseconds) [info] - byPosition: allow required data written to nullable columns (25 milliseconds) [info] - byPosition: missing required columns cause failure (27 milliseconds) [info] - byPosition: missing optional columns cause failure (24 milliseconds) [info] - byPosition: insert safe cast (26 milliseconds) [info] - byPosition: fail extra data fields (21 milliseconds) [info] - SPARK-35916: timestamps without time zone difference (2 seconds, 342 milliseconds) [info] - bypass output column resolution (49 milliseconds) [info] - check fields of struct type column (52 milliseconds) [info] - SPARK-36498: reorder inner fields with byName mode (12 milliseconds) [info] - SPARK-36498: reorder inner fields in array of struct with byName mode (30 milliseconds) [info] - SPARK-36498: reorder inner fields in map of struct with byName mode (15 milliseconds) [info] - delete expression is resolved using table fields (29 milliseconds) [info] - delete expression is not resolved using query fields (62 milliseconds) [info] CodeFormatterSuite: [info] - removing overlapping comments (1 millisecond) [info] - removing extra new lines and comments (0 milliseconds) [info] - basic example (6 milliseconds) [info] - nested example (1 millisecond) [info] - single line (0 milliseconds) [info] - if else on the same line (1 millisecond) [info] - function calls (0 milliseconds) [info] - function calls with maxLines=0 (0 milliseconds) [info] - function calls with maxLines=2 (1 millisecond) [info] - single line comments (0 milliseconds) [info] - single line comments /* */ (4 milliseconds) [info] - multi-line comments (1 millisecond) [info] - reduce empty lines (1 millisecond) [info] - comment place holder (0 milliseconds) [info] AnsiCastSuiteWithAnsiModeOn: [info] - SPARK-34896: subtract dates (622 milliseconds) [info] - to_timestamp_ntz (803 milliseconds) [info] - to_timestamp exception mode (47 milliseconds) [info] - Consistent error handling for datetime formatting and parsing functions (185 milliseconds) [info] - SPARK-31896: Handle am-pm timestamp parsing when hour is missing (15 milliseconds) [info] - DATE_FROM_UNIX_DATE (170 milliseconds) [info] - UNIX_DATE (38 milliseconds) [info] - UNIX_SECONDS (114 milliseconds) [info] - UNIX_MILLIS (88 milliseconds) [info] - UNIX_MICROS (92 milliseconds) [info] - submit a barrier ResultStage that requires more slots than current total under local mode (3 seconds, 45 milliseconds) [info] - TIMESTAMP_SECONDS (513 milliseconds) [info] - TIMESTAMP_MILLIS (201 milliseconds) [info] - TIMESTAMP_MICROS (213 milliseconds) [info] - SPARK-33498: GetTimestamp,UnixTimestamp,ToUnixTimestamp with parseError (303 milliseconds) [info] - null cast (4 seconds, 367 milliseconds) [info] - cast string to date (70 milliseconds) [info] - submit a barrier ShuffleMapStage that requires more slots than current total under local mode (3 seconds, 48 milliseconds) [info] - run Python application in yarn-cluster mode (26 seconds, 33 milliseconds) [info] - submit a barrier ResultStage that requires more slots than current total under local-cluster mode (3 seconds, 430 milliseconds) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@42f3269b rejected from java.util.concurrent.ThreadPoolExecutor@b0d15d[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 0] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at java.util.concurrent.Executors$DelegatedExecutorService.execute(Executors.java:668) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) [info] - cast string to timestamp (6 seconds, 823 milliseconds) [info] - cast from boolean (226 milliseconds) [info] - cast from int (483 milliseconds) [info] - SPARK-34739,SPARK-35889: add a year-month interval to a timestamp (9 seconds, 108 milliseconds) [info] - cast from long (373 milliseconds) [info] - submit a barrier ShuffleMapStage that requires more slots than current total under local-cluster mode (3 seconds, 438 milliseconds) [info] - cast from float (273 milliseconds) [info] - cast from double (283 milliseconds) [info] - cast from string (0 milliseconds) [info] - SPARK-32518: CoarseGrainedSchedulerBackend.maxNumConcurrentTasks should consider all kinds of resources for the barrier stage (18 seconds, 82 milliseconds) [info] BlockManagerInfoSuite: [info] - broadcast block externalShuffleServiceEnabled=true (1 millisecond) [info] - broadcast block externalShuffleServiceEnabled=false (1 millisecond) [info] - RDD block with MEMORY_ONLY externalShuffleServiceEnabled=true (0 milliseconds) [info] - RDD block with MEMORY_ONLY externalShuffleServiceEnabled=false (0 milliseconds) [info] - RDD block with MEMORY_AND_DISK externalShuffleServiceEnabled=true (0 milliseconds) [info] - RDD block with MEMORY_AND_DISK externalShuffleServiceEnabled=false (0 milliseconds) [info] - RDD block with DISK_ONLY externalShuffleServiceEnabled=true (1 millisecond) [info] - RDD block with DISK_ONLY externalShuffleServiceEnabled=false (1 millisecond) [info] - update from MEMORY_ONLY to DISK_ONLY externalShuffleServiceEnabled=true (0 milliseconds) [info] - update from MEMORY_ONLY to DISK_ONLY externalShuffleServiceEnabled=false (0 milliseconds) [info] - using invalid StorageLevel externalShuffleServiceEnabled=true (1 millisecond) [info] - using invalid StorageLevel externalShuffleServiceEnabled=false (0 milliseconds) [info] - remove block and add another one externalShuffleServiceEnabled=true (1 millisecond) [info] - remove block and add another one externalShuffleServiceEnabled=false (0 milliseconds) [info] HistoryServerArgumentsSuite: [info] - No Arguments Parsing (2 milliseconds) [info] - Properties File Arguments Parsing --properties-file (7 milliseconds) [info] HttpSecurityFilterSuite: [info] - filter bad user input (34 milliseconds) [info] - perform access control (29 milliseconds) [info] - set security-related headers (11 milliseconds) [info] - doAs impersonation (32 milliseconds) [info] MetricsSystemSuite: [info] - MetricsSystem with default config (1 millisecond) [info] - MetricsSystem with sources add (6 milliseconds) [info] - MetricsSystem with Driver instance (1 millisecond) [info] - MetricsSystem with Driver instance and spark.app.id is not set (1 millisecond) [info] - MetricsSystem with Driver instance and spark.executor.id is not set (2 milliseconds) [info] - MetricsSystem with Executor instance (1 millisecond) [info] - MetricsSystem with Executor instance and spark.app.id is not set (1 millisecond) [info] - MetricsSystem with Executor instance and spark.executor.id is not set (1 millisecond) [info] - MetricsSystem with instance which is neither Driver nor Executor (2 milliseconds) [info] - MetricsSystem with Executor instance, with custom namespace (1 millisecond) [info] - MetricsSystem with Executor instance, custom namespace which is not set (1 millisecond) [info] - MetricsSystem with Executor instance, custom namespace, spark.executor.id not set (1 millisecond) [info] - MetricsSystem with non-driver, non-executor instance with custom namespace (1 millisecond) [info] - SPARK-37078: Support old 3-parameter Sink constructors (2 milliseconds) [info] JobCancellationSuite: [info] - run Python application in yarn-cluster mode using spark.yarn.appMasterEnv to override local envvar (25 seconds, 44 milliseconds) [info] - local mode, FIFO scheduler (85 milliseconds) [info] - local mode, fair scheduler (74 milliseconds) [info] - SPARK-34761,SPARK-35889: add a day-time interval to a timestamp (19 seconds, 876 milliseconds) [info] CastSuite: [info] - cluster mode, FIFO scheduler (4 seconds, 351 milliseconds) [info] - cluster mode, fair scheduler (5 seconds, 109 milliseconds) [info] - do not put partially executed partitions into cache (108 milliseconds) [info] - job group (66 milliseconds) [info] - inherited job group (SPARK-6629) (63 milliseconds) [info] - job group with interruption (91 milliseconds) [info] - null cast (8 seconds, 389 milliseconds) [info] - cast string to date (1 second, 72 milliseconds) [info] - task reaper kills JVM if killed tasks keep running for too long (6 seconds, 567 milliseconds) [info] - cast string to timestamp (7 seconds, 940 milliseconds) [info] - cast from boolean (135 milliseconds) [info] - cast from int (279 milliseconds) [info] - cast from long (228 milliseconds) [info] - cast from float (188 milliseconds) [info] - cast from double (200 milliseconds) [info] - cast from string (7 milliseconds) [info] - task reaper will not kill JVM if spark.task.killTimeout == -1 (5 seconds, 707 milliseconds) [info] - two jobs sharing the same stage (84 milliseconds) [info] - interruptible iterator of shuffle reader (174 milliseconds) [info] PartitioningSuite: [info] - HashPartitioner equality (1 milli