Started by an SCM change Running as SYSTEM [EnvInject] - Loading node environment variables. [EnvInject] - Preparing an environment for the build. [EnvInject] - Keeping Jenkins system variables. [EnvInject] - Keeping Jenkins build variables. [EnvInject] - Injecting as environment variables the properties content PATH=/home/anaconda/envs/py36/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin AMPLAB_JENKINS="true" SPARK_MASTER_SBT_HADOOP_2_7=1 JAVA_HOME=/usr/java/latest AMPLAB_JENKINS_BUILD_HIVE_PROFILE=hive2.3 SPARK_TESTING=1 AMPLAB_JENKINS_BUILD_PROFILE=hadoop2.7 LANG=en_US.UTF-8 SPARK_BRANCH=master [EnvInject] - Variables injected successfully. [EnvInject] - Injecting contributions. Building remotely on research-jenkins-worker-01 (ubuntu20 worker-01 ubuntu) in workspace /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7 The recommended git tool is: NONE No credentials specified > git rev-parse --resolve-git-dir /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/.git # timeout=10 Fetching changes from the remote Git repository > git config remote.origin.url https://github.com/apache/spark.git # timeout=10 Fetching upstream changes from https://github.com/apache/spark.git > git --version # timeout=10 > git --version # 'git version 2.25.1' > git fetch --tags --force --progress -- https://github.com/apache/spark.git +refs/heads/*:refs/remotes/origin/* # timeout=10 > git rev-parse origin/master^{commit} # timeout=10 Checking out Revision 030de1d09f121b167aaaa8237a2807f902c1e710 (origin/master) > git config core.sparsecheckout # timeout=10 > git checkout -f 030de1d09f121b167aaaa8237a2807f902c1e710 # timeout=10 Commit message: "[SPARK-37543][DOCS] Document Java 17 support" > git rev-list --no-walk c411d2681b2e16e28346e1c55df8d71013d746cb # timeout=10 [EnvInject] - Mask passwords that will be passed as build parameters. [spark-master-test-sbt-hadoop-2.7] $ /bin/bash /tmp/jenkins8478545480283341319.sh Removing R/SparkR.Rcheck/ Removing R/SparkR_3.3.0.tar.gz Removing R/cran-check.out Removing R/lib/ Removing R/pkg/man/ Removing R/pkg/tests/fulltests/Rplots.pdf Removing R/pkg/tests/fulltests/_snaps/ Removing R/unit-tests.out Removing append/ Removing assembly/target/ Removing build/sbt-launch-1.5.5.jar Removing common/kvstore/target/ Removing common/network-common/target/ Removing common/network-shuffle/target/ Removing common/network-yarn/target/ Removing common/sketch/target/ Removing common/tags/target/ Removing common/unsafe/target/ Removing core/derby.log Removing core/dummy/ Removing core/ignored/ Removing core/target/ Removing core/temp-secrets/ Removing derby.log Removing dev/__pycache__/ Removing dev/ansible-for-test-node/roles/jenkins-worker/files/util_scripts/__pycache__/ Removing dev/create-release/__pycache__/ Removing dev/lint-r-report.log Removing dev/sparktestsupport/__pycache__/ Removing dev/target/ Removing examples/src/main/python/__pycache__/ Removing examples/src/main/python/ml/__pycache__/ Removing examples/src/main/python/mllib/__pycache__/ Removing examples/src/main/python/sql/__pycache__/ Removing examples/src/main/python/sql/streaming/__pycache__/ Removing examples/src/main/python/streaming/__pycache__/ Removing examples/target/ Removing external/avro/spark-warehouse/ Removing external/avro/target/ Removing external/docker-integration-tests/target/ Removing external/kafka-0-10-assembly/target/ Removing external/kafka-0-10-sql/spark-warehouse/ Removing external/kafka-0-10-sql/target/ Removing external/kafka-0-10-token-provider/target/ Removing external/kafka-0-10/target/ Removing external/kinesis-asl-assembly/target/ Removing external/kinesis-asl/checkpoint/ Removing external/kinesis-asl/src/main/python/examples/streaming/__pycache__/ Removing external/kinesis-asl/target/ Removing external/spark-ganglia-lgpl/target/ Removing graphx/target/ Removing hadoop-cloud/target/ Removing launcher/target/ Removing lib/ Removing logs/ Removing metastore_db/ Removing mllib-local/target/ Removing mllib/checkpoint/ Removing mllib/spark-warehouse/ Removing mllib/target/ Removing project/project/ Removing project/target/ Removing python/__pycache__/ Removing python/dist/ Removing python/docs/source/__pycache__/ Removing python/lib/pyspark.zip Removing python/pyspark.egg-info/ Removing python/pyspark/__pycache__/ Removing python/pyspark/cloudpickle/__pycache__/ Removing python/pyspark/ml/__pycache__/ Removing python/pyspark/ml/linalg/__pycache__/ Removing python/pyspark/ml/param/__pycache__/ Removing python/pyspark/ml/tests/__pycache__/ Removing python/pyspark/mllib/__pycache__/ Removing python/pyspark/mllib/linalg/__pycache__/ Removing python/pyspark/mllib/stat/__pycache__/ Removing python/pyspark/mllib/tests/__pycache__/ Removing python/pyspark/pandas/__pycache__/ Removing python/pyspark/pandas/data_type_ops/__pycache__/ Removing python/pyspark/pandas/indexes/__pycache__/ Removing python/pyspark/pandas/missing/__pycache__/ Removing python/pyspark/pandas/plot/__pycache__/ Removing python/pyspark/pandas/spark/__pycache__/ Removing python/pyspark/pandas/tests/__pycache__/ Removing python/pyspark/pandas/tests/data_type_ops/__pycache__/ Removing python/pyspark/pandas/tests/indexes/__pycache__/ Removing python/pyspark/pandas/tests/plot/__pycache__/ Removing python/pyspark/pandas/typedef/__pycache__/ Removing python/pyspark/pandas/usage_logging/__pycache__/ Removing python/pyspark/python/ Removing python/pyspark/resource/__pycache__/ Removing python/pyspark/resource/tests/__pycache__/ Removing python/pyspark/sql/__pycache__/ Removing python/pyspark/sql/avro/__pycache__/ Removing python/pyspark/sql/pandas/__pycache__/ Removing python/pyspark/sql/tests/__pycache__/ Removing python/pyspark/streaming/__pycache__/ Removing python/pyspark/streaming/tests/__pycache__/ Removing python/pyspark/testing/__pycache__/ Removing python/pyspark/tests/__pycache__/ Removing python/target/ Removing python/test_coverage/__pycache__/ Removing python/test_support/__pycache__/ Removing repl/derby.log Removing repl/metastore_db/ Removing repl/spark-warehouse/ Removing repl/target/ Removing resource-managers/kubernetes/core/target/ Removing resource-managers/kubernetes/core/temp-secret/ Removing resource-managers/kubernetes/integration-tests/target/ Removing resource-managers/kubernetes/integration-tests/tests/__pycache__/ Removing resource-managers/mesos/target/ Removing resource-managers/yarn/target/ Removing scalastyle-on-compile.generated.xml Removing spark-warehouse/ Removing sql/__pycache__/ Removing sql/catalyst/fake/ Removing sql/catalyst/spark-warehouse/ Removing sql/catalyst/target/ Removing sql/core/spark-warehouse/ Removing sql/core/src/test/resources/__pycache__/ Removing sql/core/target/ Removing sql/hive-thriftserver/derby.log Removing sql/hive-thriftserver/metastore_db/ Removing sql/hive-thriftserver/spark-warehouse/ Removing sql/hive-thriftserver/spark_derby/ Removing sql/hive-thriftserver/target/ Removing sql/hive/derby.log Removing sql/hive/metastore_db/ Removing sql/hive/src/test/resources/data/scripts/__pycache__/ Removing sql/hive/target/ Removing streaming/checkpoint/ Removing streaming/target/ Removing target/ Removing tools/target/ Removing work/ +++ dirname /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R/install-dev.sh ++ cd /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R ++ pwd + FWDIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R + LIB_DIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R/lib + mkdir -p /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R/lib + pushd /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R + . /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R/find-r.sh ++ '[' -z '' ']' ++ '[' '!' -z '' ']' +++ command -v R ++ '[' '!' /usr/bin/R ']' ++++ which R +++ dirname /usr/bin/R ++ R_SCRIPT_PATH=/usr/bin ++ echo 'Using R_SCRIPT_PATH = /usr/bin' Using R_SCRIPT_PATH = /usr/bin + . /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R/create-rd.sh ++ set -o pipefail ++ set -e ++++ dirname /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R/create-rd.sh +++ cd /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R +++ pwd ++ FWDIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R ++ pushd /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R ++ . /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R/find-r.sh +++ '[' -z /usr/bin ']' ++ /usr/bin/Rscript -e ' if(requireNamespace("devtools", quietly=TRUE)) { setwd("/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R"); devtools::document(pkg="./pkg", roclets="rd") }' Updating SparkR documentation First time using roxygen2. Upgrading automatically... Loading SparkR Creating a new generic function for ‘as.data.frame’ in package ‘SparkR’ Creating a new generic function for ‘colnames’ in package ‘SparkR’ Creating a new generic function for ‘colnames<-’ in package ‘SparkR’ Creating a new generic function for ‘cov’ in package ‘SparkR’ Creating a new generic function for ‘drop’ in package ‘SparkR’ Creating a new generic function for ‘na.omit’ in package ‘SparkR’ Creating a new generic function for ‘filter’ in package ‘SparkR’ Creating a new generic function for ‘intersect’ in package ‘SparkR’ Creating a new generic function for ‘sample’ in package ‘SparkR’ Creating a new generic function for ‘transform’ in package ‘SparkR’ Creating a new generic function for ‘subset’ in package ‘SparkR’ Creating a new generic function for ‘summary’ in package ‘SparkR’ Creating a new generic function for ‘union’ in package ‘SparkR’ Creating a new generic function for ‘endsWith’ in package ‘SparkR’ Creating a new generic function for ‘startsWith’ in package ‘SparkR’ Creating a new generic function for ‘lag’ in package ‘SparkR’ Creating a new generic function for ‘rank’ in package ‘SparkR’ Creating a new generic function for ‘sd’ in package ‘SparkR’ Creating a new generic function for ‘var’ in package ‘SparkR’ Creating a new generic function for ‘window’ in package ‘SparkR’ Creating a new generic function for ‘predict’ in package ‘SparkR’ Creating a new generic function for ‘rbind’ in package ‘SparkR’ Creating a generic function for ‘substr’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘%in%’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘lapply’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘Filter’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘nrow’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘ncol’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘factorial’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘atan2’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘ifelse’ from package ‘base’ in package ‘SparkR’ Writing structType.Rd Writing print.structType.Rd Writing structField.Rd Writing print.structField.Rd Writing summarize.Rd Writing alias.Rd Writing arrange.Rd Writing as.data.frame.Rd Writing cache.Rd Writing checkpoint.Rd Writing coalesce.Rd Writing collect.Rd Writing columns.Rd Writing coltypes.Rd Writing count.Rd Writing cov.Rd Writing corr.Rd Writing createOrReplaceTempView.Rd Writing cube.Rd Writing dapply.Rd Writing dapplyCollect.Rd Writing gapply.Rd Writing gapplyCollect.Rd Writing describe.Rd Writing distinct.Rd Writing drop.Rd Writing dropDuplicates.Rd Writing nafunctions.Rd Writing dtypes.Rd Writing explain.Rd Writing except.Rd Writing exceptAll.Rd Writing filter.Rd Writing first.Rd Writing groupBy.Rd Writing hint.Rd Writing insertInto.Rd Writing intersect.Rd Writing intersectAll.Rd Writing isLocal.Rd Writing isStreaming.Rd Writing limit.Rd Writing localCheckpoint.Rd Writing merge.Rd Writing mutate.Rd Writing orderBy.Rd Writing persist.Rd Writing printSchema.Rd Writing registerTempTable-deprecated.Rd Writing rename.Rd Writing repartition.Rd Writing repartitionByRange.Rd Writing sample.Rd Writing rollup.Rd Writing sampleBy.Rd Writing saveAsTable.Rd Writing take.Rd Writing write.df.Rd Writing write.jdbc.Rd Writing write.json.Rd Writing write.orc.Rd Writing write.parquet.Rd Writing write.stream.Rd Writing write.text.Rd Writing schema.Rd Writing select.Rd Writing selectExpr.Rd Writing showDF.Rd Writing subset.Rd Writing summary.Rd Writing union.Rd Writing unionAll.Rd Writing unionByName.Rd Writing unpersist.Rd Writing with.Rd Writing withColumn.Rd Writing withWatermark.Rd Writing randomSplit.Rd Writing broadcast.Rd Writing columnfunctions.Rd Writing between.Rd Writing cast.Rd Writing endsWith.Rd Writing startsWith.Rd Writing column_nonaggregate_functions.Rd Writing otherwise.Rd Writing over.Rd Writing eq_null_safe.Rd Writing withField.Rd Writing dropFields.Rd Writing partitionBy.Rd Writing rowsBetween.Rd Writing rangeBetween.Rd Writing windowPartitionBy.Rd Writing windowOrderBy.Rd Writing column_datetime_diff_functions.Rd Writing column_aggregate_functions.Rd Writing column_collection_functions.Rd Writing column_ml_functions.Rd Writing column_string_functions.Rd Writing column_misc_functions.Rd Writing avg.Rd Writing column_math_functions.Rd Writing column.Rd Writing column_window_functions.Rd Writing column_datetime_functions.Rd Writing column_avro_functions.Rd Writing last.Rd Writing not.Rd Writing fitted.Rd Writing predict.Rd Writing rbind.Rd Writing spark.als.Rd Writing spark.bisectingKmeans.Rd Writing spark.fmClassifier.Rd Writing spark.fmRegressor.Rd Writing spark.gaussianMixture.Rd Writing spark.gbt.Rd Writing spark.glm.Rd Writing spark.isoreg.Rd Writing spark.kmeans.Rd Writing spark.kstest.Rd Writing spark.lda.Rd Writing spark.logit.Rd Writing spark.mlp.Rd Writing spark.naiveBayes.Rd Writing spark.decisionTree.Rd Writing spark.randomForest.Rd Writing spark.survreg.Rd Writing spark.svmLinear.Rd Writing spark.fpGrowth.Rd Writing spark.prefixSpan.Rd Writing spark.powerIterationClustering.Rd Writing spark.lm.Rd Writing write.ml.Rd Writing awaitTermination.Rd Writing isActive.Rd Writing lastProgress.Rd Writing queryName.Rd Writing status.Rd Writing stopQuery.Rd Writing print.jobj.Rd Writing show.Rd Writing substr.Rd Writing match.Rd Writing GroupedData.Rd Writing pivot.Rd Writing SparkDataFrame.Rd Writing storageLevel.Rd Writing toJSON.Rd Writing nrow.Rd Writing ncol.Rd Writing dim.Rd Writing head.Rd Writing join.Rd Writing crossJoin.Rd Writing attach.Rd Writing str.Rd Writing histogram.Rd Writing getNumPartitions.Rd Writing sparkR.conf.Rd Writing sparkR.version.Rd Writing createDataFrame.Rd Writing read.json.Rd Writing read.orc.Rd Writing read.parquet.Rd Writing read.text.Rd Writing sql.Rd Writing tableToDF.Rd Writing read.df.Rd Writing read.jdbc.Rd Writing read.stream.Rd Writing WindowSpec.Rd Writing createExternalTable-deprecated.Rd Writing createTable.Rd Writing cacheTable.Rd Writing uncacheTable.Rd Writing clearCache.Rd Writing dropTempTable-deprecated.Rd Writing dropTempView.Rd Writing tables.Rd Writing tableNames.Rd Writing currentDatabase.Rd Writing setCurrentDatabase.Rd Writing listDatabases.Rd Writing listTables.Rd Writing listColumns.Rd Writing listFunctions.Rd Writing recoverPartitions.Rd Writing refreshTable.Rd Writing refreshByPath.Rd Writing spark.addFile.Rd Writing spark.getSparkFilesRootDirectory.Rd Writing spark.getSparkFiles.Rd Writing spark.lapply.Rd Writing setLogLevel.Rd Writing setCheckpointDir.Rd Writing unresolved_named_lambda_var.Rd Writing create_lambda.Rd Writing invoke_higher_order_function.Rd Writing install.spark.Rd Writing sparkR.callJMethod.Rd Writing sparkR.callJStatic.Rd Writing sparkR.newJObject.Rd Writing LinearSVCModel-class.Rd Writing LogisticRegressionModel-class.Rd Writing MultilayerPerceptronClassificationModel-class.Rd Writing NaiveBayesModel-class.Rd Writing FMClassificationModel-class.Rd Writing BisectingKMeansModel-class.Rd Writing GaussianMixtureModel-class.Rd Writing KMeansModel-class.Rd Writing LDAModel-class.Rd Writing PowerIterationClustering-class.Rd Writing FPGrowthModel-class.Rd Writing PrefixSpan-class.Rd Writing ALSModel-class.Rd Writing AFTSurvivalRegressionModel-class.Rd Writing GeneralizedLinearRegressionModel-class.Rd Writing IsotonicRegressionModel-class.Rd Writing LinearRegressionModel-class.Rd Writing FMRegressionModel-class.Rd Writing glm.Rd Writing KSTest-class.Rd Writing GBTRegressionModel-class.Rd Writing GBTClassificationModel-class.Rd Writing RandomForestRegressionModel-class.Rd Writing RandomForestClassificationModel-class.Rd Writing DecisionTreeRegressionModel-class.Rd Writing DecisionTreeClassificationModel-class.Rd Writing read.ml.Rd Writing sparkR.session.stop.Rd Writing sparkR.init-deprecated.Rd Writing sparkRSQL.init-deprecated.Rd Writing sparkRHive.init-deprecated.Rd Writing sparkR.session.Rd Writing sparkR.uiWebUrl.Rd Writing setJobGroup.Rd Writing clearJobGroup.Rd Writing cancelJobGroup.Rd Writing setJobDescription.Rd Writing setLocalProperty.Rd Writing getLocalProperty.Rd Writing crosstab.Rd Writing freqItems.Rd Writing approxQuantile.Rd Writing StreamingQuery.Rd Writing hashCode.Rd + /usr/bin/R CMD INSTALL --library=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R/lib /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R/pkg/ * installing *source* package ‘SparkR’ ... ** using staged installation ** R ** inst ** byte-compile and prepare package for lazy loading Creating a new generic function for ‘as.data.frame’ in package ‘SparkR’ Creating a new generic function for ‘colnames’ in package ‘SparkR’ Creating a new generic function for ‘colnames<-’ in package ‘SparkR’ Creating a new generic function for ‘cov’ in package ‘SparkR’ Creating a new generic function for ‘drop’ in package ‘SparkR’ Creating a new generic function for ‘na.omit’ in package ‘SparkR’ Creating a new generic function for ‘filter’ in package ‘SparkR’ Creating a new generic function for ‘intersect’ in package ‘SparkR’ Creating a new generic function for ‘sample’ in package ‘SparkR’ Creating a new generic function for ‘transform’ in package ‘SparkR’ Creating a new generic function for ‘subset’ in package ‘SparkR’ Creating a new generic function for ‘summary’ in package ‘SparkR’ Creating a new generic function for ‘union’ in package ‘SparkR’ Creating a new generic function for ‘endsWith’ in package ‘SparkR’ Creating a new generic function for ‘startsWith’ in package ‘SparkR’ Creating a new generic function for ‘lag’ in package ‘SparkR’ Creating a new generic function for ‘rank’ in package ‘SparkR’ Creating a new generic function for ‘sd’ in package ‘SparkR’ Creating a new generic function for ‘var’ in package ‘SparkR’ Creating a new generic function for ‘window’ in package ‘SparkR’ Creating a new generic function for ‘predict’ in package ‘SparkR’ Creating a new generic function for ‘rbind’ in package ‘SparkR’ Creating a generic function for ‘substr’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘%in%’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘lapply’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘Filter’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘nrow’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘ncol’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘factorial’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘atan2’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘ifelse’ from package ‘base’ in package ‘SparkR’ ** help *** installing help indices ** building package indices ** installing vignettes ** testing if installed package can be loaded from temporary location ** testing if installed package can be loaded from final location ** testing if installed package keeps a record of temporary installation path * DONE (SparkR) + cd /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R/lib + jar cfM /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/R/lib/sparkr.zip SparkR + popd [info] Using build tool sbt with profiles -Phadoop-2.7 under environment amplab_jenkins [info] Found the following changed modules: root [info] Setup the following environment variables for tests: ======================================================================== Running Apache RAT checks ======================================================================== Attempting to fetch rat RAT checks passed. ======================================================================== Running Scala style checks ======================================================================== [info] Checking Scala style using SBT with these profiles: -Phadoop-2.7 -Pmesos -Phadoop-cloud -Pdocker-integration-tests -Pspark-ganglia-lgpl -Pyarn -Phive -Phive-thriftserver -Pkinesis-asl -Pkubernetes Scalastyle checks passed. ======================================================================== Running Python style checks ======================================================================== starting python compilation test... python compilation succeeded. The python3 -m black command was not found. Skipping black checks for now. starting flake8 test... flake8 checks passed. The mypy command was not found. Skipping for now. all lint-python tests passed! ======================================================================== Running R style checks ======================================================================== Loading required namespace: SparkR Loading required namespace: lintr lintr checks passed. ======================================================================== Building Spark ======================================================================== [info] Building Spark using SBT with these arguments: -Phadoop-2.7 -Pmesos -Phadoop-cloud -Pdocker-integration-tests -Pspark-ganglia-lgpl -Pyarn -Phive -Phive-thriftserver -Pkinesis-asl -Pkubernetes test:package streaming-kinesis-asl-assembly/assembly Using /usr/java/latest as default JAVA_HOME. Note, this will be overridden by -java-home if it is set. [info] welcome to sbt 1.5.5 (Private Build Java 1.8.0_282) [info] loading settings for project spark-master-test-sbt-hadoop-2-7-build from plugins.sbt ... [info] loading project definition from /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project [info] resolving key references (36387 settings) ... [info] set current project to spark-parent (in build file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/) [warn] there are 210 keys that are not used by any other settings/tasks: [warn] [warn] * assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * avro / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * avro / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * avro / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * avro / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * avro / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * avro / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * catalyst / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * catalyst / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * catalyst / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * catalyst / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * catalyst / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * catalyst / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * core / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * core / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * core / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * core / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * core / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * core / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * docker-integration-tests / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * docker-integration-tests / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * docker-integration-tests / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * docker-integration-tests / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * docker-integration-tests / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * docker-integration-tests / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * examples / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * examples / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * examples / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * examples / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * examples / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * examples / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * ganglia-lgpl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * ganglia-lgpl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * ganglia-lgpl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * ganglia-lgpl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * ganglia-lgpl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * ganglia-lgpl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * graphx / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * graphx / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * graphx / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * graphx / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * graphx / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * graphx / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * hadoop-cloud / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * hadoop-cloud / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * hadoop-cloud / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * hadoop-cloud / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * hadoop-cloud / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * hadoop-cloud / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * hive / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * hive / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * hive / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * hive / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * hive / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * hive / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * hive-thriftserver / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * hive-thriftserver / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * hive-thriftserver / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * hive-thriftserver / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * hive-thriftserver / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * hive-thriftserver / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * kubernetes / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * kubernetes / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * kubernetes / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * kubernetes / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * kubernetes / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * kubernetes / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * kvstore / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * kvstore / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * kvstore / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * kvstore / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * kvstore / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * kvstore / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * launcher / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * launcher / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * launcher / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * launcher / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * launcher / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * launcher / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * mesos / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * mesos / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * mesos / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * mesos / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * mesos / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * mesos / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * mllib / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * mllib / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * mllib / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * mllib / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * mllib / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * mllib / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * mllib-local / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * mllib-local / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * mllib-local / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * mllib-local / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * mllib-local / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * mllib-local / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * network-common / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * network-common / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * network-common / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * network-common / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * network-common / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * network-common / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * network-shuffle / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * network-shuffle / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * network-shuffle / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * network-shuffle / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * network-shuffle / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * network-shuffle / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * network-yarn / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * network-yarn / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * network-yarn / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * network-yarn / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * network-yarn / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * network-yarn / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * repl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * repl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * repl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * repl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * repl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * repl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * sketch / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * sketch / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * sketch / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * sketch / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * sketch / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * sketch / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * spark / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * spark / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * spark / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * spark / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * spark / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * spark / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * sql / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * sql / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * sql / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * sql / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * sql / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * sql / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * sql-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * sql-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * sql-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * sql-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * sql-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * sql-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming-kafka-0-10-assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming-kafka-0-10-assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming-kafka-0-10-assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming-kafka-0-10-assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming-kafka-0-10-assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming-kafka-0-10-assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming-kinesis-asl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming-kinesis-asl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming-kinesis-asl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming-kinesis-asl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming-kinesis-asl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming-kinesis-asl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming-kinesis-asl-assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming-kinesis-asl-assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming-kinesis-asl-assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming-kinesis-asl-assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming-kinesis-asl-assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming-kinesis-asl-assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * tags / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * tags / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * tags / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * tags / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * tags / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * tags / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * token-provider-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * token-provider-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * token-provider-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * token-provider-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * token-provider-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * token-provider-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * tools / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * tools / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * tools / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * tools / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * tools / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * tools / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * unsafe / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * unsafe / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * unsafe / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * unsafe / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * unsafe / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * unsafe / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * yarn / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * yarn / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * yarn / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * yarn / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * yarn / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * yarn / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] [warn] note: a setting might still be used by a command; to exclude a key from this `lintUnused` check [warn] either append it to `Global / excludeLintKeys` or call .withRank(KeyRanks.Invisible) on the key [warn] sbt 0.13 shell syntax is deprecated; use slash syntax instead: kvstore / Test / package, mllib / Test / package, tools / Test / package, graphx / Test / package, docker-integration-tests / Test / package, hive / Test / package, unsafe / Test / package, ganglia-lgpl / Test / package, sql-kafka-0-10 / Test / package, avro / Test / package, network-shuffle / Test / package, network-yarn / Test / package, mllib-local / Test / package, assembly / Test / package, launcher / Test / package, streaming / Test / package, kubernetes / Test / package, mesos / Test / package, sql / Test / package, sketch / Test / package, core / Test / package, catalyst / Test / package, repl / Test / package, streaming-kafka-0-10-assembly / Test / package, streaming-kinesis-asl / Test / package, hive-thriftserver / Test / package, token-provider-kafka-0-10 / Test / package, network-common / Test / package, streaming-kafka-0-10 / Test / package, examples / Test / package, hadoop-cloud / Test / package, yarn / Test / package, tags / Test / package, streaming-kinesis-asl-assembly / Test / package, Test / package [info] compiling 2 Scala sources and 8 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/tags/target/scala-2.12/classes ... [info] compiling 1 Scala source to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/tools/target/scala-2.12/classes ... [info] compiling 85 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/network-common/target/scala-2.12/classes ... [info] done compiling [info] done compiling [info] compiling 9 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/sketch/target/scala-2.12/classes ... [info] compiling 12 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/kvstore/target/scala-2.12/classes ... [info] compiling 21 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/launcher/target/scala-2.12/classes ... [info] compiling 50 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/network-shuffle/target/scala-2.12/classes ... [info] compiling 8 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/tags/target/scala-2.12/test-classes ... [info] compiling 18 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/unsafe/target/scala-2.12/classes ... [info] done compiling [info] compiling 26 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/network-common/target/scala-2.12/test-classes ... [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:22:1: Unsafe is internal proprietary API and may be removed in a future release [warn] import sun.misc.Unsafe; [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:28:1: Unsafe is internal proprietary API and may be removed in a future release [warn] private static final Unsafe _UNSAFE; [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:150:1: Unsafe is internal proprietary API and may be removed in a future release [warn] sun.misc.Unsafe unsafe; [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:152:1: Unsafe is internal proprietary API and may be removed in a future release [warn] Field unsafeField = Unsafe.class.getDeclaredField("theUnsafe"); [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:154:1: Unsafe is internal proprietary API and may be removed in a future release [warn] unsafe = (sun.misc.Unsafe) unsafeField.get(null); [warn] ^5 warnings [info] done compiling [info] done compiling [info] compiling 3 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/sketch/target/scala-2.12/test-classes ... [info] Note: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/launcher/src/main/java/org/apache/spark/launcher/LauncherServer.java uses or overrides a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] done compiling [info] done compiling [info] compiling 12 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/kvstore/target/scala-2.12/test-classes ... [info] done compiling [info] compiling 7 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/launcher/target/scala-2.12/test-classes ... [info] compiling 1 Scala source and 6 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/unsafe/target/scala-2.12/test-classes ... [info] done compiling [info] compiling 3 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/network-yarn/target/scala-2.12/classes ... [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/network-common/src/test/java/org/apache/spark/network/server/OneForOneStreamManagerSuite.java:105:1: [unchecked] unchecked conversion [warn] Iterator<ManagedBuffer> buffers = Mockito.mock(Iterator.class); [warn] ^ required: Iterator<ManagedBuffer> [warn] found: Iterator [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/network-common/src/test/java/org/apache/spark/network/server/OneForOneStreamManagerSuite.java:111:1: [unchecked] unchecked conversion [warn] Iterator<ManagedBuffer> buffers2 = Mockito.mock(Iterator.class); [warn] ^ required: Iterator<ManagedBuffer> [warn] found: Iterator [warn] 2 warnings [info] compiling 568 Scala sources and 104 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/scala-2.12/classes ... [info] done compiling [info] done compiling [info] compiling 18 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/common/network-shuffle/target/scala-2.12/test-classes ... [info] done compiling [info] done compiling [info] done compiling [info] done compiling [info] done compiling [info] compiling 5 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/mllib-local/target/scala-2.12/classes ... [info] done compiling [info] Note: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/src/main/java/org/apache/spark/SparkFirehoseListener.java uses or overrides a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] compiling 38 Scala sources and 5 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/graphx/target/scala-2.12/classes ... [info] compiling 1 Scala source and 1 Java source to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/external/spark-ganglia-lgpl/target/scala-2.12/classes ... [info] compiling 104 Scala sources and 6 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/streaming/target/scala-2.12/classes ... [info] compiling 5 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/external/kafka-0-10-token-provider/target/scala-2.12/classes ... [info] compiling 47 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/resource-managers/kubernetes/core/target/scala-2.12/classes ... [info] compiling 367 Scala sources and 168 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/catalyst/target/scala-2.12/classes ... [info] compiling 20 Scala sources and 1 Java source to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/resource-managers/mesos/target/scala-2.12/classes ... [info] compiling 25 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/resource-managers/yarn/target/scala-2.12/classes ... [info] compiling 314 Scala sources and 29 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/scala-2.12/test-classes ... [info] done compiling [info] done compiling [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] done compiling [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] done compiling [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] compiling 11 Scala sources and 2 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/external/kinesis-asl/target/scala-2.12/classes ... [info] compiling 10 Scala sources and 1 Java source to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/external/kafka-0-10/target/scala-2.12/classes ... [info] done compiling [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/external/kinesis-asl/src/main/java/org/apache/spark/examples/streaming/JavaKinesisWordCountASL.java:157:1: [unchecked] unchecked method invocation: method union in class JavaStreamingContext is applied to given types [warn] unionStreams = jssc.union(streamsList.toArray(new JavaDStream[0])); [warn] ^ required: JavaDStream<T>[] [warn] found: JavaDStream[] [warn] where T is a type-variable: [warn] T extends Object declared in method <T>union(JavaDStream<T>...) [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/external/kinesis-asl/src/main/java/org/apache/spark/examples/streaming/JavaKinesisWordCountASL.java:157:1: [unchecked] unchecked conversion [warn] unionStreams = jssc.union(streamsList.toArray(new JavaDStream[0])); [warn] ^ required: JavaDStream<T>[] [warn] found: JavaDStream[] [warn] where T is a type-variable: [warn] T extends Object declared in method <T>union(JavaDStream<T>...) [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] compiling 19 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/graphx/target/scala-2.12/test-classes ... [info] compiling 38 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/resource-managers/kubernetes/core/target/scala-2.12/test-classes ... [info] compiling 6 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/external/kafka-0-10-token-provider/target/scala-2.12/test-classes ... [info] compiling 11 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/resource-managers/mesos/target/scala-2.12/test-classes ... [info] compiling 22 Scala sources and 3 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/resource-managers/yarn/target/scala-2.12/test-classes ... [info] compiling 11 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/mllib-local/target/scala-2.12/test-classes ... [info] compiling 41 Scala sources and 9 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/streaming/target/scala-2.12/test-classes ... [info] done compiling [info] done compiling [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/SupportsAtomicPartitionManagement.java:55:1: [unchecked] unchecked method invocation: method createPartitions in interface SupportsAtomicPartitionManagement is applied to given types [warn] createPartitions(new InternalRow[]{ident}, new Map[]{properties}); [warn] ^ required: InternalRow[],Map<String,String>[] [warn] found: InternalRow[],Map[] [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/SupportsAtomicPartitionManagement.java:55:1: [unchecked] unchecked conversion [warn] createPartitions(new InternalRow[]{ident}, new Map[]{properties}); [warn] ^ required: Map<String,String>[] [warn] found: Map[] [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/catalyst/src/main/java/org/apache/spark/sql/util/NumericHistogram.java:186:1: [unchecked] unchecked method invocation: method sort in class Collections is applied to given types [warn] Collections.sort(tmp_bins); [warn] ^ required: List<T> [warn] found: ArrayList<Coord> [warn] where T is a type-variable: [warn] T extends Comparable<? super T> declared in method <T>sort(List<T>) [warn] 3 warnings [info] done compiling [info] done compiling [info] done compiling [info] done compiling [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] compiling 302 Scala sources and 6 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/catalyst/target/scala-2.12/test-classes ... [info] compiling 544 Scala sources and 71 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/core/target/scala-2.12/classes ... [info] done compiling [info] compiling 6 Scala sources and 4 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/external/kafka-0-10/target/scala-2.12/test-classes ... [info] compiling 8 Scala sources and 1 Java source to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/external/kinesis-asl/target/scala-2.12/test-classes ... [info] Note: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/external/kinesis-asl/src/test/java/org/apache/spark/streaming/kinesis/JavaKinesisInputDStreamBuilderSuite.java uses or overrides a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] done compiling [info] done compiling [info] done compiling [info] compiling 31 Scala sources and 1 Java source to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/external/kafka-0-10-sql/target/scala-2.12/classes ... [info] compiling 4 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/repl/target/scala-2.12/classes ... [info] compiling 29 Scala sources and 2 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/scala-2.12/classes ... [info] compiling 327 Scala sources and 5 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/mllib/target/scala-2.12/classes ... [info] compiling 18 Scala sources and 1 Java source to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/external/avro/target/scala-2.12/classes ... [info] done compiling [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/external/avro/src/main/java/org/apache/spark/sql/avro/SparkAvroKeyOutputFormat.java:55:1: [unchecked] unchecked call to SparkAvroKeyRecordWriter(Schema,GenericData,CodecFactory,OutputStream,int,Map<String,String>) as a member of the raw type SparkAvroKeyRecordWriter [warn] return new SparkAvroKeyRecordWriter( [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/external/avro/src/main/java/org/apache/spark/sql/avro/SparkAvroKeyOutputFormat.java:74:1: [unchecked] unchecked call to DataFileWriter(DatumWriter<D>) as a member of the raw type DataFileWriter [warn] this.mAvroFileWriter = new DataFileWriter(dataModel.createDatumWriter(writerSchema)); [warn] ^ where D is a type-variable: [warn] D extends Object declared in class DataFileWriter [info] done compiling [info] done compiling [info] done compiling [info] Note: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/src/main/java/org/apache/hadoop/hive/ql/io/orc/SparkOrcNewRecordReader.java uses or overrides a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] done compiling [info] compiling 27 Scala sources and 86 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive-thriftserver/target/scala-2.12/classes ... [info] Note: Some input files use or override a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] compiling 523 Scala sources and 47 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/core/target/scala-2.12/test-classes ... [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] compiling 6 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/repl/target/scala-2.12/test-classes ... [info] compiling 204 Scala sources and 135 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/examples/target/scala-2.12/classes ... [info] done compiling [info] Note: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/examples/src/main/java/org/apache/spark/examples/ml/JavaChiSqSelectorExample.java uses or overrides a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] Note: Some input files use or override a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] compiling 21 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/external/kafka-0-10-sql/target/scala-2.12/test-classes ... [info] compiling 205 Scala sources and 66 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/mllib/target/scala-2.12/test-classes ... [info] compiling 13 Scala sources and 1 Java source to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/external/avro/target/scala-2.12/test-classes ... [info] compiling 117 Scala sources and 17 Java sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/scala-2.12/test-classes ... [info] compiling 20 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/external/docker-integration-tests/target/scala-2.12/test-classes ... [info] done compiling [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] done compiling [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:464:1: [unchecked] unchecked cast [warn] setLint((List<Integer>)value); [warn] ^ required: List<Integer> [warn] found: Object [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:472:1: [unchecked] unchecked cast [warn] setLString((List<String>)value); [warn] ^ required: List<String> [warn] found: Object [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:480:1: [unchecked] unchecked cast [warn] setLintString((List<IntString>)value); [warn] ^ required: List<IntString> [warn] found: Object [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:488:1: [unchecked] unchecked cast [warn] setMStringString((Map<String,String>)value); [warn] ^ required: Map<String,String> [warn] found: Object [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:749:1: [unchecked] unchecked call to read(TProtocol,T) as a member of the raw type IScheme [warn] schemes.get(iprot.getScheme()).getScheme().read(iprot, this); [warn] ^ where T is a type-variable: [warn] T extends TBase declared in interface IScheme [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:753:1: [unchecked] unchecked call to write(TProtocol,T) as a member of the raw type IScheme [warn] schemes.get(oprot.getScheme()).getScheme().write(oprot, this); [warn] ^ where T is a type-variable: [warn] T extends TBase declared in interface IScheme [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:1027:1: [unchecked] getScheme() in ComplexTupleSchemeFactory implements <S>getScheme() in SchemeFactory [warn] public ComplexTupleScheme getScheme() { [warn] ^ return type requires unchecked conversion from ComplexTupleScheme to S [warn] where S is a type-variable: [warn] S extends IScheme declared in method <S>getScheme() [warn] Note: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/src/test/java/org/apache/spark/sql/hive/JavaDataFrameSuite.java uses or overrides a deprecated API. [warn] Note: Recompile with -Xlint:deprecation for details. [warn] 8 warnings [info] done compiling [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] compiling 18 Scala sources to /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive-thriftserver/target/scala-2.12/test-classes ... [info] done compiling [info] done compiling [success] Total time: 446 s (07:26), completed Dec 3, 2021 10:51:32 PM [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] Strategy 'discard' was applied to 2 files (Run the task at debug level to see details) [info] Strategy 'filterDistinctLines' was applied to 8 files (Run the task at debug level to see details) [info] Strategy 'first' was applied to 90 files (Run the task at debug level to see details) [warn] Ignored unknown package option FixedTimestamp(Some(1262304000000)) [success] Total time: 22 s, completed Dec 3, 2021 10:51:54 PM ======================================================================== Detecting binary incompatibilities with MiMa ======================================================================== [info] Detecting binary incompatibilities with MiMa using SBT with these profiles: -Phadoop-2.7 -Pmesos -Phadoop-cloud -Pdocker-integration-tests -Pspark-ganglia-lgpl -Pyarn -Phive -Phive-thriftserver -Pkinesis-asl -Pkubernetes [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.Strategy [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamedQueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.InBlock [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.TrainValidationSplit.TrainValidationSplitReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.LocalLDAModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.Main.MainClassOptionParser [WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DefaultParamsReader.Metadata [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetReadState.RowRange [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.OpenHashMapBasedStateMap.StateInfo [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.dynalloc.ExecutorMonitor.ShuffleCleanedEvent [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.FMRegressionModel.FMRegressionModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.RpcEndpointVerifier.CheckExistence [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.MiscellaneousProcessAdded [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.SetupDriver Error instrumenting class:org.apache.spark.mapred.SparkHadoopMapRedUtil$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.MultilayerPerceptronClassifierWrapper.MultilayerPerceptronClassifierWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FromStatementBodyContext Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory$FixedLenByteArrayUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.Hello [WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.CryptoHelperChannel [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ImputerModel.ImputerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveSessionCatalog.SessionCatalogAndTable [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierCommentListContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LegacyDecimalLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.SignalUtils.ActionHandler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleMultipartIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.api.r.BaseRRunner.ReaderIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentityTransformContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.HealthTracker.ExecutorFailureList [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QualifiedNameListContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator22$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntervalContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QuerySpecificationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.DeleteColumn [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator27$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.plans [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.DateAccessor Error instrumenting class:org.apache.spark.sql.execution.streaming.StreamExecution$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.RocksDBStateStore.COMMITTED Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.text.TextWrite Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetUtils$FileTypes$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveRdd [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleTableIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.SchemaPruning.RootField [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.LongWithRebaseUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.BucketSpecHelper [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.ZooKeeperLeaderElectionAgent.LeadershipStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UnsetTablePropertiesContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillTask [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LogisticRegressionWrapper.LogisticRegressionWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntervalValueContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.FeatureHasher.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.RunLengthEncoding.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterChanged [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.DefaultPartitionCoalescer.PartitionLocations [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.NullAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.StateStoreAwareZipPartitionsHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueRowConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Imputer.$$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveSessionCatalog.DatabaseInSessionCatalog [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SchemaHelper.SchemaWriter [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.SparkAppHandle.Listener [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.TempFileBasedBlockStoreUpdater Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestWorkerState [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.KVTypeInfo.Accessor [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillMerger.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RetryingBlockTransferor.BlockTransferStarter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisteredWorker [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.RocksDBStateStore.ABORTED [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterApplication [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FromClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateKeyWatermarkPredicate Error instrumenting class:org.apache.spark.sql.execution.datasources.PathGlobFilter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ColumnReferenceContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterChangeAcknowledged [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryTermDefaultContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComparisonContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetMatchingBlockIds [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerExecutorStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.PythonMLLibAPI.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ApplicationRemoved [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyAndNumValues [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryTermContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator19$3 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.StandaloneResourceUtils.MutableResourceInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0.Data Error instrumenting class:org.apache.spark.input.StreamInputFormat [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RetrieveSparkAppConfig [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator26$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.RpcHandler.MergedBlockMetaReqHandler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelWriter.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetShufflePushMergerLocations [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$5 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.BisectingKMeansModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator23$3 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator2$2 Error instrumenting class:org.apache.spark.mllib.regression.IsotonicRegressionModel$SaveLoadV1_0$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.IntegerWithRebaseUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.Prefix [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerDecommissioner.ShuffleMigrationRunnable [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyToNumValuesType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.DoubleUpdater Error instrumenting class:org.apache.spark.deploy.history.RollingEventLogFilesWriter$ [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.io.LocalDiskShuffleMapOutputWriter.$PartitionWriterStream [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.TaskIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.stat.test.ChiSqTest.Method [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ExtractWindowExpressions [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CurrentLikeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.Batch [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveShufflePushMergerLocation [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionColumnContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDB.PrefixCache [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LinearRegressionWrapper.LinearRegressionWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AliasedRelationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.EltCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.FreqSequence [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.DecisionTreeRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$4 Error instrumenting class:org.apache.spark.sql.execution.PartitionedFileUtil$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.CubeType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.FloatType.FloatIsConflicted [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveGroupingAnalytics [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.NodeData [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.Pipeline.PipelineReader [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.ArrayWrappers.ComparableObjectArray [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.DoubleHasher [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RetrieveDelegationTokens [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ClassificationModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MultiUnitsIntervalContext [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.ReceiverTracker.TrackerState [WARN] Unable to detect inner functions for class:org.apache.spark.sql.streaming.StreamingQueryListener.QueryTerminatedEvent [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.sources.MemorySink.AddedData [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockStoreClient.$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ParenthesizedExpressionContext Error instrumenting class:org.apache.spark.sql.execution.command.DataWritingCommand$ Error instrumenting class:org.apache.spark.mllib.clustering.LocalLDAModel$SaveLoadV1_0$ [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.dynalloc.ExecutorMonitor.Tracker [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.NodeData [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.ShuffleBlockPusher.PushRequest [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableProviderContext Error instrumenting class:org.apache.spark.sql.execution.command.DDLUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.PeriodConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.DecisionTreeRegressorWrapper.DecisionTreeRegressorWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.UnregisterApplication [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleByBytesContext [WARN] Unable to detect inner functions for class:org.apache.spark.api.r.BaseRRunner.WriterThread [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorkerFailed [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.SplitData [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator16$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.JoinReorderDP.JoinPlan [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MergeIntoTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.expressions [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.EventFilter.FilterStatistics [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.CatalogImpl.$$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.StringLiteralCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerExecutorStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateFunctionContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClient.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$7 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.LDAWrapperReader Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.Division [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryClassificationSummary.$$typecreator6$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LocationSpecContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.PartitioningUtils.PartitionValues [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper.GeneralizedLinearRegressionWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.FValueTest.FValueResult [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.FValueTest.$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Tokenizer.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DriverStatusResponse [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelWriter.$$typecreator1$2 Error instrumenting class:org.apache.spark.sql.execution.command.LoadDataCommand$ [WARN] Unable to detect inner functions for class:org.apache.spark.resource.ResourceProfile.DefaultProfileExecutorResources [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveCatalogs.NonSessionCatalogAndTable Error instrumenting class:org.apache.spark.sql.execution.streaming.state.SchemaHelper$SchemaV2Reader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.optimization.LBFGS.CostFun [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.executor.CoarseGrainedExecutorBackend.Arguments [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.NGram.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TablePropertyKeyContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.AsFunctionIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.ml.param.shared.SharedParamsCodeGen.ParamDesc [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.GaussianMixtureModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator21$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedStateStore.COMMITTED [WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClientFactory.ClientPool [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.RandomForestRegressionModel.$$typecreator1$1 Error instrumenting class:org.apache.spark.scheduler.SplitInfo$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.IfCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowNamespacesContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExtractContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.YearMonthIntervalType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.SparkBuildInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AddTablePartitionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SmallIntLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.python.MLSerDe.SparseMatrixPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$11 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.DurationConverter Error instrumenting class:org.apache.spark.api.python.DoubleArrayWritable [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV2_0 Error instrumenting class:org.apache.spark.ml.tuning.TrainValidationSplitModel$TrainValidationSplitModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.TransformHelper [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingGlobalLimitStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.DecommissionBlockManager [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ChiSquareTest.$typecreator13$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.orc.OrcUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.api.java.JavaUtils.SerializableMapWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyToNumValuesStore Error instrumenting class:org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.DecisionTreeRegressorWrapper.DecisionTreeRegressorWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.AnalysisErrorAt [WARN] Unable to detect inner functions for class:org.apache.spark.ui.JettyUtils.ServletParams [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.Once [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RepairTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.EnsembleNodeData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.FloatConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.DataSource.SourceInfo [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.SparkSubmitUtils.MavenCoordinate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.ApproximatePercentile.PercentileDigest [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamedExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.expressions.NullOrdering.1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Std [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RefreshFunctionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RowConstructorContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.OptimizeMetadataOnlyQuery.PartitionedRelation [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.ImplicitTypeCasts [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AggregationClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WindowFunctionType.Python [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSizeHint.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.AssociationRules.Rule [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.SquaredEuclideanSilhouette.ClusterStats Error instrumenting class:org.apache.spark.sql.execution.streaming.state.SchemaHelper$SchemaReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.RepeatedGroupConverter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeans.ClusterSummaryAggregator Error instrumenting class:org.apache.spark.streaming.CheckpointWriter$CheckpointWriteHandler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DayTimeIntervalDataTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowDefContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.DeferFetchRequestResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ErrorIdentContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.SpillReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveNewInstance [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.IntervalUtils.ParseState [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ErrorCapturingIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.fpm.FPGrowthModel.FPGrowthModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.DateConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.GroupingAnalyticsContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator29$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.Aggregator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryClassificationSummary.$$typecreator6$5 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.IteratorForPartition [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PivotColumnContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.RocksDBStateStore [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.MapConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ExecutorAllocationManager.StageAttempt [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetQuantifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.DecommissionExecutorsOnHost [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.FValueTest.$typecreator19$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.client.StandaloneAppClient.ClientEndpoint [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator5$2 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveShuffle [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CtesContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FallbackOnPushMergedFailureResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.plans.DslLogicalPlan [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GBTRegressorWrapper.GBTRegressorWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.FMClassificationModel.FMClassificationModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveUserSpecifiedColumns [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.IsotonicRegressionModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.KMeansWrapper.KMeansWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.ReceiverTracker.ReceiverTrackerEndpoint [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.BisectingKMeansWrapper.BisectingKMeansWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.IdentityProjection Error instrumenting class:org.apache.spark.internal.io.HadoopMapRedCommitProtocol [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorter.$SortedIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MultiInsertQueryContext Error instrumenting class:org.apache.spark.sql.execution.datasources.DaysWritable [WARN] Unable to detect inner functions for class:org.apache.spark.ml.fpm.FPGrowthModel.FPGrowthModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveSessionCatalog.ResolvedV1TableIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SubqueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.SelectorModel.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryClassificationSummary.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AssignmentListContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.FValueTest.$typecreator13$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.crypto.TransportCipher.EncryptionHandler [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.BytesToBytesMap.$MapIteratorWithKeyIndex [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.StandaloneResourceUtils.StandaloneResourceAllocation [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RetryingBlockTransferor.1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LDAModel.$$typecreator2$1 Error instrumenting class:org.apache.spark.internal.io.HadoopMapReduceWriteConfigUtil [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GBTRegressionModel.GBTRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.RobustScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LogisticRegressionWrapper.LogisticRegressionWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.DateTimeOperations [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator13$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.ByteUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.SetupDriver [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetMatchingBlockIds [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexAndValue [WARN] Unable to detect inner functions for class:org.apache.spark.InternalAccumulator.output [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.CatalystTypeConverter [WARN] Unable to detect inner functions for class:org.apache.spark.SparkConf.DeprecatedConfig [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StrictIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.UnivariateFeatureSelectorModel.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.Bitmaps [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.DecimalConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComparisonOperatorContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SparkSaslClient.$ClientCallbackHandler [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DriverStatusResponse [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.IntArrays [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetReplicateInfoForRDDBlocks [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator11$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.YearMonthIntervalDataTypeContext Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.parquet.ParquetScan [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowCurrentNamespaceContext [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.receiver.BlockGenerator.Block [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.OneForOneBlockFetcher.$BlocksInfo [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.SparkAppConfig [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.StackCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BigIntLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ConfigKeyContext Error instrumenting class:org.apache.spark.sql.execution.datasources.FilePartition$ [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.TransportServer.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PivotClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.UpdateColumnPosition [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Count [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RetrieveLastAllocatedExecutorId [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StopWordsRemover.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.SignedPrefixComparatorDesc [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.impl.GLMRegressionModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionFieldListContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerDecommissioning [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator11$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.LongDelta.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WindowFunctionType.SQL Error instrumenting class:org.apache.spark.sql.execution.streaming.CheckpointFileManager$RenameHelperMethods [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.UpdateColumnNullability [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.FloatConverter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.BinaryPrefixComparator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.FMClassificationModel.FMClassificationModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PivotValueContext [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.InputFileBlockHolder.FileBlock [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FailNativeCommandContext Error instrumenting class:org.apache.spark.api.python.WriteInputFormatTestDataGenerator$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DecommissionWorker [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCheckResult.TypeCheckFailure [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleTableSchemaContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.StarSchemaDetection.TableAccessCardinality [WARN] Unable to detect inner functions for class:org.apache.spark.resource.ResourceProfile.ExecutorResourcesOrDefaults [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Data [WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.WriteSkippedQueue [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.EncryptedDownloadFile [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.Cholesky [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.json.JsonFilters.JsonPredicate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ConfigValueContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.LDAWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$16 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PredicateOperatorContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.stat.test.KolmogorovSmirnovTest.NullHypothesis Error instrumenting class:org.apache.spark.deploy.master.ui.MasterWebUI [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.GroupType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.ArrayConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Family [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.AppListingListener.MutableAttemptInfo [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.BlockManagerHeartbeat Error instrumenting class:org.apache.spark.sql.execution.datasources.DataSource$ [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StopExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillDriver Error instrumenting class:org.apache.spark.mllib.clustering.GaussianMixtureModel$SaveLoadV1_0$ [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.UpdateBlockInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.RollupType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator17$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator25$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableValuedFunctionContext Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.json.JsonTable [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ANOVATest.$typecreator19$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.IntDelta.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayes.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.StopBlockManagerMaster [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.ALSModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0.$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.HandleAnalysisOnlyCommand [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.OneForOneBlockFetcher.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.NaturalKeys [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.TypedAverage.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.EventFilter.FilterStatistics [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.InConversion [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SchemaHelper.SchemaReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeFixedWidthAggregationMap.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetNamespaceLocationContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Min [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.$$typecreator2$2 Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.text.TextScan Error instrumenting class:org.apache.spark.sql.execution.streaming.state.StateStoreProvider$ [WARN] Unable to detect inner functions for class:org.apache.spark.storage.StorageStatus.NonRddStorageInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ColTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockHandler.$ShuffleManagedBufferIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.WindowInPandasExec.WindowBoundType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.OrderedIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.OutputSpec [WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DatasetUtils.$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.RandomForestClassificationModel.RandomForestClassificationModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerLatestState [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator16$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator2$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.ColumnChange [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeExternalRowSorter.PrefixComputer [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.PromoteStrings [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingDeduplicationStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.Window [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.MapConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ReplaceTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.ArrayWrappers.ComparableLongArray [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.EncryptedDownloadFile.$EncryptedDownloadWritableChannel [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InsertIntoContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CommentTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.RandomForestRegressionModel.RandomForestRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocations [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.Rating [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NumericLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveBinaryArithmetic [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator12$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionValContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.BlockManagerHeartbeat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.ArrayConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryPrimaryDefaultContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCWriter Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.orc.OrcScan$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Tweedie [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.UnsignedPrefixComparator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WhereClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSizeHint.$$typecreator3$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.TextBasedFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator2$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InlineTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.GBTClassificationModel.GBTClassificationModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionSummary.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RegisterBlockManager [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$6 [WARN] Unable to detect inner functions for class:org.apache.spark.network.util.TransportFrameDecoder.Interceptor [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerRemoved [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.FixedLengthRowBasedKeyValueBatch.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.LabeledPointPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.CoalesceExec.EmptyPartition [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Index [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.RemoveProperty [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.SuccessFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslAttr [WARN] Unable to detect inner functions for class:org.apache.spark.network.util.LevelDBProvider.LevelDBLogger [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.ShuffleBlockPusher.PushResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.OverlayContext Error instrumenting class:org.apache.spark.ml.tuning.CrossValidatorModel$CrossValidatorModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetArrayConverter.ElementConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleInsertQueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.SuccessFetchResult Error instrumenting class:org.apache.spark.sql.execution.datasources.ModifiedBeforeFilter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VarianceThresholdSelectorModel.VarianceThresholdSelectorWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.ShuffleInMemorySorter.1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClientFactory.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.PassThrough.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RefreshTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.HadoopFSUtils.SerializableFileStatus [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.RandomForestClassifierWrapper.RandomForestClassifierWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.DecisionTreeClassificationModel.DecisionTreeClassificationModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Index [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.StateStoreType [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.SparkAppHandle.State [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.TypedAverage.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.UnsignedPrefixComparatorDescNullsFirst [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillExecutorsOnHost [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator8$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.MapConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TypeConstructorContext Error instrumenting class:org.apache.spark.SSLOptions [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestDriverStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.streaming.InternalOutputModes.Append [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcDeserializer.ArrayDataUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CastContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.PivotType Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.orc.OrcTable [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.ALSModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$11 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableAliasContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator6$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Logit [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.SchemaPruning.RootField Error instrumenting class:org.apache.spark.kafka010.KafkaDelegationTokenProvider [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.TextSocketContinuousStream.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ImplicitOperators [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.MapZipWithCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSlicer.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveSessionCatalog.ResolvedViewIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StopExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.AsTableIdentifier Error instrumenting class:org.apache.spark.input.WholeTextFileInputFormat [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveRdd [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator18$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BooleanExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveMissingReferences [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveRandomSeed [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UnquotedIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.PrimitiveConverter Error instrumenting class:org.apache.spark.sql.execution.streaming.state.SchemaHelper$SchemaV1Reader [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.io.LocalDiskShuffleMapOutputWriter.$LocalDiskShufflePartitionWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryClassificationSummary.$$typecreator6$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveNaturalAndUsingJoin [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.LaunchDriver [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$16 Error instrumenting class:org.apache.spark.deploy.history.HistoryServer Error instrumenting class:org.apache.spark.sql.execution.streaming.ManifestFileCommitProtocol [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator14$2 Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.parquet.ParquetWrite [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateValueWatermarkPredicate Error instrumenting class:org.apache.spark.api.python.TestOutputKeyConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.jdbc.connection.SecureConnectionProvider.JDBCConfiguration [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.WidenSetOperationTypes [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.expressions.SortDirection.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QualifiedColTypeWithPositionListContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.optimization.NNLS.Workspace [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.linalg.distributed.RowMatrix.$SVDMode$1$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DataTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SparkSaslServer.1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.OneForOneBlockPusher.$BlockPushCallback [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionFieldContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UnitToUnitIntervalContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.DataSourceV2Implicits.MetadataColumnsHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QuotedIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VarianceThresholdSelectorModel.VarianceThresholdSelectorWriter.Data Error instrumenting class:org.apache.spark.sql.execution.datasources.binaryfile.BinaryFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.IntervalYearAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IndexToString.$$typecreator1$4 Error instrumenting class:org.apache.spark.api.python.TestWritable [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.BisectingKMeansModel.BisectingKMeansModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.SparkAppConfig [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WriteStyle.RawStyle [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.SendHeartbeat [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerRemoved [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetShufflePushMergerLocations [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.RandomForestClassificationModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.Decimal.DecimalAsIfIntegral [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeMax Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.json.JsonWrite [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.KafkaDataConsumer.NonCachedKafkaDataConsumer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.LeftSide [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.Optimizer.OptimizeSubqueries [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.IntervalDayAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.stat.StatFunctions.CovarianceCounter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator19$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RecoverPartitionsContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0.$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.SquaredEuclideanSilhouette.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyToValuePair [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.ParquetOutputTimestampType Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetReadSupport [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PolynomialExpansion.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$6 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.Hasher [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.CombinedTypeCoercionRule Error instrumenting class:org.apache.spark.input.StreamBasedRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FloatLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator7$3 [WARN] Unable to detect inner functions for class:org.apache.spark.executor.Executor.TaskReaper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedStateStore.STATE [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.LocalIndexEncoder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.IntegralDivision [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RetryingBlockTransferor.$RetryingBlockTransferListener [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.Pipeline.SharedReadWrite [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.IsExecutorAlive [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.AddMetadataColumns [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.Replaced [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.StringType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetReplicateInfoForRDDBlocks [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionTransformContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.Deserializer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QualifiedColTypeWithPositionContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.RpcHandler.1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.CrossValidator.CrossValidatorWriter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.ExecutorDecommissioning [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.TransportRequestHandler.$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.CreateStageResult [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.status.AppStatusListener.StageCompletionTime Error instrumenting class:org.apache.spark.WritableConverter$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetMapConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AnsiNonReservedContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Imputer.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.FixedPoint [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterInStandby [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.ProgressReporter.ExecutionStats [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.DatabaseDesc [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.After [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.ChainedIterator [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillDriverResponse [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.SizeTracker.Sample [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StopWordsRemover.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.IdentifierHelper [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator23$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.FloatType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.python.MLSerDe.SparseVectorPickler [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocationsAndStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DereferenceContext Error instrumenting class:org.apache.spark.sql.execution.datasources.csv.MultiLineCSVDataSource$ Error instrumenting class:org.apache.spark.deploy.security.HBaseDelegationTokenProvider [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveBlock [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.DriverEndpoint [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveSessionCatalog.ResolvedV1TableAndIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.internal.io.FileCommitProtocol.TaskCommitMessage [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetExecutorEndpointRef [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Link [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntegerLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.LookupFunctions [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.BooleanAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.streaming.InternalOutputModes.Complete Error instrumenting class:org.apache.spark.input.StreamFileInputFormat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AnalyzeContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InstanceList.CountingRemoveIfForEach [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.TimestampConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowFunctionsContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.DoubleAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StructContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.MultipartIdentifierHelper [WARN] Unable to detect inner functions for class:org.apache.spark.MapOutputTrackerMaster.MessageLoop [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TransformQuerySpecificationContext Error instrumenting class:org.apache.spark.metrics.sink.PrometheusServlet [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.KafkaDataConsumer.NonCachedKafkaDataConsumer Error instrumenting class:org.apache.spark.ml.image.SamplePathFilter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PredicatedContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.RpcHandler.OneWayRpcCallback [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RelationPrimaryContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamespaceContext [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.NewHadoopRDD.NewHadoopMapPartitionsWithSplitRDD [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.StructConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InsertOverwriteDirContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelWriter.$$typecreator1$2 Error instrumenting class:org.apache.spark.input.FixedLengthBinaryInputFormat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$5 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.StreamingListenerBus.WrappedStreamingListenerEvent [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SortItemContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveSubquery [WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClient.$RpcChannelListener [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.EnsembleNodeData [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.Heartbeat [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.dstream.ReceiverInputDStream.ReceiverRateController [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.TypeConverter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DecommissionWorkers [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.AnsiTypeCoercion.DateTimeOperations [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Key [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.HashingTF.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.FValueTest.FValueResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.NamespaceChange.1 [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.HadoopRDD.HadoopMapPartitionsWithSplitRDD [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.LocalLDAModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetQuotedConfigurationContext Error instrumenting class:org.apache.spark.sql.errors.QueryCompilationErrors$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.util.BytecodeUtils.MethodInvocationFinder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExponentLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.TimestampNTZType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.BooleanEquality [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.MergingSortWithSessionWindowStateIterator.SessionRowInformation [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.ByteConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SparkSaslServer.$DigestCallbackHandler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.VectorizedColumnReader.2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinExec.OneSideHashJoiner [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.IntHasher [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedStateStore.ABORTED [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV2_0.$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.ByteType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$10 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.DecommissionExecutorsOnHost Error instrumenting class:org.apache.spark.sql.execution.datasources.PartitioningUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillDriver [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.FixedLenByteArrayAsLongUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.trees.TreeNodeRef [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveDeserializer [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.RocksDBConf.ConfEntry Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.csv.CSVTable [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.UpdateDelegationTokens [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.ByteArrays [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.SparseMatrixPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionSpecContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeQueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelWriter.Data Error instrumenting class:org.apache.spark.sql.execution.streaming.CheckpointFileManager$ [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.BlockStoreUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchRequest [WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClient.$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UncacheTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.plans.logical.statsEstimation.EstimationUtils.OverlappedRange [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MatchedActionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslSymbol [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Max Error instrumenting class:org.apache.spark.sql.internal.SharedState$ Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.VectorizedParquetRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ElementwiseProduct.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.DecisionTreeRegressionModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.StateStoreAwareZipPartitionsRDD [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingJoinStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.ShuffleMetricsSource [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.DeprecatedConfig [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComplexDataTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.InMemoryScans [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$9 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.ApproxCountDistinctForIntervals.LongArrayInternalRow [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.ALSWrapper.ALSWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ValueExpressionContext Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.text.TextTable [WARN] Unable to detect inner functions for class:org.apache.spark.util.HadoopFSUtils.SerializableBlockLocation [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.impl.GLMRegressionModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.Schema [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QuotedIdentifierAlternativeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator9$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetFilters.ParquetPrimitiveField [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.KafkaDataConsumer.CachedKafkaDataConsumer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.StructAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RFormulaModel.RFormulaModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.Aggregation [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.KolmogorovSmirnovTest.KolmogorovSmirnovTestResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.WithCTEStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetMemoryStatus Error instrumenting class:org.apache.spark.ml.source.libsvm.LibSVMFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.AttributeSeq [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.$$typecreator2$1 Error instrumenting class:org.apache.spark.mllib.tree.model.TreeEnsembleModel$SaveLoadV1_0$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator11$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.RocksDBStateStore.$UPDATING$ [WARN] Unable to detect inner functions for class:org.apache.spark.util.HadoopFSUtils.SerializableFileStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveHints.ResolveJoinStrategyHints [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedStateStore.UPDATING [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.CrossValidator.CrossValidatorReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.DataSourceV2Implicits.TableHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator16$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExpressionSeqContext Error instrumenting class:org.apache.spark.deploy.history.EventLogFileReader$ [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestExecutors Error instrumenting class:org.apache.spark.sql.execution.datasources.DataSourceUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.UpdateColumnType [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelWriter.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocations [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.RemovedConfig [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.UnsignedPrefixComparatorDesc [WARN] Unable to detect inner functions for class:org.apache.spark.ui.JettyUtils.ServletParams [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.ContinuousRow [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ErrorHandler.BlockFetchErrorHandler [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.io.LocalDiskShuffleMapOutputWriter.$PartitionWriterChannel [WARN] Unable to detect inner functions for class:org.apache.spark.util.HadoopFSUtils.SerializableBlockLocation [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveAggAliasInGroupBy [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.PushLeftSemiLeftAntiThroughJoin.AllowedJoin [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.io.LocalDiskShuffleMapOutputWriter.1 Error instrumenting class:org.apache.spark.deploy.rest.RestSubmissionServer [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV2_0.$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.DecommissionBlockManagers [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDeBase.BasePickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator25$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator24$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Binarizer.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.Cluster [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.Cluster [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerDecommissionSigReceived [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.SpecialLimits [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator13$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeM2n [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.PartitionOverwriteMode [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LinearSVCWrapper.LinearSVCWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.SparkConf.DeprecatedConfig [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.TemporalSequenceImpl [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.AppShufflePartitionInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FlatMapGroupsWithStateExec.InputProcessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$17 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.PushMergedRemoteMetaFailedFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.AppShuffleInfo [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterChanged [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.NaiveBayesWrapper.NaiveBayesWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.RandomForestClassifierWrapper.RandomForestClassifierWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DatasetUtils.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.SubmitDriverResponse [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.Schema [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.ArrowVectorAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.ShuffleBlockPusher.PushRequest [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GBTRegressionModel.GBTRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ErrorCapturingUnitToUnitIntervalContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamedWindowContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.MapAccessor Error instrumenting class:org.apache.spark.sql.execution.command.CommandUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.IdentityConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LinearRegressionWrapper.LinearRegressionWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CacheTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.Shutdown [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.streaming.InternalOutputModes.Update [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArray.SpillableArrayIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueRowConverterFormatV2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.WindowInPandasExec.WindowBoundType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetMapConverter.$KeyValueConverter [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.StringArrays [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GaussianMixtureWrapper.GaussianMixtureWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.TypedAverage.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.DecisionTreeClassifierWrapper.DecisionTreeClassifierWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StorageHandlerContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ReplaceTableHeaderContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.ContinuousRow Error instrumenting class:org.apache.spark.input.Configurable [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.1 Error instrumenting class:org.apache.spark.sql.execution.SparkScriptTransformationExec [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DataType.JSortedObject [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.Decimal.DecimalIsFractional [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClustering.Assignment [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DecimalType.Expression [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.RatingBlock Error instrumenting class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection.Schema [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.OneForOneBlockFetcher.$ChunkCallback [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropFunctionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FrameBoundContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.SignedPrefixComparator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.FMRegressionModel.FMRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.BooleanUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.DoubleConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.DurationConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetFilters.ParquetSchemaType [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.NodeData [WARN] Unable to detect inner functions for class:org.apache.spark.SparkConf.AlternateConfig [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisteredApplication [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.RobustScalerModelWriter.$$typecreator1$2 Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetUtils$FileTypes [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.CosineSilhouette.$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.DCT.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.TrainValidationSplit.TrainValidationSplitWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ExecutorAllocationManager.StageAttempt [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ChiSquareTest.$typecreator19$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.UnsignedPrefixComparatorNullsLast [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchBlockInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ANOVATest.$typecreator5$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.csv.TextInputCSVDataSource$ [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalShuffleBlockResolver.$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.UnivariateFeatureSelectorModel.UnivariateFeatureSelectorModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.VertexData [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorAdded [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchBlockInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateViewContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetBlockStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveFunctions [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator18$1 [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.ShuffleInMemorySorter.SortComparator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator6$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.PeriodConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.IntegralSequenceImpl [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.SerializerBuildHelper.MapElementInformation [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.IsExecutorAlive [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.ConcurrentOutputWriterSpec [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.FMClassificationModel.FMClassificationModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StatefulAggregationStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.FPGrowthWrapper.FPGrowthWrapperReader Error instrumenting class:org.apache.spark.ui.ProxyRedirectHandler$ResponseWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.executor.ExecutorMetricsSource.ExecutorMetricGauge Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.json.JsonScan [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.RatingPickler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LDAModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.SplitData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetOperationContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.RpcHandler.NoopMergedBlockMetaReqHandler [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.MessageDecoder.1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.LocalLDAModel.SaveLoadV1_0.Data$ [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.DefaultPartitionCoalescer.partitionGroupOrdering [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.EpochMarker [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GaussianMixtureWrapper.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.StateStore.MaintenanceTask [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerLatestState [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeColNameContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.JoinTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.$$typecreator6$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.LongAsMicrosRebaseUpdater Error instrumenting class:org.apache.spark.sql.execution.datasources.orc.OrcFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.IsExecutorAlive [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.FunctionIdentifierHelper [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.OneVsRestModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.BytesToBytesMap.1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorAdded [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.TimestampTypes [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.ALSWrapper.ALSWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestSubmitDriver [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MultipartIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ArithmeticBinaryContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.CatalogImpl.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$12 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableFileFormatContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.PredictData [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator8$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.analysis.DetectAmbiguousSelfJoin.ColumnReference [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExplainContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WriteStyle.FlattenStyle [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0.Data Error instrumenting class:org.apache.spark.sql.execution.streaming.CheckpointFileManager$CancellableFSDataOutputStream [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.RandomForestClassificationModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.GroupByClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$13 Error instrumenting class:org.apache.spark.ui.JettyUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeKVExternalSorter.$KVSorterIterator [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.SpillableIterator [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherServer.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SparkSession.implicits [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.FMRegressorWrapper.FMRegressorWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.SplitData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveOrdinalInOrderByAndGroupBy [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.RemoteBlockDownloadFileManager.ReferenceWithCleanup [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelWriter.Data Error instrumenting class:org.apache.spark.input.FixedLengthBinaryRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator15$2 [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.BlockStoreClient.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.Pipeline.PipelineWriter Error instrumenting class:org.apache.spark.internal.io.HadoopMapRedWriteConfigUtil [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryPrimaryContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.StopAppClient [WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.ErrorHandlingWritableChannel [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClusteringModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.LongAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveAlterTableCommands [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.CoalesceExec.EmptyPartition [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.NaiveBayesWrapper.NaiveBayesWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Identity [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.PeriodSequenceImpl [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.QueryExecution.debug [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.GroupingSetContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Normalizer.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.attribute.AttributeType.Nominal$1$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.ShortAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.TaskIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.RatingBlock [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.RobustScalerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.MetricsAggregate [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.LaunchedExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixture.$$typecreator5$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.PartitionPath$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.ShortUpdater Error instrumenting class:org.apache.spark.sql.execution.datasources.SchemaMergeUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.DefaultStep [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateManagerImplV2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ResourceContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetPeers [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDBTypeInfo.$Index [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.TextSocketContinuousStream.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.IsotonicRegressionModel.SaveLoadV1_0.Data$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.FMRegressionModel.FMRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog.CatalogDatabaseHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.StructConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Binarizer.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.NGram.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.HavingClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ConstantContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveTempViews [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.BinaryToSQLTimestampRebaseUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.KMeansModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslExpression [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator2$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator10$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerSchedulerStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.ArrayAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.CreateStageResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Subscript [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveReferences [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.CoalesceExec.EmptyRDDWithPartitions [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.SignedPrefixComparatorDescNullsFirst [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelWriter.Data Error instrumenting class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider$StoreFile$ [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorStateChanged [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PredicateContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelWriter.Data Error instrumenting class:org.apache.spark.sql.catalyst.parser.ParserUtils$EnhancedLogicalPlan$ [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.ShuffleInMemorySorter.ShuffleSorterIterator [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.HashComparator [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestKillDriver [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestSubmitDriver [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.JoinReorderDP.JoinPlan Error instrumenting class:org.apache.spark.deploy.worker.ui.WorkerWebUI [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.receiver.BlockGenerator.GeneratorState [WARN] Unable to detect inner functions for class:org.apache.spark.InternalAccumulator.shuffleWrite [WARN] Unable to detect inner functions for class:org.apache.spark.ml.python.MLSerDe.DenseMatrixPickler Error instrumenting class:org.apache.spark.metrics.MetricsSystem [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.DenseVectorPickler [WARN] Unable to detect inner functions for class:org.apache.spark.executor.ExecutorMetricsPoller.TCMP Error instrumenting class:org.apache.spark.status.api.v1.PrometheusResource$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator28$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveNamespace [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RegisterBlockManager [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateManagerImplBase [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.RenameColumn [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.LaunchExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DefaultParamsReader.Metadata [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.DecisionTreeClassificationModel.DecisionTreeClassificationModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Interaction.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator17$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.xml.UDFXPathUtil.ReusableStringReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StringLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.RebaseDateTime.JsonRebaseRecord [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BoundPortsResponse [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.FMRegressionModel.FMRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.ImplicitAttribute [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.util.Utils.Lock [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec.$ColumnMetrics$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.BisectingKMeansModel.BisectingKMeansModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator26$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BooleanValueContext Error instrumenting class:org.apache.spark.sql.execution.streaming.state.RocksDB$ Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.parquet.ParquetTable [WARN] Unable to detect inner functions for class:org.apache.spark.sql.streaming.StreamingQueryListener.QueryStartedEvent [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.VertexData [WARN] Unable to detect inner functions for class:org.apache.spark.util.JsonProtocol.TASK_END_REASON_FORMATTED_CLASS_NAMES Error instrumenting class:org.apache.spark.sql.execution.SparkScriptTransformationWriterThread$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.analysis.DetectAmbiguousSelfJoin.ColumnReference [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.FlatMapGroupsWithStateStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.protocol.BlockTransferMessage.Type [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AnalyzeTablesContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InsertOverwriteTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ExtractGenerator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.CatalogAndIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.CompleteRecovery [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.impl.ShippableVertexPartition.ShippableVertexPartitionOpsConstructor [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorker [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.FPGrowthWrapper.FPGrowthWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.OpenHashMapBasedStateMap.LimitMarker Error instrumenting class:org.apache.spark.sql.execution.streaming.FileContextBasedCheckpointFileManager [WARN] Unable to detect inner functions for class:org.apache.spark.ExecutorAllocationManager.TargetNumUpdates [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator11$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleByPercentileContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedReadStateStore [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcFiltersBase.OrcPrimitiveField [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComplexColTypeListContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.$$typecreator2$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.json.JsonFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.Trigger [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveSubqueryColumnAliases [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$4 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.BinaryType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Gaussian [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowColumnsContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.util.LevelDBProvider.StoreVersion [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.UncompressedInBlockSort [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockHandler.$ShuffleMetrics [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LDA.LDAReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAssembler.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateData [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.dynalloc.ExecutorMonitor.ShuffleCleanedEvent [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Log [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.RobustScalerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ApplicationFinished [WARN] Unable to detect inner functions for class:org.apache.spark.sql.jdbc.MsSqlServerDialect.SpecificTypes Error instrumenting class:org.apache.spark.sql.execution.streaming.FileStreamSinkLog [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Tweedie [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.ExternalIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LogicalNotContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator20$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PolynomialExpansion.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.attribute.AttributeType.Binary$1$ [WARN] Unable to detect inner functions for class:org.apache.spark.util.SizeEstimator.ClassInfo [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.LaunchTask [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.RepeatedConverter [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.PartitionStrategy.RandomVertexCut [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.HintContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.StreamingListenerBus.WrappedStreamingListenerEvent [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClustering.Assignment [WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.WriteQueued [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.types.UTF8String.IntWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterClusterManager [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GBTClassifierWrapper.GBTClassifierWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.OneForOneStreamManager.StreamState [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.DataSourceV2Implicits.PartitionSpecsHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.GroupByType [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.types.UTF8String.LongWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FileFormatContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingRelationStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.ParserUtils.EnhancedLogicalPlan [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocationsMultipleBlockIds [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorkerFailed Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetFooterReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DecommissionWorkersOnHosts [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.PushLeftSemiLeftAntiThroughJoin.PushdownDirection [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierListContext Error instrumenting class:org.apache.spark.mllib.clustering.DistributedLDAModel$SaveLoadV1_0$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetFilters.ParquetPrimitiveField [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueStore [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ColTypeListContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.attribute.AttributeType.Unresolved$1$ [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.ArrayWrappers.ComparableIntArray [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateKeyWatermarkPredicate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Named [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetNamespacePropertiesContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SortPrefixUtils.NoOpPrefixComparator Error instrumenting class:org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CommentSpecContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.CheckForWorkerTimeOut [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetDecimalConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.KVTypeInfo.$MethodAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.ReviveOffers [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.BooleanBitSet.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorker [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RenameTableColumnContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.Serializer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetLongDictionaryAwareDecimalConverter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.OutputCommitCoordinatorEndpoint [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.EnsembleNodeData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.DoubleConverter [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.MergeShuffleFile [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.ByteBufferBlockStoreUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.LeastSquaresNESolver [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FunctionNameContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.MetricsAggregate [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.CosineSilhouette.$typecreator2$2 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.FileBasedWriteAheadLog.LogInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ExecutorAllocationManager.ExecutorAllocationListener Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat Error instrumenting class:org.apache.spark.sql.execution.datasources.binaryfile.BinaryFileFormat$ Error instrumenting class:org.apache.spark.metrics.sink.MetricsServlet [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TablePropertyListContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GBTRegressionModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$5 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.DataTypeJsonUtils.DataTypeJsonDeserializer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.DurationSequenceImpl [WARN] Unable to detect inner functions for class:org.apache.spark.util.JsonProtocol.SPARK_LISTENER_EVENT_FORMATTED_CLASS_NAMES [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StopWordsRemover.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.ListObjectOutputStream [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.Message [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockStoreClient.$2 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestDriverStatus Error instrumenting class:org.apache.spark.ui.DelegatingServletContextHandler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.NumNonZeros [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.NullIntolerant [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.PivotType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.ArrayConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$15 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FallbackOnPushMergedFailureResult Error instrumenting class:org.apache.spark.ml.tuning.CrossValidatorModel$CrossValidatorModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.storage.DiskBlockObjectWriter.$ManualCloseBufferedOutputStream$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.RebaseDateTime.RebaseInfo [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.Main.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.IntAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.VariableLengthRowBasedKeyValueBatch.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.PushBlockStreamCallback [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.IntegerType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.WriteQueueResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.BooleanBitSet.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.DCT.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.QueryPlanningTracker.PhaseSummary [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Poisson [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveShufflePushMergerLocation [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.QuasiNewton [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeExternalRowSorter.PrefixComputer.Prefix [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.StructNullableTypeConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.KMeansWrapper.KMeansWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.StringAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$8 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.RandomForestRegressionModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.OptimizeOneRowRelationSubquery.OneRowSubquery Error instrumenting class:org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand$ Error instrumenting class:org.apache.spark.sql.execution.streaming.state.StateStore$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.LaunchExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RelationContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterWriter.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.sources.MemorySink.AddedData [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.CombinedTypeCoercionRule [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.CharType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.StandaloneResourceUtils.StandaloneResourceAllocation [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Probit [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockHandler.$ManagedBufferIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleMethodContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.PushMergedLocalMetaFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.SubmitDriverResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.BinaryAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionSpecLocationContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.ExecutorDecommissioning Error instrumenting class:org.apache.spark.sql.execution.streaming.StreamMetadata$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.IntDelta.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator9$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.LocalPrefixSpan.ReversedPrefix Error instrumenting class:org.apache.spark.sql.execution.streaming.state.SchemaHelper$SchemaWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierCommentContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.RightSide Error instrumenting class:org.apache.spark.ui.ServerInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StopWordsRemover.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.FileBasedWriteAheadLog.LogInfo Error instrumenting class:org.apache.spark.kafka010.KafkaTokenUtil$KafkaDelegationTokenIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RemoveWorker [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SubstringContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RFormulaModel.RFormulaModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.CosineSilhouette.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.TriggerThreadDump [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.Strings [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.FileEntry [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.WindowsSubstitution [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.DiskMapIterator [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.GradientBoostedTreesModel.SaveLoadV1_0 Error instrumenting class:org.apache.spark.sql.execution.datasources.NoopCache$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeExternalRowSorter.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VarianceThresholdSelectorModel.VarianceThresholdSelectorWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillExecutorsOnHost [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.protocol.BlockTransferMessage.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.CholeskySolver [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper.GeneralizedLinearRegressionWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockHandler.$ShuffleMetrics.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.ConcatCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StopDriver [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.BlockLocationsAndStatus [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.JoinSelection [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.Postfix [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InMemoryLists [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$6 [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.TransportRequestHandler.$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator14$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LambdaContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BucketSpecContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.SourceFileRemover [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DriverStateChanged [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinConditionSplitPredicates [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TransformClauseContext Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.csv.CSVWrite [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelReader.$$typecreator5$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.InternalSequenceBase [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeKVExternalSorter.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.PartitioningUtils.TypedPartValue [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$9 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.FloatType.FloatAsIfIntegral [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.IDF.DocumentFrequencyAggregator [WARN] Unable to detect inner functions for class:org.apache.spark.ExecutorAllocationManager.TargetNumUpdates [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DoubleLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.Block.InlineHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.SetProperty [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.SquaredEuclideanSilhouette.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveNamespace [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.GBTClassificationModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.SerializationDebugger [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.receiver.BlockGenerator.Block [WARN] Unable to detect inner functions for class:org.apache.spark.BarrierCoordinator.ContextBarrierState [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FailureFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.VarcharType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FunctionCallContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.CalendarConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSlicer.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeM2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$14 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.ReplicateBlock [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SubqueryExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.ObjectStreamClassReflection [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugQuery [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ImputerModel.ImputerReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateWatermarkPredicates [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTableClausesContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.DynamicPartitionDataConcurrentWriter.WriterStatus [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.RobustScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DecimalType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveShuffle [WARN] Unable to detect inner functions for class:org.apache.spark.network.crypto.TransportCipher.EncryptedMessage [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.HashingTF.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.LongDelta.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.StarSchemaDetection.TableAccessCardinality [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator4$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.SessionCatalogAndIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.DirectKafkaRateController [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GBTRegressionModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.BytesToBytesMap.$Location [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ApplyTransformContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.BisectingKMeansWrapper.BisectingKMeansWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.LongArrays [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.adaptive.OptimizeSkewedJoin.ShuffleStage [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.RowUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ConstantDefaultContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.DictionaryEncoding.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.ArrayWrappers.ComparableByteArray [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.InBlock [WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DatasetUtils.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.OldData [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.ByteBufferBlockStoreUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$5 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.DecommissionBlockManagers [WARN] Unable to detect inner functions for class:org.apache.spark.storage.StorageStatus.RddStorageInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DayTimeIntervalType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LateralViewContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.CatalogAndMultipartIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComplexColTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RemoveExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InMemoryView [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.RepeatedPrimitiveConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator12$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ErrorCapturingMultiUnitsIntervalContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.ShortType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Binarizer.$$typecreator2$1 Error instrumenting class:org.apache.spark.mllib.regression.IsotonicRegressionModel$SaveLoadV1_0$Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelWriter.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Inverse [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.stat.test.ChiSqTest.Method [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Tokenizer.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator23$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingGlobalLimitStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ClearCacheContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.SpilledFile [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.ObjectSerializerPruning.IsNullCondition [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator21$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.EvaluatePython.StructTypePickler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.StringConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Sqrt [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.LongAsMicrosUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.DirectKafkaInputDStreamCheckpointData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.analysis.DetectAmbiguousSelfJoin.LogicalPlanWithDatasetId [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.KMeansModelReader.$$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TransformArgumentContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.internal.plugin.PluginContextImpl.PluginMetricsSource [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.ArrayConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.PartitioningUtils.TypedPartValue [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.DeprecatedConfig [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.PushMergedLocalMetaFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.plans.logical.statsEstimation.EstimationUtils.OverlappedRange [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LastContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SparkSaslClient.1 Error instrumenting class:org.apache.spark.ml.image.SamplePathFilter$ [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.PartitionStrategy.EdgePartition1D [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator27$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NotMatchedActionContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.UpdateDelegationTokens [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.HistoryServerDiskManager.Lease [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.HashMapGenerator.Buffer [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.RemoteBlockDownloadFileManager [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.LocalDateConverter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.AddWebUIFilter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ReconnectWorker [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TablePropertyContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.Deprecated [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ResetConfigurationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierSeqContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.UnivariateFeatureSelectorModel.UnivariateFeatureSelectorModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.recommendation.MatrixFactorizationModel.SaveLoadV1_0.$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveRelations [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveHints.ResolveCoalesceHints [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator5$2 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.UpdateBlockInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.ProgressReporter.ExecutionStats [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.RunLengthEncoding.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.PartitionStrategy.EdgePartition2D [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.OldData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TinyIntLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TrimContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.StructConverter [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.TimSort.$SortState [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DecommissionWorkersOnHosts [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.IsotonicRegressionWrapper.IsotonicRegressionWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.BasicNullableTypeConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ColumnarBatch.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.impl.GLMClassificationModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ClassificationModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetBinaryDictionaryAwareDecimalConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.FixedPoint [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$14 [WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.NettyRpcEnv.FileDownloadChannel [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.FMClassificationModel.FMClassificationModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexAndValue [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateManager [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RegularQuerySpecificationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.BooleanConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.FamilyAndLink [WARN] Unable to detect inner functions for class:org.apache.spark.util.sketch.CountMinSketch.Version [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec.$ColumnMetrics [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.TempFileBasedBlockStoreUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.dynalloc.ExecutorMonitor.ExecutorIdCollector [WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.BaseErrorHandler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.VectorizedRleValuesReader.1 Error instrumenting class:org.apache.spark.ml.tuning.TrainValidationSplitModel$TrainValidationSplitModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.Postfix [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.$typecreator5$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LDAModel.$$typecreator1$2 Error instrumenting class:org.apache.spark.sql.catalyst.util.CompressionCodecs$ [WARN] Unable to detect inner functions for class:org.apache.spark.executor.Executor.TaskRunner [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTableHeaderContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.streaming.StreamingQueryListener.QueryProgressEvent [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.HadoopRDD.HadoopMapPartitionsWithSplitRDD [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DoubleType.DoubleAsIfIntegral [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowSpecContext [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.ObjectStreamClassMethods [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.functions.$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.attribute.AttributeType.Numeric$1$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TransformContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveEncodersInUDF [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.DoublePrefixComparator Error instrumenting class:org.apache.spark.sql.execution.datasources.orc.OrcColumnarBatchReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorUpdated [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.SparkScripts [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LoadDataContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WriteStyle.QuotedStyle [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.Auto [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV2_0.$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDB.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeNNZ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DateType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.AFTSurvivalRegressionWrapper.AFTSurvivalRegressionWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WhenClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.$3 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisteredWorker [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.Heartbeat [WARN] Unable to detect inner functions for class:org.apache.spark.executor.CoarseGrainedExecutorBackend.Arguments [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.SparkSubmitCommandBuilder.$OptionParser [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ManageResourceContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.LongUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntervalLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveBroadcast [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.Metadata$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcFiltersBase.OrcPrimitiveField [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetTimeZoneContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClusteringModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RealIdentContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ArithmeticOperatorContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AssignmentContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MultiInsertQueryBodyContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.crypto.TransportCipher.DecryptionHandler [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveBlock [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Sequence.InternalSequence [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslAttribute Error instrumenting class:org.apache.spark.sql.execution.streaming.FileSystemBasedCheckpointFileManager [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalShuffleBlockResolver.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.SimpleDownloadFile.$SimpleDownloadWritableChannel [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.SeenFilesMap [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.BinaryToSQLTimestampConvertTzUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RowFormatSerdeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec.$SetAccumulator [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.DeferFetchRequestResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.JoinCriteriaContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ValueExpressionDefaultContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator22$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.MultilayerPerceptronClassifierWrapper.MultilayerPerceptronClassifierWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropViewContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VarianceThresholdSelectorModel.VarianceThresholdSelectorModelReader Error instrumenting class:org.apache.spark.mllib.tree.model.TreeEnsembleModel$SaveLoadV1_0$Metadata [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StrictNonReservedContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator15$3 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.UnregisterApplication [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Link [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.DecimalConverter [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDBTypeInfo.1 [WARN] Unable to detect inner functions for class:org.apache.spark.SparkConf.AlternateConfig Error instrumenting class:org.apache.spark.sql.execution.streaming.FileStreamSourceLog [WARN] Unable to detect inner functions for class:org.apache.spark.util.random.StratifiedSamplingUtils.RandomDataGenerator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.LongType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.PythonMLLibAPI.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelWriter.Data Error instrumenting class:org.apache.spark.WritableFactory$ [WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClient.$StdChannelListener [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.SerDeUtil.AutoBatchedPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator26$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.DecommissionExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.param.shared.SharedParamsCodeGen.ParamDesc Error instrumenting class:org.apache.spark.ui.ServerInfo$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MultipartIdentifierListContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.IsotonicRegressionWrapper.IsotonicRegressionWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ElementwiseProduct.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RegexTokenizer.$$typecreator2$2 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.GetExecutorLossReason [WARN] Unable to detect inner functions for class:org.apache.spark.storage.StorageStatus.NonRddStorageInfo Error instrumenting class:org.apache.spark.sql.execution.streaming.state.SchemaHelper$SchemaV1Writer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.ShortConverter Error instrumenting class:org.apache.spark.sql.execution.streaming.FileStreamSink$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeMean [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator9$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.JsonProtocol.JOB_RESULT_FORMATTED_CLASS_NAMES Error instrumenting class:org.apache.spark.ui.WebUI [WARN] Unable to detect inner functions for class:org.apache.spark.status.KVUtils.KVStoreScalaSerializer [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.NormL1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PrimaryExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator19$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.IdentityConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.StringUtils.PlanStringConcat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateManagerImplV1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ArithmeticUnaryContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.CLogLog [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.PushMergedRemoteMetaFailedFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.MiscellaneousProcessAdded [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCheckResult.TypeCheckSuccess Error instrumenting class:org.apache.spark.sql.execution.streaming.CompactibleFileStreamLog [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.SquaredEuclideanSilhouette.ClusterStats [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.BooleanConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexer.CategoryStats [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.FPGrowthModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.SpilledFile [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator24$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleByBucketContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DoubleType.DoubleAsIfIntegral [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SearchedCaseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetConfigurationContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.RevokedLeadership [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RFormulaModel.RFormulaModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.BatchedWriteAheadLog.Record [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ColPositionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.QuantileSummaries.Stats [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.lib.SVDPlusPlus.Conf [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GaussianMixtureWrapper.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AlterColumnActionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DoubleType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleByRowsContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcDeserializer.CatalystDataUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NonReservedContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueRowConverter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.ElectedLeader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExistsContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.NamespaceChange.RemoveProperty [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.StringUtils.StringConcat [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.UncompressedInBlock [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator20$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GBTRegressorWrapper.GBTRegressorWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RenameTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Interaction.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AlterTableAlterColumnContext Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.csv.CSVScan Error instrumenting class:org.apache.spark.executor.ExecutorSource Error instrumenting class:org.apache.spark.sql.execution.datasources.FileFormatWriter$ [WARN] Unable to detect inner functions for class:org.apache.spark.TestUtils.JavaSourceFromString [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.DistributedLDAModel.DistributedWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator7$2 Error instrumenting class:org.apache.spark.sql.execution.datasources.json.MultiLineJsonDataSource$ [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StatusUpdate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NullLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.HealthTracker.ExecutorFailureList.TaskId [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TruncateTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Sum [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StarContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RequestExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowPartitionsContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.DecisionTreeClassificationModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.AnsiTypeCoercion.GetDateFieldOperations [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$18 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueRowConverterFormatV1 [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.BasePythonRunner.ReaderIterator Error instrumenting class:org.apache.spark.sql.execution.datasources.ModifiedAfterFilter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.MetadataColumnHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.AddColumn [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.StageState [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestKillDriver [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SparkSession.Builder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.EvaluatePython.RowPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.TimSort.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveTables Error instrumenting class:org.apache.spark.input.StreamRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.UnivariateFeatureSelectorModel.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.CatalogImpl.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UpdateTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator10$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeExternalRowSorter.RowComparator [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeans.ClusterSummary [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.functions.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.PassThrough.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GBTRegressionModel.$$typecreator2$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.SQLHadoopMapReduceCommitProtocol [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.BasicOperators [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.impl.GLMRegressionModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.DistributedLDAModel.DistributedLDAModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.impl.GLMClassificationModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.PartitionTypeHelper [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.LaunchTask [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.BitmapArrays [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BoundPortsRequest [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.InternalLinearRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$12 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.DataSource.SourceInfo [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.AbstractLauncher.ArgumentValidator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DoubleType.DoubleIsConflicted [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.SerializerBuildHelper.MapElementInformation Error instrumenting class:org.apache.spark.ml.source.image.ImageFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SimpleCaseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.TimestampNTZConverter Error instrumenting class:org.apache.spark.sql.catalyst.expressions.codegen.Block$InlineHelper$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveWindowFrame [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NestedConstantListContext Error instrumenting class:org.apache.spark.api.python.JavaToWritableConverter Error instrumenting class:org.apache.spark.sql.execution.datasources.PartitionDirectory$ [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterClusterManager [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Family [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryOrganizationContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.FMClassificationModel.FMClassificationModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.FamilyAndLink [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkDirCleanup [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator21$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator10$3 [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.ShuffleBlockPusher.PushResult [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelWriter.Data Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.SpecificParquetRecordReaderBase Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.TextBasedFileScan [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.$SpillableIterator [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.ExternalIterator.$StreamBuffer [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.BasePythonRunner.WriterThread [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$2 [WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.RpcEndpointVerifier.CheckExistence [WARN] Unable to detect inner functions for class:org.apache.spark.ml.PipelineModel.PipelineModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.RebaseDateTime.JsonRebaseRecord [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator12$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRest.OneVsRestReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.resource.ResourceProfile.ExecutorResourcesOrDefaults [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$7 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.NNLSSolver [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeans.ClusterSummary [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.DecisionTreeRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.network.util.NettyUtils.1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.functions.$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTableLikeContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.python.MLSerDe.DenseVectorPickler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerWriter [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InMemoryIterator [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.SerDeUtil.ByteArrayConstructor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.UnivariateFeatureSelectorModel.UnivariateFeatureSelectorModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinSide [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.DictionaryEncoding.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.Instrumentation.loggerTags [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.TableDesc [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator30$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator2$3 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.dstream.FileInputDStream.FileInputDStreamCheckpointData [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SkewSpecContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.GetExecutorLossReason [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator21$2 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DecommissionWorkers Error instrumenting class:org.apache.spark.mllib.clustering.GaussianMixtureModel$SaveLoadV1_0$Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.ApproxCountDistinctForIntervals.LongArrayInternalRow [WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClient.$3 [WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.ErrorHandlingInputStream Error instrumenting class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider$StoreFile [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator5$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Projection [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator2$5 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.RandomForestRegressorWrapper.RandomForestRegressorWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.adaptive.OptimizeShuffleWithLocalRead.BroadcastJoinWithShuffleRight [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ResetQuotedConfigurationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveHints.RemoveAllHints [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.PythonMLLibAPI.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherBackend.BackendConnection [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetIntDictionaryAwareDecimalConverter Error instrumenting class:org.apache.spark.mllib.clustering.DistributedLDAModel$SaveLoadV1_0$EdgeData [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.Prefix [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.NodeData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.BooleanType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.util.TimerWithCustomTimeUnit.$SnapshotWithCustomTimeUnit [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetExecutorEndpointRef [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RemoveExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.HintStatementContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LinearSVCWrapper.LinearSVCWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.CatalogHelper [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.JdbcRDD.ConnectionFactory [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.OrderedIdentifierListContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.CatalogImpl.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveInsertInto [WARN] Unable to detect inner functions for class:org.apache.spark.executor.ExecutorMetricsPoller.TCMP Error instrumenting class:org.apache.spark.sql.execution.streaming.CheckpointFileManager$RenameBasedFSDataOutputStream [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$6 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeKVExternalSorter.KVComparator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.Block.BlockHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.IntegerUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAssembler.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateValueWatermarkPredicate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveWindowOrder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.InstantConverter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.$typecreator1$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.json.TextInputJsonDataSource$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.Solver [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.KolmogorovSmirnovTest.KolmogorovSmirnovTestResult [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TablePropertyValueContext Error instrumenting class:org.apache.spark.internal.io.FileCommitProtocol$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.QueryPlanningTracker.RuleSummary Error instrumenting class:org.apache.spark.ui.ProxyRedirectHandler [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockHandler.$ShuffleChunkManagedBufferIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamedExpressionSeqContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FunctionIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.util.LevelDBProvider.1 Error instrumenting class:org.apache.spark.sql.execution.datasources.PathFilterWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.BinaryUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.AddWebUIFilter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AddTableColumnsContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DmlStatementContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleDataTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Wildcard [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetTablePropertiesContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCheckResult.TypeCheckFailure Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.parquet.ParquetScan$ [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SaslEncryption.EncryptionHandler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog.TableIdentifierHelper [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionBase.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.write.WriteBuilder.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSizeHint.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.AnsiTypeCoercion.PromoteStringLiterals [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DeleteFromTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FailureFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV2_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.ByteConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveAliases [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.FloatHasher [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.SignedPrefixComparatorNullsLast [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.DynamicPartitionDataConcurrentWriter.WriterIndex Error instrumenting class:org.apache.spark.sql.execution.datasources.InMemoryFileIndex$ [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.ListObjectOutput [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherServer.$ServerConnection [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RowFormatContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocationsAndStatus [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.FPTree.Node [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslString Error instrumenting class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider$HDFSBackedStateStore Error instrumenting class:org.apache.spark.api.python.TestOutputValueConverter [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.RadixSortSupport [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.MapConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.PartitioningUtils.PartitionValues [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator4$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeRelationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ExtractGenerator.AliasedGenerator$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.PipelineModel.PipelineModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.analysis.DetectAmbiguousSelfJoin.AttrWithCast [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.AppShuffleMergePartitionsInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.FloatUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ErrorHandler.BlockPushErrorHandler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.AFTSurvivalRegressionWrapper.AFTSurvivalRegressionWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV1_0.$typecreator1$1 Error instrumenting class:org.apache.spark.sql.execution.streaming.state.SchemaHelper$SchemaV2Writer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.PythonEvals [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.DataSourceV2Implicits.OptionsHelper Error instrumenting class:org.apache.spark.sql.errors.QueryExecutionErrors$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StatementDefaultContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator3$6 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateNamespaceContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveGenerate [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.FValueTest.$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV3_0.$typecreator1$6 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.GBTClassificationModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ReconnectWorker [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Binomial [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArray.ExternalAppendOnlyUnsafeRowArrayIterator [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.BlockLocationsAndStatus [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.InternalLinearRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QualifiedNameContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.StringToAttributeConversionHelper [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.TransportRequestHandler.$4 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelWriter.Data Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$ [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DriverStateChanged [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.JoinRelationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetFilters.ParquetSchemaType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FirstContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.KMeansModelReader.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InsertIntoTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator22$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetTableLocationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.FlatMapGroupsWithStateExecHelper.StateData [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator18$3 Error instrumenting class:org.apache.spark.sql.catalyst.streaming.WriteToStreamStatement$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LogicalBinaryContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.BlockPushNonFatalFailure.ReturnCode Error instrumenting class:org.apache.spark.SparkEnv$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.ByteAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.PushMergedRemoteMetaFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.Batch [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetStorageStatus [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.NewHadoopRDD.NewHadoopMapPartitionsWithSplitRDD [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorter.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.StateStoreOps [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.QuantileSummaries.Stats [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSizeHint.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.EdgeData$ [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.DenseMatrixPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SubscriptContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.api.r.SQLUtils.RegexContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterChangeAcknowledged [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.PythonWorkerFactory.MonitorThread [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV2_0.$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$13 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.KafkaDataConsumer.CachedKafkaDataConsumer [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.recommendation.MatrixFactorizationModel.SaveLoadV1_0.$typecreator16$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FunctionTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator15$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.UnivariateFeatureSelectorModel.UnivariateFeatureSelectorModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.IsotonicRegressionModel.SaveLoadV1_0.$typecreator1$1 Error instrumenting class:org.apache.spark.deploy.history.EventLogFileWriter$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.LSHModel.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.DiskBlockObjectWriter.ManualCloseOutputStream Error instrumenting class:org.apache.spark.streaming.StreamingContext$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.FeatureHasher.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeWeightSum [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorkerResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.DecimalAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.impl.GLMRegressionModel.SaveLoadV1_0.Data$ Error instrumenting class:org.apache.spark.sql.catalyst.catalog.InMemoryCatalog$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.PMMLLinearRegressionModelWriter.Data Error instrumenting class:org.apache.spark.streaming.api.java.JavaStreamingContext$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinConditionSplitPredicates [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.ToBlockManagerMasterStorageEndpoint [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalBlockStoreClient.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.recommendation.MatrixFactorizationModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.DecimalConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator20$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SaslEncryption.DecryptionHandler [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InstanceList [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$8 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.First [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.BasicNullableTypeConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.RandomForestRegressionModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.functions.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Metric [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.OutputSpec [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.NullOutputStream [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillDriverResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.status.AppStatusListener.StageCompletionTime [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.PythonForeachWriter.UnsafeRowBuffer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext.MutableStateArrays [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.PipedRDD.NotEqualsFileNameFilter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveHints.DisableHints [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FromStatementContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableNameContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowViewsContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.DumpByteCode [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropTablePartitionsContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.HybridStore.SwitchToLevelDBListener [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Variance [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryClassificationSummary.$$typecreator6$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.WindowInPandasExec.BoundedWindow [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator29$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.SparkSubmitUtils.MavenCoordinate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTempViewUsingContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BeginRecovery [WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.NettyRpcEnv.FileDownloadCallback [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.StringConverter Error instrumenting class:org.apache.spark.sql.execution.datasources.csv.CSVFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.EpochMarkerGenerator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateWatermarkPredicates [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.KVTypeInfo.$FieldAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Power [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CommentNamespaceContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.ValueAndMatchPair [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection.Schema [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerDriverStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.KolmogorovSmirnovTest.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.GBTClassificationModel.GBTClassificationModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.FileStreamSourceCleaner [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.DummySerializerInstance.$1 Error instrumenting class:org.apache.spark.input.ConfigurableCombineFileRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NotMatchedClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchRequest [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.FPTree.Summary [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.CaseWhenCoercion Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.orc.OrcScan [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveOutputRelation [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateFileFormatContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinExec.OneSideHashJoiner.$AddingProcessedRowToStateCompletionIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.WindowInPandasExec.UnboundedWindow [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.JobScheduler.JobHandler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Named [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerDecommissioning [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.RandomForestRegressionModel.RandomForestRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.HandleNullInputsForUDF [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Message.Type [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator28$1 [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.impl.VertexPartition.VertexPartitionOpsConstructor [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.PushMergedRemoteMetaFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.OneForOneBlockFetcher.$DownloadCallback [WARN] Unable to detect inner functions for class:org.apache.spark.internal.io.FileCommitProtocol.EmptyTaskCommitMessage [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Normalizer.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDB.TypeAliases [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.Empty2Null Error instrumenting class:org.apache.spark.sql.catalyst.expressions.codegen.Block$BlockHelper$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeMetric Error instrumenting class:org.apache.spark.sql.execution.streaming.SinkFileStatus$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetArrayConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropNamespaceContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.FMRegressionModel.FMRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.FunctionArgumentConversion [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.HashingTF.HashingTFReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.UpdateColumnComment [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.BasePythonRunner.MonitorThread [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.FMClassifierWrapper.FMClassifierWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClusteringModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.GlobalAggregates Error instrumenting class:org.apache.spark.serializer.SerializationDebugger$ObjectStreamClassMethods$ [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.ExecutorDecommissionSigReceived [WARN] Unable to detect inner functions for class:org.apache.spark.util.SizeEstimator.SearchState [WARN] Unable to detect inner functions for class:org.apache.spark.InternalAccumulator.input [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateWatermarkPredicate [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.PowerIterationClustering.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RobustScalerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.InternalLinearRegressionModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcDeserializer.RowUpdater Error instrumenting class:org.apache.spark.status.api.v1.ApiRootResource$ [WARN] Unable to detect inner functions for class:org.apache.spark.InternalAccumulator.shuffleRead [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.Empty2Null [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowFrameContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorUpdated [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RetrieveSparkAppConfig [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.UDTConverter Error instrumenting class:org.apache.spark.mllib.clustering.LocalLDAModel$SaveLoadV1_0$Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.RandomForestRegressorWrapper.RandomForestRegressorWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InlineTableDefault1Context [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.DynamicPartitionDataConcurrentWriter.WriterIndex [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator14$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ErrorCapturingIdentifierExtraContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.HashMapGenerator.Buffer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.TimestampType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.LaunchedExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator9$2 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerDriverStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyAndNumValues [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.EnsembleNodeData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.StructConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveAggregateFunctions [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InlineTableDefault2Context [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.DecisionTreeClassifierWrapper.DecisionTreeClassifierWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.StructNullableTypeConverter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ReregisterWithMaster [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.GaussianMixtureModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.TimestampAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.util.logging.DriverLogger.DfsAsyncWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveSessionCatalog.ResolvedV1TableOrViewIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorter.SortComparator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.Rating [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.receiver.ReceiverSupervisor.ReceiverState [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RenameTablePartitionContext [WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.CryptoParams [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.Stop [WARN] Unable to detect inner functions for class:org.apache.spark.util.sketch.BloomFilter.Version [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.DataTypeJsonUtils.DataTypeJsonSerializer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator24$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.V1Table.IdentifierHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.FixedLenByteArrayAsIntUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NumberContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.VectorizedRleValuesReader.MODE [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.KeyWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.LongConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AlterViewQueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator2$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.stat.test.ChiSqTest.NullHypothesis [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$10 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeFuncNameContext Error instrumenting class:org.apache.spark.SparkContext$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.KolmogorovSmirnovTest.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GaussianMixtureWrapper.GaussianMixtureWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.NormalEquation [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.UnsafeShuffleWriter.StreamFallbackChannelWrapper Error instrumenting class:org.apache.spark.sql.execution.datasources.CodecStreams$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLContext.implicits [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator1$7 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.OpenHashMapBasedStateMap.StateInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.NonSessionCatalogAndIdentifier [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleStatementContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BooleanLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SelectClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.ConcurrentOutputWriterSpec [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.RebaseDateTime.RebaseInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.UncompressedInBlockBuilder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.ArraySortLike.NullOrder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.FloatType.FloatAsIfIntegral Error instrumenting class:org.apache.spark.sql.execution.command.PathFilterIgnoreNonData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.RemovedConfig [WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.Trigger [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.impl.GLMClassificationModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisteredApplication [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.DowncastLongUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRest.OneVsRestWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.ApproximatePercentile.PercentileDigestSerializer [WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.ErrorHandlingOutputStream [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayes.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RefreshResourceContext [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.HashMapGrowthStrategy.Doubling [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FromStmtContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.impl.RandomForest.NodeIndexInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleFunctionIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.BinaryToSQLTimestampUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV3_0.$typecreator1$5 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.IsExecutorAlive [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.FileStreamSourceCleaner [WARN] Unable to detect inner functions for class:org.apache.spark.status.KVUtils.MetadataMismatchException [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.LookupCatalog.CatalogAndNamespace [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.BytesToBytesMap.$MapIterator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.QuasiNewtonSolver.NormalEquationCostFun [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.$$typecreator16$2 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.HealthTracker.ExecutorFailureList.TaskId [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Mean [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowTablesContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Binarizer.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocationsMultipleBlockIds [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.BinaryToSQLTimestampConvertTzRebaseUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelWriter.Data Error instrumenting class:org.apache.spark.sql.execution.aggregate.TungstenAggregationIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$15 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RequestExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BeginRecovery [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ApplicationRemoved [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.StateStoreHandler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.DecisionTreeClassificationModel.DecisionTreeClassificationModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ANOVATest.$typecreator13$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerSchedulerStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.LongHasher [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.MapKeyDedupPolicy [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.KryoSerializer.PoolWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestMasterState [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelWriter.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MatchedClauseContext Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.UnivariateFeatureSelectorModel.UnivariateFeatureSelectorModelWriter.Data Error instrumenting class:org.apache.spark.ui.SparkUI [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.WindowFrameCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RemoveWorker [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VarianceThresholdSelectorModel.VarianceThresholdSelectorWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.DeclarativeAggregate.RichAttribute Error instrumenting class:org.apache.spark.sql.catalyst.catalog.ExternalCatalogUtils$ Error instrumenting class:org.apache.spark.sql.execution.datasources.v2.orc.OrcWrite [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.SizeTracker.Sample [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ConstantListContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.FMClassifierWrapper.FMClassifierWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcShimUtils.VectorizedRowBatchWrap [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.$4 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV3_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator27$2 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillTask [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.SetAppId [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.ToBlockManagerMaster [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.OneVsRestModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeL1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.RocksDBConf.ConfEntry [WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DatasetUtils.$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveUpCast [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolvePivot [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArray.InMemoryBufferIterator [WARN] Unable to detect inner functions for class:org.apache.spark.executor.CoarseGrainedExecutorBackend.RegisteredExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.stat.FrequentItems.FreqItemCounter [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.StringPrefixComparator [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.BatchedWriteAheadLog.Record [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RemoteBlockPushResolver.AppPathsInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.InternalKMeansModelWriter.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.TableChange.ColumnPosition [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.GenericFileFormatContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetTableSerDeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.UnsignedLongUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.UnsafeShuffleWriter.MyByteArrayOutputStream [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowRefContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowTblPropertiesContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropTableColumnsContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.UDTConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.ShortConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.CatalogV2Implicits.NamespaceHelper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetStringConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.streaming.StreamingQueryListener.Event [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.FPGrowth.FreqItemset [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.MergingSortWithSessionWindowStateIterator.SessionRowInformation [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.SelectorModel.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator17$3 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DmlStatementNoWithContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.RandomForestClassificationModel.RandomForestClassificationModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterApplication [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.impl.GLMClassificationModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.NormL2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeMin [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.VectorizedColumnReader.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.AppListingListener.MutableApplicationInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator7$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StatementContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeNamespaceContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ChiSquareTest.$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.ContinuousRecord [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.IntConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AliasedQueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.resource.ResourceProfile.DefaultProfileExecutorResources [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.GroupingElementContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.LaunchDriver [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.Decimal.DecimalIsConflicted [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SaslEncryption.EncryptedMessage [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.FMRegressorWrapper.FMRegressorWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.StageState [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalShuffleBlockResolver.AppExecId [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetBlockStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PositionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.IntConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.connector.catalog.NamespaceChange.SetProperty [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DecimalLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator16$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.SparseVectorPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator8$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.GaussianMixtureModel.SaveLoadV1_0.Data$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexModelWriter.$$typecreator1$3 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator3$2 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.StandaloneResourceUtils.MutableResourceInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.FileEntry [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BigDecimalLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RegexTokenizer.$$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetPeers [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.RocksDBStateStore.STATE [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Gamma [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.PMMLLinearRegressionModelWriter.Data Error instrumenting class:org.apache.spark.input.WholeTextFileRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.RatingBlockBuilder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.StringUtils.StringConcat [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator3$1 Error instrumenting class:org.apache.spark.internal.io.HadoopMapReduceCommitProtocol [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.StringToColumn [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveBroadcast [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.PredictData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.Optimizer.UpdateCTERelationStats [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StatusUpdate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RowFormatDelimitedContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.ValueAndMatchPair [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.SetState [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.LocalPrefixSpan.ReversedPrefix [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BoundPortsResponse [WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.ErrorHandlingReadableChannel [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugStreamQuery [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator25$2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.continuous.ContinuousQueuedDataReader.DataReaderThread [WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.LatchedTriggers [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.RandomForestModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.FloatAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorStateChanged [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.LinearRegressionModel.LinearRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.SpillableIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyToValuePair [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueType [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.ReplicateBlock [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.TransportRequestHandler.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ApplyCharTypePadding.AttrOrOuterRef [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV1_0.$typecreator1$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpanModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ApplicationFinished [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.$typecreator13$3 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GBTClassifierWrapper.GBTClassifierWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.LegacyBehaviorPolicy [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PrimitiveDataTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DecimalType.Fixed [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.json.JsonFilters.JsonPredicate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeFunctionContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.StorageStatus.RddStorageInfo [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.LogisticRegressionWithLBFGS.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.RowToColumnConverter.LongConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetVectorUpdaterFactory.UnsignedIntegerUpdater Error instrumenting class:org.apache.spark.sql.execution.datasources.text.TextFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator1$4 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowTableExtendedContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ANOVATest.$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.SplitData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowCreateTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.adaptive.OptimizeShuffleWithLocalRead.BroadcastJoinWithShuffleLeft [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelWriter.$$typecreator5$1 Created : .generated-mima-class-excludes in current directory. Created : .generated-mima-member-excludes in current directory. Using /usr/java/latest as default JAVA_HOME. Note, this will be overridden by -java-home if it is set. [info] welcome to sbt 1.5.5 (Private Build Java 1.8.0_282) [info] loading settings for project spark-master-test-sbt-hadoop-2-7-build from plugins.sbt ... [info] loading project definition from /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project [info] resolving key references (36381 settings) ... [info] set current project to spark-parent (in build file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/) [warn] there are 210 keys that are not used by any other settings/tasks: [warn] [warn] * assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * avro / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * avro / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * avro / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * avro / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * avro / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * avro / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * catalyst / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * catalyst / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * catalyst / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * catalyst / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * catalyst / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * catalyst / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * core / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * core / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * core / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * core / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * core / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * core / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * docker-integration-tests / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * docker-integration-tests / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * docker-integration-tests / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * docker-integration-tests / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * docker-integration-tests / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * docker-integration-tests / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * examples / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * examples / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * examples / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * examples / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * examples / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * examples / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * ganglia-lgpl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * ganglia-lgpl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * ganglia-lgpl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * ganglia-lgpl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * ganglia-lgpl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * ganglia-lgpl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * graphx / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * graphx / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * graphx / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * graphx / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * graphx / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * graphx / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * hadoop-cloud / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * hadoop-cloud / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * hadoop-cloud / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * hadoop-cloud / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * hadoop-cloud / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * hadoop-cloud / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * hive / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * hive / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * hive / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * hive / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * hive / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * hive / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * hive-thriftserver / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * hive-thriftserver / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * hive-thriftserver / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * hive-thriftserver / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * hive-thriftserver / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * hive-thriftserver / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * kubernetes / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * kubernetes / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * kubernetes / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * kubernetes / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * kubernetes / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * kubernetes / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * kvstore / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * kvstore / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * kvstore / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * kvstore / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * kvstore / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * kvstore / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * launcher / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * launcher / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * launcher / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * launcher / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * launcher / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * launcher / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * mesos / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * mesos / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * mesos / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * mesos / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * mesos / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * mesos / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * mllib / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * mllib / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * mllib / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * mllib / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * mllib / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * mllib / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * mllib-local / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * mllib-local / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * mllib-local / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * mllib-local / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * mllib-local / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * mllib-local / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * network-common / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * network-common / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * network-common / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * network-common / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * network-common / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * network-common / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * network-shuffle / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * network-shuffle / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * network-shuffle / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * network-shuffle / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * network-shuffle / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * network-shuffle / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * network-yarn / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * network-yarn / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * network-yarn / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * network-yarn / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * network-yarn / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * network-yarn / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * repl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * repl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * repl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * repl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * repl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * repl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * sketch / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * sketch / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * sketch / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * sketch / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * sketch / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * sketch / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * spark / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * spark / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * spark / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * spark / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * spark / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * spark / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * sql / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * sql / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * sql / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * sql / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * sql / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * sql / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * sql-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * sql-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * sql-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * sql-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * sql-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * sql-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming-kafka-0-10-assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming-kafka-0-10-assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming-kafka-0-10-assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming-kafka-0-10-assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming-kafka-0-10-assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming-kafka-0-10-assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming-kinesis-asl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming-kinesis-asl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming-kinesis-asl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming-kinesis-asl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming-kinesis-asl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming-kinesis-asl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming-kinesis-asl-assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming-kinesis-asl-assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming-kinesis-asl-assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming-kinesis-asl-assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming-kinesis-asl-assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming-kinesis-asl-assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * tags / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * tags / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * tags / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * tags / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * tags / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * tags / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * token-provider-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * token-provider-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * token-provider-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * token-provider-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * token-provider-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * token-provider-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * tools / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * tools / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * tools / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * tools / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * tools / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * tools / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * unsafe / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * unsafe / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * unsafe / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * unsafe / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * unsafe / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * unsafe / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * yarn / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * yarn / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * yarn / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * yarn / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * yarn / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * yarn / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] [warn] note: a setting might still be used by a command; to exclude a key from this `lintUnused` check [warn] either append it to `Global / excludeLintKeys` or call .withRank(KeyRanks.Invisible) on the key [info] spark-parent: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-tags: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-unsafe: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-kvstore: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-network-common: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-network-shuffle: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-network-yarn: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-tools: mimaPreviousArtifacts not set, not analyzing binary compatibility [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] spark-ganglia-lgpl: mimaPreviousArtifacts not set, not analyzing binary compatibility [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] spark-yarn: mimaPreviousArtifacts not set, not analyzing binary compatibility [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] spark-catalyst: mimaPreviousArtifacts not set, not analyzing binary compatibility [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] spark-mesos: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-kubernetes: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-token-provider-kafka-0-10: mimaPreviousArtifacts not set, not analyzing binary compatibility [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] spark-streaming-kinesis-asl: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-docker-integration-tests: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-avro: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-hadoop-cloud: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-sql-kafka-0-10: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-hive: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-streaming-kinesis-asl-assembly: mimaPreviousArtifacts not set, not analyzing binary compatibility [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] spark-hive-thriftserver: mimaPreviousArtifacts not set, not analyzing binary compatibility [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] spark-streaming-kafka-0-10-assembly: mimaPreviousArtifacts not set, not analyzing binary compatibility [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] spark-examples: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-repl: mimaPreviousArtifacts not set, not analyzing binary compatibility [info] spark-assembly: mimaPreviousArtifacts not set, not analyzing binary compatibility [success] Total time: 55 s, completed Dec 3, 2021 10:54:00 PM [info] Building Spark assembly using SBT with these arguments: -Phadoop-2.7 -Pmesos -Phadoop-cloud -Pdocker-integration-tests -Pspark-ganglia-lgpl -Pyarn -Phive -Phive-thriftserver -Pkinesis-asl -Pkubernetes assembly/package Using /usr/java/latest as default JAVA_HOME. Note, this will be overridden by -java-home if it is set. [info] welcome to sbt 1.5.5 (Private Build Java 1.8.0_282) [info] loading settings for project spark-master-test-sbt-hadoop-2-7-build from plugins.sbt ... [info] loading project definition from /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project [info] resolving key references (36387 settings) ... [info] set current project to spark-parent (in build file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/) [warn] there are 210 keys that are not used by any other settings/tasks: [warn] [warn] * assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * avro / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * avro / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * avro / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * avro / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * avro / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * avro / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * catalyst / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * catalyst / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * catalyst / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * catalyst / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * catalyst / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * catalyst / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * core / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * core / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * core / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * core / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * core / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * core / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * docker-integration-tests / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * docker-integration-tests / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * docker-integration-tests / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * docker-integration-tests / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * docker-integration-tests / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * docker-integration-tests / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * examples / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * examples / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * examples / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * examples / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * examples / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * examples / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * ganglia-lgpl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * ganglia-lgpl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * ganglia-lgpl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * ganglia-lgpl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * ganglia-lgpl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * ganglia-lgpl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * graphx / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * graphx / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * graphx / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * graphx / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * graphx / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * graphx / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * hadoop-cloud / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * hadoop-cloud / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * hadoop-cloud / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * hadoop-cloud / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * hadoop-cloud / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * hadoop-cloud / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * hive / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * hive / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * hive / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * hive / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * hive / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * hive / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * hive-thriftserver / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * hive-thriftserver / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * hive-thriftserver / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * hive-thriftserver / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * hive-thriftserver / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * hive-thriftserver / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * kubernetes / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * kubernetes / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * kubernetes / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * kubernetes / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * kubernetes / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * kubernetes / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * kvstore / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * kvstore / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * kvstore / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * kvstore / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * kvstore / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * kvstore / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * launcher / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * launcher / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * launcher / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * launcher / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * launcher / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * launcher / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * mesos / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * mesos / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * mesos / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * mesos / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * mesos / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * mesos / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * mllib / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * mllib / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * mllib / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * mllib / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * mllib / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * mllib / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * mllib-local / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * mllib-local / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * mllib-local / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * mllib-local / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * mllib-local / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * mllib-local / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * network-common / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * network-common / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * network-common / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * network-common / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * network-common / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * network-common / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * network-shuffle / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * network-shuffle / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * network-shuffle / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * network-shuffle / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * network-shuffle / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * network-shuffle / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * network-yarn / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * network-yarn / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * network-yarn / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * network-yarn / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * network-yarn / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * network-yarn / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * repl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * repl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * repl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * repl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * repl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * repl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * sketch / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * sketch / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * sketch / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * sketch / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * sketch / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * sketch / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * spark / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * spark / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * spark / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * spark / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * spark / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * spark / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * sql / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * sql / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * sql / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * sql / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * sql / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * sql / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * sql-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * sql-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * sql-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * sql-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * sql-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * sql-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming-kafka-0-10-assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming-kafka-0-10-assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming-kafka-0-10-assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming-kafka-0-10-assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming-kafka-0-10-assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming-kafka-0-10-assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming-kinesis-asl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming-kinesis-asl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming-kinesis-asl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming-kinesis-asl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming-kinesis-asl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming-kinesis-asl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming-kinesis-asl-assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming-kinesis-asl-assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming-kinesis-asl-assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming-kinesis-asl-assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming-kinesis-asl-assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming-kinesis-asl-assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * tags / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * tags / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * tags / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * tags / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * tags / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * tags / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * token-provider-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * token-provider-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * token-provider-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * token-provider-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * token-provider-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * token-provider-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * tools / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * tools / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * tools / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * tools / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * tools / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * tools / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * unsafe / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * unsafe / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * unsafe / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * unsafe / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * unsafe / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * unsafe / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * yarn / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * yarn / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * yarn / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * yarn / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * yarn / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * yarn / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] [warn] note: a setting might still be used by a command; to exclude a key from this `lintUnused` check [warn] either append it to `Global / excludeLintKeys` or call .withRank(KeyRanks.Invisible) on the key [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [success] Total time: 43 s, completed Dec 3, 2021 10:54:55 PM ======================================================================== Running Java style checks ======================================================================== [info] Checking Java style using SBT with these profiles: -Phadoop-2.7 -Pmesos -Phadoop-cloud -Pdocker-integration-tests -Pspark-ganglia-lgpl -Pyarn -Phive -Phive-thriftserver -Pkinesis-asl -Pkubernetes Checkstyle checks passed. ======================================================================== Running Spark unit tests ======================================================================== [info] Running Spark tests using SBT with these arguments: -Phadoop-2.7 -Pmesos -Phadoop-cloud -Pspark-ganglia-lgpl -Pyarn -Pkinesis-asl -Phive -Phive-thriftserver -Pdocker-integration-tests -Pkubernetes test Using /usr/java/latest as default JAVA_HOME. Note, this will be overridden by -java-home if it is set. [info] welcome to sbt 1.5.5 (Private Build Java 1.8.0_282) [info] loading settings for project spark-master-test-sbt-hadoop-2-7-build from plugins.sbt ... [info] loading project definition from /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project [info] resolving key references (36387 settings) ... [info] set current project to spark-parent (in build file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/) [warn] there are 210 keys that are not used by any other settings/tasks: [warn] [warn] * assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * avro / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * avro / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * avro / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * avro / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * avro / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * avro / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * catalyst / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * catalyst / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * catalyst / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * catalyst / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * catalyst / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * catalyst / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * core / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * core / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * core / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * core / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * core / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * core / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * docker-integration-tests / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * docker-integration-tests / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * docker-integration-tests / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * docker-integration-tests / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * docker-integration-tests / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * docker-integration-tests / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * examples / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * examples / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * examples / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * examples / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * examples / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * examples / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * ganglia-lgpl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * ganglia-lgpl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * ganglia-lgpl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * ganglia-lgpl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * ganglia-lgpl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * ganglia-lgpl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * graphx / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * graphx / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * graphx / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * graphx / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * graphx / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * graphx / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * hadoop-cloud / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * hadoop-cloud / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * hadoop-cloud / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * hadoop-cloud / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * hadoop-cloud / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * hadoop-cloud / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * hive / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * hive / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * hive / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * hive / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * hive / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * hive / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * hive-thriftserver / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * hive-thriftserver / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * hive-thriftserver / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * hive-thriftserver / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * hive-thriftserver / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * hive-thriftserver / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * kubernetes / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * kubernetes / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * kubernetes / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * kubernetes / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * kubernetes / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * kubernetes / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * kvstore / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * kvstore / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * kvstore / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * kvstore / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * kvstore / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * kvstore / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * launcher / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * launcher / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * launcher / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * launcher / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * launcher / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * launcher / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * mesos / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * mesos / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * mesos / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * mesos / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * mesos / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * mesos / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * mllib / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * mllib / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * mllib / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * mllib / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * mllib / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * mllib / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * mllib-local / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * mllib-local / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * mllib-local / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * mllib-local / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * mllib-local / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * mllib-local / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * network-common / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * network-common / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * network-common / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * network-common / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * network-common / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * network-common / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * network-shuffle / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * network-shuffle / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * network-shuffle / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * network-shuffle / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * network-shuffle / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * network-shuffle / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * network-yarn / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * network-yarn / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * network-yarn / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * network-yarn / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * network-yarn / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * network-yarn / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * repl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * repl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * repl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * repl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * repl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * repl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * sketch / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * sketch / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * sketch / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * sketch / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * sketch / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * sketch / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * spark / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * spark / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * spark / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * spark / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * spark / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * spark / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * sql / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * sql / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * sql / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * sql / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * sql / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * sql / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * sql-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * sql-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * sql-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * sql-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * sql-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * sql-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming-kafka-0-10-assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming-kafka-0-10-assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming-kafka-0-10-assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming-kafka-0-10-assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming-kafka-0-10-assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming-kafka-0-10-assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming-kinesis-asl / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming-kinesis-asl / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming-kinesis-asl / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming-kinesis-asl / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming-kinesis-asl / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming-kinesis-asl / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * streaming-kinesis-asl-assembly / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * streaming-kinesis-asl-assembly / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * streaming-kinesis-asl-assembly / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * streaming-kinesis-asl-assembly / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * streaming-kinesis-asl-assembly / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * streaming-kinesis-asl-assembly / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * tags / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * tags / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * tags / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * tags / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * tags / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * tags / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * token-provider-kafka-0-10 / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * token-provider-kafka-0-10 / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * token-provider-kafka-0-10 / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * token-provider-kafka-0-10 / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * token-provider-kafka-0-10 / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * token-provider-kafka-0-10 / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * tools / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * tools / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * tools / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * tools / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * tools / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * tools / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * unsafe / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * unsafe / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * unsafe / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * unsafe / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * unsafe / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * unsafe / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] * yarn / Compile / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1070 [warn] * yarn / M2r / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:299 [warn] * yarn / Sbt / publishMavenStyle [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:300 [warn] * yarn / Test / checkstyle / javaSource [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:1071 [warn] * yarn / scalaStyleOnCompile / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:191 [warn] * yarn / scalaStyleOnTest / logLevel [warn] +- /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/project/SparkBuild.scala:192 [warn] [warn] note: a setting might still be used by a command; to exclude a key from this `lintUnused` check [warn] either append it to `Global / excludeLintKeys` or call .withRank(KeyRanks.Invisible) on the key [info] Test run started [info] Test org.apache.spark.util.kvstore.LevelDBBenchmark ignored [info] Test run finished: 0 failed, 1 ignored, 0 total, 0.011s [info] Test run started [info] Test org.apache.spark.network.crypto.AuthMessagesSuite.testPublicKeyEncodeDecode started [info] Test run started [info] Test org.apache.spark.util.kvstore.ArrayWrappersSuite.testGenericArrayKey started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.009s [info] Test run started [info] Test run started [info] Test org.apache.spark.network.sasl.ShuffleSecretManagerSuite.testMultipleRegisters started [info] Test run started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testNoRedirectToLog started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexDescendingWithStart started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testProcMonitorWithOutputRedirection started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.211s [info] Test run started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectOutputToLog started [info] Test org.apache.spark.network.ChunkFetchRequestHandlerSuite.handleChunkFetchRequest started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexDescending started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexWithSkip started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexWithMax started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexDescending started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexDescending started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndex started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexWithSkip started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexDescending started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.testRefWithIntNaturalKey started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexDescending started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexWithMax started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndex started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexWithSkip started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexWithMax started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndex started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexWithSkip started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndex started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndex started [info] Test run finished: 0 failed, 0 ignored, 38 total, 0.266s [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectsSimple started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectErrorToLog started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.269s [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testProcMonitorWithLogRedirection started [info] Test run started [info] Test org.apache.spark.network.shuffle.ExternalShuffleCleanupSuite.cleanupOnlyRemovedApp started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testFailedChildProc started [info] Test run started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testMultipleTypesWriteReadDelete started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectErrorTwiceFails started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testBadLogRedirect started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectLastWins started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectToLog started [info] Test run finished: 0 failed, 0 ignored, 11 total, 0.268s [info] Test run started [info] Test org.apache.spark.launcher.SparkSubmitOptionParserSuite.testMissingArg started [info] Test org.apache.spark.network.shuffle.ExternalShuffleCleanupSuite.cleanupUsesExecutor started [info] Test org.apache.spark.network.shuffle.ExternalShuffleCleanupSuite.noCleanupAndCleanup started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testObjectWriteReadDelete started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testSkip started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testMultipleObjectWriteReadDelete started [info] Test org.apache.spark.network.shuffle.ExternalShuffleCleanupSuite.cleanupMultipleExecutors started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testReopenAndVersionCheckDb started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testMetadata started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testUpdate started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testCloseLevelDBIterator started [info] Test run finished: 0 failed, 0 ignored, 4 total, 0.463s [info] Test run started [info] Test org.apache.spark.launcher.SparkSubmitOptionParserSuite.testAllOptions started [info] Test org.apache.spark.launcher.SparkSubmitOptionParserSuite.testEqualSeparatedOption started [info] Test org.apache.spark.launcher.SparkSubmitOptionParserSuite.testExtraOptions started [info] Test run finished: 0 failed, 0 ignored, 4 total, 0.53s [info] Test run started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testCliParser started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testPySparkLauncher started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testAlternateSyntaxParsing started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunner started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testSparkRShell started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testMissingAppResource started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunnerPrimaryResource started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testShellCliParser started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testClusterCmdBuilder started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testDriverCmdBuilder started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunnerNoMainClass started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.777s [info] Test run started [info] Test org.apache.spark.network.protocol.MergedBlockMetaSuccessSuite.testMergedBlocksMetaEncodeDecode started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testCliKillAndStatus started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunnerNoArg started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testIsClientMode started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testPySparkFallback started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunnerWithMasterNoMainClass started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testCliHelpAndNoArg started [info] Test run finished: 0 failed, 0 ignored, 17 total, 0.122s [info] Test run started [info] Test org.apache.spark.launcher.LauncherServerSuite.testTimeout started [info] Test org.apache.spark.launcher.LauncherServerSuite.testStreamFiltering started [info] Test org.apache.spark.launcher.LauncherServerSuite.testSparkSubmitVmShutsDown started [info] Test org.apache.spark.launcher.LauncherServerSuite.testLauncherServerReuse started [info] Test org.apache.spark.launcher.LauncherServerSuite.testAppHandleDisconnect started [info] Test org.apache.spark.launcher.LauncherServerSuite.testCommunication started [info] Test run finished: 0 failed, 0 ignored, 6 total, 0.038s [info] Test run started [info] Test org.apache.spark.launcher.InProcessLauncherSuite.testKill started [info] Test org.apache.spark.launcher.InProcessLauncherSuite.testLauncher started [info] Test org.apache.spark.launcher.InProcessLauncherSuite.testErrorPropagation started [info] Test run finished: 0 failed, 0 ignored, 3 total, 0.036s [info] Test run started [info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testValidOptionStrings started [info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testJavaMajorVersion started [info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testPythonArgQuoting started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.177s [info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testWindowsBatchQuoting started [info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testInvalidOptionStrings started [info] Test run finished: 0 failed, 0 ignored, 5 total, 0.002s [info] Test run started [info] Test org.apache.spark.network.client.TransportClientFactorySuite.reuseClientsUpToConfigVariable started [info] Test org.apache.spark.network.sasl.SaslIntegrationSuite.testGoodClient started [info] Test org.apache.spark.network.sasl.SaslIntegrationSuite.testNoSaslClient started [info] Test org.apache.spark.network.sasl.SaslIntegrationSuite.testNoSaslServer started [info] Test org.apache.spark.network.client.TransportClientFactorySuite.reuseClientsUpToConfigVariableConcurrent started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testRemoveAll started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testNegativeIndexValues started [info] Test run finished: 0 failed, 0 ignored, 10 total, 1.473s [info] Test run started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testBasicIteration started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testObjectWriteReadDelete started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testDeleteParentIndex started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testMultipleObjectWriteReadDelete started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testMetadata started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testArrayIndices started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testUpdate started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testRemoveAll started [info] Test run finished: 0 failed, 0 ignored, 8 total, 0.005s [info] Test run started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testDuplicateIndex started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testEmptyIndexName started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testIndexAnnotation started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testNumEncoding started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testIllegalIndexMethod started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testKeyClashes started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testArrayIndices started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testNoNaturalIndex2 started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testIllegalIndexName started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testNoNaturalIndex started [info] Test run finished: 0 failed, 0 ignored, 10 total, 0.009s [info] Test run started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexDescending started [info] Test org.apache.spark.network.sasl.SaslIntegrationSuite.testBadClient started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexWithSkip started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexWithMax started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexDescending started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexDescending started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndex started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexWithSkip started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexDescending started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.testRefWithIntNaturalKey started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexDescending started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexDescendingWithStart started [info] Test run finished: 0 failed, 0 ignored, 4 total, 1.167s [info] Test run started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testEmptyBlockFetch started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexWithMax started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndex started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexWithSkip started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexWithMax started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndex started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexWithSkip started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndex started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexWithStart started [info] Test org.apache.spark.network.client.TransportClientFactorySuite.fastFailConnectionInTimeWindow started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndex started [info] Test run finished: 0 failed, 0 ignored, 38 total, 0.265s [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFailure started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFailureAndSuccess started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testBatchFetchThreeShuffleBlocks started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testShuffleBlockChunksFetch started [info] Test org.apache.spark.network.client.TransportClientFactorySuite.closeFactoryBeforeCreateClient started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testShuffleBlockChunkFetchFailure started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFetchThreeShuffleBlocks started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testUseOldProtocol started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFetchThree started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testInvalidShuffleBlockIds started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testBatchFetchShuffleBlocksOrder started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFetchOne started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFetchShuffleBlocksOrder started [info] Test run finished: 0 failed, 0 ignored, 13 total, 0.181s [info] Test run started [info] Test org.apache.spark.network.shuffle.ErrorHandlerSuite.testErrorRetry started [info] Test org.apache.spark.network.shuffle.ErrorHandlerSuite.testErrorLogging started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.004s [info] Test run started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testShuffleCorruptionDiagnosisNetworkIssue started [info] Test org.apache.spark.network.client.TransportClientFactorySuite.closeBlockClientsWithFactory started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testFetchShuffleBlocks started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testFetchShuffleChunks started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testShuffleCorruptionDiagnosisCRC32 started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testShuffleCorruptionDiagnosisChecksumVerifyPass started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testOpenBlocksWithShuffleChunks started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testOpenDiskPersistedRDDBlocksWithMissingBlock started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testShuffleCorruptionDiagnosisDiskIssue started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testFinalizeShuffleMerge started [info] BloomFilterSuite: [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testRegisterExecutor started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testShuffleCorruptionDiagnosisUnknownIssue started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testFetchMergedBlocksMeta started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testBadMessages started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testCompatibilityWithOldVersion started [info] Test org.apache.spark.network.client.TransportClientFactorySuite.neverReturnInactiveClients started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testFetchShuffleBlocksInBatch started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testShuffleCorruptionDiagnosisUnSupportedAlgorithm started [info] Test org.apache.spark.network.shuffle.ExternalBlockHandlerSuite.testOpenDiskPersistedRDDBlocks started [info] - accuracy - Byte (37 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 17 total, 0.28s [info] Test run started [info] Test org.apache.spark.network.shuffle.protocol.FetchShuffleBlockChunksSuite.testFetchShuffleBlockChunksEncodeDecode started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.001s [info] Test run started [info] Test org.apache.spark.network.shuffle.protocol.FetchShuffleBlocksSuite.testFetchShuffleBlockEncodeDecode started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.001s [info] Test run started [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnRemovedExecutorWithoutFilesToKeep started [info] - mergeInPlace - Byte (17 milliseconds) [info] - intersectInPlace - Byte (12 milliseconds) [info] - accuracy - Short (6 milliseconds) [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnRemovedExecutorWithFilesToKeepFetchRddEnabled started [info] - mergeInPlace - Short (14 milliseconds) [info] - intersectInPlace - Short (11 milliseconds) [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRemovedExecutorWithoutFilesToKeep started [info] Test org.apache.spark.network.client.TransportClientFactorySuite.closeIdleConnectionForRequestTimeOut started [info] - accuracy - Int (39 milliseconds) [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRegisteredExecutorWithFilesToKeepFetchRddDisabled started [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRegisteredExecutorWithoutFilesToKeep started [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRegisteredExecutorWithFilesToKeepFetchRddEnabled started [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupUsesExecutorWithoutFilesToKeep started [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnRemovedExecutorWithFilesToKeepFetchRddDisabled started [info] - mergeInPlace - Int (129 milliseconds) [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRemovedExecutorWithFilesToKeepFetchRddDisabled started [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupUsesExecutorWithFilesToKeep started [info] Test org.apache.spark.network.shuffle.CleanupNonShuffleServiceServedFilesSuite.cleanupOnlyRemovedExecutorWithFilesToKeepFetchRddEnabled started [info] - intersectInPlace - Int (88 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 11 total, 0.349s [info] Test run started [info] Test org.apache.spark.network.shuffle.OneForOneBlockPusherSuite.testHandlingRetriableFailures started [info] - accuracy - Long (49 milliseconds) [info] Test org.apache.spark.network.shuffle.OneForOneBlockPusherSuite.testPushOne started [info] Test org.apache.spark.network.shuffle.OneForOneBlockPusherSuite.testServerFailures started [info] Test org.apache.spark.network.shuffle.OneForOneBlockPusherSuite.testPushThree started [info] Test run finished: 0 failed, 0 ignored, 4 total, 0.037s [info] Test run started [info] Test org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testBadSecret started [info] - mergeInPlace - Long (79 milliseconds) [info] Test org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testBadAppId started [info] - intersectInPlace - Long (73 milliseconds) [info] Test org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testValid started [info] Test org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testEncryption started [info] Test run finished: 0 failed, 0 ignored, 4 total, 0.305s [info] Test run started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testBlockFetchWithOlderShuffleMergeId started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testDuplicateBlocksAreIgnoredWhenPrevStreamIsInProgress started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testRequestForAbortedShufflePartitionThrowsException started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureAfterMultipleDataBlocks started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testOngoingMergeOfBlockFromPreviousAttemptIsAborted started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testBasicBlockMerge started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testCollision started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testPushBlockFromPreviousAttemptIsRejected started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testBlockPushWithOlderShuffleMergeId started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureAfterDuplicateBlockDoesNotInterfereActiveStream started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testExecutorRegisterWithInvalidJsonForPushShuffle started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testDuplicateBlocksAreIgnoredWhenPrevStreamHasCompleted started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureWhileTruncatingFiles started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFinalizeShuffleMergeFromPreviousAttemptIsAborted started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testCleanupOlderShuffleMergeId started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testIOExceptionsExceededThreshold started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testOnFailureInvokedMoreThanOncePerBlock started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testBlockReceivedAfterMergeFinalize started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureAfterData started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFinalizeWithOlderShuffleMergeId started [info] UTF8StringPropertyCheckSuite: [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testWritingPendingBufsIsAbortedImmediatelyDuringComplete started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFinalizeWithMultipleReducePartitions started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testUpdateLocalDirsOnlyOnce started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testNoIndexFile started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFinalizeOfDeterminateShuffle started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testRecoverIndexFileAfterIOExceptionsInFinalize started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testExecutorRegistrationFromTwoAppAttempts started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureAfterComplete started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testRecoverMetaFileAfterIOExceptionsInFinalize started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testIncompleteStreamsAreOverwritten started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testFailureInAStreamDoesNotInterfereWithStreamWhichIsWriting started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testDeferredBufsAreWrittenDuringOnData started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testRecoverIndexFileAfterIOExceptions started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testCleanUpDirectory started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testIOExceptionsDuringMetaUpdateIncreasesExceptionCount started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testDividingMergedBlocksIntoChunks started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testErrorLogging started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testPendingBlockIsAbortedImmediately started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testDeferredBufsAreWrittenDuringOnComplete started [info] Test org.apache.spark.network.shuffle.RemoteBlockPushResolverSuite.testRecoverMetaFileAfterIOExceptions started [info] - toString (92 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 40 total, 0.383s [info] - numChars (4 milliseconds) [info] - startsWith (22 milliseconds) [info] - endsWith (6 milliseconds) [info] - toUpperCase (5 milliseconds) [info] - toLowerCase (4 milliseconds) [info] Test run started [info] Test org.apache.spark.network.shuffle.RetryingBlockTransferorSuite.testRetryAndUnrecoverable started [info] - compare (10 milliseconds) [info] Test org.apache.spark.network.shuffle.RetryingBlockTransferorSuite.testSingleIOExceptionOnFirst started [info] Test org.apache.spark.network.shuffle.RetryingBlockTransferorSuite.testUnrecoverableFailure started [info] Test org.apache.spark.network.shuffle.RetryingBlockTransferorSuite.testSingleIOExceptionOnSecond started [info] Test org.apache.spark.network.shuffle.RetryingBlockTransferorSuite.testThreeIOExceptions started [info] Test org.apache.spark.network.client.TransportClientFactorySuite.returnDifferentClientsForDifferentServers started [info] Test org.apache.spark.network.shuffle.RetryingBlockTransferorSuite.testNoFailures started [info] Test org.apache.spark.network.shuffle.RetryingBlockTransferorSuite.testTwoIOExceptions started [info] Test run finished: 0 failed, 0 ignored, 7 total, 0.102s [info] Test run started [info] Test org.apache.spark.network.shuffle.BlockTransferMessagesSuite.serializeOpenShuffleBlocks started [info] Test org.apache.spark.network.shuffle.BlockTransferMessagesSuite.testLocalDirsMessages started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.003s [info] Test run started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchUnregisteredExecutor started [info] - substring (87 milliseconds) [info] - contains (37 milliseconds) [info] - trim, trimLeft, trimRight (16 milliseconds) [info] - reverse (3 milliseconds) [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchWrongExecutor started [info] - indexOf (21 milliseconds) [info] - repeat (8 milliseconds) [info] - lpad, rpad (4 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 8 total, 2.668s [info] Test run started [info] Test org.apache.spark.network.util.CryptoUtilsSuite.testConfConversion started [info] - concat (31 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.012s [info] Test run started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptChallengeSalt started [info] - concatWs (19 milliseconds) [info] - split !!! IGNORED !!! [info] - levenshteinDistance (8 milliseconds) [info] - hashCode (3 milliseconds) [info] - equals (3 milliseconds) [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptChallengeAppId started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testFixedChallengeResponse started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptChallengeCiphertext started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testMismatchedSecret started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptResponseSalt started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testRegisterWithCustomShuffleManager started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testEncryptedMessage started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchCorruptRddBlock started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testEncryptedMessageWhenTransferringZeroBytes started [info] Test run started [info] Test org.apache.spark.unsafe.array.LongArraySuite.basicTest started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testFixedChallenge started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.009s [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptServerCiphertext started [info] Test run started [info] Test org.apache.spark.unsafe.array.ByteArraySuite.testCompareBinary started [info] Test org.apache.spark.unsafe.array.ByteArraySuite.testGetPrefix started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.002s [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testCorruptResponseAppId started [info] Test run started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.freeingOnHeapMemoryBlockResetsBaseObjectAndOffset started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.overlappingCopyMemory started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testAuthEngine started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testBadKeySize started [info] Test run finished: 0 failed, 0 ignored, 13 total, 0.236s [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchNoServer started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.memoryDebugFillEnabledInTest started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.offHeapMemoryAllocatorThrowsAssertionErrorOnDoubleFree started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.heapMemoryReuse started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.onHeapMemoryAllocatorPoolingReUsesLongArrays started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.onHeapMemoryAllocatorThrowsAssertionErrorOnDoubleFree started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.freeingOffHeapMemoryBlockResetsOffset started [info] Test run finished: 0 failed, 0 ignored, 8 total, 0.045s [info] Test run started [info] Test org.apache.spark.network.util.NettyMemoryMetricsSuite.testGeneralNettyMemoryMetrics started [info] Test run started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.titleCase started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.concatTest started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.soundex started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.basicTest started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStreamUnderflow started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.testToShort started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.startsWith started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.compareTo started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.levenshteinDistance started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStreamOverflow started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStreamIntArray started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.upperAndLower started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.testToInt started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.createBlankString started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.prefix started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.concatWsTest started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.repeat started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.contains started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.skipWrongFirstByte started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.emptyStringTest started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStreamSlice started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.trimBothWithTrimString started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.substringSQL started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.substring_index started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.pad started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.split started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.trims started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.trimRightWithTrimString started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchDeletedRddBlock started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.findInSet started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.substring started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.translate started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.replace started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.reverse started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.trimLeftWithTrimString started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.endsWith started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.testToByte started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.testToLong started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStream started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.indexOf started [info] Test run finished: 0 failed, 0 ignored, 39 total, 0.052s [info] Test run started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.testKnownLongInputs started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.testKnownIntegerInputs started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.randomizedStressTest started [info] Test org.apache.spark.network.util.NettyMemoryMetricsSuite.testAdditionalMetrics started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.testKnownBytesInputs started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.randomizedStressTestPaddedStrings started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.randomizedStressTestBytes started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.205s [info] Test run started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testDeallocateReleasesManagedBuffer started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchValidRddBlock started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testByteBufBody started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testShortWrite started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testCompositeByteBufBodySingleBuffer started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testCompositeByteBufBodyMultipleBuffers started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testSingleWrite started [info] Test run finished: 0 failed, 0 ignored, 6 total, 0.042s [info] Test run started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testEmptyFrame started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testNegativeFrameSize started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testSplitLengthField started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testFrameDecoding started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testRemoveRddBlocks started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testInterception started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testRetainedFrames started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testConsolidationPerf started [info] Test run finished: 0 failed, 0 ignored, 6 total, 0.347s [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchThreeSort started [info] Test run started [info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.periodAndDurationTest started [info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.equalsTest started [info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.toStringTest started [info] Test run finished: 0 failed, 0 ignored, 3 total, 0.007s [info] Passed: Total 46, Failed 0, Errors 0, Passed 46 [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchWrongBlockId started [info] Passed: Total 106, Failed 0, Errors 0, Passed 105, Skipped 1 [info] ScalaTest [info] Run completed in 4 seconds, 914 milliseconds. [info] Total number of tests run: 19 [info] Suites: completed 1, aborted 0 [info] Tests: succeeded 19, failed 0, canceled 0, ignored 1, pending 0 [info] All tests passed. [info] Passed: Total 78, Failed 0, Errors 0, Passed 78, Ignored 1 [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchNonexistent started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchOneSort started [info] Test run finished: 0 failed, 0 ignored, 12 total, 1.492s [info] Test run started [info] Test org.apache.spark.network.shuffle.ExternalShuffleBlockResolverSuite.testSortShuffleBlocks started [info] Test org.apache.spark.network.shuffle.ExternalShuffleBlockResolverSuite.testBadRequests started [info] Test org.apache.spark.network.shuffle.ExternalShuffleBlockResolverSuite.jsonSerializationOfExecutorRegistration started [info] Test run finished: 0 failed, 0 ignored, 3 total, 0.197s [info] Test run started [info] Test org.apache.spark.network.shuffle.AppIsolationSuite.testSaslAppIsolation started [info] Test org.apache.spark.network.shuffle.AppIsolationSuite.testAuthEngineAppIsolation started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.627s [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] - accuracy - String (3 seconds, 392 milliseconds) [info] Passed: Total 128, Failed 0, Errors 0, Passed 128 [info] - mergeInPlace - String (1 second, 922 milliseconds) [info] - intersectInPlace - String (2 seconds, 418 milliseconds) [info] - incompatible merge (3 milliseconds) [info] BitArraySuite: [info] - error case when create BitArray (2 milliseconds) [info] - bitSize (0 milliseconds) [info] - set (1 millisecond) [info] - normal operation (24 milliseconds) [info] - merge (8 milliseconds) [info] CountMinSketchSuite: [info] - accuracy - Byte (281 milliseconds) [info] DistributedSuite: [info] - mergeInPlace - Byte (290 milliseconds) [info] - accuracy - Short (598 milliseconds) [info] ExternalSorterSuite: [info] - mergeInPlace - Short (308 milliseconds) [info] - accuracy - Int (525 milliseconds) [info] - mergeInPlace - Int (342 milliseconds) [info] - accuracy - Long (672 milliseconds) [info] - mergeInPlace - Long (345 milliseconds) [info] - empty data stream with kryo ser (2 seconds, 918 milliseconds) [info] - empty data stream with java ser (214 milliseconds) [info] - few elements per partition with kryo ser (183 milliseconds) [info] - few elements per partition with java ser (230 milliseconds) [info] - empty partitions with spilling with kryo ser (710 milliseconds) [info] - accuracy - String (2 seconds, 439 milliseconds) [info] - empty partitions with spilling with java ser (392 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 7 total, 13.47s [info] Test run started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.failOutstandingStreamCallbackOnException started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.failOutstandingStreamCallbackOnClose started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.testActiveStreams started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleSuccessfulFetch started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleFailedRPC started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleFailedFetch started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleSuccessfulMergedBlockMeta started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleFailedMergedBlockMeta started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleSuccessfulRPC started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.clearAllOutstandingRequests started [info] Test run finished: 0 failed, 0 ignored, 10 total, 0.073s [info] Test run started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testNonMatching started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testSaslAuthentication started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testEncryptedMessageChunking started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testServerAlwaysEncrypt started [info] - mergeInPlace - String (1 second, 658 milliseconds) [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testDataEncryptionIsActuallyEnabled started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testFileRegionEncryption started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testSaslEncryption started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testDelegates started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testEncryptedMessage started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testMatching started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testRpcHandlerDelegate started [info] Test run finished: 0 failed, 0 ignored, 11 total, 1.044s [info] Test run started [info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchNonExistentChunk started [info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchFileChunk started [info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchBothChunks started [info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchChunkAndNonExistent started [info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchBufferChunk started [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] Test run finished: 0 failed, 0 ignored, 5 total, 0.344s [info] Test run started [info] Test org.apache.spark.network.RequestTimeoutIntegrationSuite.furtherRequestsDelay started [info] - task throws not serializable exception (8 seconds, 156 milliseconds) [info] - local-cluster format (4 milliseconds) [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] - accuracy - Byte array (4 seconds, 341 milliseconds) [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] - mergeInPlace - Byte array (2 seconds, 480 milliseconds) [info] - incompatible merge (3 milliseconds) [info] - simple groupByKey (7 seconds, 196 milliseconds) [info] - spilling in local cluster with kryo ser (10 seconds, 26 milliseconds) [info] ResourceRequestHelperSuite: [info] - empty SparkConf should be valid (273 milliseconds) [info] - just normal resources are defined (6 milliseconds) [info] - get yarn resources from configs (22 milliseconds) [info] - get invalid yarn resources from configs (36 milliseconds) [info] - valid request: value with unit !!! CANCELED !!! (31 milliseconds) [info] org.apache.spark.deploy.yarn.ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ResourceRequestHelperSuite.scala:105) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ResourceRequestHelperSuite.$anonfun$new$9(ResourceRequestHelperSuite.scala:105) [info] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - valid request: value without unit !!! CANCELED !!! (3 milliseconds) [info] org.apache.spark.deploy.yarn.ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ResourceRequestHelperSuite.scala:105) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ResourceRequestHelperSuite.$anonfun$new$9(ResourceRequestHelperSuite.scala:105) [info] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - valid request: multiple resources !!! CANCELED !!! (2 milliseconds) [info] org.apache.spark.deploy.yarn.ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ResourceRequestHelperSuite.scala:105) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ResourceRequestHelperSuite.$anonfun$new$9(ResourceRequestHelperSuite.scala:105) [info] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - invalid request: value does not match pattern !!! CANCELED !!! (3 milliseconds) [info] org.apache.spark.deploy.yarn.ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ResourceRequestHelperSuite.scala:127) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ResourceRequestHelperSuite.$anonfun$new$14(ResourceRequestHelperSuite.scala:127) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - invalid request: only unit defined !!! CANCELED !!! (2 milliseconds) [info] org.apache.spark.deploy.yarn.ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ResourceRequestHelperSuite.scala:127) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ResourceRequestHelperSuite.$anonfun$new$14(ResourceRequestHelperSuite.scala:127) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - invalid request: invalid unit !!! CANCELED !!! (3 milliseconds) [info] org.apache.spark.deploy.yarn.ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ResourceRequestHelperSuite.scala:127) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ResourceRequestHelperSuite.$anonfun$new$14(ResourceRequestHelperSuite.scala:127) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - disallowed resource request: spark.yarn.executor.resource.memory.amount !!! CANCELED !!! (4 milliseconds) [info] org.apache.spark.deploy.yarn.ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ResourceRequestHelperSuite.scala:150) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ResourceRequestHelperSuite.$anonfun$new$17(ResourceRequestHelperSuite.scala:150) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - disallowed resource request: spark.yarn.executor.resource.memory-mb.amount !!! CANCELED !!! (2 milliseconds) [info] org.apache.spark.deploy.yarn.ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ResourceRequestHelperSuite.scala:150) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ResourceRequestHelperSuite.$anonfun$new$17(ResourceRequestHelperSuite.scala:150) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - disallowed resource request: spark.yarn.executor.resource.mb.amount !!! CANCELED !!! (2 milliseconds) [info] org.apache.spark.deploy.yarn.ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ResourceRequestHelperSuite.scala:150) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ResourceRequestHelperSuite.$anonfun$new$17(ResourceRequestHelperSuite.scala:150) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - disallowed resource request: spark.yarn.executor.resource.cores.amount !!! CANCELED !!! (4 milliseconds) [info] org.apache.spark.deploy.yarn.ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ResourceRequestHelperSuite.scala:150) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ResourceRequestHelperSuite.$anonfun$new$17(ResourceRequestHelperSuite.scala:150) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - disallowed resource request: spark.yarn.executor.resource.vcores.amount !!! CANCELED !!! (3 milliseconds) [info] org.apache.spark.deploy.yarn.ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ResourceRequestHelperSuite.scala:150) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ResourceRequestHelperSuite.$anonfun$new$17(ResourceRequestHelperSuite.scala:150) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - disallowed resource request: spark.yarn.am.resource.memory.amount !!! CANCELED !!! (3 milliseconds) [info] org.apache.spark.deploy.yarn.ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ResourceRequestHelperSuite.scala:150) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ResourceRequestHelperSuite.$anonfun$new$17(ResourceRequestHelperSuite.scala:150) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - disallowed resource request: spark.yarn.driver.resource.memory.amount !!! CANCELED !!! (3 milliseconds) [info] org.apache.spark.deploy.yarn.ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ResourceRequestHelperSuite.scala:150) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ResourceRequestHelperSuite.$anonfun$new$17(ResourceRequestHelperSuite.scala:150) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - disallowed resource request: spark.yarn.am.resource.cores.amount !!! CANCELED !!! (3 milliseconds) [info] org.apache.spark.deploy.yarn.ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ResourceRequestHelperSuite.scala:150) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ResourceRequestHelperSuite.$anonfun$new$17(ResourceRequestHelperSuite.scala:150) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - disallowed resource request: spark.yarn.driver.resource.cores.amount !!! CANCELED !!! (3 milliseconds) [info] org.apache.spark.deploy.yarn.ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ResourceRequestHelperSuite.scala:150) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ResourceRequestHelperSuite.$anonfun$new$17(ResourceRequestHelperSuite.scala:150) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - multiple disallowed resources in config (9 milliseconds) [info] FailureTrackerSuite: [info] - failures expire if validity interval is set (36 milliseconds) [info] - failures never expire if validity interval is not set (-1) (5 milliseconds) [info] YarnSparkHadoopUtilSuite: [info] - shell script escaping (325 milliseconds) [info] - Yarn configuration override (427 milliseconds) [info] - test getApplicationAclsForYarn acls on (127 milliseconds) [info] - test getApplicationAclsForYarn acls on and specify users (87 milliseconds) [info] - SPARK-35672: test replaceEnvVars in Unix mode (32 milliseconds) [info] - SPARK-35672: test replaceEnvVars in Windows mode (10 milliseconds) [info] ClientSuite: [info] - default Yarn application classpath (9 milliseconds) [info] - default MR application classpath (1 millisecond) [info] - resultant classpath for an application that defines a classpath for YARN (31 milliseconds) [info] - resultant classpath for an application that defines a classpath for MR (45 milliseconds) [info] - resultant classpath for an application that defines both classpaths, YARN and MR (27 milliseconds) [info] - Local jar URIs (407 milliseconds) [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] Test org.apache.spark.network.RequestTimeoutIntegrationSuite.timeoutCleanlyClosesClient started [info] - Jar path propagation through SparkConf (1 second, 466 milliseconds) [info] - Cluster path translation (6 milliseconds) [info] - configuration and args propagate through createApplicationSubmissionContext (112 milliseconds) [info] - specify a more specific type for the application (873 milliseconds) [info] - spark.yarn.jars with multiple paths and globs (396 milliseconds) [info] - distribute jars archive (223 milliseconds) [info] - SPARK-37239: distribute jars archive with set STAGING_FILE_REPLICATION (141 milliseconds) [info] - distribute archive multiple times (734 milliseconds) [info] - distribute local spark jars (168 milliseconds) [info] - groupByKey where map output sizes exceed maxMbInFlight (7 seconds, 171 milliseconds) [info] - ignore same name jars (206 milliseconds) [info] - custom resource request (client mode) !!! CANCELED !!! (2 milliseconds) [info] ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ClientSuite.scala:471) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ClientSuite.$anonfun$new$34(ClientSuite.scala:471) [info] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - custom resource request (cluster mode) !!! CANCELED !!! (2 milliseconds) [info] ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ClientSuite.scala:471) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ClientSuite.$anonfun$new$34(ClientSuite.scala:471) [info] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - custom driver resource request yarn config and spark config fails !!! CANCELED !!! (3 milliseconds) [info] ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ClientSuite.scala:496) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ClientSuite.$anonfun$new$37(ClientSuite.scala:496) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - custom executor resource request yarn config and spark config fails !!! CANCELED !!! (2 milliseconds) [info] ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ClientSuite.scala:519) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ClientSuite.$anonfun$new$41(ClientSuite.scala:519) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - custom resources spark config mapped to yarn config !!! CANCELED !!! (2 milliseconds) [info] ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ClientSuite.scala:542) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ClientSuite.$anonfun$new$45(ClientSuite.scala:542) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - gpu/fpga spark resources mapped to custom yarn resources !!! CANCELED !!! (2 milliseconds) [info] ResourceRequestHelper.isYarnResourceTypesAvailable() was false (ClientSuite.scala:576) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions.newTestCanceledException(Assertions.scala:475) [info] at org.scalatest.Assertions.newTestCanceledException$(Assertions.scala:474) [info] at org.scalatest.Assertions$.newTestCanceledException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:1310) [info] at org.apache.spark.deploy.yarn.ClientSuite.$anonfun$new$48(ClientSuite.scala:576) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190) [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) [info] at scala.collection.immutable.List.foreach(List.scala:431) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) [info] at org.scalatest.Suite.run(Suite.scala:1112) [info] at org.scalatest.Suite.run$(Suite.scala:1094) [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513) [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] - test yarn jars path not exists (49 milliseconds) [info] - SPARK-31582 Being able to not populate Hadoop classpath (55 milliseconds) [info] - SPARK-35672: test Client.getUserClasspathUrls (9 milliseconds) [info] - files URI match test1 (1 millisecond) [info] - files URI match test2 (1 millisecond) [info] - files URI match test3 (0 milliseconds) [info] - wasb URI match test (1 millisecond) [info] - hdfs URI match test (1 millisecond) [info] - files URI unmatch test1 (1 millisecond) [info] - files URI unmatch test2 (1 millisecond) [info] - files URI unmatch test3 (0 milliseconds) [info] - wasb URI unmatch test1 (0 milliseconds) [info] - wasb URI unmatch test2 (0 milliseconds) [info] - s3 URI unmatch test (1 millisecond) [info] - hdfs URI unmatch test1 (0 milliseconds) [info] - hdfs URI unmatch test2 (0 milliseconds) [info] ClientDistributedCacheManagerSuite: [info] - test getFileStatus empty (239 milliseconds) [info] - test getFileStatus cached (1 millisecond) [info] - test addResource (3 milliseconds) [info] - test addResource link null (2 milliseconds) [info] - test addResource appmaster only (1 millisecond) [info] - test addResource archive (2 milliseconds) [info] ContainerPlacementStrategySuite: [info] - allocate locality preferred containers with enough resource and no matched existed containers (309 milliseconds) [info] - allocate locality preferred containers with enough resource and partially matched containers (59 milliseconds) [info] - allocate locality preferred containers with limited resource and partially matched containers (46 milliseconds) [info] - allocate locality preferred containers with fully matched containers (48 milliseconds) [info] - allocate containers with no locality preference (43 milliseconds) [info] - allocate locality preferred containers by considering the localities of pending requests (52 milliseconds) [info] YarnClusterSuite: [info] - spilling in local cluster with java ser (9 seconds, 204 milliseconds) [info] - accumulators (5 seconds, 522 milliseconds) [info] Test org.apache.spark.network.RequestTimeoutIntegrationSuite.timeoutInactiveRequests started [info] - broadcast variables (5 seconds, 706 milliseconds) [info] - spilling in local cluster with many reduce tasks with kryo ser (12 seconds, 1 milliseconds) [info] - repeatedly failing task (5 seconds, 288 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 3 total, 31.632s [info] Test run started [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testSaslClientFallback started [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testSaslServerFallback started [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testAuthReplay started [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testNewAuth started [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testLargeMessageEncryption started [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testAuthFailure started [info] Test run finished: 0 failed, 0 ignored, 6 total, 0.306s [info] Test run started [info] Test org.apache.spark.network.util.TimerWithCustomUnitSuite.testTimingViaContext started [info] Test org.apache.spark.network.util.TimerWithCustomUnitSuite.testTimerWithMillisecondTimeUnit started [info] Test org.apache.spark.network.util.TimerWithCustomUnitSuite.testTimerWithNanosecondTimeUnit started [info] Test run finished: 0 failed, 0 ignored, 3 total, 0.02s [info] Test run started [info] Test org.apache.spark.network.server.OneForOneStreamManagerSuite.streamStatesAreFreedWhenConnectionIsClosedEvenIfBufferIteratorThrowsException started [info] Test org.apache.spark.network.server.OneForOneStreamManagerSuite.testMissingChunk started [info] Test org.apache.spark.network.server.OneForOneStreamManagerSuite.managedBuffersAreFreedWhenConnectionIsClosed started [info] Test run finished: 0 failed, 0 ignored, 3 total, 0.028s [info] Test run started [info] Test org.apache.spark.network.TransportRequestHandlerSuite.handleMergedBlockMetaRequest started [info] Test org.apache.spark.network.TransportRequestHandlerSuite.handleStreamRequest started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.006s [info] Test run started [info] Test org.apache.spark.network.protocol.EncodersSuite.testBitmapArraysEncodeDecode started [info] Test org.apache.spark.network.protocol.EncodersSuite.testRoaringBitmapEncodeShouldFailWhenBufferIsSmall started [info] Test org.apache.spark.network.protocol.EncodersSuite.testRoaringBitmapEncodeDecode started [info] Test run finished: 0 failed, 0 ignored, 3 total, 0.006s [info] Test run started [info] Test org.apache.spark.network.StreamSuite.testSingleStream started [info] Test org.apache.spark.network.StreamSuite.testMultipleStreams started [info] Test org.apache.spark.network.StreamSuite.testConcurrentStreams started [info] Test org.apache.spark.network.StreamSuite.testZeroLengthStream started [info] Test run finished: 0 failed, 0 ignored, 4 total, 0.311s [info] Test run started [info] Test org.apache.spark.network.crypto.TransportCipherSuite.testBufferNotLeaksOnInternalError started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.086s [info] Test run started [info] Test org.apache.spark.network.RpcIntegrationSuite.sendRpcWithStreamConcurrently started [info] Test org.apache.spark.network.RpcIntegrationSuite.sendOneWayMessage started [info] Test org.apache.spark.network.RpcIntegrationSuite.singleRPC started [info] Test org.apache.spark.network.RpcIntegrationSuite.throwErrorRPC started [info] Test org.apache.spark.network.RpcIntegrationSuite.doubleTrouble started [info] Test org.apache.spark.network.RpcIntegrationSuite.doubleRPC started [info] Test org.apache.spark.network.RpcIntegrationSuite.returnErrorRPC started [info] Test org.apache.spark.network.RpcIntegrationSuite.sendRpcWithStreamFailures started [info] Test org.apache.spark.network.RpcIntegrationSuite.sendRpcWithStreamOneAtATime started [info] Test org.apache.spark.network.RpcIntegrationSuite.sendSuccessAndFailure started [info] Test run finished: 0 failed, 0 ignored, 10 total, 0.274s [info] Test run started [info] Test org.apache.spark.network.ProtocolSuite.responses started [info] Test org.apache.spark.network.ProtocolSuite.requests started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.006s [info] JobGeneratorSuite: [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] - spilling in local cluster with many reduce tasks with java ser (12 seconds, 496 milliseconds) [info] - cleanup of intermediate files in sorter (246 milliseconds) [info] - cleanup of intermediate files in sorter with failures (374 milliseconds) [info] - repeatedly failing task that crashes JVM (11 seconds, 149 milliseconds) [info] - cleanup of intermediate files in shuffle (1 second, 88 milliseconds) [info] - SPARK-6222: Do not clear received block data too soon (8 seconds, 28 milliseconds) [info] ReceiverInputDStreamSuite: [info] - cleanup of intermediate files in shuffle with failures (292 milliseconds) [info] - Without WAL enabled: createBlockRDD creates empty BlockRDD when no block info (233 milliseconds) [info] - no sorting or partial aggregation with kryo ser (169 milliseconds) [warn] multiple main classes detected: run 'show discoveredMainClasses' to see the list [info] - Without WAL enabled: createBlockRDD creates correct BlockRDD with block info (184 milliseconds) [info] - no sorting or partial aggregation with java ser (168 milliseconds) [info] - Without WAL enabled: createBlockRDD filters non-existent blocks before creating BlockRDD (247 milliseconds) [info] - no sorting or partial aggregation with spilling with kryo ser (175 milliseconds) [info] - With WAL enabled: createBlockRDD creates empty WALBackedBlockRDD when no block info (213 milliseconds) [info] - no sorting or partial aggregation with spilling with java ser (218 milliseconds) [info] - With WAL enabled: createBlockRDD creates correct WALBackedBlockRDD with all block info having WAL info (229 milliseconds) [info] - sorting, no partial aggregation with kryo ser (136 milliseconds) [info] - With WAL enabled: createBlockRDD creates BlockRDD when some block info don't have WAL info (251 milliseconds) [info] - sorting, no partial aggregation with java ser (210 milliseconds) [info] FileBasedWriteAheadLogWithFileCloseAfterWriteSuite: [info] - FileBasedWriteAheadLog - read all logs (144 milliseconds) [info] - sorting, no partial aggregation with spilling with kryo ser (204 milliseconds) [info] - FileBasedWriteAheadLog - write logs (98 milliseconds) [info] - sorting, no partial aggregation with spilling with java ser (272 milliseconds) [info] - FileBasedWriteAheadLog - read all logs after write (256 milliseconds) [info] - FileBasedWriteAheadLog - clean old logs (81 milliseconds) [info] - partial aggregation, no sorting with kryo ser (156 milliseconds) [info] - FileBasedWriteAheadLog - clean old logs synchronously (89 milliseconds) [info] - partial aggregation, no sorting with java ser (154 milliseconds) [info] - partial aggregation, no sorting with spilling with kryo ser (185 milliseconds) [info] - FileBasedWriteAheadLog - handling file errors while reading rotating logs (387 milliseconds) [info] - FileBasedWriteAheadLog - do not create directories or files unless write (5 milliseconds) [info] - FileBasedWriteAheadLog - parallel recovery not enabled if closeFileAfterWrite = false (14 milliseconds) [info] - FileBasedWriteAheadLog - close after write flag (4 milliseconds) [info] RateLimiterSuite: [info] - rate limiter initializes even without a maxRate set (1 millisecond) [info] - rate limiter updates when below maxRate (0 milliseconds) [info] - rate limiter stays below maxRate despite large updates (0 milliseconds) [info] - partial aggregation, no sorting with spilling with java ser (121 milliseconds) [info] ReceivedBlockTrackerSuite: [info] - block addition, and block to batch allocation (49 milliseconds) [info] - partial aggregation and sorting with kryo ser (144 milliseconds) [info] - partial aggregation and sorting with java ser (132 milliseconds) [info] - partial aggregation and sorting with spilling with kryo ser (160 milliseconds) [info] - partial aggregation and sorting with spilling with java ser (122 milliseconds) [info] - run Spark in yarn-client mode (27 seconds, 153 milliseconds) [info] - sort without breaking sorting contracts with kryo ser (2 seconds, 346 milliseconds) [info] - sort without breaking sorting contracts with java ser (2 seconds, 154 milliseconds) [info] - sort without breaking timsort contracts for large arrays !!! IGNORED !!! [info] - spilling with hash collisions (261 milliseconds) [info] - spilling with many hash collisions (735 milliseconds) [info] - spilling with hash collisions using the Int.MaxValue key (312 milliseconds) [info] - spilling with null keys and values (289 milliseconds) [info] - repeatedly failing task that crashes JVM with a zero exit code (SPARK-16925) (11 seconds, 563 milliseconds) [info] - sorting updates peak execution memory (1 second, 620 milliseconds) [info] - force to spill for external sorter (1 second, 55 milliseconds) [info] DAGSchedulerSuite: [info] - [SPARK-3353] parent stage should have lower stage id (1 second, 492 milliseconds) [info] - [SPARK-13902] Ensure no duplicate stages are created (126 milliseconds) [info] - All shuffle files on the storage endpoint should be cleaned up when it is lost (144 milliseconds) [info] - SPARK-32003: All shuffle files for executor should be cleaned up on fetch failure (116 milliseconds) [info] - zero split job (81 milliseconds) [info] - run trivial job (91 milliseconds) [info] - run trivial job w/ dependency (93 milliseconds) [info] - equals and hashCode AccumulableInfo (1 millisecond) [info] - cache location preferences w/ dependency (116 milliseconds) [info] - regression test for getCacheLocs (104 milliseconds) [info] - getMissingParentStages should consider all ancestor RDDs' cache statuses (109 milliseconds) [info] - avoid exponential blowup when getting preferred locs list (202 milliseconds) [info] - unserializable task (194 milliseconds) [info] - trivial job failure (145 milliseconds) [info] - trivial job cancellation (102 milliseconds) [info] - job cancellation no-kill backend (81 milliseconds) [info] - run trivial shuffle (87 milliseconds) [info] - run trivial shuffle with fetch failure (97 milliseconds) [info] - shuffle files not lost when executor process lost with shuffle service (78 milliseconds) [info] - shuffle files lost when worker lost with shuffle service (86 milliseconds) [info] - shuffle files lost when worker lost without shuffle service (97 milliseconds) [info] - shuffle files not lost when executor failure with shuffle service (151 milliseconds) [info] - shuffle files lost when executor failure without shuffle service (90 milliseconds) [info] - SPARK-28967 properties must be cloned before posting to listener bus for 0 partition (102 milliseconds) [info] - Single stage fetch failure should not abort the stage. (126 milliseconds) [info] - Multiple consecutive stage fetch failures should lead to job being aborted. (140 milliseconds) [info] - Failures in different stages should not trigger an overall abort (170 milliseconds) [info] - caching (encryption = off) (6 seconds, 706 milliseconds) [info] - Non-consecutive stage failures don't trigger abort (212 milliseconds) [info] - trivial shuffle with multiple fetch failures (106 milliseconds) [info] - Retry all the tasks on a resubmitted attempt of a barrier stage caused by FetchFailure (136 milliseconds) [info] - Retry all the tasks on a resubmitted attempt of a barrier stage caused by TaskKilled (112 milliseconds) [info] - Fail the job if a barrier ResultTask failed (88 milliseconds) [info] - late fetch failures don't cause multiple concurrent attempts for the same map stage (91 milliseconds) [info] - extremely late fetch failures don't cause multiple concurrent attempts for the same stage (136 milliseconds) [info] - task events always posted in speculation / when stage is killed (110 milliseconds) [info] - ignore late map task completions (96 milliseconds) [info] - run shuffle with map stage failure (87 milliseconds) [info] - shuffle fetch failure in a reused shuffle dependency (102 milliseconds) [info] - don't submit stage until its dependencies map outputs are registered (SPARK-5259) (108 milliseconds) [info] - register map outputs correctly after ExecutorLost and task Resubmitted (92 milliseconds) [info] - failure of stage used by two jobs (103 milliseconds) [info] - stage used by two jobs, the first no longer active (SPARK-6880) (100 milliseconds) [info] - stage used by two jobs, some fetch failures, and the first job no longer active (SPARK-6880) (168 milliseconds) [info] - run trivial shuffle with out-of-band executor failure and retry (116 milliseconds) [info] - recursive shuffle failures (106 milliseconds) [info] - block addition, and block to batch allocation with many blocks (17 seconds, 83 milliseconds) [info] - cached post-shuffle (107 milliseconds) [info] - recovery with write ahead logs should remove only allocated blocks from received queue (30 milliseconds) [info] - SPARK-30388: shuffle fetch failed on speculative task, but original task succeed (531 milliseconds) [info] - block allocation to batch should not loose blocks from received queue (695 milliseconds) [info] - misbehaved accumulator should not crash DAGScheduler and SparkContext (175 milliseconds) [info] - recovery and cleanup with write ahead logs (65 milliseconds) [info] - disable write ahead log when checkpoint directory is not set (1 millisecond) [info] - parallel file deletion in FileBasedWriteAheadLog is robust to deletion error (43 milliseconds) [info] - misbehaved accumulator should not impact other accumulators (103 milliseconds) [info] ReceivedBlockHandlerWithEncryptionSuite: [info] - misbehaved resultHandler should not crash DAGScheduler and SparkContext (165 milliseconds) [info] - invalid spark.job.interruptOnCancel should not crash DAGScheduler (136 milliseconds) [info] - BlockManagerBasedBlockHandler - store blocks (316 milliseconds) [info] - getPartitions exceptions should not crash DAGScheduler and SparkContext (SPARK-8606) (122 milliseconds) [info] - BlockManagerBasedBlockHandler - handle errors in storing block (8 milliseconds) [info] - getPreferredLocations errors should not crash DAGScheduler and SparkContext (SPARK-8606) (140 milliseconds) [info] - WriteAheadLogBasedBlockHandler - store blocks (112 milliseconds) [info] - WriteAheadLogBasedBlockHandler - handle errors in storing block (21 milliseconds) [info] - accumulator not calculated for resubmitted result stage (126 milliseconds) [info] - WriteAheadLogBasedBlockHandler - clean old blocks (39 milliseconds) [info] - accumulator not calculated for resubmitted task in result stage (95 milliseconds) [info] - accumulators are updated on exception failures and task killed (86 milliseconds) [info] - Test Block - count messages (124 milliseconds) [info] - reduce tasks should be placed locally with map output (124 milliseconds) [info] - Test Block - isFullyConsumed (34 milliseconds) [info] ReceiverSchedulingPolicySuite: [info] - rescheduleReceiver: empty executors (1 millisecond) [info] - rescheduleReceiver: receiver preferredLocation (6 milliseconds) [info] - rescheduleReceiver: return all idle executors if there are any idle executors (6 milliseconds) [info] - rescheduleReceiver: return all executors that have minimum weight if no idle executors (5 milliseconds) [info] - scheduleReceivers: schedule receivers evenly when there are more receivers than executors (8 milliseconds) [info] - scheduleReceivers: schedule receivers evenly when there are more executors than receivers (4 milliseconds) [info] - scheduleReceivers: schedule receivers evenly when the preferredLocations are even (5 milliseconds) [info] - scheduleReceivers: return empty if no receiver (1 millisecond) [info] - scheduleReceivers: return empty scheduled executors if no executors (2 milliseconds) [info] MapWithStateSuite: [info] - reduce task locality preferences should only include machines with largest map outputs (124 milliseconds) [info] - state - get, exists, update, remove, (6 milliseconds) [info] - stages with both narrow and shuffle dependencies use narrow ones for locality (96 milliseconds) [info] - Spark exceptions should include call site in stack trace (114 milliseconds) [info] - catch errors in event loop (95 milliseconds) [info] - simple map stage submission (169 milliseconds) [info] - map stage submission with reduce stage also depending on the data (163 milliseconds) [info] - map stage submission with fetch failure (161 milliseconds) [info] - map stage submission with multiple shared stages and failures (136 milliseconds) [info] - Trigger mapstage's job listener in submitMissingTasks (114 milliseconds) [info] - map stage submission with executor failure late map task completions (95 milliseconds) [info] - getShuffleDependenciesAndResourceProfiles correctly returns only direct shuffle parents (79 milliseconds) [info] - mapWithState - basic operations with simple API (1 second, 321 milliseconds) [info] - mapWithState - basic operations with advanced API (535 milliseconds) [info] - mapWithState - type inferencing and class tags (10 milliseconds) [info] - caching (encryption = on) (6 seconds, 671 milliseconds) [info] - SPARK-17644: After one stage is aborted for too many failed attempts, subsequent stagesstill behave correctly on fetch failures (1 second, 516 milliseconds) [info] - mapWithState - states as mapped data (484 milliseconds) [info] - [SPARK-19263] DAGScheduler should not submit multiple active tasksets, even with late completions from earlier stage attempts (140 milliseconds) [info] - task end event should have updated accumulators (SPARK-20342) (479 milliseconds) [info] - mapWithState - initial states, with nothing returned as from mapping function (563 milliseconds) [info] - Barrier task failures from the same stage attempt don't trigger multiple stage retries (145 milliseconds) [info] - Barrier task failures from a previous stage attempt don't trigger stage retry (112 milliseconds) [info] - SPARK-25341: abort stage while using old fetch protocol (134 milliseconds) [info] - SPARK-25341: retry all the succeeding stages when the map stage is indeterminate (152 milliseconds) [info] - SPARK-25341: continuous indeterminate stage roll back (171 milliseconds) [info] - mapWithState - state removing (689 milliseconds) [info] - SPARK-29042: Sampled RDD with unordered input should be indeterminate (133 milliseconds) [info] - SPARK-23207: cannot rollback a result stage (131 milliseconds) [info] - SPARK-23207: local checkpoint fail to rollback (checkpointed before) (139 milliseconds) [info] - SPARK-23207: local checkpoint fail to rollback (checkpointing now) (134 milliseconds) [info] - SPARK-23207: reliable checkpoint can avoid rollback (checkpointed before) (400 milliseconds) [info] - SPARK-23207: reliable checkpoint fail to rollback (checkpointing now) (234 milliseconds) [info] - SPARK-27164: RDD.countApprox on empty RDDs schedules jobs which never complete (214 milliseconds) [info] - mapWithState - state timing out (1 second, 479 milliseconds) [info] - Completions in zombie tasksets update status of non-zombie taskset (185 milliseconds) [info] - mapWithState - checkpoint durations (77 milliseconds) [info] - test default resource profile (130 milliseconds) [info] - test 1 resource profile (122 milliseconds) ------------------------------------------- Time: 1000 ms ------------------------------------------- ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,1) [info] - test 2 resource profiles errors by default (122 milliseconds) ------------------------------------------- Time: 3000 ms ------------------------------------------- (a,2) (b,1) [info] - test 2 resource profile with merge conflict config true (114 milliseconds) ------------------------------------------- Time: 3000 ms ------------------------------------------- (a,2) (b,1) [info] - test multiple resource profiles created from merging use same rp (133 milliseconds) [info] - test merge 2 resource profiles multiple configs (3 milliseconds) [info] - test merge 3 resource profiles (1 millisecond) ------------------------------------------- Time: 4000 ms ------------------------------------------- (a,3) (b,2) (c,1) ------------------------------------------- Time: 5000 ms ------------------------------------------- (c,1) (a,4) (b,3) [info] - getShuffleDependenciesAndResourceProfiles returns deps and profiles correctly (132 milliseconds) ------------------------------------------- Time: 6000 ms ------------------------------------------- (b,3) (c,1) (a,5) ------------------------------------------- Time: 7000 ms ------------------------------------------- (a,5) (b,3) (c,1) [info] - mapWithState - driver failure recovery (708 milliseconds) [info] JavaStreamingListenerWrapperSuite: [info] - basic (16 milliseconds) [info] DurationSuite: [info] - less (1 millisecond) [info] - lessEq (1 millisecond) [info] - greater (1 millisecond) [info] - greaterEq (1 millisecond) [info] - plus (0 milliseconds) [info] - minus (0 milliseconds) [info] - times (0 milliseconds) [info] - div (0 milliseconds) [info] - isMultipleOf (0 milliseconds) [info] - min (1 millisecond) [info] - max (1 millisecond) [info] - isZero (0 milliseconds) [info] - Milliseconds (1 millisecond) [info] - Seconds (0 milliseconds) [info] - Minutes (1 millisecond) [info] PIDRateEstimatorSuite: [info] - the right estimator is created (6 milliseconds) [info] - estimator checks ranges (2 milliseconds) [info] - first estimate is None (3 milliseconds) [info] - second estimate is not None (1 millisecond) [info] - no estimate when no time difference between successive calls (2 milliseconds) [info] - no estimate when no records in previous batch (0 milliseconds) [info] - no estimate when there is no processing delay (0 milliseconds) [info] - estimate is never less than min rate (4 milliseconds) [info] - with no accumulated or positive error, |I| > 0, follow the processing speed (4 milliseconds) [info] - with no accumulated but some positive error, |I| > 0, follow the processing speed (4 milliseconds) [info] - with some accumulated and some positive error, |I| > 0, stay below the processing speed (20 milliseconds) [info] WindowOperationsSuite: [info] - run Spark in yarn-cluster mode (26 seconds, 57 milliseconds) [info] - SPARK-32920: shuffle merge finalization (665 milliseconds) [info] - window - basic window (450 milliseconds) [info] - SPARK-32920: merger locations not empty (164 milliseconds) [info] - window - tumbling window (317 milliseconds) [info] - SPARK-32920: merger locations reuse from shuffle dependency (185 milliseconds) [info] - SPARK-32920: Disable shuffle merge due to not enough mergers available (188 milliseconds) [info] - caching on disk (encryption = off) (5 seconds, 923 milliseconds) [info] - window - larger window (329 milliseconds) [info] - SPARK-32920: Ensure child stage should not start before all the parent stages are completed with shuffle merge finalized for all the parent stages (163 milliseconds) [info] - window - non-overlapping window (269 milliseconds) [info] - SPARK-32920: Reused ShuffleDependency with Shuffle Merge disabled for the corresponding ShuffleDependency should not cause DAGScheduler to hang (178 milliseconds) [info] - window - persistence level (90 milliseconds) [info] - SPARK-32920: Reused ShuffleDependency with Shuffle Merge disabled for the corresponding ShuffleDependency with shuffle data loss should recompute missing partitions (182 milliseconds) [info] - reduceByKeyAndWindow - basic reduction (419 milliseconds) [info] - SPARK-32920: Empty RDD should not be computed (211 milliseconds) [info] - SPARK-32920: Merge results should be unregistered if the running stage is cancelled before shuffle merge is finalized (156 milliseconds) [info] - SPARK-32920: SPARK-35549: Merge results should not get registered after shuffle merge finalization (139 milliseconds) [info] - reduceByKeyAndWindow - key already in window and new value added into window (390 milliseconds) [info] - SPARK-32920: Disable push based shuffle in the case of a barrier stage (159 milliseconds) [info] - reduceByKeyAndWindow - new key added into window (316 milliseconds) [info] - SPARK-32920: metadata fetch failure should not unregister map status (151 milliseconds) [info] - SPARK-32923: handle stage failure for indeterminate map stage with push-based shuffle (312 milliseconds) [info] - reduceByKeyAndWindow - key removed from window (498 milliseconds) [info] FsHistoryProviderSuite: [info] - Parse application logs (inMemory = true) (367 milliseconds) [info] - reduceByKeyAndWindow - larger slide time (431 milliseconds) [info] - reduceByKeyAndWindow - big test (701 milliseconds) [info] - Parse application logs (inMemory = false) (794 milliseconds) [info] - reduceByKeyAndWindow with inverse function - basic reduction (332 milliseconds) [info] - SPARK-31608: parse application logs with HybridStore (359 milliseconds) [info] - SPARK-3697: ignore files that cannot be read. (171 milliseconds) [info] - reduceByKeyAndWindow with inverse function - key already in window and new value added into window (448 milliseconds) [info] - history file is renamed from inprogress to completed (205 milliseconds) [info] - Parse logs that application is not started (74 milliseconds) [info] - SPARK-5582: empty log directory (152 milliseconds) [info] - reduceByKeyAndWindow with inverse function - new key added into window (436 milliseconds) [info] - apps with multiple attempts with order (668 milliseconds) [info] - reduceByKeyAndWindow with inverse function - key removed from window (502 milliseconds) [info] - log urls without customization (501 milliseconds) [info] - reduceByKeyAndWindow with inverse function - larger slide time (595 milliseconds) [info] - custom log urls, including FILE_NAME (444 milliseconds) [info] - caching on disk (encryption = on) (5 seconds, 800 milliseconds) [info] - custom log urls, excluding FILE_NAME (420 milliseconds) [info] - reduceByKeyAndWindow with inverse function - big test (783 milliseconds) [info] - custom log urls with invalid attribute (395 milliseconds) [info] - reduceByKeyAndWindow with inverse and filter functions - big test (779 milliseconds) [info] - custom log urls, LOG_FILES not available while FILE_NAME is specified (428 milliseconds) [info] - custom log urls, app not finished, applyIncompleteApplication: true (326 milliseconds) [info] - groupByKeyAndWindow (585 milliseconds) [info] - custom log urls, app not finished, applyIncompleteApplication: false (330 milliseconds) [info] - log cleaner (164 milliseconds) [info] - should not clean inprogress application with lastUpdated time less than maxTime (137 milliseconds) [info] - countByWindow (437 milliseconds) [info] - log cleaner for inProgress files (109 milliseconds) [info] - Event log copy (115 milliseconds) [info] - driver log cleaner (94 milliseconds) [info] - countByValueAndWindow (384 milliseconds) [info] - SPARK-8372: new logs with no app ID are ignored (62 milliseconds) [info] TimeSuite: [info] - less (0 milliseconds) [info] - lessEq (0 milliseconds) [info] - greater (0 milliseconds) [info] - greaterEq (1 millisecond) [info] - plus (0 milliseconds) [info] - minus Time (0 milliseconds) [info] - minus Duration (1 millisecond) [info] - floor (0 milliseconds) [info] - isMultipleOf (0 milliseconds) [info] - min (1 millisecond) [info] - max (0 milliseconds) [info] - until (0 milliseconds) [info] - to (1 millisecond) [info] DStreamScopeSuite: [info] - dstream without scope (1 millisecond) [info] - input dstream without scope (3 milliseconds) [info] - scoping simple operations (9 milliseconds) [info] - scoping nested operations (44 milliseconds) [info] - provider correctly checks whether fs is in safe mode (517 milliseconds) [info] - provider waits for safe mode to finish before initializing (47 milliseconds) [info] - transform should allow RDD operations to be captured in scopes (24 milliseconds) [info] - provider reports error after FS leaves safe mode (76 milliseconds) [info] - foreachRDD should allow RDD operations to be captured in scope (24 milliseconds) [info] - ignore hidden files (95 milliseconds) [info] StreamingContextSuite: [info] - from no conf constructor (88 milliseconds) [info] - from no conf + spark home (71 milliseconds) [info] - from no conf + spark home + env (68 milliseconds) [info] - from conf with settings (322 milliseconds) [info] - support history server ui admin acls (633 milliseconds) [info] - from existing SparkContext (75 milliseconds) [info] - mismatched version discards old listing (168 milliseconds) [info] - from existing SparkContext with settings (132 milliseconds) [info] - from checkpoint (309 milliseconds) [info] - invalidate cached UI (474 milliseconds) [info] - checkPoint from conf (126 milliseconds) [info] - state matching (1 millisecond) [info] - start and stop state check (138 milliseconds) [info] - clean up stale app information (344 milliseconds) [info] - start with non-serializable DStream checkpoints (155 milliseconds) [info] - SPARK-21571: clean up removes invalid history files (117 milliseconds) [info] - start failure should stop internal components (131 milliseconds) [info] - always find end event for finished apps (183 milliseconds) [info] - parse event logs with optimizations off (148 milliseconds) [info] - start should set local properties of streaming jobs correctly (486 milliseconds) [info] - start multiple times (85 milliseconds) [info] - SPARK-24948: ignore files we don't have read permission on (295 milliseconds) [info] - stop multiple times (120 milliseconds) [info] - stop before start (89 milliseconds) [info] - check in-progress event logs absolute length (252 milliseconds) [info] - caching in memory, replicated (encryption = off) (5 seconds, 900 milliseconds) [info] - start after stop (131 milliseconds) [info] - stop only streaming context (193 milliseconds) [info] - stop(stopSparkContext=true) after stop(stopSparkContext=false) (111 milliseconds) [info] - log cleaner with the maximum number of log files (695 milliseconds) [info] - backwards compatibility with LogInfo from Spark 2.4 (7 milliseconds) [info] - SPARK-29755 LogInfo should be serialized/deserialized by jackson properly (4 milliseconds) [info] - SPARK-29755 AttemptInfoWrapper should be serialized/deserialized by jackson properly (47 milliseconds) [info] - SPARK-29043: clean up specified event log (102 milliseconds) [info] - compact event log files (225 milliseconds) [info] - SPARK-33146: don't let one bad rolling log folder prevent loading other applications (123 milliseconds) [info] - SPARK-36354: EventLogFileReader should skip rolling event log directories with no logs (80 milliseconds) [info] - SPARK-33215: check ui view permissions without retrieving ui (306 milliseconds) [info] RDDSuite: [info] - basic operations (903 milliseconds) [info] - serialization (3 milliseconds) [info] - distinct with known partitioner preserves partitioning (273 milliseconds) [info] - countApproxDistinct (208 milliseconds) [info] - SparkContext.union (84 milliseconds) [info] - SparkContext.union parallel partition listing (134 milliseconds) [info] - SparkContext.union creates UnionRDD if at least one RDD has no partitioner (3 milliseconds) [info] - SparkContext.union creates PartitionAwareUnionRDD if all RDDs have partitioners (4 milliseconds) [info] - PartitionAwareUnionRDD raises exception if at least one RDD has no partitioner (2 milliseconds) [info] - SPARK-23778: empty RDD in union should not produce a UnionRDD (12 milliseconds) [info] - partitioner aware union (164 milliseconds) [info] - UnionRDD partition serialized size should be small (10 milliseconds) [info] - fold (20 milliseconds) [info] - fold with op modifying first arg (30 milliseconds) [info] - aggregate (28 milliseconds) [info] - treeAggregate (630 milliseconds) [info] - treeAggregate with ops modifying first args (813 milliseconds) [info] - SPARK-36419: treeAggregate with finalAggregateOnExecutor set to true (1 second, 56 milliseconds) [info] - caching in memory, replicated (encryption = off) (with replication as stream) (6 seconds, 184 milliseconds) [info] - run Spark in yarn-client mode with unmanaged am (19 seconds, 49 milliseconds) [info] - treeReduce (566 milliseconds) [info] - basic caching (42 milliseconds) [info] - caching with failures (20 milliseconds) [info] - empty RDD (177 milliseconds) [info] - repartitioned RDDs (256 milliseconds) [info] - repartitioned RDDs perform load balancing (2 seconds, 302 milliseconds) [info] - coalesced RDDs (170 milliseconds) [info] - coalesced RDDs with locality (75 milliseconds) [info] - coalesced RDDs with partial locality (44 milliseconds) [info] - stop gracefully (9 seconds, 637 milliseconds) [info] - coalesced RDDs with locality, large scale (10K partitions) (1 second, 47 milliseconds) [info] - stop gracefully even if a receiver misses StopReceiver (715 milliseconds) [info] - coalesced RDDs with partial locality, large scale (10K partitions) (506 milliseconds) [info] - coalesced RDDs with locality, fail first pass (10 milliseconds) [info] - zipped RDDs (69 milliseconds) [info] - partition pruning (23 milliseconds) [info] - caching in memory, replicated (encryption = on) (6 seconds, 159 milliseconds) [info] - collect large number of empty partitions (2 seconds, 290 milliseconds) [info] - take (1 second, 956 milliseconds) [info] - top with predefined ordering (138 milliseconds) [info] - top with custom ordering (17 milliseconds) [info] - takeOrdered with predefined ordering (14 milliseconds) [info] - takeOrdered with limit 0 (1 millisecond) [info] - takeOrdered with custom ordering (14 milliseconds) [info] - isEmpty (92 milliseconds) [info] - sample preserves partitioner (2 milliseconds) [info] - caching in memory, replicated (encryption = on) (with replication as stream) (5 seconds, 947 milliseconds) [info] - caching in memory, serialized, replicated (encryption = off) (5 seconds, 378 milliseconds) [info] - stop slow receiver gracefully (15 seconds, 889 milliseconds) [info] - registering and de-registering of streamingSource (141 milliseconds) [info] - SPARK-28709 registering and de-registering of progressListener (145 milliseconds) [info] - awaitTermination (2 seconds, 120 milliseconds) [info] - run Spark in yarn-client mode with different configurations, ensuring redaction (23 seconds, 45 milliseconds) [info] - awaitTermination after stop (130 milliseconds) [info] - caching in memory, serialized, replicated (encryption = off) (with replication as stream) (5 seconds, 495 milliseconds) [info] - awaitTermination with error in task (247 milliseconds) [info] - awaitTermination with error in job generation (436 milliseconds) [info] - awaitTerminationOrTimeout (1 second, 95 milliseconds) [info] - getOrCreate (617 milliseconds) [info] - takeSample (15 seconds, 889 milliseconds) [info] - takeSample from an empty rdd (13 milliseconds) [info] - getActive and getActiveOrCreate (227 milliseconds) [info] - randomSplit (603 milliseconds) [info] - runJob on an invalid partition (7 milliseconds) [info] - sort an empty RDD (20 milliseconds) [info] - sortByKey (139 milliseconds) [info] - sortByKey ascending parameter (108 milliseconds) [info] - sortByKey with explicit ordering (90 milliseconds) [info] - repartitionAndSortWithinPartitions (22 milliseconds) [info] - SPARK-32384: repartitionAndSortWithinPartitions without shuffle (10 milliseconds) [info] - cartesian on empty RDD (21 milliseconds) [info] - cartesian on non-empty RDDs (57 milliseconds) [info] - intersection (78 milliseconds) [info] - getActiveOrCreate with checkpoint (969 milliseconds) [info] - intersection strips duplicates in an input (72 milliseconds) [info] - multiple streaming contexts (72 milliseconds) [info] - zipWithIndex (29 milliseconds) [info] - zipWithIndex with a single partition (12 milliseconds) [info] - zipWithIndex chained with other RDDs (SPARK-4433) (32 milliseconds) [info] - zipWithUniqueId (51 milliseconds) [info] - retag with implicit ClassTag (23 milliseconds) [info] - parent method (6 milliseconds) [info] - getNarrowAncestors (32 milliseconds) [info] - getNarrowAncestors with multiple parents (22 milliseconds) [info] - getNarrowAncestors with cycles (26 milliseconds) [info] - task serialization exception should not hang scheduler (37 milliseconds) [info] - RDD.partitions() fails fast when partitions indices are incorrect (SPARK-13021) (1 millisecond) [info] - nested RDDs are not supported (SPARK-5063) (21 milliseconds) [info] - actions cannot be performed inside of transformations (SPARK-5063) (21 milliseconds) [info] - DStream and generated RDD creation sites (525 milliseconds) [info] - throw exception on using active or stopped context (139 milliseconds) [info] - custom RDD coalescer (416 milliseconds) [info] - SPARK-18406: race between end-of-task and completion iterator read lock release (47 milliseconds) [info] - queueStream doesn't support checkpointing (400 milliseconds) [info] - SPARK-27666: Do not release lock while TaskContext already completed (1 second, 50 milliseconds) [info] - SPARK-23496: order of input partitions can result in severe skew in coalesce (4 milliseconds) [info] - cannot run actions after SparkContext has been stopped (SPARK-5063) (72 milliseconds) [info] - cannot call methods on a stopped SparkContext (SPARK-5063) (5 milliseconds) [info] ExecutorSuite: [info] - Creating an InputDStream but not using it should not crash (960 milliseconds) [info] - caching in memory, serialized, replicated (encryption = on) (5 seconds, 542 milliseconds) [info] - SPARK-15963: Catch `TaskKilledException` correctly in Executor.TaskRunner (266 milliseconds) [info] - SPARK-19276: Handle FetchFailedExceptions that are hidden by user exceptions (112 milliseconds) [info] - Executor's worker threads should be UninterruptibleThread (82 milliseconds) [info] - SPARK-19276: OOMs correctly handled with a FetchFailure (114 milliseconds) [info] - SPARK-23816: interrupts are not masked by a FetchFailure (93 milliseconds) [info] - Gracefully handle error in task deserialization (10 milliseconds) [info] - Heartbeat should drop zero accumulator updates (86 milliseconds) [info] - Heartbeat should not drop zero accumulator updates when the conf is disabled (5 milliseconds) [info] - Send task executor metrics in DirectTaskResult (75 milliseconds) [info] - Send task executor metrics in TaskKilled (83 milliseconds) [info] - Send task executor metrics in ExceptionFailure (90 milliseconds) [info] - SPARK-34949: do not re-register BlockManager when executor is shutting down (7 milliseconds) [info] - SPARK-33587: isFatalError (58 milliseconds) [info] SerDeUtilSuite: [info] - Converting an empty pair RDD to python does not throw an exception (SPARK-5441) (44 milliseconds) [info] - Converting an empty python RDD to pair RDD does not throw an exception (SPARK-5441) (42 milliseconds) [info] UtilsSuite: [info] - timeConversion (3 milliseconds) [info] - Test byteString conversion (4 milliseconds) [info] - bytesToString (1 millisecond) [info] - copyStream (8 milliseconds) [info] - copyStreamUpTo (19 milliseconds) [info] - memoryStringToMb (1 millisecond) [info] - splitCommandString (1 millisecond) [info] - string formatting of time durations (1 millisecond) [info] - reading offset bytes of a file (7 milliseconds) [info] - reading offset bytes of a file (compressed) (6 milliseconds) [info] - reading offset bytes across multiple files (11 milliseconds) [info] - reading offset bytes across multiple files (compressed) (9 milliseconds) [info] - deserialize long value (1 millisecond) [info] - writeByteBuffer should not change ByteBuffer position (1 millisecond) [info] - get iterator size (1 millisecond) [info] - getIteratorZipWithIndex (1 millisecond) [info] - SPARK-35907: createDirectory (24 milliseconds) [info] - doesDirectoryContainFilesNewerThan (8 milliseconds) [info] - resolveURI (1 millisecond) [info] - resolveURIs with multiple paths (2 milliseconds) [info] - nonLocalPaths (3 milliseconds) [info] - isBindCollision (3 milliseconds) [info] - log4j log level change (1 millisecond) [info] - deleteRecursively (16 milliseconds) [info] - loading properties from file (9 milliseconds) [info] - timeIt with prepare (2 seconds, 3 milliseconds) [info] - fetch hcfs dir (36 milliseconds) [info] - shutdown hook manager (4 milliseconds) [info] - isInDirectory (4 milliseconds) [info] - circular buffer: if nothing was written to the buffer, display nothing (1 millisecond) [info] - circular buffer: if the buffer isn't full, print only the contents written (1 millisecond) [info] - circular buffer: data written == size of the buffer (0 milliseconds) [info] - circular buffer: multiple overflow (0 milliseconds) [info] - isDynamicAllocationEnabled (1 millisecond) [info] - getDynamicAllocationInitialExecutors (3 milliseconds) [info] - Set Spark CallerContext (0 milliseconds) [info] - encodeFileNameToURIRawPath (1 millisecond) [info] - decodeFileNameInURI (0 milliseconds) [info] - caching in memory, serialized, replicated (encryption = on) (with replication as stream) (5 seconds, 824 milliseconds) [info] - Kill process (5 seconds, 36 milliseconds) [info] - chi square test of randomizeInPlace (23 milliseconds) [info] - redact sensitive information (2 milliseconds) [info] - redact sensitive information in command line args (3 milliseconds) [info] - redact sensitive information in sequence of key value pairs (1 millisecond) [info] - tryWithSafeFinally (3 milliseconds) [info] - tryWithSafeFinallyAndFailureCallbacks (5 milliseconds) [info] - load extensions (4 milliseconds) [info] - check Kubernetes master URL (3 milliseconds) [info] - stringHalfWidth (1 millisecond) [info] - trimExceptCRLF standalone (4 milliseconds) [info] - pathsToMetadata (1 millisecond) [info] - checkHost supports both IPV4 and IPV6 (2 milliseconds) [info] - checkHostPort support IPV6 and IPV4 (1 millisecond) [info] - parseHostPort support IPV6 and IPV4 (1 millisecond) [info] - executorOffHeapMemorySizeAsMb when MEMORY_OFFHEAP_ENABLED is false (0 milliseconds) [info] - executorOffHeapMemorySizeAsMb when MEMORY_OFFHEAP_ENABLED is true (1 millisecond) [info] - executorMemoryOverhead when MEMORY_OFFHEAP_ENABLED is true, but MEMORY_OFFHEAP_SIZE not config scene (0 milliseconds) [info] - isPushBasedShuffleEnabled when PUSH_BASED_SHUFFLE_ENABLED and SHUFFLE_SERVICE_ENABLED are both set to true in YARN mode with maxAttempts set to 1 (3 milliseconds) [info] PagedDataSourceSuite: [info] - basic (2 milliseconds) [info] CheckpointStorageSuite: [info] - checkpoint compression (283 milliseconds) [info] - cache checkpoint preferred location (213 milliseconds) [info] - SPARK-31484: checkpoint should not fail in retry (674 milliseconds) [info] SortingSuite: [info] - sortByKey (41 milliseconds) [info] - large array (53 milliseconds) [info] - large array with one split (45 milliseconds) [info] - large array with many partitions (89 milliseconds) [info] - sort descending (64 milliseconds) [info] - sort descending with one split (44 milliseconds) [info] - sort descending with many partitions (90 milliseconds) [info] - more partitions than elements (86 milliseconds) [info] - empty RDD (51 milliseconds) [info] - partition balancing (101 milliseconds) [info] - partition balancing for descending sort (107 milliseconds) [info] - get a range of elements in a sorted RDD that is on one partition (78 milliseconds) [info] - get a range of elements over multiple partitions in a descendingly sorted RDD (66 milliseconds) [info] - get a range of elements in an array not partitioned by a range partitioner (21 milliseconds) [info] - get a range of elements over multiple partitions but not taking up full partitions (76 milliseconds) [info] RpcAddressSuite: [info] - hostPort (1 millisecond) [info] - fromSparkURL (0 milliseconds) [info] - fromSparkURL: a typo url (0 milliseconds) [info] - fromSparkURL: invalid scheme (1 millisecond) [info] - toSparkURL (1 millisecond) [info] JavaSerializerSuite: [info] - JavaSerializer instances are serializable (1 millisecond) [info] - Deserialize object containing a primitive Class as attribute (9 milliseconds) [info] - SPARK-36627: Deserialize object containing a proxy Class as attribute (10 milliseconds) [info] LocalDirsSuite: [info] - Utils.getLocalDir() returns a valid directory, even if some local dirs are missing (7 milliseconds) [info] - SPARK_LOCAL_DIRS override also affects driver (6 milliseconds) [info] - Utils.getLocalDir() throws an exception if any temporary directory cannot be retrieved (9 milliseconds) [info] TaskContextSuite: [info] - provide metrics sources (183 milliseconds) [info] - calls TaskCompletionListener after failure (143 milliseconds) [info] - calls TaskFailureListeners after failure (104 milliseconds) [info] - all TaskCompletionListeners should be called even if some fail (10 milliseconds) [info] - all TaskFailureListeners should be called even if some fail (11 milliseconds) Exception in thread "streaming-job-executor-0" java.lang.Error: java.lang.InterruptedException at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998) at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304) at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:242) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:258) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:187) at org.apache.spark.util.ThreadUtils$.awaitReady(ThreadUtils.scala:334) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:925) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2227) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2248) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2267) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2292) at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1021) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:406) at org.apache.spark.rdd.RDD.collect(RDD.scala:1020) at org.apache.spark.streaming.StreamingContextSuite.$anonfun$new$133(StreamingContextSuite.scala:842) at org.apache.spark.streaming.StreamingContextSuite.$anonfun$new$133$adapted(StreamingContextSuite.scala:840) at org.apache.spark.streaming.dstream.DStream.$anonfun$foreachRDD$2(DStream.scala:629) at org.apache.spark.streaming.dstream.DStream.$anonfun$foreachRDD$2$adapted(DStream.scala:629) at org.apache.spark.streaming.dstream.ForEachDStream.$anonfun$generateJob$2(ForEachDStream.scala:51) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:417) at org.apache.spark.streaming.dstream.ForEachDStream.$anonfun$generateJob$1(ForEachDStream.scala:51) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.util.Try$.apply(Try.scala:213) at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.$anonfun$run$1(JobScheduler.scala:256) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more [info] - TaskContext.attemptNumber should return attempt number, not task id (SPARK-4014) (141 milliseconds) [info] - SPARK-18560 Receiver data should be deserialized properly. (11 seconds, 793 milliseconds) [info] - caching on disk, replicated 2 (encryption = off) (6 seconds, 212 milliseconds) [info] - TaskContext.stageAttemptNumber getter (555 milliseconds) [info] - accumulators are updated on exception failures (194 milliseconds) [info] - failed tasks collect only accumulators whose values count during failures (96 milliseconds) [info] - only updated internal accumulators will be sent back to driver (109 milliseconds) [info] - localProperties are propagated to executors correctly (96 milliseconds) [info] - immediately call a completion listener if the context is completed (1 millisecond) [info] - immediately call a failure listener if the context has failed (1 millisecond) [info] - TaskCompletionListenerException.getMessage should include previousError (1 millisecond) [info] - all TaskCompletionListeners should be called even if some fail or a task (3 milliseconds) [info] - listener registers another listener (reentrancy) (1 millisecond) [info] - listener registers another listener using a second thread (2 milliseconds) [info] - SPARK-22955 graceful shutdown shouldn't lead to job generation error (1 second, 314 milliseconds) [info] DStreamClosureSuite: [info] - user provided closures are actually cleaned (115 milliseconds) [info] - listeners registered from different threads are called sequentially (405 milliseconds) [info] - listeners registered from same thread are called in reverse order (1 millisecond) [info] HistoryServerSuite: [info] BatchedWriteAheadLogSuite: [info] - BatchedWriteAheadLog - read all logs (33 milliseconds) [info] - BatchedWriteAheadLog - write logs (30 milliseconds) [info] - BatchedWriteAheadLog - read all logs after write (33 milliseconds) [info] - BatchedWriteAheadLog - clean old logs (28 milliseconds) [info] - BatchedWriteAheadLog - clean old logs synchronously (26 milliseconds) [info] - BatchedWriteAheadLog - handling file errors while reading rotating logs (98 milliseconds) [info] - BatchedWriteAheadLog - do not create directories or files unless write (2 milliseconds) [info] - BatchedWriteAheadLog - parallel recovery not enabled if closeFileAfterWrite = false (16 milliseconds) [info] - BatchedWriteAheadLog - serializing and deserializing batched records (2 milliseconds) [info] - BatchedWriteAheadLog - failures in wrappedLog get bubbled up (94 milliseconds) [info] - BatchedWriteAheadLog - name log with the highest timestamp of aggregated entries (27 milliseconds) [info] - BatchedWriteAheadLog - shutdown properly (2 milliseconds) [info] - BatchedWriteAheadLog - fail everything in queue during shutdown (10 milliseconds) [info] ReceiverTrackerSuite: [info] - send rate update to receivers (398 milliseconds) [info] - application list json (1 second, 413 milliseconds) [info] - completed app list json (38 milliseconds) [info] - running app list json (9 milliseconds) [info] - minDate app list json (11 milliseconds) [info] - maxDate app list json (8 milliseconds) [info] - maxDate2 app list json (7 milliseconds) [info] - should restart receiver after stopping it (804 milliseconds) [info] - minEndDate app list json (9 milliseconds) [info] - maxEndDate app list json (9 milliseconds) [info] - minEndDate and maxEndDate app list json (6 milliseconds) [info] - minDate and maxEndDate app list json (6 milliseconds) [info] - limit app list json (6 milliseconds) [info] - one app json (64 milliseconds) [info] - one app multi-attempt json (6 milliseconds) [info] - SPARK-11063: TaskSetManager should use Receiver RDD's preferredLocations (403 milliseconds) [info] - get allocated executors (525 milliseconds) [info] StateMapSuite: [info] - EmptyStateMap (2 milliseconds) [info] - OpenHashMapBasedStateMap - put, get, getByTime, getAll, remove (1 millisecond) [info] - OpenHashMapBasedStateMap - put, get, getByTime, getAll, remove with copy (1 millisecond) [info] - job list json (877 milliseconds) [info] - OpenHashMapBasedStateMap - serializing and deserializing (57 milliseconds) [info] - OpenHashMapBasedStateMap - serializing and deserializing with compaction (7 milliseconds) [info] - job list from multi-attempt app json(1) (788 milliseconds) [info] - job list from multi-attempt app json(2) (679 milliseconds) [info] - one job json (6 milliseconds) [info] - succeeded job list json (7 milliseconds) [info] - succeeded&failed job list json (10 milliseconds) [info] - executor list json (119 milliseconds) [info] - caching on disk, replicated 2 (encryption = off) (with replication as stream) (5 seconds, 594 milliseconds) [info] - executor list with executor metrics json (1 second, 164 milliseconds) [info] - run Spark in yarn-cluster mode with different configurations, ensuring redaction (25 seconds, 44 milliseconds) [info] - stage list json (417 milliseconds) [info] - complete stage list json (14 milliseconds) [info] - failed stage list json (8 milliseconds) [info] - one stage json (191 milliseconds) [info] - one stage json with details (27 milliseconds) [info] - one stage attempt json (21 milliseconds) [info] - one stage attempt json details with failed task (10 milliseconds) [info] - stage task summary w shuffle write (955 milliseconds) [info] - stage task summary w shuffle read (24 milliseconds) [info] - stage task summary w/ custom quantiles (72 milliseconds) [info] - stage task list (99 milliseconds) [info] - stage task list w/ offset & length (24 milliseconds) [info] - stage task list w/ sortBy (13 milliseconds) [info] - stage task list w/ sortBy short names: -runtime (14 milliseconds) [info] - stage task list w/ sortBy short names: runtime (12 milliseconds) [info] - stage task list w/ status (843 milliseconds) [info] - stage task list w/ status & offset & length (16 milliseconds) [info] - stage task list w/ status & sortBy short names: runtime (20 milliseconds) [info] - stage list with accumulable json (379 milliseconds) [info] - stage with accumulable json (218 milliseconds) [info] - stage task list from multi-attempt app json(1) (11 milliseconds) [info] - stage task list from multi-attempt app json(2) (210 milliseconds) [info] - caching on disk, replicated 2 (encryption = on) (5 seconds, 413 milliseconds) [info] - excludeOnFailure for stage (1 second, 82 milliseconds) [info] - OpenHashMapBasedStateMap - all possible sequences of operations with copies (7 seconds, 619 milliseconds) [info] - OpenHashMapBasedStateMap - serializing and deserializing with KryoSerializable states (16 milliseconds) [info] - EmptyStateMap - serializing and deserializing (15 milliseconds) [info] - MapWithStateRDDRecord - serializing and deserializing with KryoSerializable states (8 milliseconds) [info] RecurringTimerSuite: [info] - basic (2 milliseconds) [info] - SPARK-10224: call 'callback' after stopping (6 milliseconds) [info] CheckpointSuite: [info] - non-existent checkpoint dir (2 milliseconds) [info] - excludeOnFailure node for stage (1 second, 67 milliseconds) [info] - rdd list storage json (76 milliseconds) [info] - executor node excludeOnFailure (721 milliseconds) [info] - executor node excludeOnFailure unexcluding (9 milliseconds) [info] - executor memory usage (9 milliseconds) [info] - executor resource information (432 milliseconds) [info] - multiple resource profiles (842 milliseconds) [info] - stage list with peak metrics (1 second, 504 milliseconds) [info] - stage with peak metrics (182 milliseconds) [info] - stage with summaries (128 milliseconds) [info] - app environment (64 milliseconds) [info] - one rdd storage json (18 milliseconds) [info] - miscellaneous process (45 milliseconds) [info] - download all logs for app with multiple attempts (64 milliseconds) [info] - download one log for app with multiple attempts (75 milliseconds) [info] - response codes on bad paths (46 milliseconds) [info] - automatically retrieve uiRoot from request through Knox (63 milliseconds) [info] - static relative links are prefixed with uiRoot (spark.ui.proxyBase) (6 milliseconds) [info] - /version api endpoint (7 milliseconds) [info] - caching on disk, replicated 2 (encryption = on) (with replication as stream) (5 seconds, 384 milliseconds) [info] - security manager starts with spark.authenticate set (34 milliseconds) [info] - basic rdd checkpoints + dstream graph checkpoint recovery (7 seconds, 4 milliseconds) [info] - recovery of conf through checkpoints (266 milliseconds) [info] - get correct spark.driver.[host|port] from checkpoint (327 milliseconds) [info] - SPARK-30199 get ui port and blockmanager port (207 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (a,2) (b,1) ------------------------------------------- Time: 1000 ms ------------------------------------------- (,2) ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,2) (b,1) ------------------------------------------- Time: 2500 ms ------------------------------------------- (,2) ------------------------------------------- Time: 3000 ms ------------------------------------------- [info] - recovery with map and reduceByKey operations (627 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (a,1) ------------------------------------------- Time: 1000 ms ------------------------------------------- (a,2) ------------------------------------------- Time: 1500 ms ------------------------------------------- (a,3) ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 2500 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 3000 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 3500 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 3500 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 4000 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 4500 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 5000 ms ------------------------------------------- (a,4) [info] - recovery with invertible reduceByKeyAndWindow operation (1 second, 382 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (a,2) (b,1) ------------------------------------------- Time: 1000 ms ------------------------------------------- (,2) ------------------------------------------- Time: 1500 ms ------------------------------------------- [info] - caching on disk, replicated 3 (encryption = off) (5 seconds, 482 milliseconds) ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,2) (b,1) ------------------------------------------- Time: 2500 ms ------------------------------------------- (,2) ------------------------------------------- Time: 3000 ms ------------------------------------------- [info] - recovery with saveAsHadoopFiles operation (1 second, 147 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (a,2) (b,1) ------------------------------------------- Time: 1000 ms ------------------------------------------- (,2) ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,2) (b,1) ------------------------------------------- Time: 2500 ms ------------------------------------------- (,2) ------------------------------------------- Time: 3000 ms ------------------------------------------- [info] - recovery with saveAsNewAPIHadoopFiles operation (930 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (b,1) (a,2) ------------------------------------------- Time: 1000 ms ------------------------------------------- (,2) ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 2000 ms ------------------------------------------- (b,1) (a,2) ------------------------------------------- Time: 2500 ms ------------------------------------------- (,2) ------------------------------------------- Time: 3000 ms ------------------------------------------- [info] - recovery with saveAsHadoopFile inside transform operation (1 second, 66 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (a,1) ------------------------------------------- Time: 1000 ms ------------------------------------------- (a,2) ------------------------------------------- Time: 1500 ms ------------------------------------------- (a,3) ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 2500 ms ------------------------------------------- (a,5) ------------------------------------------- Time: 3000 ms ------------------------------------------- (a,6) ------------------------------------------- Time: 3500 ms ------------------------------------------- (a,7) ------------------------------------------- Time: 3500 ms ------------------------------------------- (a,7) ------------------------------------------- Time: 4000 ms ------------------------------------------- (a,8) ------------------------------------------- Time: 4500 ms ------------------------------------------- (a,9) ------------------------------------------- Time: 5000 ms ------------------------------------------- (a,10) [info] - recovery with updateStateByKey operation (1 second, 142 milliseconds) [info] - caching on disk, replicated 3 (encryption = off) (with replication as stream) (5 seconds, 538 milliseconds) [info] - incomplete apps get refreshed (11 seconds, 184 milliseconds) [info] - yarn-cluster should respect conf overrides in SparkHadoopUtil (SPARK-16414, SPARK-23630) (21 seconds, 50 milliseconds) [info] - recovery maintains rate controller (2 seconds, 589 milliseconds) [info] - ui and api authorization checks (1 second, 378 milliseconds) [info] - SPARK-33215: speed up event log download by skipping UI rebuild (579 milliseconds) Exception in thread "streaming-job-executor-0" java.lang.Error: java.lang.InterruptedException at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998) at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304) at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:242) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:258) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:187) at org.apache.spark.util.ThreadUtils$.awaitReady(ThreadUtils.scala:334) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:925) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2227) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2248) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2267) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2292) at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1021) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:406) at org.apache.spark.rdd.RDD.collect(RDD.scala:1020) at org.apache.spark.streaming.TestOutputStream$$anonfun$$lessinit$greater$1.apply(TestSuiteBase.scala:99) at org.apache.spark.streaming.TestOutputStream$$anonfun$$lessinit$greater$1.apply(TestSuiteBase.scala:98) at org.apache.spark.streaming.dstream.ForEachDStream.$anonfun$generateJob$2(ForEachDStream.scala:51) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:417) at org.apache.spark.streaming.dstream.ForEachDStream.$anonfun$generateJob$1(ForEachDStream.scala:51) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.util.Try$.apply(Try.scala:213) at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.$anonfun$run$1(JobScheduler.scala:256) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more [info] - access history application defaults to the last attempt id (1 second, 669 milliseconds) [info] - SPARK-31697: HistoryServer should set Content-Type (3 milliseconds) [info] - Redirect to the root page when accessed to /history/ (1 millisecond) [info] NextIteratorSuite: [info] - one iteration (2 milliseconds) [info] - two iterations (0 milliseconds) [info] - empty iteration (1 millisecond) [info] - close is called once for empty iterations (0 milliseconds) [info] - close is called once for non-empty iterations (1 millisecond) [info] - recovery with file input stream (3 seconds, 115 milliseconds) [info] ParallelCollectionSplitSuite: [info] - one element per slice (1 millisecond) [info] - one slice (0 milliseconds) [info] - equal slices (0 milliseconds) [info] - non-equal slices (1 millisecond) [info] - splitting exclusive range (0 milliseconds) [info] - splitting inclusive range (1 millisecond) [info] - empty data (1 millisecond) [info] - zero slices (1 millisecond) [info] - negative number of slices (0 milliseconds) [info] - exclusive ranges sliced into ranges (1 millisecond) [info] - inclusive ranges sliced into ranges (1 millisecond) [info] - identical slice sizes between Range and NumericRange (2 milliseconds) [info] - identical slice sizes between List and NumericRange (1 millisecond) [info] - large ranges don't overflow (1 millisecond) [info] - random array tests (185 milliseconds) [info] - random exclusive range tests (17 milliseconds) [info] - random inclusive range tests (13 milliseconds) [info] - exclusive ranges of longs (2 milliseconds) [info] - inclusive ranges of longs (1 millisecond) [info] - exclusive ranges of doubles (3 milliseconds) [info] - inclusive ranges of doubles (2 milliseconds) [info] - inclusive ranges with Int.MaxValue and Int.MinValue (2 milliseconds) [info] - empty ranges with Int.MaxValue and Int.MinValue (2 milliseconds) [info] UISeleniumSuite: [info] - DStreamCheckpointData.restore invoking times (500 milliseconds) [info] - all jobs page should be rendered even though we configure the scheduling mode to fair (1 second, 134 milliseconds) [info] - recovery from checkpoint contains array object (1 second, 24 milliseconds) [info] - SPARK-11267: the race condition of two checkpoints in a batch (109 milliseconds) [info] - SPARK-28912: Fix MatchError in getCheckpointFiles (25 milliseconds) [info] - caching on disk, replicated 3 (encryption = on) (5 seconds, 556 milliseconds) [info] - SPARK-6847: stack overflow when updateStateByKey is followed by a checkpointed dstream (615 milliseconds) [info] FailureSuite: [info] - effects of unpersist() / persist() should be reflected (1 second, 714 milliseconds) [info] - failed stages should not appear to be active (1 second, 32 milliseconds) [info] - spark.ui.killEnabled should properly control kill button display (1 second, 265 milliseconds) [info] - jobs page should not display job group name unless some job was submitted in a job group (936 milliseconds) [info] - caching on disk, replicated 3 (encryption = on) (with replication as stream) (5 seconds, 188 milliseconds) [info] - job progress bars should handle stage / task failures (1 second, 253 milliseconds) [info] - job details page should display useful information for stages that haven't started (576 milliseconds) [info] - job progress bars / cells reflect skipped stages / tasks (554 milliseconds) [info] - stages that aren't run appear as 'skipped stages' after a job finishes (497 milliseconds) [info] - jobs with stages that are skipped should show correct link descriptions on all jobs page (477 milliseconds) [info] - attaching and detaching a new tab (714 milliseconds) [info] - kill stage POST/GET response is correct (260 milliseconds) [info] - kill job POST/GET response is correct (246 milliseconds) [info] - caching in memory and disk, replicated (encryption = off) (5 seconds, 482 milliseconds) [info] - stage & job retention (1 second, 808 milliseconds) [info] - live UI json application list (450 milliseconds) [info] - job stages should have expected dotfile under DAG visualization (267 milliseconds) [info] - stages page should show skipped stages (1 second, 161 milliseconds) [info] - Staleness of Spark UI should not last minutes or hours (502 milliseconds) [info] - description for empty jobs (291 milliseconds) [info] HadoopDelegationTokenManagerSuite: [info] - default configuration (12 milliseconds) [info] - disable hadoopfs credential provider (2 milliseconds) [info] - using deprecated configurations (2 milliseconds) [info] - caching in memory and disk, replicated (encryption = off) (with replication as stream) (5 seconds, 386 milliseconds) [info] - SPARK-35672: run Spark in yarn-client mode with additional jar using URI scheme 'local' (22 seconds, 40 milliseconds) [info] - SPARK-29082: do not fail if current user does not have credentials (3 seconds, 702 milliseconds) [info] RollingEventLogFilesWriterSuite: [info] - create EventLogFileWriter with enable/disable rolling (105 milliseconds) [info] - initialize, write, stop - with codec None (56 milliseconds) [info] - initialize, write, stop - with codec Some(lz4) (50 milliseconds) [info] - initialize, write, stop - with codec Some(lzf) (53 milliseconds) [info] - initialize, write, stop - with codec Some(snappy) (85 milliseconds) [info] - initialize, write, stop - with codec Some(zstd) (58 milliseconds) [info] - Use the defalut value of spark.eventLog.compression.codec (42 milliseconds) [info] - Event log names (0 milliseconds) [info] - Log overwriting (68 milliseconds) [info] - rolling event log files - codec None (276 milliseconds) [info] - rolling event log files - codec Some(lz4) (213 milliseconds) [info] - rolling event log files - codec Some(lzf) (312 milliseconds) [info] - rolling event log files - codec Some(snappy) (357 milliseconds) [info] - rolling event log files - codec Some(zstd) (378 milliseconds) [info] - rolling event log files - the max size of event log file size less than lower limit (36 milliseconds) [info] RandomBlockReplicationPolicyBehavior: [info] - block replication - random block replication policy (7 milliseconds) [info] RDDCleanerSuite: [info] - RDD shuffle cleanup standalone (191 milliseconds) [info] LocalDiskShuffleMapOutputWriterSuite: [info] - writing to an outputstream (5 milliseconds) [info] - writing to a channel (7 milliseconds) [info] TestMemoryManagerSuite: [info] - tracks allocated execution memory by task (4 milliseconds) [info] - markconsequentOOM (0 milliseconds) [info] ResourceProfileManagerSuite: [info] - ResourceProfileManager (3 milliseconds) [info] - isSupported yarn no dynamic allocation (2 milliseconds) [info] - isSupported yarn with dynamic allocation (2 milliseconds) [info] - isSupported k8s with dynamic allocation (1 millisecond) [info] - isSupported with local mode (1 millisecond) [info] - ResourceProfileManager has equivalent profile (173 milliseconds) [info] ExecutorRunnerTest: [info] - command includes appId (52 milliseconds) [info] BlockTransferServiceSuite: [info] - fetchBlockSync should not hang when BlockFetchingListener.onBlockFetchSuccess fails (4 milliseconds) [info] EventLoggingListenerSuite: [info] - Basic event logging with compression (441 milliseconds) [info] - caching in memory and disk, replicated (encryption = on) (5 seconds, 403 milliseconds) [info] - End-to-end event logging (4 seconds, 870 milliseconds) [info] - caching in memory and disk, replicated (encryption = on) (with replication as stream) (5 seconds, 419 milliseconds) [info] - caching in memory and disk, serialized, replicated (encryption = off) (5 seconds, 392 milliseconds) [info] - multiple failures with map (35 seconds, 505 milliseconds) [info] - caching in memory and disk, serialized, replicated (encryption = off) (with replication as stream) (5 seconds, 628 milliseconds) [info] - SPARK-35672: run Spark in yarn-cluster mode with additional jar using URI scheme 'local' (24 seconds, 41 milliseconds) [info] - caching in memory and disk, serialized, replicated (encryption = on) (5 seconds, 119 milliseconds) [info] - End-to-end event logging with compression (19 seconds, 604 milliseconds) [info] - Event logging with password redaction (36 milliseconds) [info] - Spark-33504 sensitive attributes redaction in properties (41 milliseconds) [info] - Executor metrics update (75 milliseconds) [info] - SPARK-31764: isBarrier should be logged in event log (201 milliseconds) [info] PluginContainerSuite: [info] - plugin initialization and communication (140 milliseconds) [info] - do nothing if plugins are not configured (1 millisecond) [info] - merging of config options (80 milliseconds) [info] - SPARK-33088: executor tasks trigger plugin calls (109 milliseconds) [info] - SPARK-33088: executor failed tasks trigger plugin calls (129 milliseconds) [info] - caching in memory and disk, serialized, replicated (encryption = on) (with replication as stream) (5 seconds, 693 milliseconds) [info] - plugin initialization in non-local mode (3 seconds, 760 milliseconds) [info] - plugin initialization in non-local mode with resources (3 seconds, 869 milliseconds) [info] DriverRunnerTest: [info] - Process succeeds instantly (78 milliseconds) [info] - Process failing several times and then succeeding (31 milliseconds) [info] - Process doesn't restart if not supervised (27 milliseconds) [info] - Process doesn't restart if killed (31 milliseconds) [info] - Reset of backoff counter (34 milliseconds) [info] - Kill process finalized with state KILLED (40 milliseconds) [info] - Finalized with state FINISHED (40 milliseconds) [info] - Finalized with state FAILED (41 milliseconds) [info] - Handle exception starting process (50 milliseconds) [info] PrefixComparatorsSuite: [info] - String prefix comparator (50 milliseconds) [info] - Binary prefix comparator (7 milliseconds) [info] - double prefix comparator handles NaNs properly (1 millisecond) [info] - double prefix comparator handles negative NaNs properly (0 milliseconds) [info] - double prefix comparator handles other special values properly (0 milliseconds) [info] NettyBlockTransferSecuritySuite: [info] - security default off (99 milliseconds) [info] - security on same password (102 milliseconds) [info] - security on mismatch password (63 milliseconds) [info] - security mismatch auth off on server (60 milliseconds) [info] - security mismatch auth off on client (70 milliseconds) [info] - security with aes encryption (134 milliseconds) [info] CommandUtilsSuite: [info] - set libraryPath correctly (32 milliseconds) [info] - auth secret shouldn't appear in java opts (85 milliseconds) [info] PairRDDFunctionsSuite: [info] - aggregateByKey (102 milliseconds) [info] - groupByKey (45 milliseconds) [info] - groupByKey with duplicates (41 milliseconds) [info] - groupByKey with negative key hash codes (31 milliseconds) [info] - groupByKey with many output partitions (40 milliseconds) [info] - compute without caching when no partitions fit in memory (6 seconds, 454 milliseconds) [info] - sampleByKey (4 seconds, 781 milliseconds) [info] - SPARK-35672: run Spark in yarn-client mode with additional jar using URI scheme 'local' and gateway-replacement path (21 seconds, 43 milliseconds) [info] - compute when only some partitions fit in memory (5 seconds, 750 milliseconds) [info] - passing environment variables to cluster (4 seconds, 562 milliseconds) [info] - sampleByKeyExact (6 seconds, 445 milliseconds) [info] - reduceByKey (30 milliseconds) [info] - reduceByKey with collectAsMap (28 milliseconds) [info] - reduceByKey with many output partitions (33 milliseconds) [info] - reduceByKey with partitioner (34 milliseconds) [info] - countApproxDistinctByKey (110 milliseconds) [info] - join (31 milliseconds) [info] - join all-to-all (31 milliseconds) [info] - leftOuterJoin (31 milliseconds) [info] - cogroup with empty RDD (25 milliseconds) [info] - cogroup with groupByed RDD having 0 partitions (37 milliseconds) [info] - cogroup between multiple RDD with an order of magnitude difference in number of partitions (7 milliseconds) [info] - cogroup between multiple RDD with number of partitions similar in order of magnitude (5 milliseconds) [info] - cogroup between multiple RDD when defaultParallelism is set without proper partitioner (4 milliseconds) [info] - cogroup between multiple RDD when defaultParallelism is set with proper partitioner (4 milliseconds) [info] - cogroup between multiple RDD when defaultParallelism is set; with huge number of partitions in upstream RDDs (4 milliseconds) [info] - rightOuterJoin (32 milliseconds) [info] - fullOuterJoin (36 milliseconds) [info] - join with no matches (35 milliseconds) [info] - join with many output partitions (39 milliseconds) [info] - groupWith (33 milliseconds) [info] - groupWith3 (35 milliseconds) [info] - groupWith4 (39 milliseconds) [info] - zero-partition RDD (61 milliseconds) [info] - keys and values (30 milliseconds) [info] - default partitioner uses partition size (11 milliseconds) [info] - default partitioner uses largest partitioner (9 milliseconds) [info] - subtract (43 milliseconds) [info] - subtract with narrow dependency (52 milliseconds) [info] - subtractByKey (27 milliseconds) [info] - subtractByKey with narrow dependency (39 milliseconds) [info] - foldByKey (67 milliseconds) [info] - foldByKey with mutable result type (48 milliseconds) [info] - saveNewAPIHadoopFile should call setConf if format is configurable (96 milliseconds) [info] - The JobId on the driver and executors should be the same during the commit (36 milliseconds) [info] - saveAsHadoopFile should respect configured output committers (49 milliseconds) [info] - failure callbacks should be called before calling writer.close() in saveNewAPIHadoopFile (34 milliseconds) [info] - failure callbacks should be called before calling writer.close() in saveAsHadoopFile (51 milliseconds) [info] - saveAsNewAPIHadoopDataset should support invalid output paths when there are no files to be committed to an absolute output location (97 milliseconds) [info] - saveAsHadoopDataset should respect empty output directory when there are no files to be committed to an absolute output location (53 milliseconds) [info] - lookup (49 milliseconds) [info] - lookup with partitioner (49 milliseconds) [info] - lookup with bad partitioner (27 milliseconds) [info] RBackendSuite: [info] - close() clears jvmObjectTracker (2 milliseconds) [info] PrimitiveVectorSuite: [info] - primitive value (5 milliseconds) [info] - non-primitive value (4 milliseconds) [info] - ideal growth (4 milliseconds) [info] - ideal size (2 milliseconds) [info] - resizing (5 milliseconds) [info] MetricsConfigSuite: [info] - MetricsConfig with default properties (2 milliseconds) [info] - MetricsConfig with properties set from a file (0 milliseconds) [info] - MetricsConfig with properties set from a Spark configuration (1 millisecond) [info] - MetricsConfig with properties set from a file and a Spark configuration (1 millisecond) [info] - MetricsConfig with subProperties (0 milliseconds) [info] PartiallySerializedBlockSuite: [info] - valuesIterator() and finishWritingToStream() cannot be called after discard() is called (58 milliseconds) [info] - discard() can be called more than once (1 millisecond) [info] - cannot call valuesIterator() more than once (4 milliseconds) [info] - cannot call finishWritingToStream() more than once (4 milliseconds) [info] - cannot call finishWritingToStream() after valuesIterator() (2 milliseconds) [info] - cannot call valuesIterator() after finishWritingToStream() (3 milliseconds) [info] - buffers are deallocated in a TaskCompletionListener (3 milliseconds) [info] - basic numbers with discard() and numBuffered = 50 (7 milliseconds) [info] - basic numbers with finishWritingToStream() and numBuffered = 50 (37 milliseconds) [info] - basic numbers with valuesIterator() and numBuffered = 50 (3 milliseconds) [info] - basic numbers with discard() and numBuffered = 0 (1 millisecond) [info] - basic numbers with finishWritingToStream() and numBuffered = 0 (17 milliseconds) [info] - basic numbers with valuesIterator() and numBuffered = 0 (3 milliseconds) [info] - basic numbers with discard() and numBuffered = 1000 (24 milliseconds) [info] - basic numbers with finishWritingToStream() and numBuffered = 1000 (19 milliseconds) [info] - basic numbers with valuesIterator() and numBuffered = 1000 (31 milliseconds) [info] - case classes with discard() and numBuffered = 50 (33 milliseconds) [info] - case classes with finishWritingToStream() and numBuffered = 50 (200 milliseconds) [info] - case classes with valuesIterator() and numBuffered = 50 (19 milliseconds) [info] - case classes with discard() and numBuffered = 0 (2 milliseconds) [info] - case classes with finishWritingToStream() and numBuffered = 0 (120 milliseconds) [info] - case classes with valuesIterator() and numBuffered = 0 (2 milliseconds) [info] - case classes with discard() and numBuffered = 1000 (153 milliseconds) [info] - case classes with finishWritingToStream() and numBuffered = 1000 (156 milliseconds) [info] - case classes with valuesIterator() and numBuffered = 1000 (194 milliseconds) [info] - empty iterator with discard() and numBuffered = 0 (1 millisecond) [info] - empty iterator with finishWritingToStream() and numBuffered = 0 (2 milliseconds) [info] - empty iterator with valuesIterator() and numBuffered = 0 (2 milliseconds) [info] SparkContextSchedulerCreationSuite: [info] - bad-master (85 milliseconds) [info] - local (96 milliseconds) [info] - local-* (85 milliseconds) [info] - local-n (67 milliseconds) [info] - local-*-n-failures (68 milliseconds) [info] - local-n-failures (70 milliseconds) [info] - bad-local-n (72 milliseconds) [info] - bad-local-n-failures (83 milliseconds) [info] - local-default-parallelism (87 milliseconds) [info] - local-cluster (385 milliseconds) [info] SerializationDebuggerSuite: [info] - primitives, strings, and nulls (2 milliseconds) [info] - primitive arrays (0 milliseconds) [info] - non-primitive arrays (1 millisecond) [info] - serializable object (1 millisecond) [info] - nested arrays (0 milliseconds) [info] - nested objects (0 milliseconds) [info] - cycles (should not loop forever) (0 milliseconds) [info] - root object not serializable (0 milliseconds) [info] - array containing not serializable element (1 millisecond) [info] - object containing not serializable field (1 millisecond) [info] - externalizable class writing out not serializable object (0 milliseconds) [info] - externalizable class writing out serializable objects (0 milliseconds) [info] - object containing writeReplace() which returns not serializable object (1 millisecond) [info] - object containing writeReplace() which returns serializable object (0 milliseconds) [info] - no infinite loop with writeReplace() which returns class of its own type (1 millisecond) [info] - object containing writeObject() and not serializable field (2 milliseconds) [info] - object containing writeObject() and serializable field (0 milliseconds) [info] - object of serializable subclass with more fields than superclass (SPARK-7180) (1 millisecond) [info] - crazy nested objects (1 millisecond) [info] - improveException (1 millisecond) [info] - improveException with error in debugger (2 milliseconds) [info] LoggingSuite: [info] - spark-shell logging filter (1 millisecond) [info] NettyRpcHandlerSuite: [info] - receive (70 milliseconds) [info] - connectionTerminated (2 milliseconds) [info] SamplingUtilsSuite: [info] - reservoirSampleAndCount (1 millisecond) [info] - SPARK-18678 reservoirSampleAndCount with tiny input (3 milliseconds) [info] - computeFraction (3 milliseconds) [info] AppStatusListenerWithInMemoryStoreSuite: [info] - environment info (1 millisecond) [info] - scheduler events (31 milliseconds) [info] - storage events (17 milliseconds) [info] - eviction of old data (11 milliseconds) [info] - eviction should respect job completion time (1 millisecond) [info] - eviction should respect stage completion time (1 millisecond) [info] - skipped stages should be evicted before completed stages (2 milliseconds) [info] - eviction should respect task completion time (2 milliseconds) [info] - lastStageAttempt should fail when the stage doesn't exist (2 milliseconds) [info] - SPARK-24415: update metrics for tasks that finish late (3 milliseconds) [info] - Total tasks in the executor summary should match total stage tasks (live = true) (6 milliseconds) [info] - Total tasks in the executor summary should match total stage tasks (live = false) (1 millisecond) [info] - driver logs (2 milliseconds) [info] - executor metrics updates (7 milliseconds) [info] - stage executor metrics (2 milliseconds) [info] - storage information on executor lost/down (6 milliseconds) [info] - clean up used memory when BlockManager added (1 millisecond) [info] - SPARK-34877 - check YarnAmInfoEvent is populated correctly (2 milliseconds) [info] TimeStampedHashMapSuite: [info] - HashMap - basic test (5 milliseconds) [info] - TimeStampedHashMap - basic test (6 milliseconds) [info] - TimeStampedHashMap - threading safety test (124 milliseconds) [info] - TimeStampedHashMap - clearing by timestamp (33 milliseconds) [info] RandomSamplerSuite: [info] - utilities (7 milliseconds) [info] - sanity check medianKSD against references (91 milliseconds) [info] - bernoulli sampling (33 milliseconds) [info] - bernoulli sampling without iterator (27 milliseconds) [info] - bernoulli sampling with gap sampling optimization (103 milliseconds) [info] - bernoulli sampling (without iterator) with gap sampling optimization (67 milliseconds) [info] - bernoulli boundary cases (1 millisecond) [info] - bernoulli (without iterator) boundary cases (2 milliseconds) [info] - bernoulli data types (79 milliseconds) [info] - bernoulli clone (18 milliseconds) [info] - bernoulli set seed (24 milliseconds) [info] - replacement sampling (51 milliseconds) [info] - replacement sampling without iterator (41 milliseconds) [info] - replacement sampling with gap sampling (117 milliseconds) [info] - recover from node failures (5 seconds, 351 milliseconds) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@69358b0a rejected from java.util.concurrent.ThreadPoolExecutor@65100c4d[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 0] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at java.util.concurrent.Executors$DelegatedExecutorService.execute(Executors.java:668) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) [info] - replacement sampling (without iterator) with gap sampling (129 milliseconds) [info] - replacement boundary cases (1 millisecond) [info] - replacement (without) boundary cases (1 millisecond) [info] - replacement data types (93 milliseconds) [info] - replacement clone (24 milliseconds) [info] - replacement set seed (36 milliseconds) [info] - bernoulli partitioning sampling (15 milliseconds) [info] - bernoulli partitioning sampling without iterator (22 milliseconds) [info] - bernoulli partitioning boundary cases (1 millisecond) [info] - bernoulli partitioning (without iterator) boundary cases (3 milliseconds) [info] - bernoulli partitioning data (1 millisecond) [info] - bernoulli partitioning clone (1 millisecond) [info] ChunkedByteBufferOutputStreamSuite: [info] - empty output (1 millisecond) [info] - write a single byte (1 millisecond) [info] - write a single near boundary (1 millisecond) [info] - write a single at boundary (0 milliseconds) [info] - single chunk output (0 milliseconds) [info] - single chunk output at boundary size (1 millisecond) [info] - multiple chunk output (1 millisecond) [info] - multiple chunk output at boundary size (1 millisecond) [info] - SPARK-36464: size returns correct positive number even with over 2GB data (750 milliseconds) [info] ProcfsMetricsGetterSuite: [info] - testGetProcessInfo (2 milliseconds) [info] - SPARK-34845: partial metrics shouldn't be returned (20 milliseconds) [info] GraphiteSinkSuite: [info] - GraphiteSink with default MetricsFilter (9 milliseconds) [info] - GraphiteSink with regex MetricsFilter (1 millisecond) [info] SparkSubmitUtilsSuite: [info] - incorrect maven coordinate throws error (3 milliseconds) [info] - create repo resolvers (78 milliseconds) [info] - create additional resolvers (8 milliseconds) :: loading settings :: url = jar:file:/home/jenkins/sparkivy/per-executor-caches/6/.cache/coursier/v1/https/maven-central.storage-download.googleapis.com/maven2/org/apache/ivy/ivy/2.5.0/ivy-2.5.0.jar!/org/apache/ivy/core/settings/ivysettings.xml [info] - add dependencies works correctly (73 milliseconds) [info] - excludes works correctly (7 milliseconds) [info] - ivy path works correctly (1 second, 575 milliseconds) [info] - search for artifact at local repositories (1 second, 660 milliseconds) [info] - dependency not found throws RuntimeException (78 milliseconds) [info] - neglects Spark and Spark's dependencies (364 milliseconds) [info] - multiple failures with updateStateByKey (41 seconds, 28 milliseconds) [info] - exclude dependencies end to end (461 milliseconds) :: loading settings :: file = /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/ivy-b8135b0b-1845-418f-94d8-4f46ba6e06f1/ivysettings.xml [info] WriteAheadLogUtilsSuite: [info] - log selection and creation (22 milliseconds) [info] - wrap WriteAheadLog in BatchedWriteAheadLog when batching is enabled (2 milliseconds) [info] - batching is enabled by default in WriteAheadLog (0 milliseconds) [info] - closeFileAfterWrite is disabled by default in WriteAheadLog (0 milliseconds) [info] BlockGeneratorSuite: [info] - block generation and data callbacks (28 milliseconds) [info] - stop ensures correct shutdown (235 milliseconds) [info] - block push errors are reported (28 milliseconds) [info] UISeleniumSuite: [info] - load ivy settings file (501 milliseconds) [info] - SPARK-10878: test resolution files cleaned after resolving artifact (291 milliseconds) [info] - SPARK-34624: should ignore non-jar dependencies (419 milliseconds) [info] BasicEventFilterBuilderSuite: [info] - track live jobs (9 milliseconds) [info] - track live executors (2 milliseconds) [info] ImplicitOrderingSuite: [info] - basic inference of Orderings (252 milliseconds) [info] TaskMetricsSuite: [info] - mutating values (1 millisecond) [info] - mutating shuffle read metrics values (1 millisecond) [info] - mutating shuffle write metrics values (0 milliseconds) [info] - mutating input metrics values (1 millisecond) [info] - mutating output metrics values (0 milliseconds) [info] - merging multiple shuffle read metrics (1 millisecond) [info] - additional accumulables (1 millisecond) [info] ExternalShuffleServiceSuite: [info] - groupByKey without compression (200 milliseconds) [info] - recover from repeated node failures during shuffle-map (8 seconds, 32 milliseconds) [info] - attaching and detaching a Streaming tab (5 seconds, 111 milliseconds) [info] ExecutorAllocationManagerSuite: [info] - basic functionality (152 milliseconds) [info] - basic decommissioning (22 milliseconds) [info] - requestExecutors policy (25 milliseconds) [info] - killExecutor policy (14 milliseconds) [info] - parameter validation (16 milliseconds) [info] - shuffle non-zero block size (6 seconds, 815 milliseconds) [info] - enabling and disabling (1 second, 600 milliseconds) [info] StreamingJobProgressListenerSuite: [info] - onBatchSubmitted, onBatchStarted, onBatchCompleted, onReceiverStarted, onReceiverError, onReceiverStopped (79 milliseconds) [info] - Remove the old completed batches when exceeding the limit (76 milliseconds) [info] - out-of-order onJobStart and onBatchXXX (101 milliseconds) [info] - detect memory leak (101 milliseconds) [info] ReceiverSuite: [info] - receiver life cycle (331 milliseconds) [info] - block generator throttling !!! IGNORED !!! [info] - SPARK-35672: run Spark in yarn-cluster mode with additional jar using URI scheme 'local' and gateway-replacement path (25 seconds, 42 milliseconds) [info] - shuffle serializer (6 seconds, 358 milliseconds) [info] - recover from repeated node failures during shuffle-reduce (14 seconds, 202 milliseconds) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@28714cfd rejected from java.util.concurrent.ThreadPoolExecutor@70c16702[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 108] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete(Promise.scala:372) at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete$(Promise.scala:371) at scala.concurrent.impl.Promise$KeptPromise$Successful.onComplete(Promise.scala:379) at scala.concurrent.impl.Promise.transform(Promise.scala:33) at scala.concurrent.impl.Promise.transform$(Promise.scala:31) at scala.concurrent.impl.Promise$KeptPromise$Successful.transform(Promise.scala:379) at scala.concurrent.Future.map(Future.scala:292) at scala.concurrent.Future.map$(Future.scala:292) at scala.concurrent.impl.Promise$KeptPromise$Successful.map(Promise.scala:379) at scala.concurrent.Future$.apply(Future.scala:659) at org.apache.spark.streaming.receiver.WriteAheadLogBasedBlockHandler.storeBlock(ReceivedBlockHandler.scala:190) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:158) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushArrayBuffer(ReceiverSupervisorImpl.scala:129) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl$$anon$2.onPushBlock(ReceiverSupervisorImpl.scala:110) at org.apache.spark.streaming.receiver.BlockGenerator.pushBlock(BlockGenerator.scala:299) at org.apache.spark.streaming.receiver.BlockGenerator.org$apache$spark$streaming$receiver$BlockGenerator$$keepPushingBlocks(BlockGenerator.scala:271) at org.apache.spark.streaming.receiver.BlockGenerator$$anon$1.run(BlockGenerator.scala:112) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@5513a8fe rejected from java.util.concurrent.ThreadPoolExecutor@5e6e6b[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 108] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete(Promise.scala:372) at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete$(Promise.scala:371) at scala.concurrent.impl.Promise$KeptPromise$Successful.onComplete(Promise.scala:379) at scala.concurrent.impl.Promise.transform(Promise.scala:33) at scala.concurrent.impl.Promise.transform$(Promise.scala:31) at scala.concurrent.impl.Promise$KeptPromise$Successful.transform(Promise.scala:379) at scala.concurrent.Future.map(Future.scala:292) at scala.concurrent.Future.map$(Future.scala:292) at scala.concurrent.impl.Promise$KeptPromise$Successful.map(Promise.scala:379) at scala.concurrent.Future$.apply(Future.scala:659) at org.apache.spark.streaming.receiver.WriteAheadLogBasedBlockHandler.storeBlock(ReceivedBlockHandler.scala:190) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:158) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushArrayBuffer(ReceiverSupervisorImpl.scala:129) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl$$anon$2.onPushBlock(ReceiverSupervisorImpl.scala:110) at org.apache.spark.streaming.receiver.BlockGenerator.pushBlock(BlockGenerator.scala:299) at org.apache.spark.streaming.receiver.BlockGenerator.org$apache$spark$streaming$receiver$BlockGenerator$$keepPushingBlocks(BlockGenerator.scala:271) at org.apache.spark.streaming.receiver.BlockGenerator$$anon$1.run(BlockGenerator.scala:112) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@4f4a42a1 rejected from java.util.concurrent.ThreadPoolExecutor@70c16702[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 108] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete(Promise.scala:372) at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete$(Promise.scala:371) at scala.concurrent.impl.Promise$KeptPromise$Successful.onComplete(Promise.scala:379) at scala.concurrent.impl.Promise.transform(Promise.scala:33) at scala.concurrent.impl.Promise.transform$(Promise.scala:31) at scala.concurrent.impl.Promise$KeptPromise$Successful.transform(Promise.scala:379) at scala.concurrent.Future.map(Future.scala:292) at scala.concurrent.Future.map$(Future.scala:292) at scala.concurrent.impl.Promise$KeptPromise$Successful.map(Promise.scala:379) at scala.concurrent.Future$.apply(Future.scala:659) at org.apache.spark.streaming.receiver.WriteAheadLogBasedBlockHandler.storeBlock(ReceivedBlockHandler.scala:203) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:158) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushArrayBuffer(ReceiverSupervisorImpl.scala:129) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl$$anon$2.onPushBlock(ReceiverSupervisorImpl.scala:110) at org.apache.spark.streaming.receiver.BlockGenerator.pushBlock(BlockGenerator.scala:299) at org.apache.spark.streaming.receiver.BlockGenerator.org$apache$spark$streaming$receiver$BlockGenerator$$keepPushingBlocks(BlockGenerator.scala:271) at org.apache.spark.streaming.receiver.BlockGenerator$$anon$1.run(BlockGenerator.scala:112) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@3d9a76b6 rejected from java.util.concurrent.ThreadPoolExecutor@5e6e6b[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 108] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete(Promise.scala:372) at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete$(Promise.scala:371) at scala.concurrent.impl.Promise$KeptPromise$Successful.onComplete(Promise.scala:379) at scala.concurrent.impl.Promise.transform(Promise.scala:33) at scala.concurrent.impl.Promise.transform$(Promise.scala:31) at scala.concurrent.impl.Promise$KeptPromise$Successful.transform(Promise.scala:379) at scala.concurrent.Future.map(Future.scala:292) at scala.concurrent.Future.map$(Future.scala:292) at scala.concurrent.impl.Promise$KeptPromise$Successful.map(Promise.scala:379) at scala.concurrent.Future$.apply(Future.scala:659) at org.apache.spark.streaming.receiver.WriteAheadLogBasedBlockHandler.storeBlock(ReceivedBlockHandler.scala:203) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:158) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushArrayBuffer(ReceiverSupervisorImpl.scala:129) at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl$$anon$2.onPushBlock(ReceiverSupervisorImpl.scala:110) at org.apache.spark.streaming.receiver.BlockGenerator.pushBlock(BlockGenerator.scala:299) at org.apache.spark.streaming.receiver.BlockGenerator.org$apache$spark$streaming$receiver$BlockGenerator$$keepPushingBlocks(BlockGenerator.scala:271) at org.apache.spark.streaming.receiver.BlockGenerator$$anon$1.run(BlockGenerator.scala:112) [info] - zero sized blocks (8 seconds, 450 milliseconds) [info] - recover from node failures with replication (12 seconds, 292 milliseconds) [info] - zero sized blocks without kryo (8 seconds, 244 milliseconds) [info] - SPARK-35672: run Spark in yarn-cluster mode with additional jar using URI scheme 'local' and gateway-replacement path containing an environment variable (24 seconds, 43 milliseconds) [info] - unpersist RDDs (5 seconds, 697 milliseconds) [info] - shuffle on mutable pairs (5 seconds, 855 milliseconds) [info] - reference partitions inside a task (4 seconds, 929 milliseconds) [info] BasicDriverFeatureStepSuite: [info] - Check the pod respects all configurations from the user. (667 milliseconds) [info] - Check driver pod respects kubernetes driver request cores (23 milliseconds) [info] - Check appropriate entrypoint rerouting for various bindings (9 milliseconds) [info] - memory overhead factor: java (7 milliseconds) [info] - memory overhead factor: python default (5 milliseconds) [info] - memory overhead factor: python w/ override (4 milliseconds) [info] - memory overhead factor: r default (5 milliseconds) [info] - SPARK-35493: make spark.blockManager.port be able to be fallen back to in driver pod (9 milliseconds) [info] - SPARK-36075: Check driver pod respects nodeSelector/driverNodeSelector (5 milliseconds) [info] EnvSecretsFeatureStepSuite: [info] - sets up all keyRefs (9 milliseconds) [info] ExecutorPodsPollingSnapshotSourceSuite: [info] - sorting on mutable pairs (5 seconds, 596 milliseconds) [info] - Items returned by the API should be pushed to the event queue (49 milliseconds) [info] - SPARK-36334: Support pod listing with resource version (17 milliseconds) [info] ExecutorPodsSnapshotSuite: [info] - States are interpreted correctly from pod metadata. (35 milliseconds) [info] - SPARK-30821: States are interpreted correctly from pod metadata when configured to check all containers. (15 milliseconds) [info] - Updates add new pods for non-matching ids and edit existing pods for matching ids (7 milliseconds) [info] ExecutorKubernetesCredentialsFeatureStepSuite: [info] - configure spark pod with executor service account (5 milliseconds) [info] - configure spark pod with with driver service account and without executor service account (2 milliseconds) [info] - configure spark pod with with driver service account and with executor service account (1 millisecond) [info] DriverKubernetesCredentialsFeatureStepSuite: [info] - Don't set any credentials (54 milliseconds) [info] - Only set credentials that are manually mounted. (4 milliseconds) [info] - Mount credentials from the submission client as a secret. (63 milliseconds) [info] PodTemplateConfigMapStepSuite: [info] - Do nothing when executor template is not specified (2 milliseconds) [info] - Mounts executor template volume if config specified (127 milliseconds) [info] KubernetesExecutorBuilderSuite: [info] - use empty initial pod if template is not specified (130 milliseconds) [info] - load pod template if specified (174 milliseconds) [info] - configure a custom test step (71 milliseconds) [info] - complain about misconfigured pod template (44 milliseconds) [info] KubernetesConfSuite: [info] - Resolve driver labels, annotations, secret mount paths, envs, and memory overhead (5 milliseconds) [info] - Basic executor translated fields. (1 millisecond) [info] - resource profile not default. (1 millisecond) [info] - Image pull secrets. (1 millisecond) [info] - Set executor labels, annotations, and secrets (4 milliseconds) [info] - Verify that executorEnv key conforms to the regular specification (2 milliseconds) [info] - SPARK-36075: Set nodeSelector, driverNodeSelector, executorNodeSelect (2 milliseconds) [info] - SPARK-36059: Set driver.scheduler and executor.scheduler (2 milliseconds) [info] - SPARK-36566: get app name label (2 milliseconds) [info] BasicExecutorFeatureStepSuite: [info] - test spark resource missing vendor (31 milliseconds) [info] - test spark resource missing amount (2 milliseconds) [info] - basic executor pod with resources (34 milliseconds) [info] - basic executor pod has reasonable defaults (33 milliseconds) [info] - executor pod hostnames get truncated to 63 characters (32 milliseconds) [info] - SPARK-35460: invalid PodNamePrefixes (3 milliseconds) [info] - hostname truncation generates valid host names (51 milliseconds) [info] - classpath and extra java options get translated into environment variables (34 milliseconds) [info] - SPARK-32655 Support appId/execId placeholder in SPARK_EXECUTOR_DIRS (28 milliseconds) [info] - test executor pyspark memory (28 milliseconds) [info] - auth secret propagation (35 milliseconds) [info] - Auth secret shouldn't propagate if files are loaded. (42 milliseconds) [info] - SPARK-32661 test executor offheap memory (30 milliseconds) [info] - basic resourceprofile (33 milliseconds) [info] - resourceprofile with gpus (29 milliseconds) [info] - Verify spark conf dir is mounted as configmap volume on executor pod's container. (37 milliseconds) [info] - SPARK-34316 Disable configmap volume on executor pod's container (36 milliseconds) [info] - SPARK-35482: user correct block manager port for executor pods (42 milliseconds) [info] - SPARK-35969: Make the pod prefix more readable and tallied with K8S DNS Label Names (70 milliseconds) [info] - SPARK-36075: Check executor pod respects nodeSelector/executorNodeSelector (34 milliseconds) [info] KubernetesVolumeUtilsSuite: [info] - Parses hostPath volumes correctly (3 milliseconds) [info] - Parses subPath correctly (1 millisecond) [info] - Parses persistentVolumeClaim volumes correctly (2 milliseconds) [info] - Parses emptyDir volumes correctly (1 millisecond) [info] - Parses emptyDir volume options can be optional (1 millisecond) [info] - Defaults optional readOnly to false (1 millisecond) [info] - Fails on missing mount key (1 millisecond) [info] - Fails on missing option key (1 millisecond) [info] - SPARK-33063: Fails on missing option key in persistentVolumeClaim (1 millisecond) [info] - Parses read-only nfs volumes correctly (1 millisecond) [info] - Parses read/write nfs volumes correctly (1 millisecond) [info] - Fails on missing path option (1 millisecond) [info] - Fails on missing server option (1 millisecond) [info] KubernetesClusterSchedulerBackendSuite: [info] - Start all components (5 milliseconds) [info] - Stop all components (14 milliseconds) [info] - Remove executor (91 milliseconds) [info] - Kill executors (197 milliseconds) [info] - SPARK-34407: CoarseGrainedSchedulerBackend.stop may throw SparkException (10 milliseconds) [info] - SPARK-34469: Ignore RegisterExecutor when SparkContext is stopped (5 milliseconds) [info] - Dynamically fetch an executor ID (1 millisecond) [info] KubernetesDriverBuilderSuite: [info] - use empty initial pod if template is not specified (107 milliseconds) [info] - load pod template if specified (65 milliseconds) [info] - configure a custom test step (65 milliseconds) [info] - complain about misconfigured pod template (20 milliseconds) [info] LocalDirsFeatureStepSuite: [info] - Resolve to default local dir if neither env nor configuration are set (1 millisecond) [info] - Use configured local dirs split on comma if provided. (3 milliseconds) [info] - Use tmpfs to back default local dir (1 millisecond) [info] - local dir on mounted volume (4 milliseconds) [info] ExecutorPodsWatchSnapshotSourceSuite: [info] - Watch events should be pushed to the snapshots store as snapshot updates. (3 milliseconds) [info] ExecutorPodsAllocatorSuite: [info] - SPARK-36052: test splitSlots (3 milliseconds) [info] - SPARK-36052: pending pod limit with multiple resource profiles (55 milliseconds) [info] - Initially request executors in batches. Do not request another batch if the first has not finished. (8 milliseconds) [info] - Request executors in batches. Allow another batch to be requested if all pending executors start running. (25 milliseconds) [info] - When a current batch reaches error states immediately, re-request them on the next batch. (11 milliseconds) [info] - Verify stopping deletes the labeled pods (2 milliseconds) [info] - When an executor is requested but the API does not report it in a reasonable time, retry requesting that executor. (7 milliseconds) [info] - SPARK-28487: scale up and down on target executor count changes (13 milliseconds) [info] - SPARK-34334: correctly identify timed out pending pod requests as excess (6 milliseconds) [info] - SPARK-33099: Respect executor idle timeout configuration (9 milliseconds) [info] - SPARK-34361: scheduler backend known pods with multiple resource profiles at downscaling (19 milliseconds) [info] - SPARK-33288: multiple resource profiles (18 milliseconds) [info] - SPARK-33262: pod allocator does not stall with pending pods (9 milliseconds) [info] - SPARK-35416: Support PersistentVolumeClaim Reuse (26 milliseconds) [info] - print the pod name instead of Some(name) if pod is absent (3 milliseconds) [info] ExecutorPodsSnapshotsStoreSuite: [info] - Subscribers get notified of events periodically. (6 milliseconds) [info] - Even without sending events, initially receive an empty buffer. (1 millisecond) [info] - Replacing the snapshot passes the new snapshot to subscribers. (2 milliseconds) [info] ExecutorPodsLifecycleManagerSuite: [info] - When an executor reaches error states immediately, remove from the scheduler backend. (23 milliseconds) [info] - Don't remove executors twice from Spark but remove from K8s repeatedly. (4 milliseconds) [info] - When the scheduler backend lists executor ids that aren't present in the cluster, remove those executors from Spark. (5 milliseconds) [info] - Keep executor pods in k8s if configured. (4 milliseconds) [info] StatefulSetAllocatorSuite: [info] - Validate initial statefulSet creation & cleanup with two resource profiles (25 milliseconds) [info] - Validate statefulSet scale up (3 milliseconds) [info] HadoopConfDriverFeatureStepSuite: [info] - mount hadoop config map if defined (3 milliseconds) [info] - create hadoop config map if config dir is defined (6 milliseconds) [info] KubernetesClusterManagerSuite: [info] - constructing a AbstractPodsAllocator works (6 milliseconds) [info] KubernetesClientUtilsSuite: [info] - verify load files, loads only allowed files and not the disallowed files. (33 milliseconds) [info] - write ahead log - generating and cleaning (39 seconds, 82 milliseconds) [info] StreamingListenerSuite: [info] - cogroup using mutable pairs (5 seconds, 191 milliseconds) [info] - batch info reporting (624 milliseconds) [info] - receiver info reporting (150 milliseconds) [info] - output operation reporting (247 milliseconds) [info] - don't call ssc.stop in listener (955 milliseconds) [info] - verify load files, truncates the content to maxSize, when keys are very large in number. (3 seconds, 86 milliseconds) [info] - verify load files, truncates the content to maxSize, when keys are equal in length. (5 milliseconds) [info] MountVolumesFeatureStepSuite: [info] - Mounts hostPath volumes (3 milliseconds) [info] - Mounts persistentVolumeClaims (3 milliseconds) [info] - SPARK-32713 Mounts parameterized persistentVolumeClaims in executors (2 milliseconds) [info] - Create and mounts persistentVolumeClaims in driver (2 milliseconds) [info] - Create and mount persistentVolumeClaims in executors (1 millisecond) [info] - Mounts emptyDir (5 milliseconds) [info] - Mounts emptyDir with no options (1 millisecond) [info] - Mounts read/write nfs volumes (4 milliseconds) [info] - Mounts read-only nfs volumes (2 milliseconds) [info] - Mounts multiple volumes (2 milliseconds) [info] - mountPath should be unique (2 milliseconds) [info] - Mounts subpath on emptyDir (2 milliseconds) [info] - Mounts subpath on persistentVolumeClaims (2 milliseconds) [info] - Mounts multiple subpaths (2 milliseconds) [info] - onBatchCompleted with successful batch (1 second, 7 milliseconds) [info] ClientSuite: [info] - The client should configure the pod using the builder. (11 milliseconds) [info] - The client should create Kubernetes resources (4 milliseconds) [info] - All files from SPARK_CONF_DIR, except templates, spark config, binary files and are within size limit, should be populated to pod's configMap. (14 milliseconds) [info] - Waiting for app completion should stall on the watcher (2 milliseconds) [info] K8sSubmitOpSuite: [info] - List app status (5 milliseconds) [info] - List status for multiple apps with glob (3 milliseconds) [info] - Kill app (3 milliseconds) [info] - Kill app with gracePeriod (2 milliseconds) [info] - Kill multiple apps with glob without gracePeriod (3 milliseconds) [info] KubernetesLocalDiskShuffleDataIOSuite: [info] - onBatchCompleted with failed batch and one failed job (996 milliseconds) [info] - onBatchCompleted with failed batch and multiple failed jobs (1 second, 74 milliseconds) [info] - subtract mutable pairs (5 seconds, 271 milliseconds) [info] - StreamingListener receives no events after stopping StreamingListenerBus (520 milliseconds) [info] MapWithStateRDDSuite: [info] - creation from pair RDD (156 milliseconds) [info] - updating state and generating mapped data in MapWithStateRDDRecord (4 milliseconds) [info] - states generated by MapWithStateRDD (1 second, 161 milliseconds) [info] - SPARK-35672: run Spark in yarn-client mode with additional jar using URI scheme 'file' (22 seconds, 47 milliseconds) [info] - checkpointing (798 milliseconds) [info] - checkpointing empty state RDD (217 milliseconds) [info] BasicOperationsSuite: [info] - map (345 milliseconds) [info] - flatMap (307 milliseconds) [info] - filter (251 milliseconds) [info] - glom (302 milliseconds) [info] - mapPartitions (258 milliseconds) [info] - repartition (more partitions) (421 milliseconds) [info] - repartition (fewer partitions) (426 milliseconds) [info] - groupByKey (326 milliseconds) [info] - reduceByKey (331 milliseconds) [info] - sort with Java non serializable class - Kryo (5 seconds, 739 milliseconds) [info] - reduce (367 milliseconds) [info] - count (362 milliseconds) [info] - countByValue (362 milliseconds) [info] - mapValues (305 milliseconds) [info] - flatMapValues (307 milliseconds) [info] - union (292 milliseconds) [info] - union with input stream return None (126 milliseconds) [info] - StreamingContext.union (285 milliseconds) [info] - transform (242 milliseconds) [info] - transform with NULL (94 milliseconds) [info] - transform with input stream return None (128 milliseconds) [info] - transformWith (416 milliseconds) [info] - transformWith with input stream return None (145 milliseconds) [info] - StreamingContext.transform (295 milliseconds) [info] - StreamingContext.transform with input stream return None (134 milliseconds) [info] - cogroup (351 milliseconds) [info] - join (365 milliseconds) [info] - leftOuterJoin (388 milliseconds) [info] - rightOuterJoin (378 milliseconds) [info] - sort with Java non serializable class - Java (5 seconds, 97 milliseconds) [info] - fullOuterJoin (383 milliseconds) [info] - recompute is not blocked by the recovery (13 seconds, 765 milliseconds) [info] - shuffle with different compression settings (SPARK-3426) (613 milliseconds) [info] - updateStateByKey (428 milliseconds) [info] - [SPARK-4085] rerun map stage if reduce stage cannot find its local shuffle file (397 milliseconds) [info] - updateStateByKey - simple with initial value RDD (409 milliseconds) [info] - cannot find its local shuffle file if no execution of the stage and rerun shuffle (136 milliseconds) [info] - updateStateByKey - testing time stamps as input (396 milliseconds) [info] - metrics for shuffle without aggregation (261 milliseconds) [info] - updateStateByKey - with initial value RDD (343 milliseconds) [info] - updateStateByKey - object lifecycle (355 milliseconds) [info] - metrics for shuffle with aggregation (694 milliseconds) [info] - multiple simultaneous attempts for one task (SPARK-8029) (99 milliseconds) [info] - SPARK-34541: shuffle can be removed (136 milliseconds) [info] - slice (2 seconds, 185 milliseconds) [info] - slice - has not been initialized (75 milliseconds) [info] - rdd cleanup - map and window (344 milliseconds) [info] - rdd cleanup - updateStateByKey (686 milliseconds) Exception in thread "receiver-supervisor-future-0" java.lang.Error: java.lang.InterruptedException: sleep interrupted at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException: sleep interrupted at java.lang.Thread.sleep(Native Method) at org.apache.spark.streaming.receiver.ReceiverSupervisor.$anonfun$restartReceiver$1(ReceiverSupervisor.scala:196) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659) at scala.util.Success.$anonfun$map$1(Try.scala:255) at scala.util.Success.map(Try.scala:213) at scala.concurrent.Future.$anonfun$map$1(Future.scala:292) at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more [info] - rdd cleanup - input blocks and persisted RDDs (2 seconds, 175 milliseconds) [info] InputStreamsSuite: Exception in thread "receiver-supervisor-future-0" java.lang.Error: java.lang.InterruptedException: sleep interrupted at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException: sleep interrupted at java.lang.Thread.sleep(Native Method) at org.apache.spark.streaming.receiver.ReceiverSupervisor.$anonfun$restartReceiver$1(ReceiverSupervisor.scala:196) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659) at scala.util.Success.$anonfun$map$1(Try.scala:255) at scala.util.Success.map(Try.scala:213) at scala.concurrent.Future.$anonfun$map$1(Future.scala:292) at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more [info] - socket input stream (578 milliseconds) [info] - socket input stream - no block in a batch (395 milliseconds) [info] - SPARK-36206: shuffle checksum detect disk corruption (6 seconds, 773 milliseconds) [info] - Partial recompute shuffle data (12 seconds, 114 milliseconds) [info] - SPARK-35672: run Spark in yarn-cluster mode with additional jar using URI scheme 'file' (23 seconds, 33 milliseconds) [info] - binary records stream (6 seconds, 139 milliseconds) [info] - file input stream - newFilesOnly = true (281 milliseconds) [info] - file input stream - newFilesOnly = false (262 milliseconds) [info] - using external shuffle service (6 seconds, 80 milliseconds) [info] - file input stream - wildcard (432 milliseconds) [info] - Modified files are correctly detected. (142 milliseconds) [info] - multi-thread receiver (1 second, 761 milliseconds) [info] - queue input stream - oneAtATime = true (1 second, 101 milliseconds) [info] - SPARK-25888: using external shuffle service fetching disk persisted blocks (4 seconds, 381 milliseconds) [info] ClosureCleanerSuite: [info] - closures inside an object (99 milliseconds) [info] - closures inside a class (93 milliseconds) [info] - closures inside a class with no default constructor (96 milliseconds) [info] - closures that don't use fields of the outer class (130 milliseconds) [info] - nested closures inside an object (127 milliseconds) [info] - nested closures inside a class (145 milliseconds) [info] - toplevel return statements in closures are identified at cleaning time (83 milliseconds) [info] - return statements from named functions nested in closures don't raise exceptions (88 milliseconds) [info] - queue input stream - oneAtATime = false (2 seconds, 74 milliseconds) [info] - test track the number of input stream (77 milliseconds) [info] UIUtilsSuite: [info] - shortTimeUnitString (1 millisecond) [info] - normalizeDuration (3 milliseconds) [info] - convertToTimeUnit (0 milliseconds) [info] - formatBatchTime (0 milliseconds) [info] RateLimitedOutputStreamSuite: [info] - user provided closures are actually cleaned (144 milliseconds) [info] - createNullValue (3 milliseconds) [info] UnpersistSuite: [info] - unpersist RDD (92 milliseconds) [info] PeriodicRDDCheckpointerSuite: [info] - Persisting (8 milliseconds) [info] - Checkpointing (325 milliseconds) [info] TaskSetManagerSuite: [info] - TaskSet with no preferences (74 milliseconds) [info] - multiple offers with no preferences (73 milliseconds) [info] - skip unsatisfiable locality levels (87 milliseconds) [info] - basic delay scheduling (103 milliseconds) [info] - we do not need to delay scheduling when we only have noPref tasks in the queue (74 milliseconds) [info] - delay scheduling with fallback (181 milliseconds) [info] - delay scheduling with failed hosts (75 milliseconds) [info] - task result lost (70 milliseconds) [info] - repeated failures lead to task set abortion (65 milliseconds) [info] - executors should be excluded after task failure, in spite of locality preferences (98 milliseconds) [info] - new executors get added and lost (80 milliseconds) [info] - Executors exit for reason unrelated to currently running tasks (74 milliseconds) [info] - SPARK-31837: Shift to the new highest locality level if there is when recomputeLocality (88 milliseconds) [info] - SPARK-32653: Decommissioned host should not be used to calculate locality levels (86 milliseconds) [info] - SPARK-32653: Decommissioned executor should not be used to calculate locality levels (71 milliseconds) [info] - test RACK_LOCAL tasks (72 milliseconds) [info] - do not emit warning when serialized task is small (80 milliseconds) [info] - emit warning when serialized task is large (84 milliseconds) [info] - Not serializable exception thrown if the task cannot be serialized (95 milliseconds) [info] - A new rdd and full recovery of old data (11 seconds, 796 milliseconds) [info] - abort the job if total size of results is too large (1 second, 228 milliseconds) [info] - write (4 seconds, 102 milliseconds) [info] FileBasedWriteAheadLogSuite: [info] - FileBasedWriteAheadLog - read all logs (23 milliseconds) [info] - FileBasedWriteAheadLog - write logs (18 milliseconds) [info] - FileBasedWriteAheadLog - read all logs after write (18 milliseconds) [info] - FileBasedWriteAheadLog - clean old logs (15 milliseconds) [info] - FileBasedWriteAheadLog - clean old logs synchronously (15 milliseconds) [info] - FileBasedWriteAheadLog - handling file errors while reading rotating logs (56 milliseconds) [info] - FileBasedWriteAheadLog - do not create directories or files unless write (1 millisecond) [info] - FileBasedWriteAheadLog - parallel recovery not enabled if closeFileAfterWrite = false (10 milliseconds) [info] - FileBasedWriteAheadLog - seqToParIterator (51 milliseconds) [info] - FileBasedWriteAheadLogWriter - writing data (15 milliseconds) [info] - FileBasedWriteAheadLogWriter - syncing of data by writing and reading immediately (16 milliseconds) [info] - FileBasedWriteAheadLogReader - sequentially reading data (2 milliseconds) [info] - FileBasedWriteAheadLogReader - sequentially reading data written with writer (4 milliseconds) [info] - FileBasedWriteAheadLogReader - reading data written with writer after corrupted write (813 milliseconds) [info] - FileBasedWriteAheadLogReader - handles errors when file doesn't exist (4 milliseconds) [info] - FileBasedWriteAheadLogRandomReader - reading data using random reader (12 milliseconds) [info] - FileBasedWriteAheadLogRandomReader- reading data using random reader written with writer (8 milliseconds) [info] RateControllerSuite: [info] - RateController - rate controller publishes updates after batches complete (453 milliseconds) [info] - ReceiverRateController - published rates reach receivers (597 milliseconds) [info] InputInfoTrackerSuite: [info] - test report and get InputInfo from InputInfoTracker (1 millisecond) [info] - test cleanup InputInfo from InputInfoTracker (1 millisecond) [info] WriteAheadLogBackedBlockRDDSuite: [info] - Read data available in both block manager and write ahead log (72 milliseconds) [info] - Read data available only in block manager, not in write ahead log (52 milliseconds) [info] - Read data available only in write ahead log, not in block manager (53 milliseconds) [info] - Read data with partially available in block manager, and rest in write ahead log (51 milliseconds) [info] - Test isBlockValid skips block fetching from BlockManager (96 milliseconds) [info] - Test whether RDD is valid after removing blocks from block manager (95 milliseconds) [info] - Test storing of blocks recovered from write ahead log back into block manager (96 milliseconds) Exception in thread "block-manager-storage-async-thread-pool-2" Exception in thread "block-manager-storage-async-thread-pool-3" java.lang.Error: java.lang.InterruptedException at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2048) at org.apache.spark.storage.BlockInfoManager.$anonfun$acquireLock$1(BlockInfoManager.scala:221) at org.apache.spark.storage.BlockInfoManager.$anonfun$acquireLock$1$adapted(BlockInfoManager.scala:214) at org.apache.spark.storage.BlockInfoWrapper.withLock(BlockInfoManager.scala:105) at org.apache.spark.storage.BlockInfoManager.acquireLock(BlockInfoManager.scala:214) at org.apache.spark.storage.BlockInfoManager.lockForWriting(BlockInfoManager.scala:293) at org.apache.spark.storage.BlockManager.removeBlock(BlockManager.scala:1930) at org.apache.spark.storage.BlockManagerStorageEndpoint$$anonfun$receiveAndReply$1.$anonfun$applyOrElse$1(BlockManagerStorageEndpoint.scala:47) at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) at org.apache.spark.storage.BlockManagerStorageEndpoint.$anonfun$doAsync$1(BlockManagerStorageEndpoint.scala:89) at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659) at scala.util.Success.$anonfun$map$1(Try.scala:255) at scala.util.Success.map(Try.scala:213) at scala.concurrent.Future.$anonfun$map$1(Future.scala:292) at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@166f245a rejected from java.util.concurrent.ThreadPoolExecutor@18a8d8e7[Shutting down, pool size = 3, active threads = 3, queued tasks = 0, completed tasks = 2] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) java.lang.Error: java.lang.InterruptedException at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2048) at org.apache.spark.storage.BlockInfoManager.$anonfun$acquireLock$1(BlockInfoManager.scala:221) at org.apache.spark.storage.BlockInfoManager.$anonfun$acquireLock$1$adapted(BlockInfoManager.scala:214) at org.apache.spark.storage.BlockInfoWrapper.withLock(BlockInfoManager.scala:105) at org.apache.spark.storage.BlockInfoManager.acquireLock(BlockInfoManager.scala:214) at org.apache.spark.storage.BlockInfoManager.lockForWriting(BlockInfoManager.scala:293) at org.apache.spark.storage.BlockManager.removeBlock(BlockManager.scala:1930) at org.apache.spark.storage.BlockManagerStorageEndpoint$$anonfun$receiveAndReply$1.$anonfun$applyOrElse$1(BlockManagerStorageEndpoint.scala:47) at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) at org.apache.spark.storage.BlockManagerStorageEndpoint.$anonfun$doAsync$1(BlockManagerStorageEndpoint.scala:89) at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659) at scala.util.Success.$anonfun$map$1(Try.scala:255) at scala.util.Success.map(Try.scala:213) at scala.concurrent.Future.$anonfun$map$1(Future.scala:292) at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@5995d2dc rejected from java.util.concurrent.ThreadPoolExecutor@18a8d8e7[Shutting down, pool size = 3, active threads = 3, queued tasks = 0, completed tasks = 2] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@42226257 rejected from java.util.concurrent.ThreadPoolExecutor@18a8d8e7[Shutting down, pool size = 3, active threads = 3, queued tasks = 0, completed tasks = 2] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@2ce8e1cb rejected from java.util.concurrent.ThreadPoolExecutor@18a8d8e7[Shutting down, pool size = 3, active threads = 3, queued tasks = 0, completed tasks = 2] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@31f686c rejected from java.util.concurrent.ThreadPoolExecutor@18a8d8e7[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 3] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@6dd5ebf0 rejected from java.util.concurrent.ThreadPoolExecutor@18a8d8e7[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 3] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) at scala.concurrent.Promise.complete(Promise.scala:53) at scala.concurrent.Promise.complete$(Promise.scala:52) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) [info] - read data in block manager and WAL with encryption on (126 milliseconds) [info] BatchedWriteAheadLogWithCloseFileAfterWriteSuite: [info] - BatchedWriteAheadLog - read all logs (27 milliseconds) [info] - BatchedWriteAheadLog - write logs (47 milliseconds) [info] - BatchedWriteAheadLog - read all logs after write (57 milliseconds) [info] - BatchedWriteAheadLog - clean old logs (96 milliseconds) [info] - BatchedWriteAheadLog - clean old logs synchronously (39 milliseconds) [info] - BatchedWriteAheadLog - handling file errors while reading rotating logs (172 milliseconds) [info] - BatchedWriteAheadLog - do not create directories or files unless write (2 milliseconds) [info] - BatchedWriteAheadLog - parallel recovery not enabled if closeFileAfterWrite = false (6 milliseconds) [info] - BatchedWriteAheadLog - close after write flag (3 milliseconds) [info] ReceivedBlockHandlerSuite: [info] - BlockManagerBasedBlockHandler - store blocks (21 milliseconds) [info] - BlockManagerBasedBlockHandler - handle errors in storing block (3 milliseconds) [info] - WriteAheadLogBasedBlockHandler - store blocks (57 milliseconds) [info] - WriteAheadLogBasedBlockHandler - handle errors in storing block (18 milliseconds) [info] - WriteAheadLogBasedBlockHandler - clean old blocks (15 milliseconds) [info] - Test Block - count messages (32 milliseconds) [info] - Test Block - isFullyConsumed (12 milliseconds) [info] Test run started [info] Test org.apache.spark.streaming.JavaDurationSuite.testGreaterEq started [info] Test org.apache.spark.streaming.JavaDurationSuite.testDiv started [info] Test org.apache.spark.streaming.JavaDurationSuite.testMinus started [info] Test org.apache.spark.streaming.JavaDurationSuite.testTimes started [info] Test org.apache.spark.streaming.JavaDurationSuite.testLess started [info] Test org.apache.spark.streaming.JavaDurationSuite.testPlus started [info] Test org.apache.spark.streaming.JavaDurationSuite.testGreater started [info] Test org.apache.spark.streaming.JavaDurationSuite.testMinutes started [info] Test org.apache.spark.streaming.JavaDurationSuite.testMilliseconds started [info] Test org.apache.spark.streaming.JavaDurationSuite.testLessEq started [info] Test org.apache.spark.streaming.JavaDurationSuite.testSeconds started [info] Test run finished: 0 failed, 0 ignored, 11 total, 0.009s [info] Test run started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testStreamingContextTransform started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testFlatMapValues started [info] - run Spark in yarn-cluster mode unsuccessfully (16 seconds, 26 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduceByWindowWithInverse started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testMapPartitions started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairFilter started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testRepartitionFewerPartitions started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCombineByKey started [info] - SPARK-32470: do not check total size of intermediate stages (6 seconds, 384 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testContextGetOrCreate started [info] - [SPARK-13931] taskSetManager should not send Resubmitted tasks after being a zombie (86 milliseconds) [info] - [SPARK-22074] Task killed by other attempt task should not be resubmitted (85 milliseconds) [info] - speculative and noPref task should be scheduled after node-local (69 milliseconds) [info] - node-local tasks should be scheduled right away when there are only node-local and no-preference tasks (73 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testWindowWithSlideDuration started [info] - SPARK-4939: node-local tasks should be scheduled right after process-local tasks finished (74 milliseconds) [info] - SPARK-4939: no-pref tasks should be scheduled after process-local tasks finished (70 milliseconds) [info] - Ensure TaskSetManager is usable after addition of levels (72 milliseconds) [info] - Test that locations with HDFSCacheTaskLocation are treated as PROCESS_LOCAL. (77 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testQueueStream started [info] - Test TaskLocation for different host type. (2 milliseconds) [info] - Kill other task attempts when one attempt belonging to the same task succeeds (80 milliseconds) [info] - Killing speculative tasks does not count towards aborting the taskset (78 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCountByValue started [info] - SPARK-19868: DagScheduler only notified of taskEnd when state is ready (160 milliseconds) [info] - SPARK-17894: Verify TaskSetManagers for different stage attempts have unique names (75 milliseconds) [info] - don't update excludelist for shuffle-fetch failures, preemption, denied commits, or killed tasks (86 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testMap started [info] - update application healthTracker for shuffle-fetch (80 milliseconds) [info] - update healthTracker before adding pending task to avoid race condition (105 milliseconds) [info] - SPARK-21563 context's added jars shouldn't change mid-TaskSet (71 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairToNormalRDDTransform started [info] - SPARK-24677: Avoid NoSuchElementException from MedianHeap (77 milliseconds) [info] - SPARK-24755 Executor loss can cause task to not be resubmitted (86 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairReduceByKey started [info] - SPARK-13343 speculative tasks that didn't commit shouldn't be marked as success (86 milliseconds) [info] - SPARK-13704 Rack Resolution is done with a batch of de-duped hosts (104 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCount started [info] - TaskSetManager passes task resource along (68 milliseconds) [info] - SPARK-26755 Ensure that a speculative task is submitted only once for execution (76 milliseconds) [info] - SPARK-26755 Ensure that a speculative task obeys original locality preferences (75 milliseconds) [info] - SPARK-29976 when a speculation time threshold is provided, should speculative run the task even if there are not enough successful runs, total tasks: 1 (69 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery started [info] - SPARK-29976: when the speculation time threshold is not provided,don't speculative run if there are not enough successful runs, total tasks: 1 (70 milliseconds) [info] - SPARK-29976 when a speculation time threshold is provided, should speculative run the task even if there are not enough successful runs, total tasks: 2 (71 milliseconds) [info] - SPARK-29976: when the speculation time threshold is not provided,don't speculative run if there are not enough successful runs, total tasks: 2 (67 milliseconds) [info] - SPARK-29976 when a speculation time threshold is provided, should not speculative if there are too many tasks in the stage even though time threshold is provided (68 milliseconds) [info] - SPARK-21040: Check speculative tasks are launched when an executor is decommissioned and the tasks running on it cannot finish within EXECUTOR_DECOMMISSION_KILL_INTERVAL (71 milliseconds) [info] - SPARK-29976 Regular speculation configs should still take effect even when a threshold is provided (75 milliseconds) [info] - SPARK-30417 when spark.task.cpus is greater than spark.executor.cores due to standalone settings, speculate if there is only one task in the stage (74 milliseconds) [info] - TaskOutputFileAlreadyExistException lead to task set abortion (106 milliseconds) [info] - Multi stages (11 seconds, 548 milliseconds) [info] KerberosConfDriverFeatureStepSuite: [info] - mount krb5 config map if defined (30 milliseconds) [info] - create krb5.conf config map if local config provided (24 milliseconds) [info] - create keytab secret if client keytab file used (24 milliseconds) [info] - do nothing if container-local keytab used (16 milliseconds) [info] - mount delegation tokens if provided (12 milliseconds) [info] - create delegation tokens if needed (37 milliseconds) [info] - do nothing if no config and no tokens (23 milliseconds) [info] MountSecretsFeatureStepSuite: [info] - mounts all given secrets (3 milliseconds) [info] DriverServiceFeatureStepSuite: [info] - Headless service has a port for the driver RPC, the block manager and driver ui. (5 milliseconds) [info] - Hostname and ports are set according to the service name. (1 millisecond) [info] - Ports should resolve to defaults in SparkConf and in the service. (1 millisecond) [info] - Long prefixes should switch to using a generated unique name. (9 milliseconds) [info] - Disallow bind address and driver host to be set explicitly. (2 milliseconds) [info] DriverCommandFeatureStepSuite: [info] - java resource (1 millisecond) [info] - python resource (3 milliseconds) [info] - python executable precedence (5 milliseconds) [info] - R resource (1 millisecond) [info] - SPARK-25355: java resource args with proxy-user (1 millisecond) [info] - SPARK-25355: python resource args with proxy-user (1 millisecond) [info] - SPARK-25355: R resource args with proxy-user (1 millisecond) [info] KubernetesUtilsSuite: [info] - Selects the given container as spark container. (2 milliseconds) [info] - Selects the first container if no container name is given. (1 millisecond) [info] - Falls back to the first container if given container name does not exist. (1 millisecond) [info] - constructs spark pod correctly with pod template with no containers (1 millisecond) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairMap started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testUnion started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testFlatMap started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduceByKeyAndWindowWithInverse started [info] - SPARK-30359: don't clean executorsPendingToRemove at the beginning of CoarseGrainedSchedulerBackend.reset (3 seconds, 794 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testGlom started [info] - SPARK-33741 Test minimum amount of time a task runs before being considered for speculation (68 milliseconds) [info] BlockManagerBasicStrategyReplicationSuite: [info] Test test.org.apache.spark.streaming.JavaAPISuite.testJoin started [info] - get peers with addition and removal of block managers (22 milliseconds) [info] MesosCoarseGrainedSchedulerBackendSuite: [info] - block replication - 2x replication (87 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairFlatMap started [info] - block replication - 3x replication (122 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairToPairFlatMapWithChangingTypes started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairMapPartitions started [info] - block replication - mixed between 1x to 5x (159 milliseconds) [info] - block replication - off-heap (36 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testRepartitionMorePartitions started [info] - block replication - 2x replication without peers (1 millisecond) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduceByWindowWithoutInverse started [info] - block replication - replication failures (45 milliseconds) [info] - test block replication failures when block is received by remote block manager but putBlock fails (stream = false) (46 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testLeftOuterJoin started [info] - test block replication failures when block is received by remote block manager but putBlock fails (stream = true) (28 milliseconds) [info] - block replication - addition and deletion of block managers (119 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testVariousTransform started [info] BlockManagerMasterSuite: [info] - SPARK-31422: getMemoryStatus should not fail after BlockManagerMaster stops (4 milliseconds) [info] - SPARK-31422: getStorageStatus should not fail after BlockManagerMaster stops (0 milliseconds) [info] RDDOperationGraphSuite: [info] - Test simple cluster equals (1 millisecond) [info] ShuffleExternalSorterSuite: [info] Test test.org.apache.spark.streaming.JavaAPISuite.testTransformWith started [info] - nested spill should be no-op (114 milliseconds) [info] ChunkedByteBufferSuite: [info] - no chunks (1 millisecond) [info] - getChunks() duplicates chunks (1 millisecond) [info] - copy() does not affect original buffer's position (1 millisecond) [info] - writeFully() does not affect original buffer's position (0 milliseconds) [info] - SPARK-24107: writeFully() write buffer which is larger than bufferWriteChunkSize (18 milliseconds) [info] - toArray() (1 millisecond) [info] - toArray() throws UnsupportedOperationException if size exceeds 2GB (2 milliseconds) [info] - toInputStream() (1 millisecond) [info] HistoryServerDiskManagerSuite: [info] Test test.org.apache.spark.streaming.JavaAPISuite.testVariousTransformWith started [info] - leasing space (50 milliseconds) [info] - tracking active stores (10 milliseconds) [info] - approximate size heuristic (1 millisecond) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testTextFileStream started [info] - SPARK-32024: update ApplicationStoreInfo.size during initializing (14 milliseconds) [info] PythonBroadcastSuite: [info] - PythonBroadcast can be serialized with Kryo (SPARK-4882) (17 milliseconds) [info] KeyLockSuite: [info] - The same key should wait when its lock is held (7 milliseconds) [info] - A different key should not be locked (3 milliseconds) [info] NettyBlockTransferServiceSuite: [info] - can bind to a random port (30 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairGroupByKey started [info] - can bind to two random ports (55 milliseconds) [info] - can bind to a specific port (38 milliseconds) [info] - can bind to a specific port twice and the second increments (67 milliseconds) [info] - SPARK-27637: test fetch block with executor dead (74 milliseconds) [info] BasicSchedulerIntegrationSuite: [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCoGroup started [info] - super simple job (123 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testInitialization started [info] - multi-stage job (214 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testSocketString started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testGroupByKeyAndWindow started [info] - mesos supports killing and limiting executors (3 seconds, 200 milliseconds) [info] - job with fetch failure (348 milliseconds) [info] - mesos supports killing and relaunching tasks with executors (213 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduceByKeyAndWindow started [info] - job failure after 4 attempts (107 milliseconds) [info] - mesos supports spark.executor.cores (160 milliseconds) [info] - SPARK-23626: RDD with expensive getPartitions() doesn't block scheduler loop (107 milliseconds) [info] JobWaiterSuite: [info] - call jobFailed multiple times (2 milliseconds) [info] RDDBarrierSuite: [info] - mesos supports unset spark.executor.cores (105 milliseconds) [info] - create an RDDBarrier (3 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testForeachRDD started [info] - RDDBarrier mapPartitionsWithIndex (26 milliseconds) [info] - create an RDDBarrier in the middle of a chain of RDDs (6 milliseconds) [info] - RDDBarrier with shuffle (4 milliseconds) [info] UninterruptibleThreadSuite: [info] - mesos does not acquire more than spark.cores.max (118 milliseconds) [info] - mesos does not acquire gpus if not specified (117 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testFileStream started [info] - mesos does not acquire more than spark.mesos.gpus.max (111 milliseconds) [info] - mesos declines offers that violate attribute constraints (106 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairTransform started [info] - mesos declines offers with a filter when reached spark.cores.max (125 milliseconds) [info] - mesos declines offers with a filter when maxCores not a multiple of executor.cores (88 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testFilter started [info] - mesos declines offers with a filter when reached spark.cores.max with executor.cores (110 milliseconds) [info] - mesos assigns tasks round-robin on offers (94 milliseconds) [info] - interrupt when runUninterruptibly is running (1 second, 2 milliseconds) [info] - interrupt before runUninterruptibly runs (2 milliseconds) [info] - nested runUninterruptibly (4 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairMap2 started [info] - mesos creates multiple executors on a single agent (95 milliseconds) [info] - mesos doesn't register twice with the same shuffle service (97 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testMapValues started [info] - Port offer decline when there is no appropriate range (103 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduce started [info] - Port offer accepted when ephemeral ports are used (97 milliseconds) [info] - Port offer accepted with user defined port numbers (123 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testUpdateStateByKey started [info] - mesos kills an executor when told (95 milliseconds) [info] - weburi is set in created scheduler driver (109 milliseconds) [info] - failover timeout is set in created scheduler driver (83 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testTransform started [info] - honors unset spark.mesos.containerizer (139 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testWindow started [info] - honors spark.mesos.containerizer="mesos" (88 milliseconds) [info] - docker settings are reflected in created tasks (111 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCountByValueAndWindow started [info] - force-pull-image option is disabled by default (93 milliseconds) [info] - mesos supports spark.executor.uri (94 milliseconds) [info] - mesos supports setting fetcher cache (89 milliseconds) [info] - stress test (1 second, 859 milliseconds) [info] BlockManagerDecommissionUnitSuite: [info] Test test.org.apache.spark.streaming.JavaAPISuite.testRawSocketStream started [info] - mesos supports disabling fetcher cache (102 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testSocketTextStream started [info] - mesos sets task name to spark.app.name (82 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testUpdateStateByKeyWithInitial started [info] - mesos sets configurable labels on tasks (90 milliseconds) [info] - mesos supports spark.mesos.network.name and spark.mesos.network.labels (84 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testContextState started [info] Test run finished: 0 failed, 0 ignored, 53 total, 16.861s [info] Test run started [info] Test test.org.apache.spark.streaming.Java8APISuite.testStreamingContextTransform started [info] Test test.org.apache.spark.streaming.Java8APISuite.testFlatMapValues started [info] - SPARK-28778 '--hostname' shouldn't be set for executor when virtual network is enabled (456 milliseconds) [info] - supports spark.scheduler.minRegisteredResourcesRatio (158 milliseconds) [info] Test test.org.apache.spark.streaming.Java8APISuite.testMapPartitions started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairFilter started [info] Test test.org.apache.spark.streaming.Java8APISuite.testCombineByKey started [info] Test test.org.apache.spark.streaming.Java8APISuite.testMap started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairToNormalRDDTransform started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairReduceByKey started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairMap started [info] Test test.org.apache.spark.streaming.Java8APISuite.testFlatMap started [info] Test test.org.apache.spark.streaming.Java8APISuite.testReduceByKeyAndWindowWithInverse started [info] Test test.org.apache.spark.streaming.Java8APISuite.testReduceByWindow started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairFlatMap started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairToPairFlatMapWithChangingTypes started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairMapPartitions started [info] Test test.org.apache.spark.streaming.Java8APISuite.testVariousTransform started [info] Test test.org.apache.spark.streaming.Java8APISuite.testTransformWith started [info] Test test.org.apache.spark.streaming.Java8APISuite.testVariousTransformWith started [info] - test that with no blocks we finish migration (5 seconds, 35 milliseconds) [info] - block decom manager with no migrations configured (6 milliseconds) [info] - block decom manager with no peers (4 milliseconds) [info] Test test.org.apache.spark.streaming.Java8APISuite.testReduceByKeyAndWindow started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairTransform started [info] Test test.org.apache.spark.streaming.Java8APISuite.testFilter started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairMap2 started [info] Test test.org.apache.spark.streaming.Java8APISuite.testMapValues started [info] Test test.org.apache.spark.streaming.Java8APISuite.testReduce started [info] Test test.org.apache.spark.streaming.Java8APISuite.testUpdateStateByKey started [info] Test test.org.apache.spark.streaming.Java8APISuite.testTransform started [info] Test run finished: 0 failed, 0 ignored, 26 total, 6.622s [info] Test run started [info] Test org.apache.spark.streaming.JavaMapWithStateSuite.testBasicFunction started [info] - supports data locality with dynamic allocation (6 seconds, 102 milliseconds) [info] - Creates an env-based reference secrets. (85 milliseconds) [info] - Creates an env-based value secrets. (100 milliseconds) [info] - Creates file-based reference secrets. (88 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.442s [info] Test run started [info] Test org.apache.spark.streaming.JavaWriteAheadLogSuite.testCustomWAL started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.002s [info] Test run started [info] Test org.apache.spark.streaming.JavaReceiverAPISuite.testReceiver started [info] - Creates a file-based value secrets. (100 milliseconds) [info] MesosClusterManagerSuite: [info] - mesos fine-grained (230 milliseconds) [info] - mesos coarse-grained (71 milliseconds) [info] - mesos with zookeeper (72 milliseconds) [info] - mesos with i/o encryption throws error (127 milliseconds) [info] MesosFineGrainedSchedulerBackendSuite: [info] - weburi is set in created scheduler driver (307 milliseconds) [info] - Use configured mesosExecutor.cores for ExecutorInfo (47 milliseconds) [info] - check spark-class location correctly (4 milliseconds) [info] - spark docker properties correctly populate the DockerInfo message (4 milliseconds) [info] - mesos resource offers result in launching tasks (23 milliseconds) [info] - can handle multiple roles (7 milliseconds) [info] MesosProtoUtilsSuite: [info] - mesosLabels (1 millisecond) [info] MesosSchedulerUtilsSuite: [info] - use at-least minimum overhead (19 milliseconds) [info] - use overhead if it is greater than minimum value (1 millisecond) [info] - use spark.mesos.executor.memoryOverhead (if set) (1 millisecond) [info] - parse a non-empty constraint string correctly (7 milliseconds) [info] - parse an empty constraint string correctly (1 millisecond) [info] - throw an exception when the input is malformed (4 milliseconds) [info] - empty values for attributes' constraints matches all values (4 milliseconds) [info] - subset match is performed for set attributes (1 millisecond) [info] - less than equal match is performed on scalar attributes (2 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 1 total, 1.166s [info] Test run started [info] Test org.apache.spark.streaming.JavaTimeSuite.testGreaterEq started [info] Test org.apache.spark.streaming.JavaTimeSuite.testLess started [info] Test org.apache.spark.streaming.JavaTimeSuite.testPlus started [info] Test org.apache.spark.streaming.JavaTimeSuite.testMinusDuration started [info] Test org.apache.spark.streaming.JavaTimeSuite.testGreater started [info] Test org.apache.spark.streaming.JavaTimeSuite.testLessEq started [info] Test org.apache.spark.streaming.JavaTimeSuite.testMinusTime started [info] Test run finished: 0 failed, 0 ignored, 7 total, 0.001s [info] - contains match is performed for range attributes (29 milliseconds) [info] - equality match is performed for text attributes (1 millisecond) [info] - Port reservation is done correctly with user specified ports only (10 milliseconds) [info] - Port reservation is done correctly with all random ports (1 millisecond) [info] - Port reservation is done correctly with user specified ports only - multiple ranges (1 millisecond) [info] - Port reservation is done correctly with all random ports - multiple ranges (2 milliseconds) [info] - Principal specified via spark.mesos.principal (11 milliseconds) [info] - Principal specified via spark.mesos.principal.file (15 milliseconds) [info] - Principal specified via spark.mesos.principal.file that does not exist (1 millisecond) [info] - Principal specified via SPARK_MESOS_PRINCIPAL (1 millisecond) [info] - Principal specified via SPARK_MESOS_PRINCIPAL_FILE (1 millisecond) [info] - Principal specified via SPARK_MESOS_PRINCIPAL_FILE that does not exist (1 millisecond) [info] - Secret specified via spark.mesos.secret (1 millisecond) [info] - Principal specified via spark.mesos.secret.file (1 millisecond) [info] - Principal specified via spark.mesos.secret.file that does not exist (1 millisecond) [info] - Principal specified via SPARK_MESOS_SECRET (0 milliseconds) [info] - Principal specified via SPARK_MESOS_SECRET_FILE (1 millisecond) [info] - Secret specified with no principal (1 millisecond) [info] - Principal specification preference (0 milliseconds) [info] - Secret specification preference (0 milliseconds) [info] MesosRestServerSuite: [info] - test default driver overhead memory (60 milliseconds) [info] - test driver overhead memory with overhead factor (3 milliseconds) [info] - test configured driver overhead memory (3 milliseconds) [info] MesosClusterDispatcherArgumentsSuite: Using host: 192.168.122.1 Using port: 7077 Using webUiPort: 8081 Framework Name: Spark Cluster Spark Config properties set: (spark.hadoop.hadoop.security.key.provider.path,test:///) (spark.testing,true) (spark.ui.showConsoleProgress,false) (spark.master.rest.enabled,false) (spark.ui.enabled,false) (spark.unsafe.exceptionOnMemoryLeak,true) (spark.test.home,/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7) (spark.mesos.key2,value2) (spark.memory.debugFill,true) (spark.port.maxRetries,100) [info] - test if spark config args are passed successfully (18 milliseconds) Using host: localhost Using port: 1212 Using webUiPort: 2323 Framework Name: myFramework Spark Config properties set: (spark.hadoop.hadoop.security.key.provider.path,test:///) (spark.testing,true) (spark.ui.showConsoleProgress,false) (spark.master.rest.enabled,false) (spark.ui.enabled,false) (spark.unsafe.exceptionOnMemoryLeak,true) (spark.test.home,/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7) (spark.memory.debugFill,true) (spark.port.maxRetries,100) [info] - test non conf settings (1 millisecond) [info] MesosSchedulerBackendUtilSuite: [info] - ContainerInfo fails to parse invalid docker parameters (4 milliseconds) [info] - ContainerInfo parses docker parameters (1 millisecond) [info] - SPARK-28778 ContainerInfo respects Docker network configuration (2 milliseconds) [info] MesosClusterSchedulerSuite: [info] - can queue drivers (8 milliseconds) [info] - can kill queued drivers (5 milliseconds) [info] - can handle multiple roles (22 milliseconds) [info] - escapes commandline args for the shell (4 milliseconds) [info] - SPARK-22256: supports spark.mesos.driver.memoryOverhead with 384mb default (4 milliseconds) [info] - SPARK-22256: supports spark.mesos.driver.memoryOverhead with 10% default (4 milliseconds) [info] - supports spark.mesos.driverEnv.* (4 milliseconds) [info] - supports spark.mesos.network.name and spark.mesos.network.labels (3 milliseconds) [info] - supports setting fetcher cache (3 milliseconds) [info] - supports setting fetcher cache on the dispatcher (3 milliseconds) [info] - supports disabling fetcher cache (3 milliseconds) [info] - accept/decline offers with driver constraints (18 milliseconds) [info] - supports spark.mesos.driver.labels (3 milliseconds) [info] - can kill supervised drivers (5 milliseconds) [info] - block decom manager with only shuffle files time moves forward (5 seconds, 32 milliseconds) [info] BLASSuite: [info] - SPARK-27347: do not restart outdated supervised drivers (1 second, 508 milliseconds) [info] - Declines offer with refuse seconds = 120. (3 milliseconds) [info] - Creates an env-based reference secrets. (4 milliseconds) [info] - Creates an env-based value secrets. (5 milliseconds) [info] - Creates file-based reference secrets. (4 milliseconds) [info] - Creates a file-based value secrets. (4 milliseconds) [info] - assembles a valid driver command, escaping all confs and args (3 milliseconds) [info] - SPARK-23499: Test dispatcher priority queue with non float value (2 milliseconds) [info] - SPARK-23499: Get driver priority (3 milliseconds) [info] - SPARK-23499: Can queue drivers with priority (4 milliseconds) [info] - SPARK-23499: Can queue drivers with negative priority (1 millisecond) [info] MesosClusterDispatcherSuite: [info] - prints usage on empty input (7 milliseconds) [info] - prints usage with only --help (2 milliseconds) [info] - prints error with unrecognized options (1 millisecond) [info] - nativeL1Threshold (59 milliseconds) [info] - copy (21 milliseconds) [info] - scal (1 millisecond) [info] - axpy (6 milliseconds) [info] - dot (3 milliseconds) [info] - spr (3 milliseconds) [info] - syr (11 milliseconds) [info] - gemm (13 milliseconds) [info] - gemv (12 milliseconds) [info] - spmv (1 millisecond) [info] UtilsSuite: [info] - EPSILON (7 milliseconds) [info] TestingUtilsSuite: [info] Run completed in 6 minutes, 24 seconds. [info] Total number of tests run: 34 [info] Suites: completed 3, aborted 0 [info] Tests: succeeded 34, failed 0, canceled 0, ignored 0, pending 0 [info] All tests passed. [info] - Comparing doubles using relative error. (10 milliseconds) [info] - Comparing doubles using absolute error. (2 milliseconds) [info] - Comparing vectors using relative error. (5 milliseconds) [info] - Comparing vectors using absolute error. (3 milliseconds) [info] - Comparing Matrices using absolute error. (346 milliseconds) [info] - Comparing Matrices using relative error. (7 milliseconds) [info] BreezeMatrixConversionSuite: [info] - dense matrix to breeze (0 milliseconds) [info] - dense breeze matrix to matrix (1 millisecond) [info] - sparse matrix to breeze (126 milliseconds) [info] - sparse breeze matrix to sparse matrix (13 milliseconds) [info] BreezeVectorConversionSuite: [info] - dense to breeze (331 milliseconds) [info] - sparse to breeze (189 milliseconds) [info] - dense breeze to vector (1 millisecond) [info] - sparse breeze to vector (0 milliseconds) [info] - sparse breeze with partially-used arrays to vector (5 milliseconds) [info] MultivariateGaussianSuite: Dec 03, 2021 11:01:50 PM com.github.fommil.netlib.LAPACK <clinit> WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeSystemLAPACK Dec 03, 2021 11:01:51 PM com.github.fommil.netlib.LAPACK <clinit> WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeRefLAPACK Dec 03, 2021 11:01:51 PM com.github.fommil.netlib.BLAS <clinit> WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS Dec 03, 2021 11:01:51 PM com.github.fommil.netlib.BLAS <clinit> WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS [info] - univariate (128 milliseconds) [info] - multivariate (8 milliseconds) [info] - multivariate degenerate (2 milliseconds) [info] - SPARK-11302 (8 milliseconds) [info] MatricesSuite: [info] - dense matrix construction (0 milliseconds) [info] - dense matrix construction with wrong dimension (1 millisecond) [info] - sparse matrix construction (126 milliseconds) [info] - sparse matrix construction with wrong number of elements (1 millisecond) [info] - index in matrices incorrect input (4 milliseconds) [info] - equals (3 milliseconds) [info] - matrix copies are deep copies (1 millisecond) [info] - matrix indexing and updating (1 millisecond) [info] - dense to dense (1 millisecond) [info] - dense to sparse (2 milliseconds) [info] - sparse to sparse (3 milliseconds) [info] - sparse to dense (3 milliseconds) [info] - compressed dense (5 milliseconds) [info] - compressed sparse (2 milliseconds) [info] - map, update (2 milliseconds) [info] - transpose (1 millisecond) [info] - foreachActive (1 millisecond) [info] - horzcat, vertcat, eye, speye (13 milliseconds) [info] - zeros (1 millisecond) [info] - ones (2 milliseconds) [info] - eye (0 milliseconds) [info] CastSuite: [info] - rand (568 milliseconds) [info] - randn (2 milliseconds) [info] - diag (1 millisecond) [info] - sprand (29 milliseconds) [info] - sprandn (3 milliseconds) [info] - toString (9 milliseconds) [info] - numNonzeros and numActives (1 millisecond) [info] - fromBreeze with sparse matrix (33 milliseconds) [info] - row/col iterator (5 milliseconds) [info] VectorsSuite: [info] - dense vector construction with varargs (0 milliseconds) [info] - dense vector construction from a double array (0 milliseconds) [info] - sparse vector construction (0 milliseconds) [info] - sparse vector construction with unordered elements (2 milliseconds) [info] - sparse vector construction with mismatched indices/values array (1 millisecond) [info] - sparse vector construction with too many indices vs size (1 millisecond) [info] - sparse vector construction with negative indices (0 milliseconds) [info] - dense to array (0 milliseconds) [info] - dense argmax (1 millisecond) [info] - sparse to array (0 milliseconds) [info] - sparse argmax (0 milliseconds) [info] - vector equals (3 milliseconds) [info] - vectors equals with explicit 0 (1 millisecond) [info] - indexing dense vectors (0 milliseconds) [info] - indexing sparse vectors (0 milliseconds) [info] - zeros (0 milliseconds) [info] - Vector.copy (1 millisecond) [info] - fromBreeze (1 millisecond) [info] - sqdist (48 milliseconds) [info] - foreach (10 milliseconds) [info] - foreachActive (3 milliseconds) [info] - foreachNonZero (2 milliseconds) [info] - vector p-norm (5 milliseconds) [info] - Vector numActive and numNonzeros (1 millisecond) [info] - Vector toSparse and toDense (2 milliseconds) [info] - Vector.compressed (1 millisecond) [info] - SparseVector.slice (1 millisecond) [info] - SparseVector.slice with sorted indices (0 milliseconds) [info] - sparse vector only support non-negative length (1 millisecond) [info] - dot product only supports vectors of same size (1 millisecond) [info] - dense vector dot product (1 millisecond) [info] - sparse vector dot product (0 milliseconds) [info] - mixed sparse and dense vector dot product (0 milliseconds) [info] - iterator (2 milliseconds) [info] - activeIterator (3 milliseconds) [info] - nonZeroIterator (2 milliseconds) [info] - block decom manager does not re-add removed shuffle files (5 seconds, 6 milliseconds) [info] - run Spark in yarn-cluster mode failure after sc initialized (32 seconds, 38 milliseconds) [info] HashExpressionsSuite: [info] - md5 (2 seconds, 421 milliseconds) [info] - null cast (7 seconds, 130 milliseconds) [info] - sha1 (202 milliseconds) [info] - sha2 (520 milliseconds) [info] - block decom manager handles IO failures (5 seconds, 34 milliseconds) [info] - cast string to date (695 milliseconds) [info] - crc32 (139 milliseconds) [info] - hive-hash for null (4 milliseconds) [info] - hive-hash for boolean (1 millisecond) [info] - hive-hash for byte (2 milliseconds) [info] - hive-hash for short (1 millisecond) [info] - hive-hash for int (1 millisecond) [info] - hive-hash for long (2 milliseconds) [info] - hive-hash for float (1 millisecond) [info] - hive-hash for double (1 millisecond) [info] - hive-hash for string (1 millisecond) [info] - hive-hash for date type (6 milliseconds) [info] - hive-hash for timestamp type (40 milliseconds) [info] - hive-hash for CalendarInterval type (17 milliseconds) [info] - hive-hash for array (3 milliseconds) [info] - hive-hash for map (1 millisecond) [info] - hive-hash for struct (3 milliseconds) [info] - murmur3/xxHash64/hive hash: struct<null:void,boolean:boolean,byte:tinyint,short:smallint,int:int,long:bigint,float:float,double:double,bigDecimal:decimal(38,18),smallDecimal:decimal(10,0),string:string,binary:binary,date:date,timestamp:timestamp,udt:examplepoint> (4 seconds, 582 milliseconds) [info] - block decom manager short circuits removed blocks (5 seconds, 34 milliseconds) [info] - test shuffle and cached rdd migration without any error (52 milliseconds) [info] DriverSuite: [info] - driver should exit after finishing without cleanup (SPARK-530) !!! IGNORED !!! [info] CompactBufferSuite: [info] - empty buffer (3 milliseconds) [info] - basic inserts (5 milliseconds) [info] - adding sequences (4 milliseconds) [info] - adding the same buffer to itself (1 millisecond) [info] MapStatusSuite: [info] - compressSize (1 millisecond) [info] - decompressSize (1 millisecond) [info] - SPARK-30633: xxHash64 with long seed: struct<null:void,boolean:boolean,byte:tinyint,short:smallint,int:int,long:bigint,float:float,double:double,bigDecimal:decimal(38,18),smallDecimal:decimal(10,0),string:string,binary:binary,date:date,timestamp:timestamp,udt:examplepoint> (697 milliseconds) [info] - MapStatus should never report non-empty blocks' sizes as 0 (538 milliseconds) [info] - large tasks should use org.apache.spark.scheduler.HighlyCompressedMapStatus (2 milliseconds) [info] - HighlyCompressedMapStatus: estimated size should be the average non-empty block size (7 milliseconds) [info] - SPARK-22540: ensure HighlyCompressedMapStatus calculates correct avgSize (19 milliseconds) [info] - RoaringBitmap: runOptimize succeeded (15 milliseconds) [info] - RoaringBitmap: runOptimize failed (5 milliseconds) [info] - Blocks which are bigger than SHUFFLE_ACCURATE_BLOCK_THRESHOLD should not be underestimated. (8 milliseconds) [info] - cast string to timestamp (7 seconds, 18 milliseconds) [info] - cast from boolean (143 milliseconds) [info] - cast from int (254 milliseconds) [info] - cast from long (195 milliseconds) [info] - cast from float (147 milliseconds) [info] - cast from double (146 milliseconds) [info] - cast from string (6 milliseconds) [info] - SPARK-21133 HighlyCompressedMapStatus#writeExternal throws NPE (5 seconds, 825 milliseconds) [info] BlockInfoManagerSuite: [info] - initial memory usage (0 milliseconds) [info] - get non-existent block (1 millisecond) [info] - basic lockNewBlockForWriting (2 milliseconds) [info] - lockNewBlockForWriting blocks while write lock is held, then returns false after release (303 milliseconds) [info] - lockNewBlockForWriting blocks while write lock is held, then returns true after removal (304 milliseconds) [info] - read locks are reentrant (1 millisecond) [info] - multiple tasks can hold read locks (2 milliseconds) [info] - single task can hold write lock (2 milliseconds) [info] - cannot grab a writer lock while already holding a write lock (1 millisecond) [info] - assertBlockIsLockedForWriting throws exception if block is not locked (1 millisecond) [info] - downgrade lock (1 millisecond) [info] - write lock will block readers (302 milliseconds) [info] - read locks will block writer (304 milliseconds) [info] - removing a non-existent block throws SparkException (1 millisecond) [info] - removing a block without holding any locks throws IllegalStateException (1 millisecond) [info] - removing a block while holding only a read lock throws IllegalStateException (1 millisecond) [info] - removing a block causes blocked callers to receive None (302 milliseconds) [info] - releaseAllLocksForTask releases write locks (1 millisecond) [info] StoragePageSuite: [info] - rddTable (7 milliseconds) [info] - empty rddTable (1 millisecond) [info] - streamBlockStorageLevelDescriptionAndSize (0 milliseconds) [info] - receiverBlockTables (10 milliseconds) [info] - empty receiverBlockTables (0 milliseconds) [info] TaskSchedulerImplSuite: [info] - SPARK-32653: Decommissioned host/executor should be considered as inactive (88 milliseconds) [info] - Scheduler does not always schedule tasks on the same workers (806 milliseconds) [info] - Scheduler correctly accounts for multiple CPUs per task (83 milliseconds) [info] - SPARK-18886 - partial offers (isAllFreeResources = false) reset timer before any resources have been rejected (76 milliseconds) [info] - SPARK-18886 - delay scheduling timer is reset when it accepts all resources offered when isAllFreeResources = true (78 milliseconds) [info] - SPARK-18886 - task set with no locality requirements should not starve one with them (80 milliseconds) [info] - SPARK-18886 - partial resource offers (isAllFreeResources = false) reset time if last full resource offer (isAllResources = true) was accepted as well as any following partial resource offers (70 milliseconds) [info] - SPARK-18886 - partial resource offers (isAllFreeResources = false) do not reset time if any offer was rejected since last full offer was fully accepted (76 milliseconds) [info] - Scheduler does not crash when tasks are not serializable (76 milliseconds) [info] - concurrent attempts for the same stage only have one active taskset (72 milliseconds) [info] - don't schedule more tasks after a taskset is zombie (73 milliseconds) [info] - if a zombie attempt finishes, continue scheduling tasks for non-zombie attempts (77 milliseconds) [info] - tasks are not re-scheduled while executor loss reason is pending (73 milliseconds) [info] - scheduled tasks obey task and stage excludelist (148 milliseconds) [info] - scheduled tasks obey node and executor excludelists (119 milliseconds) [info] - abort stage when all executors are excluded and we cannot acquire new executor (167 milliseconds) [info] - SPARK-22148 abort timer should kick in when task is completely excluded & no new executor can be acquired (84 milliseconds) [info] - SPARK-22148 try to acquire a new executor when task is unschedulable with 1 executor (77 milliseconds) [info] - SPARK-22148 abort timer should clear unschedulableTaskSetToExpiryTime for all TaskSets (99 milliseconds) [info] - murmur3/xxHash64/hive hash: struct<arrayOfNull:array<void>,arrayOfString:array<string>,arrayOfArrayOfString:array<array<string>>,arrayOfArrayOfInt:array<array<int>>,arrayOfStruct:array<struct<str:string>>,arrayOfUDT:array<examplepoint>> (10 seconds, 331 milliseconds) [info] - SPARK-22148 Ensure we don't abort the taskSet if we haven't been completely excluded (95 milliseconds) [info] - SPARK-31418 abort timer should kick in when task is completely excluded &allocation manager could not acquire a new executor before the timeout (82 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 0 (139 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 1 (149 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 2 (121 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 3 (162 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 4 (144 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 5 (139 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 6 (124 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 7 (124 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 8 (125 milliseconds) [info] - Excluded node for entire task set prevents per-task exclusion checks: iteration 9 (128 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 0 (184 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 1 (136 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 2 (136 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 3 (121 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 4 (147 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 5 (137 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion checks: iteration 6 (121 milliseconds) [info] - SPARK-30633: xxHash64 with long seed: struct<arrayOfNull:array<void>,arrayOfString:array<string>,arrayOfArrayOfString:array<array<string>>,arrayOfArrayOfInt:array<array<int>>,arrayOfStruct:array<struct<str:string>>,arrayOfUDT:array<examplepoint>> (2 seconds, 792 milliseconds) [info] - Excluded executor for entire task set prevents per-task exclusion chec