Started by an SCM change Running as SYSTEM [EnvInject] - Loading node environment variables. [EnvInject] - Preparing an environment for the build. [EnvInject] - Keeping Jenkins system variables. [EnvInject] - Keeping Jenkins build variables. [EnvInject] - Injecting as environment variables the properties content AMPLAB_JENKINS_BUILD_PROFILE=hadoop2.6 SPARK_BRANCH=branch-2.4 PATH=/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/:/home/jenkins/gems/bin:/usr/local/go/bin:/home/jenkins/go-projects/bin:/home/jenkins/anaconda2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games LANG=en_US.UTF-8 SPARK_TESTING=1 JAVA_HOME=/usr/java/latest AMPLAB_JENKINS="true" [EnvInject] - Variables injected successfully. [EnvInject] - Injecting contributions. Building remotely on research-jenkins-worker-09 (ubuntu ubuntu-gpu research-09 ubuntu-avx2) in workspace /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6 The recommended git tool is: NONE No credentials specified > git rev-parse --is-inside-work-tree # timeout=10 Fetching changes from the remote Git repository > git config remote.origin.url https://github.com/apache/spark.git # timeout=10 Fetching upstream changes from https://github.com/apache/spark.git > git --version # timeout=10 > git --version # 'git version 2.7.4' > git fetch --tags --progress https://github.com/apache/spark.git +refs/heads/*:refs/remotes/origin/* # timeout=10 > git rev-parse origin/branch-2.4^{commit} # timeout=10 Checking out Revision e0e1e21ee84bb3cb20eefee982da59afaa250a2c (origin/branch-2.4) > git config core.sparsecheckout # timeout=10 > git checkout -f e0e1e21ee84bb3cb20eefee982da59afaa250a2c # timeout=10 Commit message: "[SPARK-34125][CORE][2.4] Make EventLoggingListener.codecMap thread-safe" > git rev-list --no-walk 7ae6c8d985b8f512555a9f373b7b445216006c53 # timeout=10 [spark-branch-2.4-test-sbt-hadoop-2.6] $ /bin/bash /tmp/jenkins8684197816929567029.sh Removing R/SparkR.Rcheck/ Removing R/SparkR_2.4.8.tar.gz Removing R/cran-check.out Removing R/lib/ Removing R/pkg/man/ Removing R/pkg/tests/fulltests/Rplots.pdf Removing R/target/ Removing R/unit-tests.out Removing append/ Removing assembly/target/ Removing build/sbt-launch-0.13.17.jar Removing build/scala-2.11.12/ Removing build/zinc-0.3.15/ Removing common/kvstore/target/ Removing common/network-common/target/ Removing common/network-shuffle/target/ Removing common/network-yarn/target/ Removing common/sketch/target/ Removing common/tags/target/ Removing common/unsafe/target/ Removing core/derby.log Removing core/dummy/ Removing core/ignored/ Removing core/metastore_db/ Removing core/target/ Removing derby.log Removing dev/__pycache__/ Removing dev/create-release/__pycache__/ Removing dev/lint-r-report.log Removing dev/pr-deps/ Removing dev/pycodestyle-2.4.0.py Removing dev/sparktestsupport/__init__.pyc Removing dev/sparktestsupport/__pycache__/ Removing dev/sparktestsupport/modules.pyc Removing dev/sparktestsupport/shellutils.pyc Removing dev/sparktestsupport/toposort.pyc Removing dev/target/ Removing examples/src/main/python/__pycache__/ Removing examples/src/main/python/ml/__pycache__/ Removing examples/src/main/python/mllib/__pycache__/ Removing examples/src/main/python/sql/__pycache__/ Removing examples/src/main/python/sql/streaming/__pycache__/ Removing examples/src/main/python/streaming/__pycache__/ Removing examples/target/ Removing external/avro/spark-warehouse/ Removing external/avro/target/ Removing external/flume-assembly/target/ Removing external/flume-sink/target/ Removing external/flume/checkpoint/ Removing external/flume/target/ Removing external/kafka-0-10-assembly/target/ Removing external/kafka-0-10-sql/spark-warehouse/ Removing external/kafka-0-10-sql/target/ Removing external/kafka-0-10/target/ Removing external/kafka-0-8-assembly/target/ Removing external/kafka-0-8/target/ Removing external/kinesis-asl-assembly/target/ Removing external/kinesis-asl/checkpoint/ Removing external/kinesis-asl/src/main/python/examples/streaming/__pycache__/ Removing external/kinesis-asl/target/ Removing external/spark-ganglia-lgpl/target/ Removing graphx/target/ Removing launcher/target/ Removing lib/ Removing logs/ Removing metastore_db/ Removing mllib-local/target/ Removing mllib/checkpoint/ Removing mllib/spark-warehouse/ Removing mllib/target/ Removing project/project/ Removing project/target/ Removing python/.eggs/ Removing python/__pycache__/ Removing python/dist/ Removing python/docs/__pycache__/ Removing python/docs/_build/ Removing python/docs/epytext.pyc Removing python/lib/pyspark.zip Removing python/pyspark.egg-info/ Removing python/pyspark/__init__.pyc Removing python/pyspark/__pycache__/ Removing python/pyspark/_globals.pyc Removing python/pyspark/accumulators.pyc Removing python/pyspark/broadcast.pyc Removing python/pyspark/cloudpickle.pyc Removing python/pyspark/conf.pyc Removing python/pyspark/context.pyc Removing python/pyspark/files.pyc Removing python/pyspark/find_spark_home.pyc Removing python/pyspark/heapq3.pyc Removing python/pyspark/java_gateway.pyc Removing python/pyspark/join.pyc Removing python/pyspark/ml/__init__.pyc Removing python/pyspark/ml/__pycache__/ Removing python/pyspark/ml/base.pyc Removing python/pyspark/ml/classification.pyc Removing python/pyspark/ml/clustering.pyc Removing python/pyspark/ml/common.pyc Removing python/pyspark/ml/evaluation.pyc Removing python/pyspark/ml/feature.pyc Removing python/pyspark/ml/fpm.pyc Removing python/pyspark/ml/image.pyc Removing python/pyspark/ml/linalg/__init__.pyc Removing python/pyspark/ml/linalg/__pycache__/ Removing python/pyspark/ml/param/__init__.pyc Removing python/pyspark/ml/param/__pycache__/ Removing python/pyspark/ml/param/shared.pyc Removing python/pyspark/ml/pipeline.pyc Removing python/pyspark/ml/recommendation.pyc Removing python/pyspark/ml/regression.pyc Removing python/pyspark/ml/stat.pyc Removing python/pyspark/ml/tests.pyc Removing python/pyspark/ml/tuning.pyc Removing python/pyspark/ml/util.pyc Removing python/pyspark/ml/wrapper.pyc Removing python/pyspark/mllib/__init__.pyc Removing python/pyspark/mllib/__pycache__/ Removing python/pyspark/mllib/classification.pyc Removing python/pyspark/mllib/clustering.pyc Removing python/pyspark/mllib/common.pyc Removing python/pyspark/mllib/evaluation.pyc Removing python/pyspark/mllib/feature.pyc Removing python/pyspark/mllib/fpm.pyc Removing python/pyspark/mllib/linalg/__init__.pyc Removing python/pyspark/mllib/linalg/__pycache__/ Removing python/pyspark/mllib/linalg/distributed.pyc Removing python/pyspark/mllib/random.pyc Removing python/pyspark/mllib/recommendation.pyc Removing python/pyspark/mllib/regression.pyc Removing python/pyspark/mllib/stat/KernelDensity.pyc Removing python/pyspark/mllib/stat/__init__.pyc Removing python/pyspark/mllib/stat/__pycache__/ Removing python/pyspark/mllib/stat/_statistics.pyc Removing python/pyspark/mllib/stat/distribution.pyc Removing python/pyspark/mllib/stat/test.pyc Removing python/pyspark/mllib/tests.pyc Removing python/pyspark/mllib/tree.pyc Removing python/pyspark/mllib/util.pyc Removing python/pyspark/profiler.pyc Removing python/pyspark/python/ Removing python/pyspark/rdd.pyc Removing python/pyspark/rddsampler.pyc Removing python/pyspark/resultiterable.pyc Removing python/pyspark/serializers.pyc Removing python/pyspark/shuffle.pyc Removing python/pyspark/sql/__init__.pyc Removing python/pyspark/sql/__pycache__/ Removing python/pyspark/sql/catalog.pyc Removing python/pyspark/sql/column.pyc Removing python/pyspark/sql/conf.pyc Removing python/pyspark/sql/context.pyc Removing python/pyspark/sql/dataframe.pyc Removing python/pyspark/sql/functions.pyc Removing python/pyspark/sql/group.pyc Removing python/pyspark/sql/readwriter.pyc Removing python/pyspark/sql/session.pyc Removing python/pyspark/sql/streaming.pyc Removing python/pyspark/sql/tests.pyc Removing python/pyspark/sql/types.pyc Removing python/pyspark/sql/udf.pyc Removing python/pyspark/sql/utils.pyc Removing python/pyspark/sql/window.pyc Removing python/pyspark/statcounter.pyc Removing python/pyspark/status.pyc Removing python/pyspark/storagelevel.pyc Removing python/pyspark/streaming/__init__.pyc Removing python/pyspark/streaming/__pycache__/ Removing python/pyspark/streaming/context.pyc Removing python/pyspark/streaming/dstream.pyc Removing python/pyspark/streaming/flume.pyc Removing python/pyspark/streaming/kafka.pyc Removing python/pyspark/streaming/kinesis.pyc Removing python/pyspark/streaming/listener.pyc Removing python/pyspark/streaming/tests.pyc Removing python/pyspark/streaming/util.pyc Removing python/pyspark/taskcontext.pyc Removing python/pyspark/test_broadcast.pyc Removing python/pyspark/test_serializers.pyc Removing python/pyspark/tests.pyc Removing python/pyspark/traceback_utils.pyc Removing python/pyspark/util.pyc Removing python/pyspark/version.pyc Removing python/pyspark/worker.pyc Removing python/target/ Removing python/test_coverage/__pycache__/ Removing python/test_support/__pycache__/ Removing repl/spark-warehouse/ Removing repl/target/ Removing resource-managers/kubernetes/core/target/ Removing resource-managers/kubernetes/integration-tests/tests/__pycache__/ Removing resource-managers/mesos/target/ Removing resource-managers/yarn/target/ Removing scalastyle-on-compile.generated.xml Removing spark-warehouse/ Removing sql/__pycache__/ Removing sql/catalyst/loc/ Removing sql/catalyst/target/ Removing sql/core/loc/ Removing sql/core/paris/ Removing sql/core/spark-warehouse/ Removing sql/core/target/ Removing sql/hive-thriftserver/derby.log Removing sql/hive-thriftserver/metastore_db/ Removing sql/hive-thriftserver/spark-warehouse/ Removing sql/hive-thriftserver/target/ Removing sql/hive/derby.log Removing sql/hive/loc/ Removing sql/hive/metastore_db/ Removing sql/hive/src/test/resources/data/scripts/__pycache__/ Removing sql/hive/target/ Removing streaming/checkpoint/ Removing streaming/target/ Removing target/ Removing tools/target/ Removing work/ +++ dirname /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R/install-dev.sh ++ cd /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R ++ pwd + FWDIR=/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R + LIB_DIR=/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R/lib + mkdir -p /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R/lib + pushd /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R + . /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R/find-r.sh ++ '[' -z '' ']' ++ '[' '!' -z '' ']' +++ command -v R ++ '[' '!' /usr/bin/R ']' ++++ which R +++ dirname /usr/bin/R ++ R_SCRIPT_PATH=/usr/bin ++ echo 'Using R_SCRIPT_PATH = /usr/bin' Using R_SCRIPT_PATH = /usr/bin + . /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R/create-rd.sh ++ set -o pipefail ++ set -e ++++ dirname /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R/create-rd.sh +++ cd /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R +++ pwd ++ FWDIR=/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R ++ pushd /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R ++ . /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R/find-r.sh +++ '[' -z /usr/bin ']' ++ /usr/bin/Rscript -e ' if("devtools" %in% rownames(installed.packages())) { library(devtools); devtools::document(pkg="./pkg", roclets=c("rd")) }' Loading required package: usethis Updating SparkR documentation First time using roxygen2. Upgrading automatically... Updating roxygen version in /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R/pkg/DESCRIPTION Loading SparkR Creating a new generic function for ‘as.data.frame’ in package ‘SparkR’ Creating a new generic function for ‘colnames’ in package ‘SparkR’ Creating a new generic function for ‘colnames<-’ in package ‘SparkR’ Creating a new generic function for ‘cov’ in package ‘SparkR’ Creating a new generic function for ‘drop’ in package ‘SparkR’ Creating a new generic function for ‘na.omit’ in package ‘SparkR’ Creating a new generic function for ‘filter’ in package ‘SparkR’ Creating a new generic function for ‘intersect’ in package ‘SparkR’ Creating a new generic function for ‘sample’ in package ‘SparkR’ Creating a new generic function for ‘transform’ in package ‘SparkR’ Creating a new generic function for ‘subset’ in package ‘SparkR’ Creating a new generic function for ‘summary’ in package ‘SparkR’ Creating a new generic function for ‘union’ in package ‘SparkR’ Creating a new generic function for ‘endsWith’ in package ‘SparkR’ Creating a new generic function for ‘startsWith’ in package ‘SparkR’ Creating a new generic function for ‘lag’ in package ‘SparkR’ Creating a new generic function for ‘rank’ in package ‘SparkR’ Creating a new generic function for ‘sd’ in package ‘SparkR’ Creating a new generic function for ‘var’ in package ‘SparkR’ Creating a new generic function for ‘window’ in package ‘SparkR’ Creating a new generic function for ‘predict’ in package ‘SparkR’ Creating a new generic function for ‘rbind’ in package ‘SparkR’ Creating a generic function for ‘substr’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘%in%’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘lapply’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘Filter’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘nrow’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘ncol’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘factorial’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘atan2’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘ifelse’ from package ‘base’ in package ‘SparkR’ Warning: [/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R/pkg/R/SQLContext.R:592] @name May only use one @name per block Warning: [/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R/pkg/R/SQLContext.R:733] @name May only use one @name per block Writing structType.Rd Writing print.structType.Rd Writing structField.Rd Writing print.structField.Rd Writing summarize.Rd Writing alias.Rd Writing arrange.Rd Writing as.data.frame.Rd Writing cache.Rd Writing checkpoint.Rd Writing coalesce.Rd Writing collect.Rd Writing columns.Rd Writing coltypes.Rd Writing count.Rd Writing cov.Rd Writing corr.Rd Writing createOrReplaceTempView.Rd Writing cube.Rd Writing dapply.Rd Writing dapplyCollect.Rd Writing gapply.Rd Writing gapplyCollect.Rd Writing describe.Rd Writing distinct.Rd Writing drop.Rd Writing dropDuplicates.Rd Writing nafunctions.Rd Writing dtypes.Rd Writing explain.Rd Writing except.Rd Writing exceptAll.Rd Writing filter.Rd Writing first.Rd Writing groupBy.Rd Writing hint.Rd Writing insertInto.Rd Writing intersect.Rd Writing intersectAll.Rd Writing isLocal.Rd Writing isStreaming.Rd Writing limit.Rd Writing localCheckpoint.Rd Writing merge.Rd Writing mutate.Rd Writing orderBy.Rd Writing persist.Rd Writing printSchema.Rd Writing registerTempTable-deprecated.Rd Writing rename.Rd Writing repartition.Rd Writing repartitionByRange.Rd Writing sample.Rd Writing rollup.Rd Writing sampleBy.Rd Writing saveAsTable.Rd Writing take.Rd Writing write.df.Rd Writing write.jdbc.Rd Writing write.json.Rd Writing write.orc.Rd Writing write.parquet.Rd Writing write.stream.Rd Writing write.text.Rd Writing schema.Rd Writing select.Rd Writing selectExpr.Rd Writing showDF.Rd Writing subset.Rd Writing summary.Rd Writing union.Rd Writing unionByName.Rd Writing unpersist.Rd Writing with.Rd Writing withColumn.Rd Writing withWatermark.Rd Writing randomSplit.Rd Writing broadcast.Rd Writing columnfunctions.Rd Writing between.Rd Writing cast.Rd Writing endsWith.Rd Writing startsWith.Rd Writing column_nonaggregate_functions.Rd Writing otherwise.Rd Writing over.Rd Writing eq_null_safe.Rd Writing partitionBy.Rd Writing rowsBetween.Rd Writing rangeBetween.Rd Writing windowPartitionBy.Rd Writing windowOrderBy.Rd Writing column_datetime_diff_functions.Rd Writing column_aggregate_functions.Rd Writing column_collection_functions.Rd Writing column_string_functions.Rd Writing avg.Rd Writing column_math_functions.Rd Writing column.Rd Writing column_misc_functions.Rd Writing column_window_functions.Rd Writing column_datetime_functions.Rd Writing last.Rd Writing not.Rd Writing fitted.Rd Writing predict.Rd Writing rbind.Rd Writing spark.als.Rd Writing spark.bisectingKmeans.Rd Writing spark.gaussianMixture.Rd Writing spark.gbt.Rd Writing spark.glm.Rd Writing spark.isoreg.Rd Writing spark.kmeans.Rd Writing spark.kstest.Rd Writing spark.lda.Rd Writing spark.logit.Rd Writing spark.mlp.Rd Writing spark.naiveBayes.Rd Writing spark.decisionTree.Rd Writing spark.randomForest.Rd Writing spark.survreg.Rd Writing spark.svmLinear.Rd Writing spark.fpGrowth.Rd Writing write.ml.Rd Writing awaitTermination.Rd Writing isActive.Rd Writing lastProgress.Rd Writing queryName.Rd Writing status.Rd Writing stopQuery.Rd Writing print.jobj.Rd Writing show.Rd Writing substr.Rd Writing match.Rd Writing GroupedData.Rd Writing pivot.Rd Writing SparkDataFrame.Rd Writing storageLevel.Rd Writing toJSON.Rd Writing nrow.Rd Writing ncol.Rd Writing dim.Rd Writing head.Rd Writing join.Rd Writing crossJoin.Rd Writing attach.Rd Writing str.Rd Writing histogram.Rd Writing getNumPartitions.Rd Writing sparkR.conf.Rd Writing sparkR.version.Rd Writing createDataFrame.Rd Writing read.json.Rd Writing read.orc.Rd Writing read.parquet.Rd Writing read.text.Rd Writing sql.Rd Writing tableToDF.Rd Writing read.df.Rd Writing read.jdbc.Rd Writing read.stream.Rd Writing WindowSpec.Rd Writing createExternalTable-deprecated.Rd Writing createTable.Rd Writing cacheTable.Rd Writing uncacheTable.Rd Writing clearCache.Rd Writing dropTempTable-deprecated.Rd Writing dropTempView.Rd Writing tables.Rd Writing tableNames.Rd Writing currentDatabase.Rd Writing setCurrentDatabase.Rd Writing listDatabases.Rd Writing listTables.Rd Writing listColumns.Rd Writing listFunctions.Rd Writing recoverPartitions.Rd Writing refreshTable.Rd Writing refreshByPath.Rd Writing spark.addFile.Rd Writing spark.getSparkFilesRootDirectory.Rd Writing spark.getSparkFiles.Rd Writing spark.lapply.Rd Writing setLogLevel.Rd Writing setCheckpointDir.Rd Writing install.spark.Rd Writing sparkR.callJMethod.Rd Writing sparkR.callJStatic.Rd Writing sparkR.newJObject.Rd Writing LinearSVCModel-class.Rd Writing LogisticRegressionModel-class.Rd Writing MultilayerPerceptronClassificationModel-class.Rd Writing NaiveBayesModel-class.Rd Writing BisectingKMeansModel-class.Rd Writing GaussianMixtureModel-class.Rd Writing KMeansModel-class.Rd Writing LDAModel-class.Rd Writing FPGrowthModel-class.Rd Writing ALSModel-class.Rd Writing AFTSurvivalRegressionModel-class.Rd Writing GeneralizedLinearRegressionModel-class.Rd Writing IsotonicRegressionModel-class.Rd Writing glm.Rd Writing KSTest-class.Rd Writing GBTRegressionModel-class.Rd Writing GBTClassificationModel-class.Rd Writing RandomForestRegressionModel-class.Rd Writing RandomForestClassificationModel-class.Rd Writing DecisionTreeRegressionModel-class.Rd Writing DecisionTreeClassificationModel-class.Rd Writing read.ml.Rd Writing sparkR.session.stop.Rd Writing sparkR.init-deprecated.Rd Writing sparkRSQL.init-deprecated.Rd Writing sparkRHive.init-deprecated.Rd Writing sparkR.session.Rd Writing sparkR.uiWebUrl.Rd Writing setJobGroup.Rd Writing clearJobGroup.Rd Writing cancelJobGroup.Rd Writing setJobDescription.Rd Writing setLocalProperty.Rd Writing getLocalProperty.Rd Writing crosstab.Rd Writing freqItems.Rd Writing approxQuantile.Rd Writing StreamingQuery.Rd Writing hashCode.Rd + /usr/bin/R CMD INSTALL --library=/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R/lib /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R/pkg/ * installing *source* package ‘SparkR’ ... ** using staged installation ** R ** inst ** byte-compile and prepare package for lazy loading Creating a new generic function for ‘as.data.frame’ in package ‘SparkR’ Creating a new generic function for ‘colnames’ in package ‘SparkR’ Creating a new generic function for ‘colnames<-’ in package ‘SparkR’ Creating a new generic function for ‘cov’ in package ‘SparkR’ Creating a new generic function for ‘drop’ in package ‘SparkR’ Creating a new generic function for ‘na.omit’ in package ‘SparkR’ Creating a new generic function for ‘filter’ in package ‘SparkR’ Creating a new generic function for ‘intersect’ in package ‘SparkR’ Creating a new generic function for ‘sample’ in package ‘SparkR’ Creating a new generic function for ‘transform’ in package ‘SparkR’ Creating a new generic function for ‘subset’ in package ‘SparkR’ Creating a new generic function for ‘summary’ in package ‘SparkR’ Creating a new generic function for ‘union’ in package ‘SparkR’ Creating a new generic function for ‘endsWith’ in package ‘SparkR’ Creating a new generic function for ‘startsWith’ in package ‘SparkR’ Creating a new generic function for ‘lag’ in package ‘SparkR’ Creating a new generic function for ‘rank’ in package ‘SparkR’ Creating a new generic function for ‘sd’ in package ‘SparkR’ Creating a new generic function for ‘var’ in package ‘SparkR’ Creating a new generic function for ‘window’ in package ‘SparkR’ Creating a new generic function for ‘predict’ in package ‘SparkR’ Creating a new generic function for ‘rbind’ in package ‘SparkR’ Creating a generic function for ‘substr’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘%in%’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘lapply’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘Filter’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘nrow’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘ncol’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘factorial’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘atan2’ from package ‘base’ in package ‘SparkR’ Creating a generic function for ‘ifelse’ from package ‘base’ in package ‘SparkR’ ** help *** installing help indices ** building package indices ** installing vignettes ** testing if installed package can be loaded from temporary location ** testing if installed package can be loaded from final location ** testing if installed package keeps a record of temporary installation path * DONE (SparkR) + cd /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R/lib + jar cfM /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/R/lib/sparkr.zip SparkR + popd [info] Using build tool sbt with Hadoop profile hadoop2.6 under environment amplab_jenkins [info] Found the following changed modules: root [info] Setup the following environment variables for tests: ======================================================================== Running Apache RAT checks ======================================================================== Attempting to fetch rat RAT checks passed. ======================================================================== Running Scala style checks ======================================================================== Scalastyle checks passed. ======================================================================== Running Python style checks ======================================================================== pycodestyle checks passed. rm -rf _build/* pydoc checks passed. ======================================================================== Running R style checks ======================================================================== Attaching package: ‘SparkR’ The following objects are masked from ‘package:stats’: cov, filter, lag, na.omit, predict, sd, var, window The following objects are masked from ‘package:base’: as.data.frame, colnames, colnames<-, drop, endsWith, intersect, rank, rbind, sample, startsWith, subset, summary, transform, union Attaching package: ‘testthat’ The following objects are masked from ‘package:SparkR’: describe, not lintr checks passed. ======================================================================== Running build tests ======================================================================== Using `mvn` from path: /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/mvn Using `mvn` from path: /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/mvn Performing Maven install for hadoop-2.6 Using `mvn` from path: /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/mvn Performing Maven validate for hadoop-2.6 Using `mvn` from path: /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/mvn Generating dependency manifest for hadoop-2.6 Using `mvn` from path: /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/mvn Performing Maven install for hadoop-2.7 Using `mvn` from path: /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/mvn Performing Maven validate for hadoop-2.7 Using `mvn` from path: /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/mvn Generating dependency manifest for hadoop-2.7 Using `mvn` from path: /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/mvn Performing Maven install for hadoop-3.1 Using `mvn` from path: /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/mvn Performing Maven validate for hadoop-3.1 Using `mvn` from path: /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/mvn Generating dependency manifest for hadoop-3.1 Using `mvn` from path: /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/mvn Using `mvn` from path: /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/mvn ======================================================================== Building Spark ======================================================================== [info] Building Spark (w/Hive 1.2.1) using SBT with these arguments: -Phadoop-2.6 -Pkubernetes -Phive-thriftserver -Pflume -Pkinesis-asl -Pyarn -Pkafka-0-8 -Pspark-ganglia-lgpl -Phive -Pmesos test:package streaming-kafka-0-8-assembly/assembly streaming-flume-assembly/assembly streaming-kinesis-asl-assembly/assembly Using /usr/lib/jvm/java-8-openjdk-amd64/ as default JAVA_HOME. Note, this will be overridden by -java-home if it is set. [info] Loading project definition from /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/project [info] Set current project to spark-parent (in build file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/) [info] Avro compiler using stringType=CharSequence [info] Compiling Avro IDL /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume-sink/src/main/avro/sparkflume.avdl [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}tags... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}tools... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}spark... [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8-assembly/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/assembly/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/launcher/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-assembly/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/spark-ganglia-lgpl/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/unsafe/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/kvstore/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/sketch/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume-assembly/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/launcher/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-yarn/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/examples/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-shuffle/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl-assembly/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/assembly/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/tags/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8-assembly/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-common/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/tools/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-yarn/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-assembly/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-common/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume-assembly/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl-assembly/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/kvstore/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-shuffle/target [info] Done updating. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/target/scala-2.11/spark-parent_2.11-2.4.8-SNAPSHOT.jar ... [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/tags/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/tools/target [info] Done packaging. [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/spark-ganglia-lgpl/target [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/target/scala-2.11/spark-parent_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume-sink/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/unsafe/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/sketch/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume-sink/target [info] Done updating. [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/repl/target [info] Compiling 2 Scala sources and 6 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/tags/target/scala-2.11/classes... [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/repl/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive-thriftserver/target [info] Done updating. [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}mllib-local... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}streaming-flume-sink... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}unsafe... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}sketch... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}network-common... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}kvstore... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}launcher... [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/target [info] Compiling 1 Scala source to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/tools/target/scala-2.11/classes... [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive-thriftserver/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib-local/target [info] Done updating. [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/target [info] Done updating. [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/kubernetes/core/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/kubernetes/core/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/graphx/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/yarn/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/yarn/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib-local/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/graphx/target [info] Done updating. [info] Done updating. [info] Done updating. [info] Compiling 78 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-common/target/scala-2.11/classes... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}network-shuffle... [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive/target [info] Done updating. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/tags/target/scala-2.11/spark-tags_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 6 Scala sources and 3 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume-sink/target/scala-2.11/classes... [info] Compiling 4 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/tags/target/scala-2.11/test-classes... [info] Compiling 16 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/unsafe/target/scala-2.11/classes... [info] Compiling 5 Scala sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib-local/target/scala-2.11/classes... [info] Done updating. [info] Compiling 12 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/kvstore/target/scala-2.11/classes... [info] Compiling 20 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/launcher/target/scala-2.11/classes... [info] Compiling 9 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/sketch/target/scala-2.11/classes... [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/examples/target [info] Done updating. [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}core... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}network-yarn... [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/tags/target/scala-2.11/spark-tags_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:22: Unsafe is internal proprietary API and may be removed in a future release [warn] import sun.misc.Unsafe; [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:28: Unsafe is internal proprietary API and may be removed in a future release [warn] private static final Unsafe _UNSAFE; [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:150: Unsafe is internal proprietary API and may be removed in a future release [warn] sun.misc.Unsafe unsafe; [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:152: Unsafe is internal proprietary API and may be removed in a future release [warn] Field unsafeField = Unsafe.class.getDeclaredField("theUnsafe"); [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:154: Unsafe is internal proprietary API and may be removed in a future release [warn] unsafe = (sun.misc.Unsafe) unsafeField.get(null); [warn] ^ [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/streaming/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/streaming/target [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/sketch/target/scala-2.11/spark-sketch_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 3 Scala sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/sketch/target/scala-2.11/test-classes... [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/kvstore/target/scala-2.11/spark-kvstore_2.11-2.4.8-SNAPSHOT.jar ... [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/unsafe/target/scala-2.11/spark-unsafe_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Done packaging. [info] Compiling 1 Scala source and 5 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/unsafe/target/scala-2.11/test-classes... [info] Compiling 10 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/kvstore/target/scala-2.11/test-classes... [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/launcher/target/scala-2.11/spark-launcher_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 7 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/launcher/target/scala-2.11/test-classes... [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/tools/target/scala-2.11/spark-tools_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/tools/target/scala-2.11/spark-tools_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-common/target/scala-2.11/spark-network-common_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 24 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-shuffle/target/scala-2.11/classes... [info] Compiling 21 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-common/target/scala-2.11/test-classes... [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/launcher/target/scala-2.11/spark-launcher_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/kvstore/target/scala-2.11/spark-kvstore_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-shuffle/target/scala-2.11/spark-network-shuffle_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-common/src/test/java/org/apache/spark/network/server/OneForOneStreamManagerSuite.java:61: [unchecked] unchecked generic array creation for varargs parameter of type Class[] [warn] Mockito.when(buffers.next()).thenThrow(RuntimeException.class); [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-common/src/test/java/org/apache/spark/network/server/OneForOneStreamManagerSuite.java:68: [unchecked] unchecked generic array creation for varargs parameter of type Class[] [warn] Mockito.when(buffers2.next()).thenReturn(mockManagedBuffer).thenThrow(RuntimeException.class); [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume-sink/target/scala-2.11/src_managed/main/compiled_avro/org/apache/spark/streaming/flume/sink/EventBatch.java:243: [unchecked] unchecked cast [warn] record.events = fieldSetFlags()[2] ? this.events : (java.util.List) defaultValue(fields()[2]); [warn] ^ [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume-sink/target/scala-2.11/spark-streaming-flume-sink_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 1 Scala source to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume-sink/target/scala-2.11/test-classes... [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-common/target/scala-2.11/spark-network-common_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Compiling 13 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-shuffle/target/scala-2.11/test-classes... [info] Done packaging. [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive/target [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/sketch/target/scala-2.11/spark-sketch_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}catalyst... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}streaming... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}mesos... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}ganglia-lgpl... [info] Compiling 495 Scala sources and 81 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/target/scala-2.11/classes... [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume-sink/target/scala-2.11/spark-streaming-flume-sink_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-shuffle/target/scala-2.11/spark-network-shuffle_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Done updating. [info] Compiling 2 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-yarn/target/scala-2.11/classes... [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/unsafe/target/scala-2.11/spark-unsafe_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-yarn/target/scala-2.11/spark-network-yarn_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/common/network-yarn/target/scala-2.11/spark-network-yarn_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib-local/target/scala-2.11/spark-mllib-local_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 10 Scala sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib-local/target/scala-2.11/test-classes... [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/catalyst/target [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/catalyst/target [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/target [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib-local/target/scala-2.11/spark-mllib-local_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/target [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}kubernetes... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}graphx... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}streaming-kafka-0-8... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}streaming-kinesis-asl... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}streaming-kafka-0-10... [success] created output: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/target [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}sql... [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/BarrierTaskContext.scala:161: method isRunningLocally in class TaskContext is deprecated: Local execution was removed, so this always returns false [warn] override def isRunningLocally(): Boolean = taskContext.isRunningLocally() [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/SSLOptions.scala:71: constructor SslContextFactory in class SslContextFactory is deprecated: see corresponding Javadoc for more information. [warn] val sslContextFactory = new SslContextFactory() [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/api/r/RBackend.scala:102: method childGroup in class ServerBootstrap is deprecated: see corresponding Javadoc for more information. [warn] if (bootstrap != null && bootstrap.childGroup() != null) { [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/scheduler/StageInfo.scala:59: value attemptId in class StageInfo is deprecated: Use attemptNumber instead [warn] def attemptNumber(): Int = attemptId [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala:492: trait AccumulableParam in package spark is deprecated: use AccumulatorV2 [warn] param: org.apache.spark.AccumulableParam[R, T]) extends AccumulatorV2[T, R] { [warn] ^ [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}streaming-flume... [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * com.google.code.findbugs:jsr305:3.0.2 is selected over 1.3.9 [warn] +- org.apache.arrow:arrow-vector:0.10.0 (depends on 3.0.2) [warn] +- org.apache.arrow:arrow-memory:0.10.0 (depends on 3.0.2) [warn] +- org.apache.spark:spark-unsafe_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.spark:spark-network-common_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.hadoop:hadoop-common:2.6.5 (depends on 1.3.9) [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}streaming-kinesis-asl-assembly... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}streaming-kafka-0-8-assembly... [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.5.12.Final, 3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.flume:flume-ng-core:1.6.0 (depends on 3.5.12.Final) [warn] +- org.apache.flume:flume-ng-sdk:1.6.0 (depends on 3.5.12.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.5.12.Final) [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.5.12.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.5.12.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.7.0.Final, 3.6.2.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.7.0.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.7.0.Final, 3.6.2.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.7.0.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}yarn... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}hive... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}sql-kafka-0-10... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}avro... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}streaming-kafka-0-10-assembly... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}streaming-flume-assembly... [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}mllib... [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.7.0.Final, 3.6.2.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.7.0.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [warn] 5 warnings found [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.5.12.Final, 3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.flume:flume-ng-core:1.6.0 (depends on 3.5.12.Final) [warn] +- org.apache.flume:flume-ng-sdk:1.6.0 (depends on 3.5.12.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.5.12.Final) [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.5.12.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.5.12.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/SSLOptions.scala:71: constructor SslContextFactory in class SslContextFactory is deprecated: see corresponding Javadoc for more information. [warn] val sslContextFactory = new SslContextFactory() [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala:492: trait AccumulableParam in package spark is deprecated: use AccumulatorV2 [warn] param: org.apache.spark.AccumulableParam[R, T]) extends AccumulatorV2[T, R] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala:492: trait AccumulableParam in package spark is deprecated: use AccumulatorV2 [warn] param: org.apache.spark.AccumulableParam[R, T]) extends AccumulatorV2[T, R] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/BarrierTaskContext.scala:161: method isRunningLocally in class TaskContext is deprecated: Local execution was removed, so this always returns false [warn] override def isRunningLocally(): Boolean = taskContext.isRunningLocally() [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/scheduler/StageInfo.scala:59: value attemptId in class StageInfo is deprecated: Use attemptNumber instead [warn] def attemptNumber(): Int = attemptId [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/api/r/RBackend.scala:102: method childGroup in class ServerBootstrap is deprecated: see corresponding Javadoc for more information. [warn] if (bootstrap != null && bootstrap.childGroup() != null) { [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/target/scala-2.11/spark-core_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 37 Scala sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/kubernetes/core/target/scala-2.11/classes... [info] Compiling 1 Scala source to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/spark-ganglia-lgpl/target/scala-2.11/classes... [info] Compiling 38 Scala sources and 5 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/graphx/target/scala-2.11/classes... [info] Compiling 20 Scala sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/target/scala-2.11/classes... [info] Compiling 26 Scala sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/yarn/target/scala-2.11/classes... [info] Compiling 103 Scala sources and 6 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/streaming/target/scala-2.11/classes... [info] Compiling 240 Scala sources and 31 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/catalyst/target/scala-2.11/classes... [info] Compiling 240 Scala sources and 26 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/target/scala-2.11/test-classes... [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * com.google.code.findbugs:jsr305:3.0.2 is selected over 1.3.9 [warn] +- org.apache.arrow:arrow-vector:0.10.0 (depends on 3.0.2) [warn] +- org.apache.arrow:arrow-memory:0.10.0 (depends on 3.0.2) [warn] +- org.apache.spark:spark-unsafe_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.spark:spark-network-common_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.hadoop:hadoop-common:2.6.5 (depends on 1.3.9) [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/spark-ganglia-lgpl/target/scala-2.11/spark-ganglia-lgpl_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/spark-ganglia-lgpl/target/scala-2.11/spark-ganglia-lgpl_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:87: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] fwInfoBuilder.setRole(role) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:224: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] role.foreach { r => builder.setRole(r) } [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:260: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] Option(r.getRole), reservation) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:263: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] Option(r.getRole), reservation) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:521: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] role.foreach { r => builder.setRole(r) } [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:537: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] (RoleResourceInfo(resource.getRole, reservation), [warn] ^ [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/kubernetes/core/target/scala-2.11/spark-kubernetes_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/yarn/target/scala-2.11/spark-yarn_2.11-2.4.8-SNAPSHOT.jar ... [warn] 6 warnings found [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:87: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] fwInfoBuilder.setRole(role) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:224: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] role.foreach { r => builder.setRole(r) } [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:260: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] Option(r.getRole), reservation) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:263: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] Option(r.getRole), reservation) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:521: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] role.foreach { r => builder.setRole(r) } [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:537: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] (RoleResourceInfo(resource.getRole, reservation), [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/target/scala-2.11/spark-mesos_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * com.google.code.findbugs:jsr305:3.0.2 is selected over 1.3.9 [warn] +- org.apache.arrow:arrow-vector:0.10.0 (depends on 3.0.2) [warn] +- org.apache.arrow:arrow-memory:0.10.0 (depends on 3.0.2) [warn] +- org.apache.spark:spark-unsafe_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.spark:spark-network-common_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.hadoop:hadoop-common:2.6.5 (depends on 1.3.9) [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/graphx/target/scala-2.11/spark-graphx_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * com.google.code.findbugs:jsr305:3.0.2 is selected over 1.3.9 [warn] +- org.apache.arrow:arrow-vector:0.10.0 (depends on 3.0.2) [warn] +- org.apache.arrow:arrow-memory:0.10.0 (depends on 3.0.2) [warn] +- org.apache.spark:spark-unsafe_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.spark:spark-network-common_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.spark:spark-hive_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.hadoop:hadoop-common:2.6.5 (depends on 1.3.9) [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/streaming/target/scala-2.11/spark-streaming_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 11 Scala sources and 1 Java source to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/target/scala-2.11/classes... [info] Compiling 10 Scala sources and 2 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/target/scala-2.11/classes... [info] Compiling 10 Scala sources and 1 Java source to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/target/scala-2.11/classes... [info] Compiling 11 Scala sources and 2 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/target/scala-2.11/classes... [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/examples/FlumeEventCount.scala:59: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val stream = FlumeUtils.createStream(ssc, host, port, StorageLevel.MEMORY_ONLY_SER_2) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/examples/FlumePollingEventCount.scala:56: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val stream = FlumeUtils.createPollingStream(ssc, host, port) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeUtils.scala:259: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val dstream = FlumeUtils.createStream(jssc, hostname, port, storageLevel, enableDecompression) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeUtils.scala:275: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val dstream = FlumeUtils.createPollingStream( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:107: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val kinesisClient = new AmazonKinesisClient(credentials) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:108: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] kinesisClient.setEndpoint(endpointUrl) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:223: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val kinesisClient = new AmazonKinesisClient(new DefaultAWSCredentialsProviderChain()) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:224: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] kinesisClient.setEndpoint(endpoint) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:140: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] private val client = new AmazonKinesisClient(credentials) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:150: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] client.setEndpoint(endpointUrl) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:219: method setRequestCredentials in class AmazonWebServiceRequest is deprecated: see corresponding Javadoc for more information. [warn] getRecordsRequest.setRequestCredentials(credentials) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:240: method setRequestCredentials in class AmazonWebServiceRequest is deprecated: see corresponding Javadoc for more information. [warn] getShardIteratorRequest.setRequestCredentials(credentials) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisReceiver.scala:187: constructor Worker in class Worker is deprecated: see corresponding Javadoc for more information. [warn] worker = new Worker(recordProcessorFactory, kinesisClientLibConfiguration) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:58: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val client = new AmazonKinesisClient(KinesisTestUtils.getAWSCredentials()) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:59: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] client.setEndpoint(endpointUrl) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:64: constructor AmazonDynamoDBClient in class AmazonDynamoDBClient is deprecated: see corresponding Javadoc for more information. [warn] val dynamoDBClient = new AmazonDynamoDBClient(new DefaultAWSCredentialsProviderChain()) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:65: method setRegion in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] dynamoDBClient.setRegion(RegionUtils.getRegion(regionName)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisUtils.scala:606: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] KinesisUtils.createStream(jssc.ssc, kinesisAppName, streamName, endpointUrl, regionName, [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisUtils.scala:613: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] KinesisUtils.createStream(jssc, kinesisAppName, streamName, endpointUrl, regionName, [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisUtils.scala:616: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] KinesisUtils.createStream(jssc, kinesisAppName, streamName, endpointUrl, regionName, [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/SparkAWSCredentials.scala:76: method withLongLivedCredentialsProvider in class Builder is deprecated: see corresponding Javadoc for more information. [warn] .withLongLivedCredentialsProvider(longLivedCreds.provider) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/ConsumerStrategy.scala:100: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/ConsumerStrategy.scala:153: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/DirectKafkaInputDStream.scala:172: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] val msgs = c.poll(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/KafkaDataConsumer.scala:200: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] val p = consumer.poll(timeout) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:89: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] protected val kc = new KafkaCluster(kafkaParams) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:172: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] OffsetRange(tp.topic, tp.partition, fo, uo.offset) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:217: object KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val leaders = KafkaCluster.checkErrors(kc.findLeaders(topics)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:222: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] context.sparkContext, kafkaParams, b.map(OffsetRange(_)), leaders, messageHandler) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:58: trait HasOffsetRanges in package kafka is deprecated: Update to Kafka 0.10 integration [warn] ) extends RDD[R](sc, Nil) with Logging with HasOffsetRanges { [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:55: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val offsetRanges: Array[OffsetRange], [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:152: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val kc = new KafkaCluster(kafkaParams) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:268: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] OffsetRange(tp.topic, tp.partition, fo, uo.offset) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:633: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createStream[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder]( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:647: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges: JList[OffsetRange], [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:648: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] leaders: JMap[TopicAndPartition, Broker]): JavaRDD[(Array[Byte], Array[Byte])] = { [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:657: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges: JList[OffsetRange], [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:658: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] leaders: JMap[TopicAndPartition, Broker]): JavaRDD[Array[Byte]] = { [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:670: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges: JList[OffsetRange], [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:671: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] leaders: JMap[TopicAndPartition, Broker], [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:673: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createRDD[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder, V]( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:676: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges.toArray(new Array[OffsetRange](offsetRanges.size())), [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:720: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val kc = new KafkaCluster(Map(kafkaParams.asScala.toSeq: _*)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:721: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.getFromOffsets( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:725: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createDirectStream[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder, V]( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:733: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] ): OffsetRange = OffsetRange.create(topic, partition, fromOffset, untilOffset) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:733: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] ): OffsetRange = OffsetRange.create(topic, partition, fromOffset, untilOffset) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:738: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] def createBroker(host: String, port: JInt): Broker = Broker(host, port) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:738: object Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] def createBroker(host: String, port: JInt): Broker = Broker(host, port) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:740: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] def offsetRangesOfKafkaRDD(rdd: RDD[_]): JList[OffsetRange] = { [warn] ^ [warn] four warnings found [warn] 17 warnings found [warn] four warnings found [warn] 25 warnings found [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeUtils.scala:259: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val dstream = FlumeUtils.createStream(jssc, hostname, port, storageLevel, enableDecompression) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeUtils.scala:275: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val dstream = FlumeUtils.createPollingStream( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/examples/FlumePollingEventCount.scala:56: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val stream = FlumeUtils.createPollingStream(ssc, host, port) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/examples/FlumePollingEventCount.scala:56: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val stream = FlumeUtils.createPollingStream(ssc, host, port) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/examples/FlumeEventCount.scala:59: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val stream = FlumeUtils.createStream(ssc, host, port, StorageLevel.MEMORY_ONLY_SER_2) [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/target/scala-2.11/spark-streaming-flume_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume-assembly/target/scala-2.11/spark-streaming-flume-assembly_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume-assembly/target/scala-2.11/spark-streaming-flume-assembly_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * com.google.code.findbugs:jsr305:3.0.2 is selected over 1.3.9 [warn] +- org.apache.arrow:arrow-vector:0.10.0 (depends on 3.0.2) [warn] +- org.apache.arrow:arrow-memory:0.10.0 (depends on 3.0.2) [warn] +- org.apache.spark:spark-unsafe_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.spark:spark-network-common_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.hadoop:hadoop-common:2.6.5 (depends on 1.3.9) [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}hive-thriftserver... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}examples... [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}repl... [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/DirectKafkaInputDStream.scala:172: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] val msgs = c.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/ConsumerStrategy.scala:100: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/ConsumerStrategy.scala:153: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/KafkaDataConsumer.scala:200: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] val p = consumer.poll(timeout) [warn] [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/target/scala-2.11/spark-streaming-kafka-0-10_2.11-2.4.8-SNAPSHOT.jar ... [info] Note: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/java/org/apache/spark/examples/streaming/JavaKinesisWordCountASL.java uses or overrides a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-assembly/target/scala-2.11/spark-streaming-kafka-0-10-assembly_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-assembly/target/scala-2.11/spark-streaming-kafka-0-10-assembly_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:140: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] private val client = new AmazonKinesisClient(credentials) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:150: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] client.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:219: method setRequestCredentials in class AmazonWebServiceRequest is deprecated: see corresponding Javadoc for more information. [warn] getRecordsRequest.setRequestCredentials(credentials) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:240: method setRequestCredentials in class AmazonWebServiceRequest is deprecated: see corresponding Javadoc for more information. [warn] getShardIteratorRequest.setRequestCredentials(credentials) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisUtils.scala:606: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] KinesisUtils.createStream(jssc.ssc, kinesisAppName, streamName, endpointUrl, regionName, [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisUtils.scala:613: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] KinesisUtils.createStream(jssc, kinesisAppName, streamName, endpointUrl, regionName, [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisUtils.scala:616: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] KinesisUtils.createStream(jssc, kinesisAppName, streamName, endpointUrl, regionName, [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:107: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val kinesisClient = new AmazonKinesisClient(credentials) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:108: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] kinesisClient.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:223: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val kinesisClient = new AmazonKinesisClient(new DefaultAWSCredentialsProviderChain()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:224: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] kinesisClient.setEndpoint(endpoint) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/SparkAWSCredentials.scala:76: method withLongLivedCredentialsProvider in class Builder is deprecated: see corresponding Javadoc for more information. [warn] .withLongLivedCredentialsProvider(longLivedCreds.provider) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:58: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val client = new AmazonKinesisClient(KinesisTestUtils.getAWSCredentials()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:59: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] client.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:64: constructor AmazonDynamoDBClient in class AmazonDynamoDBClient is deprecated: see corresponding Javadoc for more information. [warn] val dynamoDBClient = new AmazonDynamoDBClient(new DefaultAWSCredentialsProviderChain()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:65: method setRegion in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] dynamoDBClient.setRegion(RegionUtils.getRegion(regionName)) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisReceiver.scala:187: constructor Worker in class Worker is deprecated: see corresponding Javadoc for more information. [warn] worker = new Worker(recordProcessorFactory, kinesisClientLibConfiguration) [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/target/scala-2.11/spark-streaming-kinesis-asl_2.11-2.4.8-SNAPSHOT.jar ... [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:633: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createStream[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder]( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:647: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges: JList[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:648: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] leaders: JMap[TopicAndPartition, Broker]): JavaRDD[(Array[Byte], Array[Byte])] = { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:657: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges: JList[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:658: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] leaders: JMap[TopicAndPartition, Broker]): JavaRDD[Array[Byte]] = { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:670: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges: JList[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:671: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] leaders: JMap[TopicAndPartition, Broker], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:673: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createRDD[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder, V]( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:676: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges.toArray(new Array[OffsetRange](offsetRanges.size())), [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:720: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val kc = new KafkaCluster(Map(kafkaParams.asScala.toSeq: _*)) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:721: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.getFromOffsets( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:725: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createDirectStream[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder, V]( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:733: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] ): OffsetRange = OffsetRange.create(topic, partition, fromOffset, untilOffset) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:733: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] ): OffsetRange = OffsetRange.create(topic, partition, fromOffset, untilOffset) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:738: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] def createBroker(host: String, port: JInt): Broker = Broker(host, port) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:738: object Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] def createBroker(host: String, port: JInt): Broker = Broker(host, port) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:740: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] def offsetRangesOfKafkaRDD(rdd: RDD[_]): JList[OffsetRange] = { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:89: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] protected val kc = new KafkaCluster(kafkaParams) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:172: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] OffsetRange(tp.topic, tp.partition, fo, uo.offset) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:217: object KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val leaders = KafkaCluster.checkErrors(kc.findLeaders(topics)) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:222: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] context.sparkContext, kafkaParams, b.map(OffsetRange(_)), leaders, messageHandler) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:58: trait HasOffsetRanges in package kafka is deprecated: Update to Kafka 0.10 integration [warn] ) extends RDD[R](sc, Nil) with Logging with HasOffsetRanges { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:55: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val offsetRanges: Array[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:55: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val offsetRanges: Array[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:152: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val kc = new KafkaCluster(kafkaParams) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:268: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] OffsetRange(tp.topic, tp.partition, fo, uo.offset) [warn] [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/target/scala-2.11/spark-streaming-kafka-0-8_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl-assembly/target/scala-2.11/spark-streaming-kinesis-asl-assembly_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl-assembly/target/scala-2.11/spark-streaming-kinesis-asl-assembly_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8-assembly/target/scala-2.11/spark-streaming-kafka-0-8-assembly_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8-assembly/target/scala-2.11/spark-streaming-kafka-0-8-assembly_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * com.google.code.findbugs:jsr305:3.0.2 is selected over 1.3.9 [warn] +- org.apache.arrow:arrow-vector:0.10.0 (depends on 3.0.2) [warn] +- org.apache.arrow:arrow-memory:0.10.0 (depends on 3.0.2) [warn] +- org.apache.spark:spark-unsafe_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.spark:spark-network-common_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.spark:spark-hive_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.hadoop:hadoop-common:2.6.5 (depends on 1.3.9) [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * com.google.code.findbugs:jsr305:3.0.2 is selected over 1.3.9 [warn] +- org.apache.arrow:arrow-vector:0.10.0 (depends on 3.0.2) [warn] +- org.apache.arrow:arrow-memory:0.10.0 (depends on 3.0.2) [warn] +- org.apache.spark:spark-unsafe_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.spark:spark-network-common_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.spark:spark-hive_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.hadoop:hadoop-common:2.6.5 (depends on 1.3.9) [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala:461: postfix operator seconds should be enabled [warn] by making the implicit value scala.language.postfixOps visible. [warn] This can be achieved by adding the import clause 'import scala.language.postfixOps' [warn] or by setting the compiler option -language:postfixOps. [warn] See the Scaladoc for value scala.language.postfixOps for a discussion [warn] why the feature should be explicitly enabled. [warn] eventually(timeout(5 seconds), interval(200 milliseconds)) { [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala:461: postfix operator milliseconds should be enabled [warn] by making the implicit value scala.language.postfixOps visible. [warn] eventually(timeout(5 seconds), interval(200 milliseconds)) { [warn] ^ [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * com.google.code.findbugs:jsr305:3.0.2 is selected over 1.3.9 [warn] +- org.apache.arrow:arrow-vector:0.10.0 (depends on 3.0.2) [warn] +- org.apache.arrow:arrow-memory:0.10.0 (depends on 3.0.2) [warn] +- org.apache.spark:spark-unsafe_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.spark:spark-network-common_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.hadoop:hadoop-common:2.6.5 (depends on 1.3.9) [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] [warn] * org.scala-lang.modules:scala-parser-combinators_2.11:1.1.0 is selected over 1.0.4 [warn] +- org.apache.spark:spark-mllib_2.11:2.4.8-SNAPSHOT (depends on 1.1.0) [warn] +- org.apache.spark:spark-catalyst_2.11:2.4.8-SNAPSHOT (depends on 1.1.0) [warn] +- org.scala-lang:scala-compiler:2.11.12 (depends on 1.0.4) [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}assembly... [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * com.google.code.findbugs:jsr305:3.0.2 is selected over 1.3.9 [warn] +- org.apache.arrow:arrow-vector:0.10.0 (depends on 3.0.2) [warn] +- org.apache.arrow:arrow-memory:0.10.0 (depends on 3.0.2) [warn] +- org.apache.spark:spark-unsafe_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.spark:spark-network-common_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.spark:spark-hive_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] +- org.apache.hadoop:hadoop-common:2.6.5 (depends on 1.3.9) [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 1.3.9) [warn] [warn] * org.scala-lang.modules:scala-parser-combinators_2.11:1.1.0 is selected over 1.0.4 [warn] +- org.scala-lang:scala-compiler:2.11.12 (depends on 1.0.4) [warn] +- org.apache.spark:spark-mllib_2.11:2.4.8-SNAPSHOT (depends on 1.0.4) [warn] +- org.apache.spark:spark-catalyst_2.11:2.4.8-SNAPSHOT (depends on 1.0.4) [warn] [warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final} [warn] +- org.apache.spark:spark-core_2.11:2.4.8-SNAPSHOT (depends on 3.9.9.Final) [warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final) [warn] [warn] Run 'evicted' to see detailed eviction warnings [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:48: trait AccumulableParam in package spark is deprecated: use AccumulatorV2 [warn] implicit def setAccum[A]: AccumulableParam[mutable.Set[A], A] = [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:49: trait AccumulableParam in package spark is deprecated: use AccumulatorV2 [warn] new AccumulableParam[mutable.Set[A], A] { [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:86: class Accumulator in package spark is deprecated: use AccumulatorV2 [warn] val acc: Accumulator[Int] = sc.accumulator(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:86: method accumulator in class SparkContext is deprecated: use AccumulatorV2 [warn] val acc: Accumulator[Int] = sc.accumulator(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:86: object IntAccumulatorParam in object AccumulatorParam is deprecated: use AccumulatorV2 [warn] val acc: Accumulator[Int] = sc.accumulator(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:92: method accumulator in class SparkContext is deprecated: use AccumulatorV2 [warn] val longAcc = sc.accumulator(0L) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:92: object LongAccumulatorParam in object AccumulatorParam is deprecated: use AccumulatorV2 [warn] val longAcc = sc.accumulator(0L) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:100: class Accumulator in package spark is deprecated: use AccumulatorV2 [warn] val acc: Accumulator[Int] = sc.accumulator(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:100: method accumulator in class SparkContext is deprecated: use AccumulatorV2 [warn] val acc: Accumulator[Int] = sc.accumulator(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:100: object IntAccumulatorParam in object AccumulatorParam is deprecated: use AccumulatorV2 [warn] val acc: Accumulator[Int] = sc.accumulator(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:112: class Accumulable in package spark is deprecated: use AccumulatorV2 [warn] val acc: Accumulable[mutable.Set[Any], Any] = sc.accumulable(new mutable.HashSet[Any]()) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:112: method accumulable in class SparkContext is deprecated: use AccumulatorV2 [warn] val acc: Accumulable[mutable.Set[Any], Any] = sc.accumulable(new mutable.HashSet[Any]()) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:129: class Accumulable in package spark is deprecated: use AccumulatorV2 [warn] val acc: Accumulable[mutable.Set[Any], Any] = sc.accumulable(new mutable.HashSet[Any]()) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:129: method accumulable in class SparkContext is deprecated: use AccumulatorV2 [warn] val acc: Accumulable[mutable.Set[Any], Any] = sc.accumulable(new mutable.HashSet[Any]()) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:145: method accumulableCollection in class SparkContext is deprecated: use AccumulatorV2 [warn] val setAcc = sc.accumulableCollection(mutable.HashSet[Int]()) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:146: method accumulableCollection in class SparkContext is deprecated: use AccumulatorV2 [warn] val bufferAcc = sc.accumulableCollection(mutable.ArrayBuffer[Int]()) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:147: method accumulableCollection in class SparkContext is deprecated: use AccumulatorV2 [warn] val mapAcc = sc.accumulableCollection(mutable.HashMap[Int, String]()) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:170: class Accumulable in package spark is deprecated: use AccumulatorV2 [warn] val acc: Accumulable[mutable.Set[Any], Any] = sc.accumulable(new mutable.HashSet[Any]()) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:170: method accumulable in class SparkContext is deprecated: use AccumulatorV2 [warn] val acc: Accumulable[mutable.Set[Any], Any] = sc.accumulable(new mutable.HashSet[Any]()) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:184: class Accumulable in package spark is deprecated: use AccumulatorV2 [warn] var acc: Accumulable[mutable.Set[Any], Any] = sc.accumulable(new mutable.HashSet[Any]()) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:184: method accumulable in class SparkContext is deprecated: use AccumulatorV2 [warn] var acc: Accumulable[mutable.Set[Any], Any] = sc.accumulable(new mutable.HashSet[Any]()) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:225: class Accumulator in package spark is deprecated: use AccumulatorV2 [warn] val acc = new Accumulator("", StringAccumulatorParam, Some("darkness")) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala:225: object StringAccumulatorParam in object AccumulatorParam is deprecated: use AccumulatorV2 [warn] val acc = new Accumulator("", StringAccumulatorParam, Some("darkness")) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/status/AppStatusListenerSuite.scala:1194: value attemptId in class StageInfo is deprecated: Use attemptNumber instead [warn] SparkListenerTaskEnd(stage1.stageId, stage1.attemptId, "taskType", Success, tasks(1), null)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/status/AppStatusListenerSuite.scala:1198: value attemptId in class StageInfo is deprecated: Use attemptNumber instead [warn] SparkListenerTaskEnd(stage1.stageId, stage1.attemptId, "taskType", Success, tasks(0), null)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/status/AppStatusListenerSuite.scala:1264: value attemptId in class StageInfo is deprecated: Use attemptNumber instead [warn] SparkListenerTaskEnd(stage1.stageId, stage1.attemptId, "taskType", Success, tasks(0), null)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/status/AppStatusListenerSuite.scala:1278: value attemptId in class StageInfo is deprecated: Use attemptNumber instead [warn] SparkListenerTaskEnd(stage1.stageId, stage1.attemptId, "taskType", [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/util/AccumulatorV2Suite.scala:132: object StringAccumulatorParam in object AccumulatorParam is deprecated: use AccumulatorV2 [warn] val acc = new LegacyAccumulatorWrapper("default", AccumulatorParam.StringAccumulatorParam) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/util/AccumulatorV2Suite.scala:132: object AccumulatorParam in package spark is deprecated: use AccumulatorV2 [warn] val acc = new LegacyAccumulatorWrapper("default", AccumulatorParam.StringAccumulatorParam) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/spark/util/AccumulatorV2Suite.scala:168: trait AccumulatorParam in package spark is deprecated: use AccumulatorV2 [warn] val param = new AccumulatorParam[MyData] { [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/sparktest/ImplicitSuite.scala:79: method accumulator in class SparkContext is deprecated: use AccumulatorV2 [warn] sc.accumulator(123.4) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/sparktest/ImplicitSuite.scala:79: object DoubleAccumulatorParam in object AccumulatorParam is deprecated: use AccumulatorV2 [warn] sc.accumulator(123.4) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/sparktest/ImplicitSuite.scala:84: method accumulator in class SparkContext is deprecated: use AccumulatorV2 [warn] sc.accumulator(123) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/sparktest/ImplicitSuite.scala:84: object IntAccumulatorParam in object AccumulatorParam is deprecated: use AccumulatorV2 [warn] sc.accumulator(123) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/sparktest/ImplicitSuite.scala:89: method accumulator in class SparkContext is deprecated: use AccumulatorV2 [warn] sc.accumulator(123L) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/sparktest/ImplicitSuite.scala:89: object LongAccumulatorParam in object AccumulatorParam is deprecated: use AccumulatorV2 [warn] sc.accumulator(123L) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/sparktest/ImplicitSuite.scala:94: method accumulator in class SparkContext is deprecated: use AccumulatorV2 [warn] sc.accumulator(123F) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/scala/org/apache/sparktest/ImplicitSuite.scala:94: object FloatAccumulatorParam in object AccumulatorParam is deprecated: use AccumulatorV2 [warn] sc.accumulator(123F) [warn] ^ [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/catalyst/target/scala-2.11/spark-catalyst_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 347 Scala sources and 93 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/target/scala-2.11/classes... [warn] 40 warnings found [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:128: value ENABLE_JOB_SUMMARY in object ParquetOutputFormat is deprecated: see corresponding Javadoc for more information. [warn] && conf.get(ParquetOutputFormat.ENABLE_JOB_SUMMARY) == null) { [warn] ^ [info] Note: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/test/java/test/org/apache/spark/JavaAPISuite.java uses or overrides a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Compiling 10 Scala sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/target/scala-2.11/test-classes... [info] Compiling 5 Scala sources and 3 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/target/scala-2.11/test-classes... [info] Compiling 23 Scala sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/yarn/target/scala-2.11/test-classes... [info] Compiling 28 Scala sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/kubernetes/core/target/scala-2.11/test-classes... [info] Compiling 40 Scala sources and 9 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/streaming/target/scala-2.11/test-classes... [info] Compiling 3 Scala sources and 3 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/target/scala-2.11/test-classes... [info] Compiling 6 Scala sources and 4 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/target/scala-2.11/test-classes... [info] Compiling 201 Scala sources and 5 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/catalyst/target/scala-2.11/test-classes... [info] Compiling 20 Scala sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/graphx/target/scala-2.11/test-classes... [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/target/scala-2.11/spark-core_2.11-2.4.8-SNAPSHOT-tests.jar ... [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/test/scala/org/apache/spark/streaming/flume/FlumePollingStreamSuite.scala:117: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] FlumeUtils.createPollingStream(ssc, addresses, StorageLevel.MEMORY_AND_DISK, [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/test/scala/org/apache/spark/streaming/flume/FlumeStreamSuite.scala:83: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val flumeStream = FlumeUtils.createStream( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/DirectKafkaStreamSuite.scala:96: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder]( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/DirectKafkaStreamSuite.scala:103: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] var offsetRanges = Array[OffsetRange]() [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/DirectKafkaStreamSuite.scala:107: trait HasOffsetRanges in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges = rdd.asInstanceOf[HasOffsetRanges].offsetRanges [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/DirectKafkaStreamSuite.scala:148: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val kc = new KafkaCluster(kafkaParams) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/DirectKafkaStreamSuite.scala:163: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder]( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/DirectKafkaStreamSuite.scala:194: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val kc = new KafkaCluster(kafkaParams) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/DirectKafkaStreamSuite.scala:209: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder, String]( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/DirectKafkaStreamSuite.scala:251: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder]( [warn] ^ [info] Done packaging. [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/DirectKafkaStreamSuite.scala:340: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder]( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/DirectKafkaStreamSuite.scala:414: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val kc = new KafkaCluster(kafkaParams) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/DirectKafkaStreamSuite.scala:494: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val kc = new KafkaCluster(kafkaParams) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/DirectKafkaStreamSuite.scala:565: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] kafkaStream: DStream[(K, V)]): Seq[(Time, Array[OffsetRange])] = { [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/KafkaClusterSuite.scala:30: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] private var kc: KafkaCluster = null [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/KafkaClusterSuite.scala:40: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] kc = new KafkaCluster(Map("metadata.broker.list" -> kafkaTestUtils.brokerAddress)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/KafkaRDDSuite.scala:64: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val offsetRanges = Array(OffsetRange(topic, 0, 0, messages.size)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/KafkaRDDSuite.scala:66: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val rdd = KafkaUtils.createRDD[String, String, StringDecoder, StringDecoder]( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/KafkaRDDSuite.scala:80: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val emptyRdd = KafkaUtils.createRDD[String, String, StringDecoder, StringDecoder]( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/KafkaRDDSuite.scala:81: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] sc, kafkaParams, Array(OffsetRange(topic, 0, 0, 0))) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/KafkaRDDSuite.scala:86: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val badRanges = Array(OffsetRange(topic, 0, 0, messages.size + 1)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/KafkaRDDSuite.scala:88: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createRDD[String, String, StringDecoder, StringDecoder]( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/KafkaRDDSuite.scala:102: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val kc = new KafkaCluster(kafkaParams) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/KafkaRDDSuite.scala:113: trait HasOffsetRanges in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val ranges = rdd.get.asInstanceOf[HasOffsetRanges].offsetRanges [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/KafkaRDDSuite.scala:148: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] private def getRdd(kc: KafkaCluster, topics: Set[String]) = { [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/KafkaRDDSuite.scala:161: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] OffsetRange(tp.topic, tp.partition, fromOffset, until(tp).offset) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/KafkaRDDSuite.scala:165: object Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] tp -> Broker(lo.host, lo.port) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/KafkaRDDSuite.scala:168: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createRDD[String, String, StringDecoder, StringDecoder, String]( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/KafkaStreamSuite.scala:66: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val stream = KafkaUtils.createStream[String, String, StringDecoder, StringDecoder]( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/ReliableKafkaStreamSuite.scala:96: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val stream = KafkaUtils.createStream[String, String, StringDecoder, StringDecoder]( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/test/scala/org/apache/spark/streaming/kafka/ReliableKafkaStreamSuite.scala:130: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val stream = KafkaUtils.createStream[String, String, StringDecoder, StringDecoder]( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/kubernetes/core/src/test/scala/org/apache/spark/scheduler/cluster/k8s/ExecutorPodsAllocatorSuite.scala:168: non-variable type argument org.apache.spark.deploy.k8s.KubernetesExecutorSpecificConf in type org.apache.spark.deploy.k8s.KubernetesConf[org.apache.spark.deploy.k8s.KubernetesExecutorSpecificConf] is unchecked since it is eliminated by erasure [warn] if (!argument.isInstanceOf[KubernetesConf[KubernetesExecutorSpecificConf]]) { [warn] ^ [warn] two warnings found [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/DirectKafkaStreamSuite.scala:253: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] s.consumer.poll(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/DirectKafkaStreamSuite.scala:309: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] s.consumer.poll(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/DirectKafkaStreamSuite.scala:473: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/KafkaTestUtils.scala:60: class ZkUtils in package utils is deprecated: This is an internal class that is no longer used by Kafka and will be removed in a future release. Please use org.apache.kafka.clients.admin.AdminClient instead. [warn] private var zkUtils: ZkUtils = _ [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/KafkaTestUtils.scala:88: class ZkUtils in package utils is deprecated: This is an internal class that is no longer used by Kafka and will be removed in a future release. Please use org.apache.kafka.clients.admin.AdminClient instead. [warn] def zookeeperClient: ZkUtils = { [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/KafkaTestUtils.scala:100: object ZkUtils in package utils is deprecated: This is an internal class that is no longer used by Kafka and will be removed in a future release. Please use org.apache.kafka.clients.admin.AdminClient instead. [warn] zkUtils = ZkUtils(s"$zkHost:$zkPort", zkSessionTimeout, zkConnectionTimeout, false) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/KafkaTestUtils.scala:178: method createTopic in object AdminUtils is deprecated: This method is deprecated and will be replaced by kafka.zk.AdminZkClient. [warn] AdminUtils.createTopic(zkUtils, topic, partitions, 1, config) [warn] ^ [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/target/scala-2.11/spark-streaming-flume_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:113: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] Resource.newBuilder().setRole("*") [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:116: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] Resource.newBuilder().setRole("*") [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:121: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] Resource.newBuilder().setRole("role2") [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:124: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] Resource.newBuilder().setRole("role2") [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:138: method valueOf in object Status is deprecated: see corresponding Javadoc for more information. [warn] ).thenReturn(Status.valueOf(1)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:151: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] assert(cpus.exists(_.getRole() == "role2")) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:152: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] assert(cpus.exists(_.getRole() == "*")) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:155: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] assert(mem.exists(_.getRole() == "role2")) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:156: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] assert(mem.exists(_.getRole() == "*")) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:417: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] Resource.newBuilder().setRole("*") [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:420: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] Resource.newBuilder().setRole("*") [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:271: method valueOf in object Status is deprecated: see corresponding Javadoc for more information. [warn] ).thenReturn(Status.valueOf(1)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:272: method valueOf in object Status is deprecated: see corresponding Javadoc for more information. [warn] when(driver.declineOffer(mesosOffers.get(1).getId)).thenReturn(Status.valueOf(1)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:273: method valueOf in object Status is deprecated: see corresponding Javadoc for more information. [warn] when(driver.declineOffer(mesosOffers.get(2).getId)).thenReturn(Status.valueOf(1)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:299: method valueOf in object Status is deprecated: see corresponding Javadoc for more information. [warn] when(driver.declineOffer(mesosOffers2.get(0).getId)).thenReturn(Status.valueOf(1)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:325: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] .setRole("prod") [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:329: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] .setRole("prod") [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:334: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] .setRole("dev") [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:339: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] .setRole("dev") [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:380: method valueOf in object Status is deprecated: see corresponding Javadoc for more information. [warn] ).thenReturn(Status.valueOf(1)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:397: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] assert(cpusDev.getRole.equals("dev")) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:400: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] r.getName.equals("mem") && r.getScalar.getValue.equals(484.0) && r.getRole.equals("prod") [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:403: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] r.getName.equals("cpus") && r.getScalar.getValue.equals(1.0) && r.getRole.equals("prod") [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtilsSuite.scala:54: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] role.foreach { r => builder.setRole(r) } [warn] ^ [warn] 29 warnings found [info] Note: Some input files use or override a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [warn] 7 warnings found [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/target/scala-2.11/spark-streaming-kafka-0-8_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/target/scala-2.11/spark-streaming-kafka-0-10_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] 24 warnings found [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/target/scala-2.11/spark-mesos_2.11-2.4.8-SNAPSHOT-tests.jar ... [warn] one warning found [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/kubernetes/core/target/scala-2.11/spark-kubernetes_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/yarn/target/scala-2.11/spark-yarn_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/graphx/target/scala-2.11/spark-graphx_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:360: class ParquetInputSplit in package hadoop is deprecated: see corresponding Javadoc for more information. [warn] new org.apache.parquet.hadoop.ParquetInputSplit( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:371: method readFooter in object ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter(sharedConf, filePath, SKIP_ROW_GROUPS).getFileMetaData [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:544: method readFooter in object ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] ^ [info] Compiling 8 Scala sources and 2 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/target/scala-2.11/test-classes... [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/streaming/target/scala-2.11/spark-streaming_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/test/scala/org/apache/spark/streaming/kinesis/KinesisInputDStreamBuilderSuite.scala:163: method initialPositionInStream in class Builder is deprecated: use initialPosition(initialPosition: KinesisInitialPosition) [warn] .initialPositionInStream(InitialPositionInStream.AT_TIMESTAMP) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/test/scala/org/apache/spark/streaming/kinesis/KinesisStreamSuite.scala:103: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] val kinesisStream1 = KinesisUtils.createStream(ssc, "myAppName", "mySparkStream", [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/test/scala/org/apache/spark/streaming/kinesis/KinesisStreamSuite.scala:106: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] val kinesisStream2 = KinesisUtils.createStream(ssc, "myAppName", "mySparkStream", [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/test/scala/org/apache/spark/streaming/kinesis/KinesisStreamSuite.scala:113: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] val inputStream = KinesisUtils.createStream(ssc, appName, "dummyStream", [warn] ^ [warn] four warnings found [info] Note: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/test/java/org/apache/spark/streaming/kinesis/JavaKinesisInputDStreamBuilderSuite.java uses or overrides a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/target/scala-2.11/spark-streaming-kinesis-asl_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] 6 warnings found [info] Note: Some input files use or override a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:128: value ENABLE_JOB_SUMMARY in object ParquetOutputFormat is deprecated: see corresponding Javadoc for more information. [warn] && conf.get(ParquetOutputFormat.ENABLE_JOB_SUMMARY) == null) { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:360: class ParquetInputSplit in package hadoop is deprecated: see corresponding Javadoc for more information. [warn] new org.apache.parquet.hadoop.ParquetInputSplit( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:371: method readFooter in object ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter(sharedConf, filePath, SKIP_ROW_GROUPS).getFileMetaData [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:544: method readFooter in object ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/target/scala-2.11/spark-sql_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 29 Scala sources and 2 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive/target/scala-2.11/classes... [info] Compiling 20 Scala sources and 1 Java source to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/target/scala-2.11/classes... [info] Compiling 304 Scala sources and 5 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/target/scala-2.11/classes... [info] Compiling 10 Scala sources and 1 Java source to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/target/scala-2.11/classes... [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaDataConsumer.scala:470: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] val p = consumer.poll(pollTimeoutMs) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:119: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:139: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:187: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:217: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:293: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/src/main/java/org/apache/spark/sql/avro/SparkAvroKeyOutputFormat.java:55: [unchecked] unchecked call to SparkAvroKeyRecordWriter(Schema,GenericData,CodecFactory,OutputStream,int,Map) as a member of the raw type SparkAvroKeyRecordWriter [warn] return new SparkAvroKeyRecordWriter( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/src/main/java/org/apache/spark/sql/avro/SparkAvroKeyOutputFormat.java:55: [unchecked] unchecked conversion [warn] return new SparkAvroKeyRecordWriter( [warn] ^ [warn] 6 warnings found [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/target/scala-2.11/spark-avro_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:119: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:139: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:187: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:217: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:293: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaDataConsumer.scala:470: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] val p = consumer.poll(pollTimeoutMs) [warn] [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/target/scala-2.11/spark-sql-kafka-0-10_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [warn] there were 16 deprecation warnings; re-run with -deprecation for details [warn] one warning found [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive/target/scala-2.11/spark-hive_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 12 Scala sources and 171 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive-thriftserver/target/scala-2.11/classes... [info] Compiling 292 Scala sources and 33 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/target/scala-2.11/test-classes... [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/catalyst/target/scala-2.11/spark-catalyst_2.11-2.4.8-SNAPSHOT-tests.jar ... [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive-thriftserver/src/gen/java/org/apache/hive/service/cli/thrift/TArrayTypeEntry.java:266: [unchecked] unchecked call to read(TProtocol,T) as a member of the raw type IScheme [warn] schemes.get(iprot.getScheme()).getScheme().read(iprot, this); [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/where T is a type-variable:T extends TBase declared in interface IScheme [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive-thriftserver/src/gen/java/org/apache/hive/service/cli/thrift/TArrayTypeEntry.java:270: [unchecked] unchecked call to write(TProtocol,T) as a member of the raw type IScheme [warn] schemes.get(oprot.getScheme()).getScheme().write(oprot, this); [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/where T is a type-variable:T extends TBase declared in interface IScheme [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive-thriftserver/src/gen/java/org/apache/hive/service/cli/thrift/TArrayTypeEntry.java:313: [unchecked] getScheme() in TArrayTypeEntryStandardSchemeFactory implements getScheme() in SchemeFactory [warn] public TArrayTypeEntryStandardScheme getScheme() { [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/return type requires unchecked conversion from TArrayTypeEntryStandardScheme to S [warn] where S is a type-variable:S extends IScheme declared in method getScheme() [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive-thriftserver/src/gen/java/org/apache/hive/service/cli/thrift/TArrayTypeEntry.java:361: [unchecked] getScheme() in TArrayTypeEntryTupleSchemeFactory implements getScheme() in SchemeFactory [warn] public TArrayTypeEntryTupleScheme getScheme() { [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/return type requires unchecked conversion from TArrayTypeEntryTupleScheme to S [warn] where S is a type-variable:S extends IScheme declared in method getScheme() [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive-thriftserver/src/gen/java/org/apache/hive/service/cli/thrift/TBinaryColumn.java:240: [unchecked] unchecked cast [warn] setValues((List)value); [warn] ^ [info] Done packaging. [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive-thriftserver/target/scala-2.11/spark-hive-thriftserver_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/main/scala/org/apache/spark/ml/feature/OneHotEncoder.scala:138: class OneHotEncoder in package feature is deprecated: `OneHotEncoderEstimator` will be renamed `OneHotEncoder` and this `OneHotEncoder` will be removed in 3.0.0. [warn] object OneHotEncoder extends DefaultParamsReadable[OneHotEncoder] { [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/main/scala/org/apache/spark/ml/feature/OneHotEncoder.scala:141: class OneHotEncoder in package feature is deprecated: `OneHotEncoderEstimator` will be renamed `OneHotEncoder` and this `OneHotEncoder` will be removed in 3.0.0. [warn] override def load(path: String): OneHotEncoder = super.load(path) [warn] ^ [warn] two warnings found [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/main/scala/org/apache/spark/ml/feature/OneHotEncoder.scala:138: class OneHotEncoder in package feature is deprecated: `OneHotEncoderEstimator` will be renamed `OneHotEncoder` and this `OneHotEncoder` will be removed in 3.0.0. [warn] object OneHotEncoder extends DefaultParamsReadable[OneHotEncoder] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/main/scala/org/apache/spark/ml/feature/OneHotEncoder.scala:141: class OneHotEncoder in package feature is deprecated: `OneHotEncoderEstimator` will be renamed `OneHotEncoder` and this `OneHotEncoder` will be removed in 3.0.0. [warn] override def load(path: String): OneHotEncoder = super.load(path) [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/target/scala-2.11/spark-mllib_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 6 Scala sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/repl/target/scala-2.11/classes... [info] Compiling 191 Scala sources and 128 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/examples/target/scala-2.11/classes... [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala:53: method addedClasspath in class ILoop is deprecated: Use reset, replay or require to update class path [warn] if (addedClasspath != "") { [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala:54: method addedClasspath in class ILoop is deprecated: Use reset, replay or require to update class path [warn] settings.classpath append addedClasspath [warn] ^ [warn] two warnings found [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala:53: method addedClasspath in class ILoop is deprecated: Use reset, replay or require to update class path [warn] if (addedClasspath != "") { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala:54: method addedClasspath in class ILoop is deprecated: Use reset, replay or require to update class path [warn] settings.classpath append addedClasspath [warn] [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/repl/target/scala-2.11/spark-repl_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 3 Scala sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/repl/target/scala-2.11/test-classes... [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/assembly/target/scala-2.11/jars/spark-assembly_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/assembly/target/scala-2.11/jars/spark-assembly_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/repl/target/scala-2.11/spark-repl_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/examples/target/scala-2.11/jars/spark-examples_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/examples/target/scala-2.11/jars/spark-examples_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/sources/TextSocketStreamSuite.scala:393: non-variable type argument String in type (String, java.sql.Timestamp) is unchecked since it is eliminated by erasure [warn] .isInstanceOf[(String, Timestamp)]) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/sources/TextSocketStreamSuite.scala:392: non-variable type argument String in type (String, java.sql.Timestamp) is unchecked since it is eliminated by erasure [warn] assert(r.get().get(0, TextSocketReader.SCHEMA_TIMESTAMP) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/streaming/StreamingQueryStatusAndProgressSuite.scala:204: postfix operator minute should be enabled [warn] by making the implicit value scala.language.postfixOps visible. [warn] This can be achieved by adding the import clause 'import scala.language.postfixOps' [warn] or by setting the compiler option -language:postfixOps. [warn] See the Scaladoc for value scala.language.postfixOps for a discussion [warn] why the feature should be explicitly enabled. [warn] eventually(timeout(1 minute)) { [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/streaming/StreamingQuerySuite.scala:693: a pure expression does nothing in statement position; you may be omitting necessary parentheses [warn] q1 [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/DataFrameSuite.scala:230: method explode in class Dataset is deprecated: use flatMap() or select() with functions.explode() instead [warn] df.explode("words", "word") { word: String => word.split(" ").toSeq }.select('word), [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/DataFrameSuite.scala:238: method explode in class Dataset is deprecated: use flatMap() or select() with functions.explode() instead [warn] df.explode('letters) { [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/DataFrameSuite.scala:288: method explode in class Dataset is deprecated: use flatMap() or select() with functions.explode() instead [warn] df.explode($"*") { case Row(prefix: String, csv: String) => [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/DataFrameSuite.scala:295: method explode in class Dataset is deprecated: use flatMap() or select() with functions.explode() instead [warn] df.explode('prefix, 'csv) { case Row(prefix: String, csv: String) => [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/DataFrameWindowFramesSuite.scala:228: method rangeBetween in class WindowSpec is deprecated: Use the version with Long parameter types [warn] val window = Window.partitionBy($"value").orderBy($"key").rangeBetween(lit(0), lit(2)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/DataFrameWindowFramesSuite.scala:242: method rangeBetween in class WindowSpec is deprecated: Use the version with Long parameter types [warn] val window = Window.partitionBy($"value").orderBy($"key").rangeBetween(currentRow, lit(2.5D)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/DataFrameWindowFramesSuite.scala:242: method currentRow in object functions is deprecated: Use Window.currentRow [warn] val window = Window.partitionBy($"value").orderBy($"key").rangeBetween(currentRow, lit(2.5D)) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/DataFrameWindowFramesSuite.scala:259: method rangeBetween in class WindowSpec is deprecated: Use the version with Long parameter types [warn] .rangeBetween(currentRow, lit(CalendarInterval.fromString("interval 23 days 4 hours"))) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/DataFrameWindowFramesSuite.scala:259: method currentRow in object functions is deprecated: Use Window.currentRow [warn] .rangeBetween(currentRow, lit(CalendarInterval.fromString("interval 23 days 4 hours"))) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/ProcessingTimeSuite.scala:30: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] def getIntervalMs(trigger: Trigger): Long = trigger.asInstanceOf[ProcessingTime].intervalMs [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetCompatibilityTest.scala:49: method readAllFootersInParallel in object ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readAllFootersInParallel(hadoopConf, parquetFiles, true) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetInteroperabilitySuite.scala:178: method readFooter in object ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter(hadoopConf, part.getPath, NO_FILTER) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetTest.scala:133: method writeMetadataFile in object ParquetFileWriter is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileWriter.writeMetadataFile(configuration, path, Seq(footer).asJava) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetTest.scala:148: method writeMetadataFile in object ParquetFileWriter is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileWriter.writeMetadataFile(configuration, path, Seq(footer).asJava) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetTest.scala:154: method readAllFootersInParallel in object ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readAllFootersInParallel(configuration, fs.getFileStatus(path)).asScala.toSeq [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetTest.scala:158: method readFooter in object ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter( [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/ProcessingTimeExecutorSuite.scala:51: class ConcurrentHashSet in package util is deprecated: see corresponding Javadoc for more information. [warn] val triggerTimes = new ConcurrentHashSet[Int] [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/ProcessingTimeExecutorSuite.scala:55: method apply in object ProcessingTime is deprecated: use Trigger.ProcessingTime(interval) [warn] val executor = ProcessingTimeExecutor(ProcessingTime("1000 milliseconds"), clock) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/streaming/StreamSuite.scala:316: method apply in object ProcessingTime is deprecated: use Trigger.ProcessingTime(interval) [warn] StartStream(ProcessingTime("10 seconds"), new StreamManualClock), [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/scala/org/apache/spark/sql/streaming/StreamSuite.scala:357: method apply in object ProcessingTime is deprecated: use Trigger.ProcessingTime(interval) [warn] StartStream(ProcessingTime("10 seconds"), new StreamManualClock(60 * 1000)), [warn] ^ [warn] 24 warnings found [info] Note: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/test/java/test/org/apache/spark/sql/JavaDataFrameSuite.java uses or overrides a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Compiling 5 Scala sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/target/scala-2.11/test-classes... [info] Compiling 9 Scala sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive-thriftserver/target/scala-2.11/test-classes... [info] Compiling 14 Scala sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/target/scala-2.11/test-classes... [info] Compiling 193 Scala sources and 66 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/target/scala-2.11/test-classes... [info] Compiling 88 Scala sources and 17 Java sources to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive/target/scala-2.11/test-classes... [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/target/scala-2.11/spark-sql_2.11-2.4.8-SNAPSHOT-tests.jar ... [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/HiveCliSessionStateSuite.scala:31: a pure expression does nothing in statement position; you may be omitting necessary parentheses [warn] try f finally SessionState.detachSession() [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaContinuousTest.scala:76: reflective access of structural type member value activeTaskIdCount should be enabled [warn] by making the implicit value scala.language.reflectiveCalls visible. [warn] This can be achieved by adding the import clause 'import scala.language.reflectiveCalls' [warn] or by setting the compiler option -language:reflectiveCalls. [warn] See the Scaladoc for value scala.language.reflectiveCalls for a discussion [warn] why the feature should be explicitly enabled. [warn] assert(tasksEndedListener.activeTaskIdCount.get() == 0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/src/test/scala/org/apache/spark/sql/avro/AvroSuite.scala:141: constructor Field in class Field is deprecated: see corresponding Javadoc for more information. [warn] Seq(new Field("null", Schema.create(Type.NULL), "doc", null)).asJava [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/src/test/scala/org/apache/spark/sql/avro/AvroSuite.scala:164: constructor Field in class Field is deprecated: see corresponding Javadoc for more information. [warn] val fields = Seq(new Field("field1", union, "doc", null)).asJava [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/src/test/scala/org/apache/spark/sql/avro/AvroSuite.scala:192: constructor Field in class Field is deprecated: see corresponding Javadoc for more information. [warn] val fields = Seq(new Field("field1", union, "doc", null)).asJava [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/src/test/scala/org/apache/spark/sql/avro/AvroSuite.scala:224: constructor Field in class Field is deprecated: see corresponding Javadoc for more information. [warn] val fields = Seq(new Field("field1", union, "doc", null)).asJava [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/src/test/scala/org/apache/spark/sql/avro/AvroSuite.scala:250: constructor Field in class Field is deprecated: see corresponding Javadoc for more information. [warn] val fields = Seq(new Field("field1", UnionOfOne, "doc", null)).asJava [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/src/test/scala/org/apache/spark/sql/avro/AvroSuite.scala:303: constructor Field in class Field is deprecated: see corresponding Javadoc for more information. [warn] new Field("field1", complexUnionType, "doc", null), [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/src/test/scala/org/apache/spark/sql/avro/AvroSuite.scala:304: constructor Field in class Field is deprecated: see corresponding Javadoc for more information. [warn] new Field("field2", complexUnionType, "doc", null), [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/src/test/scala/org/apache/spark/sql/avro/AvroSuite.scala:305: constructor Field in class Field is deprecated: see corresponding Javadoc for more information. [warn] new Field("field3", complexUnionType, "doc", null), [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/src/test/scala/org/apache/spark/sql/avro/AvroSuite.scala:306: constructor Field in class Field is deprecated: see corresponding Javadoc for more information. [warn] new Field("field4", complexUnionType, "doc", null) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/src/test/scala/org/apache/spark/sql/avro/AvroSuite.scala:970: constructor Field in class Field is deprecated: see corresponding Javadoc for more information. [warn] val avroField = new Field(name, avroType, "", null) [warn] ^ [warn] one warning found [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive-thriftserver/target/scala-2.11/spark-hive-thriftserver_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Done packaging. [warn] 10 warnings found [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/avro/target/scala-2.11/spark-avro_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala:66: class ZkUtils in package utils is deprecated: This is an internal class that is no longer used by Kafka and will be removed in a future release. Please use org.apache.kafka.clients.admin.AdminClient instead. [warn] private var zkUtils: ZkUtils = _ [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala:95: class ZkUtils in package utils is deprecated: This is an internal class that is no longer used by Kafka and will be removed in a future release. Please use org.apache.kafka.clients.admin.AdminClient instead. [warn] def zookeeperClient: ZkUtils = { [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala:107: object ZkUtils in package utils is deprecated: This is an internal class that is no longer used by Kafka and will be removed in a future release. Please use org.apache.kafka.clients.admin.AdminClient instead. [warn] zkUtils = ZkUtils(s"$zkHost:$zkPort", zkSessionTimeout, zkConnectionTimeout, false) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala:198: method createTopic in object AdminUtils is deprecated: This method is deprecated and will be replaced by kafka.zk.AdminZkClient. [warn] AdminUtils.createTopic(zkUtils, topic, partitions, 1) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala:225: method deleteTopic in object AdminUtils is deprecated: This method is deprecated and will be replaced by kafka.zk.AdminZkClient. [warn] AdminUtils.deleteTopic(zkUtils, topic) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala:290: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] kc.poll(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala:304: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] kc.poll(0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala:383: object ZkUtils in package utils is deprecated: This is an internal class that is no longer used by Kafka and will be removed in a future release. Please use org.apache.kafka.clients.admin.AdminClient instead. [warn] !zkUtils.pathExists(getDeleteTopicPath(topic)), [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala:384: object ZkUtils in package utils is deprecated: This is an internal class that is no longer used by Kafka and will be removed in a future release. Please use org.apache.kafka.clients.admin.AdminClient instead. [warn] s"${getDeleteTopicPath(topic)} still exists") [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala:385: object ZkUtils in package utils is deprecated: This is an internal class that is no longer used by Kafka and will be removed in a future release. Please use org.apache.kafka.clients.admin.AdminClient instead. [warn] assert(!zkUtils.pathExists(getTopicPath(topic)), s"${getTopicPath(topic)} still exists") [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala:385: object ZkUtils in package utils is deprecated: This is an internal class that is no longer used by Kafka and will be removed in a future release. Please use org.apache.kafka.clients.admin.AdminClient instead. [warn] assert(!zkUtils.pathExists(getTopicPath(topic)), s"${getTopicPath(topic)} still exists") [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala:409: class ZkUtils in package utils is deprecated: This is an internal class that is no longer used by Kafka and will be removed in a future release. Please use org.apache.kafka.clients.admin.AdminClient instead. [warn] zkUtils: ZkUtils, [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala:421: method deleteTopic in object AdminUtils is deprecated: This method is deprecated and will be replaced by kafka.zk.AdminZkClient. [warn] AdminUtils.deleteTopic(zkUtils, topic) [warn] ^ [warn] 14 warnings found [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/target/scala-2.11/spark-sql-kafka-0-10_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] there were 25 deprecation warnings; re-run with -deprecation for details [warn] one warning found [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:464: [unchecked] unchecked cast [warn] setLint((List)value); [warn] ^ [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/hive/target/scala-2.11/spark-hive_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/clustering/KMeansSuite.scala:120: method computeCost in class KMeansModel is deprecated: This method is deprecated and will be removed in 3.0.0. Use ClusteringEvaluator instead. You can also get the cost on the training dataset in the summary. [warn] assert(model.computeCost(dataset) < 0.1) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/clustering/KMeansSuite.scala:135: method computeCost in class KMeansModel is deprecated: This method is deprecated and will be removed in 3.0.0. Use ClusteringEvaluator instead. You can also get the cost on the training dataset in the summary. [warn] assert(model.computeCost(dataset) == summary.trainingCost) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/clustering/KMeansSuite.scala:206: method computeCost in class KMeansModel is deprecated: This method is deprecated and will be removed in 3.0.0. Use ClusteringEvaluator instead. You can also get the cost on the training dataset in the summary. [warn] model.computeCost(dataset) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/feature/OneHotEncoderSuite.scala:46: class OneHotEncoder in package feature is deprecated: `OneHotEncoderEstimator` will be renamed `OneHotEncoder` and this `OneHotEncoder` will be removed in 3.0.0. [warn] ParamsSuite.checkParams(new OneHotEncoder) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/feature/OneHotEncoderSuite.scala:51: class OneHotEncoder in package feature is deprecated: `OneHotEncoderEstimator` will be renamed `OneHotEncoder` and this `OneHotEncoder` will be removed in 3.0.0. [warn] val encoder = new OneHotEncoder() [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/feature/OneHotEncoderSuite.scala:74: class OneHotEncoder in package feature is deprecated: `OneHotEncoderEstimator` will be renamed `OneHotEncoder` and this `OneHotEncoder` will be removed in 3.0.0. [warn] val encoder = new OneHotEncoder() [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/feature/OneHotEncoderSuite.scala:96: class OneHotEncoder in package feature is deprecated: `OneHotEncoderEstimator` will be renamed `OneHotEncoder` and this `OneHotEncoder` will be removed in 3.0.0. [warn] val encoder = new OneHotEncoder() [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/feature/OneHotEncoderSuite.scala:110: class OneHotEncoder in package feature is deprecated: `OneHotEncoderEstimator` will be renamed `OneHotEncoder` and this `OneHotEncoder` will be removed in 3.0.0. [warn] val encoder = new OneHotEncoder() [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/feature/OneHotEncoderSuite.scala:121: class OneHotEncoder in package feature is deprecated: `OneHotEncoderEstimator` will be renamed `OneHotEncoder` and this `OneHotEncoder` will be removed in 3.0.0. [warn] val t = new OneHotEncoder() [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/feature/OneHotEncoderSuite.scala:156: class OneHotEncoder in package feature is deprecated: `OneHotEncoderEstimator` will be renamed `OneHotEncoder` and this `OneHotEncoder` will be removed in 3.0.0. [warn] val encoder = new OneHotEncoder() [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/image/ImageSchemaSuite.scala:52: method readImages in object ImageSchema is deprecated: use `spark.read.format("image").load(path)` and this `readImages` will be removed in 3.0.0. [warn] var df = readImages(imagePath) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/image/ImageSchemaSuite.scala:55: method readImages in object ImageSchema is deprecated: use `spark.read.format("image").load(path)` and this `readImages` will be removed in 3.0.0. [warn] df = readImages(imagePath, null, true, -1, false, 1.0, 0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/image/ImageSchemaSuite.scala:58: method readImages in object ImageSchema is deprecated: use `spark.read.format("image").load(path)` and this `readImages` will be removed in 3.0.0. [warn] df = readImages(imagePath, null, true, -1, true, 1.0, 0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/image/ImageSchemaSuite.scala:62: method readImages in object ImageSchema is deprecated: use `spark.read.format("image").load(path)` and this `readImages` will be removed in 3.0.0. [warn] df = readImages(imagePath, null, true, -1, true, 0.5, 0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/image/ImageSchemaSuite.scala:69: method readImages in object ImageSchema is deprecated: use `spark.read.format("image").load(path)` and this `readImages` will be removed in 3.0.0. [warn] val df = readImages(imagePath, null, false, 3, true, 1.0, 0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/image/ImageSchemaSuite.scala:74: method readImages in object ImageSchema is deprecated: use `spark.read.format("image").load(path)` and this `readImages` will be removed in 3.0.0. [warn] val df = readImages(imagePath + "/kittens/DP153539.jpg", null, false, 3, true, 1.0, 0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/image/ImageSchemaSuite.scala:79: method readImages in object ImageSchema is deprecated: use `spark.read.format("image").load(path)` and this `readImages` will be removed in 3.0.0. [warn] val df = readImages(imagePath + "/multi-channel/BGRA.png", null, false, 3, true, 1.0, 0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/image/ImageSchemaSuite.scala:84: method readImages in object ImageSchema is deprecated: use `spark.read.format("image").load(path)` and this `readImages` will be removed in 3.0.0. [warn] val df = readImages(imagePath + "/kittens/not-image.txt", null, false, 3, true, 1.0, 0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/image/ImageSchemaSuite.scala:90: method readImages in object ImageSchema is deprecated: use `spark.read.format("image").load(path)` and this `readImages` will be removed in 3.0.0. [warn] val df = readImages(imagePath + "/kittens/not-image.txt", null, false, 3, false, 1.0, 0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/image/ImageSchemaSuite.scala:96: method readImages in object ImageSchema is deprecated: use `spark.read.format("image").load(path)` and this `readImages` will be removed in 3.0.0. [warn] readImages(imagePath, null, true, 3, true, 1.1, 0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/image/ImageSchemaSuite.scala:103: method readImages in object ImageSchema is deprecated: use `spark.read.format("image").load(path)` and this `readImages` will be removed in 3.0.0. [warn] readImages(imagePath, null, true, 3, true, -0.1, 0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/image/ImageSchemaSuite.scala:109: method readImages in object ImageSchema is deprecated: use `spark.read.format("image").load(path)` and this `readImages` will be removed in 3.0.0. [warn] val df = readImages(imagePath, null, true, 3, true, 0.0, 0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/image/ImageSchemaSuite.scala:114: method readImages in object ImageSchema is deprecated: use `spark.read.format("image").load(path)` and this `readImages` will be removed in 3.0.0. [warn] val df = readImages(imagePath, sparkSession = spark, true, 3, true, 1.0, 0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/image/ImageSchemaSuite.scala:119: method readImages in object ImageSchema is deprecated: use `spark.read.format("image").load(path)` and this `readImages` will be removed in 3.0.0. [warn] val df = readImages(imagePath, null, true, 3, true, 1.0, 0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/image/ImageSchemaSuite.scala:124: method readImages in object ImageSchema is deprecated: use `spark.read.format("image").load(path)` and this `readImages` will be removed in 3.0.0. [warn] val df = readImages(imagePath, null, true, -3, true, 1.0, 0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/image/ImageSchemaSuite.scala:129: method readImages in object ImageSchema is deprecated: use `spark.read.format("image").load(path)` and this `readImages` will be removed in 3.0.0. [warn] val df = readImages(imagePath, null, true, 0, true, 1.0, 0) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/ml/image/ImageSchemaSuite.scala:136: method readImages in object ImageSchema is deprecated: use `spark.read.format("image").load(path)` and this `readImages` will be removed in 3.0.0. [warn] val images = readImages(imagePath + "/multi-channel/").collect [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/classification/LogisticRegressionSuite.scala:227: constructor LogisticRegressionWithSGD in class LogisticRegressionWithSGD is deprecated: Use ml.classification.LogisticRegression or LogisticRegressionWithLBFGS [warn] val lr = new LogisticRegressionWithSGD().setIntercept(true) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/classification/LogisticRegressionSuite.scala:303: constructor LogisticRegressionWithSGD in class LogisticRegressionWithSGD is deprecated: Use ml.classification.LogisticRegression or LogisticRegressionWithLBFGS [warn] val lr = new LogisticRegressionWithSGD().setIntercept(true) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/classification/LogisticRegressionSuite.scala:338: constructor LogisticRegressionWithSGD in class LogisticRegressionWithSGD is deprecated: Use ml.classification.LogisticRegression or LogisticRegressionWithLBFGS [warn] val lr = new LogisticRegressionWithSGD().setIntercept(true) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/classification/LogisticRegressionSuite.scala:919: object LogisticRegressionWithSGD in package classification is deprecated: Use ml.classification.LogisticRegression or LogisticRegressionWithLBFGS [warn] val model = LogisticRegressionWithSGD.train(points, 2) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/clustering/KMeansSuite.scala:369: method train in object KMeans is deprecated: Use train method without 'runs' [warn] val model = KMeans.train(points, 2, 2, 1, initMode) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/evaluation/MulticlassMetricsSuite.scala:80: value precision in class MulticlassMetrics is deprecated: Use accuracy. [warn] assert(math.abs(metrics.accuracy - metrics.precision) < delta) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/evaluation/MulticlassMetricsSuite.scala:81: value recall in class MulticlassMetrics is deprecated: Use accuracy. [warn] assert(math.abs(metrics.accuracy - metrics.recall) < delta) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/evaluation/MulticlassMetricsSuite.scala:82: value fMeasure in class MulticlassMetrics is deprecated: Use accuracy. [warn] assert(math.abs(metrics.accuracy - metrics.fMeasure) < delta) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/regression/LassoSuite.scala:58: constructor LassoWithSGD in class LassoWithSGD is deprecated: Use ml.regression.LinearRegression with elasticNetParam = 1.0. Note the default regParam is 0.01 for LassoWithSGD, but is 0.0 for LinearRegression. [warn] val ls = new LassoWithSGD() [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/regression/LassoSuite.scala:102: constructor LassoWithSGD in class LassoWithSGD is deprecated: Use ml.regression.LinearRegression with elasticNetParam = 1.0. Note the default regParam is 0.01 for LassoWithSGD, but is 0.0 for LinearRegression. [warn] val ls = new LassoWithSGD() [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/regression/LassoSuite.scala:156: object LassoWithSGD in package regression is deprecated: Use ml.regression.LinearRegression with elasticNetParam = 1.0. Note the default regParam is 0.01 for LassoWithSGD, but is 0.0 for LinearRegression. [warn] val model = LassoWithSGD.train(points, 2) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/regression/LinearRegressionSuite.scala:49: constructor LinearRegressionWithSGD in class LinearRegressionWithSGD is deprecated: Use ml.regression.LinearRegression or LBFGS [warn] val linReg = new LinearRegressionWithSGD().setIntercept(true) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/regression/LinearRegressionSuite.scala:75: constructor LinearRegressionWithSGD in class LinearRegressionWithSGD is deprecated: Use ml.regression.LinearRegression or LBFGS [warn] val linReg = new LinearRegressionWithSGD().setIntercept(false) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/regression/LinearRegressionSuite.scala:106: constructor LinearRegressionWithSGD in class LinearRegressionWithSGD is deprecated: Use ml.regression.LinearRegression or LBFGS [warn] val linReg = new LinearRegressionWithSGD().setIntercept(false) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/regression/LinearRegressionSuite.scala:163: object LinearRegressionWithSGD in package regression is deprecated: Use ml.regression.LinearRegression or LBFGS [warn] val model = LinearRegressionWithSGD.train(points, 2) [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/regression/RidgeRegressionSuite.scala:63: constructor LinearRegressionWithSGD in class LinearRegressionWithSGD is deprecated: Use ml.regression.LinearRegression or LBFGS [warn] val linearReg = new LinearRegressionWithSGD() [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/regression/RidgeRegressionSuite.scala:71: constructor RidgeRegressionWithSGD in class RidgeRegressionWithSGD is deprecated: Use ml.regression.LinearRegression with elasticNetParam = 0.0. Note the default regParam is 0.01 for RidgeRegressionWithSGD, but is 0.0 for LinearRegression. [warn] val ridgeReg = new RidgeRegressionWithSGD() [warn] ^ [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/test/scala/org/apache/spark/mllib/regression/RidgeRegressionSuite.scala:113: object RidgeRegressionWithSGD in package regression is deprecated: Use ml.regression.LinearRegression with elasticNetParam = 0.0. Note the default regParam is 0.01 for RidgeRegressionWithSGD, but is 0.0 for LinearRegression. [warn] val model = RidgeRegressionWithSGD.train(points, 2) [warn] ^ [warn] 45 warnings found [info] Note: Some input files use or override a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/target/scala-2.11/spark-mllib_2.11-2.4.8-SNAPSHOT-tests.jar ... [info] Done packaging. [success] Total time: 580 s, completed Jan 17, 2021 5:16:38 PM [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/SSLOptions.scala:71: constructor SslContextFactory in class SslContextFactory is deprecated: see corresponding Javadoc for more information. [warn] val sslContextFactory = new SslContextFactory() [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala:492: trait AccumulableParam in package spark is deprecated: use AccumulatorV2 [warn] param: org.apache.spark.AccumulableParam[R, T]) extends AccumulatorV2[T, R] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala:492: trait AccumulableParam in package spark is deprecated: use AccumulatorV2 [warn] param: org.apache.spark.AccumulableParam[R, T]) extends AccumulatorV2[T, R] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/BarrierTaskContext.scala:161: method isRunningLocally in class TaskContext is deprecated: Local execution was removed, so this always returns false [warn] override def isRunningLocally(): Boolean = taskContext.isRunningLocally() [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/scheduler/StageInfo.scala:59: value attemptId in class StageInfo is deprecated: Use attemptNumber instead [warn] def attemptNumber(): Int = attemptId [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/api/r/RBackend.scala:102: method childGroup in class ServerBootstrap is deprecated: see corresponding Javadoc for more information. [warn] if (bootstrap != null && bootstrap.childGroup() != null) { [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/target/scala-2.11/spark-core_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:633: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createStream[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder]( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:647: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges: JList[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:648: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] leaders: JMap[TopicAndPartition, Broker]): JavaRDD[(Array[Byte], Array[Byte])] = { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:657: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges: JList[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:658: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] leaders: JMap[TopicAndPartition, Broker]): JavaRDD[Array[Byte]] = { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:670: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges: JList[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:671: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] leaders: JMap[TopicAndPartition, Broker], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:673: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createRDD[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder, V]( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:676: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges.toArray(new Array[OffsetRange](offsetRanges.size())), [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:720: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val kc = new KafkaCluster(Map(kafkaParams.asScala.toSeq: _*)) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:721: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.getFromOffsets( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:725: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createDirectStream[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder, V]( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:733: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] ): OffsetRange = OffsetRange.create(topic, partition, fromOffset, untilOffset) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:733: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] ): OffsetRange = OffsetRange.create(topic, partition, fromOffset, untilOffset) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:738: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] def createBroker(host: String, port: JInt): Broker = Broker(host, port) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:738: object Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] def createBroker(host: String, port: JInt): Broker = Broker(host, port) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:740: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] def offsetRangesOfKafkaRDD(rdd: RDD[_]): JList[OffsetRange] = { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:89: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] protected val kc = new KafkaCluster(kafkaParams) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:172: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] OffsetRange(tp.topic, tp.partition, fo, uo.offset) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:217: object KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val leaders = KafkaCluster.checkErrors(kc.findLeaders(topics)) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:222: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] context.sparkContext, kafkaParams, b.map(OffsetRange(_)), leaders, messageHandler) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:58: trait HasOffsetRanges in package kafka is deprecated: Update to Kafka 0.10 integration [warn] ) extends RDD[R](sc, Nil) with Logging with HasOffsetRanges { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:55: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val offsetRanges: Array[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:55: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val offsetRanges: Array[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:152: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val kc = new KafkaCluster(kafkaParams) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:268: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] OffsetRange(tp.topic, tp.partition, fo, uo.offset) [warn] [info] Checking every *.class/*.jar file's SHA-1. [warn] Strategy 'discard' was applied to a file [warn] Strategy 'filterDistinctLines' was applied to 7 files [warn] Strategy 'first' was applied to 95 files [info] SHA-1: 177496adeb4b784b30fce8614a984fff59e4b551 [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8-assembly/target/scala-2.11/spark-streaming-kafka-0-8-assembly-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [success] Total time: 17 s, completed Jan 17, 2021 5:16:55 PM [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/SSLOptions.scala:71: constructor SslContextFactory in class SslContextFactory is deprecated: see corresponding Javadoc for more information. [warn] val sslContextFactory = new SslContextFactory() [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala:492: trait AccumulableParam in package spark is deprecated: use AccumulatorV2 [warn] param: org.apache.spark.AccumulableParam[R, T]) extends AccumulatorV2[T, R] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala:492: trait AccumulableParam in package spark is deprecated: use AccumulatorV2 [warn] param: org.apache.spark.AccumulableParam[R, T]) extends AccumulatorV2[T, R] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/BarrierTaskContext.scala:161: method isRunningLocally in class TaskContext is deprecated: Local execution was removed, so this always returns false [warn] override def isRunningLocally(): Boolean = taskContext.isRunningLocally() [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/scheduler/StageInfo.scala:59: value attemptId in class StageInfo is deprecated: Use attemptNumber instead [warn] def attemptNumber(): Int = attemptId [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/api/r/RBackend.scala:102: method childGroup in class ServerBootstrap is deprecated: see corresponding Javadoc for more information. [warn] if (bootstrap != null && bootstrap.childGroup() != null) { [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/target/scala-2.11/spark-core_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeUtils.scala:259: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val dstream = FlumeUtils.createStream(jssc, hostname, port, storageLevel, enableDecompression) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeUtils.scala:275: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val dstream = FlumeUtils.createPollingStream( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/examples/FlumePollingEventCount.scala:56: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val stream = FlumeUtils.createPollingStream(ssc, host, port) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/examples/FlumePollingEventCount.scala:56: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val stream = FlumeUtils.createPollingStream(ssc, host, port) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/examples/FlumeEventCount.scala:59: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val stream = FlumeUtils.createStream(ssc, host, port, StorageLevel.MEMORY_ONLY_SER_2) [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Checking every *.class/*.jar file's SHA-1. [warn] Strategy 'discard' was applied to a file [warn] Strategy 'filterDistinctLines' was applied to 7 files [warn] Strategy 'first' was applied to 88 files [info] SHA-1: 06f676839096880cf7d44ef6f12cafcbc4120dea [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume-assembly/target/scala-2.11/spark-streaming-flume-assembly-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [success] Total time: 16 s, completed Jan 17, 2021 5:17:11 PM [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/SSLOptions.scala:71: constructor SslContextFactory in class SslContextFactory is deprecated: see corresponding Javadoc for more information. [warn] val sslContextFactory = new SslContextFactory() [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala:492: trait AccumulableParam in package spark is deprecated: use AccumulatorV2 [warn] param: org.apache.spark.AccumulableParam[R, T]) extends AccumulatorV2[T, R] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala:492: trait AccumulableParam in package spark is deprecated: use AccumulatorV2 [warn] param: org.apache.spark.AccumulableParam[R, T]) extends AccumulatorV2[T, R] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/BarrierTaskContext.scala:161: method isRunningLocally in class TaskContext is deprecated: Local execution was removed, so this always returns false [warn] override def isRunningLocally(): Boolean = taskContext.isRunningLocally() [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/scheduler/StageInfo.scala:59: value attemptId in class StageInfo is deprecated: Use attemptNumber instead [warn] def attemptNumber(): Int = attemptId [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/api/r/RBackend.scala:102: method childGroup in class ServerBootstrap is deprecated: see corresponding Javadoc for more information. [warn] if (bootstrap != null && bootstrap.childGroup() != null) { [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/target/scala-2.11/spark-core_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:140: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] private val client = new AmazonKinesisClient(credentials) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:150: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] client.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:219: method setRequestCredentials in class AmazonWebServiceRequest is deprecated: see corresponding Javadoc for more information. [warn] getRecordsRequest.setRequestCredentials(credentials) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:240: method setRequestCredentials in class AmazonWebServiceRequest is deprecated: see corresponding Javadoc for more information. [warn] getShardIteratorRequest.setRequestCredentials(credentials) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisUtils.scala:606: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] KinesisUtils.createStream(jssc.ssc, kinesisAppName, streamName, endpointUrl, regionName, [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisUtils.scala:613: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] KinesisUtils.createStream(jssc, kinesisAppName, streamName, endpointUrl, regionName, [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisUtils.scala:616: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] KinesisUtils.createStream(jssc, kinesisAppName, streamName, endpointUrl, regionName, [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:107: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val kinesisClient = new AmazonKinesisClient(credentials) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:108: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] kinesisClient.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:223: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val kinesisClient = new AmazonKinesisClient(new DefaultAWSCredentialsProviderChain()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:224: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] kinesisClient.setEndpoint(endpoint) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/SparkAWSCredentials.scala:76: method withLongLivedCredentialsProvider in class Builder is deprecated: see corresponding Javadoc for more information. [warn] .withLongLivedCredentialsProvider(longLivedCreds.provider) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:58: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val client = new AmazonKinesisClient(KinesisTestUtils.getAWSCredentials()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:59: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] client.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:64: constructor AmazonDynamoDBClient in class AmazonDynamoDBClient is deprecated: see corresponding Javadoc for more information. [warn] val dynamoDBClient = new AmazonDynamoDBClient(new DefaultAWSCredentialsProviderChain()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:65: method setRegion in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] dynamoDBClient.setRegion(RegionUtils.getRegion(regionName)) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisReceiver.scala:187: constructor Worker in class Worker is deprecated: see corresponding Javadoc for more information. [warn] worker = new Worker(recordProcessorFactory, kinesisClientLibConfiguration) [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Checking every *.class/*.jar file's SHA-1. [warn] Strategy 'discard' was applied to 2 files [warn] Strategy 'filterDistinctLines' was applied to 8 files [warn] Strategy 'first' was applied to 50 files [info] SHA-1: f43f2d192c98ad103d1a490b1b3bc57922fc7337 [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl-assembly/target/scala-2.11/spark-streaming-kinesis-asl-assembly-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [success] Total time: 17 s, completed Jan 17, 2021 5:17:28 PM ======================================================================== Detecting binary incompatibilities with MiMa ======================================================================== [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.Strategy [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamedQueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.InBlock [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.TrainValidationSplit.TrainValidationSplitReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.LocalLDAModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.Main.MainClassOptionParser [WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DefaultParamsReader.Metadata [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.OpenHashMapBasedStateMap.StateInfo [WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.RpcEndpointVerifier.CheckExistence [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.SetupDriver Error instrumenting class:org.apache.spark.mapred.SparkHadoopMapRedUtil$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.MultilayerPerceptronClassifierWrapper.MultilayerPerceptronClassifierWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexModelWriter.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.Hello [WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.CryptoHelperChannel [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ImputerModel.ImputerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierCommentListContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.SignalUtils.ActionHandler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntervalContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QuerySpecificationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator17$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.Benchmark.Result [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.plans [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.DateAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.ImplicitTypeCasts [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveRdd [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleTableIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.ZooKeeperLeaderElectionAgent.LeadershipStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UnsetTablePropertiesContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillTask [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AggregationContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LogisticRegressionWrapper.LogisticRegressionWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntervalValueContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.FeatureHasher.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.RunLengthEncoding.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterChanged [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.DefaultPartitionCoalescer.PartitionLocations [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.StateStoreAwareZipPartitionsHelper [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.SparkAppHandle.Listener [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.SerDeUtil.ArrayConstructor [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestWorkerState Error instrumenting class:org.apache.spark.sql.execution.streaming.HDFSMetadataLog$FileSystemManager [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.KVTypeInfo.Accessor [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillMerger.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisteredWorker [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterApplication [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FromClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateKeyWatermarkPredicate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ColumnReferenceContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterChangeAcknowledged [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryTermDefaultContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComparisonContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetMatchingBlockIds [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.PythonMLLibAPI.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ApplicationRemoved [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyAndNumValues [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryTermContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0.Data Error instrumenting class:org.apache.spark.input.StreamInputFormat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.CaseWhenCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RetryingBlockFetcher.BlockFetchStarter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RetrieveSparkAppConfig [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator16$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.BisectingKMeansModel.$$typecreator1$1 Error instrumenting class:org.apache.spark.mllib.regression.IsotonicRegressionModel$SaveLoadV1_0$ [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.Prefix [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyToNumValuesType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.InMemoryFileIndex.SerializableBlockLocation [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.stat.test.ChiSqTest.Method [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ExtractWindowExpressions [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator8$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.Batch [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ChiSquareTest.ChiSquareResult [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDB.PrefixCache [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AliasedRelationContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.FreqSequence [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.DecisionTreeRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.CubeType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator9$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveGroupingAnalytics Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Error instrumenting class:org.apache.spark.deploy.SparkSubmit$ [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.NodeData [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.Pipeline.PipelineReader [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.ArrayWrappers.ComparableObjectArray [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.Division [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator7$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ClassificationModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.KMeansModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.ReceiverTracker.TrackerState 21/01/17 17:18:12 WARN Utils: Your hostname, research-jenkins-worker-09 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface enp4s0f0) 21/01/17 17:18:12 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address [WARN] Unable to detect inner functions for class:org.apache.spark.sql.streaming.StreamingQueryListener.QueryTerminatedEvent [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ParenthesizedExpressionContext Error instrumenting class:org.apache.spark.mllib.clustering.LocalLDAModel$SaveLoadV1_0$ [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.NodeData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableProviderContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveHints.ResolveBroadcastHints Error instrumenting class:org.apache.spark.sql.execution.command.DDLUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.DecisionTreeRegressorWrapper.DecisionTreeRegressorWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.UnregisterApplication [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleByBytesContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorkerFailed [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.SplitData [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.JoinReorderDP.JoinPlan [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.expressions [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateFunctionContext Error instrumenting class:org.apache.spark.sql.execution.streaming.CommitLog [WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClient.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.LDAWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LocationSpecContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.PartitioningUtils.PartitionValues [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper.GeneralizedLinearRegressionWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelWriter.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DriverStatusResponse [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.optimization.LBFGS.CostFun [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TablePropertyKeyContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.param.shared.SharedParamsCodeGen.ParamDesc [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.GaussianMixtureModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedStateStore.COMMITTED [WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClientFactory.ClientPool Error instrumenting class:org.apache.spark.scheduler.SplitInfo$ [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.SparkBuildInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AddTablePartitionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SmallIntLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.python.MLSerDe.SparseMatrixPickler Error instrumenting class:org.apache.spark.api.python.DoubleArrayWritable [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.TrainValidationSplitModel.TrainValidationSplitModelReader Error instrumenting class:org.apache.spark.sql.execution.datasources.orc.OrcUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.api.java.JavaUtils.SerializableMapWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyToNumValuesStore [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.DecisionTreeRegressorWrapper.DecisionTreeRegressorWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.AnalysisErrorAt [WARN] Unable to detect inner functions for class:org.apache.spark.ui.JettyUtils.ServletParams [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.Once [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RepairTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.EnsembleNodeData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.DataSource.SourceInfo [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.SparkSubmitUtils.MavenCoordinate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.ApproximatePercentile.PercentileDigest [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamedExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RowConstructorContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.OptimizeMetadataOnlyQuery.PartitionedRelation Error instrumenting class:org.apache.spark.deploy.SparkHadoopUtil$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSizeHint.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.AssociationRules.Rule [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.SquaredEuclideanSilhouette.ClusterStats [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.RepeatedGroupConverter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeans.ClusterSummaryAggregator [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.CheckpointWriter.CheckpointWriteHandler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowDefContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.SpillReader [WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.Dispatcher.EndpointData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveNewInstance [WARN] Unable to detect inner functions for class:org.apache.spark.ml.fpm.FPGrowthModel.FPGrowthModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.DateConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.Aggregator [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.IteratorForPartition [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelWriter.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.MapConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetDatabasePropertiesContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetQuantifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.MemorySink.AddedData [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.client.StandaloneAppClient.ClientEndpoint [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveShuffle [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CtesContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.plans.DslLogicalPlan [WARN] Unable to detect inner functions for class:org.apache.spark.annotation.InterfaceStability.Evolving [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GBTRegressorWrapper.GBTRegressorWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.KMeansWrapper.KMeansWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClusteringModel.SaveLoadV1_0.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.ReceiverTracker.ReceiverTrackerEndpoint [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.BisectingKMeansWrapper.BisectingKMeansWrapperWriter Error instrumenting class:org.apache.spark.sql.execution.streaming.HDFSMetadataLog$FileContextManager [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.IdentityProjection Error instrumenting class:org.apache.spark.internal.io.HadoopMapRedCommitProtocol [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorter.$SortedIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MultiInsertQueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelWriter.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.fpm.FPGrowthModel.FPGrowthModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelWriter.Data Error instrumenting class:org.apache.spark.sql.execution.streaming.OffsetSeqLog [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ChiSquareTest.$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SubqueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.WidenSetOperationTypes [WARN] Unable to detect inner functions for class:org.apache.spark.network.crypto.TransportCipher.EncryptionHandler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LDAModel.$$typecreator2$1 Error instrumenting class:org.apache.spark.internal.io.HadoopMapReduceWriteConfigUtil [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GBTRegressionModel.GBTRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LogisticRegressionWrapper.LogisticRegressionWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.DateTimeOperations [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.SetupDriver [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetMatchingBlockIds [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexAndValue [WARN] Unable to detect inner functions for class:org.apache.spark.InternalAccumulator.output [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.CatalystTypeConverter [WARN] Unable to detect inner functions for class:org.apache.spark.SparkConf.DeprecatedConfig [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StrictIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComparisonOperatorContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.EltCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SparkSaslClient.$ClientCallbackHandler [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DriverStatusResponse [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterExecutorResponse Error instrumenting class:org.apache.spark.launcher.InProcessLauncher [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.BlacklistTracker.ExecutorFailureList [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveTableValuedFunctions.ArgumentList [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.receiver.BlockGenerator.Block [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.SparkAppConfig [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator7$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.sources.MemorySinkV2.AddedData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BigIntLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.TransportServer.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Count [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RetrieveLastAllocatedExecutorId [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StopWordsRemover.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.SignedPrefixComparatorDesc [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.impl.GLMRegressionModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator11$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.LongDelta.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.FloatConverter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.BinaryPrefixComparator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.InputFileBlockHolder.FileBlock [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FailNativeCommandContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCheckResult.TypeCheckFailure [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleTableSchemaContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.StarSchemaDetection.TableAccessCardinality [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator12$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.Benchmark.Timer [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.Cholesky [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.LDAWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PredicateOperatorContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.stat.test.KolmogorovSmirnovTest.NullHypothesis Error instrumenting class:org.apache.spark.deploy.master.ui.MasterWebUI [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.GroupType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.ArrayConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Family [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.AppListingListener.MutableAttemptInfo [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.BlockManagerHeartbeat [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.DecisionTreeClassificationModel.DecisionTreeClassificationModelWriter.$$typecreator1$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.DataSource$ [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StopExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillDriver Error instrumenting class:org.apache.spark.mllib.clustering.GaussianMixtureModel$SaveLoadV1_0$ [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.UpdateBlockInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.RollupType [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.mesos.MesosExternalShuffleClient.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableValuedFunctionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.IntDelta.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.StopBlockManagerMaster [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.KMeansModelReader.$$typecreator7$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.ALSModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.OneForOneBlockFetcher.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.TypedAverage.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeFixedWidthAggregationMap.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Min Error instrumenting class:org.apache.spark.sql.execution.streaming.state.StateStoreProvider$ [WARN] Unable to detect inner functions for class:org.apache.spark.storage.StorageStatus.NonRddStorageInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ColTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.OrderedIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.EmptyDirectoryWriteTask 21/01/17 17:18:14 WARN BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS 21/01/17 17:18:14 WARN BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.OutputSpec [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.RandomForestClassificationModel.RandomForestClassificationModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerLatestState [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.linalg.distributed.RowMatrix.$SVDMode$2$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeExternalRowSorter.PrefixComputer [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelWriter.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.PromoteStrings [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingDeduplicationStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.ArrayWrappers.ComparableLongArray [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InsertIntoContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.RandomForestRegressionModel.RandomForestRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocations [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.Rating [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NumericLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionValContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.BlockManagerHeartbeat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryPrimaryDefaultContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Tweedie [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.UnsignedPrefixComparator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSizeHint.$$typecreator3$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.TextBasedFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowDatabasesContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InlineTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.GBTClassificationModel.GBTClassificationModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RegisterBlockManager [WARN] Unable to detect inner functions for class:org.apache.spark.network.util.TransportFrameDecoder.Interceptor [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerRemoved [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.FixedLengthRowBasedKeyValueBatch.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Bucketizer.BucketizerWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.LabeledPointPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.CoalesceExec.EmptyPartition [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Index [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.SuccessFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.network.util.LevelDBProvider.LevelDBLogger [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.ToBlockManagerSlave [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.CrossValidatorModel.CrossValidatorModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetArrayConverter.ElementConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleInsertQueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.SuccessFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.ShuffleInMemorySorter.1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.client.TransportClientFactory.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.errors.TreeNodeException [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.PassThrough.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator9$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RefreshTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.RandomForestClassifierWrapper.RandomForestClassifierWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.DecisionTreeClassificationModel.DecisionTreeClassificationModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Index [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator7$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.StateStoreType [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.recommendation.MatrixFactorizationModel.SaveLoadV1_0.$typecreator13$1 [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.SparkAppHandle.State [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.TypedAverage.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.UnsignedPrefixComparatorDescNullsFirst [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillExecutorsOnHost [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.MapConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.CTESubstitution [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TypeConstructorContext Error instrumenting class:org.apache.spark.SSLOptions [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestDriverStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.streaming.InternalOutputModes.Append [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcDeserializer.ArrayDataUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.SparkSubmitCommandBuilder.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CastContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.PivotType [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.ALSModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableAliasContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Logit [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ImplicitOperators [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSlicer.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StopExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelReader Error instrumenting class:org.apache.spark.sql.execution.streaming.HDFSMetadataLog$FileManager [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.CatalogImpl.$$typecreator2$1 Error instrumenting class:org.apache.spark.input.WholeTextFileInputFormat [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveRdd [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BooleanExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveMissingReferences [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UnquotedIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.PrimitiveConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveNaturalAndUsingJoin [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.LaunchDriver Error instrumenting class:org.apache.spark.deploy.history.HistoryServer Error instrumenting class:org.apache.spark.sql.execution.streaming.ManifestFileCommitProtocol [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateValueWatermarkPredicate Error instrumenting class:org.apache.spark.api.python.TestOutputKeyConverter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.optimization.NNLS.Workspace [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DataTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SparkSaslServer.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QuotedIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0 Error instrumenting class:org.apache.spark.api.python.TestWritable [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.BisectingKMeansModel.BisectingKMeansModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.SparkAppConfig [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WriteStyle.RawStyle [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.SendHeartbeat [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerRemoved Error instrumenting class:org.apache.spark.deploy.FaultToleranceTest$delayedInit$body [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.Decimal.DecimalAsIfIntegral [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeMax [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.LeftSide [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.Optimizer.OptimizeSubqueries [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.stat.StatFunctions.CovarianceCounter [WARN] Unable to detect inner functions for class:org.apache.spark.annotation.InterfaceStability.Unstable [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RecoverPartitionsContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.SquaredEuclideanSilhouette.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.ParquetOutputTimestampType Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetReadSupport [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.Hasher Error instrumenting class:org.apache.spark.input.StreamBasedRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.executor.Executor.TaskReaper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedStateStore.STATE [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.LocalIndexEncoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.Pipeline.SharedReadWrite [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.Replaced [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.StringType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.InMemoryFileIndex.SerializableFileStatus [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.RpcHandler.1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.CrossValidator.CrossValidatorWriter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetMapConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.FixedPoint [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterInStandby [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.ProgressReporter.ExecutionStats [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.DatabaseDesc [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.ChainedIterator [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillDriverResponse [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.SizeTracker.Sample [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StopWordsRemover.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.FloatType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator18$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.python.MLSerDe.SparseVectorPickler [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocationsAndStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DereferenceContext Error instrumenting class:org.apache.spark.sql.execution.datasources.csv.MultiLineCSVDataSource$ Error instrumenting class:org.apache.spark.deploy.security.HBaseDelegationTokenProvider [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveBlock [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.DriverEndpoint [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.CachedKafkaConsumer.CacheKey [WARN] Unable to detect inner functions for class:org.apache.spark.internal.io.FileCommitProtocol.TaskCommitMessage [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetExecutorEndpointRef [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Link [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntegerLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.LookupFunctions [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.BooleanAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.streaming.InternalOutputModes.Complete Error instrumenting class:org.apache.spark.input.StreamFileInputFormat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator10$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AnalyzeContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ChiSquareTest.ChiSquareResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.TimestampConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowFunctionsContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.DoubleAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StructContext [WARN] Unable to detect inner functions for class:org.apache.spark.MapOutputTrackerMaster.MessageLoop [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateDatabaseContext Error instrumenting class:org.apache.spark.ml.image.SamplePathFilter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PredicatedContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.RpcHandler.OneWayRpcCallback [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RelationPrimaryContext [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.NewHadoopRDD.NewHadoopMapPartitionsWithSplitRDD [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InsertOverwriteDirContext Error instrumenting class:org.apache.spark.input.FixedLengthBinaryInputFormat [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.StreamingListenerBus.WrappedStreamingListenerEvent Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.SpecificParquetRecordReaderBase$NullIntIterator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SortItemContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveSubquery [WARN] Unable to detect inner functions for class:org.apache.spark.ml.attribute.AttributeType.Numeric$2$ [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.EnsembleNodeData [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.Heartbeat [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.dstream.ReceiverInputDStream.ReceiverRateController [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.GaussianMixtureModel.SaveLoadV1_0.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Key [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.HashingTF.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.HadoopRDD.HadoopMapPartitionsWithSplitRDD [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.util.BytecodeUtils.MethodInvocationFinder [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryLogisticRegressionSummary.$$typecreator30$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.BooleanEquality [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.ByteConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SparkSaslServer.$DigestCallbackHandler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.VectorizedColumnReader.2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinExec.OneSideHashJoiner [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.IntHasher [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedStateStore.ABORTED [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.ByteType.$$typecreator1$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.PartitioningUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillDriver [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.trees.TreeNodeRef [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.MutableProjection [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveDeserializer [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.UpdateDelegationTokens [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.ByteArrays [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.SparseMatrixPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionSpecContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchRequest [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.SummarizerBuffer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UncacheTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.plans.logical.statsEstimation.EstimationUtils.OverlappedRange [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslSymbol [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Max 21/01/17 17:18:16 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.VectorizedParquetRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.StateStoreAwareZipPartitionsRDD [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingJoinStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.ShuffleMetricsSource [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComplexDataTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.InMemoryScans [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.FixNullability [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.ApproxCountDistinctForIntervals.LongArrayInternalRow [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.ALSWrapper.ALSWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ValueExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.impl.GLMRegressionModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.Schema [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QuotedIdentifierAlternativeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.FunctionArgumentConversion [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator10$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.InMemoryFileIndex.SerializableFileStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.StructAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RFormulaModel.RFormulaModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.Aggregation [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.sources.MemorySinkV2.AddedData [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetMemoryStatus Error instrumenting class:org.apache.spark.ml.source.libsvm.LibSVMFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.util.Benchmark.Case [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.AttributeSeq [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.$$typecreator2$1 Error instrumenting class:org.apache.spark.mllib.tree.model.TreeEnsembleModel$SaveLoadV1_0$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator10$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.HDFSBackedStateStore.UPDATING [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowsContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.CrossValidator.CrossValidatorReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.recommendation.MatrixFactorizationModel.SaveLoadV1_0.$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocations [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.UnsignedPrefixComparatorDesc [WARN] Unable to detect inner functions for class:org.apache.spark.ui.JettyUtils.ServletParams [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveAggAliasInGroupBy [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.QuantileDiscretizer.QuantileDiscretizerWriter Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$FileTypes Error instrumenting class:org.apache.spark.deploy.rest.RestSubmissionServer [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDeBase.BasePickler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Binarizer.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.Cluster [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.Cluster [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.SpecialLimits [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeM2n [WARN] Unable to detect inner functions for class:org.apache.spark.util.Benchmark.Case [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.PartitionOverwriteMode [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LinearSVCWrapper.LinearSVCWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.SparkConf.DeprecatedConfig [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterChanged [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.NaiveBayesWrapper.NaiveBayesWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.RandomForestClassifierWrapper.RandomForestClassifierWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.SubmitDriverResponse [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.ScalaReflection.Schema [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.ArrowVectorAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GBTRegressionModel.GBTRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamedWindowContext Error instrumenting class:org.apache.spark.sql.execution.command.CommandUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.IdentityConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CacheTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.CachedKafkaConsumer.CacheKey [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisteredExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.Shutdown [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalShuffleBlockHandler.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.streaming.InternalOutputModes.Update [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArray.SpillableArrayIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetMapConverter.$KeyValueConverter [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.StringArrays [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GaussianMixtureWrapper.GaussianMixtureWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.TypedAverage.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.DecisionTreeClassifierWrapper.DecisionTreeClassifierWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.RemoteBlockTempFileManager [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StorageHandlerContext Error instrumenting class:org.apache.spark.input.Configurable [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DataType.JSortedObject [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.Decimal.DecimalIsFractional [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClustering.Assignment [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DecimalType.Expression [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.RatingBlock Error instrumenting class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.OneForOneBlockFetcher.$ChunkCallback [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropFunctionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FrameBoundContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.SignedPrefixComparator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeDatabaseContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.NodeData [WARN] Unable to detect inner functions for class:org.apache.spark.SparkConf.AlternateConfig [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.LinearRegressionModel.LinearRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisteredApplication [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.TrainValidationSplit.TrainValidationSplitWriter [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.UnsignedPrefixComparatorNullsLast Error instrumenting class:org.apache.spark.sql.execution.datasources.csv.TextInputCSVDataSource$ [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator8$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalShuffleBlockResolver.$2 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.VertexData [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorAdded [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateViewContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetBlockStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveFunctions [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.ShuffleInMemorySorter.SortComparator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StatefulAggregationStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.FPGrowthWrapper.FPGrowthWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableIdentifierContext Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$FileTypes$ [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.RatingPickler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LDAModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.SplitData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetOperationContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.MessageDecoder.1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.LocalLDAModel.SaveLoadV1_0.Data$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GaussianMixtureWrapper.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.StateStore.MaintenanceTask [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerLatestState [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeColNameContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.JoinTypeContext Error instrumenting class:org.apache.spark.sql.execution.datasources.orc.OrcFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.joins.BuildSide [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.OneVsRestModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.BytesToBytesMap.1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorAdded [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.ALSWrapper.ALSWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestSubmitDriver [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.KMeansModelWriter.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ArithmeticBinaryContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.CatalogImpl.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableFileFormatContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.PredictData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.joins.BuildRight [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.InConversion [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExplainContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV1_0.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WriteStyle.FlattenStyle [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.SaveLoadV1_0.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelWriter.Data Error instrumenting class:org.apache.spark.ui.JettyUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.UseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeKVExternalSorter.$KVSorterIterator [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.SpillableIterator [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherServer.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SparkSession.implicits [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.SplitData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveOrdinalInOrderByAndGroupBy [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.attribute.AttributeType.Binary$2$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator2$1 Error instrumenting class:org.apache.spark.input.FixedLengthBinaryRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.Pipeline.PipelineWriter Error instrumenting class:org.apache.spark.internal.io.HadoopMapRedWriteConfigUtil [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator8$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryPrimaryContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.StopAppClient [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClusteringModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.LongAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.CoalesceExec.EmptyPartition [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.NaiveBayesWrapper.NaiveBayesWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Identity [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.QueryExecution.debug [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.GroupingSetContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.ShortAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.RatingBlock [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.MetricsAggregate [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.$typecreator10$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryNoWithContext Error instrumenting class:org.apache.spark.sql.execution.datasources.PartitionPath$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ResourceContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetPeers [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDBTypeInfo.$Index [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.IsotonicRegressionModel.SaveLoadV1_0.Data$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.StructConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ConstantContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.KMeansModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslExpression [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerSchedulerStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.ArrayAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Subscript [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveReferences [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.CoalesceExec.EmptyRDDWithPartitions [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.SignedPrefixComparatorDescNullsFirst [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelWriter.Data Error instrumenting class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider$StoreFile$ [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorStateChanged [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PredicateContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelWriter.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelWriter.Data Error instrumenting class:org.apache.spark.sql.catalyst.parser.ParserUtils$EnhancedLogicalPlan$ [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.ShuffleInMemorySorter.ShuffleSorterIterator [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.HashComparator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.ConcatCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestKillDriver [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestSubmitDriver [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.JoinReorderDP.JoinPlan Error instrumenting class:org.apache.spark.deploy.worker.ui.WorkerWebUI [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.receiver.BlockGenerator.GeneratorState [WARN] Unable to detect inner functions for class:org.apache.spark.InternalAccumulator.shuffleWrite [WARN] Unable to detect inner functions for class:org.apache.spark.ml.python.MLSerDe.DenseMatrixPickler Error instrumenting class:org.apache.spark.metrics.MetricsSystem [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.DenseVectorPickler [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RegisterBlockManager [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.LaunchExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.util.DefaultParamsReader.Metadata [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0.$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Interaction.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.xml.UDFXPathUtil.ReusableStringReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StringLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeansModel.Data [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BoundPortsResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.ImplicitAttribute [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.util.Utils.Lock [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec.$ColumnMetrics$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.BisectingKMeansModel.BisectingKMeansModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BooleanValueContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.mesos.MesosExternalShuffleClient.$RegisterDriverCallback [WARN] Unable to detect inner functions for class:org.apache.spark.sql.streaming.StreamingQueryListener.QueryStartedEvent [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.VertexData [WARN] Unable to detect inner functions for class:org.apache.spark.util.JsonProtocol.TASK_END_REASON_FORMATTED_CLASS_NAMES [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.FlatMapGroupsWithStateStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.EltCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.protocol.BlockTransferMessage.Type [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InsertOverwriteTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ExtractGenerator [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.CompleteRecovery [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.impl.ShippableVertexPartition.ShippableVertexPartitionOpsConstructor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryLogisticRegressionSummary.$$typecreator22$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorker [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.FPGrowthWrapper.FPGrowthWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.EquivalentExpressions.Expr [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.OpenHashMapBasedStateMap.LimitMarker [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleByPercentileContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StringIndexerModel.StringIndexerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComplexColTypeListContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.$$typecreator2$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.json.JsonFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.Trigger [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveSubqueryColumnAliases [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.BinaryType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Gaussian [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowColumnsContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.util.LevelDBProvider.StoreVersion [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.UncompressedInBlockSort [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LDA.LDAReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAssembler.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.EquivalentExpressions.Expr [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Log [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ApplicationFinished [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveHints.RemoveAllHints Error instrumenting class:org.apache.spark.sql.execution.streaming.FileStreamSinkLog [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Tweedie [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.ExternalIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LogicalNotContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.util.SizeEstimator.ClassInfo [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.LaunchTask [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.RepeatedConverter [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.HasCachedBlocks [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.PartitionStrategy.RandomVertexCut [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.HintContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.StreamingListenerBus.WrappedStreamingListenerEvent [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClustering.Assignment [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.types.UTF8String.IntWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterClusterManager [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GBTClassifierWrapper.GBTClassifierWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.OneForOneStreamManager.StreamState [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.GroupByType [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.types.UTF8String.LongWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FileFormatContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.HasCachedBlocks [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.StreamingRelationStrategy [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.ParserUtils.EnhancedLogicalPlan [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocationsMultipleBlockIds [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorkerFailed [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.IsotonicRegressionModel.SaveLoadV1_0.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierListContext Error instrumenting class:org.apache.spark.mllib.clustering.DistributedLDAModel$SaveLoadV1_0$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.InMemoryFileIndex.SerializableBlockLocation [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueStore [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ColTypeListContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.ArrayWrappers.ComparableIntArray [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateKeyWatermarkPredicate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Named [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SortPrefixUtils.NoOpPrefixComparator Error instrumenting class:org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.CheckForWorkerTimeOut [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetDecimalConverter [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.KVTypeInfo.$MethodAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.ReviveOffers [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.BooleanBitSet.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorker [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetLongDictionaryAwareDecimalConverter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.OutputCommitCoordinatorEndpoint [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.EnsembleNodeData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.DoubleConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.LeastSquaresNESolver [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.MetricsAggregate [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.FileBasedWriteAheadLog.LogInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ExecutorAllocationManager.ExecutorAllocationListener Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat Error instrumenting class:org.apache.spark.metrics.sink.MetricsServlet [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TablePropertyListContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.JsonProtocol.SPARK_LISTENER_EVENT_FORMATTED_CLASS_NAMES [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StopWordsRemover.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.ListObjectOutputStream [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.Message [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestDriverStatus [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator11$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.NumNonZeros [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.NullIntolerant [WARN] Unable to detect inner functions for class:org.apache.spark.sql.RelationalGroupedDataset.PivotType [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.CrossValidatorModel.CrossValidatorModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.storage.DiskBlockObjectWriter.$ManualCloseBufferedOutputStream$1 [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.Main.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.IntAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.VariableLengthRowBasedKeyValueBatch.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.PythonMLLibAPI.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.IntegerType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.BooleanBitSet.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Poisson [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.DecisionTreeRegressionModelWriter.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.QuasiNewton [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeExternalRowSorter.PrefixComputer.Prefix [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.KMeansWrapper.KMeansWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.StringAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.LinearRegressionModel.LinearRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.CatalogImpl.$$typecreator3$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand$ Error instrumenting class:org.apache.spark.sql.execution.streaming.state.StateStore$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelWriter Error instrumenting class:org.apache.spark.sql.execution.streaming.HDFSMetadataLog [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.LaunchExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RetryingBlockFetcher.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RelationContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator9$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Probit [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleMethodContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.SubmitDriverResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.BinaryAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PartitionSpecLocationContext Error instrumenting class:org.apache.spark.sql.execution.streaming.StreamMetadata$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.IntDelta.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LDAModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.LocalPrefixSpan.ReversedPrefix [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierCommentContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.RightSide Error instrumenting class:org.apache.spark.ui.ServerInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StopWordsRemover.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.FileBasedWriteAheadLog.LogInfo [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RemoveWorker [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RFormulaModel.RFormulaModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.TriggerThreadDump [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Encoders.Strings [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.FileEntry [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.WindowsSubstitution [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.DiskMapIterator [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.GradientBoostedTreesModel.SaveLoadV1_0 Error instrumenting class:org.apache.spark.sql.execution.datasources.NoopCache$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.ChiSquareTest.$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeExternalRowSorter.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillExecutorsOnHost [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.protocol.BlockTransferMessage.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.CholeskySolver [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper.GeneralizedLinearRegressionWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StopDriver [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.BlockLocationsAndStatus Error instrumenting class:org.apache.spark.sql.execution.datasources.FileFormatWriter$DynamicPartitionWriteTask [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.JoinSelection [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.Postfix [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BucketSpecContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DriverStateChanged [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinConditionSplitPredicates [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeKVExternalSorter.1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.IDF.DocumentFrequencyAggregator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DoubleLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterExecutorFailed [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.SquaredEuclideanSilhouette.$typecreator1$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.SerializationDebugger [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.receiver.BlockGenerator.Block [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FailureFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FunctionCallContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSlicer.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeM2 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.MemorySink.AddedData [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.ReplicateBlock [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SubqueryExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.ObjectStreamClassReflection [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugQuery [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ImputerModel.ImputerReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateWatermarkPredicates [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DecimalType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveShuffle [WARN] Unable to detect inner functions for class:org.apache.spark.network.crypto.TransportCipher.EncryptedMessage [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.HashingTF.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.LongDelta.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.optimizer.StarSchemaDetection.TableAccessCardinality [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.DirectKafkaRateController [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.BytesToBytesMap.$Location [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.SpecificParquetRecordReaderBase.IntIterator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.BisectingKMeansWrapper.BisectingKMeansWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.RowUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ConstantDefaultContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.DictionaryEncoding.Decoder [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.ArrayWrappers.ComparableByteArray [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.InBlock [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.OldData [WARN] Unable to detect inner functions for class:org.apache.spark.AccumulatorParam.FloatAccumulatorParam [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.StorageStatus.RddStorageInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LateralViewContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ComplexColTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RemoveExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InMemoryView [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.RepeatedPrimitiveConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.ShortType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Binarizer.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.StandardScalerModel.StandardScalerModelWriter.$$typecreator3$1 Error instrumenting class:org.apache.spark.mllib.regression.IsotonicRegressionModel$SaveLoadV1_0$Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Inverse [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.stat.test.ChiSqTest.Method [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ClearCacheContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.SpilledFile [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.EvaluatePython.StructTypePickler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoder.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Sqrt [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.DirectKafkaInputDStreamCheckpointData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.ArrayConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.plans.logical.statsEstimation.EstimationUtils.OverlappedRange [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LastContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SparkSaslClient.1 Error instrumenting class:org.apache.spark.ml.image.SamplePathFilter$ [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.PartitionStrategy.EdgePartition1D [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.UpdateDelegationTokens [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.HistoryServerDiskManager.Lease [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.HashMapGenerator.Buffer [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.AddWebUIFilter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ReconnectWorker [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TablePropertyContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.SQLConf.Deprecated [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ResetConfigurationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator13$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IdentifierSeqContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator8$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.BucketedRandomProjectionLSHModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveRelations [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropDatabaseContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.UpdateBlockInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.ProgressReporter.ExecutionStats [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.RunLengthEncoding.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.PartitionStrategy.EdgePartition2D [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.OldData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TinyIntLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.StructConverter [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.TimSort.$SortState [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.IsotonicRegressionWrapper.IsotonicRegressionWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ColumnarBatch.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.impl.GLMClassificationModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ClassificationModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetBinaryDictionaryAwareDecimalConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.FixedPoint [WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.NettyRpcEnv.FileDownloadChannel [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexAndValue [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.BooleanConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.FamilyAndLink [WARN] Unable to detect inner functions for class:org.apache.spark.util.sketch.CountMinSketch.Version [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec.$ColumnMetrics [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.VectorizedRleValuesReader.1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tuning.TrainValidationSplitModel.TrainValidationSplitModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.Postfix Error instrumenting class:org.apache.spark.sql.catalyst.util.CompressionCodecs$ [WARN] Unable to detect inner functions for class:org.apache.spark.executor.Executor.TaskRunner [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.SpecificParquetRecordReaderBase.RLEIntIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTableHeaderContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.streaming.StreamingQueryListener.QueryProgressEvent [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.HadoopRDD.HadoopMapPartitionsWithSplitRDD [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowSpecContext [WARN] Unable to detect inner functions for class:org.apache.spark.io.ReadAheadInputStream.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.ObjectStreamClassMethods [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.DoublePrefixComparator Error instrumenting class:org.apache.spark.sql.execution.datasources.orc.OrcColumnarBatchReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorUpdated [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LoadDataContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.WriteStyle.QuotedStyle [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.Auto [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDB.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.LinearRegressionModel.LinearRegressionModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeNNZ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DateType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.AFTSurvivalRegressionWrapper.AFTSurvivalRegressionWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WhenClauseContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionSummary.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisteredWorker [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.Heartbeat [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ChangeColumnContext [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.SparkSubmitCommandBuilder.$OptionParser [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator25$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ManageResourceContext [WARN] Unable to detect inner functions for class:org.apache.spark.ExecutorAllocationManager.ExecutorAllocationManagerSource [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntervalLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveBroadcast [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.Metadata$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.IntervalFieldContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ArithmeticOperatorContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.MultiInsertQueryBodyContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.crypto.TransportCipher.DecryptionHandler [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveBlock [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslAttribute [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalShuffleBlockResolver.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.SeenFilesMap [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RowFormatSerdeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.debug.DebugExec.$SetAccumulator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.JoinCriteriaContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ValueExpressionDefaultContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LDAWrapper.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.MultilayerPerceptronClassifierWrapper.MultilayerPerceptronClassifierWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator19$1 Error instrumenting class:org.apache.spark.mllib.tree.model.TreeEnsembleModel$SaveLoadV1_0$Metadata [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.UnregisterApplication [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Link [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDBTypeInfo.1 [WARN] Unable to detect inner functions for class:org.apache.spark.SparkConf.AlternateConfig Error instrumenting class:org.apache.spark.sql.execution.streaming.FileStreamSourceLog [WARN] Unable to detect inner functions for class:org.apache.spark.util.random.StratifiedSamplingUtils.RandomDataGenerator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.LongType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.SerDeUtil.AutoBatchedPickler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.param.shared.SharedParamsCodeGen.ParamDesc [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelWriter.$$typecreator3$1 Error instrumenting class:org.apache.spark.ui.ServerInfo$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.IsotonicRegressionWrapper.IsotonicRegressionWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.$$typecreator38$1 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.GetExecutorLossReason [WARN] Unable to detect inner functions for class:org.apache.spark.storage.StorageStatus.NonRddStorageInfo Error instrumenting class:org.apache.spark.sql.execution.streaming.FileStreamSink$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeMean [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.$$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.JsonProtocol.JOB_RESULT_FORMATTED_CLASS_NAMES Error instrumenting class:org.apache.spark.ui.WebUI [WARN] Unable to detect inner functions for class:org.apache.spark.status.KVUtils.KVStoreScalaSerializer [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.NormL1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PrimaryExpressionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.IdentityConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ArithmeticUnaryContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.CLogLog [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCheckResult.TypeCheckSuccess Error instrumenting class:org.apache.spark.sql.execution.streaming.CompactibleFileStreamLog [WARN] Unable to detect inner functions for class:org.apache.spark.ml.evaluation.SquaredEuclideanSilhouette.ClusterStats [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexer.CategoryStats [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.FPGrowthModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalSorter.SpilledFile [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.SpecificParquetRecordReaderBase.ValuesReaderIntIterator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.PCAModel.PCAModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleByBucketContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SearchedCaseContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetConfigurationContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.RevokedLeadership [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.RFormulaModel.RFormulaModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.BatchedWriteAheadLog.Record [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ColPositionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.QuantileSummaries.Stats [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.lib.SVDPlusPlus.Conf [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GaussianMixtureWrapper.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.RetryingBlockFetcher.$RetryingBlockFetchListener [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DoubleType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleByRowsContext [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.UnsafeShuffleWriter.$CloseAndFlushShieldOutputStream [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcDeserializer.CatalystDataUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NonReservedContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.ElectedLeader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ExistsContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.UncompressedInBlock [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.CommandBuilderUtils.JavaVendor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GBTRegressorWrapper.GBTRegressorWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RenameTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Interaction.$$typecreator2$1 Error instrumenting class:org.apache.spark.executor.ExecutorSource Error instrumenting class:org.apache.spark.sql.execution.datasources.FileFormatWriter$ [WARN] Unable to detect inner functions for class:org.apache.spark.TestUtils.JavaSourceFromString [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.DistributedLDAModel.DistributedWriter Error instrumenting class:org.apache.spark.sql.execution.datasources.json.MultiLineJsonDataSource$ [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StatusUpdate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NullLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TruncateTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.AccumulatorParam.DoubleAccumulatorParam [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StarContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RequestExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowPartitionsContext [WARN] Unable to detect inner functions for class:org.apache.spark.AccumulatorParam.LongAccumulatorParam [WARN] Unable to detect inner functions for class:org.apache.spark.ml.attribute.AttributeType.Nominal$2$ [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.BasePythonRunner.ReaderIterator [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.StageState [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestKillDriver [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SparkSession.Builder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.python.EvaluatePython.RowPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.TimSort.1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayes.$$typecreator9$1 Error instrumenting class:org.apache.spark.input.StreamRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeExternalRowSorter.RowComparator [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeans.ClusterSummary [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.PassThrough.Decoder Error instrumenting class:org.apache.spark.sql.execution.datasources.SQLHadoopMapReduceCommitProtocol [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.SparkStrategies.BasicOperators [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.impl.GLMRegressionModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.DistributedLDAModel.DistributedLDAModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.impl.GLMClassificationModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.LaunchTask [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BoundPortsRequest [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.v2.PushDownOperatorsToDataSource.FilterAndProject [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.DataSource.SourceInfo [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.AbstractLauncher.ArgumentValidator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SimpleCaseContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.$typecreator11$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveWindowFrame [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NestedConstantListContext Error instrumenting class:org.apache.spark.api.python.JavaToWritableConverter Error instrumenting class:org.apache.spark.sql.execution.datasources.PartitionDirectory$ [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterClusterManager [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Family [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryOrganizationContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.FamilyAndLink [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkDirCleanup [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelWriter.Data Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.SpecificParquetRecordReaderBase [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.$SpillableIterator [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.ExternalIterator.$StreamBuffer [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.BasePythonRunner.WriterThread [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.MaxAbsScalerModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.RpcEndpointVerifier.CheckExistence [WARN] Unable to detect inner functions for class:org.apache.spark.ml.PipelineModel.PipelineModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.mesos.MesosExternalShuffleClient.$Heartbeater [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel.MultilayerPerceptronClassificationModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRest.OneVsRestReader [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalShuffleBlockHandler.$ManagedBufferIterator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.ProbabilisticClassificationModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator7$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.internal.CatalogImpl.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.NaiveBayesModel.NaiveBayesModelWriter.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.NNLSSolver [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.BisectingKMeans.ClusterSummary [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.DecisionTreeRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.network.util.NettyUtils.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTableLikeContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.python.MLSerDe.DenseVectorPickler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerWriter [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InMemoryIterator [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.SerDeUtil.ByteArrayConstructor [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.LocalLDAModel.SaveLoadV1_0.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinSide [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.columnar.compression.DictionaryEncoding.Encoder [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.TableDesc [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.dstream.FileInputDStream.FileInputDStreamCheckpointData [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SkewSpecContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.GetExecutorLossReason Error instrumenting class:org.apache.spark.mllib.clustering.GaussianMixtureModel$SaveLoadV1_0$Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.ApproxCountDistinctForIntervals.LongArrayInternalRow [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BooleanDefaultContext Error instrumenting class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider$StoreFile [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.Projection [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.RandomForestRegressorWrapper.RandomForestRegressorWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.AccumulatorParam.IntAccumulatorParam [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.LinearRegressionModel.LinearRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherBackend.BackendConnection [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetIntDictionaryAwareDecimalConverter Error instrumenting class:org.apache.spark.mllib.clustering.DistributedLDAModel$SaveLoadV1_0$EdgeData [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpan.Prefix [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.NodeData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.BooleanType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetExecutorEndpointRef [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RemoveExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.HintStatementContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.LinearSVCWrapper.LinearSVCWrapperWriter [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.JdbcRDD.ConnectionFactory [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.OrderedIdentifierListContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.UnsafeKVExternalSorter.KVComparator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAssembler.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.annotation.InterfaceStability.Stable [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateValueWatermarkPredicate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveWindowOrder [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.BlacklistTracker.ExecutorFailureList.TaskId [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.TreeEnsembleModel.SaveLoadV1_0.$typecreator1$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.json.TextInputJsonDataSource$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.WeightedLeastSquares.Solver [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MaxAbsScalerModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TablePropertyValueContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NamedExpressionSeqContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator11$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FunctionIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.network.util.LevelDBProvider.1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.AddWebUIFilter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AddTableColumnsContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleDataTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterExecutorFailed [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.joins.BuildLeft [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Wildcard [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetTablePropertiesContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator12$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCheckResult.TypeCheckFailure [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SaslEncryption.EncryptionHandler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionBase.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSizeHint.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FailureFetchResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.WindowFrameCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveAliases [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.SignedPrefixComparatorNullsLast Error instrumenting class:org.apache.spark.sql.execution.datasources.InMemoryFileIndex$ [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.ListObjectOutput [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherServer.$ServerConnection [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RowFormatContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocationsAndStatus [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.FPTree.Node [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.DslString Error instrumenting class:org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider$HDFSBackedStateStore Error instrumenting class:org.apache.spark.api.python.TestOutputValueConverter [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.RadixSortSupport [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.PartitioningUtils.PartitionValues [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ExtractGenerator.AliasedGenerator$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.PipelineModel.PipelineModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.AFTSurvivalRegressionWrapper.AFTSurvivalRegressionWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.KMeansModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator15$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StatementDefaultContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IndexToString.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveGenerate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FlatMapGroupsWithStateExec.StateStoreUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ReconnectWorker [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Binomial [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArray.ExternalAppendOnlyUnsafeRowArrayIterator [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.BlockLocationsAndStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QualifiedNameContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.dsl.ExpressionConversions.StringToAttributeConversionHelper [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelWriter.Data Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.PullOutNondeterministic [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RegisterExecutor [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.DriverStateChanged [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.JoinRelationContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FirstContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.KMeansModelReader.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InsertIntoTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetTableLocationContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.LogicalBinaryContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.ByteAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.WriteTaskResult [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.rules.RuleExecutor.Batch [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetStorageStatus [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.NewHadoopRDD.NewHadoopMapPartitionsWithSplitRDD [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorter.1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.StateStoreOps [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.util.QuantileSummaries.Stats [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.$typecreator7$1 [WARN] Unable to detect inner functions for class:org.apache.spark.AccumulatorParam.StringAccumulatorParam [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorSizeHint.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.DistributedLDAModel.SaveLoadV1_0.EdgeData$ [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.DenseMatrixPickler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SubscriptContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.api.r.SQLUtils.RegexContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.MasterChangeAcknowledged [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.PythonWorkerFactory.MonitorThread [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.ResolveTableValuedFunctions.ArgumentList [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.FunctionTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.QueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.IsotonicRegressionModel.IsotonicRegressionModelWriter.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.IsotonicRegressionModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.storage.DiskBlockObjectWriter.ManualCloseOutputStream Error instrumenting class:org.apache.spark.streaming.StreamingContext$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.FeatureHasher.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeWeightSum [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterWorkerResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.DecimalAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.regression.impl.GLMRegressionModel.SaveLoadV1_0.Data$ Error instrumenting class:org.apache.spark.sql.catalyst.catalog.InMemoryCatalog$ Error instrumenting class:org.apache.spark.streaming.api.java.JavaStreamingContext$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinConditionSplitPredicates [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.recommendation.MatrixFactorizationModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.DecimalConverter [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SaslEncryption.DecryptionHandler [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.InMemoryStore.InstanceList [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Metric [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.OutputSpec [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.SerializationDebugger.NullOutputStream [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.KillDriverResponse [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext.MutableStateArrays [WARN] Unable to detect inner functions for class:org.apache.spark.rdd.PipedRDD.NotEqualsFileNameFilter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableNameContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.codegen.DumpByteCode [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropTablePartitionsContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Variance [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.kafka010.KafkaRDD.KafkaRDDIterator [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.SparkSubmitUtils.MavenCoordinate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateTempViewUsingContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BeginRecovery [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.PythonMLLibAPI.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.NettyRpcEnv.FileDownloadCallback [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.StringConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.$$typecreator30$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.csv.CSVFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.util.Benchmark.Result [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateWatermarkPredicates [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.KVTypeInfo.$FieldAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Power [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryLogisticRegressionSummary.$$typecreator14$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.GBTClassificationModel.GBTClassificationModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.serializer.DummySerializerInstance.$1 Error instrumenting class:org.apache.spark.input.ConfigurableCombineFileRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.storage.ShuffleBlockFetcherIterator.FetchRequest [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.FPTree.Summary [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateFileFormatContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorAttributeRewriter.VectorAttributeRewriterWriter.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.scheduler.JobScheduler.JobHandler [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.PathInstruction.Named [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryLogisticRegressionSummary.$$typecreator38$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.RandomForestRegressionModel.RandomForestRegressionModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.HandleNullInputsForUDF [WARN] Unable to detect inner functions for class:org.apache.spark.network.protocol.Message.Type [WARN] Unable to detect inner functions for class:org.apache.spark.graphx.impl.VertexPartition.VertexPartitionOpsConstructor [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.WriteTaskResult [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.OneForOneBlockFetcher.$DownloadCallback [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.GeneralizedLinearRegressionModelWriter.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.internal.io.FileCommitProtocol.EmptyTaskCommitMessage [WARN] Unable to detect inner functions for class:org.apache.spark.util.kvstore.LevelDB.TypeAliases [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.DecisionTreeRegressionModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.ExecuteWriteTask [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeMetric Error instrumenting class:org.apache.spark.sql.execution.streaming.SinkFileStatus$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetArrayConverter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ChiSqSelectorModel.ChiSqSelectorModelWriter.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinHashLSHModel.MinHashLSHModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.StackCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.api.python.BasePythonRunner.MonitorThread [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.util.MLUtils.$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.VectorIndexerModel.VectorIndexerModelWriter.$$typecreator3$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.PowerIterationClusteringModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.GlobalAggregates Error instrumenting class:org.apache.spark.serializer.SerializationDebugger$ObjectStreamClassMethods$ [WARN] Unable to detect inner functions for class:org.apache.spark.util.SizeEstimator.SearchState [WARN] Unable to detect inner functions for class:org.apache.spark.InternalAccumulator.input [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalShuffleBlockHandler.$ShuffleMetrics [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper.JoinStateWatermarkPredicate [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LinearSVCModel.LinearSVCReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.orc.OrcDeserializer.RowUpdater [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.FileFormatWriter.WriteJobDescription Error instrumenting class:org.apache.spark.status.api.v1.ApiRootResource$ [WARN] Unable to detect inner functions for class:org.apache.spark.InternalAccumulator.shuffleRead [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowFrameContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorUpdated [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.UDTConverter Error instrumenting class:org.apache.spark.mllib.clustering.LocalLDAModel$SaveLoadV1_0$Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.RandomForestRegressorWrapper.RandomForestRegressorWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InlineTableDefault1Context [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.aggregate.HashMapGenerator.Buffer [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelWriter.$$typecreator10$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.TimestampType.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.LogisticRegressionModel.LogisticRegressionModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyAndNumValues [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.EnsembleModelReadWrite.EnsembleNodeData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveAggregateFunctions [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.InlineTableDefault2Context [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.DecisionTreeClassifierWrapper.DecisionTreeClassifierWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ReregisterWithMaster [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.TimestampAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorter.SortComparator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.Rating [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.receiver.ReceiverSupervisor.ReceiverState [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RenameTablePartitionContext [WARN] Unable to detect inner functions for class:org.apache.spark.security.CryptoStreamUtils.CryptoParams [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.Stop [WARN] Unable to detect inner functions for class:org.apache.spark.util.sketch.BloomFilter.Version [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.NumberContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.VectorizedRleValuesReader.MODE [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.KeyWrapper [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.LongConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AlterViewQueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.stat.test.ChiSqTest.NullHypothesis [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeFuncNameContext Error instrumenting class:org.apache.spark.SparkContext$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GaussianMixtureWrapper.GaussianMixtureWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.NormalEquation Error instrumenting class:org.apache.spark.sql.execution.datasources.CodecStreams$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLContext.implicits [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.AFTSurvivalRegressionModel.AFTSurvivalRegressionModelWriter.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.OpenHashMapBasedStateMap.StateInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleStatementContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BooleanLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.UncompressedInBlockBuilder [WARN] Unable to detect inner functions for class:org.apache.spark.status.ElementTrackingStore.Trigger [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.impl.GLMClassificationModel.SaveLoadV1_0.Data [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisteredApplication [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.BlacklistTracker.ExecutorFailureList.TaskId [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRest.OneVsRestWriter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.ApproximatePercentile.PercentileDigestSerializer [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RefreshResourceContext [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.HashMapGrowthStrategy.Doubling [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.BinaryLogisticRegressionSummary.$$typecreator6$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.impl.RandomForest.NodeIndexInfo [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.OneHotEncoderModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV1_0.$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SingleFunctionIdentifierContext [WARN] Unable to detect inner functions for class:org.apache.spark.status.KVUtils.MetadataMismatchException [WARN] Unable to detect inner functions for class:org.apache.spark.unsafe.map.BytesToBytesMap.$MapIterator [WARN] Unable to detect inner functions for class:org.apache.spark.ml.optim.QuasiNewtonSolver.NormalEquationCostFun [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.Mean [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowTablesContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetLocationsMultipleBlockIds [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.CountVectorizerModel.CountVectorizerModelWriter.Data Error instrumenting class:org.apache.spark.sql.execution.aggregate.TungstenAggregationIterator [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RequestExecutors [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BeginRecovery [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ApplicationRemoved [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerWriter.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.StateStoreHandler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.DecisionTreeClassificationModel.DecisionTreeClassificationModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.WorkerSchedulerStateResponse [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.OpenHashSet.LongHasher [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RequestMasterState Error instrumenting class:org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport Error instrumenting class:org.apache.spark.ui.SparkUI [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.RemoveWorker [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.expressions.aggregate.DeclarativeAggregate.RichAttribute Error instrumenting class:org.apache.spark.sql.catalyst.catalog.ExternalCatalogUtils$ [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.Data [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.SizeTracker.Sample [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ConstantListContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.KMeansModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.KillTask [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.SetAppId [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.ToBlockManagerMaster [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.OneVsRestModel.OneVsRestModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeL1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.ConcatCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.ColumnPruner.ColumnPrunerReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolveUpCast [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.Analyzer.ResolvePivot [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArray.InMemoryBufferIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.stat.FrequentItems.FreqItemCounter [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.unsafe.sort.PrefixComparators.StringPrefixComparator [WARN] Unable to detect inner functions for class:org.apache.spark.streaming.util.BatchedWriteAheadLog.Record [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.Word2VecModel.SaveLoadV1_0.$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.GenericFileFormatContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SetTableSerDeContext [WARN] Unable to detect inner functions for class:org.apache.spark.shuffle.sort.UnsafeShuffleWriter.MyByteArrayOutputStream [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.WindowRefContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowTblPropertiesContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.TableContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.UDTConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.ShortConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.analysis.TypeCoercion.IfCoercion [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.ParquetStringConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.streaming.StreamingQueryListener.Event [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.FPGrowth.FreqItemset [WARN] Unable to detect inner functions for class:org.apache.spark.ml.classification.RandomForestClassificationModel.RandomForestClassificationModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.RegisterApplication [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.impl.GLMClassificationModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.NormL2 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.stat.SummaryBuilderImpl.ComputeMin [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.datasources.parquet.VectorizedColumnReader.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.history.AppListingListener.MutableApplicationInfo [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.StatementContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.AliasedQueryContext [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.LaunchDriver [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.Decimal.DecimalIsConflicted [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelWriter.$$typecreator5$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.IDFModel.IDFModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManager.RemoteBlockTempFileManager.$ReferenceWithCleanup [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegressionModel.$$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.sasl.SaslEncryption.EncryptedMessage [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.OutputCommitCoordinator.StageState [WARN] Unable to detect inner functions for class:org.apache.spark.network.shuffle.ExternalShuffleBlockResolver.AppExecId [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetBlockStatus [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.SampleContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.$$typecreator14$1 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PositionContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.CatalystTypeConverters.IntConverter [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DecimalLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.api.python.SerDe.SparseVectorPickler [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelReader.$$typecreator17$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.attribute.AttributeType.Unresolved$2$ [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.clustering.GaussianMixtureModel.SaveLoadV1_0.Data$ [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.FileStreamSource.FileEntry [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.BigDecimalLiteralContext [WARN] Unable to detect inner functions for class:org.apache.spark.rpc.netty.Dispatcher.MessageLoop [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.GetPeers [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.GeneralizedLinearRegression.Gamma Error instrumenting class:org.apache.spark.input.WholeTextFileRecordReader [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALS.RatingBlockBuilder Error instrumenting class:org.apache.spark.internal.io.HadoopMapReduceCommitProtocol [WARN] Unable to detect inner functions for class:org.apache.spark.sql.SQLImplicits.StringToColumn [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.RemoveBroadcast [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.DecisionTreeModel.SaveLoadV1_0.PredictData [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.LocalLDAModel.LocalLDAModelWriter.$$typecreator4$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.MinMaxScalerModel.MinMaxScalerModelWriter [WARN] Unable to detect inner functions for class:org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages.StatusUpdate [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.RowFormatDelimitedContext [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.feature.ChiSqSelectorModel.SaveLoadV1_0.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.launcher.LauncherProtocol.SetState [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.LocalPrefixSpan.ReversedPrefix [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.OneHotEncoderModel.$$typecreator1$1 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.master.MasterMessages.BoundPortsResponse [WARN] Unable to detect inner functions for class:org.apache.spark.sql.Encoders.$typecreator11$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.NaiveBayesModel.SaveLoadV2_0.$typecreator2$1 [WARN] Unable to detect inner functions for class:org.apache.spark.ml.clustering.GaussianMixtureModel.GaussianMixtureModelWriter.Data [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.tree.model.RandomForestModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.sql.vectorized.ArrowColumnVector.FloatAccessor [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ExecutorStateChanged [WARN] Unable to detect inner functions for class:org.apache.spark.ml.regression.LinearRegressionModel.LinearRegressionModelReader [WARN] Unable to detect inner functions for class:org.apache.spark.util.collection.ExternalAppendOnlyMap.SpillableIterator [WARN] Unable to detect inner functions for class:org.apache.spark.sql.execution.streaming.state.SymmetricHashJoinStateManager.KeyWithIndexToValueType [WARN] Unable to detect inner functions for class:org.apache.spark.storage.BlockManagerMessages.ReplicateBlock [WARN] Unable to detect inner functions for class:org.apache.spark.ml.recommendation.ALSModel.$$typecreator15$1 [WARN] Unable to detect inner functions for class:org.apache.spark.network.server.TransportRequestHandler.$1 [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.fpm.PrefixSpanModel.SaveLoadV1_0 [WARN] Unable to detect inner functions for class:org.apache.spark.deploy.DeployMessages.ApplicationFinished [WARN] Unable to detect inner functions for class:org.apache.spark.ml.r.GBTClassifierWrapper.GBTClassifierWrapperReader [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.PrimitiveDataTypeContext [WARN] Unable to detect inner functions for class:org.apache.spark.sql.types.DecimalType.Fixed [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.DescribeFunctionContext [WARN] Unable to detect inner functions for class:org.apache.spark.storage.StorageStatus.RddStorageInfo [WARN] Unable to detect inner functions for class:org.apache.spark.mllib.classification.LogisticRegressionWithLBFGS.$$typecreator1$1 Error instrumenting class:org.apache.spark.sql.execution.datasources.text.TextFileFormat [WARN] Unable to detect inner functions for class:org.apache.spark.ml.tree.DecisionTreeModelReadWrite.SplitData [WARN] Unable to detect inner functions for class:org.apache.spark.sql.catalyst.parser.SqlBaseParser.ShowCreateTableContext [WARN] Unable to detect inner functions for class:org.apache.spark.ml.feature.Word2VecModel.Word2VecModelWriter.$$typecreator9$1 Created : .generated-mima-class-excludes in current directory. Created : .generated-mima-member-excludes in current directory. Using /usr/lib/jvm/java-8-openjdk-amd64/ as default JAVA_HOME. Note, this will be overridden by -java-home if it is set. [info] Loading project definition from /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/project [info] Set current project to spark-parent (in build file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/) [info] Updating {file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/}tools... [info] spark-parent: previous-artifact not set, not analyzing binary compatibility [info] spark-tags: previous-artifact not set, not analyzing binary compatibility [info] Done updating. [info] spark-kvstore: previous-artifact not set, not analyzing binary compatibility [info] spark-tools: previous-artifact not set, not analyzing binary compatibility [info] spark-unsafe: previous-artifact not set, not analyzing binary compatibility [info] spark-streaming-flume-sink: previous-artifact not set, not analyzing binary compatibility [info] spark-network-common: previous-artifact not set, not analyzing binary compatibility [info] spark-network-shuffle: previous-artifact not set, not analyzing binary compatibility [info] spark-network-yarn: previous-artifact not set, not analyzing binary compatibility [info] spark-launcher: found 0 potential binary incompatibilities while checking against org.apache.spark:spark-launcher_2.11:2.3.0 (filtered 1) [info] spark-sketch: found 0 potential binary incompatibilities while checking against org.apache.spark:spark-sketch_2.11:2.3.0 (filtered 1) [info] spark-mllib-local: found 0 potential binary incompatibilities while checking against org.apache.spark:spark-mllib-local_2.11:2.3.0 (filtered 1) [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/SSLOptions.scala:71: constructor SslContextFactory in class SslContextFactory is deprecated: see corresponding Javadoc for more information. [warn] val sslContextFactory = new SslContextFactory() [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala:492: trait AccumulableParam in package spark is deprecated: use AccumulatorV2 [warn] param: org.apache.spark.AccumulableParam[R, T]) extends AccumulatorV2[T, R] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala:492: trait AccumulableParam in package spark is deprecated: use AccumulatorV2 [warn] param: org.apache.spark.AccumulableParam[R, T]) extends AccumulatorV2[T, R] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/BarrierTaskContext.scala:161: method isRunningLocally in class TaskContext is deprecated: Local execution was removed, so this always returns false [warn] override def isRunningLocally(): Boolean = taskContext.isRunningLocally() [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/scheduler/StageInfo.scala:59: value attemptId in class StageInfo is deprecated: Use attemptNumber instead [warn] def attemptNumber(): Int = attemptId [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/api/r/RBackend.scala:102: method childGroup in class ServerBootstrap is deprecated: see corresponding Javadoc for more information. [warn] if (bootstrap != null && bootstrap.childGroup() != null) { [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/target/scala-2.11/spark-core_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] spark-ganglia-lgpl: previous-artifact not set, not analyzing binary compatibility [info] spark-kubernetes: previous-artifact not set, not analyzing binary compatibility [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] spark-yarn: previous-artifact not set, not analyzing binary compatibility [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:87: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] fwInfoBuilder.setRole(role) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:224: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] role.foreach { r => builder.setRole(r) } [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:260: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] Option(r.getRole), reservation) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:263: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] Option(r.getRole), reservation) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:521: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] role.foreach { r => builder.setRole(r) } [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:537: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] (RoleResourceInfo(resource.getRole, reservation), [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] spark-mesos: previous-artifact not set, not analyzing binary compatibility [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] spark-catalyst: previous-artifact not set, not analyzing binary compatibility [info] spark-streaming: found 0 potential binary incompatibilities while checking against org.apache.spark:spark-streaming_2.11:2.3.0 (filtered 3) [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeUtils.scala:259: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val dstream = FlumeUtils.createStream(jssc, hostname, port, storageLevel, enableDecompression) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeUtils.scala:275: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val dstream = FlumeUtils.createPollingStream( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/examples/FlumePollingEventCount.scala:56: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val stream = FlumeUtils.createPollingStream(ssc, host, port) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/examples/FlumePollingEventCount.scala:56: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val stream = FlumeUtils.createPollingStream(ssc, host, port) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/examples/FlumeEventCount.scala:59: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val stream = FlumeUtils.createStream(ssc, host, port, StorageLevel.MEMORY_ONLY_SER_2) [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] spark-streaming-flume: previous-artifact not set, not analyzing binary compatibility [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/DirectKafkaInputDStream.scala:172: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] val msgs = c.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/ConsumerStrategy.scala:100: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/ConsumerStrategy.scala:153: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/KafkaDataConsumer.scala:200: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] val p = consumer.poll(timeout) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:140: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] private val client = new AmazonKinesisClient(credentials) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:150: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] client.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:219: method setRequestCredentials in class AmazonWebServiceRequest is deprecated: see corresponding Javadoc for more information. [warn] getRecordsRequest.setRequestCredentials(credentials) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:240: method setRequestCredentials in class AmazonWebServiceRequest is deprecated: see corresponding Javadoc for more information. [warn] getShardIteratorRequest.setRequestCredentials(credentials) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisUtils.scala:606: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] KinesisUtils.createStream(jssc.ssc, kinesisAppName, streamName, endpointUrl, regionName, [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisUtils.scala:613: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] KinesisUtils.createStream(jssc, kinesisAppName, streamName, endpointUrl, regionName, [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisUtils.scala:616: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] KinesisUtils.createStream(jssc, kinesisAppName, streamName, endpointUrl, regionName, [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:107: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val kinesisClient = new AmazonKinesisClient(credentials) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:108: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] kinesisClient.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:223: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val kinesisClient = new AmazonKinesisClient(new DefaultAWSCredentialsProviderChain()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:224: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] kinesisClient.setEndpoint(endpoint) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/SparkAWSCredentials.scala:76: method withLongLivedCredentialsProvider in class Builder is deprecated: see corresponding Javadoc for more information. [warn] .withLongLivedCredentialsProvider(longLivedCreds.provider) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:58: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val client = new AmazonKinesisClient(KinesisTestUtils.getAWSCredentials()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:59: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] client.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:64: constructor AmazonDynamoDBClient in class AmazonDynamoDBClient is deprecated: see corresponding Javadoc for more information. [warn] val dynamoDBClient = new AmazonDynamoDBClient(new DefaultAWSCredentialsProviderChain()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:65: method setRegion in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] dynamoDBClient.setRegion(RegionUtils.getRegion(regionName)) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisReceiver.scala:187: constructor Worker in class Worker is deprecated: see corresponding Javadoc for more information. [warn] worker = new Worker(recordProcessorFactory, kinesisClientLibConfiguration) [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] spark-streaming-kinesis-asl: previous-artifact not set, not analyzing binary compatibility [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:633: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createStream[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder]( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:647: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges: JList[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:648: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] leaders: JMap[TopicAndPartition, Broker]): JavaRDD[(Array[Byte], Array[Byte])] = { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:657: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges: JList[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:658: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] leaders: JMap[TopicAndPartition, Broker]): JavaRDD[Array[Byte]] = { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:670: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges: JList[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:671: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] leaders: JMap[TopicAndPartition, Broker], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:673: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createRDD[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder, V]( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:676: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges.toArray(new Array[OffsetRange](offsetRanges.size())), [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:720: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val kc = new KafkaCluster(Map(kafkaParams.asScala.toSeq: _*)) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:721: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.getFromOffsets( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:725: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createDirectStream[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder, V]( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:733: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] ): OffsetRange = OffsetRange.create(topic, partition, fromOffset, untilOffset) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:733: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] ): OffsetRange = OffsetRange.create(topic, partition, fromOffset, untilOffset) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:738: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] def createBroker(host: String, port: JInt): Broker = Broker(host, port) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:738: object Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] def createBroker(host: String, port: JInt): Broker = Broker(host, port) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:740: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] def offsetRangesOfKafkaRDD(rdd: RDD[_]): JList[OffsetRange] = { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:89: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] protected val kc = new KafkaCluster(kafkaParams) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:172: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] OffsetRange(tp.topic, tp.partition, fo, uo.offset) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:217: object KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val leaders = KafkaCluster.checkErrors(kc.findLeaders(topics)) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:222: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] context.sparkContext, kafkaParams, b.map(OffsetRange(_)), leaders, messageHandler) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:58: trait HasOffsetRanges in package kafka is deprecated: Update to Kafka 0.10 integration [warn] ) extends RDD[R](sc, Nil) with Logging with HasOffsetRanges { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:55: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val offsetRanges: Array[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:55: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val offsetRanges: Array[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:152: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val kc = new KafkaCluster(kafkaParams) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:268: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] OffsetRange(tp.topic, tp.partition, fo, uo.offset) [warn] [info] spark-streaming-kafka-0-8: previous-artifact not set, not analyzing binary compatibility [info] spark-graphx: found 0 potential binary incompatibilities while checking against org.apache.spark:spark-graphx_2.11:2.3.0 (filtered 3) [info] spark-streaming-kafka-0-8-assembly: previous-artifact not set, not analyzing binary compatibility [info] spark-streaming-kafka-0-10-assembly: previous-artifact not set, not analyzing binary compatibility [info] spark-streaming-kinesis-asl-assembly: previous-artifact not set, not analyzing binary compatibility [info] spark-streaming-flume-assembly: previous-artifact not set, not analyzing binary compatibility [info] spark-streaming-kafka-0-10: found 0 potential binary incompatibilities while checking against org.apache.spark:spark-streaming-kafka-0-10_2.11:2.3.0 (filtered 6) [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:128: value ENABLE_JOB_SUMMARY in object ParquetOutputFormat is deprecated: see corresponding Javadoc for more information. [warn] && conf.get(ParquetOutputFormat.ENABLE_JOB_SUMMARY) == null) { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:360: class ParquetInputSplit in package hadoop is deprecated: see corresponding Javadoc for more information. [warn] new org.apache.parquet.hadoop.ParquetInputSplit( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:371: method readFooter in object ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter(sharedConf, filePath, SKIP_ROW_GROUPS).getFileMetaData [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:544: method readFooter in object ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [info] spark-avro: previous-artifact not set, not analyzing binary compatibility [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:119: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:139: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:187: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:217: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:293: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaDataConsumer.scala:470: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] val p = consumer.poll(pollTimeoutMs) [warn] [info] spark-sql-kafka-0-10: previous-artifact not set, not analyzing binary compatibility [info] spark-core: found 0 potential binary incompatibilities while checking against org.apache.spark:spark-core_2.11:2.3.0 (filtered 909) [info] spark-hive: previous-artifact not set, not analyzing binary compatibility [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/main/scala/org/apache/spark/ml/feature/OneHotEncoder.scala:138: class OneHotEncoder in package feature is deprecated: `OneHotEncoderEstimator` will be renamed `OneHotEncoder` and this `OneHotEncoder` will be removed in 3.0.0. [warn] object OneHotEncoder extends DefaultParamsReadable[OneHotEncoder] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/main/scala/org/apache/spark/ml/feature/OneHotEncoder.scala:141: class OneHotEncoder in package feature is deprecated: `OneHotEncoderEstimator` will be renamed `OneHotEncoder` and this `OneHotEncoder` will be removed in 3.0.0. [warn] override def load(path: String): OneHotEncoder = super.load(path) [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala:53: method addedClasspath in class ILoop is deprecated: Use reset, replay or require to update class path [warn] if (addedClasspath != "") { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala:54: method addedClasspath in class ILoop is deprecated: Use reset, replay or require to update class path [warn] settings.classpath append addedClasspath [warn] [info] spark-repl: previous-artifact not set, not analyzing binary compatibility [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] spark-hive-thriftserver: previous-artifact not set, not analyzing binary compatibility [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/assembly/target/scala-2.11/spark-assembly_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] spark-assembly: previous-artifact not set, not analyzing binary compatibility [info] Compiling 1 Scala source to /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/examples/target/scala-2.11/classes... [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/examples/target/scala-2.11/spark-examples_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] spark-examples: previous-artifact not set, not analyzing binary compatibility [info] spark-mllib: found 0 potential binary incompatibilities while checking against org.apache.spark:spark-mllib_2.11:2.3.0 (filtered 514) [info] spark-sql: found 0 potential binary incompatibilities while checking against org.apache.spark:spark-sql_2.11:2.3.0 (filtered 294) [success] Total time: 28 s, completed Jan 17, 2021 5:19:00 PM [info] Building Spark assembly (w/Hive 1.2.1) using SBT with these arguments: -Phadoop-2.6 -Pkubernetes -Phive-thriftserver -Pflume -Pkinesis-asl -Pyarn -Pkafka-0-8 -Pspark-ganglia-lgpl -Phive -Pmesos assembly/package Using /usr/lib/jvm/java-8-openjdk-amd64/ as default JAVA_HOME. Note, this will be overridden by -java-home if it is set. [info] Loading project definition from /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/project [info] Set current project to spark-parent (in build file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/) [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/SSLOptions.scala:71: constructor SslContextFactory in class SslContextFactory is deprecated: see corresponding Javadoc for more information. [warn] val sslContextFactory = new SslContextFactory() [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala:492: trait AccumulableParam in package spark is deprecated: use AccumulatorV2 [warn] param: org.apache.spark.AccumulableParam[R, T]) extends AccumulatorV2[T, R] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala:492: trait AccumulableParam in package spark is deprecated: use AccumulatorV2 [warn] param: org.apache.spark.AccumulableParam[R, T]) extends AccumulatorV2[T, R] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/BarrierTaskContext.scala:161: method isRunningLocally in class TaskContext is deprecated: Local execution was removed, so this always returns false [warn] override def isRunningLocally(): Boolean = taskContext.isRunningLocally() [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/scheduler/StageInfo.scala:59: value attemptId in class StageInfo is deprecated: Use attemptNumber instead [warn] def attemptNumber(): Int = attemptId [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/api/r/RBackend.scala:102: method childGroup in class ServerBootstrap is deprecated: see corresponding Javadoc for more information. [warn] if (bootstrap != null && bootstrap.childGroup() != null) { [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/target/scala-2.11/spark-core_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:87: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] fwInfoBuilder.setRole(role) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:224: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] role.foreach { r => builder.setRole(r) } [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:260: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] Option(r.getRole), reservation) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:263: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] Option(r.getRole), reservation) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:521: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] role.foreach { r => builder.setRole(r) } [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:537: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] (RoleResourceInfo(resource.getRole, reservation), [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:128: value ENABLE_JOB_SUMMARY in object ParquetOutputFormat is deprecated: see corresponding Javadoc for more information. [warn] && conf.get(ParquetOutputFormat.ENABLE_JOB_SUMMARY) == null) { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:360: class ParquetInputSplit in package hadoop is deprecated: see corresponding Javadoc for more information. [warn] new org.apache.parquet.hadoop.ParquetInputSplit( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:371: method readFooter in object ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter(sharedConf, filePath, SKIP_ROW_GROUPS).getFileMetaData [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:544: method readFooter in object ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/main/scala/org/apache/spark/ml/feature/OneHotEncoder.scala:138: class OneHotEncoder in package feature is deprecated: `OneHotEncoderEstimator` will be renamed `OneHotEncoder` and this `OneHotEncoder` will be removed in 3.0.0. [warn] object OneHotEncoder extends DefaultParamsReadable[OneHotEncoder] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/main/scala/org/apache/spark/ml/feature/OneHotEncoder.scala:141: class OneHotEncoder in package feature is deprecated: `OneHotEncoderEstimator` will be renamed `OneHotEncoder` and this `OneHotEncoder` will be removed in 3.0.0. [warn] override def load(path: String): OneHotEncoder = super.load(path) [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala:53: method addedClasspath in class ILoop is deprecated: Use reset, replay or require to update class path [warn] if (addedClasspath != "") { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala:54: method addedClasspath in class ILoop is deprecated: Use reset, replay or require to update class path [warn] settings.classpath append addedClasspath [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/assembly/target/scala-2.11/jars/spark-assembly_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [success] Total time: 15 s, completed Jan 17, 2021 5:19:25 PM ======================================================================== Running Java style checks ======================================================================== Checkstyle checks passed. ======================================================================== Running Spark unit tests ======================================================================== [info] Running Spark tests using SBT with these arguments: -Phadoop-2.6 -Pkubernetes -Pflume -Phive-thriftserver -Pyarn -Pkafka-0-8 -Pspark-ganglia-lgpl -Pkinesis-asl -Phive -Pmesos test Using /usr/lib/jvm/java-8-openjdk-amd64/ as default JAVA_HOME. Note, this will be overridden by -java-home if it is set. [info] Loading project definition from /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/project [info] Set current project to spark-parent (in build file:/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/) [info] ScalaTest [info] ScalaTest [info] ScalaTest [info] Run completed in 114 milliseconds. [info] Total number of tests run: 0 [info] Suites: completed 0, aborted 0 [info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0 [info] No tests were executed. [info] Run completed in 120 milliseconds. [info] Total number of tests run: 0 [info] Suites: completed 0, aborted 0 [info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0 [info] No tests were executed. [info] Run completed in 168 milliseconds. [info] Total number of tests run: 0 [info] Suites: completed 0, aborted 0 [info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0 [info] No tests were executed. [info] ScalaTest [info] Run completed in 22 milliseconds. [info] Total number of tests run: 0 [info] Suites: completed 0, aborted 0 [info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0 [info] No tests were executed. [info] BitArraySuite: [info] SparkSinkSuite: [info] Test run started [info] - error case when create BitArray (14 milliseconds) [info] - bitSize (3 milliseconds) [info] - set (2 milliseconds) [info] - normal operation (10 milliseconds) [info] - merge (11 milliseconds) [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexDescendingWithStart started [info] BloomFilterSuite: [info] - accuracy - Byte (8 milliseconds) [info] - mergeInPlace - Byte (5 milliseconds) [info] - accuracy - Short (6 milliseconds) [info] - mergeInPlace - Short (6 milliseconds) [info] UTF8StringPropertyCheckSuite: [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexDescending started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexWithLast started [info] - accuracy - Int (40 milliseconds) [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexWithSkip started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexWithMax started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexDescending started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexDescending started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndex started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexWithSkip started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexDescending started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.testRefWithIntNaturalKey started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexDescending started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexWithMax started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndex started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexWithSkip started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndexWithMax started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.refIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.childIndex started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexWithSkip started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.copyIndex started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.naturalIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndexWithStart started [info] Test org.apache.spark.util.kvstore.InMemoryIteratorSuite.numericIndex started [info] Test run finished: 0 failed, 0 ignored, 38 total, 0.174s [info] Test run started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testDuplicateIndex started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testEmptyIndexName started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testIndexAnnotation started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testNumEncoding started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testIllegalIndexMethod started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testKeyClashes started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testArrayIndices started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testNoNaturalIndex2 started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testIllegalIndexName started [info] Test org.apache.spark.util.kvstore.LevelDBTypeInfoSuite.testNoNaturalIndex started [info] Test run finished: 0 failed, 0 ignored, 10 total, 0.014s [info] Test run started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexDescendingWithStart started [info] - toString (93 milliseconds) [info] - numChars (15 milliseconds) [info] - startsWith (14 milliseconds) [info] - mergeInPlace - Int (134 milliseconds) [info] - endsWith (9 milliseconds) [info] - toUpperCase (6 milliseconds) [info] - toLowerCase (4 milliseconds) [info] - compare (8 milliseconds) [info] - accuracy - Long (46 milliseconds) [info] - substring (50 milliseconds) [info] - contains (17 milliseconds) [info] - trim, trimLeft, trimRight (13 milliseconds) [info] - reverse (4 milliseconds) [info] - indexOf (18 milliseconds) [info] - repeat (8 milliseconds) [info] - lpad, rpad (4 milliseconds) [info] - mergeInPlace - Long (101 milliseconds) [info] - concat (57 milliseconds) [info] - concatWs (44 milliseconds) [info] - split !!! IGNORED !!! [info] - levenshteinDistance (7 milliseconds) [info] - hashCode (2 milliseconds) [info] - equals (1 millisecond) [info] Test run started [info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.addTest started [info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.fromStringTest started [info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.equalsTest started [info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.fromYearMonthStringTest started [info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.toStringTest started [info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.subtractTest started [info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.fromCaseInsensitiveStringTest started [info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.fromSingleUnitStringTest started [info] Test org.apache.spark.unsafe.types.CalendarIntervalSuite.fromDayTimeStringTest started [info] Test run finished: 0 failed, 0 ignored, 9 total, 0.013s [info] Test run started [info] Test org.apache.spark.unsafe.array.LongArraySuite.basicTest started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.001s [info] Test run started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.freeingOnHeapMemoryBlockResetsBaseObjectAndOffset started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.overlappingCopyMemory started [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/SSLOptions.scala:71: constructor SslContextFactory in class SslContextFactory is deprecated: see corresponding Javadoc for more information. [warn] val sslContextFactory = new SslContextFactory() [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala:492: trait AccumulableParam in package spark is deprecated: use AccumulatorV2 [warn] param: org.apache.spark.AccumulableParam[R, T]) extends AccumulatorV2[T, R] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala:492: trait AccumulableParam in package spark is deprecated: use AccumulatorV2 [warn] param: org.apache.spark.AccumulableParam[R, T]) extends AccumulatorV2[T, R] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/BarrierTaskContext.scala:161: method isRunningLocally in class TaskContext is deprecated: Local execution was removed, so this always returns false [warn] override def isRunningLocally(): Boolean = taskContext.isRunningLocally() [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/scheduler/StageInfo.scala:59: value attemptId in class StageInfo is deprecated: Use attemptNumber instead [warn] def attemptNumber(): Int = attemptId [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/src/main/scala/org/apache/spark/api/r/RBackend.scala:102: method childGroup in class ServerBootstrap is deprecated: see corresponding Javadoc for more information. [warn] if (bootstrap != null && bootstrap.childGroup() != null) { [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexWithStart started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.memoryDebugFillEnabledInTest started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.offHeapMemoryAllocatorThrowsAssertionErrorOnDoubleFree started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.heapMemoryReuse started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.onHeapMemoryAllocatorPoolingReUsesLongArrays started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.onHeapMemoryAllocatorThrowsAssertionErrorOnDoubleFree started [info] Test org.apache.spark.unsafe.PlatformUtilSuite.freeingOffHeapMemoryBlockResetsOffset started [info] Test run finished: 0 failed, 0 ignored, 8 total, 0.06s [info] Test run started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.testKnownLongInputs started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.testKnownIntegerInputs started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.randomizedStressTest started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexDescending started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.testKnownBytesInputs started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexWithStart started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.randomizedStressTestPaddedStrings started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexWithSkip started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexWithMax started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexDescending started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexDescending started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndex started [info] Test org.apache.spark.unsafe.hash.Murmur3_x86_32Suite.randomizedStressTestBytes started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexWithSkip started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexDescending started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.testRefWithIntNaturalKey started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexDescending started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexWithMax started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndex started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexWithSkip started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndexWithMax started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.refIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexDescendingWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.childIndex started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexWithSkip started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.copyIndex started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.naturalIndexDescendingWithLast started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndexWithStart started [info] Test org.apache.spark.util.kvstore.LevelDBIteratorSuite.numericIndex started [info] Test run finished: 0 failed, 0 ignored, 38 total, 0.756s [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/core/target/scala-2.11/spark-core_2.11-2.4.8-SNAPSHOT.jar ... [info] Test run started [info] Test org.apache.spark.util.kvstore.ArrayWrappersSuite.testGenericArrayKey started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.001s [info] Test run started [info] Test run finished: 0 failed, 0 ignored, 6 total, 0.304s [info] Test org.apache.spark.util.kvstore.LevelDBBenchmark ignored [info] Test run finished: 0 failed, 1 ignored, 0 total, 0.0s [info] Test run started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testBasicIteration started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testObjectWriteReadDelete started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testMultipleObjectWriteReadDelete started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testMetadata started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testArrayIndices started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testUpdate started [info] Test org.apache.spark.util.kvstore.InMemoryStoreSuite.testRemoveAll started [info] Test run finished: 0 failed, 0 ignored, 7 total, 0.015s [info] Test run started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testMultipleTypesWriteReadDelete started [info] Test run started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.titleCase started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.concatTest started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.soundex started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.basicTest started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStreamUnderflow started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.testToShort started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.startsWith started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.compareTo started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.levenshteinDistance started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStreamOverflow started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testObjectWriteReadDelete started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStreamIntArray started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.upperAndLower started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testSkip started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.testToInt started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.createBlankString started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.prefix started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.concatWsTest started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.repeat started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.contains started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.skipWrongFirstByte started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.emptyStringTest started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStreamSlice started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.trimBothWithTrimString started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.substringSQL started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.substring_index started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.pad started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testMultipleObjectWriteReadDelete started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.split started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.trims started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.trimRightWithTrimString started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.findInSet started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.substring started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.translate started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.reverse started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.trimLeftWithTrimString started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.endsWith started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.testToByte started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.testToLong started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.writeToOutputStream started [info] Test org.apache.spark.unsafe.types.UTF8StringSuite.indexOf started [info] Test run finished: 0 failed, 0 ignored, 38 total, 0.045s [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testReopenAndVersionCheckDb started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testMetadata started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testUpdate started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testRemoveAll started [info] Test org.apache.spark.util.kvstore.LevelDBSuite.testNegativeIndexValues started [info] Test run finished: 0 failed, 0 ignored, 9 total, 0.319s [info] - Success with ack (1 second, 691 milliseconds) [info] Test run started [info] Test org.apache.spark.launcher.InProcessLauncherSuite.testKill started [info] Test org.apache.spark.launcher.InProcessLauncherSuite.testLauncher started [info] Test org.apache.spark.launcher.InProcessLauncherSuite.testErrorPropagation started [info] Test run finished: 0 failed, 0 ignored, 3 total, 0.158s [info] Test run started [info] Test org.apache.spark.launcher.SparkSubmitOptionParserSuite.testMissingArg started [info] TestingUtilsSuite: [info] - Comparing doubles using relative error. (39 milliseconds) [info] - Comparing doubles using absolute error. (5 milliseconds) [info] Test org.apache.spark.launcher.SparkSubmitOptionParserSuite.testAllOptions started [info] - Comparing vectors using relative error. (17 milliseconds) [info] - Comparing vectors using absolute error. (7 milliseconds) [info] Test org.apache.spark.launcher.SparkSubmitOptionParserSuite.testEqualSeparatedOption started [info] Test org.apache.spark.launcher.SparkSubmitOptionParserSuite.testExtraOptions started [info] Test run finished: 0 failed, 0 ignored, 4 total, 0.162s [info] Test run started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testCliParser started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testPySparkLauncher started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testAlternateSyntaxParsing started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunner started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testSparkRShell started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testMissingAppResource started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testShellCliParser started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testClusterCmdBuilder started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testDriverCmdBuilder started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunnerNoMainClass started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testCliKillAndStatus started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunnerNoArg started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testPySparkFallback started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testExamplesRunnerWithMasterNoMainClass started [info] Test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testCliHelpAndNoArg started [info] Test run finished: 0 failed, 0 ignored, 15 total, 0.066s [info] Test run started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testNoRedirectToLog started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testProcMonitorWithOutputRedirection started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectOutputToLog started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectsSimple started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectErrorToLog started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testProcMonitorWithLogRedirection started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testFailedChildProc started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectErrorTwiceFails started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testBadLogRedirect started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectLastWins started [info] Test org.apache.spark.launcher.ChildProcAppHandleSuite.testRedirectToLog started [info] Test run finished: 0 failed, 0 ignored, 11 total, 0.128s [info] Test run started [info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testValidOptionStrings started [info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testJavaMajorVersion started [info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testPythonArgQuoting started [info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testWindowsBatchQuoting started [info] Test org.apache.spark.launcher.CommandBuilderUtilsSuite.testInvalidOptionStrings started [info] Test run finished: 0 failed, 0 ignored, 5 total, 0.004s [info] Test run started [info] Test org.apache.spark.launcher.LauncherServerSuite.testTimeout started [info] Test org.apache.spark.launcher.LauncherServerSuite.testStreamFiltering started [info] Done packaging. [info] - Comparing Matrices using absolute error. (289 milliseconds) [info] - Comparing Matrices using relative error. (10 milliseconds) [info] Test org.apache.spark.launcher.LauncherServerSuite.testSparkSubmitVmShutsDown started [info] Test org.apache.spark.launcher.LauncherServerSuite.testLauncherServerReuse started [info] UtilsSuite: [info] - EPSILON (3 milliseconds) [info] Test org.apache.spark.launcher.LauncherServerSuite.testAppHandleDisconnect started [info] Test org.apache.spark.launcher.LauncherServerSuite.testCommunication started [info] MatricesSuite: [info] - dense matrix construction (0 milliseconds) [info] - dense matrix construction with wrong dimension (1 millisecond) [info] Test run finished: 0 failed, 0 ignored, 6 total, 0.14s [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] - sparse matrix construction (195 milliseconds) [info] - sparse matrix construction with wrong number of elements (2 milliseconds) [info] - index in matrices incorrect input (3 milliseconds) [info] - equals (16 milliseconds) [info] - matrix copies are deep copies (0 milliseconds) [info] - matrix indexing and updating (1 millisecond) [info] - dense to dense (2 milliseconds) [info] - dense to sparse (2 milliseconds) [info] - sparse to sparse (4 milliseconds) [info] - sparse to dense (2 milliseconds) [info] - compressed dense (4 milliseconds) [info] - compressed sparse (2 milliseconds) [info] - map, update (2 milliseconds) [info] - transpose (1 millisecond) [info] - foreachActive (1 millisecond) [info] - horzcat, vertcat, eye, speye (13 milliseconds) [info] - zeros (1 millisecond) [info] - ones (2 milliseconds) [info] - eye (1 millisecond) [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:87: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] fwInfoBuilder.setRole(role) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:224: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] role.foreach { r => builder.setRole(r) } [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:260: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] Option(r.getRole), reservation) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:263: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] Option(r.getRole), reservation) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:521: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] role.foreach { r => builder.setRole(r) } [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:537: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] (RoleResourceInfo(resource.getRole, reservation), [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] - Failure with nack (1 second, 147 milliseconds) [info] - rand (142 milliseconds) [info] - randn (2 milliseconds) [info] - diag (1 millisecond) [info] - sprand (9 milliseconds) [info] - sprandn (2 milliseconds) [info] - toString (13 milliseconds) [info] - numNonzeros and numActives (1 millisecond) [info] - fromBreeze with sparse matrix (15 milliseconds) Jan 17, 2021 5:19:52 PM com.github.fommil.netlib.BLAS WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS Jan 17, 2021 5:19:52 PM com.github.fommil.netlib.BLAS WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS [info] - row/col iterator (45 milliseconds) [info] BreezeMatrixConversionSuite: [info] - dense matrix to breeze (0 milliseconds) [info] - dense breeze matrix to matrix (1 millisecond) [info] - sparse matrix to breeze (1 millisecond) [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] - sparse breeze matrix to sparse matrix (1 millisecond) [info] MultivariateGaussianSuite: [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeUtils.scala:259: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val dstream = FlumeUtils.createStream(jssc, hostname, port, storageLevel, enableDecompression) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeUtils.scala:275: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val dstream = FlumeUtils.createPollingStream( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/examples/FlumePollingEventCount.scala:56: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val stream = FlumeUtils.createPollingStream(ssc, host, port) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/examples/FlumePollingEventCount.scala:56: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val stream = FlumeUtils.createPollingStream(ssc, host, port) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/flume/src/main/scala/org/apache/spark/examples/FlumeEventCount.scala:59: object FlumeUtils in package flume is deprecated: Deprecated without replacement [warn] val stream = FlumeUtils.createStream(ssc, host, port, StorageLevel.MEMORY_ONLY_SER_2) [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/DirectKafkaInputDStream.scala:172: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] val msgs = c.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/ConsumerStrategy.scala:100: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/ConsumerStrategy.scala:153: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/KafkaDataConsumer.scala:200: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] val p = consumer.poll(timeout) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:633: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createStream[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder]( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:647: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges: JList[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:648: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] leaders: JMap[TopicAndPartition, Broker]): JavaRDD[(Array[Byte], Array[Byte])] = { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:657: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges: JList[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:658: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] leaders: JMap[TopicAndPartition, Broker]): JavaRDD[Array[Byte]] = { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:670: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges: JList[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:671: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] leaders: JMap[TopicAndPartition, Broker], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:673: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createRDD[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder, V]( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:676: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] offsetRanges.toArray(new Array[OffsetRange](offsetRanges.size())), [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:720: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val kc = new KafkaCluster(Map(kafkaParams.asScala.toSeq: _*)) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:721: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.getFromOffsets( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:725: object KafkaUtils in package kafka is deprecated: Update to Kafka 0.10 integration [warn] KafkaUtils.createDirectStream[Array[Byte], Array[Byte], DefaultDecoder, DefaultDecoder, V]( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:733: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] ): OffsetRange = OffsetRange.create(topic, partition, fromOffset, untilOffset) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:733: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] ): OffsetRange = OffsetRange.create(topic, partition, fromOffset, untilOffset) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:738: class Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] def createBroker(host: String, port: JInt): Broker = Broker(host, port) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:738: object Broker in package kafka is deprecated: Update to Kafka 0.10 integration [warn] def createBroker(host: String, port: JInt): Broker = Broker(host, port) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala:740: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] def offsetRangesOfKafkaRDD(rdd: RDD[_]): JList[OffsetRange] = { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:89: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] protected val kc = new KafkaCluster(kafkaParams) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:172: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] OffsetRange(tp.topic, tp.partition, fo, uo.offset) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:217: object KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val leaders = KafkaCluster.checkErrors(kc.findLeaders(topics)) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/DirectKafkaInputDStream.scala:222: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] context.sparkContext, kafkaParams, b.map(OffsetRange(_)), leaders, messageHandler) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:58: trait HasOffsetRanges in package kafka is deprecated: Update to Kafka 0.10 integration [warn] ) extends RDD[R](sc, Nil) with Logging with HasOffsetRanges { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:55: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val offsetRanges: Array[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:55: class OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val offsetRanges: Array[OffsetRange], [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:152: class KafkaCluster in package kafka is deprecated: Update to Kafka 0.10 integration [warn] val kc = new KafkaCluster(kafkaParams) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-8/src/main/scala/org/apache/spark/streaming/kafka/KafkaRDD.scala:268: object OffsetRange in package kafka is deprecated: Update to Kafka 0.10 integration [warn] OffsetRange(tp.topic, tp.partition, fo, uo.offset) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:140: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] private val client = new AmazonKinesisClient(credentials) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:150: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] client.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:219: method setRequestCredentials in class AmazonWebServiceRequest is deprecated: see corresponding Javadoc for more information. [warn] getRecordsRequest.setRequestCredentials(credentials) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:240: method setRequestCredentials in class AmazonWebServiceRequest is deprecated: see corresponding Javadoc for more information. [warn] getShardIteratorRequest.setRequestCredentials(credentials) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisUtils.scala:606: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] KinesisUtils.createStream(jssc.ssc, kinesisAppName, streamName, endpointUrl, regionName, [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisUtils.scala:613: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] KinesisUtils.createStream(jssc, kinesisAppName, streamName, endpointUrl, regionName, [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisUtils.scala:616: method createStream in object KinesisUtils is deprecated: Use KinesisInputDStream.builder instead [warn] KinesisUtils.createStream(jssc, kinesisAppName, streamName, endpointUrl, regionName, [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:107: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val kinesisClient = new AmazonKinesisClient(credentials) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:108: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] kinesisClient.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:223: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val kinesisClient = new AmazonKinesisClient(new DefaultAWSCredentialsProviderChain()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:224: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] kinesisClient.setEndpoint(endpoint) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/SparkAWSCredentials.scala:76: method withLongLivedCredentialsProvider in class Builder is deprecated: see corresponding Javadoc for more information. [warn] .withLongLivedCredentialsProvider(longLivedCreds.provider) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:58: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val client = new AmazonKinesisClient(KinesisTestUtils.getAWSCredentials()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:59: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] client.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:64: constructor AmazonDynamoDBClient in class AmazonDynamoDBClient is deprecated: see corresponding Javadoc for more information. [warn] val dynamoDBClient = new AmazonDynamoDBClient(new DefaultAWSCredentialsProviderChain()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:65: method setRegion in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] dynamoDBClient.setRegion(RegionUtils.getRegion(regionName)) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisReceiver.scala:187: constructor Worker in class Worker is deprecated: see corresponding Javadoc for more information. [warn] worker = new Worker(recordProcessorFactory, kinesisClientLibConfiguration) [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] - accuracy - String (2 seconds, 955 milliseconds) Jan 17, 2021 5:19:53 PM com.github.fommil.netlib.LAPACK WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeSystemLAPACK Jan 17, 2021 5:19:53 PM com.github.fommil.netlib.LAPACK WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeRefLAPACK [info] ScalaTest [info] Run completed in 18 milliseconds. [info] Total number of tests run: 0 [info] Suites: completed 0, aborted 0 [info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0 [info] No tests were executed. [info] Test run started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testBadChallenge started [info] ScalaTest [info] Run completed in 14 milliseconds. [info] Total number of tests run: 0 [info] Suites: completed 0, aborted 0 [info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0 [info] No tests were executed. [info] ScalaTest [info] Run completed in 13 milliseconds. [info] Total number of tests run: 0 [info] Suites: completed 0, aborted 0 [info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0 [info] No tests were executed. [info] ScalaTest [info] Run completed in 17 milliseconds. [info] Total number of tests run: 0 [info] Suites: completed 0, aborted 0 [info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0 [info] No tests were executed. [info] - univariate (427 milliseconds) [info] - multivariate (8 milliseconds) [info] - multivariate degenerate (0 milliseconds) [info] - SPARK-11302 (5 milliseconds) [info] BreezeVectorConversionSuite: [info] - dense to breeze (1 millisecond) [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testWrongAppId started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testWrongNonce started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testMismatchedSecret started [info] - sparse to breeze (110 milliseconds) [info] - dense breeze to vector (1 millisecond) [info] - sparse breeze to vector (0 milliseconds) [info] - sparse breeze with partially-used arrays to vector (0 milliseconds) [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testEncryptedMessage started [info] VectorsSuite: [info] - dense vector construction with varargs (0 milliseconds) [info] - dense vector construction from a double array (0 milliseconds) [info] - sparse vector construction (1 millisecond) [info] - sparse vector construction with unordered elements (2 milliseconds) [info] - sparse vector construction with mismatched indices/values array (1 millisecond) [info] - sparse vector construction with too many indices vs size (1 millisecond) [info] - sparse vector construction with negative indices (0 milliseconds) [info] - dense to array (0 milliseconds) [info] - dense argmax (1 millisecond) [info] - sparse to array (0 milliseconds) [info] - sparse argmax (0 milliseconds) [info] - vector equals (2 milliseconds) [info] - vectors equals with explicit 0 (1 millisecond) [info] - indexing dense vectors (1 millisecond) [info] - indexing sparse vectors (0 milliseconds) [info] - zeros (0 milliseconds) [info] - Vector.copy (1 millisecond) [info] - fromBreeze (1 millisecond) [info] - sqdist (49 milliseconds) [info] - foreachActive (3 milliseconds) [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testEncryptedMessageWhenTransferringZeroBytes started [info] - vector p-norm (5 milliseconds) [info] - Vector numActive and numNonzeros (2 milliseconds) [info] - Vector toSparse and toDense (1 millisecond) [info] - Vector.compressed (1 millisecond) [info] - SparseVector.slice (1 millisecond) [info] - sparse vector only support non-negative length (1 millisecond) [info] BLASSuite: [info] - copy (4 milliseconds) [info] - scal (0 milliseconds) [info] - axpy (1 millisecond) [info] - dot (1 millisecond) [info] - spr (2 milliseconds) [info] - syr (4 milliseconds) [info] - gemm (3 milliseconds) [info] - gemv (3 milliseconds) [info] - spmv (2 milliseconds) [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testAuthEngine started [info] Test org.apache.spark.network.crypto.AuthEngineSuite.testBadKeySize started [info] Test run finished: 0 failed, 0 ignored, 8 total, 0.442s [info] Test run started [info] Test org.apache.spark.network.server.OneForOneStreamManagerSuite.streamStatesAreFreedWhenConnectionIsClosedEvenIfBufferIteratorThrowsException started [info] Test org.apache.spark.network.server.OneForOneStreamManagerSuite.managedBuffersAreFeedWhenConnectionIsClosed started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.062s [info] Test run started [info] Test org.apache.spark.network.RequestTimeoutIntegrationSuite.furtherRequestsDelay started [info] - Failure with timeout (1 second, 120 milliseconds) [info] Test run started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchUnregisteredExecutor started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchWrongExecutor started [info] - Multiple consumers (1 second, 511 milliseconds) [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchNoServer started [info] - mergeInPlace - String (2 seconds, 217 milliseconds) [info] - incompatible merge (2 milliseconds) [info] CountMinSketchSuite: [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testRegisterInvalidExecutor started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchThreeSort started [info] - accuracy - Byte (261 milliseconds) [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchWrongBlockId started [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchNonexistent started [info] - mergeInPlace - Byte (241 milliseconds) [info] Test org.apache.spark.network.shuffle.ExternalShuffleIntegrationSuite.testFetchOneSort started [info] Test run finished: 0 failed, 0 ignored, 8 total, 1.703s [info] Test run started [info] Test org.apache.spark.network.shuffle.RetryingBlockFetcherSuite.testRetryAndUnrecoverable started [info] Test org.apache.spark.network.shuffle.RetryingBlockFetcherSuite.testSingleIOExceptionOnFirst started [info] Test org.apache.spark.network.shuffle.RetryingBlockFetcherSuite.testUnrecoverableFailure started [info] Test org.apache.spark.network.shuffle.RetryingBlockFetcherSuite.testSingleIOExceptionOnSecond started [info] Test org.apache.spark.network.shuffle.RetryingBlockFetcherSuite.testThreeIOExceptions started [info] Test org.apache.spark.network.shuffle.RetryingBlockFetcherSuite.testNoFailures started [info] Test org.apache.spark.network.shuffle.RetryingBlockFetcherSuite.testTwoIOExceptions started [info] Test run finished: 0 failed, 0 ignored, 7 total, 0.205s [info] Test run started [info] Test org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testBadSecret started [info] Test org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testBadAppId started [info] - accuracy - Short (586 milliseconds) [info] Test org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testValid started [info] Test org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testEncryption started [info] - Multiple consumers with some failures (1 second, 400 milliseconds) [info] - mergeInPlace - Short (229 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 4 total, 0.435s [info] Test run started [info] Test org.apache.spark.network.sasl.SaslIntegrationSuite.testGoodClient started [info] ScalaTest [info] Run completed in 8 seconds, 65 milliseconds. [info] Total number of tests run: 19 [info] Suites: completed 1, aborted 0 [info] Tests: succeeded 19, failed 0, canceled 0, ignored 1, pending 0 [info] All tests passed. [info] Passed: Total 81, Failed 0, Errors 0, Passed 81, Ignored 1 [info] Test org.apache.spark.network.sasl.SaslIntegrationSuite.testNoSaslClient started [info] Test org.apache.spark.network.sasl.SaslIntegrationSuite.testNoSaslServer started [info] Test org.apache.spark.network.sasl.SaslIntegrationSuite.testBadClient started [info] Test run finished: 0 failed, 0 ignored, 4 total, 0.384s [info] Test run started [info] Test org.apache.spark.network.sasl.ShuffleSecretManagerSuite.testMultipleRegisters started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.002s [info] - accuracy - Int (463 milliseconds) [info] Test run started [info] Test org.apache.spark.network.shuffle.NonShuffleFilesCleanupSuite.cleanupUsesExecutorWithShuffleFiles started [info] Test org.apache.spark.network.shuffle.NonShuffleFilesCleanupSuite.cleanupOnlyRegisteredExecutorWithShuffleFiles started [info] Test org.apache.spark.network.shuffle.NonShuffleFilesCleanupSuite.cleanupOnRemovedExecutorWithShuffleFiles started [info] Test org.apache.spark.network.shuffle.NonShuffleFilesCleanupSuite.cleanupOnlyRemovedExecutorWithoutShuffleFiles started [info] Test org.apache.spark.network.shuffle.NonShuffleFilesCleanupSuite.cleanupOnRemovedExecutorWithoutShuffleFiles started [info] Test org.apache.spark.network.shuffle.NonShuffleFilesCleanupSuite.cleanupOnlyRemovedExecutorWithShuffleFiles started [info] Test org.apache.spark.network.shuffle.NonShuffleFilesCleanupSuite.cleanupOnlyRegisteredExecutorWithoutShuffleFiles started [info] Test org.apache.spark.network.shuffle.NonShuffleFilesCleanupSuite.cleanupUsesExecutorWithoutShuffleFiles started [info] Test run finished: 0 failed, 0 ignored, 8 total, 0.1s [info] Test run started [info] Test org.apache.spark.network.shuffle.AppIsolationSuite.testSaslAppIsolation started [info] - mergeInPlace - Int (199 milliseconds) [info] Test org.apache.spark.network.shuffle.AppIsolationSuite.testAuthEngineAppIsolation started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.378s [info] Test run started [info] Test org.apache.spark.network.shuffle.ExternalShuffleBlockResolverSuite.testNormalizeAndInternPathname started [info] Test org.apache.spark.network.shuffle.ExternalShuffleBlockResolverSuite.testSortShuffleBlocks started [info] Test org.apache.spark.network.shuffle.ExternalShuffleBlockResolverSuite.testBadRequests started [info] Test org.apache.spark.network.shuffle.ExternalShuffleBlockResolverSuite.jsonSerializationOfExecutorRegistration started [info] Test run finished: 0 failed, 0 ignored, 4 total, 0.214s [info] Test run started [info] Test org.apache.spark.network.shuffle.ExternalShuffleBlockHandlerSuite.testOpenShuffleBlocks started [info] Test org.apache.spark.network.shuffle.ExternalShuffleBlockHandlerSuite.testRegisterExecutor started [info] Test org.apache.spark.network.shuffle.ExternalShuffleBlockHandlerSuite.testBadMessages started [info] Test run finished: 0 failed, 0 ignored, 3 total, 0.023s [info] Test run started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testEmptyBlockFetch started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFailure started [info] - accuracy - Long (535 milliseconds) [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFailureAndSuccess started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFetchThree started [info] Test org.apache.spark.network.shuffle.OneForOneBlockFetcherSuite.testFetchOne started [info] Test run finished: 0 failed, 0 ignored, 5 total, 0.009s [info] Test run started [info] Test org.apache.spark.network.shuffle.BlockTransferMessagesSuite.serializeOpenShuffleBlocks started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.001s [info] Test run started [info] Test org.apache.spark.network.shuffle.ExternalShuffleCleanupSuite.cleanupOnlyRemovedApp started [info] JdbcRDDSuite: [info] - mergeInPlace - Long (225 milliseconds) [info] Test org.apache.spark.network.shuffle.ExternalShuffleCleanupSuite.cleanupUsesExecutor started [info] Test org.apache.spark.network.shuffle.ExternalShuffleCleanupSuite.noCleanupAndCleanup started [info] Test org.apache.spark.network.shuffle.ExternalShuffleCleanupSuite.cleanupMultipleExecutors started [info] Test run finished: 0 failed, 0 ignored, 4 total, 1.998s [info] - accuracy - String (1 second, 976 milliseconds) [info] - basic functionality (2 seconds, 417 milliseconds) [info] DistributedSuite: [info] - mergeInPlace - String (2 seconds, 14 milliseconds) [info] - large id overflow (481 milliseconds) [info] SparkUncaughtExceptionHandlerSuite: [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:128: value ENABLE_JOB_SUMMARY in object ParquetOutputFormat is deprecated: see corresponding Javadoc for more information. [warn] && conf.get(ParquetOutputFormat.ENABLE_JOB_SUMMARY) == null) { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:360: class ParquetInputSplit in package hadoop is deprecated: see corresponding Javadoc for more information. [warn] new org.apache.parquet.hadoop.ParquetInputSplit( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:371: method readFooter in object ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter(sharedConf, filePath, SKIP_ROW_GROUPS).getFileMetaData [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:544: method readFooter in object ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter( [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/TriggerExecutor.scala:46: class ProcessingTime in package streaming is deprecated: use Trigger.ProcessingTime(intervalMs) [warn] case class ProcessingTimeExecutor(processingTime: ProcessingTime, clock: Clock = new SystemClock()) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:119: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:139: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:187: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:217: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:293: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaDataConsumer.scala:470: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] val p = consumer.poll(pollTimeoutMs) [warn] [info] - SPARK-30310: Test uncaught RuntimeException, exitOnUncaughtException = true (1 second, 378 milliseconds) [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/main/scala/org/apache/spark/ml/feature/OneHotEncoder.scala:138: class OneHotEncoder in package feature is deprecated: `OneHotEncoderEstimator` will be renamed `OneHotEncoder` and this `OneHotEncoder` will be removed in 3.0.0. [warn] object OneHotEncoder extends DefaultParamsReadable[OneHotEncoder] { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/mllib/src/main/scala/org/apache/spark/ml/feature/OneHotEncoder.scala:141: class OneHotEncoder in package feature is deprecated: `OneHotEncoderEstimator` will be renamed `OneHotEncoder` and this `OneHotEncoder` will be removed in 3.0.0. [warn] override def load(path: String): OneHotEncoder = super.load(path) [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala:53: method addedClasspath in class ILoop is deprecated: Use reset, replay or require to update class path [warn] if (addedClasspath != "") { [warn] [warn] /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala:54: method addedClasspath in class ILoop is deprecated: Use reset, replay or require to update class path [warn] settings.classpath append addedClasspath [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Test org.apache.spark.network.RequestTimeoutIntegrationSuite.timeoutCleanlyClosesClient started [info] - SPARK-30310: Test uncaught RuntimeException, exitOnUncaughtException = false (1 second, 990 milliseconds) [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/examples/target/scala-2.11/jars/spark-examples_2.11-2.4.8-SNAPSHOT.jar ... [info] Done packaging. [info] ScalaTest [info] Run completed in 15 milliseconds. [info] Total number of tests run: 0 [info] Suites: completed 0, aborted 0 [info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0 [info] No tests were executed. [info] - task throws not serializable exception (5 seconds, 978 milliseconds) [info] - local-cluster format (4 milliseconds) [info] - SPARK-30310: Test uncaught OutOfMemoryError, exitOnUncaughtException = true (1 second, 696 milliseconds) [info] - accuracy - Byte array (6 seconds, 47 milliseconds) [info] - SPARK-30310: Test uncaught OutOfMemoryError, exitOnUncaughtException = false (1 second, 355 milliseconds) [info] - mergeInPlace - Byte array (1 second, 959 milliseconds) [info] - incompatible merge (1 millisecond) [info] - SPARK-30310: Test uncaught SparkFatalException(RuntimeException), exitOnUncaughtException = true (1 second, 265 milliseconds) [info] FlumePollingStreamSuite: [info] - simple groupByKey (3 seconds, 758 milliseconds) [info] - SPARK-30310: Test uncaught SparkFatalException(RuntimeException), exitOnUncaughtException = false (1 second, 249 milliseconds) [info] - SPARK-30310: Test uncaught SparkFatalException(OutOfMemoryError), exitOnUncaughtException = true (1 second, 328 milliseconds) [info] - SPARK-30310: Test uncaught SparkFatalException(OutOfMemoryError), exitOnUncaughtException = false (1 second, 303 milliseconds) [info] PipedRDDSuite: [info] - basic pipe (111 milliseconds) [info] - basic pipe with tokenization (90 milliseconds) [info] - failure in iterating over pipe input (82 milliseconds) [info] - stdin writer thread should be exited when task is finished (56 milliseconds) [info] - groupByKey where map output sizes exceed maxMbInFlight (3 seconds, 906 milliseconds) [info] Test org.apache.spark.network.RequestTimeoutIntegrationSuite.timeoutInactiveRequests started [info] - advanced pipe (708 milliseconds) [info] - pipe with empty partition (117 milliseconds) [info] - pipe with env variable (40 milliseconds) [info] - pipe with process which cannot be launched due to bad command (30 milliseconds) cat: nonexistent_file: No such file or directory cat: nonexistent_file: No such file or directory [info] - pipe with process which is launched but fails with non-zero exit status (35 milliseconds) [info] - basic pipe with separate working directory (119 milliseconds) [info] - test pipe exports map_input_file (73 milliseconds) [info] - test pipe exports mapreduce_map_input_file (30 milliseconds) [info] AccumulatorV2Suite: [info] - LongAccumulator add/avg/sum/count/isZero (1 millisecond) [info] - DoubleAccumulator add/avg/sum/count/isZero (1 millisecond) [info] - ListAccumulator (1 millisecond) [info] - LegacyAccumulatorWrapper (1 millisecond) [info] - LegacyAccumulatorWrapper with AccumulatorParam that has no equals/hashCode (3 milliseconds) [info] FileSuite: [info] - text files (429 milliseconds) [info] - text files (compressed) (538 milliseconds) [info] - SequenceFiles (313 milliseconds) [info] - SequenceFile (compressed) (393 milliseconds) [info] - SequenceFile with writable key (237 milliseconds) [info] - SequenceFile with writable value (237 milliseconds) [info] - SequenceFile with writable key and value (226 milliseconds) [info] - accumulators (3 seconds, 240 milliseconds) [info] - implicit conversions in reading SequenceFiles (345 milliseconds) [info] - object files of ints (218 milliseconds) [info] - object files of complex types (235 milliseconds) [info] - object files of classes from a JAR (1 second, 31 milliseconds) [info] - write SequenceFile using new Hadoop API (231 milliseconds) [info] - read SequenceFile using new Hadoop API (267 milliseconds) [info] - binary file input as byte array (190 milliseconds) [info] - portabledatastream caching tests (180 milliseconds) [info] - portabledatastream persist disk storage (211 milliseconds) [info] - broadcast variables (3 seconds, 72 milliseconds) [info] - portabledatastream flatmap tests (153 milliseconds) [info] - SPARK-22357 test binaryFiles minPartitions (443 milliseconds) [info] - minimum split size per node and per rack should be less than or equal to maxSplitSize (126 milliseconds) [info] - fixed record length binary file as byte array (128 milliseconds) [info] - negative binary record length should raise an exception (110 milliseconds) [info] - file caching (140 milliseconds) [info] - prevent user from overwriting the empty directory (old Hadoop API) (81 milliseconds) [info] - prevent user from overwriting the non-empty directory (old Hadoop API) (145 milliseconds) [info] - allow user to disable the output directory existence checking (old Hadoop API) (205 milliseconds) [info] - prevent user from overwriting the empty directory (new Hadoop API) (67 milliseconds) [info] - prevent user from overwriting the non-empty directory (new Hadoop API) (169 milliseconds) [info] - allow user to disable the output directory existence checking (new Hadoop API (205 milliseconds) [info] - save Hadoop Dataset through old Hadoop API (125 milliseconds) [info] - save Hadoop Dataset through new Hadoop API (136 milliseconds) [info] - Get input files via old Hadoop API (216 milliseconds) [info] - Get input files via new Hadoop API (221 milliseconds) [info] - spark.files.ignoreCorruptFiles should work both HadoopRDD and NewHadoopRDD (313 milliseconds) [info] - repeatedly failing task (3 seconds, 358 milliseconds) [info] - spark.hadoopRDD.ignoreEmptySplits work correctly (old Hadoop API) (516 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 3 total, 31.875s [info] Test run started [info] Test org.apache.spark.network.ProtocolSuite.responses started [info] Test org.apache.spark.network.ProtocolSuite.requests started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.029s [info] Test run started [info] Test org.apache.spark.network.TransportClientFactorySuite.reuseClientsUpToConfigVariable started [info] Test org.apache.spark.network.TransportClientFactorySuite.reuseClientsUpToConfigVariableConcurrent started [info] - spark.hadoopRDD.ignoreEmptySplits work correctly (new Hadoop API) (445 milliseconds) [info] Test org.apache.spark.network.TransportClientFactorySuite.closeFactoryBeforeCreateClient started [info] Test org.apache.spark.network.TransportClientFactorySuite.closeBlockClientsWithFactory started [info] - spark.files.ignoreMissingFiles should work both HadoopRDD and NewHadoopRDD (417 milliseconds) [info] LogPageSuite: [info] Test org.apache.spark.network.TransportClientFactorySuite.neverReturnInactiveClients started [info] Test org.apache.spark.network.TransportClientFactorySuite.closeIdleConnectionForRequestTimeOut started [info] - flume polling test (13 seconds, 842 milliseconds) [info] - get logs simple (237 milliseconds) [info] PartiallyUnrolledIteratorSuite: [info] - join two iterators (47 milliseconds) [info] HistoryServerDiskManagerSuite: [info] - leasing space (130 milliseconds) [info] - tracking active stores (23 milliseconds) [info] - approximate size heuristic (1 millisecond) [info] - SPARK-32024: update ApplicationStoreInfo.size during initializing (32 milliseconds) [info] ExternalShuffleServiceSuite: [info] - groupByKey without compression (247 milliseconds) [info] Test org.apache.spark.network.TransportClientFactorySuite.returnDifferentClientsForDifferentServers started [info] Test run finished: 0 failed, 0 ignored, 7 total, 2.324s [info] Test run started [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testSaslClientFallback started [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testSaslServerFallback started [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testAuthReplay started [info] - shuffle non-zero block size (3 seconds, 945 milliseconds) [info] - repeatedly failing task that crashes JVM (7 seconds, 150 milliseconds) [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testNewAuth started [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testLargeMessageEncryption started [info] Test org.apache.spark.network.crypto.AuthIntegrationSuite.testAuthFailure started [info] Test run finished: 0 failed, 0 ignored, 6 total, 5.662s [info] Test run started [info] Test org.apache.spark.network.RpcIntegrationSuite.sendRpcWithStreamConcurrently started [info] Test org.apache.spark.network.RpcIntegrationSuite.sendOneWayMessage started [info] Test org.apache.spark.network.RpcIntegrationSuite.singleRPC started [info] Test org.apache.spark.network.RpcIntegrationSuite.throwErrorRPC started [info] Test org.apache.spark.network.RpcIntegrationSuite.doubleTrouble started [info] Test org.apache.spark.network.RpcIntegrationSuite.doubleRPC started [info] Test org.apache.spark.network.RpcIntegrationSuite.returnErrorRPC started [info] Test org.apache.spark.network.RpcIntegrationSuite.sendRpcWithStreamFailures started [info] Test org.apache.spark.network.RpcIntegrationSuite.sendRpcWithStreamOneAtATime started [info] Test org.apache.spark.network.RpcIntegrationSuite.sendSuccessAndFailure started [info] Test run finished: 0 failed, 0 ignored, 10 total, 0.907s [info] Test run started [info] Test org.apache.spark.network.crypto.TransportCipherSuite.testBufferNotLeaksOnInternalError started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.014s [info] Test run started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.failOutstandingStreamCallbackOnException started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.failOutstandingStreamCallbackOnClose started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.testActiveStreams started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleSuccessfulFetch started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleFailedRPC started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleFailedFetch started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.handleSuccessfulRPC started [info] Test org.apache.spark.network.TransportResponseHandlerSuite.clearAllOutstandingRequests started [info] Test run finished: 0 failed, 0 ignored, 8 total, 0.022s [info] Test run started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testEmptyFrame started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testNegativeFrameSize started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testSplitLengthField started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testFrameDecoding started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testInterception started [info] Test org.apache.spark.network.util.TransportFrameDecoderSuite.testRetainedFrames started [info] Test run finished: 0 failed, 0 ignored, 6 total, 0.086s [info] Test run started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testDeallocateReleasesManagedBuffer started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testByteBufBody started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testShortWrite started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testCompositeByteBufBodySingleBuffer started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testCompositeByteBufBodyMultipleBuffers started [info] Test org.apache.spark.network.protocol.MessageWithHeaderSuite.testSingleWrite started [info] Test run finished: 0 failed, 0 ignored, 6 total, 0.003s [info] Test run started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testNonMatching started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testSaslAuthentication started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testEncryptedMessageChunking started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testServerAlwaysEncrypt started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testDataEncryptionIsActuallyEnabled started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testFileRegionEncryption started [info] - shuffle serializer (3 seconds, 566 milliseconds) [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testSaslEncryption started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testDelegates started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testEncryptedMessage started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testMatching started [info] Test org.apache.spark.network.sasl.SparkSaslSuite.testRpcHandlerDelegate started [info] Test run finished: 0 failed, 0 ignored, 11 total, 0.505s [info] Test run started [info] Test org.apache.spark.network.util.CryptoUtilsSuite.testConfConversion started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.0s [info] Test run started [info] Test org.apache.spark.network.TransportRequestHandlerSuite.handleFetchRequestAndStreamRequest started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.017s [info] Test run started [info] Test org.apache.spark.network.crypto.AuthMessagesSuite.testServerResponse started [info] Test org.apache.spark.network.crypto.AuthMessagesSuite.testClientChallenge started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.001s [info] Test run started [info] Test org.apache.spark.network.util.NettyMemoryMetricsSuite.testGeneralNettyMemoryMetrics started [info] Test org.apache.spark.network.util.NettyMemoryMetricsSuite.testAdditionalMetrics started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.129s [info] Test run started [info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchNonExistentChunk started [info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchFileChunk started [info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchBothChunks started [info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchChunkAndNonExistent started [info] Test org.apache.spark.network.ChunkFetchIntegrationSuite.fetchBufferChunk started [info] Test run finished: 0 failed, 0 ignored, 5 total, 0.441s [info] Test run started [info] Test org.apache.spark.network.StreamSuite.testSingleStream started [info] Test org.apache.spark.network.StreamSuite.testMultipleStreams started [info] Test org.apache.spark.network.StreamSuite.testConcurrentStreams started [info] Test org.apache.spark.network.StreamSuite.testZeroLengthStream started [info] Test run finished: 0 failed, 0 ignored, 4 total, 0.292s [info] MesosSchedulerUtilsSuite: [info] - use at-least minimum overhead (332 milliseconds) [info] - use overhead if it is greater than minimum value (7 milliseconds) [info] - use spark.mesos.executor.memoryOverhead (if set) (3 milliseconds) [info] - parse a non-empty constraint string correctly (28 milliseconds) [info] - parse an empty constraint string correctly (1 millisecond) [info] - throw an exception when the input is malformed (5 milliseconds) [info] - empty values for attributes' constraints matches all values (40 milliseconds) [info] - subset match is performed for set attributes (7 milliseconds) [info] - less than equal match is performed on scalar attributes (5 milliseconds) [info] - contains match is performed for range attributes (44 milliseconds) [info] - equality match is performed for text attributes (2 milliseconds) [info] - Port reservation is done correctly with user specified ports only (40 milliseconds) [info] - Port reservation is done correctly with all random ports (2 milliseconds) [info] - Port reservation is done correctly with user specified ports only - multiple ranges (3 milliseconds) [info] - Port reservation is done correctly with all random ports - multiple ranges (2 milliseconds) [info] - Principal specified via spark.mesos.principal (21 milliseconds) [info] - Principal specified via spark.mesos.principal.file (26 milliseconds) [info] - Principal specified via spark.mesos.principal.file that does not exist (6 milliseconds) [info] - Principal specified via SPARK_MESOS_PRINCIPAL (7 milliseconds) [info] - Principal specified via SPARK_MESOS_PRINCIPAL_FILE (2 milliseconds) [info] - Principal specified via SPARK_MESOS_PRINCIPAL_FILE that does not exist (2 milliseconds) [info] - Secret specified via spark.mesos.secret (2 milliseconds) [info] - Principal specified via spark.mesos.secret.file (2 milliseconds) [info] - Principal specified via spark.mesos.secret.file that does not exist (2 milliseconds) [info] - Principal specified via SPARK_MESOS_SECRET (1 millisecond) [info] - Principal specified via SPARK_MESOS_SECRET_FILE (1 millisecond) [info] - Secret specified with no principal (2 milliseconds) [info] - Principal specification preference (1 millisecond) [info] - Secret specification preference (1 millisecond) [info] MesosSchedulerBackendUtilSuite: [info] - ContainerInfo fails to parse invalid docker parameters (163 milliseconds) [info] - ContainerInfo parses docker parameters (3 milliseconds) [info] - SPARK-28778 ContainerInfo respects Docker network configuration (29 milliseconds) [info] MesosFineGrainedSchedulerBackendSuite: [info] - weburi is set in created scheduler driver (61 milliseconds) [info] - Use configured mesosExecutor.cores for ExecutorInfo (68 milliseconds) [info] - check spark-class location correctly (9 milliseconds) [info] - spark docker properties correctly populate the DockerInfo message (20 milliseconds) [info] - mesos resource offers result in launching tasks (81 milliseconds) [info] - can handle multiple roles (10 milliseconds) [info] MesosCoarseGrainedSchedulerBackendSuite: [info] - flume polling test multiple hosts (13 seconds, 840 milliseconds) [info] - mesos supports killing and limiting executors (1 second, 718 milliseconds) [info] FlumeStreamSuite: [info] - mesos supports killing and relaunching tasks with executors (204 milliseconds) [info] - mesos supports spark.executor.cores (136 milliseconds) [info] - mesos supports unset spark.executor.cores (147 milliseconds) [info] - zero sized blocks (6 seconds, 411 milliseconds) [info] - mesos does not acquire more than spark.cores.max (100 milliseconds) [info] - mesos does not acquire gpus if not specified (109 milliseconds) [info] - mesos does not acquire more than spark.mesos.gpus.max (93 milliseconds) [info] - mesos declines offers that violate attribute constraints (128 milliseconds) [info] - flume input stream (1 second, 160 milliseconds) [info] - mesos declines offers with a filter when reached spark.cores.max (89 milliseconds) [info] - mesos declines offers with a filter when maxCores not a multiple of executor.cores (92 milliseconds) [info] - mesos declines offers with a filter when reached spark.cores.max with executor.cores (118 milliseconds) [info] - mesos assigns tasks round-robin on offers (111 milliseconds) [info] - mesos creates multiple executors on a single slave (143 milliseconds) [info] - mesos doesn't register twice with the same shuffle service (80 milliseconds) [info] - flume input compressed stream (998 milliseconds) [info] - Port offer decline when there is no appropriate range (117 milliseconds) [info] Test run started [info] Test org.apache.spark.streaming.flume.JavaFlumeStreamSuite.testFlumeStream started [info] - Port offer accepted when ephemeral ports are used (85 milliseconds) [info] - Port offer accepted with user defined port numbers (97 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.198s [info] Test run started [info] Test org.apache.spark.streaming.flume.JavaFlumePollingStreamSuite.testFlumeStream started [info] - mesos kills an executor when told (98 milliseconds) [info] - repeatedly failing task that crashes JVM with a zero exit code (SPARK-16925) (11 seconds, 14 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.14s [info] - weburi is set in created scheduler driver (97 milliseconds) [info] - failover timeout is set in created scheduler driver (93 milliseconds) [info] - honors unset spark.mesos.containerizer (228 milliseconds) [info] - honors spark.mesos.containerizer="mesos" (183 milliseconds) [info] - docker settings are reflected in created tasks (198 milliseconds) [info] - force-pull-image option is disabled by default (163 milliseconds) [info] LabelPropagationSuite: [info] - mesos supports spark.executor.uri (144 milliseconds) [info] - mesos supports setting fetcher cache (157 milliseconds) [info] - mesos supports disabling fetcher cache (125 milliseconds) [info] - mesos sets task name to spark.app.name (120 milliseconds) [info] - mesos sets configurable labels on tasks (120 milliseconds) [info] - mesos supports spark.mesos.network.name and spark.mesos.network.labels (89 milliseconds) [info] - SPARK-28778 '--hostname' shouldn't be set for executor when virtual network is enabled (733 milliseconds) [info] - supports spark.scheduler.minRegisteredResourcesRatio (212 milliseconds) [info] - zero sized blocks without kryo (6 seconds, 130 milliseconds) [info] - caching (encryption = off) (5 seconds, 234 milliseconds) [info] - shuffle on mutable pairs (3 seconds, 538 milliseconds) [info] - caching (encryption = on) (4 seconds, 266 milliseconds) [info] - Label Propagation (8 seconds, 83 milliseconds) [info] BytecodeUtilsSuite: [info] - closure invokes a method (7 milliseconds) [info] - closure inside a closure invokes a method (2 milliseconds) [info] - closure inside a closure inside a closure invokes a method (3 milliseconds) [info] - closure calling a function that invokes a method (2 milliseconds) [info] - closure calling a function that invokes a method which uses another closure (3 milliseconds) [info] - nested closure (3 milliseconds) [info] PregelSuite: [info] - supports data locality with dynamic allocation (6 seconds, 229 milliseconds) [info] - Creates an env-based reference secrets. (157 milliseconds) [info] - Creates an env-based value secrets. (81 milliseconds) [info] - Creates file-based reference secrets. (79 milliseconds) [info] - Creates a file-based value secrets. (89 milliseconds) [info] - 1 iteration (748 milliseconds) [info] MesosClusterSchedulerSuite: [info] - can queue drivers (44 milliseconds) [info] - can kill queued drivers (27 milliseconds) [info] - can handle multiple roles (44 milliseconds) [info] - escapes commandline args for the shell (51 milliseconds) [info] - supports spark.mesos.driverEnv.* (26 milliseconds) [info] - supports spark.mesos.network.name and spark.mesos.network.labels (26 milliseconds) [info] - supports setting fetcher cache on the dispatcher (27 milliseconds) [info] - supports setting fetcher cache in the submission (26 milliseconds) [info] - supports disabling fetcher cache (30 milliseconds) [info] - accept/decline offers with driver constraints (49 milliseconds) [info] - supports spark.mesos.driver.labels (41 milliseconds) [info] - can kill supervised drivers (42 milliseconds) [info] - sorting on mutable pairs (3 seconds, 424 milliseconds) [info] - SPARK-27347: do not restart outdated supervised drivers (1 second, 542 milliseconds) [info] - Declines offer with refuse seconds = 120. (25 milliseconds) [info] - Creates an env-based reference secrets. (23 milliseconds) [info] - Creates an env-based value secrets. (30 milliseconds) [info] - Creates file-based reference secrets. (29 milliseconds) [info] - Creates a file-based value secrets. (37 milliseconds) [info] MesosClusterDispatcherSuite: [info] - prints usage on empty input (14 milliseconds) [info] - prints usage with only --help (2 milliseconds) [info] - prints error with unrecognized options (2 milliseconds) [info] MesosClusterManagerSuite: [info] - mesos fine-grained (59 milliseconds) [info] - mesos coarse-grained (63 milliseconds) [info] - chain propagation (2 seconds, 388 milliseconds) [info] PeriodicGraphCheckpointerSuite: [info] - mesos with zookeeper (69 milliseconds) [info] - Persisting (137 milliseconds) [info] - mesos with i/o encryption throws error (133 milliseconds) [info] MesosClusterDispatcherArgumentsSuite: (spark.testing,true) (spark.ui.showConsoleProgress,false) (spark.master.rest.enabled,false) (spark.test.home,/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6) (spark.ui.enabled,false) (spark.unsafe.exceptionOnMemoryLeak,true) (spark.mesos.key2,value2) (spark.memory.debugFill,true) (spark.port.maxRetries,100) [info] - test if spark config args are passed successfully (15 milliseconds) (spark.testing,true) (spark.ui.showConsoleProgress,false) (spark.master.rest.enabled,false) (spark.test.home,/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6) (spark.ui.enabled,false) (spark.unsafe.exceptionOnMemoryLeak,true) (spark.memory.debugFill,true) (spark.port.maxRetries,100) [info] - test non conf settings (2 milliseconds) [info] MesosProtoUtilsSuite: [info] - mesosLabels (1 millisecond) [info] - caching on disk (encryption = off) (3 seconds, 747 milliseconds) [info] ReliableKafkaStreamSuite: [info] - cogroup using mutable pairs (3 seconds, 283 milliseconds) [info] - Checkpointing (2 seconds, 835 milliseconds) [info] ConnectedComponentsSuite: [info] - caching on disk (encryption = on) (3 seconds, 950 milliseconds) [info] - subtract mutable pairs (3 seconds, 690 milliseconds) [info] - Reliable Kafka input stream with single topic (2 seconds, 229 milliseconds) [info] - Grid Connected Components (3 seconds, 272 milliseconds) [info] - Reliable Kafka input stream with multiple topics (869 milliseconds) [info] KafkaStreamSuite: [info] - caching in memory, replicated (encryption = off) (4 seconds, 309 milliseconds) [info] - sort with Java non serializable class - Kryo (3 seconds, 804 milliseconds) [info] - Reverse Grid Connected Components (3 seconds, 147 milliseconds) [info] - Kafka input stream (1 second, 357 milliseconds) [info] DirectKafkaStreamSuite: [info] - basic stream receiving with multiple topics and smallest starting offset (940 milliseconds) [info] - receiving from largest starting offset (250 milliseconds) [info] - creating stream by offset (306 milliseconds) [info] - sort with Java non serializable class - Java (2 seconds, 960 milliseconds) [info] - caching in memory, replicated (encryption = off) (with replication as stream) (3 seconds, 871 milliseconds) [info] - shuffle with different compression settings (SPARK-3426) (451 milliseconds) [info] - [SPARK-4085] rerun map stage if reduce stage cannot find its local shuffle file (448 milliseconds) [info] - cannot find its local shuffle file if no execution of the stage and rerun shuffle (133 milliseconds) [info] - metrics for shuffle without aggregation (502 milliseconds) [info] - offset recovery (2 seconds, 135 milliseconds) [info] - Chain Connected Components (4 seconds, 877 milliseconds) [info] - metrics for shuffle with aggregation (660 milliseconds) [info] - multiple simultaneous attempts for one task (SPARK-8029) (87 milliseconds) [info] - Direct Kafka stream report input information (657 milliseconds) [info] - maxMessagesPerPartition with backpressure disabled (68 milliseconds) [info] - maxMessagesPerPartition with no lag (103 milliseconds) [info] - maxMessagesPerPartition respects max rate (95 milliseconds) [info] - using rate controller (588 milliseconds) [info] - use backpressure.initialRate with backpressure (266 milliseconds) [info] - backpressure.initialRate should honor maxRatePerPartition (220 milliseconds) [info] - maxMessagesPerPartition with zero offset and rate equal to one (85 milliseconds) [info] KafkaRDDSuite: [info] - caching in memory, replicated (encryption = on) (4 seconds, 61 milliseconds) [info] - basic usage (215 milliseconds) [info] - iterator boundary conditions (214 milliseconds) [info] KafkaClusterSuite: [info] - metadata apis (12 milliseconds) [info] - leader offset apis (5 milliseconds) [info] - consumer offset apis (16 milliseconds) [info] Test run started [info] Test org.apache.spark.streaming.kafka.JavaKafkaStreamSuite.testKafkaStream started [info] - using external shuffle service (4 seconds, 128 milliseconds) [info] ConfigEntrySuite: [info] - conf entry: int (1 millisecond) [info] - conf entry: long (1 millisecond) [info] - conf entry: double (0 milliseconds) [info] - conf entry: boolean (0 milliseconds) [info] - conf entry: optional (1 millisecond) [info] - conf entry: fallback (0 milliseconds) [info] - conf entry: time (1 millisecond) [info] - conf entry: bytes (1 millisecond) [info] - Reverse Chain Connected Components (4 seconds, 752 milliseconds) [info] - conf entry: regex (1 millisecond) [info] - conf entry: string seq (1 millisecond) [info] - conf entry: int seq (0 milliseconds) [info] - conf entry: transformation (1 millisecond) [info] - conf entry: checkValue() (2 milliseconds) [info] - conf entry: valid values check (1 millisecond) [info] - conf entry: conversion error (1 millisecond) [info] - default value handling is null-safe (0 milliseconds) [info] - variable expansion of spark config entries (5 milliseconds) [info] - conf entry : default function (1 millisecond) [info] - conf entry: alternative keys (0 milliseconds) [info] - onCreate (1 millisecond) [info] InputOutputMetricsSuite: [info] - input metrics for old hadoop with coalesce (182 milliseconds) [info] - input metrics with cache and coalesce (138 milliseconds) [info] - input metrics for new Hadoop API with coalesce (91 milliseconds) [info] - input metrics when reading text file (41 milliseconds) [info] - Connected Components on a Toy Connected Graph (775 milliseconds) [info] VertexRDDSuite: [info] - input metrics on records read - simple (112 milliseconds) [info] - input metrics on records read - more stages (116 milliseconds) [info] - input metrics on records - New Hadoop API (29 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 1 total, 1.719s [info] Test run started [info] Test org.apache.spark.streaming.kafka.JavaKafkaRDDSuite.testKafkaRDD started [info] - input metrics on records read with cache (118 milliseconds) [info] - filter (454 milliseconds) [info] - input read/write and shuffle read/write metrics all line up (173 milliseconds) [info] - input metrics with interleaved reads (298 milliseconds) [info] - mapValues (458 milliseconds) [info] - output metrics on records written (85 milliseconds) [info] - output metrics on records written - new Hadoop API (100 milliseconds) [info] - caching in memory, replicated (encryption = on) (with replication as stream) (3 seconds, 914 milliseconds) [info] - minus (249 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.961s [info] Test run started [info] Test org.apache.spark.streaming.kafka.JavaDirectKafkaStreamSuite.testKafkaStream started [info] - output metrics when writing text file (68 milliseconds) [info] - input metrics with old CombineFileInputFormat (41 milliseconds) [info] - input metrics with new CombineFileInputFormat (57 milliseconds) [info] - minus with RDD[(VertexId, VD)] (250 milliseconds) [info] - input metrics with old Hadoop API in different thread (77 milliseconds) [info] - input metrics with new Hadoop API in different thread (109 milliseconds) [info] AppStatusStoreSuite: [info] - quantile calculation: 1 task (28 milliseconds) [info] - quantile calculation: few tasks (5 milliseconds) [info] - quantile calculation: more tasks (19 milliseconds) [info] - quantile calculation: lots of tasks (85 milliseconds) [info] - quantile calculation: custom quantiles (43 milliseconds) [info] - minus with non-equal number of partitions (485 milliseconds) [info] - quantile cache (113 milliseconds) [info] - SPARK-28638: only successful tasks have taskSummary when with in memory kvstore (1 millisecond) [info] - SPARK-28638: summary should contain successful tasks only when with in memory kvstore (10 milliseconds) [info] CountEvaluatorSuite: [info] - test count 0 (1 millisecond) [info] - test count >= 1 (30 milliseconds) [info] TaskResultGetterSuite: [info] - handling results smaller than max RPC message size (57 milliseconds) [info] - diff (408 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 1 total, 1.164s [info] - handling results larger than max RPC message size (365 milliseconds) [info] - handling total size of results larger than maxResultSize (116 milliseconds) [info] - diff with RDD[(VertexId, VD)] (440 milliseconds) [info] - task retried if result missing from block manager (321 milliseconds) [info] - diff vertices with non-equal number of partitions (368 milliseconds) [info] - failed task deserialized with the correct classloader (SPARK-11195) (299 milliseconds) [info] - task result size is set on the driver, not the executors (117 milliseconds) Exception in thread "task-result-getter-0" java.lang.NoClassDefFoundError at org.apache.spark.scheduler.UndeserializableException.readObject(TaskResultGetterSuite.scala:304) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1170) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2178) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431) at org.apache.spark.ThrowableSerializationWrapper.readObject(TaskEndReason.scala:193) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1170) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2178) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)[info] - failed task is handled when error occurs deserializing the reason (63 milliseconds) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431) at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75) at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114) at org.apache.spark.scheduler.TaskResultGetter$$anon$4$$anonfun$run$2.apply$mcV$sp(TaskResultGetter.scala:142) at org.apache.spark.scheduler.TaskResultGetter$$anon$4$$anonfun$run$2.apply(TaskResultGetter.scala:138) at org.apache.spark.scheduler.TaskResultGetter$$anon$4$$anonfun$run$2.apply(TaskResultGetter.scala:138) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945) at org.apache.spark.scheduler.TaskResultGetter$$anon$4.run(TaskResultGetter.scala:138) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) [info] SerializerPropertiesSuite: [info] - JavaSerializer does not support relocation (3 milliseconds) [info] KafkaRDDSuite: [info] - KryoSerializer supports relocation when auto-reset is enabled (107 milliseconds) [info] - KryoSerializer does not support relocation when auto-reset is disabled (12 milliseconds) [info] DriverRunnerTest: [info] - Process succeeds instantly (48 milliseconds) [info] - Process failing several times and then succeeding (25 milliseconds) [info] - leftJoin (539 milliseconds) [info] - Process doesn't restart if not supervised (22 milliseconds) [info] - Process doesn't restart if killed (23 milliseconds) [info] - Reset of backoff counter (24 milliseconds) [info] - Kill process finalized with state KILLED (33 milliseconds) [info] - Finalized with state FINISHED (34 milliseconds) [info] - Finalized with state FAILED (51 milliseconds) [info] - Handle exception starting process (34 milliseconds) [info] MapOutputTrackerSuite: [info] - master start and stop (70 milliseconds) [info] - master register shuffle and fetch (74 milliseconds) [info] - leftJoin vertices with non-equal number of partitions (424 milliseconds) [info] - master register and unregister shuffle (57 milliseconds) [info] - master register shuffle and unregister map output and fetch (81 milliseconds) [info] - remote fetch (181 milliseconds) [info] - remote fetch below max RPC message size (78 milliseconds) [info] - min broadcast size exceeds max RPC message size (27 milliseconds) [info] - getLocationsWithLargestOutputs with multiple outputs in same machine (88 milliseconds) [info] - innerJoin (564 milliseconds) [info] - caching in memory, serialized, replicated (encryption = off) (3 seconds, 512 milliseconds) [info] - innerJoin vertices with the non-equal number of partitions (295 milliseconds) [info] - aggregateUsingIndex (425 milliseconds) [info] - mergeFunc (170 milliseconds) [info] - cache, getStorageLevel (76 milliseconds) [info] - checkpoint (666 milliseconds) [info] - count (439 milliseconds) [info] EdgePartitionSuite: [info] - reverse (8 milliseconds) [info] - map (1 millisecond) [info] - filter (4 milliseconds) [info] - groupEdges (2 milliseconds) [info] - innerJoin (2 milliseconds) [info] - isActive, numActives, replaceActives (0 milliseconds) [info] - tripletIterator (0 milliseconds) [info] - serialization (16 milliseconds) [info] EdgeSuite: [info] - compare (1 millisecond) [info] PageRankSuite: [info] - caching in memory, serialized, replicated (encryption = off) (with replication as stream) (3 seconds, 610 milliseconds) [info] - Star PageRank (1 second, 838 milliseconds) [info] - caching in memory, serialized, replicated (encryption = on) (3 seconds, 542 milliseconds) [info] - Star PersonalPageRank (3 seconds, 697 milliseconds) [info] - caching in memory, serialized, replicated (encryption = on) (with replication as stream) (4 seconds, 303 milliseconds) [info] - caching on disk, replicated (encryption = off) (4 seconds, 419 milliseconds) [info] - remote fetch using broadcast (17 seconds, 826 milliseconds) [info] - equally divide map statistics tasks (40 milliseconds) [info] - zero-sized blocks should be excluded when getMapSizesByExecutorId (87 milliseconds) [info] PythonBroadcastSuite: [info] - PythonBroadcast can be serialized with Kryo (SPARK-4882) (18 milliseconds) [info] ExecutorRunnerTest: [info] - command includes appId (26 milliseconds) [info] CompressionCodecSuite: [info] - default compression codec (4 milliseconds) [info] - lz4 compression codec (1 millisecond) [info] - lz4 compression codec short form (1 millisecond) [info] - lz4 supports concatenation of serialized streams (2 milliseconds) [info] - lzf compression codec (11 milliseconds) [info] - lzf compression codec short form (1 millisecond) [info] - lzf supports concatenation of serialized streams (1 millisecond) [info] - snappy compression codec (32 milliseconds) [info] - snappy compression codec short form (2 milliseconds) [info] - snappy supports concatenation of serialized streams (1 millisecond) [info] - zstd compression codec (22 milliseconds) [info] - zstd compression codec short form (1 millisecond) [info] - zstd supports concatenation of serialized zstd (0 milliseconds) [info] - bad compression codec (1 millisecond) [info] MetricsSystemSuite: [info] - MetricsSystem with default config (2 milliseconds) [info] - MetricsSystem with sources add (7 milliseconds) [info] - MetricsSystem with Driver instance (1 millisecond) [info] - MetricsSystem with Driver instance and spark.app.id is not set (2 milliseconds) [info] - MetricsSystem with Driver instance and spark.executor.id is not set (2 milliseconds) [info] - MetricsSystem with Executor instance (2 milliseconds) [info] - MetricsSystem with Executor instance and spark.app.id is not set (1 millisecond) [info] - MetricsSystem with Executor instance and spark.executor.id is not set (1 millisecond) [info] - MetricsSystem with instance which is neither Driver nor Executor (2 milliseconds) [info] - MetricsSystem with Executor instance, with custom namespace (1 millisecond) [info] - MetricsSystem with Executor instance, custom namespace which is not set (1 millisecond) [info] - MetricsSystem with Executor instance, custom namespace, spark.executor.id not set (1 millisecond) [info] - MetricsSystem with non-driver, non-executor instance with custom namespace (2 milliseconds) [info] ConfigReaderSuite: [info] - variable expansion (1 millisecond) [info] - circular references (1 millisecond) [info] - spark conf provider filters config keys (0 milliseconds) [info] DoubleRDDSuite: [info] - sum (44 milliseconds) [info] - WorksOnEmpty (43 milliseconds) [info] - WorksWithOutOfRangeWithOneBucket (33 milliseconds) [info] - WorksInRangeWithOneBucket (33 milliseconds) [info] - WorksInRangeWithOneBucketExactMatch (32 milliseconds) [info] - WorksWithOutOfRangeWithTwoBuckets (25 milliseconds) [info] - WorksWithOutOfRangeWithTwoUnEvenBuckets (14 milliseconds) [info] - WorksInRangeWithTwoBuckets (25 milliseconds) [info] - WorksInRangeWithTwoBucketsAndNaN (27 milliseconds) [info] - WorksInRangeWithTwoUnevenBuckets (34 milliseconds) [info] - WorksMixedRangeWithTwoUnevenBuckets (15 milliseconds) [info] - WorksMixedRangeWithFourUnevenBuckets (16 milliseconds) [info] - WorksMixedRangeWithUnevenBucketsAndNaN (15 milliseconds) [info] - WorksMixedRangeWithUnevenBucketsAndNaNAndNaNRange (18 milliseconds) [info] - WorksMixedRangeWithUnevenBucketsAndNaNAndNaNRangeAndInfinity (16 milliseconds) [info] - WorksWithOutOfRangeWithInfiniteBuckets (15 milliseconds) [info] - ThrowsExceptionOnInvalidBucketArray (2 milliseconds) [info] - WorksWithoutBucketsBasic (48 milliseconds) [info] - WorksWithoutBucketsBasicSingleElement (27 milliseconds) [info] - WorksWithoutBucketsBasicNoRange (29 milliseconds) [info] - WorksWithoutBucketsBasicTwo (28 milliseconds) [info] - WorksWithDoubleValuesAtMinMax (69 milliseconds) [info] - WorksWithoutBucketsWithMoreRequestedThanElements (29 milliseconds) [info] - WorksWithoutBucketsForLargerDatasets (42 milliseconds) [info] - WorksWithoutBucketsWithNonIntegralBucketEdges (34 milliseconds) [info] - Grid PageRank (11 seconds, 761 milliseconds) [info] - WorksWithHugeRange (406 milliseconds) [info] - caching on disk, replicated (encryption = off) (with replication as stream) (3 seconds, 539 milliseconds) [info] - ThrowsExceptionOnInvalidRDDs (40 milliseconds) [info] NextIteratorSuite: [info] - one iteration (4 milliseconds) [info] - two iterations (1 millisecond) [info] - empty iteration (1 millisecond) [info] - close is called once for empty iterations (0 milliseconds) [info] - close is called once for non-empty iterations (0 milliseconds) [info] SparkSubmitSuite: [info] - prints usage on empty input (21 milliseconds) [info] - prints usage with only --help (3 milliseconds) [info] - prints error with unrecognized options (1 millisecond) [info] - handle binary specified but not class (115 milliseconds) [info] - handles arguments with --key=val (4 milliseconds) [info] - handles arguments to user program (1 millisecond) [info] - handles arguments to user program with name collision (1 millisecond) [info] - print the right queue name (10 milliseconds) [info] - SPARK-24241: do not fail fast if executor num is 0 when dynamic allocation is enabled (2 milliseconds) [info] - specify deploy mode through configuration (253 milliseconds) [info] - handles YARN cluster mode (20 milliseconds) [info] - handles YARN client mode (41 milliseconds) [info] - handles standalone cluster mode (15 milliseconds) [info] - handles legacy standalone cluster mode (16 milliseconds) [info] - handles standalone client mode (35 milliseconds) [info] - handles mesos client mode (40 milliseconds) [info] - handles k8s cluster mode (22 milliseconds) [info] - handles confs with flag equivalents (19 milliseconds) [info] - SPARK-21568 ConsoleProgressBar should be enabled only in shells (72 milliseconds) [info] - Chain PageRank (2 seconds, 842 milliseconds) [info] - caching on disk, replicated (encryption = on) (3 seconds, 601 milliseconds) [info] - launch simple application with spark-submit (5 seconds, 143 milliseconds) [info] - Chain PersonalizedPageRank (3 seconds, 659 milliseconds) [info] - caching on disk, replicated (encryption = on) (with replication as stream) (3 seconds, 672 milliseconds) [info] - caching in memory and disk, replicated (encryption = off) (3 seconds, 546 milliseconds) [info] - launch simple application with spark-submit with redaction (5 seconds, 141 milliseconds) [info] - caching in memory and disk, replicated (encryption = off) (with replication as stream) (3 seconds, 852 milliseconds) [info] - caching in memory and disk, replicated (encryption = on) (4 seconds, 37 milliseconds) [info] - includes jars passed in through --jars (9 seconds, 980 milliseconds) [info] - Loop with source PageRank (15 seconds, 219 milliseconds) [info] - caching in memory and disk, replicated (encryption = on) (with replication as stream) (4 seconds, 90 milliseconds) [info] - caching in memory and disk, serialized, replicated (encryption = off) (3 seconds, 640 milliseconds) [info] - caching in memory and disk, serialized, replicated (encryption = off) (with replication as stream) (3 seconds, 727 milliseconds) [info] - includes jars passed in through --packages (9 seconds, 560 milliseconds) [info] - caching in memory and disk, serialized, replicated (encryption = on) (3 seconds, 517 milliseconds) [info] - Loop with sink PageRank (13 seconds, 980 milliseconds) [info] EdgeRDDSuite: [info] - cache, getStorageLevel (81 milliseconds) [info] - checkpointing (226 milliseconds) [info] - count (116 milliseconds) [info] GraphSuite: [info] - Graph.fromEdgeTuples (240 milliseconds) [info] - Graph.fromEdges (111 milliseconds) [info] - Graph.apply (294 milliseconds) [info] - triplets (342 milliseconds) [info] - caching in memory and disk, serialized, replicated (encryption = on) (with replication as stream) (3 seconds, 727 milliseconds) [info] - includes jars passed through spark.jars.packages and spark.jars.repositories (9 seconds, 814 milliseconds) [info] - correctly builds R packages included in a jar with --packages !!! IGNORED !!! [info] - compute without caching when no partitions fit in memory (3 seconds, 404 milliseconds) [info] - partitionBy (7 seconds, 741 milliseconds) [info] - mapVertices (276 milliseconds) [info] - compute when only some partitions fit in memory (3 seconds, 632 milliseconds) [info] - mapVertices changing type with same erased type (293 milliseconds) [info] - mapEdges (166 milliseconds) [info] - mapTriplets (328 milliseconds) [info] - reverse (304 milliseconds) [info] - reverse with join elimination (256 milliseconds) [info] - subgraph (369 milliseconds) [info] - mask (285 milliseconds) [info] - groupEdges (392 milliseconds) [info] - aggregateMessages (389 milliseconds) [info] - passing environment variables to cluster (2 seconds, 902 milliseconds) [info] - outerJoinVertices (558 milliseconds) [info] - more edge partitions than vertex partitions (249 milliseconds) [info] - checkpoint (376 milliseconds) [info] - cache, getStorageLevel (59 milliseconds) [info] - non-default number of edge partitions (273 milliseconds) [info] - unpersist graph RDD (502 milliseconds) [info] - SPARK-14219: pickRandomVertex (187 milliseconds) [info] ShortestPathsSuite: [info] - include an external JAR in SparkR (10 seconds, 74 milliseconds) [info] - Shortest Path Computations (664 milliseconds) [info] GraphOpsSuite: [info] - resolves command line argument paths correctly (118 milliseconds) [info] - ambiguous archive mapping results in error message (22 milliseconds) [info] - joinVertices (283 milliseconds) [info] - resolves config paths correctly (175 milliseconds) [info] - collectNeighborIds (465 milliseconds) [info] - removeSelfEdges (188 milliseconds) [info] - filter (283 milliseconds) [info] - convertToCanonicalEdges (199 milliseconds) [info] - collectEdgesCycleDirectionOut (398 milliseconds) [info] - collectEdgesCycleDirectionIn (464 milliseconds) [info] - collectEdgesCycleDirectionEither (467 milliseconds) [info] - user classpath first in driver (2 seconds, 481 milliseconds) [info] - SPARK_CONF_DIR overrides spark-defaults.conf (7 milliseconds) [info] - support glob path (40 milliseconds) [info] - SPARK-27575: yarn confs should merge new value with existing value (51 milliseconds) [info] - downloadFile - invalid url (35 milliseconds) [info] - downloadFile - file doesn't exist (32 milliseconds) [info] - downloadFile does not download local file (23 milliseconds) [info] - download one file to local (35 milliseconds) [info] - download list of files to local (33 milliseconds) [info] - remove copies of application jar from classpath (35 milliseconds) [info] - Avoid re-upload remote resources in yarn client mode (38 milliseconds) [info] - download remote resource if it is not supported by yarn service (40 milliseconds) [info] - collectEdgesChainDirectionOut (415 milliseconds) [info] - avoid downloading remote resource if it is supported by yarn service (38 milliseconds) [info] - recover from node failures (5 seconds, 866 milliseconds) [info] - force download from blacklisted schemes (37 milliseconds) [info] - force download for all the schemes (38 milliseconds) [info] - start SparkApplication without modifying system properties (39 milliseconds) [info] - support --py-files/spark.submit.pyFiles in non pyspark application (104 milliseconds) [info] - handles natural line delimiters in --properties-file and --conf uniformly (37 milliseconds) [info] NettyRpcEnvSuite: [info] - send a message locally (9 milliseconds) [info] - collectEdgesChainDirectionIn (381 milliseconds) [info] - send a message remotely (47 milliseconds) [info] - send a RpcEndpointRef (2 milliseconds) [info] - ask a message locally (2 milliseconds) [info] - ask a message remotely (60 milliseconds) [info] - ask a message timeout (50 milliseconds) [info] - onStart and onStop (1 millisecond) [info] - onError: error in onStart (1 millisecond) [info] - onError: error in onStop (1 millisecond) [info] - onError: error in receive (1 millisecond) [info] - self: call in onStart (1 millisecond) [info] - self: call in receive (6 milliseconds) [info] - self: call in onStop (1 millisecond) [info] - collectEdgesChainDirectionEither (406 milliseconds) [info] StronglyConnectedComponentsSuite: [info] - call receive in sequence (275 milliseconds) [info] - stop(RpcEndpointRef) reentrant (2 milliseconds) [info] - sendWithReply (1 millisecond) [info] - sendWithReply: remotely (50 milliseconds) [info] - sendWithReply: error (2 milliseconds) [info] - sendWithReply: remotely error (63 milliseconds) [info] - network events in sever RpcEnv when another RpcEnv is in server mode (116 milliseconds) [info] - network events in sever RpcEnv when another RpcEnv is in client mode (102 milliseconds) [info] - network events in client RpcEnv when another RpcEnv is in server mode (114 milliseconds) [info] - sendWithReply: unserializable error (68 milliseconds) [info] - port conflict (60 milliseconds) [info] - Island Strongly Connected Components (865 milliseconds) [info] - send with authentication (473 milliseconds) [info] - send with SASL encryption (128 milliseconds) [info] - send with AES encryption (154 milliseconds) [info] - ask with authentication (165 milliseconds) [info] - ask with SASL encryption (101 milliseconds) [info] - ask with AES encryption (102 milliseconds) [info] - construct RpcTimeout with conf property (2 milliseconds) [info] - ask a message timeout on Future using RpcTimeout (24 milliseconds) [info] - file server (119 milliseconds) [info] - SPARK-14699: RpcEnv.shutdown should not fire onDisconnected events (121 milliseconds) [info] - non-existent endpoint (2 milliseconds) [info] - advertise address different from bind address (53 milliseconds) [info] - RequestMessage serialization (10 milliseconds) Exception in thread "dispatcher-event-loop-0" java.lang.StackOverflowError at org.apache.spark.rpc.netty.NettyRpcEnvSuite$$anonfun$5$$anon$1$$anonfun$receiveAndReply$1.applyOrElse(NettyRpcEnvSuite.scala:113) at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:105) at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:215) at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101) at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:221) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Exception in thread "dispatcher-event-loop-1" java.lang.StackOverflowError at org.apache.spark.rpc.netty.NettyRpcEnvSuite$$anonfun$5$$anon$1$$anonfun$receiveAndReply$1.applyOrElse(NettyRpcEnvSuite.scala:113) at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:105) at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:215) at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101) at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:221) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) [info] - StackOverflowError should be sent back and Dispatcher should survive (95 milliseconds) [info] JsonProtocolSuite: [info] - SparkListenerEvent (249 milliseconds) [info] - Dependent Classes (20 milliseconds) [info] - ExceptionFailure backward compatibility: full stack trace (2 milliseconds) [info] - StageInfo backward compatibility (details, accumulables) (1 millisecond) [info] - InputMetrics backward compatibility (2 milliseconds) [info] - Input/Output records backwards compatibility (1 millisecond) [info] - Shuffle Read/Write records backwards compatibility (1 millisecond) [info] - OutputMetrics backward compatibility (2 milliseconds) [info] - BlockManager events backward compatibility (1 millisecond) [info] - FetchFailed backwards compatibility (1 millisecond) [info] - ShuffleReadMetrics: Local bytes read backwards compatibility (1 millisecond) [info] - SparkListenerApplicationStart backwards compatibility (1 millisecond) [info] - ExecutorLostFailure backward compatibility (1 millisecond) [info] - SparkListenerJobStart backward compatibility (3 milliseconds) [info] - SparkListenerJobStart and SparkListenerJobEnd backward compatibility (3 milliseconds) [info] - RDDInfo backward compatibility (scope, parent IDs, callsite) (1 millisecond) [info] - StageInfo backward compatibility (parent IDs) (1 millisecond) [info] - TaskCommitDenied backward compatibility (1 millisecond) [info] - AccumulableInfo backward compatibility (1 millisecond) [info] - ExceptionFailure backward compatibility: accumulator updates (9 milliseconds) [info] - AccumulableInfo value de/serialization (2 milliseconds) [info] - SPARK-31923: unexpected value type of internal accumulator (1 millisecond) [info] RPackageUtilsSuite: [info] - pick which jars to unpack using the manifest (297 milliseconds) [info] - Cycle Strongly Connected Components (2 seconds, 862 milliseconds) [info] - build an R package from a jar end to end (2 seconds, 967 milliseconds) [info] - 2 Cycle Strongly Connected Components (2 seconds, 389 milliseconds) [info] VertexPartitionSuite: [info] - isDefined, filter (5 milliseconds) [info] - map (1 millisecond) [info] - diff (1 millisecond) [info] - leftJoin (2 milliseconds) [info] - innerJoin (3 milliseconds) [info] - createUsingIndex (0 milliseconds) [info] - innerJoinKeepLeft (1 millisecond) [info] - aggregateUsingIndex (1 millisecond) [info] - jars that don't exist are skipped and print warning (356 milliseconds) [info] - reindex (2 milliseconds) [info] - serialization (10 milliseconds) [info] GraphLoaderSuite: [info] - faulty R package shows documentation (363 milliseconds) [info] - GraphLoader.edgeListFile (359 milliseconds) [info] TriangleCountSuite: [info] - jars without manifest return false (113 milliseconds) [info] - SparkR zipping works properly (7 milliseconds) [info] TopologyMapperSuite: [info] - File based Topology Mapper (6 milliseconds) [info] EventLoggingListenerSuite: [info] - Verify log file exist (43 milliseconds) [info] - Basic event logging (84 milliseconds) [info] - Basic event logging with compression (219 milliseconds) [info] - Count a single triangle (522 milliseconds) [info] - recover from repeated node failures during shuffle-map (8 seconds, 14 milliseconds) [info] - Count two triangles (581 milliseconds) [info] - Count two triangles with bi-directed edges (529 milliseconds) [info] - Count a single triangle with duplicate edges (654 milliseconds) [info] GraphGeneratorsSuite: [info] - GraphGenerators.generateRandomEdges (3 milliseconds) [info] - GraphGenerators.sampleLogNormal (6 milliseconds) [info] - GraphGenerators.logNormalGraph (408 milliseconds) [info] - SPARK-5064 GraphGenerators.rmatGraph numEdges upper bound (152 milliseconds) [info] SVDPlusPlusSuite: [info] - End-to-end event logging (3 seconds, 94 milliseconds) [info] - Test SVD++ with mean square error on training set (1 second, 112 milliseconds) [info] - Test SVD++ with no edges (184 milliseconds) [info] FailureTrackerSuite: [info] - failures expire if validity interval is set (223 milliseconds) [info] - failures never expire if validity interval is not set (-1) (4 milliseconds) [info] ClientSuite: [info] - default Yarn application classpath (51 milliseconds) [info] - default MR application classpath (1 millisecond) [info] - resultant classpath for an application that defines a classpath for YARN (316 milliseconds) [info] - resultant classpath for an application that defines a classpath for MR (23 milliseconds) [info] - resultant classpath for an application that defines both classpaths, YARN and MR (16 milliseconds) [info] - Local jar URIs (313 milliseconds) [info] - Jar path propagation through SparkConf (454 milliseconds) [info] - Cluster path translation (43 milliseconds) [info] - configuration and args propagate through createApplicationSubmissionContext (103 milliseconds) [info] - spark.yarn.jars with multiple paths and globs (239 milliseconds) [info] - distribute jars archive (131 milliseconds) [info] - distribute archive multiple times (524 milliseconds) [info] - distribute local spark jars (127 milliseconds) [info] - ignore same name jars (134 milliseconds) [info] - SPARK-31582 Being able to not populate Hadoop classpath (53 milliseconds) [info] - files URI match test1 (1 millisecond) [info] - files URI match test2 (1 millisecond) [info] - files URI match test3 (1 millisecond) [info] - wasb URI match test (1 millisecond) [info] - hdfs URI match test (0 milliseconds) [info] - files URI unmatch test1 (1 millisecond) [info] - files URI unmatch test2 (0 milliseconds) [info] - files URI unmatch test3 (1 millisecond) [info] - wasb URI unmatch test1 (1 millisecond) [info] - wasb URI unmatch test2 (1 millisecond) [info] - s3 URI unmatch test (1 millisecond) [info] - hdfs URI unmatch test1 (0 milliseconds) [info] - hdfs URI unmatch test2 (0 milliseconds) [info] YarnAllocatorSuite: [info] - single container allocated (135 milliseconds) [info] - container should not be created if requested number if met (46 milliseconds) [info] - some containers allocated (37 milliseconds) [info] - receive more containers than requested (47 milliseconds) [info] - decrease total requested executors (38 milliseconds) [info] - decrease total requested executors to less than currently running (46 milliseconds) [info] - kill executors (58 milliseconds) [info] - kill same executor multiple times (32 milliseconds) [info] - process same completed container multiple times (37 milliseconds) [info] - lost executor removed from backend (57 milliseconds) [info] - blacklisted nodes reflected in amClient requests (45 milliseconds) [info] - memory exceeded diagnostic regexes (1 millisecond) [info] - window based failure executor counting (38 milliseconds) [info] - SPARK-26269: YarnAllocator should have same blacklist behaviour with YARN (68 milliseconds) [info] ClientDistributedCacheManagerSuite: [info] - test getFileStatus empty (22 milliseconds) [info] - test getFileStatus cached (1 millisecond) [info] - test addResource (3 milliseconds) [info] - test addResource link null (1 millisecond) [info] - test addResource appmaster only (1 millisecond) [info] - test addResource archive (1 millisecond) [info] ExtensionServiceIntegrationSuite: [info] - Instantiate (9 milliseconds) [info] - Contains SimpleExtensionService Service (3 milliseconds) [info] YarnAllocatorBlacklistTrackerSuite: [info] - expiring its own blacklisted nodes (2 milliseconds) [info] - not handling the expiry of scheduler blacklisted nodes (1 millisecond) [info] - combining scheduler and allocation blacklist (2 milliseconds) [info] - blacklist all available nodes (2 milliseconds) [info] YarnClusterSuite: [info] - End-to-end event logging with compression (11 seconds, 869 milliseconds) [info] - Event logging with password redaction (32 milliseconds) [info] - Log overwriting (103 milliseconds) [info] - Event log name (1 millisecond) [info] FileCommitProtocolInstantiationSuite: [info] - Dynamic partitions require appropriate constructor (1 millisecond) [info] - Standard partitions work with classic constructor (1 millisecond) [info] - Three arg constructors have priority (1 millisecond) [info] - Three arg constructors have priority when dynamic (0 milliseconds) [info] - The protocol must be of the correct class (1 millisecond) [info] - If there is no matching constructor, class hierarchy is irrelevant (1 millisecond) [info] JobCancellationSuite: [info] - local mode, FIFO scheduler (107 milliseconds) [info] - local mode, fair scheduler (141 milliseconds) [info] - cluster mode, FIFO scheduler (3 seconds, 167 milliseconds) [info] - cluster mode, fair scheduler (3 seconds, 10 milliseconds) [info] - do not put partially executed partitions into cache (77 milliseconds) [info] - job group (90 milliseconds) [info] - inherited job group (SPARK-6629) (85 milliseconds) [info] - job group with interruption (91 milliseconds) [info] - run Spark in yarn-client mode (19 seconds, 131 milliseconds) [info] - recover from repeated node failures during shuffle-reduce (39 seconds, 503 milliseconds) [info] - task reaper kills JVM if killed tasks keep running for too long (17 seconds, 470 milliseconds) [info] - basic usage (2 minutes, 4 seconds) [info] - recover from node failures with replication (10 seconds, 305 milliseconds) [info] - task reaper will not kill JVM if spark.task.killTimeout == -1 (14 seconds, 32 milliseconds) [info] - two jobs sharing the same stage (118 milliseconds) [info] - unpersist RDDs (4 seconds, 115 milliseconds) [info] - interruptible iterator of shuffle reader (211 milliseconds) [info] TaskContextSuite: [info] - run Spark in yarn-cluster mode (23 seconds, 31 milliseconds) [info] - provide metrics sources (141 milliseconds) [info] - calls TaskCompletionListener after failure (178 milliseconds) [info] - calls TaskFailureListeners after failure (63 milliseconds) [info] - all TaskCompletionListeners should be called even if some fail (6 milliseconds) [info] - all TaskFailureListeners should be called even if some fail (5 milliseconds) [info] - TaskContext.attemptNumber should return attempt number, not task id (SPARK-4014) (73 milliseconds) [info] - TaskContext.stageAttemptNumber getter (548 milliseconds) [info] - accumulators are updated on exception failures (122 milliseconds) [info] - failed tasks collect only accumulators whose values count during failures (58 milliseconds) [info] - only updated internal accumulators will be sent back to driver (66 milliseconds) [info] - localProperties are propagated to executors correctly (63 milliseconds) [info] - immediately call a completion listener if the context is completed (1 millisecond) [info] - immediately call a failure listener if the context has failed (1 millisecond) [info] - TaskCompletionListenerException.getMessage should include previousError (1 millisecond) [info] - all TaskCompletionListeners should be called even if some fail or a task (6 milliseconds) [info] DAGSchedulerSuite: [info] - [SPARK-3353] parent stage should have lower stage id (68 milliseconds) [info] - [SPARK-13902] Ensure no duplicate stages are created (44 milliseconds) [info] - All shuffle files on the slave should be cleaned up when slave lost (118 milliseconds) [info] - SPARK-32003: All shuffle files for executor should be cleaned up on fetch failure (120 milliseconds) [info] - zero split job (4 milliseconds) [info] - run trivial job (4 milliseconds) [info] - run trivial job w/ dependency (3 milliseconds) [info] - equals and hashCode AccumulableInfo (1 millisecond) [info] - cache location preferences w/ dependency (24 milliseconds) [info] - regression test for getCacheLocs (2 milliseconds) [info] - reference partitions inside a task (2 seconds, 979 milliseconds) [info] - getMissingParentStages should consider all ancestor RDDs' cache statuses (4 milliseconds) [info] - avoid exponential blowup when getting preferred locs list (51 milliseconds) [info] - unserializable task (17 milliseconds) [info] - trivial job failure (14 milliseconds) [info] - trivial job cancellation (4 milliseconds) [info] - job cancellation no-kill backend (6 milliseconds) [info] - run trivial shuffle (9 milliseconds) [info] - run trivial shuffle with fetch failure (20 milliseconds) [info] - shuffle files not lost when slave lost with shuffle service (124 milliseconds) [info] - shuffle files lost when worker lost with shuffle service (134 milliseconds) [info] ExecutorPodsSnapshotSuite: [info] - shuffle files lost when worker lost without shuffle service (100 milliseconds) [info] - States are interpreted correctly from pod metadata. (208 milliseconds) [info] - Updates add new pods for non-matching ids and edit existing pods for matching ids (5 milliseconds) [info] EnvSecretsFeatureStepSuite: [info] - sets up all keyRefs (22 milliseconds) [info] RDriverFeatureStepSuite: [info] - shuffle files not lost when executor failure with shuffle service (100 milliseconds) [info] - R Step modifies container correctly (83 milliseconds) [info] ExecutorPodsPollingSnapshotSourceSuite: [info] - shuffle files lost when executor failure without shuffle service (106 milliseconds) [info] - Items returned by the API should be pushed to the event queue (15 milliseconds) [info] BasicExecutorFeatureStepSuite: [info] - basic executor pod has reasonable defaults (34 milliseconds) [info] - Single stage fetch failure should not abort the stage. (33 milliseconds) [info] - executor pod hostnames get truncated to 63 characters (2 milliseconds) [info] - classpath and extra java options get translated into environment variables (4 milliseconds) [info] - test executor pyspark memory (4 milliseconds) [info] DriverKubernetesCredentialsFeatureStepSuite: [info] - Don't set any credentials (10 milliseconds) [info] - Only set credentials that are manually mounted. (3 milliseconds) [info] - Multiple consecutive stage fetch failures should lead to job being aborted. (27 milliseconds) [info] - Mount credentials from the submission client as a secret. (71 milliseconds) [info] ClientSuite: [info] - The client should configure the pod using the builder. (101 milliseconds) [info] - Failures in different stages should not trigger an overall abort (48 milliseconds) [info] - The client should create Kubernetes resources (4 milliseconds) [info] - Waiting for app completion should stall on the watcher (3 milliseconds) [info] DriverServiceFeatureStepSuite: [info] - Headless service has a port for the driver RPC and the block manager. (18 milliseconds) [info] - Hostname and ports are set according to the service name. (1 millisecond) [info] - Ports should resolve to defaults in SparkConf and in the service. (1 millisecond) [info] - Long prefixes should switch to using a generated name. (2 milliseconds) [info] - Disallow bind address and driver host to be set explicitly. (0 milliseconds) [info] KubernetesDriverBuilderSuite: [info] - Apply fundamental steps all the time. (9 milliseconds) [info] - Apply secrets step if secrets are present. (3 milliseconds) [info] - Apply Java step if main resource is none. (3 milliseconds) [info] - Apply Python step if main resource is python. (4 milliseconds) [info] - Apply volumes step if mounts are present. (4 milliseconds) [info] - Apply R step if main resource is R. (2 milliseconds) [info] ExecutorPodsAllocatorSuite: [info] - Initially request executors in batches. Do not request another batch if the first has not finished. (29 milliseconds) [info] - Non-consecutive stage failures don't trigger abort (82 milliseconds) [info] - Request executors in batches. Allow another batch to be requested if all pending executors start running. (14 milliseconds) [info] - When a current batch reaches error states immediately, re-request them on the next batch. (11 milliseconds) [info] - When an executor is requested but the API does not report it in a reasonable time, retry requesting that executor. (5 milliseconds) [info] KubernetesClusterSchedulerBackendSuite: [info] - trivial shuffle with multiple fetch failures (9 milliseconds) [info] - Start all components (4 milliseconds) [info] - Stop all components (9 milliseconds) [info] - Remove executor (2 milliseconds) [info] - Kill executors (7 milliseconds) [info] - Request total executors (2 milliseconds) [info] KubernetesConfSuite: [info] - Basic driver translated fields. (4 milliseconds) [info] - Creating driver conf with and without the main app jar influences spark.jars (5 milliseconds) [info] - Creating driver conf with a python primary file (2 milliseconds) [info] - Creating driver conf with a r primary file (1 millisecond) [info] - Testing explicit setting of memory overhead on non-JVM tasks (2 milliseconds) [info] - Resolve driver labels, annotations, secret mount paths, envs, and memory overhead (4 milliseconds) [info] - Basic executor translated fields. (0 milliseconds) [info] - Image pull secrets. (1 millisecond) [info] - Set executor labels, annotations, and secrets (2 milliseconds) [info] KubernetesVolumeUtilsSuite: [info] - Retry all the tasks on a resubmitted attempt of a barrier stage caused by FetchFailure (24 milliseconds) [info] - Parses hostPath volumes correctly (4 milliseconds) [info] - Parses persistentVolumeClaim volumes correctly (4 milliseconds) [info] - Parses emptyDir volumes correctly (1 millisecond) [info] - Parses emptyDir volume options can be optional (0 milliseconds) [info] - Defaults optional readOnly to false (1 millisecond) [info] - Gracefully fails on missing mount key (1 millisecond) [info] - Gracefully fails on missing option key (1 millisecond) [info] BasicDriverFeatureStepSuite: [info] - Check the pod respects all configurations from the user. (9 milliseconds) [info] - Check appropriate entrypoint rerouting for various bindings (2 milliseconds) [info] - Additional system properties resolve jars and set cluster-mode confs. (2 milliseconds) [info] ExecutorPodsSnapshotsStoreSuite: [info] - Subscribers get notified of events periodically. (7 milliseconds) [info] - Even without sending events, initially receive an empty buffer. (1 millisecond) [info] - Replacing the snapshot passes the new snapshot to subscribers. (2 milliseconds) [info] MountVolumesFeatureStepSuite: [info] - Mounts hostPath volumes (5 milliseconds) [info] - Mounts pesistentVolumeClaims (3 milliseconds) [info] - Mounts emptyDir (3 milliseconds) [info] - Mounts emptyDir with no options (1 millisecond) [info] - Mounts multiple volumes (1 millisecond) [info] MountSecretsFeatureStepSuite: [info] - mounts all given secrets (4 milliseconds) [info] ExecutorPodsLifecycleManagerSuite: [info] - Retry all the tasks on a resubmitted attempt of a barrier stage caused by TaskKilled (25 milliseconds) [info] - When an executor reaches error states immediately, remove from the scheduler backend. (10 milliseconds) [info] - Don't remove executors twice from Spark but remove from K8s repeatedly. (4 milliseconds) [info] - When the scheduler backend lists executor ids that aren't present in the cluster, remove those executors from Spark. (2 milliseconds) [info] JavaDriverFeatureStepSuite: [info] - Java Step modifies container correctly (2 milliseconds) [info] ExecutorPodsWatchSnapshotSourceSuite: [info] - Watch events should be pushed to the snapshots store as snapshot updates. (2 milliseconds) [info] LocalDirsFeatureStepSuite: [info] - Fail the job if a barrier ResultTask failed (17 milliseconds) [info] - Resolve to default local dir if neither env nor configuration are set (36 milliseconds) [info] - Use configured local dirs split on comma if provided. (1 millisecond) [info] PythonDriverFeatureStepSuite: [info] - Python Step modifies container correctly (8 milliseconds) [info] - Python Step testing empty pyfiles (2 milliseconds) [info] KubernetesExecutorBuilderSuite: [info] - Basic steps are consistently applied. (3 milliseconds) [info] - Apply secrets step if secrets are present. (1 millisecond) [info] - Apply volumes step if mounts are present. (1 millisecond) [info] - late fetch failures don't cause multiple concurrent attempts for the same map stage (12 milliseconds) [info] - extremely late fetch failures don't cause multiple concurrent attempts for the same stage (38 milliseconds) [info] - task events always posted in speculation / when stage is killed (25 milliseconds) [info] - ignore late map task completions (9 milliseconds) [info] - run shuffle with map stage failure (4 milliseconds) [info] - shuffle fetch failure in a reused shuffle dependency (17 milliseconds) [info] - don't submit stage until its dependencies map outputs are registered (SPARK-5259) (21 milliseconds) [info] - register map outputs correctly after ExecutorLost and task Resubmitted (9 milliseconds) [info] - failure of stage used by two jobs (9 milliseconds) [info] - stage used by two jobs, the first no longer active (SPARK-6880) (12 milliseconds) [info] ReceiverTrackerSuite: [info] - stage used by two jobs, some fetch failures, and the first job no longer active (SPARK-6880) (17 milliseconds) [info] - run trivial shuffle with out-of-band executor failure and retry (11 milliseconds) [info] - recursive shuffle failures (23 milliseconds) [info] - cached post-shuffle (18 milliseconds) [info] - misbehaved accumulator should not crash DAGScheduler and SparkContext (25 milliseconds) [info] - misbehaved accumulator should not impact other accumulators (14 milliseconds) [info] - misbehaved resultHandler should not crash DAGScheduler and SparkContext (19 milliseconds) [info] - getPartitions exceptions should not crash DAGScheduler and SparkContext (SPARK-8606) (14 milliseconds) [info] - getPreferredLocations errors should not crash DAGScheduler and SparkContext (SPARK-8606) (13 milliseconds) [info] - accumulator not calculated for resubmitted result stage (4 milliseconds) [info] - accumulator not calculated for resubmitted task in result stage (4 milliseconds) [info] - accumulators are updated on exception failures and task killed (3 milliseconds) [info] - reduce tasks should be placed locally with map output (8 milliseconds) [info] - reduce task locality preferences should only include machines with largest map outputs (13 milliseconds) [info] - stages with both narrow and shuffle dependencies use narrow ones for locality (8 milliseconds) [info] - Spark exceptions should include call site in stack trace (34 milliseconds) [info] - catch errors in event loop (4 milliseconds) [info] - simple map stage submission (15 milliseconds) [info] - map stage submission with reduce stage also depending on the data (9 milliseconds) [info] - map stage submission with fetch failure (20 milliseconds) [info] - map stage submission with multiple shared stages and failures (34 milliseconds) [info] - Trigger mapstage's job listener in submitMissingTasks (14 milliseconds) [info] - send rate update to receivers (2 seconds, 512 milliseconds) [info] - map stage submission with executor failure late map task completions (13 milliseconds) [info] - getShuffleDependencies correctly returns only direct shuffle parents (1 millisecond) [info] - should restart receiver after stopping it (927 milliseconds) [info] - SPARK-11063: TaskSetManager should use Receiver RDD's preferredLocations (594 milliseconds) [info] - SPARK-17644: After one stage is aborted for too many failed attempts, subsequent stagesstill behave correctly on fetch failures (1 second, 436 milliseconds) [info] - [SPARK-19263] DAGScheduler should not submit multiple active tasksets, even with late completions from earlier stage attempts (17 milliseconds) [info] - task end event should have updated accumulators (SPARK-20342) (148 milliseconds) [info] - get allocated executors (499 milliseconds) [info] RateLimitedOutputStreamSuite: [info] - Barrier task failures from the same stage attempt don't trigger multiple stage retries (10 milliseconds) [info] - Barrier task failures from a previous stage attempt don't trigger stage retry (9 milliseconds) [info] - SPARK-23207: retry all the succeeding stages when the map stage is indeterminate (14 milliseconds) [info] - SPARK-29042: Sampled RDD with unordered input should be indeterminate (6 milliseconds) [info] - SPARK-23207: cannot rollback a result stage (8 milliseconds) [info] - SPARK-23207: local checkpoint fail to rollback (checkpointed before) (35 milliseconds) [info] - SPARK-23207: local checkpoint fail to rollback (checkpointing now) (13 milliseconds) [info] - SPARK-23207: reliable checkpoint can avoid rollback (checkpointed before) (81 milliseconds) [info] - SPARK-23207: reliable checkpoint fail to rollback (checkpointing now) (21 milliseconds) [info] - SPARK-28699: abort stage if parent stage is indeterminate stage (9 milliseconds) [info] PrefixComparatorsSuite: [info] - String prefix comparator (134 milliseconds) [info] - Binary prefix comparator (9 milliseconds) [info] - double prefix comparator handles NaNs properly (0 milliseconds) [info] - double prefix comparator handles negative NaNs properly (1 millisecond) [info] - double prefix comparator handles other special values properly (1 millisecond) [info] MasterWebUISuite: [info] - kill application (285 milliseconds) [info] - kill driver (112 milliseconds) [info] SorterSuite: [info] - equivalent to Arrays.sort (52 milliseconds) [info] - KVArraySorter (101 milliseconds) [info] - write (4 seconds, 166 milliseconds) [info] RecurringTimerSuite: [info] - basic (5 milliseconds) [info] - SPARK-10224: call 'callback' after stopping (8 milliseconds) [info] InputStreamsSuite: Exception in thread "receiver-supervisor-future-0" java.lang.Error: java.lang.InterruptedException: sleep interrupted at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException: sleep interrupted at java.lang.Thread.sleep(Native Method) at org.apache.spark.streaming.receiver.ReceiverSupervisor$$anonfun$restartReceiver$1.apply$mcV$sp(ReceiverSupervisor.scala:196) at org.apache.spark.streaming.receiver.ReceiverSupervisor$$anonfun$restartReceiver$1.apply(ReceiverSupervisor.scala:189) at org.apache.spark.streaming.receiver.ReceiverSupervisor$$anonfun$restartReceiver$1.apply(ReceiverSupervisor.scala:189) at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more [info] - socket input stream (700 milliseconds) [info] - socket input stream - no block in a batch (369 milliseconds) [info] - run Spark in yarn-client mode with different configurations, ensuring redaction (19 seconds, 29 milliseconds) [info] - binary records stream (6 seconds, 218 milliseconds) [info] - file input stream - newFilesOnly = true (508 milliseconds) [info] - file input stream - newFilesOnly = false (404 milliseconds) [info] - file input stream - wildcard (662 milliseconds) [info] - multi-thread receiver (2 seconds, 13 milliseconds) [info] - queue input stream - oneAtATime = true (1 second, 102 milliseconds) [info] - queue input stream - oneAtATime = false (2 seconds, 90 milliseconds) [info] - test track the number of input stream (100 milliseconds) [info] WriteAheadLogUtilsSuite: [info] - log selection and creation (50 milliseconds) [info] - wrap WriteAheadLog in BatchedWriteAheadLog when batching is enabled (6 milliseconds) [info] - batching is enabled by default in WriteAheadLog (1 millisecond) [info] - closeFileAfterWrite is disabled by default in WriteAheadLog (1 millisecond) [info] ReceiverSchedulingPolicySuite: [info] - rescheduleReceiver: empty executors (0 milliseconds) [info] - rescheduleReceiver: receiver preferredLocation (1 millisecond) [info] - rescheduleReceiver: return all idle executors if there are any idle executors (6 milliseconds) [info] - rescheduleReceiver: return all executors that have minimum weight if no idle executors (4 milliseconds) [info] - scheduleReceivers: schedule receivers evenly when there are more receivers than executors (3 milliseconds) [info] - scheduleReceivers: schedule receivers evenly when there are more executors than receivers (4 milliseconds) [info] - scheduleReceivers: schedule receivers evenly when the preferredLocations are even (7 milliseconds) [info] - scheduleReceivers: return empty if no receiver (1 millisecond) [info] - scheduleReceivers: return empty scheduled executors if no executors (1 millisecond) [info] PIDRateEstimatorSuite: [info] - the right estimator is created (13 milliseconds) [info] - estimator checks ranges (2 milliseconds) [info] - first estimate is None (2 milliseconds) [info] - second estimate is not None (2 milliseconds) [info] - no estimate when no time difference between successive calls (2 milliseconds) [info] - no estimate when no records in previous batch (1 millisecond) [info] - no estimate when there is no processing delay (0 milliseconds) [info] - estimate is never less than min rate (20 milliseconds) [info] - with no accumulated or positive error, |I| > 0, follow the processing speed (4 milliseconds) [info] - with no accumulated but some positive error, |I| > 0, follow the processing speed (3 milliseconds) [info] - with some accumulated and some positive error, |I| > 0, stay below the processing speed (19 milliseconds) [info] ReceivedBlockHandlerSuite: [info] - BlockManagerBasedBlockHandler - store blocks (728 milliseconds) [info] - BlockManagerBasedBlockHandler - handle errors in storing block (16 milliseconds) [info] - WriteAheadLogBasedBlockHandler - store blocks (786 milliseconds) [info] - WriteAheadLogBasedBlockHandler - handle errors in storing block (36 milliseconds) [info] - WriteAheadLogBasedBlockHandler - clean old blocks (139 milliseconds) [info] - Test Block - count messages (193 milliseconds) [info] - Test Block - isFullyConsumed (39 milliseconds) [info] - SPARK-5984 TimSort bug (18 seconds, 972 milliseconds) [info] - Sorter benchmark for key-value pairs !!! IGNORED !!! [info] - Sorter benchmark for primitive int array !!! IGNORED !!! [info] RandomSamplerSuite: [info] - utilities (8 milliseconds) [info] ReceivedBlockHandlerWithEncryptionSuite: [info] - sanity check medianKSD against references (97 milliseconds) [info] - bernoulli sampling (38 milliseconds) [info] - bernoulli sampling without iterator (37 milliseconds) [info] - bernoulli sampling with gap sampling optimization (76 milliseconds) [info] - bernoulli sampling (without iterator) with gap sampling optimization (83 milliseconds) [info] - bernoulli boundary cases (1 millisecond) [info] - bernoulli (without iterator) boundary cases (2 milliseconds) [info] - bernoulli data types (99 milliseconds) [info] - bernoulli clone (18 milliseconds) [info] - bernoulli set seed (34 milliseconds) [info] - BlockManagerBasedBlockHandler - store blocks (449 milliseconds) [info] - replacement sampling (53 milliseconds) [info] - replacement sampling without iterator (45 milliseconds) [info] - BlockManagerBasedBlockHandler - handle errors in storing block (25 milliseconds) [info] - replacement sampling with gap sampling (142 milliseconds) [info] - replacement sampling (without iterator) with gap sampling (168 milliseconds) [info] - replacement boundary cases (1 millisecond) [info] - replacement (without) boundary cases (1 millisecond) [info] - replacement data types (104 milliseconds) [info] - replacement clone (30 milliseconds) [info] - replacement set seed (73 milliseconds) [info] - bernoulli partitioning sampling (27 milliseconds) [info] - bernoulli partitioning sampling without iterator (26 milliseconds) [info] - bernoulli partitioning boundary cases (0 milliseconds) [info] - bernoulli partitioning (without iterator) boundary cases (3 milliseconds) [info] - bernoulli partitioning data (0 milliseconds) [info] - bernoulli partitioning clone (0 milliseconds) [info] PoolSuite: [info] - FIFO Scheduler Test (64 milliseconds) [info] - Fair Scheduler Test (86 milliseconds) [info] - Nested Pool Test (62 milliseconds) [info] - SPARK-17663: FairSchedulableBuilder sets default values for blank or invalid datas (7 milliseconds) [info] - FIFO scheduler uses root pool and not spark.scheduler.pool property (80 milliseconds) [info] - WriteAheadLogBasedBlockHandler - store blocks (897 milliseconds) [info] - FAIR Scheduler uses default pool when spark.scheduler.pool property is not set (144 milliseconds) [info] - WriteAheadLogBasedBlockHandler - handle errors in storing block (39 milliseconds) [info] - FAIR Scheduler creates a new pool when spark.scheduler.pool property points to a non-existent pool (110 milliseconds) [info] - Pool should throw IllegalArgumentException when schedulingMode is not supported (1 millisecond) [info] - Fair Scheduler should build fair scheduler when valid spark.scheduler.allocation.file property is set (69 milliseconds) [info] - WriteAheadLogBasedBlockHandler - clean old blocks (99 milliseconds) [info] - Fair Scheduler should use default file(fairscheduler.xml) if it exists in classpath and spark.scheduler.allocation.file property is not set (52 milliseconds) [info] - Fair Scheduler should throw FileNotFoundException when invalid spark.scheduler.allocation.file property is set (60 milliseconds) [info] DiskStoreSuite: [info] - reads of memory-mapped and non memory-mapped files are equivalent (36 milliseconds) [info] - block size tracking (32 milliseconds) [info] - blocks larger than 2gb (33 milliseconds) [info] - block data encryption (56 milliseconds) [info] BlockManagerReplicationSuite: [info] - Test Block - count messages (227 milliseconds) [info] - Test Block - isFullyConsumed (46 milliseconds) [info] InputInfoTrackerSuite: [info] - get peers with addition and removal of block managers (33 milliseconds) [info] - test report and get InputInfo from InputInfoTracker (1 millisecond) [info] - test cleanup InputInfo from InputInfoTracker (2 milliseconds) [info] JobGeneratorSuite: [info] - block replication - 2x replication (673 milliseconds) [info] - block replication - 3x replication (1 second, 355 milliseconds) [info] - run Spark in yarn-cluster mode with different configurations, ensuring redaction (19 seconds, 35 milliseconds) [info] - SPARK-6222: Do not clear received block data too soon (2 seconds, 720 milliseconds) [info] ReceivedBlockTrackerSuite: [info] - block addition, and block to batch allocation (5 milliseconds) [info] - block replication - mixed between 1x to 5x (2 seconds, 126 milliseconds) [info] - block replication - off-heap (371 milliseconds) [info] - block replication - 2x replication without peers (1 millisecond) [info] - block replication - replication failures (90 milliseconds) [info] - block replication - addition and deletion of block managers (325 milliseconds) [info] BlockManagerProactiveReplicationSuite: [info] - get peers with addition and removal of block managers (44 milliseconds) [info] - block replication - 2x replication (625 milliseconds) [info] - block replication - 3x replication (1 second, 296 milliseconds) [info] - block replication - mixed between 1x to 5x (2 seconds, 14 milliseconds) [info] - block replication - off-heap (300 milliseconds) [info] - block replication - 2x replication without peers (1 millisecond) [info] - block replication - replication failures (44 milliseconds) [info] - block replication - addition and deletion of block managers (241 milliseconds) [info] - proactive block replication - 2 replicas - 1 block manager deletions (121 milliseconds) [info] - proactive block replication - 3 replicas - 2 block manager deletions (163 milliseconds) [info] - proactive block replication - 4 replicas - 3 block manager deletions (145 milliseconds) [info] - proactive block replication - 5 replicas - 4 block manager deletions (481 milliseconds) [info] BlockManagerBasicStrategyReplicationSuite: [info] - get peers with addition and removal of block managers (24 milliseconds) [info] - block replication - 2x replication (547 milliseconds) [info] - block addition, and block to batch allocation with many blocks (12 seconds, 514 milliseconds) [info] - recovery with write ahead logs should remove only allocated blocks from received queue (19 milliseconds) [info] - block replication - 3x replication (959 milliseconds) [info] - block allocation to batch should not loose blocks from received queue (214 milliseconds) [info] - recovery and cleanup with write ahead logs (41 milliseconds) [info] - disable write ahead log when checkpoint directory is not set (1 millisecond) [info] - parallel file deletion in FileBasedWriteAheadLog is robust to deletion error (22 milliseconds) [info] WindowOperationsSuite: [info] - window - basic window (509 milliseconds) [info] - window - tumbling window (356 milliseconds) [info] - window - larger window (580 milliseconds) [info] - block replication - mixed between 1x to 5x (1 second, 807 milliseconds) [info] - window - non-overlapping window (363 milliseconds) [info] - window - persistence level (92 milliseconds) [info] - block replication - off-heap (260 milliseconds) [info] - block replication - 2x replication without peers (0 milliseconds) [info] - reduceByKeyAndWindow - basic reduction (335 milliseconds) [info] - block replication - replication failures (50 milliseconds) [info] - reduceByKeyAndWindow - key already in window and new value added into window (305 milliseconds) [info] - block replication - addition and deletion of block managers (267 milliseconds) [info] FlatmapIteratorSuite: [info] - reduceByKeyAndWindow - new key added into window (303 milliseconds) [info] - Flatmap Iterator to Disk (99 milliseconds) [info] - Flatmap Iterator to Memory (72 milliseconds) [info] - Serializer Reset (104 milliseconds) [info] RDDSuite: [info] - reduceByKeyAndWindow - key removed from window (535 milliseconds) [info] - reduceByKeyAndWindow - larger slide time (419 milliseconds) [info] - basic operations (491 milliseconds) [info] - serialization (2 milliseconds) [info] - countApproxDistinct (73 milliseconds) [info] - SparkContext.union (34 milliseconds) [info] - SparkContext.union parallel partition listing (68 milliseconds) [info] - SparkContext.union creates UnionRDD if at least one RDD has no partitioner (3 milliseconds) [info] - SparkContext.union creates PartitionAwareUnionRDD if all RDDs have partitioners (4 milliseconds) [info] - PartitionAwareUnionRDD raises exception if at least one RDD has no partitioner (2 milliseconds) [info] - SPARK-23778: empty RDD in union should not produce a UnionRDD (5 milliseconds) [info] - partitioner aware union (211 milliseconds) [info] - UnionRDD partition serialized size should be small (4 milliseconds) [info] - fold (10 milliseconds) [info] - fold with op modifying first arg (10 milliseconds) [info] - aggregate (13 milliseconds) [info] - reduceByKeyAndWindow - big test (634 milliseconds) [info] - reduceByKeyAndWindow with inverse function - basic reduction (318 milliseconds) [info] - treeAggregate (544 milliseconds) [info] - reduceByKeyAndWindow with inverse function - key already in window and new value added into window (316 milliseconds) [info] - treeAggregate with ops modifying first args (475 milliseconds) [info] - reduceByKeyAndWindow with inverse function - new key added into window (463 milliseconds) [info] - treeReduce (265 milliseconds) [info] - basic caching (28 milliseconds) [info] - caching with failures (16 milliseconds) [info] - empty RDD (141 milliseconds) [info] - reduceByKeyAndWindow with inverse function - key removed from window (350 milliseconds) [info] - yarn-cluster should respect conf overrides in SparkHadoopUtil (SPARK-16414, SPARK-23630) (19 seconds, 34 milliseconds) [info] - reduceByKeyAndWindow with inverse function - larger slide time (340 milliseconds) [info] - repartitioned RDDs (654 milliseconds) [info] - reduceByKeyAndWindow with inverse function - big test (503 milliseconds) [info] - reduceByKeyAndWindow with inverse and filter functions - big test (507 milliseconds) [info] - groupByKeyAndWindow (517 milliseconds) [info] - countByWindow (373 milliseconds) [info] - countByValueAndWindow (310 milliseconds) [info] StreamingListenerSuite: [info] - batch info reporting (609 milliseconds) [info] - receiver info reporting (168 milliseconds) [info] - output operation reporting (1 second, 71 milliseconds) [info] - don't call ssc.stop in listener (985 milliseconds) [info] - onBatchCompleted with successful batch (1 second, 1 millisecond) [info] - onBatchCompleted with failed batch and one failed job (1 second, 1 millisecond) [info] - repartitioned RDDs perform load balancing (7 seconds, 525 milliseconds) [info] - coalesced RDDs (149 milliseconds) [info] - coalesced RDDs with locality (42 milliseconds) [info] - coalesced RDDs with partial locality (31 milliseconds) [info] - onBatchCompleted with failed batch and multiple failed jobs (1 second, 1 millisecond) [info] - StreamingListener receives no events after stopping StreamingListenerBus (399 milliseconds) [info] ReceiverInputDStreamSuite: [info] - Without WAL enabled: createBlockRDD creates empty BlockRDD when no block info (61 milliseconds) [info] - Without WAL enabled: createBlockRDD creates correct BlockRDD with block info (77 milliseconds) [info] - Without WAL enabled: createBlockRDD filters non-existent blocks before creating BlockRDD (68 milliseconds) [info] - With WAL enabled: createBlockRDD creates empty WALBackedBlockRDD when no block info (69 milliseconds) [info] - With WAL enabled: createBlockRDD creates correct WALBackedBlockRDD with all block info having WAL info (65 milliseconds) [info] - With WAL enabled: createBlockRDD creates BlockRDD when some block info don't have WAL info (75 milliseconds) [info] WriteAheadLogBackedBlockRDDSuite: [info] - coalesced RDDs with locality, large scale (10K partitions) (1 second, 135 milliseconds) [info] - Read data available in both block manager and write ahead log (86 milliseconds) [info] - Read data available only in block manager, not in write ahead log (48 milliseconds) [info] - Read data available only in write ahead log, not in block manager (51 milliseconds) [info] - Read data with partially available in block manager, and rest in write ahead log (52 milliseconds) [info] - Test isBlockValid skips block fetching from BlockManager (110 milliseconds) [info] - Test whether RDD is valid after removing blocks from block manager (105 milliseconds) [info] - coalesced RDDs with partial locality, large scale (10K partitions) (511 milliseconds) [info] - coalesced RDDs with locality, fail first pass (16 milliseconds) [info] - Test storing of blocks recovered from write ahead log back into block manager (100 milliseconds) [info] - zipped RDDs (31 milliseconds) Exception in thread "block-manager-slave-async-thread-pool-3" Exception in thread "block-manager-slave-async-thread-pool-4" java.lang.Error: java.lang.InterruptedException at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Object.wait(Object.java:502) at org.apache.spark.storage.BlockInfoManager.lockForWriting(BlockInfoManager.scala:236) at org.apache.spark.storage.BlockManager.removeBlock(BlockManager.scala:1571) at org.apache.spark.storage.BlockManagerSlaveEndpoint$$anonfun$receiveAndReply$1$$anonfun$applyOrElse$1.apply$mcZ$sp(BlockManagerSlaveEndpoint.scala:47) at org.apache.spark.storage.BlockManagerSlaveEndpoint$$anonfun$receiveAndReply$1$$anonfun$applyOrElse$1.apply(BlockManagerSlaveEndpoint.scala:46) at org.apache.spark.storage.BlockManagerSlaveEndpoint$$anonfun$receiveAndReply$1$$anonfun$applyOrElse$1.apply(BlockManagerSlaveEndpoint.scala:46) at org.apache.spark.storage.BlockManagerSlaveEndpoint$$anonfun$1.apply(BlockManagerSlaveEndpoint.scala:86) at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more java.lang.Error: java.lang.InterruptedException at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Object.wait(Object.java:502) at org.apache.spark.storage.BlockInfoManager.lockForWriting(BlockInfoManager.scala:236) at org.apache.spark.storage.BlockManager.removeBlock(BlockManager.scala:1571) at org.apache.spark.storage.BlockManagerSlaveEndpoint$$anonfun$receiveAndReply$1$$anonfun$applyOrElse$1.apply$mcZ$sp(BlockManagerSlaveEndpoint.scala:47) at org.apache.spark.storage.BlockManagerSlaveEndpoint$$anonfun$receiveAndReply$1$$anonfun$applyOrElse$1.apply(BlockManagerSlaveEndpoint.scala:46) at org.apache.spark.storage.BlockManagerSlaveEndpoint$$anonfun$receiveAndReply$1$$anonfun$applyOrElse$1.apply(BlockManagerSlaveEndpoint.scala:46) at org.apache.spark.storage.BlockManagerSlaveEndpoint$$anonfun$1.apply(BlockManagerSlaveEndpoint.scala:86) at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@34f4e8a6 rejected from java.util.concurrent.ThreadPoolExecutor@3142f551[Shutting down, pool size = 3, active threads = 3, queued tasks = 0, completed tasks = 2] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$1.execute(ExecutionContextImpl.scala:136) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252) at scala.concurrent.Promise$class.complete(Promise.scala:55) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:157) at scala.concurrent.Promise$class.failure(Promise.scala:104) at scala.concurrent.impl.Promise$DefaultPromise.failure(Promise.scala:157) at scala.concurrent.Future$$anonfun$failed$1.apply(Future.scala:194) at scala.concurrent.Future$$anonfun$failed$1.apply(Future.scala:192) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36) at scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:63) at scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:78) at scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:55) at scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:55) at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72) at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:54) at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:601) at scala.concurrent.BatchingExecutor$class.execute(BatchingExecutor.scala:106) at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:599) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252) at scala.concurrent.Promise$class.complete(Promise.scala:55) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:157) at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:23) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@3f851cd1 rejected from java.util.concurrent.ThreadPoolExecutor@3142f551[Shutting down, pool size = 3, active threads = 3, queued tasks = 0, completed tasks = 2] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$1.execute(ExecutionContextImpl.scala:136) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252) at scala.concurrent.Promise$class.complete(Promise.scala:55) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:157) at scala.concurrent.Promise$class.failure(Promise.scala:104) at scala.concurrent.impl.Promise$DefaultPromise.failure(Promise.scala:157) at scala.concurrent.Future$$anonfun$failed$1.apply(Future.scala:194) at scala.concurrent.Future$$anonfun$failed$1.apply(Future.scala:192) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36) at scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:63) at scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:78) at scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:55) at scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:55) at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72) at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:54) at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:601) at scala.concurrent.BatchingExecutor$class.execute(BatchingExecutor.scala:106) at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:599) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252) at scala.concurrent.Promise$class.complete(Promise.scala:55) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:157) at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:23) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@190e3da3 rejected from java.util.concurrent.ThreadPoolExecutor@3142f551[Shutting down, pool size = 3, active threads = 3, queued tasks = 0, completed tasks = 2] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$1.execute(ExecutionContextImpl.scala:136) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252) at scala.concurrent.Promise$class.complete(Promise.scala:55) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:157) at scala.concurrent.Promise$class.failure(Promise.scala:104) at scala.concurrent.impl.Promise$DefaultPromise.failure(Promise.scala:157) at scala.concurrent.Future$$anonfun$failed$1.apply(Future.scala:194) at scala.concurrent.Future$$anonfun$failed$1.apply(Future.scala:192) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36) at scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:63) at scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:78) at scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:55) at scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:55) at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72) at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:54) at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:601) at scala.concurrent.BatchingExecutor$class.execute(BatchingExecutor.scala:106) at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:599) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252) at scala.concurrent.Promise$class.complete(Promise.scala:55) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:157) at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:23) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@707fff6a rejected from java.util.concurrent.ThreadPoolExecutor@3142f551[Shutting down, pool size = 3, active threads = 3, queued tasks = 0, completed tasks = 2] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$1.execute(ExecutionContextImpl.scala:136) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252) at scala.concurrent.Promise$class.complete(Promise.scala:55) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:157) at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:23) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@3e0c83d6 rejected from java.util.concurrent.ThreadPoolExecutor@3142f551[Shutting down, pool size = 3, active threads = 3, queued tasks = 0, completed tasks = 2] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$1.execute(ExecutionContextImpl.scala:136) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252) at scala.concurrent.Promise$class.complete(Promise.scala:55) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:157) at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:23) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@7ded357a rejected from java.util.concurrent.ThreadPoolExecutor@3142f551[Shutting down, pool size = 3, active threads = 3, queued tasks = 0, completed tasks = 2] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at scala.concurrent.impl.ExecutionContextImpl$$anon$1.execute(ExecutionContextImpl.scala:136) at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252) at scala.concurrent.Promise$class.complete(Promise.scala:55) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:157) at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:23) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) [info] - partition pruning (13 milliseconds) [info] - read data in block manager and WAL with encryption on (143 milliseconds) [info] TimeSuite: [info] - less (0 milliseconds) [info] - lessEq (0 milliseconds) [info] - greater (0 milliseconds) [info] - greaterEq (1 millisecond) [info] - plus (1 millisecond) [info] - minus Time (1 millisecond) [info] - minus Duration (0 milliseconds) [info] - floor (0 milliseconds) [info] - isMultipleOf (1 millisecond) [info] - min (0 milliseconds) [info] - max (0 milliseconds) [info] - until (2 milliseconds) [info] - to (1 millisecond) [info] DStreamScopeSuite: [info] - dstream without scope (1 millisecond) [info] - input dstream without scope (3 milliseconds) [info] - scoping simple operations (10 milliseconds) [info] - scoping nested operations (21 milliseconds) [info] - transform should allow RDD operations to be captured in scopes (13 milliseconds) [info] - foreachRDD should allow RDD operations to be captured in scope (21 milliseconds) [info] ReceiverSuite: [info] - receiver life cycle (342 milliseconds) [info] - block generator throttling !!! IGNORED !!! [info] - collect large number of empty partitions (6 seconds, 515 milliseconds) [info] - take (1 second, 274 milliseconds) [info] - top with predefined ordering (69 milliseconds) [info] - top with custom ordering (8 milliseconds) [info] - takeOrdered with predefined ordering (7 milliseconds) [info] - takeOrdered with limit 0 (0 milliseconds) [info] - takeOrdered with custom ordering (7 milliseconds) [info] - isEmpty (56 milliseconds) [info] - sample preserves partitioner (2 milliseconds) [info] - run Spark in yarn-client mode with additional jar (18 seconds, 32 milliseconds) [info] - write ahead log - generating and cleaning (14 seconds, 321 milliseconds) [info] StateMapSuite: [info] - EmptyStateMap (1 millisecond) [info] - OpenHashMapBasedStateMap - put, get, getByTime, getAll, remove (10 milliseconds) [info] - OpenHashMapBasedStateMap - put, get, getByTime, getAll, remove with copy (1 millisecond) [info] - OpenHashMapBasedStateMap - serializing and deserializing (62 milliseconds) [info] - OpenHashMapBasedStateMap - serializing and deserializing with compaction (8 milliseconds) [info] - OpenHashMapBasedStateMap - all possible sequences of operations with copies (6 seconds, 689 milliseconds) [info] - OpenHashMapBasedStateMap - serializing and deserializing with KryoSerializable states (14 milliseconds) [info] - EmptyStateMap - serializing and deserializing (19 milliseconds) [info] - MapWithStateRDDRecord - serializing and deserializing with KryoSerializable states (17 milliseconds) [info] UIUtilsSuite: [info] - shortTimeUnitString (0 milliseconds) [info] - normalizeDuration (4 milliseconds) [info] - convertToTimeUnit (1 millisecond) [info] - formatBatchTime (1 millisecond) [info] DurationSuite: [info] - less (0 milliseconds) [info] - lessEq (0 milliseconds) [info] - greater (1 millisecond) [info] - greaterEq (1 millisecond) [info] - plus (1 millisecond) [info] - minus (1 millisecond) [info] - times (0 milliseconds) [info] - div (0 milliseconds) [info] - isMultipleOf (1 millisecond) [info] - min (0 milliseconds) [info] - max (0 milliseconds) [info] - isZero (0 milliseconds) [info] - Milliseconds (0 milliseconds) [info] - Seconds (1 millisecond) [info] - Minutes (1 millisecond) [info] MapWithStateRDDSuite: [info] - creation from pair RDD (355 milliseconds) [info] - updating state and generating mapped data in MapWithStateRDDRecord (5 milliseconds) [info] - states generated by MapWithStateRDD (1 second, 114 milliseconds) [info] - checkpointing (1 second, 255 milliseconds) [info] - takeSample (17 seconds, 28 milliseconds) [info] - takeSample from an empty rdd (7 milliseconds) [info] - checkpointing empty state RDD (437 milliseconds) [info] DStreamClosureSuite: [info] - randomSplit (232 milliseconds) [info] - runJob on an invalid partition (4 milliseconds) [info] - sort an empty RDD (32 milliseconds) [info] - user provided closures are actually cleaned (46 milliseconds) [info] UISeleniumSuite: [info] - sortByKey (149 milliseconds) [info] - sortByKey ascending parameter (84 milliseconds) [info] - sortByKey with explicit ordering (59 milliseconds) [info] - repartitionAndSortWithinPartitions (49 milliseconds) [info] - cartesian on empty RDD (10 milliseconds) [info] - cartesian on non-empty RDDs (35 milliseconds) [info] - intersection (52 milliseconds) [info] - intersection strips duplicates in an input (45 milliseconds) [info] - zipWithIndex (26 milliseconds) [info] - zipWithIndex with a single partition (8 milliseconds) [info] - zipWithIndex chained with other RDDs (SPARK-4433) (20 milliseconds) [info] - zipWithUniqueId (36 milliseconds) [info] - retag with implicit ClassTag (20 milliseconds) [info] - parent method (3 milliseconds) [info] - getNarrowAncestors (11 milliseconds) [info] - getNarrowAncestors with multiple parents (12 milliseconds) [info] - getNarrowAncestors with cycles (12 milliseconds) [info] - task serialization exception should not hang scheduler (22 milliseconds) [info] - RDD.partitions() fails fast when partitions indicies are incorrect (SPARK-13021) (2 milliseconds) [info] - nested RDDs are not supported (SPARK-5063) (16 milliseconds) [info] - actions cannot be performed inside of transformations (SPARK-5063) (15 milliseconds) [info] - custom RDD coalescer (206 milliseconds) [info] - SPARK-18406: race between end-of-task and completion iterator read lock release (17 milliseconds) [info] - SPARK-23496: order of input partitions can result in severe skew in coalesce (4 milliseconds) [info] - cannot run actions after SparkContext has been stopped (SPARK-5063) (133 milliseconds) [info] - cannot call methods on a stopped SparkContext (SPARK-5063) (1 millisecond) [info] - run Spark in yarn-cluster mode with additional jar (18 seconds, 27 milliseconds) [info] BasicSchedulerIntegrationSuite: [info] - super simple job (137 milliseconds) [info] - multi-stage job (155 milliseconds) [info] - job with fetch failure (317 milliseconds) [info] - job failure after 4 attempts (103 milliseconds) [info] OutputCommitCoordinatorSuite: [info] - Only one of two duplicate commit tasks should commit (55 milliseconds) [info] - If commit fails, if task is retried it should not be locked, and will succeed. (49 milliseconds) [info] - attaching and detaching a Streaming tab (1 second, 969 milliseconds) [info] FileBasedWriteAheadLogSuite: [info] - FileBasedWriteAheadLog - read all logs (33 milliseconds) [info] - FileBasedWriteAheadLog - write logs (17 milliseconds) [info] - FileBasedWriteAheadLog - read all logs after write (19 milliseconds) [info] - FileBasedWriteAheadLog - clean old logs (15 milliseconds) [info] - FileBasedWriteAheadLog - clean old logs synchronously (14 milliseconds) [info] - FileBasedWriteAheadLog - handling file errors while reading rotating logs (49 milliseconds) [info] - FileBasedWriteAheadLog - do not create directories or files unless write (2 milliseconds) [info] - FileBasedWriteAheadLog - parallel recovery not enabled if closeFileAfterWrite = false (13 milliseconds) [info] - FileBasedWriteAheadLog - seqToParIterator (96 milliseconds) [info] - FileBasedWriteAheadLogWriter - writing data (19 milliseconds) [info] - FileBasedWriteAheadLogWriter - syncing of data by writing and reading immediately (17 milliseconds) [info] - FileBasedWriteAheadLogReader - sequentially reading data (3 milliseconds) [info] - FileBasedWriteAheadLogReader - sequentially reading data written with writer (2 milliseconds) [info] - FileBasedWriteAheadLogReader - reading data written with writer after corrupted write (974 milliseconds) [info] - FileBasedWriteAheadLogReader - handles errors when file doesn't exist (2 milliseconds) [info] - FileBasedWriteAheadLogRandomReader - reading data using random reader (8 milliseconds) [info] - FileBasedWriteAheadLogRandomReader- reading data using random reader written with writer (5 milliseconds) [info] FileBasedWriteAheadLogWithFileCloseAfterWriteSuite: [info] - FileBasedWriteAheadLog - read all logs (29 milliseconds) [info] - FileBasedWriteAheadLog - write logs (27 milliseconds) [info] - FileBasedWriteAheadLog - read all logs after write (35 milliseconds) [info] - FileBasedWriteAheadLog - clean old logs (33 milliseconds) [info] - FileBasedWriteAheadLog - clean old logs synchronously (38 milliseconds) [info] - FileBasedWriteAheadLog - handling file errors while reading rotating logs (111 milliseconds) [info] - FileBasedWriteAheadLog - do not create directories or files unless write (1 millisecond) [info] - FileBasedWriteAheadLog - parallel recovery not enabled if closeFileAfterWrite = false (6 milliseconds) [info] - FileBasedWriteAheadLog - close after write flag (3 milliseconds) [info] BatchedWriteAheadLogSuite: [info] - BatchedWriteAheadLog - read all logs (33 milliseconds) [info] - BatchedWriteAheadLog - write logs (23 milliseconds) [info] - BatchedWriteAheadLog - read all logs after write (27 milliseconds) [info] - BatchedWriteAheadLog - clean old logs (14 milliseconds) [info] - BatchedWriteAheadLog - clean old logs synchronously (17 milliseconds) [info] - BatchedWriteAheadLog - handling file errors while reading rotating logs (89 milliseconds) [info] - BatchedWriteAheadLog - do not create directories or files unless write (2 milliseconds) [info] - BatchedWriteAheadLog - parallel recovery not enabled if closeFileAfterWrite = false (11 milliseconds) [info] - BatchedWriteAheadLog - serializing and deserializing batched records (2 milliseconds) [info] - BatchedWriteAheadLog - failures in wrappedLog get bubbled up (22 milliseconds) [info] - BatchedWriteAheadLog - name log with the highest timestamp of aggregated entries (19 milliseconds) [info] - BatchedWriteAheadLog - shutdown properly (1 millisecond) [info] - BatchedWriteAheadLog - fail everything in queue during shutdown (5 milliseconds) [info] BatchedWriteAheadLogWithCloseFileAfterWriteSuite: [info] - BatchedWriteAheadLog - read all logs (72 milliseconds) [info] - BatchedWriteAheadLog - write logs (33 milliseconds) [info] - BatchedWriteAheadLog - read all logs after write (50 milliseconds) [info] - BatchedWriteAheadLog - clean old logs (38 milliseconds) [info] - BatchedWriteAheadLog - clean old logs synchronously (35 milliseconds) [info] - BatchedWriteAheadLog - handling file errors while reading rotating logs (130 milliseconds) [info] - BatchedWriteAheadLog - do not create directories or files unless write (2 milliseconds) [info] - BatchedWriteAheadLog - parallel recovery not enabled if closeFileAfterWrite = false (7 milliseconds) [info] - BatchedWriteAheadLog - close after write flag (3 milliseconds) [info] CheckpointSuite: [info] - non-existent checkpoint dir (3 milliseconds) [info] - Job should not complete if all commits are denied (5 seconds, 6 milliseconds) [info] - Only authorized committer failures can clear the authorized committer lock (SPARK-6614) (7 milliseconds) [info] - SPARK-19631: Do not allow failed attempts to be authorized for committing (4 milliseconds) [info] - SPARK-24589: Differentiate tasks from different stage attempts (4 milliseconds) [info] - SPARK-24589: Make sure stage state is cleaned up (853 milliseconds) [info] TaskMetricsSuite: [info] - mutating values (1 millisecond) [info] - mutating shuffle read metrics values (0 milliseconds) [info] - mutating shuffle write metrics values (1 millisecond) [info] - mutating input metrics values (0 milliseconds) [info] - mutating output metrics values (0 milliseconds) [info] - merging multiple shuffle read metrics (1 millisecond) [info] - additional accumulables (1 millisecond) [info] OutputCommitCoordinatorIntegrationSuite: [info] - exception thrown in OutputCommitter.commitTask() (98 milliseconds) [info] UIUtilsSuite: [info] - makeDescription(plainText = false) (22 milliseconds) [info] - makeDescription(plainText = true) (7 milliseconds) [info] - SPARK-11906: Progress bar should not overflow because of speculative tasks (2 milliseconds) [info] - decodeURLParameter (SPARK-12708: Sorting task error in Stages Page when yarn mode.) (0 milliseconds) [info] - SPARK-20393: Prevent newline characters in parameters. (1 millisecond) [info] - SPARK-20393: Prevent script from parameters running on page. (0 milliseconds) [info] - SPARK-20393: Prevent javascript from parameters running on page. (1 millisecond) [info] - SPARK-20393: Prevent links from parameters on page. (0 milliseconds) [info] - SPARK-20393: Prevent popups from parameters on page. (0 milliseconds) [info] SumEvaluatorSuite: [info] - correct handling of count 1 (2 milliseconds) [info] - correct handling of count 0 (0 milliseconds) [info] - correct handling of NaN (1 millisecond) [info] - correct handling of > 1 values (9 milliseconds) [info] - test count > 1 (1 millisecond) [info] ApplicationCacheSuite: [info] - Completed UI get (34 milliseconds) [info] - Test that if an attempt ID is set, it must be used in lookups (3 milliseconds) [info] - Incomplete apps refreshed (9 milliseconds) [info] - Large Scale Application Eviction (213 milliseconds) [info] - Attempts are Evicted (13 milliseconds) [info] - redirect includes query params (26 milliseconds) [info] RpcAddressSuite: [info] - hostPort (0 milliseconds) [info] - fromSparkURL (0 milliseconds) [info] - fromSparkURL: a typo url (0 milliseconds) [info] - fromSparkURL: invalid scheme (1 millisecond) [info] - toSparkURL (0 milliseconds) [info] HistoryServerSuite: [info] - application list json (1 second, 248 milliseconds) [info] - completed app list json (22 milliseconds) [info] - running app list json (9 milliseconds) [info] - minDate app list json (10 milliseconds) [info] - maxDate app list json (8 milliseconds) [info] - maxDate2 app list json (11 milliseconds) [info] - minEndDate app list json (9 milliseconds) [info] - maxEndDate app list json (9 milliseconds) [info] - minEndDate and maxEndDate app list json (7 milliseconds) [info] - minDate and maxEndDate app list json (8 milliseconds) [info] - limit app list json (9 milliseconds) [info] - one app json (70 milliseconds) [info] - one app multi-attempt json (6 milliseconds) [info] - job list json (348 milliseconds) [info] - job list from multi-attempt app json(1) (255 milliseconds) [info] - job list from multi-attempt app json(2) (195 milliseconds) [info] - one job json (7 milliseconds) [info] - succeeded job list json (7 milliseconds) [info] - succeeded&failed job list json (10 milliseconds) [info] - executor list json (12 milliseconds) [info] - stage list json (68 milliseconds) [info] - complete stage list json (10 milliseconds) [info] - failed stage list json (8 milliseconds) [info] - one stage json (33 milliseconds) [info] - one stage attempt json (39 milliseconds) [info] - stage task summary w shuffle write (377 milliseconds) [info] - stage task summary w shuffle read (23 milliseconds) [info] - stage task summary w/ custom quantiles (29 milliseconds) [info] - stage task list (16 milliseconds) [info] - stage task list w/ offset & length (34 milliseconds) [info] - stage task list w/ sortBy (21 milliseconds) [info] - stage task list w/ sortBy short names: -runtime (19 milliseconds) [info] - stage task list w/ sortBy short names: runtime (17 milliseconds) [info] - stage list with accumulable json (23 milliseconds) [info] - stage with accumulable json (28 milliseconds) [info] - stage task list from multi-attempt app json(1) (12 milliseconds) [info] - stage task list from multi-attempt app json(2) (19 milliseconds) [info] - blacklisting for stage (226 milliseconds) [info] - blacklisting node for stage (207 milliseconds) [info] - rdd list storage json (16 milliseconds) [info] - executor node blacklisting (163 milliseconds) [info] - executor node blacklisting unblacklisting (166 milliseconds) [info] - executor memory usage (8 milliseconds) [info] - app environment (53 milliseconds) [info] - download all logs for app with multiple attempts (123 milliseconds) [info] - download one log for app with multiple attempts (129 milliseconds) [info] - response codes on bad paths (26 milliseconds) [info] - automatically retrieve uiRoot from request through Knox (38 milliseconds) [info] - static relative links are prefixed with uiRoot (spark.ui.proxyBase) (8 milliseconds) [info] - /version api endpoint (7 milliseconds) [info] - basic rdd checkpoints + dstream graph checkpoint recovery (8 seconds, 919 milliseconds) [info] - recovery of conf through checkpoints (215 milliseconds) [info] - get correct spark.driver.[host|port] from checkpoint (178 milliseconds) [info] - SPARK-30199 get ui port and blockmanager port (112 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (a,2) (b,1) ------------------------------------------- Time: 1000 ms ------------------------------------------- (,2) ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,2) (b,1) ------------------------------------------- Time: 2500 ms ------------------------------------------- (,2) ------------------------------------------- Time: 3000 ms ------------------------------------------- [info] - recovery with map and reduceByKey operations (465 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (a,1) ------------------------------------------- Time: 1000 ms ------------------------------------------- (a,2) ------------------------------------------- Time: 1500 ms ------------------------------------------- (a,3) ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 2500 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 3000 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 3500 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 3500 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 4000 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 4500 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 5000 ms ------------------------------------------- (a,4) [info] - recovery with invertible reduceByKeyAndWindow operation (946 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (a,2) (b,1) ------------------------------------------- Time: 1000 ms ------------------------------------------- (,2) ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,2) (b,1) [info] - run Spark in yarn-cluster mode unsuccessfully (16 seconds, 34 milliseconds) ------------------------------------------- Time: 2500 ms ------------------------------------------- (,2) ------------------------------------------- Time: 3000 ms ------------------------------------------- [info] - recovery with saveAsHadoopFiles operation (871 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (a,2) (b,1) ------------------------------------------- Time: 1000 ms ------------------------------------------- (,2) ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,2) (b,1) [info] - ajax rendered relative links are prefixed with uiRoot (spark.ui.proxyBase) (4 seconds, 298 milliseconds) [info] - security manager starts with spark.authenticate set (45 milliseconds) ------------------------------------------- Time: 2500 ms ------------------------------------------- (,2) ------------------------------------------- Time: 3000 ms ------------------------------------------- [info] - recovery with saveAsNewAPIHadoopFiles operation (840 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (b,1) (a,2) ------------------------------------------- Time: 1000 ms ------------------------------------------- (,2) ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 1500 ms ------------------------------------------- ------------------------------------------- Time: 2000 ms ------------------------------------------- (b,1) (a,2) ------------------------------------------- Time: 2500 ms ------------------------------------------- (,2) ------------------------------------------- Time: 3000 ms ------------------------------------------- [info] - recovery with saveAsHadoopFile inside transform operation (1 second, 33 milliseconds) ------------------------------------------- Time: 500 ms ------------------------------------------- (a,1) ------------------------------------------- Time: 1000 ms ------------------------------------------- (a,2) ------------------------------------------- Time: 1500 ms ------------------------------------------- (a,3) ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,4) ------------------------------------------- Time: 2500 ms ------------------------------------------- (a,5) ------------------------------------------- Time: 3000 ms ------------------------------------------- (a,6) ------------------------------------------- Time: 3500 ms ------------------------------------------- (a,7) ------------------------------------------- Time: 3500 ms ------------------------------------------- (a,7) ------------------------------------------- Time: 4000 ms ------------------------------------------- (a,8) ------------------------------------------- Time: 4500 ms ------------------------------------------- (a,9) ------------------------------------------- Time: 5000 ms ------------------------------------------- (a,10) [info] - recovery with updateStateByKey operation (808 milliseconds) [info] - incomplete apps get refreshed (4 seconds, 334 milliseconds) [info] - recovery maintains rate controller (2 seconds, 657 milliseconds) [info] - ui and api authorization checks (939 milliseconds) [info] - access history application defaults to the last attempt id (253 milliseconds) [info] JVMObjectTrackerSuite: [info] - JVMObjectId does not take null IDs (2 milliseconds) [info] - JVMObjectTracker (2 milliseconds) [info] BlockManagerSuite: [info] - StorageLevel object caching (0 milliseconds) [info] - BlockManagerId object caching (1 millisecond) [info] - compacted topic (2 minutes, 5 seconds) [info] - BlockManagerId.isDriver() backwards-compatibility with legacy driver ids (SPARK-6716) (0 milliseconds) [info] - master + 1 manager interaction (40 milliseconds) [info] - master + 2 managers interaction (113 milliseconds) [info] - iterator boundary conditions (280 milliseconds) [info] - executor sorting (11 milliseconds) [info] - removing block (115 milliseconds) [info] - removing rdd (43 milliseconds) [info] - removing broadcast (248 milliseconds) [info] - reregistration on heart beat (34 milliseconds) [info] - reregistration on block update (36 milliseconds) Exception in thread "streaming-job-executor-0" java.lang.Error: java.lang.InterruptedException at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998) at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304) at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:206) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:222) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:157) at org.apache.spark.util.ThreadUtils$.awaitReady(ThreadUtils.scala:243) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:750) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126) at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:990) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:385) at org.apache.spark.rdd.RDD.collect(RDD.scala:989) at org.apache.spark.streaming.TestOutputStream$$anonfun$$lessinit$greater$1.apply(TestSuiteBase.scala:100) at org.apache.spark.streaming.TestOutputStream$$anonfun$$lessinit$greater$1.apply(TestSuiteBase.scala:99) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more [info] - reregistration doesn't dead lock (409 milliseconds) [info] - correct BlockResult returned from get() calls (41 milliseconds) [info] - optimize a location order of blocks without topology information (29 milliseconds) [info] - optimize a location order of blocks with topology information (30 milliseconds) [info] - SPARK-9591: getRemoteBytes from another location when Exception throw (138 milliseconds) [info] - SPARK-14252: getOrElseUpdate should still read from remote storage (76 milliseconds) [info] - in-memory LRU storage (31 milliseconds) [info] - recovery with file input stream (3 seconds, 11 milliseconds) [info] - in-memory LRU storage with serialization (73 milliseconds) [info] - in-memory LRU storage with off-heap (124 milliseconds) [info] - in-memory LRU for partitions of same RDD (28 milliseconds) [info] - DStreamCheckpointData.restore invoking times (342 milliseconds) [info] - in-memory LRU for partitions of multiple RDDs (26 milliseconds) [info] DirectKafkaStreamSuite: [info] - on-disk storage (encryption = off) (201 milliseconds) [info] - on-disk storage (encryption = on) (82 milliseconds) [info] - disk and memory storage (encryption = off) (72 milliseconds) [info] - disk and memory storage (encryption = on) (52 milliseconds) [info] - disk and memory storage with getLocalBytes (encryption = off) (49 milliseconds) [info] - recovery from checkpoint contains array object (798 milliseconds) [info] - disk and memory storage with getLocalBytes (encryption = on) (57 milliseconds) [info] - SPARK-11267: the race condition of two checkpoints in a batch (52 milliseconds) [info] - SPARK-28912: Fix MatchError in getCheckpointFiles (23 milliseconds) [info] - disk and memory storage with serialization (encryption = off) (81 milliseconds) [info] - disk and memory storage with serialization (encryption = on) (79 milliseconds) [info] - disk and memory storage with serialization and getLocalBytes (encryption = off) (59 milliseconds) [info] - disk and memory storage with serialization and getLocalBytes (encryption = on) (70 milliseconds) [info] - SPARK-6847: stack overflow when updateStateByKey is followed by a checkpointed dstream (442 milliseconds) [info] MapWithStateSuite: [info] - state - get, exists, update, remove, (3 milliseconds) [info] - disk and off-heap memory storage (encryption = off) (91 milliseconds) [info] - disk and off-heap memory storage (encryption = on) (109 milliseconds) [info] - disk and off-heap memory storage with getLocalBytes (encryption = off) (72 milliseconds) [info] - disk and off-heap memory storage with getLocalBytes (encryption = on) (55 milliseconds) [info] - mapWithState - basic operations with simple API (420 milliseconds) [info] - LRU with mixed storage levels (encryption = off) (124 milliseconds) [info] - basic stream receiving with multiple topics and smallest starting offset (1 second, 690 milliseconds) [info] - LRU with mixed storage levels (encryption = on) (84 milliseconds) [info] - in-memory LRU with streams (encryption = off) (43 milliseconds) [info] - mapWithState - basic operations with advanced API (369 milliseconds) [info] - mapWithState - type inferencing and class tags (8 milliseconds) [info] - in-memory LRU with streams (encryption = on) (36 milliseconds) [info] - LRU with mixed storage levels and streams (encryption = off) (171 milliseconds) [info] - mapWithState - states as mapped data (391 milliseconds) [info] - LRU with mixed storage levels and streams (encryption = on) (212 milliseconds) [info] - negative byte values in ByteBufferInputStream (1 millisecond) [info] - overly large block (52 milliseconds) [info] - mapWithState - initial states, with nothing returned as from mapping function (357 milliseconds) [info] - block compression (409 milliseconds) [info] - mapWithState - state removing (426 milliseconds) [info] - block store put failure (13 milliseconds) [info] - turn off updated block statuses (32 milliseconds) [info] - updated block statuses (51 milliseconds) [info] - query block statuses (52 milliseconds) [info] - get matching blocks (40 milliseconds) [info] - SPARK-1194 regression: fix the same-RDD rule for cache replacement (32 milliseconds) [info] - safely unroll blocks through putIterator (disk) (44 milliseconds) [info] - read-locked blocks cannot be evicted from memory (124 milliseconds) [info] - remove block if a read fails due to missing DiskStore files (SPARK-15736) (206 milliseconds) [info] - mapWithState - state timing out (1 second, 79 milliseconds) [info] - SPARK-13328: refresh block locations (fetch should fail after hitting a threshold) (38 milliseconds) [info] - mapWithState - checkpoint durations (59 milliseconds) [info] - SPARK-13328: refresh block locations (fetch should succeed after location refresh) (31 milliseconds) ------------------------------------------- Time: 1000 ms ------------------------------------------- ------------------------------------------- Time: 2000 ms ------------------------------------------- (a,1) [info] - SPARK-17484: block status is properly updated following an exception in put() (100 milliseconds) ------------------------------------------- Time: 3000 ms ------------------------------------------- (a,2) (b,1) [info] - pattern based subscription (2 seconds, 756 milliseconds) [info] - SPARK-17484: master block locations are updated following an invalid remote block fetch (90 milliseconds) ------------------------------------------- Time: 3000 ms ------------------------------------------- (a,2) (b,1) ------------------------------------------- Time: 4000 ms ------------------------------------------- (a,3) (b,2) (c,1) ------------------------------------------- Time: 5000 ms ------------------------------------------- (c,1) (a,4) (b,3) ------------------------------------------- Time: 6000 ms ------------------------------------------- (b,3) (c,1) (a,5) [info] - receiving from largest starting offset (316 milliseconds) ------------------------------------------- Time: 7000 ms ------------------------------------------- (a,5) (b,3) (c,1) [info] - mapWithState - driver failure recovery (618 milliseconds) [info] BlockGeneratorSuite: [info] - block generation and data callbacks (34 milliseconds) [info] - stop ensures correct shutdown (233 milliseconds) [info] - creating stream by offset (357 milliseconds) [info] - block push errors are reported (32 milliseconds) [info] StreamingJobProgressListenerSuite: [info] - onBatchSubmitted, onBatchStarted, onBatchCompleted, onReceiverStarted, onReceiverError, onReceiverStopped (87 milliseconds) [info] - Remove the old completed batches when exceeding the limit (80 milliseconds) [info] - out-of-order onJobStart and onBatchXXX (149 milliseconds) [info] - detect memory leak (120 milliseconds) [info] ExecutorAllocationManagerSuite: [info] - basic functionality (64 milliseconds) [info] - requestExecutors policy (15 milliseconds) [info] - killExecutor policy (6 milliseconds) [info] - parameter validation (18 milliseconds) [info] - enabling and disabling (406 milliseconds) [info] RateLimiterSuite: [info] - rate limiter initializes even without a maxRate set (1 millisecond) [info] - rate limiter updates when below maxRate (1 millisecond) [info] - rate limiter stays below maxRate despite large updates (0 milliseconds) [info] StreamingContextSuite: [info] - from no conf constructor (74 milliseconds) [info] - from no conf + spark home (76 milliseconds) [info] - from no conf + spark home + env (64 milliseconds) [info] - from conf with settings (152 milliseconds) [info] - from existing SparkContext (80 milliseconds) [info] - from existing SparkContext with settings (108 milliseconds) [info] - from checkpoint (205 milliseconds) [info] - checkPoint from conf (94 milliseconds) [info] - state matching (0 milliseconds) [info] - start and stop state check (80 milliseconds) [info] - start with non-serializable DStream checkpoints (134 milliseconds) [info] - start failure should stop internal components (63 milliseconds) [info] - start should set local properties of streaming jobs correctly (276 milliseconds) [info] - start multiple times (99 milliseconds) [info] - stop multiple times (155 milliseconds) [info] - offset recovery (3 seconds, 55 milliseconds) [info] - stop before start (102 milliseconds) [info] - start after stop (59 milliseconds) [info] - stop only streaming context (157 milliseconds) [info] - stop(stopSparkContext=true) after stop(stopSparkContext=false) (70 milliseconds) [info] - offset recovery from kafka (637 milliseconds) [info] - Direct Kafka stream report input information (681 milliseconds) [info] - maxMessagesPerPartition with backpressure disabled (102 milliseconds) [info] - maxMessagesPerPartition with no lag (81 milliseconds) [info] - SPARK-20640: Shuffle registration timeout and maxAttempts conf are working (5 seconds, 246 milliseconds) [info] - maxMessagesPerPartition respects max rate (98 milliseconds) [info] - fetch remote block to local disk if block size is larger than threshold (34 milliseconds) [info] - query locations of blockIds (6 milliseconds) [info] CompactBufferSuite: [info] - empty buffer (1 millisecond) [info] - basic inserts (6 milliseconds) [info] - adding sequences (2 milliseconds) [info] - adding the same buffer to itself (2 milliseconds) [info] MasterSuite: [info] - can use a custom recovery mode factory (84 milliseconds) [info] - master correctly recover the application (93 milliseconds) [info] - master/worker web ui available (271 milliseconds) [info] - using rate controller (2 seconds, 44 milliseconds) [info] - backpressure.initialRate should honor maxRatePerPartition (382 milliseconds) [info] - use backpressure.initialRate with backpressure (345 milliseconds) [info] - maxMessagesPerPartition with zero offset and rate equal to the specified minimum with default 1 (59 milliseconds) [info] - stop gracefully (6 seconds, 175 milliseconds) [info] KafkaDataConsumerSuite: [info] - KafkaDataConsumer reuse in case of same groupId and TopicPartition (4 milliseconds) [info] - stop gracefully even if a receiver misses StopReceiver (787 milliseconds) [info] - concurrent use of KafkaDataConsumer (1 second, 251 milliseconds) [info] Test run started [info] Test org.apache.spark.streaming.kafka010.JavaLocationStrategySuite.testLocationStrategyConstructors started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.007s [info] Test run started [info] Test org.apache.spark.streaming.kafka010.JavaKafkaRDDSuite.testKafkaRDD started [info] Test run finished: 0 failed, 0 ignored, 1 total, 2.07s [info] Test run started [info] Test org.apache.spark.streaming.kafka010.JavaConsumerStrategySuite.testConsumerStrategyConstructors started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.002s [info] Test run started [info] Test org.apache.spark.streaming.kafka010.JavaDirectKafkaStreamSuite.testKafkaStream started [info] - run Spark in yarn-cluster mode failure after sc initialized (30 seconds, 33 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 1 total, 2.074s [info] SparkAWSCredentialsBuilderSuite: [info] - should build DefaultCredentials when given no params (21 milliseconds) [info] - should build BasicCredentials (2 milliseconds) [info] - should build STSCredentials (1 millisecond) [info] - SparkAWSCredentials classes should be serializable (4 milliseconds) [info] KinesisCheckpointerSuite: [info] - checkpoint is not called twice for the same sequence number (38 milliseconds) [info] - checkpoint is called after sequence number increases (2 milliseconds) [info] - should checkpoint if we have exceeded the checkpoint interval (13 milliseconds) [info] - shouldn't checkpoint if we have not exceeded the checkpoint interval (1 millisecond) [info] - should not checkpoint for the same sequence number (2 milliseconds) [info] - removing checkpointer checkpoints one last time (1 millisecond) [info] - if checkpointing is going on, wait until finished before removing and checkpointing (91 milliseconds) [info] KinesisInputDStreamBuilderSuite: [info] - should raise an exception if the StreamingContext is missing (4 milliseconds) [info] - should raise an exception if the stream name is missing (4 milliseconds) [info] - should raise an exception if the checkpoint app name is missing (1 millisecond) [info] - should propagate required values to KinesisInputDStream (306 milliseconds) [info] - should propagate default values to KinesisInputDStream (5 milliseconds) [info] - should propagate custom non-auth values to KinesisInputDStream (16 milliseconds) [info] - old Api should throw UnsupportedOperationExceptionexception with AT_TIMESTAMP (2 milliseconds) [info] KinesisReceiverSuite: [info] - process records including store and set checkpointer (5 milliseconds) [info] - split into multiple processes if a limitation is set (2 milliseconds) [info] - shouldn't store and update checkpointer when receiver is stopped (2 milliseconds) [info] - shouldn't update checkpointer when exception occurs during store (6 milliseconds) [info] - shutdown should checkpoint if the reason is TERMINATE (6 milliseconds) [info] - shutdown should not checkpoint if the reason is something other than TERMINATE (1 millisecond) [info] - retry success on first attempt (1 millisecond) [info] - retry success on second attempt after a Kinesis throttling exception (12 milliseconds) [info] - retry success on second attempt after a Kinesis dependency exception (61 milliseconds) [info] - retry failed after a shutdown exception (3 milliseconds) [info] - retry failed after an invalid state exception (3 milliseconds) [info] - retry failed after unexpected exception (4 milliseconds) [info] - retry failed after exhausting all retries (66 milliseconds) [info] WithAggregationKinesisBackedBlockRDDSuite: [info] - Basic reading from Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Read data available in both block manager and Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Read data available only in block manager, not in Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Read data available only in Kinesis, not in block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Read data available partially in block manager, rest in Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Test isBlockValid skips block fetching from block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Test whether RDD is valid after removing blocks from block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] WithoutAggregationKinesisBackedBlockRDDSuite: [info] - Basic reading from Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Read data available in both block manager and Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Read data available only in block manager, not in Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Read data available only in Kinesis, not in block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Read data available partially in block manager, rest in Kinesis [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Test isBlockValid skips block fetching from block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Test whether RDD is valid after removing blocks from block manager [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] WithAggregationKinesisStreamSuite: [info] - KinesisUtils API (21 milliseconds) [info] - RDD generation (39 milliseconds) [info] - basic operation [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - custom message handling [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Kinesis read with custom configurations (4 milliseconds) [info] - split and merge shards in a stream [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - failure recovery [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Prepare KinesisTestUtils [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] WithoutAggregationKinesisStreamSuite: [info] - KinesisUtils API (4 milliseconds) [info] - RDD generation (6 milliseconds) [info] - basic operation [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - custom message handling [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Kinesis read with custom configurations (4 milliseconds) [info] - split and merge shards in a stream [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - failure recovery [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] - Prepare KinesisTestUtils [enable by setting env var ENABLE_KINESIS_TESTS=1] !!! IGNORED !!! [info] Test run started [info] Test org.apache.spark.streaming.kinesis.JavaKinesisStreamSuite.testCustomHandlerAwsStsCreds started [info] Test org.apache.spark.streaming.kinesis.JavaKinesisStreamSuite.testCustomHandlerAwsCreds started [info] Test org.apache.spark.streaming.kinesis.JavaKinesisStreamSuite.testCustomHandler started [info] Test org.apache.spark.streaming.kinesis.JavaKinesisStreamSuite.testAwsCreds started [info] Test org.apache.spark.streaming.kinesis.JavaKinesisStreamSuite.testKinesisStream started [info] Test run finished: 0 failed, 0 ignored, 5 total, 0.822s [info] Test run started [info] Test org.apache.spark.streaming.kinesis.JavaKinesisInputDStreamBuilderSuite.testJavaKinesisDStreamBuilderOldApi started [info] Test org.apache.spark.streaming.kinesis.JavaKinesisInputDStreamBuilderSuite.testJavaKinesisDStreamBuilder started [info] Test run finished: 0 failed, 0 ignored, 2 total, 0.281s [info] ScalaTest [info] Run completed in 6 minutes, 14 seconds. [info] Total number of tests run: 0 [info] Suites: completed 0, aborted 0 [info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0 [info] No tests were executed. [info] Passed: Total 104, Failed 0, Errors 0, Passed 103, Skipped 1 [info] ScalaTest [info] Run completed in 6 minutes, 14 seconds. [info] Total number of tests run: 0 [info] Suites: completed 0, aborted 0 [info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0 [info] No tests were executed. [info] Passed: Total 44, Failed 0, Errors 0, Passed 44 [info] ScalaTest [info] Run completed in 6 minutes, 13 seconds. [info] Total number of tests run: 85 [info] Suites: completed 8, aborted 0 [info] Tests: succeeded 85, failed 0, canceled 0, ignored 0, pending 0 [info] All tests passed. [info] Passed: Total 85, Failed 0, Errors 0, Passed 85 [info] ScalaTest [info] Run completed in 6 minutes, 14 seconds. [info] Total number of tests run: 5 [info] Suites: completed 1, aborted 0 [info] Tests: succeeded 5, failed 0, canceled 0, ignored 0, pending 0 [info] All tests passed. [info] Passed: Total 5, Failed 0, Errors 0, Passed 5 [info] ScalaTest [info] Run completed in 6 minutes, 10 seconds. [info] Total number of tests run: 0 [info] Suites: completed 0, aborted 0 [info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0 [info] No tests were executed. [info] ExecutorClassLoaderSuite: [info] - stop slow receiver gracefully (15 seconds, 889 milliseconds) [info] - registering and de-registering of streamingSource (62 milliseconds) [info] - SPARK-28709 registering and de-registering of progressListener (87 milliseconds) [info] - child over system classloader (1 second, 21 milliseconds) [info] - child first (48 milliseconds) [info] - parent first (52 milliseconds) [info] - child first can fall back (58 milliseconds) [info] - child first can fail (73 milliseconds) [info] - resource from parent (64 milliseconds) [info] - resources from parent (54 milliseconds) [info] - fetch classes using Spark's RpcEnv (298 milliseconds) [info] ReplSuite: [info] - awaitTermination (2 seconds, 70 milliseconds) [info] - awaitTermination after stop (70 milliseconds) [info] - awaitTermination with error in task (105 milliseconds) [info] - awaitTermination with error in job generation (477 milliseconds) [info] - awaitTerminationOrTimeout (1 second, 71 milliseconds) [info] - getOrCreate (628 milliseconds) [info] - getActive and getActiveOrCreate (135 milliseconds) [info] - getActiveOrCreate with checkpoint (481 milliseconds) [info] - multiple streaming contexts (56 milliseconds) [info] - DStream and generated RDD creation sites (520 milliseconds) [info] - throw exception on using active or stopped context (98 milliseconds) [info] - propagation of local properties (5 seconds, 28 milliseconds) [info] - queueStream doesn't support checkpointing (538 milliseconds) [info] - Creating an InputDStream but not using it should not crash (964 milliseconds) Spark context available as 'sc' (master = local, app id = local-1610933175413). Spark session available as 'spark'. [info] - run Python application in yarn-client mode (20 seconds, 31 milliseconds) [info] - master/worker web ui available with reverseProxy (30 seconds, 265 milliseconds) [info] - basic scheduling - spread out (61 milliseconds) [info] - basic scheduling - no spread out (61 milliseconds) [info] - basic scheduling with more memory - spread out (54 milliseconds) [info] - basic scheduling with more memory - no spread out (59 milliseconds) [info] - scheduling with max cores - spread out (67 milliseconds) [info] - scheduling with max cores - no spread out (63 milliseconds) [info] - scheduling with cores per executor - spread out (65 milliseconds) [info] - scheduling with cores per executor - no spread out (58 milliseconds) [info] - scheduling with cores per executor AND max cores - spread out (95 milliseconds) [info] - scheduling with cores per executor AND max cores - no spread out (113 milliseconds) [info] - scheduling with executor limit - spread out (61 milliseconds) [info] - scheduling with executor limit - no spread out (72 milliseconds) [info] - scheduling with executor limit AND max cores - spread out (58 milliseconds) [info] - scheduling with executor limit AND max cores - no spread out (52 milliseconds) [info] - scheduling with executor limit AND cores per executor - spread out (187 milliseconds) [info] - scheduling with executor limit AND cores per executor - no spread out (57 milliseconds) [info] - scheduling with executor limit AND cores per executor AND max cores - spread out (86 milliseconds) [info] - scheduling with executor limit AND cores per executor AND max cores - no spread out (56 milliseconds) [info] - SPARK-13604: Master should ask Worker kill unknown executors and drivers (216 milliseconds) [info] - SPARK-20529: Master should reply the address received from worker (90 milliseconds) [info] - SPARK-15236: use Hive catalog (6 seconds, 495 milliseconds) [info] - SPARK-19900: there should be a corresponding driver for the app after relaunching driver (2 seconds, 147 milliseconds) [info] CompletionIteratorSuite: [info] - basic test (1 millisecond) [info] - reference to sub iterator should not be available after completion (829 milliseconds) Spark context available as 'sc' (master = local, app id = local-1610933180712). Spark session available as 'spark'. [info] SparkListenerSuite: [info] - don't call sc.stop in listener (61 milliseconds) [info] - basic creation and shutdown of LiveListenerBus (5 milliseconds) [info] - bus.stop() waits for the event queue to completely drain (3 milliseconds) [info] - metrics for dropped listener events (3 milliseconds) [info] - basic creation of StageInfo (73 milliseconds) [info] - basic creation of StageInfo with shuffle (168 milliseconds) [info] - StageInfo with fewer tasks than partitions (60 milliseconds) [info] - SPARK-15236: use in-memory catalog (2 seconds, 861 milliseconds) [info] - local metrics (1 second, 296 milliseconds) [info] - onTaskGettingResult() called when result fetched remotely (402 milliseconds) [info] - onTaskGettingResult() not called when result sent directly (104 milliseconds) [info] - onTaskEnd() should be called for all started tasks, even after job has been killed (136 milliseconds) [info] - SparkListener moves on if a listener throws an exception (21 milliseconds) [info] - registering listeners via spark.extraListeners (197 milliseconds) Exception in thread "streaming-job-executor-0" java.lang.Error: java.lang.InterruptedException at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998) at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304) at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:206) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:222) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:157) at org.apache.spark.util.ThreadUtils$.awaitReady(ThreadUtils.scala:243) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:750) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126) at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:990) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:385) at org.apache.spark.rdd.RDD.collect(RDD.scala:989) at org.apache.spark.streaming.StreamingContextSuite$$anonfun$57$$anonfun$apply$21.apply(StreamingContextSuite.scala:850) at org.apache.spark.streaming.StreamingContextSuite$$anonfun$57$$anonfun$apply$21.apply(StreamingContextSuite.scala:848) at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:628) at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:628) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more [info] - add and remove listeners to/from LiveListenerBus queues (6 milliseconds) [info] - interrupt within listener is handled correctly: throw interrupt (42 milliseconds) [info] - interrupt within listener is handled correctly: set Thread interrupted (40 milliseconds) [info] - SPARK-30285: Fix deadlock in AsyncEventQueue.removeListenerOnError: throw interrupt (30 milliseconds) Spark context available as 'sc' (master = local, app id = local-1610933183522). Spark session available as 'spark'. [info] - SPARK-30285: Fix deadlock in AsyncEventQueue.removeListenerOnError: set Thread interrupted (25 milliseconds) [info] SortShuffleSuite: [info] - SPARK-18560 Receiver data should be deserialized properly. (9 seconds, 685 milliseconds) [info] - groupByKey without compression (249 milliseconds) [info] - SPARK-22955 graceful shutdown shouldn't lead to job generation error (478 milliseconds) [info] RateControllerSuite: [info] - RateController - rate controller publishes updates after batches complete (439 milliseconds) [info] - ReceiverRateController - published rates reach receivers (582 milliseconds) [info] FailureSuite: [info] - broadcast vars (4 seconds, 451 milliseconds) [info] - shuffle non-zero block size (3 seconds, 444 milliseconds) Spark context available as 'sc' (master = local, app id = local-1610933187559). Spark session available as 'spark'. [info] - line wrapper only initialized once when used as encoder outer scope (3 seconds, 157 milliseconds) [info] - shuffle serializer (3 seconds, 581 milliseconds) Spark context available as 'sc' (master = local-cluster[1,1,1024], app id = app-20210117172632-0000). Spark session available as 'spark'. // Exiting paste mode, now interpreting. [info] - define case class and create Dataset together with paste mode (5 seconds, 654 milliseconds) [info] - zero sized blocks (5 seconds, 530 milliseconds) Spark context available as 'sc' (master = local, app id = local-1610933196523). Spark session available as 'spark'. Spark context available as 'sc' (master = local, app id = local-1610933196523). Spark session available as 'spark'. [info] - :replay should work correctly (3 seconds, 163 milliseconds) [info] - run Python application in yarn-cluster mode (23 seconds, 32 milliseconds) Spark context available as 'sc' (master = local, app id = local-1610933199515). Spark session available as 'spark'. [info] - spark-shell should find imported types in class constructors and extends clause (2 seconds, 62 milliseconds) Spark context available as 'sc' (master = local, app id = local-1610933201674). Spark session available as 'spark'. [info] - zero sized blocks without kryo (6 seconds, 15 milliseconds) Spark context available as 'sc' (master = local, app id = local-1610933203764). Spark session available as 'spark'. [info] - spark-shell should shadow val/def definitions correctly (4 seconds, 445 milliseconds) Spark context available as 'sc' (master = local-cluster[1,1,1024], app id = app-20210117172646-0000). Spark session available as 'spark'. [info] - shuffle on mutable pairs (3 seconds, 670 milliseconds) // Exiting paste mode, now interpreting. [info] - SPARK-26633: ExecutorClassLoader.getResourceAsStream find REPL classes (4 seconds, 732 milliseconds) [info] SingletonReplSuite: [info] - sorting on mutable pairs (3 seconds, 880 milliseconds) Spark context available as 'sc' (master = local-cluster[2,1,1024], app id = app-20210117172650-0000). Spark session available as 'spark'. [info] - cogroup using mutable pairs (3 seconds, 709 milliseconds) [info] - simple foreach with accumulator (2 seconds, 508 milliseconds) [info] - external vars (1 second, 508 milliseconds) [info] - external classes (454 milliseconds) [info] - external functions (504 milliseconds) [info] - subtract mutable pairs (3 seconds, 629 milliseconds) [info] - external functions that access vars (2 seconds, 931 milliseconds) [info] - broadcast vars (1 second, 4 milliseconds) [info] - sort with Java non serializable class - Kryo (4 seconds, 10 milliseconds) [info] - run Python application in yarn-cluster mode using spark.yarn.appMasterEnv to override local envvar (23 seconds, 27 milliseconds) [info] - interacting with files (1 second, 528 milliseconds) [info] - local-cluster mode (2 seconds, 5 milliseconds) [info] - sort with Java non serializable class - Java (3 seconds, 196 milliseconds) [info] - shuffle with different compression settings (SPARK-3426) (478 milliseconds) [info] - SPARK-1199 two instances of same class don't type check. (1 second, 8 milliseconds) [info] - [SPARK-4085] rerun map stage if reduce stage cannot find its local shuffle file (374 milliseconds) [info] - SPARK-2452 compound statements. (302 milliseconds) [info] - cannot find its local shuffle file if no execution of the stage and rerun shuffle (110 milliseconds) [info] - metrics for shuffle without aggregation (295 milliseconds) [info] - metrics for shuffle with aggregation (622 milliseconds) [info] - multiple simultaneous attempts for one task (SPARK-8029) (95 milliseconds) [info] - SortShuffleManager properly cleans up files for shuffles that use the serialized path (226 milliseconds) [info] - SortShuffleManager properly cleans up files for shuffles that use the deserialized path (89 milliseconds) [info] TaskSchedulerImplSuite: [info] - Scheduler does not always schedule tasks on the same workers (849 milliseconds) [info] - Scheduler correctly accounts for multiple CPUs per task (45 milliseconds) [info] - Scheduler does not crash when tasks are not serializable (41 milliseconds) [info] - SPARK-2576 importing implicits (2 seconds, 505 milliseconds) [info] - concurrent attempts for the same stage only have one active taskset (56 milliseconds) [info] - don't schedule more tasks after a taskset is zombie (48 milliseconds) [info] - if a zombie attempt finishes, continue scheduling tasks for non-zombie attempts (55 milliseconds) [info] - tasks are not re-scheduled while executor loss reason is pending (48 milliseconds) [info] - scheduled tasks obey task and stage blacklists (156 milliseconds) [info] - scheduled tasks obey node and executor blacklists (99 milliseconds) [info] - abort stage when all executors are blacklisted and we cannot acquire new executor (57 milliseconds) [info] - SPARK-22148 abort timer should kick in when task is completely blacklisted & no new executor can be acquired (68 milliseconds) [info] - SPARK-22148 try to acquire a new executor when task is unschedulable with 1 executor (61 milliseconds) [info] - multiple failures with map (44 seconds, 68 milliseconds) [info] - SPARK-22148 abort timer should clear unschedulableTaskSetToExpiryTime for all TaskSets (74 milliseconds) [info] - SPARK-22148 Ensure we don't abort the taskSet if we haven't been completely blacklisted (58 milliseconds) [info] - Blacklisted node for entire task set prevents per-task blacklist checks: iteration 0 (97 milliseconds) [info] - Blacklisted node for entire task set prevents per-task blacklist checks: iteration 1 (99 milliseconds) [info] - Blacklisted node for entire task set prevents per-task blacklist checks: iteration 2 (88 milliseconds) [info] - Blacklisted node for entire task set prevents per-task blacklist checks: iteration 3 (84 milliseconds) [info] - Blacklisted node for entire task set prevents per-task blacklist checks: iteration 4 (81 milliseconds) [info] - Blacklisted node for entire task set prevents per-task blacklist checks: iteration 5 (87 milliseconds) [info] - Datasets and encoders (1 second, 504 milliseconds) [info] - Blacklisted node for entire task set prevents per-task blacklist checks: iteration 6 (83 milliseconds) [info] - Blacklisted node for entire task set prevents per-task blacklist checks: iteration 7 (82 milliseconds) [info] - Blacklisted node for entire task set prevents per-task blacklist checks: iteration 8 (83 milliseconds) [info] - Blacklisted node for entire task set prevents per-task blacklist checks: iteration 9 (94 milliseconds) [info] - Blacklisted executor for entire task set prevents per-task blacklist checks: iteration 0 (134 milliseconds) [info] - Blacklisted executor for entire task set prevents per-task blacklist checks: iteration 1 (92 milliseconds) [info] - Blacklisted executor for entire task set prevents per-task blacklist checks: iteration 2 (106 milliseconds) [info] - Blacklisted executor for entire task set prevents per-task blacklist checks: iteration 3 (98 milliseconds) [info] - Blacklisted executor for entire task set prevents per-task blacklist checks: iteration 4 (99 milliseconds) [info] - SPARK-2632 importing a method from non serializable class and not using it. (1 second, 4 milliseconds) [info] - Blacklisted executor for entire task set prevents per-task blacklist checks: iteration 5 (96 milliseconds) [info] - Blacklisted executor for entire task set prevents per-task blacklist checks: iteration 6 (101 milliseconds) [info] - Blacklisted executor for entire task set prevents per-task blacklist checks: iteration 7 (102 milliseconds) [info] - Blacklisted executor for entire task set prevents per-task blacklist checks: iteration 8 (97 milliseconds) [info] - collecting objects of class defined in repl (503 milliseconds) [info] - Blacklisted executor for entire task set prevents per-task blacklist checks: iteration 9 (139 milliseconds) [info] - abort stage if executor loss results in unschedulability from previously failed tasks (51 milliseconds) [info] - don't abort if there is an executor available, though it hasn't had scheduled tasks yet (48 milliseconds) [info] - SPARK-16106 locality levels updated if executor added to existing host (52 milliseconds) [info] - scheduler checks for executors that can be expired from blacklist (54 milliseconds) [info] - if an executor is lost then the state for its running tasks is cleaned up (SPARK-18553) (49 milliseconds) [info] - if a task finishes with TaskState.LOST its executor is marked as dead (62 milliseconds) [info] - Locality should be used for bulk offers even with delay scheduling off (51 milliseconds) [info] - With delay scheduling off, tasks can be run at any locality level immediately (71 milliseconds) [info] - TaskScheduler should throw IllegalArgumentException when schedulingMode is not supported (60 milliseconds) [info] - Completions in zombie tasksets update status of non-zombie taskset (112 milliseconds) [info] - don't schedule for a barrier taskSet if available slots are less than pending tasks (47 milliseconds) [info] - schedule tasks for a barrier taskSet if all tasks can be launched together (59 milliseconds) [info] - SPARK-29263: barrier TaskSet can't schedule when higher prio taskset takes the slots (48 milliseconds) [info] - collecting objects of class defined in repl - shuffling (1 second, 6 milliseconds) [info] - cancelTasks shall kill all the running tasks and fail the stage (57 milliseconds) [info] - killAllTaskAttempts shall kill all the running tasks and not fail the stage (77 milliseconds) [info] - mark taskset for a barrier stage as zombie in case a task fails (47 milliseconds) [info] ChunkedByteBufferFileRegionSuite: [info] - transferTo can stop and resume correctly (2 milliseconds) [info] - transfer to with random limits (272 milliseconds) [info] CryptoStreamUtilsSuite: [info] - crypto configuration conversion (0 milliseconds) [info] - shuffle encryption key length should be 128 by default (1 millisecond) [info] - create 256-bit key (1 millisecond) [info] - create key with invalid length (0 milliseconds) [info] - serializer manager integration (4 milliseconds) [info] - replicating blocks of object with class defined in repl (1 second, 505 milliseconds) [info] - encryption key propagation to executors (2 seconds, 907 milliseconds) [info] - crypto stream wrappers (5 milliseconds) [info] - error handling wrapper (5 milliseconds) [info] RBackendSuite: [info] - close() clears jvmObjectTracker (1 millisecond) [info] NettyRpcAddressSuite: [info] - toString (1 millisecond) [info] - toString for client mode (0 milliseconds) [info] SparkContextSchedulerCreationSuite: [info] - bad-master (49 milliseconds) [info] - local (65 milliseconds) [info] - local-* (49 milliseconds) [info] - local-n (50 milliseconds) [info] - local-*-n-failures (50 milliseconds) [info] - local-n-failures (49 milliseconds) [info] - bad-local-n (48 milliseconds) [info] - bad-local-n-failures (46 milliseconds) [info] - local-default-parallelism (49 milliseconds) [info] - should clone and clean line object in ClosureCleaner (2 seconds, 505 milliseconds) [info] - local-cluster (258 milliseconds) [info] CommandUtilsSuite: [info] - set libraryPath correctly (29 milliseconds) [info] - auth secret shouldn't appear in java opts (95 milliseconds) [info] ImplicitOrderingSuite: [info] - basic inference of Orderings (83 milliseconds) [info] AsyncRDDActionsSuite: [info] - countAsync (15 milliseconds) [info] - collectAsync (15 milliseconds) [info] - foreachAsync (12 milliseconds) [info] - foreachPartitionAsync (15 milliseconds) [info] - SPARK-31399: should clone+clean line object w/ non-serializable state in ClosureCleaner (1 second, 17 milliseconds) [info] - SPARK-31399: ClosureCleaner should discover indirectly nested closure in inner class (1 second, 3 milliseconds) [info] - takeAsync (1 second, 432 milliseconds) [info] - async success handling (9 milliseconds) [info] - async failure handling (12 milliseconds) [info] - FutureAction result, infinite wait (8 milliseconds) [info] - FutureAction result, finite wait (7 milliseconds) [info] - FutureAction result, timeout (23 milliseconds) [info] - SimpleFutureAction callback must not consume a thread while waiting (34 milliseconds) [info] - ComplexFutureAction callback must not consume a thread while waiting (23 milliseconds) [info] JsonProtocolSuite: [info] - writeApplicationInfo (6 milliseconds) [info] - writeWorkerInfo (0 milliseconds) [info] - writeApplicationDescription (3 milliseconds) [info] - writeExecutorRunner (1 millisecond) [info] - writeDriverInfo (4 milliseconds) [info] - writeMasterState (1 millisecond) [info] - writeWorkerState (72 milliseconds) [info] VersionUtilsSuite: [info] - Parse Spark major version (2 milliseconds) [info] - Parse Spark minor version (1 millisecond) [info] - Parse Spark major and minor versions (2 milliseconds) [info] - Return short version number (2 milliseconds) [info] ShuffleBlockFetcherIteratorSuite: [info] - successful 3 local reads + 2 remote reads (53 milliseconds) [info] - release current unexhausted buffer in case the task completes early (14 milliseconds) [info] - fail all blocks if any of the remote request fails (23 milliseconds) [info] - newProductSeqEncoder with REPL defined class (453 milliseconds) [info] - retry corrupt blocks (20 milliseconds) [info] - big blocks are not checked for corruption (7 milliseconds) [info] - retry corrupt blocks (disabled) (14 milliseconds) [info] - Blocks should be shuffled to disk when size of the request is above the threshold(maxReqSizeShuffleToMem). (17 milliseconds) [info] - fail zero-size blocks (18 milliseconds) [info] BlockManagerMasterSuite: [info] - SPARK-31422: getMemoryStatus should not fail after BlockManagerMaster stops (3 milliseconds) [info] - SPARK-31422: getStorageStatus should not fail after BlockManagerMaster stops (1 millisecond) [info] DiskBlockManagerSuite: [info] - basic block creation (1 millisecond) [info] - enumerating blocks (13 milliseconds) [info] - SPARK-22227: non-block files are skipped (1 millisecond) [info] TaskSetManagerSuite: [info] - TaskSet with no preferences (53 milliseconds) [info] - multiple offers with no preferences (55 milliseconds) [info] - skip unsatisfiable locality levels (47 milliseconds) [info] - basic delay scheduling (50 milliseconds) [info] - we do not need to delay scheduling when we only have noPref tasks in the queue (48 milliseconds) [info] - delay scheduling with fallback (54 milliseconds) [info] - delay scheduling with failed hosts (48 milliseconds) [info] - task result lost (49 milliseconds) [info] - repeated failures lead to task set abortion (51 milliseconds) [info] - executors should be blacklisted after task failure, in spite of locality preferences (65 milliseconds) [info] - new executors get added and lost (47 milliseconds) [info] - Executors exit for reason unrelated to currently running tasks (44 milliseconds) [info] - test RACK_LOCAL tasks (46 milliseconds) [info] - do not emit warning when serialized task is small (43 milliseconds) [info] - user class path first in client mode (18 seconds, 33 milliseconds) [info] - emit warning when serialized task is large (54 milliseconds) [info] - Not serializable exception thrown if the task cannot be serialized (50 milliseconds) [info] GenerateUnsafeRowJoinerBitsetSuite: [info] - bitset concat: boundary size 0, 0 (705 milliseconds) [info] + num fields: 0 and 0 [info] + num fields: 0 and 0 [info] + num fields: 0 and 0 [info] + num fields: 0 and 0 [info] + num fields: 0 and 0 [info] - bitset concat: boundary size 0, 64 (62 milliseconds) [info] + num fields: 0 and 64 [info] + num fields: 0 and 64 [info] + num fields: 0 and 64 [info] + num fields: 0 and 64 [info] + num fields: 0 and 64 [info] - bitset concat: boundary size 64, 0 (30 milliseconds) [info] + num fields: 64 and 0 [info] + num fields: 64 and 0 [info] + num fields: 64 and 0 [info] + num fields: 64 and 0 [info] + num fields: 64 and 0 [info] - bitset concat: boundary size 64, 64 (45 milliseconds) [info] + num fields: 64 and 64 [info] + num fields: 64 and 64 [info] + num fields: 64 and 64 [info] + num fields: 64 and 64 [info] + num fields: 64 and 64 [info] - bitset concat: boundary size 0, 128 (31 milliseconds) [info] + num fields: 0 and 128 [info] + num fields: 0 and 128 [info] + num fields: 0 and 128 [info] + num fields: 0 and 128 [info] + num fields: 0 and 128 [info] - bitset concat: boundary size 128, 0 (33 milliseconds) [info] + num fields: 128 and 0 [info] + num fields: 128 and 0 [info] + num fields: 128 and 0 [info] + num fields: 128 and 0 [info] + num fields: 128 and 0 [info] - bitset concat: boundary size 128, 128 (118 milliseconds) [info] + num fields: 128 and 128 [info] + num fields: 128 and 128 [info] + num fields: 128 and 128 [info] + num fields: 128 and 128 [info] + num fields: 128 and 128 [info] - bitset concat: single word bitsets (20 milliseconds) [info] + num fields: 10 and 5 [info] + num fields: 10 and 5 [info] + num fields: 10 and 5 [info] + num fields: 10 and 5 [info] + num fields: 10 and 5 [info] - bitset concat: first bitset larger than a word (24 milliseconds) [info] + num fields: 67 and 5 [info] + num fields: 67 and 5 [info] + num fields: 67 and 5 [info] + num fields: 67 and 5 [info] + num fields: 67 and 5 [info] - bitset concat: second bitset larger than a word (25 milliseconds) [info] + num fields: 6 and 67 [info] + num fields: 6 and 67 [info] + num fields: 6 and 67 [info] + num fields: 6 and 67 [info] + num fields: 6 and 67 [info] - bitset concat: no reduction in bitset size (29 milliseconds) [info] + num fields: 33 and 34 [info] + num fields: 33 and 34 [info] + num fields: 33 and 34 [info] + num fields: 33 and 34 [info] + num fields: 33 and 34 [info] - abort the job if total size of results is too large (1 second, 532 milliseconds) Exception in thread "task-result-getter-3" java.lang.Error: java.lang.InterruptedException at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998) at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304) at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:206) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:222) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:227) at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:220) at org.apache.spark.network.BlockTransferService.fetchBlockSync(BlockTransferService.scala:121) at org.apache.spark.storage.BlockManager.getRemoteBytes(BlockManager.scala:757) at org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$1.apply$mcV$sp(TaskResultGetter.scala:88) at org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$1.apply(TaskResultGetter.scala:63) at org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$1.apply(TaskResultGetter.scala:63) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945) at org.apache.spark.scheduler.TaskResultGetter$$anon$3.run(TaskResultGetter.scala:62) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more [info] - bitset concat: two words (78 milliseconds) [info] + num fields: 120 and 95 [info] + num fields: 120 and 95 [info] + num fields: 120 and 95 [info] + num fields: 120 and 95 [info] + num fields: 120 and 95 [info] - bitset concat: bitset 65, 128 (58 milliseconds) [info] + num fields: 65 and 128 [info] + num fields: 65 and 128 [info] + num fields: 65 and 128 [info] + num fields: 65 and 128 [info] + num fields: 65 and 128 [info] - [SPARK-13931] taskSetManager should not send Resubmitted tasks after being a zombie (68 milliseconds) [info] - [SPARK-22074] Task killed by other attempt task should not be resubmitted (58 milliseconds) [info] - speculative and noPref task should be scheduled after node-local (50 milliseconds) [info] - node-local tasks should be scheduled right away when there are only node-local and no-preference tasks (46 milliseconds) [info] - SPARK-4939: node-local tasks should be scheduled right after process-local tasks finished (51 milliseconds) [info] - SPARK-4939: no-pref tasks should be scheduled after process-local tasks finished (53 milliseconds) [info] - Ensure TaskSetManager is usable after addition of levels (64 milliseconds) [info] - Test that locations with HDFSCacheTaskLocation are treated as PROCESS_LOCAL. (46 milliseconds) [info] - Test TaskLocation for different host type. (0 milliseconds) [info] - Kill other task attempts when one attempt belonging to the same task succeeds (62 milliseconds) [info] - Killing speculative tasks does not count towards aborting the taskset (69 milliseconds) [info] - SPARK-19868: DagScheduler only notified of taskEnd when state is ready (62 milliseconds) [info] - SPARK-17894: Verify TaskSetManagers for different stage attempts have unique names (49 milliseconds) [info] - don't update blacklist for shuffle-fetch failures, preemption, denied commits, or killed tasks (73 milliseconds) [info] - update application blacklist for shuffle-fetch (48 milliseconds) [info] - update blacklist before adding pending task to avoid race condition (51 milliseconds) [info] - SPARK-21563 context's added jars shouldn't change mid-TaskSet (49 milliseconds) [info] - [SPARK-24677] Avoid NoSuchElementException from MedianHeap (72 milliseconds) [info] - SPARK-24755 Executor loss can cause task to not be resubmitted (55 milliseconds) [info] - SPARK-13343 speculative tasks that didn't commit shouldn't be marked as success (67 milliseconds) [info] DiskBlockObjectWriterSuite: [info] - verify write metrics (54 milliseconds) [info] - verify write metrics on revert (49 milliseconds) [info] - Reopening a closed block writer (2 milliseconds) [info] - calling revertPartialWritesAndClose() on a partial write should truncate up to commit (5 milliseconds) [info] - calling revertPartialWritesAndClose() after commit() should have no effect (3 milliseconds) [info] - calling revertPartialWritesAndClose() on a closed block writer should have no effect (4 milliseconds) [info] - commit() and close() should be idempotent (5 milliseconds) [info] - revertPartialWritesAndClose() should be idempotent (2 milliseconds) [info] - commit() and close() without ever opening or writing (1 millisecond) [info] PartitioningSuite: [info] - HashPartitioner equality (0 milliseconds) [info] - RangePartitioner equality (71 milliseconds) [info] - RangePartitioner getPartition (142 milliseconds) [info] - RangePartitioner for keys that are not Comparable (but with Ordering) (18 milliseconds) [info] - RangPartitioner.sketch (35 milliseconds) [info] - RangePartitioner.determineBounds (0 milliseconds) [info] - RangePartitioner should run only one job if data is roughly balanced (1 second, 407 milliseconds) [info] - bitset concat: randomized tests (4 seconds, 106 milliseconds) [info] + num fields: 538 and 301 [info] + num fields: 682 and 487 [info] + num fields: 145 and 809 [info] + num fields: 901 and 758 [info] + num fields: 744 and 576 [info] + num fields: 162 and 148 [info] + num fields: 448 and 832 [info] + num fields: 118 and 807 [info] + num fields: 984 and 11 [info] + num fields: 638 and 543 [info] + num fields: 984 and 334 [info] + num fields: 706 and 152 [info] + num fields: 53 and 900 [info] + num fields: 270 and 284 [info] + num fields: 778 and 658 [info] + num fields: 297 and 81 [info] + num fields: 54 and 574 [info] + num fields: 331 and 553 [info] + num fields: 400 and 560 [info] ConstraintPropagationSuite: [info] - RangePartitioner should work well on unbalanced data (1 second, 126 milliseconds) [info] - RangePartitioner should return a single partition for empty RDDs (17 milliseconds) [info] - HashPartitioner not equal to RangePartitioner (12 milliseconds) [info] - partitioner preservation (58 milliseconds) [info] - partitioning Java arrays should fail (16 milliseconds) [info] - zero-length partitions should be correctly handled (101 milliseconds) [info] - Number of elements in RDD is less than number of partitions (11 milliseconds) [info] - defaultPartitioner (4 milliseconds) [info] - defaultPartitioner when defaultParallelism is set (4 milliseconds) [info] - propagating constraints in filters (384 milliseconds) [info] PartitionwiseSampledRDDSuite: [info] - propagating constraints in aggregate (57 milliseconds) [info] - propagating constraints in expand (49 milliseconds) [info] - seed distribution (61 milliseconds) [info] - concurrency (26 milliseconds) [info] - propagating constraints in aliases (86 milliseconds) [info] PartitionPruningRDDSuite: [info] - propagating constraints in union (47 milliseconds) [info] - propagating constraints in intersect (9 milliseconds) [info] - propagating constraints in except (5 milliseconds) [info] - Pruned Partitions inherit locality prefs correctly (2 milliseconds) [info] - propagating constraints in inner join (18 milliseconds) [info] - propagating constraints in left-semi join (7 milliseconds) [info] - propagating constraints in left-outer join (8 milliseconds) [info] - propagating constraints in right-outer join (7 milliseconds) [info] - Pruned Partitions can be unioned (33 milliseconds) [info] - propagating constraints in full-outer join (10 milliseconds) [info] - infer additional constraints in filters (7 milliseconds) [info] ClosureCleanerSuite2: [info] - infer constraints on cast (61 milliseconds) [info] - get inner closure classes (5 milliseconds) [info] - get outer classes and objects (2 milliseconds) [info] - get outer classes and objects with nesting (3 milliseconds) [info] - find accessed fields (4 milliseconds) [info] - find accessed fields with nesting (5 milliseconds) [info] - clean basic serializable closures (6 milliseconds) [info] - clean basic non-serializable closures (33 milliseconds) [info] - clean basic nested serializable closures (6 milliseconds) [info] - infer isnotnull constraints from compound expressions (78 milliseconds) [info] - infer IsNotNull constraints from non-nullable attributes (3 milliseconds) [info] - clean basic nested non-serializable closures (31 milliseconds) [info] - not infer non-deterministic constraints (13 milliseconds) [info] - clean complicated nested serializable closures (4 milliseconds) [info] - enable/disable constraint propagation (18 milliseconds) [info] - clean complicated nested non-serializable closures (36 milliseconds) [info] - verify nested LMF closures !!! CANCELED !!! (1 millisecond) [info] ClosureCleanerSuite2.supportsLMFs was false (ClosureCleanerSuite2.scala:579) [info] org.scalatest.exceptions.TestCanceledException: [info] at org.scalatest.Assertions$class.newTestCanceledException(Assertions.scala:531) [info] at org.scalatest.FunSuite.newTestCanceledException(FunSuite.scala:1560) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssume(Assertions.scala:516) [info] at org.apache.spark.util.ClosureCleanerSuite2$$anonfun$31.apply$mcV$sp(ClosureCleanerSuite2.scala:579) [info] at org.apache.spark.util.ClosureCleanerSuite2$$anonfun$31.apply(ClosureCleanerSuite2.scala:578) [info] at org.apache.spark.util.ClosureCleanerSuite2$$anonfun$31.apply(ClosureCleanerSuite2.scala:578) [info] at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:147) [info] at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183) [info] at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196) [info] at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289) [info] at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:54) [info] at org.scalatest.BeforeAndAfterEach$class.runTest(BeforeAndAfterEach.scala:221) [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:54) [info] at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229) [info] at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229) [info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396) [info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384) [info] at scala.collection.immutable.List.foreach(List.scala:392) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384) [info] at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461) [info] at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229) [info] at org.scalatest.FunSuite.runTests(FunSuite.scala:1560) [info] at org.scalatest.Suite$class.run(Suite.scala:1147) [info] at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560) [info] at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233) [info] at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:521) [info] at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:54) [info] at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210) [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:54) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480) [info] at sbt.ForkMain$Run$2.call(ForkMain.java:296) [info] at sbt.ForkMain$Run$2.call(ForkMain.java:286) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) [info] BufferHolderSparkSubmitSuite: [info] PrimitiveKeyOpenHashMapSuite: [info] - size for specialized, primitive key, value (int, int) (4 milliseconds) [info] - initialization (1 millisecond) [info] - basic operations (27 milliseconds) [info] - null values (1 millisecond) [info] - changeValue (5 milliseconds) [info] - inserting in capacity-1 map (2 milliseconds) [info] - contains (0 milliseconds) [info] StaticMemoryManagerSuite: [info] - single task requesting on-heap execution memory (3 milliseconds) [info] - two tasks requesting full on-heap execution memory (2 milliseconds) [info] - two tasks cannot grow past 1 / N of on-heap execution memory (2 milliseconds) [info] - tasks can block to get at least 1 / 2N of on-heap execution memory (303 milliseconds) [info] - TaskMemoryManager.cleanUpAllAllocatedMemory (304 milliseconds) [info] - tasks should not be granted a negative amount of execution memory (2 milliseconds) [info] - off-heap execution allocations cannot exceed limit (3 milliseconds) [info] - basic execution memory (6 milliseconds) [info] - basic storage memory (4 milliseconds) [info] - execution and storage isolation (2 milliseconds) [info] - unroll memory (2 milliseconds) [info] MutableURLClassLoaderSuite: [info] - child first (2 milliseconds) [info] - parent first (1 millisecond) [info] - child first can fall back (1 millisecond) [info] - child first can fail (1 millisecond) [info] - default JDK classloader get resources (0 milliseconds) [info] - parent first get resources (1 millisecond) [info] - child first get resources (5 milliseconds) [info] - driver sets context class loader in local mode (193 milliseconds) [info] SortingSuite: [info] - sortByKey (63 milliseconds) [info] - large array (69 milliseconds) [info] - large array with one split (53 milliseconds) [info] - large array with many partitions (276 milliseconds) [info] - sort descending (76 milliseconds) [info] - sort descending with one split (53 milliseconds) [info] - sort descending with many partitions (273 milliseconds) [info] - more partitions than elements (114 milliseconds) [info] - empty RDD (40 milliseconds) [info] - partition balancing (116 milliseconds) [info] - partition balancing for descending sort (101 milliseconds) [info] - get a range of elements in a sorted RDD that is on one partition (103 milliseconds) [info] - get a range of elements over multiple partitions in a descendingly sorted RDD (107 milliseconds) [info] - get a range of elements in an array not partitioned by a range partitioner (21 milliseconds) [info] - get a range of elements over multiple partitions but not taking up full partitions (124 milliseconds) [info] LocalCheckpointSuite: [info] - transform storage level (1 millisecond) [info] - basic lineage truncation (45 milliseconds) [info] - basic lineage truncation - caching before checkpointing (40 milliseconds) [info] - basic lineage truncation - caching after checkpointing (35 milliseconds) [info] - indirect lineage truncation (40 milliseconds) [info] - indirect lineage truncation - caching before checkpointing (40 milliseconds) [info] - indirect lineage truncation - caching after checkpointing (33 milliseconds) [info] - SPARK-22222: Buffer holder should be able to allocate memory larger than 1GB (4 seconds, 308 milliseconds) [info] CombiningLimitsSuite: [info] - limits: combines two limits (44 milliseconds) [info] - limits: combines three limits (10 milliseconds) [info] - limits: combines two limits after ColumnPruning (11 milliseconds) [info] EliminateSerializationSuite: [info] - back to back serialization (1 second, 476 milliseconds) [info] - back to back serialization with object change (50 milliseconds) [info] - back to back serialization in AppendColumns (81 milliseconds) [info] - back to back serialization in AppendColumns with object change (42 milliseconds) [info] InferFiltersFromConstraintsSuite: [info] - filter: filter out constraints in condition (21 milliseconds) [info] - single inner join: filter out values on either side on equi-join keys (38 milliseconds) [info] - single inner join: filter out nulls on either side on non equal keys (22 milliseconds) [info] - single inner join with pre-existing filters: filter out values on either side (24 milliseconds) [info] - single outer join: no null filters are generated (6 milliseconds) [info] - multiple inner joins: filter out values on all sides on equi-join keys (46 milliseconds) [info] - inner join with filter: filter out values on all sides on equi-join keys (15 milliseconds) [info] - inner join with alias: alias contains multiple attributes (35 milliseconds) [info] - inner join with alias: alias contains single attributes (30 milliseconds) [info] - generate correct filters for alias that don't produce recursive constraints (10 milliseconds) [info] - No inferred filter when constraint propagation is disabled (4 milliseconds) [info] - constraints should be inferred from aliased literals (13 milliseconds) [info] - SPARK-23405: left-semi equal-join should filter out null join keys on both sides (11 milliseconds) [info] - SPARK-21479: Outer join after-join filters push down to null-supplying side (16 milliseconds) [info] - SPARK-21479: Outer join pre-existing filters push down to null-supplying side (17 milliseconds) [info] - SPARK-21479: Outer join no filter push down to preserved side (12 milliseconds) [info] - SPARK-23564: left anti join should filter out null join keys on right side (9 milliseconds) [info] - SPARK-23564: left outer join should filter out null join keys on right side (8 milliseconds) [info] - SPARK-23564: right outer join should filter out null join keys on left side (8 milliseconds) [info] RewriteDistinctAggregatesSuite: [info] - single distinct group (13 milliseconds) [info] - single distinct group with partial aggregates (5 milliseconds) [info] - multiple distinct groups (14 milliseconds) [info] - multiple distinct groups with partial aggregates (13 milliseconds) [info] - multiple distinct groups with non-partial aggregates (10 milliseconds) [info] NullExpressionsSuite: [info] - isnull and isnotnull (827 milliseconds) [info] - AssertNotNUll (1 millisecond) [info] - IsNaN (205 milliseconds) [info] - nanvl (207 milliseconds) [info] - coalesce (1 second, 977 milliseconds) [info] - SPARK-16602 Nvl should support numeric-string cases (32 milliseconds) [info] - AtLeastNNonNulls (247 milliseconds) [info] - Coalesce should not throw 64kb exception (1 second, 436 milliseconds) [info] - SPARK-22705: Coalesce should use less global variables (2 milliseconds) [info] - AtLeastNNonNulls should not throw 64kb exception (1 second, 31 milliseconds) [info] AttributeSetSuite: [info] - sanity check (1 millisecond) [info] - checks by id not name (1 millisecond) [info] - ++ preserves AttributeSet (0 milliseconds) [info] - extracts all references (1 millisecond) [info] - dedups attributes (1 millisecond) [info] - subset (0 milliseconds) [info] - equality (1 millisecond) [info] - SPARK-18394 keep a deterministic output order along with attribute names and exprIds (1 millisecond) [info] - user class path first in cluster mode (20 seconds, 29 milliseconds) [info] ResolveInlineTablesSuite: [info] - validate inputs are foldable (5 milliseconds) [info] - validate input dimensions (1 millisecond) [info] - do not fire the rule if not all expressions are resolved (1 millisecond) [info] - convert (5 milliseconds) [info] - convert TimeZoneAwareExpression (2 milliseconds) [info] - nullability inference in convert (2 milliseconds) [info] SortOrderExpressionsSuite: [info] - checkpoint without draining iterator (10 seconds, 13 milliseconds) [info] - SortPrefix (533 milliseconds) [info] CheckCartesianProductsSuite: [info] - CheckCartesianProducts doesn't throw an exception if cross joins are enabled) (15 milliseconds) [info] - CheckCartesianProducts throws an exception for join types that require a join condition (11 milliseconds) [info] - CheckCartesianProducts doesn't throw an exception if a join condition is present (10 milliseconds) [info] - CheckCartesianProducts doesn't throw an exception if join types don't require conditions (5 milliseconds) [info] RuleExecutorSuite: [info] - only once (2 milliseconds) [info] - to fixed point (1 millisecond) [info] - to maxIterations (2 milliseconds) [info] - structural integrity checker (1 millisecond) [info] TypeCoercionSuite: [info] - implicit type cast - ByteType (3 milliseconds) [info] - implicit type cast - ShortType (1 millisecond) [info] - implicit type cast - IntegerType (0 milliseconds) [info] - implicit type cast - LongType (0 milliseconds) [info] - implicit type cast - FloatType (1 millisecond) [info] - implicit type cast - DoubleType (1 millisecond) [info] - implicit type cast - DecimalType(10, 2) (1 millisecond) [info] - implicit type cast - BinaryType (0 milliseconds) [info] - implicit type cast - BooleanType (1 millisecond) [info] - implicit type cast - StringType (2 milliseconds) [info] - implicit type cast - DateType (1 millisecond) [info] - implicit type cast - TimestampType (1 millisecond) [info] - implicit type cast - ArrayType(StringType) (8 milliseconds) [info] - implicit type cast between two Map types (11 milliseconds) [info] - implicit type cast - StructType().add("a1", StringType) (9 milliseconds) [info] - implicit type cast - NullType (1 millisecond) [info] - implicit type cast - CalendarIntervalType (1 millisecond) [info] - eligible implicit type cast - TypeCollection (2 milliseconds) [info] - ineligible implicit type cast - TypeCollection (0 milliseconds) [info] - tightest common bound for types (6 milliseconds) [info] - wider common type for decimal and array (9 milliseconds) [info] - cast NullType for expressions that implement ExpectsInputTypes (3 milliseconds) [info] - cast NullType for binary operators (3 milliseconds) [info] - coalesce casts (12 milliseconds) [info] - CreateArray casts (4 milliseconds) [info] - CreateMap casts (9 milliseconds) [info] - greatest/least cast (16 milliseconds) [info] - nanvl casts (5 milliseconds) [info] - type coercion for If (9 milliseconds) [info] - type coercion for CaseKeyWhen (9 milliseconds) [info] - type coercion for Stack (10 milliseconds) [info] - type coercion for Concat (11 milliseconds) [info] - type coercion for Elt (11 milliseconds) [info] - BooleanEquality type cast (4 milliseconds) [info] - BooleanEquality simplification (5 milliseconds) [info] - WidenSetOperationTypes for except and intersect (12 milliseconds) [info] - WidenSetOperationTypes for union (3 milliseconds) [info] - Transform Decimal precision/scale for union except and intersect (10 milliseconds) [info] - rule for date/timestamp operations (10 milliseconds) [info] - make sure rules do not fire early (4 milliseconds) [info] - SPARK-15776 Divide expression's dataType should be casted to Double or Decimal in aggregation function like sum (4 milliseconds) [info] - SPARK-17117 null type coercion in divide (2 milliseconds) [info] - binary comparison with string promotion (7 milliseconds) [info] - cast WindowFrame boundaries to the type they operate upon (5 milliseconds) [info] EncoderErrorMessageSuite: [info] - primitive types in encoders using Kryo serialization (5 milliseconds) [info] - primitive types in encoders using Java serialization (1 millisecond) [info] - nice error message for missing encoder (56 milliseconds) [info] RegexpExpressionsSuite: [info] - LIKE Pattern (1 second, 349 milliseconds) [info] - RLIKE Regular Expression (601 milliseconds) [info] - RegexReplace (159 milliseconds) [info] - SPARK-22570: RegExpReplace should not create a lot of global variables (0 milliseconds) [info] - RegexExtract (193 milliseconds) [info] - SPLIT (65 milliseconds) [info] - SPARK-30759: cache initialization for literal patterns (1 millisecond) [info] NondeterministicSuite: [info] - MonotonicallyIncreasingID (17 milliseconds) [info] - SparkPartitionID (20 milliseconds) [info] - InputFileName (21 milliseconds) [info] EncoderResolutionSuite: [info] - real type doesn't match encoder schema but they are compatible: product (44 milliseconds) [info] - real type doesn't match encoder schema but they are compatible: nested product (52 milliseconds) [info] - real type doesn't match encoder schema but they are compatible: tupled encoder (26 milliseconds) [info] - real type doesn't match encoder schema but they are compatible: primitive array (32 milliseconds) [info] - the real type is not compatible with encoder schema: primitive array (15 milliseconds) [info] - real type doesn't match encoder schema but they are compatible: array (243 milliseconds) [info] - real type doesn't match encoder schema but they are compatible: nested array (83 milliseconds) [info] - the real type is not compatible with encoder schema: non-array field (22 milliseconds) [info] - the real type is not compatible with encoder schema: array element type (36 milliseconds) [info] - the real type is not compatible with encoder schema: nested array element type (98 milliseconds) [info] - nullability of array type element should not fail analysis (26 milliseconds) [info] - the real number of fields doesn't match encoder schema: tuple encoder (14 milliseconds) [info] - the real number of fields doesn't match encoder schema: nested tuple encoder (28 milliseconds) [info] - nested case class can have different number of fields from the real schema (26 milliseconds) [info] - throw exception if real type is not compatible with encoder schema (36 milliseconds) [info] - cast from int to Long should success (1 millisecond) [info] - cast from date to java.sql.Timestamp should success (1 millisecond) [info] - cast from bigint to String should success (2 milliseconds) [info] - cast from int to java.math.BigDecimal should success (2 milliseconds) [info] - cast from bigint to java.math.BigDecimal should success (2 milliseconds) [info] - cast from bigint to Int should fail (1 millisecond) [info] - cast from timestamp to java.sql.Date should fail (1 millisecond) [info] - cast from decimal(38,18) to Double should fail (1 millisecond) [info] - cast from double to java.math.BigDecimal should fail (1 millisecond) [info] - cast from decimal(38,18) to Int should fail (1 millisecond) [info] - cast from string to Long should fail (1 millisecond) [info] ExpressionEncoderSuite: [info] - encode/decode for primitive boolean: false (codegen path) (39 milliseconds) [info] - encode/decode for primitive boolean: false (interpreted path) (4 milliseconds) [info] - encode/decode for primitive byte: -3 (codegen path) (13 milliseconds) [info] - encode/decode for primitive byte: -3 (interpreted path) (3 milliseconds) [info] - encode/decode for primitive short: -3 (codegen path) (11 milliseconds) [info] - encode/decode for primitive short: -3 (interpreted path) (3 milliseconds) [info] - encode/decode for primitive int: -3 (codegen path) (15 milliseconds) [info] - encode/decode for primitive int: -3 (interpreted path) (3 milliseconds) [info] - encode/decode for primitive long: -3 (codegen path) (12 milliseconds) [info] - encode/decode for primitive long: -3 (interpreted path) (4 milliseconds) [info] - encode/decode for primitive float: -3.7 (codegen path) (11 milliseconds) [info] - encode/decode for primitive float: -3.7 (interpreted path) (3 milliseconds) [info] - encode/decode for primitive double: -3.7 (codegen path) (12 milliseconds) [info] - encode/decode for primitive double: -3.7 (interpreted path) (3 milliseconds) [info] - encode/decode for boxed boolean: false (codegen path) (15 milliseconds) [info] - encode/decode for boxed boolean: false (interpreted path) (4 milliseconds) [info] - encode/decode for boxed byte: -3 (codegen path) (15 milliseconds) [info] - encode/decode for boxed byte: -3 (interpreted path) (3 milliseconds) [info] - encode/decode for boxed short: -3 (codegen path) (13 milliseconds) [info] - encode/decode for boxed short: -3 (interpreted path) (4 milliseconds) [info] - encode/decode for boxed int: -3 (codegen path) (19 milliseconds) [info] - encode/decode for boxed int: -3 (interpreted path) (4 milliseconds) [info] - encode/decode for boxed long: -3 (codegen path) (14 milliseconds) [info] - encode/decode for boxed long: -3 (interpreted path) (4 milliseconds) [info] - encode/decode for boxed float: -3.7 (codegen path) (14 milliseconds) [info] - encode/decode for boxed float: -3.7 (interpreted path) (4 milliseconds) [info] - encode/decode for boxed double: -3.7 (codegen path) (18 milliseconds) [info] - encode/decode for boxed double: -3.7 (interpreted path) (4 milliseconds) [info] - encode/decode for scala decimal: 32131413.211321313 (codegen path) (20 milliseconds) [info] - encode/decode for scala decimal: 32131413.211321313 (interpreted path) (4 milliseconds) [info] - encode/decode for java decimal: 231341.23123 (codegen path) (16 milliseconds) [info] - encode/decode for java decimal: 231341.23123 (interpreted path) (4 milliseconds) [info] - encode/decode for scala biginteger: 23134123123 (codegen path) (16 milliseconds) [info] - encode/decode for scala biginteger: 23134123123 (interpreted path) (5 milliseconds) [info] - encode/decode for java BigInteger: 23134123123 (codegen path) (17 milliseconds) [info] - encode/decode for java BigInteger: 23134123123 (interpreted path) (3 milliseconds) [info] - encode/decode for catalyst decimal: 32131413.211321313 (codegen path) (11 milliseconds) [info] - encode/decode for catalyst decimal: 32131413.211321313 (interpreted path) (3 milliseconds) [info] - encode/decode for string: hello (codegen path) (16 milliseconds) [info] - encode/decode for string: hello (interpreted path) (4 milliseconds) [info] - encode/decode for date: 2012-12-23 (codegen path) (18 milliseconds) [info] - encode/decode for date: 2012-12-23 (interpreted path) (4 milliseconds) [info] - encode/decode for timestamp: 2016-01-29 10:00:00.0 (codegen path) (20 milliseconds) [info] - encode/decode for timestamp: 2016-01-29 10:00:00.0 (interpreted path) (4 milliseconds) [info] - encode/decode for array of timestamp: [Ljava.sql.Timestamp;@2362fa04 (codegen path) (26 milliseconds) [info] - encode/decode for array of timestamp: [Ljava.sql.Timestamp;@2362fa04 (interpreted path) (17 milliseconds) [info] - encode/decode for binary: [B@4e4f0a75 (codegen path) (12 milliseconds) [info] - encode/decode for binary: [B@4e4f0a75 (interpreted path) (3 milliseconds) [info] - encode/decode for seq of int: List(31, -123, 4) (codegen path) (23 milliseconds) [info] - encode/decode for seq of int: List(31, -123, 4) (interpreted path) (13 milliseconds) [info] - encode/decode for seq of string: List(abc, xyz) (codegen path) (25 milliseconds) [info] - encode/decode for seq of string: List(abc, xyz) (interpreted path) (13 milliseconds) [info] - encode/decode for seq of string with null: List(abc, null, xyz) (codegen path) (25 milliseconds) [info] - encode/decode for seq of string with null: List(abc, null, xyz) (interpreted path) (16 milliseconds) [info] - encode/decode for empty seq of int: List() (codegen path) (16 milliseconds) [info] - encode/decode for empty seq of int: List() (interpreted path) (17 milliseconds) [info] - encode/decode for empty seq of string: List() (codegen path) (38 milliseconds) [info] - encode/decode for empty seq of string: List() (interpreted path) (19 milliseconds) [info] - encode/decode for seq of seq of int: List(List(31, -123), null, List(4, 67)) (codegen path) (31 milliseconds) [info] - encode/decode for seq of seq of int: List(List(31, -123), null, List(4, 67)) (interpreted path) (30 milliseconds) [info] - encode/decode for seq of seq of string: List(List(abc, xyz), List(null), null, List(1, null, 2)) (codegen path) (32 milliseconds) [info] - encode/decode for seq of seq of string: List(List(abc, xyz), List(null), null, List(1, null, 2)) (interpreted path) (18 milliseconds) [info] - encode/decode for array of int: [I@411708b5 (codegen path) (24 milliseconds) [info] - encode/decode for array of int: [I@411708b5 (interpreted path) (11 milliseconds) [info] - encode/decode for array of string: [Ljava.lang.String;@6d421023 (codegen path) (21 milliseconds) [info] - encode/decode for array of string: [Ljava.lang.String;@6d421023 (interpreted path) (13 milliseconds) [info] - encode/decode for array of string with null: [Ljava.lang.String;@6377a22c (codegen path) (22 milliseconds) [info] - encode/decode for array of string with null: [Ljava.lang.String;@6377a22c (interpreted path) (12 milliseconds) [info] - encode/decode for empty array of int: [I@2e397eb5 (codegen path) (14 milliseconds) [info] - encode/decode for empty array of int: [I@2e397eb5 (interpreted path) (13 milliseconds) [info] - encode/decode for empty array of string: [Ljava.lang.String;@2a8347a2 (codegen path) (23 milliseconds) [info] - encode/decode for empty array of string: [Ljava.lang.String;@2a8347a2 (interpreted path) (12 milliseconds) [info] - encode/decode for array of array of int: [[I@27d90629 (codegen path) (36 milliseconds) [info] - encode/decode for array of array of int: [[I@27d90629 (interpreted path) (18 milliseconds) [info] - encode/decode for array of array of string: [[Ljava.lang.String;@76234fb1 (codegen path) (28 milliseconds) [info] - encode/decode for array of array of string: [[Ljava.lang.String;@76234fb1 (interpreted path) (17 milliseconds) [info] - encode/decode for map: Map(1 -> a, 2 -> b) (codegen path) (33 milliseconds) [info] - encode/decode for map: Map(1 -> a, 2 -> b) (interpreted path) (5 milliseconds) [info] - encode/decode for map with null: Map(1 -> a, 2 -> null) (codegen path) (28 milliseconds) [info] - encode/decode for map with null: Map(1 -> a, 2 -> null) (interpreted path) (6 milliseconds) [info] - encode/decode for map of map: Map(1 -> Map(a -> 1), 2 -> Map(b -> 2)) (codegen path) (39 milliseconds) [info] - encode/decode for map of map: Map(1 -> Map(a -> 1), 2 -> Map(b -> 2)) (interpreted path) (5 milliseconds) [info] - encode/decode for null seq in tuple: (null) (codegen path) (22 milliseconds) [info] - encode/decode for null seq in tuple: (null) (interpreted path) (21 milliseconds) [info] - encode/decode for null map in tuple: (null) (codegen path) (36 milliseconds) [info] - encode/decode for null map in tuple: (null) (interpreted path) (6 milliseconds) [info] - encode/decode for list of int: List(1, 2) (codegen path) (22 milliseconds) [info] - encode/decode for list of int: List(1, 2) (interpreted path) (13 milliseconds) [info] - encode/decode for list with String and null: List(a, null) (codegen path) (27 milliseconds) [info] - encode/decode for list with String and null: List(a, null) (interpreted path) (14 milliseconds) [info] - encode/decode for udt with case class: UDTCaseClass(http://spark.apache.org/) (codegen path) (19 milliseconds) [info] - encode/decode for udt with case class: UDTCaseClass(http://spark.apache.org/) (interpreted path) (7 milliseconds) [info] - encode/decode for kryo string: hello (codegen path) (239 milliseconds) [info] - encode/decode for kryo string: hello (interpreted path) (13 milliseconds) [info] - encode/decode for kryo object: org.apache.spark.sql.catalyst.encoders.KryoSerializable@f (codegen path) (39 milliseconds) [info] - encode/decode for kryo object: org.apache.spark.sql.catalyst.encoders.KryoSerializable@f (interpreted path) (13 milliseconds) [info] - encode/decode for java string: hello (codegen path) (21 milliseconds) [info] - encode/decode for java string: hello (interpreted path) (3 milliseconds) [info] - encode/decode for java object: org.apache.spark.sql.catalyst.encoders.JavaSerializable@f (codegen path) (15 milliseconds) [info] - encode/decode for java object: org.apache.spark.sql.catalyst.encoders.JavaSerializable@f (interpreted path) (4 milliseconds) [info] - encode/decode for InnerClass: InnerClass(1) (codegen path) (18 milliseconds) [info] - encode/decode for InnerClass: InnerClass(1) (interpreted path) (7 milliseconds) [info] - encode/decode for array of inner class: [Lorg.apache.spark.sql.catalyst.encoders.ExpressionEncoderSuite$InnerClass;@17f36a30 (codegen path) (46 milliseconds) [info] - encode/decode for array of inner class: [Lorg.apache.spark.sql.catalyst.encoders.ExpressionEncoderSuite$InnerClass;@17f36a30 (interpreted path) (24 milliseconds) [info] - encode/decode for array of optional inner class: [Lscala.Option;@46be6d4 (codegen path) (50 milliseconds) [info] - encode/decode for array of optional inner class: [Lscala.Option;@46be6d4 (interpreted path) (33 milliseconds) [info] - multiple failures with updateStateByKey (38 seconds, 485 milliseconds) [info] - encode/decode for PrimitiveData: PrimitiveData(1,1,1.0,1.0,1,1,true) (codegen path) (38 milliseconds) [info] - encode/decode for PrimitiveData: PrimitiveData(1,1,1.0,1.0,1,1,true) (interpreted path) (11 milliseconds) [info] BasicOperationsSuite: [info] - encode/decode for OptionalData: OptionalData(Some(2),Some(2),Some(2.0),Some(2.0),Some(2),Some(2),Some(true),Some(PrimitiveData(1,1,1.0,1.0,1,1,true))) (codegen path) (99 milliseconds) [info] - encode/decode for OptionalData: OptionalData(Some(2),Some(2),Some(2.0),Some(2.0),Some(2),Some(2),Some(true),Some(PrimitiveData(1,1,1.0,1.0,1,1,true))) (interpreted path) (31 milliseconds) [info] - encode/decode for OptionalData: OptionalData(None,None,None,None,None,None,None,None) (codegen path) (43 milliseconds) [info] - encode/decode for OptionalData: OptionalData(None,None,None,None,None,None,None,None) (interpreted path) (29 milliseconds) [info] - encode/decode for Option in array: List(Some(1), None) (codegen path) (35 milliseconds) [info] - encode/decode for Option in array: List(Some(1), None) (interpreted path) (23 milliseconds) [info] - encode/decode for Option in map: Map(1 -> Some(10), 2 -> Some(20), 3 -> None) (codegen path) (36 milliseconds) [info] - encode/decode for Option in map: Map(1 -> Some(10), 2 -> Some(20), 3 -> None) (interpreted path) (7 milliseconds) [info] - map (254 milliseconds) [info] - encode/decode for BoxedData: BoxedData(1,1,1.0,1.0,1,1,true) (codegen path) (48 milliseconds) [info] - encode/decode for BoxedData: BoxedData(1,1,1.0,1.0,1,1,true) (interpreted path) (16 milliseconds) [info] - encode/decode for BoxedData: BoxedData(null,null,null,null,null,null,null) (codegen path) (19 milliseconds) [info] - encode/decode for BoxedData: BoxedData(null,null,null,null,null,null,null) (interpreted path) (14 milliseconds) [info] - encode/decode for RepeatedStruct: RepeatedStruct(List(PrimitiveData(1,1,1.0,1.0,1,1,true))) (codegen path) (71 milliseconds) [info] - encode/decode for RepeatedStruct: RepeatedStruct(List(PrimitiveData(1,1,1.0,1.0,1,1,true))) (interpreted path) (47 milliseconds) [info] - encode/decode for Tuple3: (1,test,PrimitiveData(1,1,1.0,1.0,1,1,true)) (codegen path) (43 milliseconds) [info] - encode/decode for Tuple3: (1,test,PrimitiveData(1,1,1.0,1.0,1,1,true)) (interpreted path) (13 milliseconds) [info] - flatMap (248 milliseconds) [info] - encode/decode for RepeatedData: RepeatedData(List(1, 2),List(1, null, 2),Map(1 -> 2),Map(1 -> null),PrimitiveData(1,1,1.0,1.0,1,1,true)) (codegen path) (78 milliseconds) [info] - encode/decode for RepeatedData: RepeatedData(List(1, 2),List(1, null, 2),Map(1 -> 2),Map(1 -> null),PrimitiveData(1,1,1.0,1.0,1,1,true)) (interpreted path) (45 milliseconds) [info] - encode/decode for NestedArray: NestedArray([[I@1956fd26) (codegen path) (29 milliseconds) [info] - encode/decode for NestedArray: NestedArray([[I@1956fd26) (interpreted path) (18 milliseconds) [info] - encode/decode for Tuple2: (Seq[(String, String)],List((a,b))) (codegen path) (44 milliseconds) [info] - encode/decode for Tuple2: (Seq[(String, String)],List((a,b))) (interpreted path) (29 milliseconds) [info] - filter (258 milliseconds) [info] - encode/decode for Tuple2: (Seq[(Int, Int)],List((1,2))) (codegen path) (42 milliseconds) [info] - encode/decode for Tuple2: (Seq[(Int, Int)],List((1,2))) (interpreted path) (31 milliseconds) [info] - encode/decode for Tuple2: (Seq[(Long, Long)],List((1,2))) (codegen path) (55 milliseconds) [info] - encode/decode for Tuple2: (Seq[(Long, Long)],List((1,2))) (interpreted path) (37 milliseconds) [info] - encode/decode for Tuple2: (Seq[(Float, Float)],List((1.0,2.0))) (codegen path) (47 milliseconds) [info] - encode/decode for Tuple2: (Seq[(Float, Float)],List((1.0,2.0))) (interpreted path) (31 milliseconds) [info] - encode/decode for Tuple2: (Seq[(Double, Double)],List((1.0,2.0))) (codegen path) (58 milliseconds) [info] - glom (287 milliseconds) [info] - encode/decode for Tuple2: (Seq[(Double, Double)],List((1.0,2.0))) (interpreted path) (27 milliseconds) [info] - encode/decode for Tuple2: (Seq[(Short, Short)],List((1,2))) (codegen path) (46 milliseconds) [info] - encode/decode for Tuple2: (Seq[(Short, Short)],List((1,2))) (interpreted path) (28 milliseconds) [info] - encode/decode for Tuple2: (Seq[(Byte, Byte)],List((1,2))) (codegen path) (43 milliseconds) [info] - encode/decode for Tuple2: (Seq[(Byte, Byte)],List((1,2))) (interpreted path) (26 milliseconds) [info] - mapPartitions (244 milliseconds) [info] - encode/decode for Tuple2: (Seq[(Boolean, Boolean)],List((true,false))) (codegen path) (80 milliseconds) [info] - encode/decode for Tuple2: (Seq[(Boolean, Boolean)],List((true,false))) (interpreted path) (27 milliseconds) [info] - encode/decode for Tuple2: (ArrayBuffer[(String, String)],ArrayBuffer((a,b))) (codegen path) (50 milliseconds) [info] - encode/decode for Tuple2: (ArrayBuffer[(String, String)],ArrayBuffer((a,b))) (interpreted path) (26 milliseconds) [info] - encode/decode for Tuple2: (ArrayBuffer[(Int, Int)],ArrayBuffer((1,2))) (codegen path) (45 milliseconds) [info] - encode/decode for Tuple2: (ArrayBuffer[(Int, Int)],ArrayBuffer((1,2))) (interpreted path) (26 milliseconds) [info] - encode/decode for Tuple2: (ArrayBuffer[(Long, Long)],ArrayBuffer((1,2))) (codegen path) (43 milliseconds) [info] - encode/decode for Tuple2: (ArrayBuffer[(Long, Long)],ArrayBuffer((1,2))) (interpreted path) (26 milliseconds) [info] - encode/decode for Tuple2: (ArrayBuffer[(Float, Float)],ArrayBuffer((1.0,2.0))) (codegen path) (52 milliseconds) [info] - encode/decode for Tuple2: (ArrayBuffer[(Float, Float)],ArrayBuffer((1.0,2.0))) (interpreted path) (29 milliseconds) [info] - encode/decode for Tuple2: (ArrayBuffer[(Double, Double)],ArrayBuffer((1.0,2.0))) (codegen path) (45 milliseconds) [info] - encode/decode for Tuple2: (ArrayBuffer[(Double, Double)],ArrayBuffer((1.0,2.0))) (interpreted path) (28 milliseconds) [info] - encode/decode for Tuple2: (ArrayBuffer[(Short, Short)],ArrayBuffer((1,2))) (codegen path) (43 milliseconds) [info] - encode/decode for Tuple2: (ArrayBuffer[(Short, Short)],ArrayBuffer((1,2))) (interpreted path) (26 milliseconds) [info] - encode/decode for Tuple2: (ArrayBuffer[(Byte, Byte)],ArrayBuffer((1,2))) (codegen path) (43 milliseconds) [info] - encode/decode for Tuple2: (ArrayBuffer[(Byte, Byte)],ArrayBuffer((1,2))) (interpreted path) (31 milliseconds) [info] - repartition (more partitions) (542 milliseconds) [info] - encode/decode for Tuple2: (ArrayBuffer[(Boolean, Boolean)],ArrayBuffer((true,false))) (codegen path) (50 milliseconds) [info] - encode/decode for Tuple2: (ArrayBuffer[(Boolean, Boolean)],ArrayBuffer((true,false))) (interpreted path) (29 milliseconds) [info] - encode/decode for Tuple2: (Seq[Seq[(Int, Int)]],List(List((1,2)))) (codegen path) (64 milliseconds) [info] - encode/decode for Tuple2: (Seq[Seq[(Int, Int)]],List(List((1,2)))) (interpreted path) (39 milliseconds) [info] - encode/decode for tuple with 2 flat encoders: (1,10) (codegen path) (16 milliseconds) [info] - encode/decode for tuple with 2 flat encoders: (1,10) (interpreted path) (5 milliseconds) [info] - encode/decode for tuple with 2 product encoders: (PrimitiveData(1,1,1.0,1.0,1,1,true),(3,30)) (codegen path) (69 milliseconds) [info] - encode/decode for tuple with 2 product encoders: (PrimitiveData(1,1,1.0,1.0,1,1,true),(3,30)) (interpreted path) (22 milliseconds) [info] - encode/decode for tuple with flat encoder and product encoder: (PrimitiveData(1,1,1.0,1.0,1,1,true),3) (codegen path) (40 milliseconds) [info] - encode/decode for tuple with flat encoder and product encoder: (PrimitiveData(1,1,1.0,1.0,1,1,true),3) (interpreted path) (12 milliseconds) [info] - encode/decode for tuple with product encoder and flat encoder: (3,PrimitiveData(1,1,1.0,1.0,1,1,true)) (codegen path) (33 milliseconds) [info] - encode/decode for tuple with product encoder and flat encoder: (3,PrimitiveData(1,1,1.0,1.0,1,1,true)) (interpreted path) (13 milliseconds) [info] - encode/decode for nested tuple encoder: (1,(10,100)) (codegen path) (23 milliseconds) [info] - encode/decode for nested tuple encoder: (1,(10,100)) (interpreted path) (7 milliseconds) [info] - repartition (fewer partitions) (441 milliseconds) [info] - encode/decode for primitive value class: PrimitiveValueClass(42) (codegen path) (15 milliseconds) [info] - encode/decode for primitive value class: PrimitiveValueClass(42) (interpreted path) (5 milliseconds) [info] - encode/decode for reference value class: ReferenceValueClass(Container(1)) (codegen path) (21 milliseconds) [info] - encode/decode for reference value class: ReferenceValueClass(Container(1)) (interpreted path) (5 milliseconds) [info] - encode/decode for option of int: Some(31) (codegen path) (14 milliseconds) [info] - encode/decode for option of int: Some(31) (interpreted path) (3 milliseconds) [info] - encode/decode for empty option of int: None (codegen path) (4 milliseconds) [info] - encode/decode for empty option of int: None (interpreted path) (3 milliseconds) [info] - checkpoint without draining iterator - caching before checkpointing (9 seconds, 616 milliseconds) [info] - encode/decode for option of string: Some(abc) (codegen path) (17 milliseconds) [info] - encode/decode for option of string: Some(abc) (interpreted path) (3 milliseconds) [info] - encode/decode for empty option of string: None (codegen path) (4 milliseconds) [info] - encode/decode for empty option of string: None (interpreted path) (3 milliseconds) [info] - encode/decode for Tuple2: (UDT,org.apache.spark.sql.catalyst.encoders.ExamplePoint@691) (codegen path) (20 milliseconds) [info] - encode/decode for Tuple2: (UDT,org.apache.spark.sql.catalyst.encoders.ExamplePoint@691) (interpreted path) (6 milliseconds) [info] - nullable of encoder schema (codegen path) (43 milliseconds) [info] - nullable of encoder schema (interpreted path) (41 milliseconds) [info] - nullable of encoder serializer (codegen path) (7 milliseconds) [info] - nullable of encoder serializer (interpreted path) (6 milliseconds) [info] - null check for map key: String (codegen path) (21 milliseconds) [info] - null check for map key: String (interpreted path) (18 milliseconds) [info] - null check for map key: Integer (codegen path) (19 milliseconds) [info] - null check for map key: Integer (interpreted path) (18 milliseconds) [info] - groupByKey (291 milliseconds) [info] UnsafeRowConverterSuite: [info] - basic conversion with only primitive types with CODEGEN_ONLY (6 milliseconds) [info] - basic conversion with only primitive types with NO_CODEGEN (0 milliseconds) [info] - basic conversion with primitive, string and binary types with CODEGEN_ONLY (7 milliseconds) [info] - basic conversion with primitive, string and binary types with NO_CODEGEN (1 millisecond) [info] - basic conversion with primitive, string, date and timestamp types with CODEGEN_ONLY (8 milliseconds) [info] - basic conversion with primitive, string, date and timestamp types with NO_CODEGEN (0 milliseconds) [info] - null handling with CODEGEN_ONLY (10 milliseconds) [info] - null handling with NO_CODEGEN (2 milliseconds) [info] - NaN canonicalization with CODEGEN_ONLY (7 milliseconds) [info] - NaN canonicalization with NO_CODEGEN (1 millisecond) [info] - basic conversion with struct type with CODEGEN_ONLY (11 milliseconds) [info] - basic conversion with struct type with NO_CODEGEN (1 millisecond) [info] - basic conversion with array type with CODEGEN_ONLY (9 milliseconds) [info] - basic conversion with array type with NO_CODEGEN (1 millisecond) [info] - basic conversion with map type with CODEGEN_ONLY (11 milliseconds) [info] - basic conversion with map type with NO_CODEGEN (1 millisecond) [info] - basic conversion with struct and array with CODEGEN_ONLY (9 milliseconds) [info] - basic conversion with struct and array with NO_CODEGEN (1 millisecond) [info] - basic conversion with struct and map with CODEGEN_ONLY (10 milliseconds) [info] - basic conversion with struct and map with NO_CODEGEN (1 millisecond) [info] - basic conversion with array and map with CODEGEN_ONLY (13 milliseconds) [info] - basic conversion with array and map with NO_CODEGEN (1 millisecond) [info] FilterPushdownSuite: [info] - eliminate subqueries (4 milliseconds) [info] - simple push down (4 milliseconds) [info] - combine redundant filters (5 milliseconds) [info] - do not combine non-deterministic filters even if they are identical (4 milliseconds) [info] - SPARK-16164: Filter pushdown should keep the ordering in the logical plan (4 milliseconds) [info] - SPARK-16994: filter should not be pushed through limit (3 milliseconds) [info] - can't push without rewrite (6 milliseconds) [info] - nondeterministic: can always push down filter through project with deterministic field (9 milliseconds) [info] - nondeterministic: can't push down filter through project with nondeterministic field (3 milliseconds) [info] - nondeterministic: can't push down filter through aggregate with nondeterministic field (7 milliseconds) [info] - reduceByKey (237 milliseconds) [info] - nondeterministic: push down part of filter through aggregate with deterministic field (21 milliseconds) [info] - filters: combines filters (5 milliseconds) [info] - joins: push to either side (8 milliseconds) [info] - joins: push to one side (5 milliseconds) [info] - joins: do not push down non-deterministic filters into join condition (5 milliseconds) [info] - joins: push to one side after transformCondition (12 milliseconds) [info] - joins: rewrite filter to push to either side (10 milliseconds) [info] - joins: push down left semi join (11 milliseconds) [info] - joins: push down left outer join #1 (12 milliseconds) [info] - joins: push down right outer join #1 (14 milliseconds) [info] - joins: push down left outer join #2 (14 milliseconds) [info] - joins: push down right outer join #2 (14 milliseconds) [info] - joins: push down left outer join #3 (14 milliseconds) [info] - joins: push down right outer join #3 (12 milliseconds) [info] - joins: push down left outer join #4 (11 milliseconds) [info] - joins: push down right outer join #4 (10 milliseconds) [info] - joins: push down left outer join #5 (11 milliseconds) [info] - joins: push down right outer join #5 (13 milliseconds) [info] - joins: can't push down (5 milliseconds) [info] - joins: conjunctive predicates (8 milliseconds) [info] - joins: conjunctive predicates #2 (8 milliseconds) [info] - joins: conjunctive predicates #3 (14 milliseconds) [info] - joins: push down where clause into left anti join (8 milliseconds) [info] - joins: only push down join conditions to the right of a left anti join (8 milliseconds) [info] - joins: only push down join conditions to the right of an existence join (7 milliseconds) [info] - generate: predicate referenced no generated column (14 milliseconds) [info] - generate: non-deterministic predicate referenced no generated column (16 milliseconds) [info] - generate: part of conjuncts referenced generated column (10 milliseconds) [info] - generate: all conjuncts referenced generated column (4 milliseconds) [info] - reduce (302 milliseconds) [info] - aggregate: push down filter when filter on group by expression (9 milliseconds) [info] - aggregate: don't push down filter when filter not on group by expression (8 milliseconds) [info] - aggregate: push down filters partially which are subset of group by expressions (10 milliseconds) [info] - aggregate: push down filters with alias (14 milliseconds) [info] - aggregate: push down filters with literal (11 milliseconds) [info] - aggregate: don't push down filters that are nondeterministic (30 milliseconds) [info] - SPARK-17712: aggregate: don't push down filters that are data-independent (9 milliseconds) [info] - aggregate: don't push filters if the aggregate has no grouping expressions (10 milliseconds) [info] - broadcast hint (14 milliseconds) [info] - union (23 milliseconds) [info] - expand (21 milliseconds) [info] - predicate subquery: push down simple (18 milliseconds) [info] - predicate subquery: push down complex (17 milliseconds) [info] - SPARK-20094: don't push predicate with IN subquery into join condition (17 milliseconds) [info] - Window: predicate push down -- basic (29 milliseconds) [info] - Window: predicate push down -- predicates with compound predicate using only one column (13 milliseconds) [info] - Window: predicate push down -- multi window expressions with the same window spec (16 milliseconds) [info] - Window: predicate push down -- multi window specification - 1 (26 milliseconds) [info] - Window: predicate push down -- multi window specification - 2 (29 milliseconds) [info] - Window: predicate push down -- predicates with multiple partitioning columns (15 milliseconds) [info] - Window: predicate push down -- complex predicate with the same expressions !!! IGNORED !!! [info] - count (354 milliseconds) [info] - Window: no predicate push down -- predicates are not from partitioning keys (12 milliseconds) [info] - Window: no predicate push down -- partial compound partition key (13 milliseconds) [info] - monitor app using launcher library (11 seconds, 322 milliseconds) [info] - Window: no predicate push down -- complex predicates containing non partitioning columns (14 milliseconds) [info] - Window: no predicate push down -- complex predicate with different expressions (19 milliseconds) [info] - join condition pushdown: deterministic and non-deterministic (12 milliseconds) [info] - watermark pushdown: no pushdown on watermark attribute #1 (11 milliseconds) [info] - watermark pushdown: no pushdown for nondeterministic filter (8 milliseconds) [info] - watermark pushdown: full pushdown (5 milliseconds) [info] - watermark pushdown: no pushdown on watermark attribute #2 (6 milliseconds) [info] UnsafeArraySuite: [info] - countByValue (312 milliseconds) [info] - read array (245 milliseconds) [info] - from primitive array (1 millisecond) [info] - to primitive array (10 milliseconds) [info] - unsafe java serialization (1 millisecond) [info] - unsafe Kryo serialization (20 milliseconds) [info] DecimalAggregatesSuite: [info] - Decimal Sum Aggregation: Optimized (9 milliseconds) [info] - Decimal Sum Aggregation: Not Optimized (2 milliseconds) [info] - Decimal Average Aggregation: Optimized (5 milliseconds) [info] - Decimal Average Aggregation: Not Optimized (2 milliseconds) [info] - Decimal Sum Aggregation over Window: Optimized (8 milliseconds) [info] - Decimal Sum Aggregation over Window: Not Optimized (9 milliseconds) [info] - Decimal Average Aggregation over Window: Optimized (9 milliseconds) [info] - Decimal Average Aggregation over Window: Not Optimized (11 milliseconds) [info] DecimalExpressionSuite: [info] - UnscaledValue (76 milliseconds) [info] - MakeDecimal (53 milliseconds) [info] - mapValues (297 milliseconds) [info] - PromotePrecision (69 milliseconds) [info] - CheckOverflow (200 milliseconds) [info] TableIdentifierParserSuite: [info] - table identifier (19 milliseconds) [info] - quoted identifiers (5 milliseconds) [info] - flatMapValues (298 milliseconds) [info] - table identifier - strict keywords (44 milliseconds) [info] - table identifier - non reserved keywords (118 milliseconds) [info] - SPARK-17364 table identifier - contains number (2 milliseconds) [info] - SPARK-17832 table identifier - contains backtick (3 milliseconds) [info] GeneratedProjectionSuite: [info] - union (258 milliseconds) [info] - union with input stream return None (140 milliseconds) [info] - StreamingContext.union (309 milliseconds) [info] - transform (250 milliseconds) [info] - transform with NULL (103 milliseconds) [info] - transform with input stream return None (119 milliseconds) [info] - transformWith (350 milliseconds) [info] - transformWith with input stream return None (113 milliseconds) [info] - StreamingContext.transform (231 milliseconds) [info] - StreamingContext.transform with input stream return None (125 milliseconds) [info] - generated projections on wider table (2 seconds, 117 milliseconds) [info] - cogroup (301 milliseconds) [info] - join (309 milliseconds) [info] - leftOuterJoin (292 milliseconds) [info] - rightOuterJoin (312 milliseconds) [info] - fullOuterJoin (303 milliseconds) [info] - updateStateByKey (333 milliseconds) [info] - updateStateByKey - simple with initial value RDD (336 milliseconds) [info] - updateStateByKey - testing time stamps as input (335 milliseconds) [info] - updateStateByKey - with initial value RDD (331 milliseconds) [info] - updateStateByKey - object lifecycle (335 milliseconds) [info] - slice (2 seconds, 115 milliseconds) [info] - slice - has not been initialized (60 milliseconds) [info] - checkpoint without draining iterator - caching after checkpointing (9 seconds, 396 milliseconds) [info] - checkpoint blocks exist (17 milliseconds) [info] - checkpoint blocks exist - caching before checkpointing (16 milliseconds) [info] - rdd cleanup - map and window (327 milliseconds) [info] - checkpoint blocks exist - caching after checkpointing (17 milliseconds) [info] - missing checkpoint block fails with informative message (32 milliseconds) [info] BoundedPriorityQueueSuite: [info] - BoundedPriorityQueue poll test (1 millisecond) [info] SparkSubmitUtilsSuite: [info] - incorrect maven coordinate throws error (2 milliseconds) [info] - create repo resolvers (3 milliseconds) [info] - create additional resolvers (4 milliseconds) :: loading settings :: url = jar:file:/home/jenkins/sparkivy/per-executor-caches/8/.ivy2/cache/org.apache.ivy/ivy/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml [info] - add dependencies works correctly (107 milliseconds) [info] - excludes works correctly (6 milliseconds) [info] - rdd cleanup - updateStateByKey (591 milliseconds) [info] - ivy path works correctly (661 milliseconds) [info] - search for artifact at local repositories (975 milliseconds) [info] - dependency not found throws RuntimeException (67 milliseconds) [info] - neglects Spark and Spark's dependencies (321 milliseconds) [info] - exclude dependencies end to end (354 milliseconds) :: loading settings :: file = /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.6/target/tmp/ivy-7bfb9f71-475f-4efc-b246-e450aac70bb3/ivysettings.xml Exception in thread "receiver-supervisor-future-0" java.lang.Error: java.lang.InterruptedException: sleep interrupted at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException: sleep interrupted at java.lang.Thread.sleep(Native Method) at org.apache.spark.streaming.receiver.ReceiverSupervisor$$anonfun$restartReceiver$1.apply$mcV$sp(ReceiverSupervisor.scala:196) at org.apache.spark.streaming.receiver.ReceiverSupervisor$$anonfun$restartReceiver$1.apply(ReceiverSupervisor.scala:189) at org.apache.spark.streaming.receiver.ReceiverSupervisor$$anonfun$restartReceiver$1.apply(ReceiverSupervisor.scala:189) at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more [info] - rdd cleanup - input blocks and persisted RDDs (2 seconds, 213 milliseconds) [info] JavaStreamingListenerWrapperSuite: [info] - basic (11 milliseconds) [info] - load ivy settings file (215 milliseconds) [info] Test run started [info] Test org.apache.spark.streaming.JavaWriteAheadLogSuite.testCustomWAL started [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.008s [info] Test run started [info] Test org.apache.spark.streaming.JavaMapWithStateSuite.testBasicFunction started [info] - SPARK-10878: test resolution files cleaned after resolving artifact (255 milliseconds) [info] ChunkedByteBufferSuite: [info] - no chunks (1 millisecond) [info] - getChunks() duplicates chunks (0 milliseconds) [info] - copy() does not affect original buffer's position (0 milliseconds) [info] - writeFully() does not affect original buffer's position (1 millisecond) [info] - SPARK-24107: writeFully() write buffer which is larger than bufferWriteChunkSize (22 milliseconds) [info] - toArray() (1 millisecond) [info] - toArray() throws UnsupportedOperationException if size exceeds 2GB (3 milliseconds) [info] - toInputStream() (2 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 1 total, 0.353s [info] Test run started [info] Test org.apache.spark.streaming.JavaDurationSuite.testGreaterEq started [info] Test org.apache.spark.streaming.JavaDurationSuite.testDiv started [info] Test org.apache.spark.streaming.JavaDurationSuite.testMinus started [info] Test org.apache.spark.streaming.JavaDurationSuite.testTimes started [info] Test org.apache.spark.streaming.JavaDurationSuite.testLess started [info] Test org.apache.spark.streaming.JavaDurationSuite.testPlus started [info] Test org.apache.spark.streaming.JavaDurationSuite.testGreater started [info] Test org.apache.spark.streaming.JavaDurationSuite.testMinutes started [info] Test org.apache.spark.streaming.JavaDurationSuite.testMilliseconds started [info] Test org.apache.spark.streaming.JavaDurationSuite.testLessEq started [info] Test org.apache.spark.streaming.JavaDurationSuite.testSeconds started [info] Test run finished: 0 failed, 0 ignored, 11 total, 0.006s [info] Test run started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testStreamingContextTransform started [info] BypassMergeSortShuffleWriterSuite: [info] - write empty iterator (9 milliseconds) [info] - SPARK-18016: generated projections on wider table requiring class-splitting (8 seconds, 976 milliseconds) [info] - generated unsafe projection with array of binary (18 milliseconds) [info] - padding bytes should be zeroed out (9 milliseconds) [info] - write with some empty partitions (74 milliseconds) [info] - MutableProjection should not cache content from the input row (12 milliseconds) [info] - SafeProjection should not cache content from the input row (6 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testFlatMapValues started [info] - SPARK-22699: GenerateSafeProjection should not use global variables for struct (4 milliseconds) [info] - only generate temp shuffle file for non-empty partition (51 milliseconds) [info] - cleanup of intermediate files after errors (119 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduceByWindowWithInverse started [info] ContextCleanerSuite: [info] - cleanup RDD (39 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testMapPartitions started [info] - cleanup shuffle (94 milliseconds) [info] - cleanup broadcast (8 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairFilter started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testRepartitionFewerPartitions started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCombineByKey started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testContextGetOrCreate started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testWindowWithSlideDuration started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testQueueStream started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCountByValue started [info] - automatically cleanup RDD (2 seconds, 188 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testMap started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairToNormalRDDTransform started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairReduceByKey started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCount started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery started [info] - automatically cleanup shuffle (1 second, 457 milliseconds) [info] - timeout to get SparkContext in cluster mode triggers failure (18 seconds, 22 milliseconds) [info] - automatically cleanup broadcast (1 second, 386 milliseconds) [info] - automatically cleanup normal checkpoint (681 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairMap started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testUnion started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testFlatMap started [info] - automatically clean up local checkpoint (1 second, 393 milliseconds) [info] - SPARK-18016: generated projections on wider table requiring state compaction (8 seconds, 88 milliseconds) [info] AggregateOptimizeSuite: [info] - remove literals in grouping expression (10 milliseconds) [info] - do not remove all grouping expressions if they are all literals (5 milliseconds) [info] - Remove aliased literals (9 milliseconds) [info] - remove repetition in grouping expression (7 milliseconds) [info] InternalOutputModesSuite: [info] - supported strings (1 millisecond) [info] - unsupported strings (1 millisecond) [info] ExpressionTypeCheckingSuite: [info] - check types for unary arithmetic (4 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduceByKeyAndWindowWithInverse started [info] - check types for binary arithmetic (40 milliseconds) [info] - check types for predicates (57 milliseconds) [info] - check types for aggregates (13 milliseconds) [info] - check types for others (10 milliseconds) [info] - check types for CreateNamedStruct (6 milliseconds) [info] - check types for CreateMap (6 milliseconds) [info] - check types for ROUND/BROUND (23 milliseconds) [info] - check types for Greatest/Least (11 milliseconds) [info] SimplifyConditionalSuite: [info] - simplify if (9 milliseconds) [info] - remove unnecessary if when the outputs are semantic equivalence (7 milliseconds) [info] - remove unreachable branches (8 milliseconds) [info] - remove entire CaseWhen if only the else branch is reachable (8 milliseconds) [info] - remove entire CaseWhen if the first branch is always true (13 milliseconds) [info] - simplify CaseWhen, prune branches following a definite true (5 milliseconds) [info] - simplify CaseWhen if all the outputs are semantic equivalence (20 milliseconds) [info] LimitPushdownSuite: [info] - Union: limit to each side (8 milliseconds) [info] - Union: limit to each side with constant-foldable limit expressions (6 milliseconds) [info] - Union: limit to each side with the new limit number (5 milliseconds) [info] - Union: no limit to both sides if children having smaller limit values (8 milliseconds) [info] - Union: limit to each sides if children having larger limit values (11 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testGlom started [info] - left outer join (10 milliseconds) [info] - left outer join and left sides are limited (7 milliseconds) [info] - left outer join and right sides are limited (8 milliseconds) [info] - right outer join (6 milliseconds) [info] - right outer join and right sides are limited (7 milliseconds) [info] - right outer join and left sides are limited (12 milliseconds) [info] - larger limits are not pushed on top of smaller ones in right outer join (7 milliseconds) [info] - full outer join where neither side is limited and both sides have same statistics (14 milliseconds) [info] - full outer join where neither side is limited and left side has larger statistics (5 milliseconds) [info] - full outer join where neither side is limited and right side has larger statistics (4 milliseconds) [info] - full outer join where both sides are limited (7 milliseconds) [info] StarJoinCostBasedReorderSuite: [info] - Test 1: Star query with two dimensions and two regular tables (79 milliseconds) [info] - Test 2: Star with a linear branch (44 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testJoin started [info] - Test 3: Star with derived branches (66 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairFlatMap started [info] - Test 4: Star with several branches (207 milliseconds) [info] - Test 5: RI star only (20 milliseconds) [info] - Test 6: No RI star (21 milliseconds) [info] CodeGeneratorWithInterpretedFallbackSuite: [info] - UnsafeProjection with codegen factory mode (10 milliseconds) [info] - fallback to the interpreter mode (13 milliseconds) [info] - codegen failures in the CODEGEN_ONLY mode (3 milliseconds) [info] ComplexTypeSuite: [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairToPairFlatMapWithChangingTypes started [info] - GetArrayItem (266 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairMapPartitions started [info] - GetMapValue (125 milliseconds) [info] - GetStructField (75 milliseconds) [info] - GetArrayStructFields (51 milliseconds) [info] - SPARK-32167: nullability of GetArrayStructFields (2 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testRepartitionMorePartitions started [info] - CreateArray (310 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduceByWindowWithoutInverse started [info] - CreateMap (289 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testLeftOuterJoin started [info] - MapFromArrays (200 milliseconds) [info] - CreateStruct (64 milliseconds) [info] - CreateNamedStruct (101 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testVariousTransform started [info] - test dsl for complex type (69 milliseconds) [info] - error message of ExtractValue (1 millisecond) [info] - ensure to preserve metadata (1 millisecond) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testTransformWith started [info] - StringToMap (90 milliseconds) [info] - SPARK-22693: CreateNamedStruct should not use global variables (1 millisecond) [info] - SPARK-33338: semanticEquals should handle static GetMapValue correctly (0 milliseconds) [info] JoinReorderSuite: [info] - reorder 3 tables (8 milliseconds) [info] - put unjoinable item at the end and reorder 3 joinable tables (9 milliseconds) [info] - reorder 3 tables with pure-attribute project (17 milliseconds) [info] - reorder 3 tables - one of the leaf items is a project (12 milliseconds) [info] - don't reorder if project contains non-attribute (10 milliseconds) [info] - reorder 4 tables (bushy tree) (15 milliseconds) [info] - keep the order of attributes in the final output (42 milliseconds) [info] - SPARK-26352: join reordering should not change the order of attributes (8 milliseconds) [info] - reorder recursively (10 milliseconds) [info] - SPARK-33935: betterThan should be consistent (1 millisecond) [info] CanonicalizeSuite: [info] - SPARK-24276: IN expression with different order are semantically equal (3 milliseconds) [info] - SPARK-26402: accessing nested fields with different cases in case insensitive mode (1 millisecond) [info] MetadataSuite: [info] - metadata builder and getters (0 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testVariousTransformWith started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testTextFileStream started [info] - metadata json conversion (219 milliseconds) [info] ComputeCurrentTimeSuite: [info] - analyzer should replace current_timestamp with literals (2 milliseconds) [info] - analyzer should replace current_date with literals (3 milliseconds) [info] BitwiseExpressionsSuite: [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairGroupByKey started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCoGroup started [info] - BitwiseNOT (508 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testInitialization started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testSocketString started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testGroupByKeyAndWindow started [info] - BitwiseAnd (417 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduceByKeyAndWindow started [info] - BitwiseOr (368 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testForeachRDD started [info] - BitwiseXor (388 milliseconds) [info] RowEncoderSuite: [info] - automatically cleanup RDD + shuffle + broadcast (4 seconds, 680 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testFileStream started [info] - encode/decode: struct (codegen path) (81 milliseconds) [info] - encode/decode: struct (interpreted path) (15 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairTransform started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testFilter started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testPairMap2 started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testMapValues started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testReduce started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testUpdateStateByKey started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testTransform started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testWindow started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testCountByValueAndWindow started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testRawSocketStream started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testSocketTextStream started [info] Test test.org.apache.spark.streaming.JavaAPISuite.testUpdateStateByKeyWithInitial started [info] - encode/decode: struct,arrayOfString:array,arrayOfArrayOfString:array>,arrayOfArrayOfInt:array>,arrayOfMap:array>,arrayOfStruct:array>,arrayOfUDT:array> (codegen path) (2 seconds, 763 milliseconds) [info] Test test.org.apache.spark.streaming.JavaAPISuite.testContextState started [info] Test run finished: 0 failed, 0 ignored, 53 total, 16.316s [info] Test run started [info] Test test.org.apache.spark.streaming.Java8APISuite.testStreamingContextTransform started [info] Test test.org.apache.spark.streaming.Java8APISuite.testFlatMapValues started [info] Test test.org.apache.spark.streaming.Java8APISuite.testMapPartitions started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairFilter started [info] Test test.org.apache.spark.streaming.Java8APISuite.testCombineByKey started [info] Test test.org.apache.spark.streaming.Java8APISuite.testMap started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairToNormalRDDTransform started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairReduceByKey started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairMap started [info] Test test.org.apache.spark.streaming.Java8APISuite.testFlatMap started [info] Test test.org.apache.spark.streaming.Java8APISuite.testReduceByKeyAndWindowWithInverse started [info] - encode/decode: struct,arrayOfString:array,arrayOfArrayOfString:array>,arrayOfArrayOfInt:array>,arrayOfMap:array>,arrayOfStruct:array>,arrayOfUDT:array> (interpreted path) (3 seconds, 129 milliseconds) [info] Test test.org.apache.spark.streaming.Java8APISuite.testReduceByWindow started [info] - automatically cleanup RDD + shuffle + broadcast in distributed mode (6 seconds, 683 milliseconds) [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairFlatMap started [info] StorageSuite: [info] - storage status add non-RDD blocks (7 milliseconds) [info] - storage status add RDD blocks (3 milliseconds) [info] - storage status getBlock (0 milliseconds) [info] - storage status memUsed, diskUsed, externalBlockStoreUsed (1 millisecond) [info] - storage memUsed, diskUsed with on-heap and off-heap blocks (3 milliseconds) [info] - old SparkListenerBlockManagerAdded event compatible (0 milliseconds) [info] AppStatusListenerSuite: [info] - environment info (12 milliseconds) [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairToPairFlatMapWithChangingTypes started [info] - scheduler events (139 milliseconds) [info] - storage events (47 milliseconds) [info] - eviction of old data (56 milliseconds) [info] - eviction should respect job completion time (9 milliseconds) [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairMapPartitions started [info] - eviction should respect stage completion time (23 milliseconds) [info] - skipped stages should be evicted before completed stages (27 milliseconds) [info] - eviction should respect task completion time (26 milliseconds) [info] - lastStageAttempt should fail when the stage doesn't exist (23 milliseconds) [info] - SPARK-24415: update metrics for tasks that finish late (32 milliseconds) [info] Test test.org.apache.spark.streaming.Java8APISuite.testVariousTransform started [info] - Total tasks in the executor summary should match total stage tasks (live = true) (37 milliseconds) [info] Test test.org.apache.spark.streaming.Java8APISuite.testTransformWith started [info] - Total tasks in the executor summary should match total stage tasks (live = false) (28 milliseconds) [info] - driver logs (9 milliseconds) [info] InboxSuite: [info] - post (21 milliseconds) [info] - post: with reply (3 milliseconds) [info] - post: multiple threads (10 milliseconds) [info] - post: Associated (3 milliseconds) [info] - post: Disassociated (3 milliseconds) [info] - post: AssociationError (3 milliseconds) [info] - SPARK-32738: should reduce the number of active threads when fatal error happens (8 milliseconds) [info] MapStatusSuite: [info] - compressSize (1 millisecond) [info] - decompressSize (0 milliseconds) [info] Test test.org.apache.spark.streaming.Java8APISuite.testVariousTransformWith started [info] Test test.org.apache.spark.streaming.Java8APISuite.testReduceByKeyAndWindow started [info] - MapStatus should never report non-empty blocks' sizes as 0 (325 milliseconds) [info] - large tasks should use org.apache.spark.scheduler.HighlyCompressedMapStatus (1 millisecond) [info] - HighlyCompressedMapStatus: estimated size should be the average non-empty block size (6 milliseconds) [info] - SPARK-22540: ensure HighlyCompressedMapStatus calculates correct avgSize (11 milliseconds) [info] - RoaringBitmap: runOptimize succeeded (18 milliseconds) [info] - RoaringBitmap: runOptimize failed (6 milliseconds) [info] - Blocks which are bigger than SHUFFLE_ACCURATE_BLOCK_THRESHOLD should not be underestimated. (8 milliseconds) [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairTransform started [info] Test test.org.apache.spark.streaming.Java8APISuite.testFilter started [info] Test test.org.apache.spark.streaming.Java8APISuite.testPairMap2 started [info] Test test.org.apache.spark.streaming.Java8APISuite.testMapValues started [info] Test test.org.apache.spark.streaming.Java8APISuite.testReduce started [info] Test test.org.apache.spark.streaming.Java8APISuite.testUpdateStateByKey started [info] Test test.org.apache.spark.streaming.Java8APISuite.testTransform started [info] - encode/decode: struct,mapOfStringAndArray:map>,mapOfArrayAndInt:map,int>,mapOfArray:map,array>,mapOfStringAndStruct:map>,mapOfStructAndString:map,string>,mapOfStruct:map,struct>> (codegen path) (3 seconds, 798 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 26 total, 6.733s [info] Test run started [info] Test org.apache.spark.streaming.JavaTimeSuite.testGreaterEq started [info] Test org.apache.spark.streaming.JavaTimeSuite.testLess started [info] Test org.apache.spark.streaming.JavaTimeSuite.testPlus started [info] Test org.apache.spark.streaming.JavaTimeSuite.testMinusDuration started [info] Test org.apache.spark.streaming.JavaTimeSuite.testGreater started [info] Test org.apache.spark.streaming.JavaTimeSuite.testLessEq started [info] Test org.apache.spark.streaming.JavaTimeSuite.testMinusTime started [info] Test run finished: 0 failed, 0 ignored, 7 total, 0.001s [info] Test run started [info] Test org.apache.spark.streaming.JavaReceiverAPISuite.testReceiver started [info] - executor env overwrite AM env in client mode (18 seconds, 22 milliseconds) [info] Test run finished: 0 failed, 0 ignored, 1 total, 1.25s [info] MathExpressionsSuite: [info] - encode/decode: struct,mapOfStringAndArray:map>,mapOfArrayAndInt:map,int>,mapOfArray:map,array>,mapOfStringAndStruct:map>,mapOfStructAndString:map,string>,mapOfStruct:map,struct>> (interpreted path) (3 seconds, 80 milliseconds) [info] - encode/decode: struct,structOfStructOfString:struct>,structOfArray:struct>,structOfMap:struct>,structOfArrayAndMap:struct,map:map>,structOfUDT:struct> (codegen path) (152 milliseconds) [info] - encode/decode: struct,structOfStructOfString:struct>,structOfArray:struct>,structOfMap:struct>,structOfArrayAndMap:struct,map:map>,structOfUDT:struct> (interpreted path) (150 milliseconds) [info] - encode/decode decimal type (codegen path) (24 milliseconds) [info] - encode/decode decimal type (interpreted path) (6 milliseconds) [info] - RowEncoder should preserve decimal precision and scale (codegen path) (9 milliseconds) [info] - RowEncoder should preserve decimal precision and scale (interpreted path) (1 millisecond) [info] - RowEncoder should preserve schema nullability (codegen path) (1 millisecond) [info] - RowEncoder should preserve schema nullability (interpreted path) (1 millisecond) [info] - RowEncoder should preserve nested column name (codegen path) (2 milliseconds) [info] - RowEncoder should preserve nested column name (interpreted path) (2 milliseconds) [info] - RowEncoder should support primitive arrays (codegen path) (34 milliseconds) [info] - RowEncoder should support primitive arrays (interpreted path) (22 milliseconds) [info] - RowEncoder should support array as the external type for ArrayType (codegen path) (32 milliseconds) [info] - RowEncoder should support array as the external type for ArrayType (interpreted path) (40 milliseconds) [info] - RowEncoder should throw RuntimeException if input row object is null (codegen path) (6 milliseconds) [info] - RowEncoder should throw RuntimeException if input row object is null (interpreted path) (1 millisecond) [info] - RowEncoder should validate external type (codegen path) (33 milliseconds) [info] - RowEncoder should validate external type (interpreted path) (15 milliseconds) [info] - RowEncoder should preserve array nullability: ArrayType(IntegerType, containsNull = true), nullable = true (codegen path) (1 millisecond) [info] - RowEncoder should preserve array nullability: ArrayType(IntegerType, containsNull = true), nullable = true (interpreted path) (2 milliseconds) [info] - RowEncoder should preserve array nullability: ArrayType(IntegerType, containsNull = true), nullable = false (codegen path) (1 millisecond) [info] - RowEncoder should preserve array nullability: ArrayType(IntegerType, containsNull = true), nullable = false (interpreted path) (1 millisecond) [info] - RowEncoder should preserve array nullability: ArrayType(IntegerType, containsNull = false), nullable = true (codegen path) (1 millisecond) [info] - RowEncoder should preserve array nullability: ArrayType(IntegerType, containsNull = false), nullable = true (interpreted path) (1 millisecond) [info] - RowEncoder should preserve array nullability: ArrayType(IntegerType, containsNull = false), nullable = false (codegen path) (1 millisecond) [info] - RowEncoder should preserve array nullability: ArrayType(IntegerType, containsNull = false), nullable = false (interpreted path) (1 millisecond) [info] - RowEncoder should preserve array nullability: ArrayType(StringType, containsNull = true), nullable = true (codegen path) (1 millisecond) [info] - RowEncoder should preserve array nullability: ArrayType(StringType, containsNull = true), nullable = true (interpreted path) (1 millisecond) [info] - RowEncoder should preserve array nullability: ArrayType(StringType, containsNull = true), nullable = false (codegen path) (1 millisecond) [info] - RowEncoder should preserve array nullability: ArrayType(StringType, containsNull = true), nullable = false (interpreted path) (1 millisecond) [info] - RowEncoder should preserve array nullability: ArrayType(StringType, containsNull = false), nullable = true (codegen path) (1 millisecond) [info] - RowEncoder should preserve array nullability: ArrayType(StringType, containsNull = false), nullable = true (interpreted path) (1 millisecond) [info] - RowEncoder should preserve array nullability: ArrayType(StringType, containsNull = false), nullable = false (codegen path) (2 milliseconds) [info] - RowEncoder should preserve array nullability: ArrayType(StringType, containsNull = false), nullable = false (interpreted path) (1 millisecond) [info] - RowEncoder should preserve map nullability: MapType(IntegerType, IntegerType, valueContainsNull = true), nullable = true (codegen path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(IntegerType, IntegerType, valueContainsNull = true), nullable = true (interpreted path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(IntegerType, IntegerType, valueContainsNull = true), nullable = false (codegen path) (1 millisecond) [info] - RowEncoder should preserve map nullability: MapType(IntegerType, IntegerType, valueContainsNull = true), nullable = false (interpreted path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(IntegerType, IntegerType, valueContainsNull = false), nullable = true (codegen path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(IntegerType, IntegerType, valueContainsNull = false), nullable = true (interpreted path) (1 millisecond) [info] - RowEncoder should preserve map nullability: MapType(IntegerType, IntegerType, valueContainsNull = false), nullable = false (codegen path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(IntegerType, IntegerType, valueContainsNull = false), nullable = false (interpreted path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(IntegerType, StringType, valueContainsNull = true), nullable = true (codegen path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(IntegerType, StringType, valueContainsNull = true), nullable = true (interpreted path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(IntegerType, StringType, valueContainsNull = true), nullable = false (codegen path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(IntegerType, StringType, valueContainsNull = true), nullable = false (interpreted path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(IntegerType, StringType, valueContainsNull = false), nullable = true (codegen path) (1 millisecond) [info] - RowEncoder should preserve map nullability: MapType(IntegerType, StringType, valueContainsNull = false), nullable = true (interpreted path) (1 millisecond) [info] - RowEncoder should preserve map nullability: MapType(IntegerType, StringType, valueContainsNull = false), nullable = false (codegen path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(IntegerType, StringType, valueContainsNull = false), nullable = false (interpreted path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(StringType, IntegerType, valueContainsNull = true), nullable = true (codegen path) (1 millisecond) [info] - RowEncoder should preserve map nullability: MapType(StringType, IntegerType, valueContainsNull = true), nullable = true (interpreted path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(StringType, IntegerType, valueContainsNull = true), nullable = false (codegen path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(StringType, IntegerType, valueContainsNull = true), nullable = false (interpreted path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(StringType, IntegerType, valueContainsNull = false), nullable = true (codegen path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(StringType, IntegerType, valueContainsNull = false), nullable = true (interpreted path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(StringType, IntegerType, valueContainsNull = false), nullable = false (codegen path) (3 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(StringType, IntegerType, valueContainsNull = false), nullable = false (interpreted path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(StringType, StringType, valueContainsNull = true), nullable = true (codegen path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(StringType, StringType, valueContainsNull = true), nullable = true (interpreted path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(StringType, StringType, valueContainsNull = true), nullable = false (codegen path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(StringType, StringType, valueContainsNull = true), nullable = false (interpreted path) (1 millisecond) [info] - RowEncoder should preserve map nullability: MapType(StringType, StringType, valueContainsNull = false), nullable = true (codegen path) (1 millisecond) [info] - RowEncoder should preserve map nullability: MapType(StringType, StringType, valueContainsNull = false), nullable = true (interpreted path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(StringType, StringType, valueContainsNull = false), nullable = false (codegen path) (2 milliseconds) [info] - RowEncoder should preserve map nullability: MapType(StringType, StringType, valueContainsNull = false), nullable = false (interpreted path) (2 milliseconds) [info] BooleanSimplificationSuite: [info] - a && a => a (8 milliseconds) [info] - a || a => a (5 milliseconds) [info] - (a && b && c && ...) || (a && b && d && ...) || (a && b && e && ...) ... (15 milliseconds) [info] - (a || b || c || ...) && (a || b || d || ...) && (a || b || e || ...) ... (18 milliseconds) [info] - e && (!e || f) - not nullable (11 milliseconds) [info] - e && (!e || f) - nullable (18 milliseconds) [info] - a < 1 && (!(a < 1) || f) - not nullable (23 milliseconds) [info] - a < 1 && ((a >= 1) || f) - not nullable (22 milliseconds) [info] - DeMorgan's law (12 milliseconds) [info] - (a && b) || (a && c) => a && (b || c) when case insensitive (4 milliseconds) [info] - (a || b) && (a || c) => a || (b && c) when case insensitive (3 milliseconds) [info] - Complementation Laws (8 milliseconds) [info] - Complementation Laws - null handling (12 milliseconds) [info] - Complementation Laws - negative case (7 milliseconds) [info] - filter reduction - positive cases (420 milliseconds) [info] DSLHintSuite: [info] - various hint parameters (2 milliseconds) [info] LikeSimplificationSuite: [info] - simplify Like into StartsWith (5 milliseconds) [info] - simplify Like into EndsWith (3 milliseconds) [info] - simplify Like into startsWith and EndsWith (4 milliseconds) [info] - simplify Like into Contains (3 milliseconds) [info] - simplify Like into EqualTo (2 milliseconds) [info] - null pattern (2 milliseconds) [info] EliminateSortsSuite: [info] - Empty order by clause (4 milliseconds) [info] - All the SortOrder are no-op (2 milliseconds) [info] - Partial order-by clauses contain no-op SortOrder (3 milliseconds) [info] - Remove no-op alias (7 milliseconds) [info] - SPARK-32318: should not remove orderBy in distribute statement (7 milliseconds) [info] - SPARK-33183: remove consecutive no-op sorts (1 millisecond) [info] DecimalPrecisionSuite: [info] - basic operations (36 milliseconds) [info] - Comparison operations (13 milliseconds) [info] - decimal precision for union (24 milliseconds) [info] - bringing in primitive types (15 milliseconds) [info] - maximum decimals (51 milliseconds) [info] - DecimalType.isWiderThan (0 milliseconds) [info] - strength reduction for integer/decimal comparisons - basic test (22 milliseconds) [info] - strength reduction for integer/decimal comparisons - overflow test (42 milliseconds) [info] - SPARK-24468: operations on decimals with negative scale (4 milliseconds) [info] SubstituteUnresolvedOrdinalsSuite: [info] - unresolved ordinal should not be unresolved (0 milliseconds) [info] - order by ordinal (5 milliseconds) [info] - group by ordinal (3 milliseconds) [info] StructTypeSuite: [info] - lookup a single missing field should output existing fields (2 milliseconds) [info] - lookup a set of missing fields should output existing fields (1 millisecond) [info] - lookup fieldIndex for missing field should output existing fields (1 millisecond) [info] - SPARK-24849: toDDL - simple struct (2 milliseconds) [info] - SPARK-24849: round trip toDDL - fromDDL (1 millisecond) [info] - SPARK-24849: round trip fromDDL - toDDL (3 milliseconds) [info] - SPARK-24849: toDDL must take into account case of fields. (1 millisecond) [info] - SPARK-24849: toDDL should output field's comment (1 millisecond) [info] ReorderAssociativeOperatorSuite: [info] - Reorder associative operators (27 milliseconds) [info] - nested expression with aggregate operator (9 milliseconds) [info] LiteralExpressionSuite: [info] - conv (1 second, 912 milliseconds) [info] - e (86 milliseconds) [info] - pi (38 milliseconds) [info] - null (240 milliseconds) [info] - default (293 milliseconds) [info] - boolean literals (51 milliseconds) [info] - int literals (386 milliseconds) [info] - double literals (282 milliseconds) [info] - string literals (59 milliseconds) [info] - sum two literals (30 milliseconds) [info] - binary literals (35 milliseconds) [info] - sin (1 second, 643 milliseconds) [info] - decimal (739 milliseconds) [info] - array (148 milliseconds) [info] - seq (51 milliseconds) [info] - map (61 milliseconds) [info] - asin (590 milliseconds) [info] - struct (97 milliseconds) [info] - unsupported types (map and struct) in Literal.apply (1 millisecond) [info] - SPARK-24571: char literals (37 milliseconds) [info] PullOutPythonUDFInJoinConditionSuite: [info] - inner join condition with python udf (10 milliseconds) [info] - unevaluable python udf and common condition (4 milliseconds) [info] - unevaluable python udf or common condition (5 milliseconds) [info] - pull out whole complex condition with multiple unevaluable python udf (6 milliseconds) [info] - partial pull out complex condition with multiple unevaluable python udf (4 milliseconds) [info] - pull out unevaluable python udf when it's mixed with evaluable one (3 milliseconds) [info] - throw an exception for not supported join types (7 milliseconds) [info] AnalysisErrorSuite: [info] - scalar subquery with 2 columns (4 milliseconds) [info] - scalar subquery with no column (0 milliseconds) [info] - single invalid type, single arg (1 millisecond) [info] - single invalid type, second arg (1 millisecond) [info] - multiple invalid type (1 millisecond) [info] - invalid window function (4 milliseconds) [info] - distinct aggregate function in window (3 milliseconds) [info] - distinct function (6 milliseconds) [info] - distinct window function (5 milliseconds) [info] - nested aggregate functions (2 milliseconds) [info] - offset window function (2 milliseconds) [info] - too many generators (2 milliseconds) [info] - unresolved attributes (2 milliseconds) [info] - unresolved attributes with a generated name (21 milliseconds) [info] - unresolved star expansion in max (2 milliseconds) [info] - bad casts (1 millisecond) [info] - sorting by unsupported column types (2 milliseconds) [info] - sorting by attributes are not from grouping expressions (7 milliseconds) [info] - non-boolean filters (2 milliseconds) [info] - non-boolean join conditions (1 millisecond) [info] - missing group by (1 millisecond) [info] - ambiguous field (2 milliseconds) [info] - ambiguous field due to case insensitivity (1 millisecond) [info] - missing field (0 milliseconds) [info] - catch all unresolved plan (1 millisecond) [info] - union with unequal number of columns (1 millisecond) [info] - intersect with unequal number of columns (0 milliseconds) [info] - except with unequal number of columns (1 millisecond) [info] - union with incompatible column types (1 millisecond) [info] - union with a incompatible column type and compatible column types (1 millisecond) [info] - intersect with incompatible column types (1 millisecond) [info] - intersect with a incompatible column type and compatible column types (1 millisecond) [info] - except with incompatible column types (1 millisecond) [info] - except with a incompatible column type and compatible column types (1 millisecond) [info] - SPARK-9955: correct error message for aggregate (2 milliseconds) [info] - slide duration greater than window in time window (3 milliseconds) [info] - start time greater than slide duration in time window (2 milliseconds) [info] - start time equal to slide duration in time window (1 millisecond) [info] - SPARK-21590: absolute value of start time greater than slide duration in time window (2 milliseconds) [info] - SPARK-21590: absolute value of start time equal to slide duration in time window (2 milliseconds) [info] - negative window duration in time window (1 millisecond) [info] - zero window duration in time window (2 milliseconds) [info] - negative slide duration in time window (2 milliseconds) [info] - zero slide duration in time window (2 milliseconds) [info] - generator nested in expressions (1 millisecond) [info] - SPARK-30998: unsupported nested inner generators (1 millisecond) [info] - SPARK-30998: unsupported nested inner generators for aggregates (1 millisecond) [info] - generator appears in operator which is not Project (1 millisecond) [info] - an evaluated limit class must not be null (1 millisecond) [info] - num_rows in limit clause must be equal to or greater than 0 (1 millisecond) [info] - more than one generators in SELECT (0 milliseconds) [info] - SPARK-6452 regression test (4 milliseconds) [info] - error test for self-join (2 milliseconds) [info] - check grouping expression data types (36 milliseconds) [info] - we should fail analysis when we find nested aggregate functions (3 milliseconds) [info] - Join can work on binary types but can't work on map types (2 milliseconds) [info] - PredicateSubQuery is used outside of a filter (3 milliseconds) [info] - PredicateSubQuery is used is a nested condition (4 milliseconds) [info] - PredicateSubQuery correlated predicate is nested in an illegal plan (15 milliseconds) [info] - SPARK-30811: CTE should not cause stack overflow when it refers to non-existent table with same name (5 milliseconds) [info] CallMethodViaReflectionSuite: [info] - findMethod via reflection for static methods (3 milliseconds) [info] - findMethod for a JDK library (0 milliseconds) [info] - class not found (1 millisecond) [info] - method not found because name does not match (1 millisecond) [info] - method not found because there is no static method (1 millisecond) [info] - input type checking (0 milliseconds) [info] - unsupported type checking (1 millisecond) [info] - invoking methods using acceptable types (51 milliseconds) [info] PullOutNondeterministicSuite: [info] - no-op on filter (2 milliseconds) [info] - sort (5 milliseconds) [info] - aggregate (3 milliseconds) [info] ArithmeticExpressionSuite: [info] - sinh (957 milliseconds) [info] - + (Add) (643 milliseconds) [info] - cos (732 milliseconds) [info] - - (UnaryMinus) (573 milliseconds) [info] - acos (709 milliseconds) [info] - - (Minus) (676 milliseconds) [info] - * (Multiply) (634 milliseconds) [info] - cosh (807 milliseconds) [info] - / (Divide) basic (210 milliseconds) [info] - / (Divide) for integral type !!! IGNORED !!! [info] - % (Remainder) (963 milliseconds) [info] - tan (1 second, 58 milliseconds) [info] - SPARK-17617: % (Remainder) double % double on super big double (53 milliseconds) [info] - Abs (954 milliseconds) [info] - cot (1 second, 562 milliseconds) [info] - pmod (862 milliseconds) [info] - atan (1 second, 29 milliseconds) [info] - function least (1 second, 514 milliseconds) [info] - tanh (885 milliseconds) [info] - toDegrees (835 milliseconds) [info] - function greatest (1 second, 336 milliseconds) [info] - toRadians (654 milliseconds) [info] - cbrt (751 milliseconds) [info] - ceil (1 second, 828 milliseconds) [info] - executor env overwrite AM env in cluster mode (19 seconds, 26 milliseconds) [info] - floor (1 second, 574 milliseconds) [info] - factorial (566 milliseconds) [info] - SPARK-22499: Least and greatest should not generate codes beyond 64KB (5 seconds, 362 milliseconds) [info] - SPARK-22704: Least and greatest use less global variables (1 millisecond) [info] ComplexTypesSuite: [info] - explicit get from namedStruct (4 milliseconds) [info] - explicit get from named_struct- expression maintains original deduced alias (3 milliseconds) [info] - collapsed getStructField ontop of namedStruct (3 milliseconds) [info] - collapse multiple CreateNamedStruct/GetStructField pairs (4 milliseconds) [info] - collapsed2 - deduced names (3 milliseconds) [info] - simplified array ops (11 milliseconds) [info] - SPARK-22570: CreateArray should not create a lot of global variables (1 millisecond) [info] - rint (827 milliseconds) [info] - SPARK-23208: Test code splitting for create array related methods (590 milliseconds) [info] - simplify map ops (7 milliseconds) [info] - simplify map ops, constant lookup, dynamic keys (8 milliseconds) [info] - simplify map ops, dynamic lookup, dynamic keys, lookup is equivalent to one of the keys (8 milliseconds) [info] - simplify map ops, no positive match (7 milliseconds) [info] - simplify map ops, constant lookup, mixed keys, eliminated constants (7 milliseconds) [info] - simplify map ops, potential dynamic match with null value + an absolute constant match (6 milliseconds) [info] - SPARK-23500: Simplify array ops that are not at the top node (11 milliseconds) [info] - SPARK-23500: Simplify map ops that are not top nodes (9 milliseconds) [info] - SPARK-23500: Simplify complex ops that aren't at the plan root (13 milliseconds) [info] - SPARK-23500: Ensure that aggregation expressions are not simplified (10 milliseconds) [info] - SPARK-23500: namedStruct and getField in the same Project #1 (7 milliseconds) [info] - SPARK-23500: namedStruct and getField in the same Project #2 (4 milliseconds) [info] - SPARK-24313: support binary type as map keys in GetMapValue (86 milliseconds) [info] DataTypeSuite: [info] - construct an ArrayType (1 millisecond) [info] - construct an MapType (0 milliseconds) [info] - construct with add (0 milliseconds) [info] - construct with add from StructField (1 millisecond) [info] - construct with add from StructField with comments (1 millisecond) [info] - construct with String DataType (0 milliseconds) [info] - extract fields from a StructType (2 milliseconds) [info] - extract field index from a StructType (0 milliseconds) [info] - fieldsMap returns map of name to StructField (1 millisecond) [info] - fieldNames and names returns field names (1 millisecond) [info] - merge where right contains type conflict (2 milliseconds) [info] - existsRecursively (4 milliseconds) [info] - from Json - NullType (4 milliseconds) [info] - from Json - BooleanType (0 milliseconds) [info] - from DDL - BooleanType (1 millisecond) [info] - from Json - ByteType (0 milliseconds) [info] - from DDL - ByteType (0 milliseconds) [info] - from Json - ShortType (1 millisecond) [info] - from DDL - ShortType (0 milliseconds) [info] - from Json - IntegerType (0 milliseconds) [info] - from DDL - IntegerType (0 milliseconds) [info] - from Json - LongType (0 milliseconds) [info] - from DDL - LongType (0 milliseconds) [info] - from Json - FloatType (0 milliseconds) [info] - from DDL - FloatType (1 millisecond) [info] - from Json - DoubleType (0 milliseconds) [info] - from DDL - DoubleType (0 milliseconds) [info] - from Json - DecimalType(10,5) (1 millisecond) [info] - from DDL - DecimalType(10,5) (0 milliseconds) [info] - from Json - DecimalType(38,18) (0 milliseconds) [info] - from DDL - DecimalType(38,18) (0 milliseconds) [info] - from Json - DateType (1 millisecond) [info] - from DDL - DateType (0 milliseconds) [info] - from Json - TimestampType (0 milliseconds) [info] - from DDL - TimestampType (1 millisecond) [info] - from Json - StringType (0 milliseconds) [info] - from DDL - StringType (0 milliseconds) [info] - from Json - BinaryType (0 milliseconds) [info] - from DDL - BinaryType (1 millisecond) [info] - from Json - ArrayType(DoubleType,true) (2 milliseconds) [info] - from DDL - ArrayType(DoubleType,true) (1 millisecond) [info] - from Json - ArrayType(StringType,false) (1 millisecond) [info] - from DDL - ArrayType(StringType,false) (0 milliseconds) [info] - from Json - MapType(IntegerType,StringType,true) (1 millisecond) [info] - from DDL - MapType(IntegerType,StringType,true) (0 milliseconds) [info] - from Json - MapType(IntegerType,ArrayType(DoubleType,true),false) (1 millisecond) [info] - from DDL - MapType(IntegerType,ArrayType(DoubleType,true),false) (0 milliseconds) [info] - from Json - StructType(StructField(a,IntegerType,true), StructField(b,ArrayType(DoubleType,true),false), StructField(c,DoubleType,false)) (3 milliseconds) [info] - from DDL - StructType(StructField(a,IntegerType,true), StructField(b,ArrayType(DoubleType,true),false), StructField(c,DoubleType,false)) (3 milliseconds) [info] - fromJson throws an exception when given type string is invalid (3 milliseconds) [info] - Check the default size of NullType (0 milliseconds) [info] - Check the default size of BooleanType (0 milliseconds) [info] - Check the default size of ByteType (0 milliseconds) [info] - Check the default size of ShortType (0 milliseconds) [info] - Check the default size of IntegerType (0 milliseconds) [info] - Check the default size of LongType (0 milliseconds) [info] - Check the default size of FloatType (0 milliseconds) [info] - Check the default size of DoubleType (0 milliseconds) [info] - Check the default size of DecimalType(10,5) (0 milliseconds) [info] - Check the default size of DecimalType(38,18) (0 milliseconds) [info] - Check the default size of DateType (0 milliseconds) [info] - Check the default size of TimestampType (0 milliseconds) [info] - Check the default size of StringType (0 milliseconds) [info] - Check the default size of BinaryType (0 milliseconds) [info] - Check the default size of ArrayType(DoubleType,true) (0 milliseconds) [info] - Check the default size of ArrayType(StringType,false) (0 milliseconds) [info] - Check the default size of MapType(IntegerType,StringType,true) (0 milliseconds) [info] - Check the default size of MapType(IntegerType,ArrayType(DoubleType,true),false) (0 milliseconds) [info] - Check the default size of StructType(StructField(a,IntegerType,true), StructField(b,ArrayType(DoubleType,true),false), StructField(c,DoubleType,false)) (0 milliseconds) [info] - equalsIgnoreCompatibleNullability: (from: ArrayType(DoubleType,true), to: ArrayType(DoubleType,true)) (0 milliseconds) [info] - equalsIgnoreCompatibleNullability: (from: ArrayType(DoubleType,false), to: ArrayType(DoubleType,false)) (0 milliseconds) [info] - equalsIgnoreCompatibleNullability: (from: ArrayType(DoubleType,false), to: ArrayType(DoubleType,true)) (0 milliseconds) [info] - equalsIgnoreCompatibleNullability: (from: ArrayType(DoubleType,true), to: ArrayType(DoubleType,false)) (0 milliseconds) [info] - equalsIgnoreCompatibleNullability: (from: ArrayType(DoubleType,false), to: ArrayType(StringType,false)) (0 milliseconds) [info] - equalsIgnoreCompatibleNullability: (from: MapType(StringType,DoubleType,true), to: MapType(StringType,DoubleType,true)) (0 milliseconds) [info] - equalsIgnoreCompatibleNullability: (from: MapType(StringType,DoubleType,false), to: MapType(StringType,DoubleType,false)) (1 millisecond) [info] - equalsIgnoreCompatibleNullability: (from: MapType(StringType,DoubleType,false), to: MapType(StringType,DoubleType,true)) (0 milliseconds) [info] - equalsIgnoreCompatibleNullability: (from: MapType(StringType,DoubleType,true), to: MapType(StringType,DoubleType,false)) (0 milliseconds) [info] - equalsIgnoreCompatibleNullability: (from: MapType(StringType,ArrayType(IntegerType,true),true), to: MapType(StringType,ArrayType(IntegerType,false),true)) (0 milliseconds) [info] - equalsIgnoreCompatibleNullability: (from: MapType(StringType,ArrayType(IntegerType,false),true), to: MapType(StringType,ArrayType(IntegerType,true),true)) (0 milliseconds) [info] - equalsIgnoreCompatibleNullability: (from: StructType(StructField(a,StringType,true)), to: StructType(StructField(a,StringType,true))) (0 milliseconds) [info] - equalsIgnoreCompatibleNullability: (from: StructType(StructField(a,StringType,false)), to: StructType(StructField(a,StringType,false))) (0 milliseconds) [info] - equalsIgnoreCompatibleNullability: (from: StructType(StructField(a,StringType,false)), to: StructType(StructField(a,StringType,true))) (0 milliseconds) [info] - equalsIgnoreCompatibleNullability: (from: StructType(StructField(a,StringType,true)), to: StructType(StructField(a,StringType,false))) (0 milliseconds) [info] - equalsIgnoreCompatibleNullability: (from: StructType(StructField(a,StringType,false), StructField(b,StringType,true)), to: StructType(StructField(a,StringType,false), StructField(b,StringType,false))) (0 milliseconds) [info] - catalogString: BooleanType (1 millisecond) [info] - catalogString: ByteType (0 milliseconds) [info] - catalogString: ShortType (0 milliseconds) [info] - catalogString: IntegerType (0 milliseconds) [info] - catalogString: LongType (0 milliseconds) [info] - catalogString: FloatType (0 milliseconds) [info] - catalogString: DoubleType (0 milliseconds) [info] - catalogString: DecimalType(10,5) (1 millisecond) [info] - catalogString: BinaryType (0 milliseconds) [info] - catalogString: StringType (0 milliseconds) [info] - catalogString: DateType (0 milliseconds) [info] - catalogString: TimestampType (0 milliseconds) [info] - catalogString: StructType(StructField(col0,IntegerType,true), StructField(col1,IntegerType,true), StructField(col2,IntegerType,true), StructField(col3,IntegerType,true)) (1 millisecond) [info] - catalogString: StructType(StructField(col0,IntegerType,true), StructField(col1,IntegerType,true), StructField(col2,IntegerType,true), StructField(col3,IntegerType,true), StructField(col4,IntegerType,true), StructField(col5,IntegerType,true), StructField(col6,IntegerType,true), StructField(col7,IntegerType,true), StructField(col8,IntegerType,true), StructField(col9,IntegerType,true), StructField(col10,IntegerType,true), StructField(col11,IntegerType,true), StructField(col12,IntegerType,true), StructField(col13,IntegerType,true), StructField(col14,IntegerType,true), StructField(col15,IntegerType,true), StructField(col16,IntegerType,true), StructField(col17,IntegerType,true), StructField(col18,IntegerType,true), StructField(col19,IntegerType,true), StructField(col20,IntegerType,true), StructField(col21,IntegerType,true), StructField(col22,IntegerType,true), StructField(col23,IntegerType,true), StructField(col24,IntegerType,true), StructField(col25,IntegerType,true), StructField(col26,IntegerType,true), StructField(col27,IntegerType,true), StructField(col28,IntegerType,true), StructField(col29,IntegerType,true), StructField(col30,IntegerType,true), StructField(col31,IntegerType,true), StructField(col32,IntegerType,true), StructField(col33,IntegerType,true), StructField(col34,IntegerType,true), StructField(col35,IntegerType,true), StructField(col36,IntegerType,true), StructField(col37,IntegerType,true), StructField(col38,IntegerType,true), StructField(col39,IntegerType,true)) (2 milliseconds) [info] - catalogString: ArrayType(IntegerType,true) (0 milliseconds) [info] - catalogString: ArrayType(StructType(StructField(col0,IntegerType,true), StructField(col1,IntegerType,true), StructField(col2,IntegerType,true), StructField(col3,IntegerType,true), StructField(col4,IntegerType,true), StructField(col5,IntegerType,true), StructField(col6,IntegerType,true), StructField(col7,IntegerType,true), StructField(col8,IntegerType,true), StructField(col9,IntegerType,true), StructField(col10,IntegerType,true), StructField(col11,IntegerType,true), StructField(col12,IntegerType,true), StructField(col13,IntegerType,true), StructField(col14,IntegerType,true), StructField(col15,IntegerType,true), StructField(col16,IntegerType,true), StructField(col17,IntegerType,true), StructField(col18,IntegerType,true), StructField(col19,IntegerType,true), StructField(col20,IntegerType,true), StructField(col21,IntegerType,true), StructField(col22,IntegerType,true), StructField(col23,IntegerType,true), StructField(col24,IntegerType,true), StructField(col25,IntegerType,true), StructField(col26,IntegerType,true), StructField(col27,IntegerType,true), StructField(col28,IntegerType,true), StructField(col29,IntegerType,true), StructField(col30,IntegerType,true), StructField(col31,IntegerType,true), StructField(col32,IntegerType,true), StructField(col33,IntegerType,true), StructField(col34,IntegerType,true), StructField(col35,IntegerType,true), StructField(col36,IntegerType,true), StructField(col37,IntegerType,true), StructField(col38,IntegerType,true), StructField(col39,IntegerType,true)),true) (2 milliseconds) [info] - catalogString: MapType(IntegerType,StringType,true) (0 milliseconds) [info] - catalogString: MapType(IntegerType,StructType(StructField(col0,IntegerType,true), StructField(col1,IntegerType,true), StructField(col2,IntegerType,true), StructField(col3,IntegerType,true), StructField(col4,IntegerType,true), StructField(col5,IntegerType,true), StructField(col6,IntegerType,true), StructField(col7,IntegerType,true), StructField(col8,IntegerType,true), StructField(col9,IntegerType,true), StructField(col10,IntegerType,true), StructField(col11,IntegerType,true), StructField(col12,IntegerType,true), StructField(col13,IntegerType,true), StructField(col14,IntegerType,true), StructField(col15,IntegerType,true), StructField(col16,IntegerType,true), StructField(col17,IntegerType,true), StructField(col18,IntegerType,true), StructField(col19,IntegerType,true), StructField(col20,IntegerType,true), StructField(col21,IntegerType,true), StructField(col22,IntegerType,true), StructField(col23,IntegerType,true), StructField(col24,IntegerType,true), StructField(col25,IntegerType,true), StructField(col26,IntegerType,true), StructField(col27,IntegerType,true), StructField(col28,IntegerType,true), StructField(col29,IntegerType,true), StructField(col30,IntegerType,true), StructField(col31,IntegerType,true), StructField(col32,IntegerType,true), StructField(col33,IntegerType,true), StructField(col34,IntegerType,true), StructField(col35,IntegerType,true), StructField(col36,IntegerType,true), StructField(col37,IntegerType,true), StructField(col38,IntegerType,true), StructField(col39,IntegerType,true)),true) (2 milliseconds) [info] - equalsStructurally: (from: BooleanType, to: BooleanType) (0 milliseconds) [info] - equalsStructurally: (from: IntegerType, to: IntegerType) (0 milliseconds) [info] - equalsStructurally: (from: IntegerType, to: LongType) (1 millisecond) [info] - equalsStructurally: (from: ArrayType(IntegerType,true), to: ArrayType(IntegerType,true)) (0 milliseconds) [info] - equalsStructurally: (from: ArrayType(IntegerType,true), to: ArrayType(IntegerType,false)) (0 milliseconds) [info] - equalsStructurally: (from: StructType(StructField(f1,IntegerType,true)), to: StructType(StructField(f2,IntegerType,true))) (1 millisecond) [info] - equalsStructurally: (from: StructType(StructField(f1,IntegerType,true)), to: StructType(StructField(f2,IntegerType,false))) (0 milliseconds) [info] - equalsStructurally: (from: StructType(StructField(f1,IntegerType,true), StructField(f,StructType(StructField(f2,StringType,true)),true)), to: StructType(StructField(f2,IntegerType,true), StructField(g,StructType(StructField(f1,StringType,true)),true))) (0 milliseconds) [info] - equalsStructurally: (from: StructType(StructField(f1,IntegerType,true), StructField(f,StructType(StructField(f2,StringType,false)),true)), to: StructType(StructField(f2,IntegerType,true), StructField(g,StructType(StructField(f1,StringType,true)),true))) (0 milliseconds) [info] - SPARK-25031: MapType should produce current formatted string for complex types (1 millisecond) [info] OptimizerRuleExclusionSuite: [info] - Exclude a single rule from multiple batches (9 milliseconds) [info] - Exclude multiple rules from single or multiple batches (6 milliseconds) [info] - Exclude non-existent rule with other valid rules (6 milliseconds) [info] - Try to exclude some non-excludable rules (2 milliseconds) [info] - Custom optimizer (1 millisecond) [info] - Verify optimized plan after excluding CombineUnions rule (17 milliseconds) [info] UnsafeMapSuite: [info] - unsafe java serialization (1 millisecond) [info] - unsafe Kryo serialization (10 milliseconds) [info] ColumnPruningSuite: [info] - Column pruning for Generate when Generate.unrequiredChildIndex = child.output (10 milliseconds) [info] - Fill Generate.unrequiredChildIndex if possible (8 milliseconds) [info] - Another fill Generate.unrequiredChildIndex if possible (8 milliseconds) [info] - Column pruning for Project on Sort (8 milliseconds) [info] - Column pruning for Expand (9 milliseconds) [info] - Column pruning for ScriptTransformation (5 milliseconds) [info] - Column pruning on Filter (13 milliseconds) [info] - Column pruning on except/intersect/distinct (9 milliseconds) [info] - Column pruning on Project (4 milliseconds) [info] - Eliminate the Project with an empty projectList (3 milliseconds) [info] - column pruning for group (5 milliseconds) [info] - column pruning for group with alias (4 milliseconds) [info] - column pruning for Project(ne, Limit) (5 milliseconds) [info] - push down project past sort (10 milliseconds) [info] - Column pruning on Window with useless aggregate functions (15 milliseconds) [info] - Column pruning on Window with selected agg expressions (19 milliseconds) [info] - Column pruning on Window in select (7 milliseconds) [info] - Column pruning on Union (6 milliseconds) [info] - Remove redundant projects in column pruning rule (9 milliseconds) [info] - Column pruning on MapPartitions (32 milliseconds) [info] - push project down into sample (7 milliseconds) [info] - SPARK-24696 ColumnPruning rule fails to remove extra Project (6 milliseconds) [info] PlanParserSuite: [info] - case insensitive (4 milliseconds) [info] - explain (2 milliseconds) [info] - set operations (16 milliseconds) [info] - common table expressions (8 milliseconds) [info] - simple select query (16 milliseconds) [info] - exp (839 milliseconds) [info] - reverse select query (18 milliseconds) [info] - multi select query (17 milliseconds) [info] - SPARK-21133 HighlyCompressedMapStatus#writeExternal throws NPE (24 seconds, 562 milliseconds) [info] - query organization (26 milliseconds) [info] - insert into (11 milliseconds) [info] - insert with if not exists (2 milliseconds) [info] PairRDDFunctionsSuite: [info] - aggregation (19 milliseconds) [info] - limit (6 milliseconds) [info] - window spec (12 milliseconds) [info] - lateral view (18 milliseconds) [info] - joins (60 milliseconds) [info] - sampled relations (11 milliseconds) [info] - aggregateByKey (32 milliseconds) [info] - sub-query (13 milliseconds) [info] - scalar sub-query (9 milliseconds) [info] - table reference (1 millisecond) [info] - table valued function (4 milliseconds) [info] - SPARK-20311 range(N) as alias (4 milliseconds) [info] - SPARK-20841 Support table column aliases in FROM clause (2 milliseconds) [info] - SPARK-20962 Support subquery column aliases in FROM clause (4 milliseconds) [info] - SPARK-20963 Support aliases for join relations in FROM clause (4 milliseconds) [info] - groupByKey (41 milliseconds) [info] - inline table (7 milliseconds) [info] - simple select query with !> and !< (3 milliseconds) [info] - select hint syntax (15 milliseconds) [info] - SPARK-20854: select hint syntax with expressions (3 milliseconds) [info] - SPARK-20854: multiple hints (3 milliseconds) [info] - TRIM function (4 milliseconds) [info] - groupByKey with duplicates (48 milliseconds) [info] - precedence of set operations (14 milliseconds) [info] JoinOptimizationSuite: [info] - extract filters and joins (2 milliseconds) [info] - groupByKey with negative key hash codes (38 milliseconds) [info] - reorder inner joins (49 milliseconds) [info] - broadcasthint sets relation statistics to smallest value (12 milliseconds) [info] ReplaceOperatorSuite: [info] - groupByKey with many output partitions (53 milliseconds) [info] - replace Intersect with Left-semi Join (11 milliseconds) [info] - replace Except with Filter while both the nodes are of type Filter (16 milliseconds) [info] - replace Except with Filter while only right node is of type Filter (5 milliseconds) [info] - replace Except with Filter while both the nodes are of type Project (6 milliseconds) [info] - replace Except with Filter while only right node is of type Project (8 milliseconds) [info] - replace Except with Filter while left node is Project and right node is Filter (8 milliseconds) [info] - replace Except with Left-anti Join (6 milliseconds) [info] - replace Except with Filter when only right filter can be applied to the left (12 milliseconds) [info] - replace Distinct with Aggregate (2 milliseconds) [info] - replace batch Deduplicate with Aggregate (4 milliseconds) [info] - add one grouping key if necessary when replace Deduplicate with Aggregate (2 milliseconds) [info] - don't replace streaming Deduplicate (2 milliseconds) [info] - SPARK-26366: ReplaceExceptWithFilter should handle properly NULL (9 milliseconds) [info] - SPARK-26366: ReplaceExceptWithFilter should not transform non-detrministic (11 milliseconds) [info] DateFormatterSuite: [info] - parsing dates (36 milliseconds) [info] - format dates (2 milliseconds) [info] - roundtrip date -> days -> date (11 milliseconds) [info] - roundtrip days -> date -> days (12 milliseconds) [info] - parsing date without explicit day (7 milliseconds) [info] ConditionalExpressionSuite: [info] - expm1 (922 milliseconds) [info] - signum (756 milliseconds) [info] - if (1 second, 606 milliseconds) [info] - log (572 milliseconds) [info] - case when (206 milliseconds) [info] - log10 (519 milliseconds) [info] - if/case when - null flags of non-primitive types (420 milliseconds) [info] - case key when (241 milliseconds) [info] - case key when - internal pattern matching expects a List while apply takes a Seq (1 millisecond) [info] - SPARK-22705: case when should use less global variables (0 milliseconds) [info] - SPARK-27917 test semantic equals of CaseWhen (1 millisecond) [info] MiscExpressionsSuite: [info] - assert_true (47 milliseconds) [info] - uuid (57 milliseconds) [info] - PrintToStderr (8 milliseconds) [info] ProjectEstimationSuite: [info] - project with alias (2 milliseconds) [info] - project on empty table (1 millisecond) [info] - test row size estimation (4 milliseconds) [info] TimestampFormatterSuite: [info] - parsing timestamps using time zones (11 milliseconds) [info] - format timestamps using time zones (1 millisecond) [info] - roundtrip micros -> timestamp -> micros using timezones (17 milliseconds) [info] - roundtrip timestamp -> micros -> timestamp using timezones (7 milliseconds) [info] - case insensitive parsing of am and pm (0 milliseconds) [info] FoldablePropagationSuite: [info] - Propagate from subquery (4 milliseconds) [info] - Propagate to select clause (3 milliseconds) [info] - Propagate to where clause (3 milliseconds) [info] - Propagate to orderBy clause (5 milliseconds) [info] - Propagate to groupBy clause (10 milliseconds) [info] - Propagate in a complex query (29 milliseconds) [info] - Propagate in subqueries of Union queries (8 milliseconds) [info] - Propagate in inner join (13 milliseconds) [info] - Propagate in expand (6 milliseconds) [info] - Propagate above outer join (6 milliseconds) [info] - SPARK-32635: Replace references with foldables coming only from the node's children (3 milliseconds) [info] RemoveRedundantSortsSuite: [info] - SPARK-33183: remove redundant sort by (9 milliseconds) [info] - SPARK-33183: remove all redundant local sorts (3 milliseconds) [info] - SPARK-33183: should not remove global sort (7 milliseconds) [info] - do not remove sort if the order is different (7 milliseconds) [info] - SPARK-33183: remove top level local sort with filter operators (5 milliseconds) [info] - SPARK-33183: keep top level global sort with filter operators (8 milliseconds) [info] - SPARK-33183: limits should not affect order for local sort (7 milliseconds) [info] - SPARK-33183: should not remove global sort with limit operators (9 milliseconds) [info] - log1p (597 milliseconds) [info] - different sorts are not simplified if limit is in between (7 milliseconds) [info] - SPARK-33183: should not remove global sort with range operator (8 milliseconds) [info] - SPARK-33183: remove local sort with range operator (3 milliseconds) [info] - sort should not be removed when there is a node which doesn't guarantee any order (12 milliseconds) [info] - remove two consecutive sorts (3 milliseconds) [info] - remove sorts separated by Filter/Project operators (14 milliseconds) [info] - SPARK-33183: remove consecutive global sorts with the same ordering (5 milliseconds) [info] - SPARK-33183: remove consecutive local sorts with the same ordering (3 milliseconds) [info] - SPARK-33183: remove consecutive local sorts with different ordering (3 milliseconds) [info] - SPARK-33183: should keep global sort when child is a local sort with the same ordering (4 milliseconds) [info] FirstLastTestSuite: [info] - empty buffer (7 milliseconds) [info] - update (8 milliseconds) [info] - update - ignore nulls (8 milliseconds) [info] - merge (9 milliseconds) [info] - merge - ignore nulls (1 millisecond) [info] - eval (6 milliseconds) [info] - eval - ignore nulls (1 millisecond) [info] - SPARK-32344: correct error handling for a type mismatch (1 millisecond) [info] ResolveGroupingAnalyticsSuite: [info] - rollupExprs (6 milliseconds) [info] - cubeExprs (2 milliseconds) [info] - grouping sets (11 milliseconds) [info] - grouping sets with no explicit group by expressions (10 milliseconds) [info] - cube (8 milliseconds) [info] - rollup (9 milliseconds) [info] - grouping function (18 milliseconds) [info] - grouping_id (20 milliseconds) [info] - filter with grouping function (53 milliseconds) [info] - sort with grouping function (46 milliseconds) [info] ReusableStringReaderSuite: [info] - empty reader (2 milliseconds) [info] - mark reset (0 milliseconds) [info] - skip (1 millisecond) [info] BufferHolderSuite: [info] - SPARK-16071 Check the size limit to avoid integer overflow (1 millisecond) [info] SameResultSuite: [info] - relations (1 millisecond) [info] - projections (8 milliseconds) [info] - filters (2 milliseconds) [info] - sorts (2 milliseconds) [info] - union (1 millisecond) [info] - hint (3 milliseconds) [info] StarJoinReorderSuite: [info] - Test 1: Selective star-join on all dimensions (32 milliseconds) [info] - Test 2: Star join on a subset of dimensions due to inequality joins (47 milliseconds) [info] - Test 3: Star join on a subset of dimensions since join column is not unique (19 milliseconds) [info] - Test 4: Star join on a subset of dimensions since join column is nullable (18 milliseconds) [info] - Test 5: Table stats not available for some of the joined tables (12 milliseconds) [info] - Test 6: Join with complex plans (19 milliseconds) [info] - Test 7: Comparable fact table sizes (12 milliseconds) [info] - Test 8: No RI joins (18 milliseconds) [info] - Test 9: Complex join predicates (17 milliseconds) [info] - Test 10: Less than two dimensions (19 milliseconds) [info] - Test 11: Expanding star join (16 milliseconds) [info] - Test 12: Non selective star join (16 milliseconds) [info] PruneFiltersSuite: [info] - Constraints of isNull + LeftOuter (8 milliseconds) [info] - Constraints of unionall (8 milliseconds) [info] - Pruning multiple constraints in the same run (7 milliseconds) [info] - Partial pruning (7 milliseconds) [info] - No predicate is pruned (4 milliseconds) [info] - Nondeterministic predicate is not pruned (5 milliseconds) [info] - No pruning when constraint propagation is disabled (8 milliseconds) [info] NumberConverterSuite: [info] - convert (2 milliseconds) [info] TypedFilterOptimizationSuite: [info] - filter after serialize with the same object type (37 milliseconds) [info] - filter after serialize with different object types (19 milliseconds) [info] - filter before deserialize with the same object type (33 milliseconds) [info] - filter before deserialize with different object types (20 milliseconds) [info] - back to back filter with the same object type (13 milliseconds) [info] - back to back filter with different object types (12 milliseconds) [info] - back to back FilterFunction with the same object type (13 milliseconds) [info] - back to back FilterFunction with different object types (13 milliseconds) [info] - FilterFunction and filter with the same object type (13 milliseconds) [info] - FilterFunction and filter with different object types (12 milliseconds) [info] - filter and FilterFunction with the same object type (13 milliseconds) [info] - filter and FilterFunction with different object types (13 milliseconds) [info] CodeFormatterSuite: [info] - removing overlapping comments (1 millisecond) [info] - removing extra new lines and comments (0 milliseconds) [info] - basic example (1 millisecond) [info] - nested example (0 milliseconds) [info] - single line (0 milliseconds) [info] - if else on the same line (1 millisecond) [info] - function calls (0 milliseconds) [info] - function calls with maxLines=0 (0 milliseconds) [info] - function calls with maxLines=2 (0 milliseconds) [info] - single line comments (0 milliseconds) [info] - single line comments /* */ (4 milliseconds) [info] - multi-line comments (1 millisecond) [info] - reduce empty lines (0 milliseconds) [info] - comment place holder (1 millisecond) [info] EliminateSubqueryAliasesSuite: [info] - eliminate top level subquery (1 millisecond) [info] - eliminate mid-tree subquery (1 millisecond) [info] - eliminate multiple subqueries (1 millisecond) [info] - bin (994 milliseconds) [info] - sampleByKey (4 seconds, 505 milliseconds) [info] ArrayDataIndexedSeqSuite: [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(BooleanType,false) (7 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(BooleanType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(BooleanType,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(BooleanType,true) (1 millisecond) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(ByteType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(ByteType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(ByteType,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(ByteType,true) (1 millisecond) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(ShortType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(ShortType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(ShortType,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(ShortType,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(IntegerType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(IntegerType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(IntegerType,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(IntegerType,true) (1 millisecond) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(LongType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(LongType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(LongType,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(LongType,true) (1 millisecond) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(FloatType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(FloatType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(FloatType,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(FloatType,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(DoubleType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(DoubleType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(DoubleType,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(DoubleType,true) (1 millisecond) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(DecimalType(10,0),false) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(DecimalType(10,0),false) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(DecimalType(10,0),true) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(DecimalType(10,0),true) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(StringType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(StringType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(StringType,true) (1 millisecond) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(StringType,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(BinaryType,false) (1 millisecond) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(BinaryType,false) (1 millisecond) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(BinaryType,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(BinaryType,true) (1 millisecond) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(DateType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(DateType,false) (1 millisecond) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(DateType,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(DateType,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(TimestampType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(TimestampType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(TimestampType,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(TimestampType,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(CalendarIntervalType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(CalendarIntervalType,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(CalendarIntervalType,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(CalendarIntervalType,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(org.apache.spark.sql.catalyst.encoders.ExamplePointUDT@4cc31766,false) (1 millisecond) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(org.apache.spark.sql.catalyst.encoders.ExamplePointUDT@4cc31766,false) (0 milliseconds) [info] - ArrayDataIndexedSeq - UnsafeArrayData - ArrayType(org.apache.spark.sql.catalyst.encoders.ExamplePointUDT@4cc31766,true) (0 milliseconds) [info] - ArrayDataIndexedSeq - GenericArrayData - ArrayType(org.apache.spark.sql.catalyst.encoders.ExamplePointUDT@4cc31766,true) (0 milliseconds) [info] - log2 (606 milliseconds) [info] ResolveHintsSuite: [info] - invalid hints should be ignored (2 milliseconds) [info] - case-sensitive or insensitive parameters (7 milliseconds) [info] - multiple broadcast hint aliases (2 milliseconds) [info] - do not traverse past existing broadcast hints (2 milliseconds) [info] - should work for subqueries (2 milliseconds) [info] - do not traverse past subquery alias (2 milliseconds) [info] - should work for CTE (7 milliseconds) [info] - should not traverse down CTE (6 milliseconds) [info] - coalesce and repartition hint (8 milliseconds) [info] TreeNodeSuite: [info] - top node changed (1 millisecond) [info] - one child changed (1 millisecond) [info] - no change (2 milliseconds) [info] - collect (1 millisecond) [info] - pre-order transform (0 milliseconds) [info] - post-order transform (0 milliseconds) [info] - transform works on nodes with Option children (2 milliseconds) [info] - mapChildren should only works on children (2 milliseconds) [info] - preserves origin (1 millisecond) [info] - foreach up (1 millisecond) [info] - find (1 millisecond) [info] - collectFirst (1 millisecond) [info] - transformExpressions on nested expression sequence (1 millisecond) [info] - expressions inside a map (3 milliseconds) [info] - toJSON (85 milliseconds) [info] - toJSON should not throws java.lang.StackOverflowError (20 milliseconds) [info] - transform works on stream of children (2 milliseconds) [info] - withNewChildren on stream of children (1 millisecond) [info] - SPARK-32999: TreeNode.nodeName should not throw malformed class name error (2 milliseconds) [info] UpdateNullabilityInAttributeReferencesSuite: [info] - update nullability in AttributeReference (8 milliseconds) [info] QuantileSummariesSuite: [info] - Extremas with epsi=0.1 and seq=increasing, compression=1000 (5 milliseconds) [info] - Some quantile values with epsi=0.1 and seq=increasing, compression=1000 (2 milliseconds) [info] - Some quantile values with epsi=0.1 and seq=increasing, compression=1000 (interleaved) (5 milliseconds) [info] - Tests on empty data with epsi=0.1 and seq=increasing, compression=1000 (0 milliseconds) [info] - Extremas with epsi=0.1 and seq=increasing, compression=10 (1 millisecond) [info] - Some quantile values with epsi=0.1 and seq=increasing, compression=10 (1 millisecond) [info] - Some quantile values with epsi=0.1 and seq=increasing, compression=10 (interleaved) (3 milliseconds) [info] - Tests on empty data with epsi=0.1 and seq=increasing, compression=10 (0 milliseconds) [info] - Extremas with epsi=1.0E-4 and seq=increasing, compression=1000 (1 millisecond) [info] - Some quantile values with epsi=1.0E-4 and seq=increasing, compression=1000 (1 millisecond) [info] - Some quantile values with epsi=1.0E-4 and seq=increasing, compression=1000 (interleaved) (7 milliseconds) [info] - Tests on empty data with epsi=1.0E-4 and seq=increasing, compression=1000 (0 milliseconds) [info] - Extremas with epsi=1.0E-4 and seq=increasing, compression=10 (0 milliseconds) [info] - Some quantile values with epsi=1.0E-4 and seq=increasing, compression=10 (1 millisecond) [info] - Some quantile values with epsi=1.0E-4 and seq=increasing, compression=10 (interleaved) (6 milliseconds) [info] - Tests on empty data with epsi=1.0E-4 and seq=increasing, compression=10 (0 milliseconds) [info] - Extremas with epsi=0.1 and seq=decreasing, compression=1000 (1 millisecond) [info] - Some quantile values with epsi=0.1 and seq=decreasing, compression=1000 (0 milliseconds) [info] - Some quantile values with epsi=0.1 and seq=decreasing, compression=1000 (interleaved) (1 millisecond) [info] - Tests on empty data with epsi=0.1 and seq=decreasing, compression=1000 (0 milliseconds) [info] - Extremas with epsi=0.1 and seq=decreasing, compression=10 (0 milliseconds) [info] - Some quantile values with epsi=0.1 and seq=decreasing, compression=10 (0 milliseconds) [info] - Some quantile values with epsi=0.1 and seq=decreasing, compression=10 (interleaved) (2 milliseconds) [info] - Tests on empty data with epsi=0.1 and seq=decreasing, compression=10 (0 milliseconds) [info] - Extremas with epsi=1.0E-4 and seq=decreasing, compression=1000 (0 milliseconds) [info] - Some quantile values with epsi=1.0E-4 and seq=decreasing, compression=1000 (1 millisecond) [info] - Some quantile values with epsi=1.0E-4 and seq=decreasing, compression=1000 (interleaved) (4 milliseconds) [info] - Tests on empty data with epsi=1.0E-4 and seq=decreasing, compression=1000 (0 milliseconds) [info] - Extremas with epsi=1.0E-4 and seq=decreasing, compression=10 (1 millisecond) [info] - Some quantile values with epsi=1.0E-4 and seq=decreasing, compression=10 (0 milliseconds) [info] - Some quantile values with epsi=1.0E-4 and seq=decreasing, compression=10 (interleaved) (3 milliseconds) [info] - Tests on empty data with epsi=1.0E-4 and seq=decreasing, compression=10 (0 milliseconds) [info] - Extremas with epsi=0.1 and seq=random, compression=1000 (1 millisecond) [info] - Some quantile values with epsi=0.1 and seq=random, compression=1000 (0 milliseconds) [info] - Some quantile values with epsi=0.1 and seq=random, compression=1000 (interleaved) (1 millisecond) [info] - Tests on empty data with epsi=0.1 and seq=random, compression=1000 (1 millisecond) [info] - Extremas with epsi=0.1 and seq=random, compression=10 (0 milliseconds) [info] - Some quantile values with epsi=0.1 and seq=random, compression=10 (0 milliseconds) [info] - Some quantile values with epsi=0.1 and seq=random, compression=10 (interleaved) (1 millisecond) [info] - Tests on empty data with epsi=0.1 and seq=random, compression=10 (0 milliseconds) [info] - Extremas with epsi=1.0E-4 and seq=random, compression=1000 (0 milliseconds) [info] - Some quantile values with epsi=1.0E-4 and seq=random, compression=1000 (0 milliseconds) [info] - Some quantile values with epsi=1.0E-4 and seq=random, compression=1000 (interleaved) (3 milliseconds) [info] - Tests on empty data with epsi=1.0E-4 and seq=random, compression=1000 (0 milliseconds) [info] - Extremas with epsi=1.0E-4 and seq=random, compression=10 (0 milliseconds) [info] - Some quantile values with epsi=1.0E-4 and seq=random, compression=10 (0 milliseconds) [info] - Some quantile values with epsi=1.0E-4 and seq=random, compression=10 (interleaved) (3 milliseconds) [info] - Tests on empty data with epsi=1.0E-4 and seq=random, compression=10 (0 milliseconds) [info] RowTest: [info] Row (without schema) [info] - throws an exception when accessing by fieldName (4 milliseconds) [info] Row (with schema) [info] - fieldIndex(name) returns field index (1 millisecond) [info] - getAs[T] retrieves a value by fieldname (1 millisecond) [info] - Accessing non existent field throws an exception (1 millisecond) [info] - getValuesMap() retrieves values of multiple fields as a Map(field -> value) (1 millisecond) [info] - getValuesMap() retrieves null value on non AnyVal Type (0 milliseconds) [info] - getAs() on type extending AnyVal throws an exception when accessing field that is null (1 millisecond) [info] - getAs() on type extending AnyVal does not throw exception when value is null (0 milliseconds) [info] row equals [info] - equality check for external rows (0 milliseconds) [info] - equality check for internal rows (0 milliseconds) [info] row immutability [info] - copy should return same ref for external rows (1 millisecond) [info] - toSeq should not expose internal state for external rows (4 milliseconds) [info] - toSeq should not expose internal state for internal rows (0 milliseconds) [info] PushProjectThroughUnionSuite: [info] - SPARK-25450 PushProjectThroughUnion rule uses the same exprId for project expressions in each Union child, causing mistakes in constant propagation (12 milliseconds) [info] LookupFunctionsSuite: [info] - SPARK-23486: the functionExists for the Persistent function check (31 milliseconds) [info] - SPARK-23486: the functionExists for the Registered function check (32 milliseconds) [info] OptimizeInSuite: [info] - OptimizedIn test: Remove deterministic repetitions (13 milliseconds) [info] - OptimizedIn test: In clause not optimized to InSet when less than 10 items (2 milliseconds) [info] - OptimizedIn test: In clause optimized to InSet when more than 10 items (5 milliseconds) [info] - OptimizedIn test: In clause not optimized in case filter has attributes (2 milliseconds) [info] - OptimizedIn test: NULL IN (expr1, ..., exprN) gets transformed to Filter(null) (3 milliseconds) [info] - OptimizedIn test: NULL IN (subquery) gets transformed to Filter(null) (7 milliseconds) [info] - OptimizedIn test: Inset optimization disabled as list expression contains attribute) (3 milliseconds) [info] - OptimizedIn test: Inset optimization disabled as list expression contains attribute - select) (3 milliseconds) [info] - OptimizedIn test: Setting the threshold for turning Set into InSet. (2 milliseconds) [info] - OptimizedIn test: one element in list gets transformed to EqualTo. (2 milliseconds) [info] - OptimizedIn test: In empty list gets transformed to FalseLiteral when value is not nullable (2 milliseconds) [info] - OptimizedIn test: In empty list gets transformed to `If` expression when value is nullable (3 milliseconds) [info] ScalaUDFSuite: [info] - basic (84 milliseconds) [info] - sqrt (586 milliseconds) [info] - better error message for NPE (10 milliseconds) [info] - SPARK-22695: ScalaUDF should not use global variables (1 millisecond) [info] StreamingJoinHelperSuite: [info] - pow (537 milliseconds) [info] - extract watermark from time condition (563 milliseconds) [info] TimeWindowSuite: [info] - time window is unevaluable (1 millisecond) [info] - blank intervals throw exception (2 milliseconds) [info] - invalid intervals throw exception (1 millisecond) [info] - intervals greater than a month throws exception (0 milliseconds) [info] - interval strings work with and without 'interval' prefix and return microseconds (1 millisecond) [info] - SPARK-21590: Start time works with negative values and return microseconds (2 milliseconds) [info] - parse sql expression for duration in microseconds - string (6 milliseconds) [info] - parse sql expression for duration in microseconds - integer (1 millisecond) [info] - parse sql expression for duration in microseconds - long (0 milliseconds) [info] - parse sql expression for duration in microseconds - invalid interval (1 millisecond) [info] - parse sql expression for duration in microseconds - invalid expression (1 millisecond) [info] - SPARK-16837: TimeWindow.apply equivalent to TimeWindow constructor (1 millisecond) [info] InMemoryCatalogSuite: [info] - basic create and list databases (14 milliseconds) [info] - get database when a database exists (26 milliseconds) [info] - get database should throw exception when the database does not exist (14 milliseconds) [info] - list databases without pattern (13 milliseconds) [info] - list databases with pattern (23 milliseconds) [info] - drop database (22 milliseconds) [info] - drop database when the database is not empty (44 milliseconds) [info] - drop database when the database does not exist (15 milliseconds) [info] - alter database (18 milliseconds) [info] - alter database should throw exception when the database does not exist (17 milliseconds) [info] - the table type of an external table should be EXTERNAL_TABLE (14 milliseconds) [info] - create table when the table already exists (21 milliseconds) [info] - drop table (18 milliseconds) [info] - drop table when database/table does not exist (16 milliseconds) [info] - shift left (438 milliseconds) [info] - rename table (66 milliseconds) [info] - rename table when database/table does not exist (22 milliseconds) [info] - rename table when destination table already exists (21 milliseconds) [info] - alter table (21 milliseconds) [info] - alter table when database/table does not exist (21 milliseconds) [info] - alter table schema (20 milliseconds) [info] - alter table stats (21 milliseconds) [info] - get table (18 milliseconds) [info] - get table when database/table does not exist (16 milliseconds) [info] - list tables without pattern (15 milliseconds) [info] - list tables with pattern (16 milliseconds) [info] - column names should be case-preserving and column nullability should be retained (15 milliseconds) [info] - basic create and list partitions (17 milliseconds) [info] - create partitions when database/table does not exist (18 milliseconds) [info] - create partitions that already exist (17 milliseconds) [info] - create partitions without location (20 milliseconds) [info] - create/drop partitions in managed tables with location (98 milliseconds) [info] - list partition names (24 milliseconds) [info] - list partition names with partial partition spec (24 milliseconds) [info] - list partitions with partial partition spec (25 milliseconds) [info] - SPARK-21457: list partitions with special chars (17 milliseconds) [info] - list partitions by filter (26 milliseconds) [info] - shift right (566 milliseconds) [info] - drop partitions (30 milliseconds) [info] - drop partitions when database/table does not exist (15 milliseconds) [info] - drop partitions that do not exist (14 milliseconds) [info] - get partition (14 milliseconds) [info] - get partition when database/table does not exist (13 milliseconds) [info] - rename partitions (17 milliseconds) [info] - rename partitions should update the location for managed table (21 milliseconds) [info] - rename partitions when database/table does not exist (24 milliseconds) [info] - rename partitions when the new partition already exists (20 milliseconds) [info] - alter partitions (15 milliseconds) [info] - alter partitions when database/table does not exist (13 milliseconds) [info] - basic create and list functions (11 milliseconds) [info] - create function when database does not exist (12 milliseconds) [info] - create function that already exists (12 milliseconds) [info] - drop function (12 milliseconds) [info] - drop function when database does not exist (12 milliseconds) [info] - drop function that does not exist (13 milliseconds) [info] - get function (12 milliseconds) [info] - get function when database does not exist (13 milliseconds) [info] - rename function (13 milliseconds) [info] - rename function when database does not exist (12 milliseconds) [info] - rename function when new function already exists (14 milliseconds) [info] - alter function (12 milliseconds) [info] - list functions (11 milliseconds) [info] - create/drop database should create/delete the directory (43 milliseconds) [info] - create/drop/rename table should create/delete/rename the directory (81 milliseconds) [info] - shift right unsigned (467 milliseconds) [info] - create/drop/rename partitions should create/delete/rename the directory (78 milliseconds) [info] - drop partition from external table should not delete the directory (22 milliseconds) [info] ExpressionEvalHelperSuite: [info] - SPARK-16489 checkEvaluation should fail if expression reuses variable names (20 milliseconds) [info] CatalystTypeConvertersSuite: [info] - null handling in rows (2 milliseconds) [info] - null handling for individual values (1 millisecond) [info] - option handling in convertToCatalyst (0 milliseconds) [info] - option handling in createToCatalystConverter (0 milliseconds) [info] - primitive array handling (0 milliseconds) [info] - An array with null handling (0 milliseconds) [info] - converting a wrong value to the struct type (1 millisecond) [info] - converting a wrong value to the map type (0 milliseconds) [info] - converting a wrong value to the array type (0 milliseconds) [info] - converting a wrong value to the decimal type (0 milliseconds) [info] - converting a wrong value to the string type (0 milliseconds) [info] - SPARK-24571: convert Char to String (0 milliseconds) [info] SimplifyStringCaseConversionSuite: [info] - simplify UPPER(UPPER(str)) (5 milliseconds) [info] - simplify UPPER(LOWER(str)) (2 milliseconds) [info] - simplify LOWER(UPPER(str)) (2 milliseconds) [info] - simplify LOWER(LOWER(str)) (3 milliseconds) [info] GeneratorExpressionSuite: [info] - explode (2 milliseconds) [info] - posexplode (2 milliseconds) [info] - inline (2 milliseconds) [info] - stack (7 milliseconds) [info] CollapseRepartitionSuite: [info] - collapse two adjacent coalesces into one (3 milliseconds) [info] - collapse two adjacent repartitions into one (3 milliseconds) [info] - coalesce above repartition (3 milliseconds) [info] - repartition above coalesce (2 milliseconds) [info] - distribute above repartition (4 milliseconds) [info] - distribute above coalesce (3 milliseconds) [info] - repartition above distribute (3 milliseconds) [info] - coalesce above distribute (10 milliseconds) [info] - collapse two adjacent distributes into one (5 milliseconds) [info] SchemaUtilsSuite: [info] - Check column name duplication in case-sensitive cases (6 milliseconds) [info] - Check column name duplication in case-insensitive cases (1 millisecond) [info] - Check no exception thrown for valid schemas (1 millisecond) [info] DateTimeUtilsSuite: [info] - nanoseconds truncation (2 milliseconds) [info] - timestamp and us (1 millisecond) [info] - us and julian day (2 milliseconds) [info] - SPARK-6785: java date conversion before and after epoch (15 milliseconds) [info] - string to date (7 milliseconds) [info] - string to time (7 milliseconds) [info] - hex (485 milliseconds) [info] - string to timestamp (245 milliseconds) [info] - SPARK-15379: special invalid date string (1 millisecond) [info] - hours (0 milliseconds) [info] - minutes (0 milliseconds) [info] - seconds (0 milliseconds) [info] - hours / minutes / seconds (2 milliseconds) [info] - get day in year (0 milliseconds) [info] - get year (0 milliseconds) [info] - get quarter (0 milliseconds) [info] - get month (0 milliseconds) [info] - get day of month (1 millisecond) [info] - date add months (0 milliseconds) [info] - timestamp add months (0 milliseconds) [info] - monthsBetween (1 millisecond) [info] - from UTC timestamp (30 milliseconds) [info] - to UTC timestamp (17 milliseconds) [info] - trailing characters while converting string to timestamp (0 milliseconds) [info] - truncTimestamp (41 milliseconds) [info] - unhex (188 milliseconds) [info] - hypot (1 second, 391 milliseconds) [info] - atan2 (1 second, 548 milliseconds) [info] - sampleByKeyExact (6 seconds, 826 milliseconds) [info] - reduceByKey (36 milliseconds) [info] - reduceByKey with collectAsMap (31 milliseconds) [info] - reduceByKey with many output partitions (42 milliseconds) [info] - reduceByKey with partitioner (30 milliseconds) [info] - countApproxDistinctByKey (111 milliseconds) [info] - join (18 milliseconds) [info] - join all-to-all (21 milliseconds) [info] - leftOuterJoin (18 milliseconds) [info] - cogroup with empty RDD (14 milliseconds) [info] - cogroup with groupByed RDD having 0 partitions (20 milliseconds) [info] - cogroup between multiple RDD with an order of magnitude difference in number of partitions (6 milliseconds) [info] - cogroup between multiple RDD with number of partitions similar in order of magnitude (5 milliseconds) [info] - cogroup between multiple RDD when defaultParallelism is set without proper partitioner (3 milliseconds) [info] - cogroup between multiple RDD when defaultParallelism is set with proper partitioner (4 milliseconds) [info] - cogroup between multiple RDD when defaultParallelism is set; with huge number of partitions in upstream RDDs (5 milliseconds) [info] - rightOuterJoin (24 milliseconds) [info] - fullOuterJoin (19 milliseconds) [info] - join with no matches (20 milliseconds) [info] - join with many output partitions (22 milliseconds) [info] - groupWith (19 milliseconds) [info] - groupWith3 (28 milliseconds) [info] - groupWith4 (24 milliseconds) [info] - zero-partition RDD (37 milliseconds) [info] - keys and values (14 milliseconds) [info] - default partitioner uses partition size (6 milliseconds) [info] - default partitioner uses largest partitioner (7 milliseconds) [info] - subtract (24 milliseconds) [info] - subtract with narrow dependency (49 milliseconds) [info] - subtractByKey (14 milliseconds) [info] - subtractByKey with narrow dependency (54 milliseconds) [info] - foldByKey (37 milliseconds) [info] - foldByKey with mutable result type (28 milliseconds) [info] - saveNewAPIHadoopFile should call setConf if format is configurable (75 milliseconds) [info] - The JobId on the driver and executors should be the same during the commit (37 milliseconds) [info] - saveAsHadoopFile should respect configured output committers (46 milliseconds) [info] - binary log (1 second, 493 milliseconds) [info] - failure callbacks should be called before calling writer.close() in saveNewAPIHadoopFile (37 milliseconds) [info] - failure callbacks should be called before calling writer.close() in saveAsHadoopFile (65 milliseconds) [info] - saveAsNewAPIHadoopDataset should support invalid output paths when there are no files to be committed to an absolute output location (93 milliseconds) [info] - saveAsHadoopDataset should respect empty output directory when there are no files to be committed to an absolute output location (38 milliseconds) [info] - lookup (37 milliseconds) [info] - lookup with partitioner (60 milliseconds) [info] - lookup with bad partitioner (39 milliseconds) [info] ThreadUtilsSuite: [info] - newDaemonSingleThreadExecutor (1 millisecond) [info] - newDaemonSingleThreadScheduledExecutor (2 milliseconds) [info] - newDaemonCachedThreadPool (2 seconds, 17 milliseconds) [info] - sameThread (1 millisecond) [info] - runInNewThread (7 milliseconds) Exception in thread "test-ForkJoinPool-3-worker-1" Exception in thread "test-ForkJoinPool-3-worker-3" java.lang.InterruptedException: sleep interrupted at java.lang.Thread.sleep(Native Method) at org.apache.spark.util.ThreadUtilsSuite$$anonfun$11$$anon$1$$anonfun$run$1.apply(ThreadUtilsSuite.scala:152) at org.apache.spark.util.ThreadUtilsSuite$$anonfun$11$$anon$1$$anonfun$run$1.apply(ThreadUtilsSuite.scala:151) at org.apache.spark.util.ThreadUtils$$anonfun$3$$anonfun$apply$1.apply(ThreadUtils.scala:287) at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) java.lang.InterruptedException: sleep interrupted at java.lang.Thread.sleep(Native Method) at org.apache.spark.util.ThreadUtilsSuite$$anonfun$11$$anon$1$$anonfun$run$1.apply(ThreadUtilsSuite.scala:152) at org.apache.spark.util.ThreadUtilsSuite$$anonfun$11$$anon$1$$anonfun$run$1.apply(ThreadUtilsSuite.scala:151) at org.apache.spark.util.ThreadUtils$$anonfun$3$$anonfun$apply$1.apply(ThreadUtils.scala:287) at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) [info] - parmap should be interruptible (7 milliseconds) [info] LauncherBackendSuite: [info] - round/bround (3 seconds, 653 milliseconds) [info] YarnSchedulerBackendSuite: [info] - RequestExecutors reflects node blacklist and is serializable (111 milliseconds) [info] ApplicationMasterSuite: [info] - history url with hadoop and spark substitutions (27 milliseconds) [info] ContainerPlacementStrategySuite: [info] - allocate locality preferred containers with enough resource and no matched existed containers (35 milliseconds) [info] - allocate locality preferred containers with enough resource and partially matched containers (30 milliseconds) [info] - allocate locality preferred containers with limited resource and partially matched containers (30 milliseconds) [info] - allocate locality preferred containers with fully matched containers (31 milliseconds) [info] - allocate containers with no locality preference (30 milliseconds) [info] - allocate locality preferred containers by considering the localities of pending requests (33 milliseconds) [info] YARNHadoopDelegationTokenManagerSuite: [info] - Correctly loads credential providers (15 milliseconds) [info] YarnProxyRedirectFilterSuite: [info] - redirect proxied requests, pass-through others (22 milliseconds) [info] YarnShuffleServiceSuite: [info] - local: launcher handle (2 seconds, 481 milliseconds) [info] - executor state kept across NM restart (510 milliseconds) [info] - removed applications should not be in registered executor file (37 milliseconds) [info] - shuffle service should be robust to corrupt registered executor file (217 milliseconds) [info] - get correct recovery path (69 milliseconds) [info] - moving recovery file from NM local dir to recovery path (217 milliseconds) [info] - service throws error if cannot start (8 milliseconds) [info] - recovery db should not be created if NM recovery is not enabled (16 milliseconds) [info] LocalityPlacementStrategySuite: [info] HashExpressionsSuite: [info] - handle large number of containers and tasks (SPARK-18750) (283 milliseconds) [info] YarnShuffleIntegrationSuite: [info] - md5 (1 second, 468 milliseconds) [info] - sha1 (198 milliseconds) [info] - sha2 (534 milliseconds) [info] - standalone/client: launcher handle (3 seconds, 617 milliseconds) [info] ExecutorSuite: [info] - SPARK-15963: Catch `TaskKilledException` correctly in Executor.TaskRunner (31 milliseconds) [info] - crc32 (141 milliseconds) [info] - hive-hash for null (6 milliseconds) [info] - hive-hash for boolean (1 millisecond) [info] - hive-hash for byte (1 millisecond) [info] - hive-hash for short (1 millisecond) [info] - hive-hash for int (0 milliseconds) [info] - hive-hash for long (1 millisecond) [info] - hive-hash for float (0 milliseconds) [info] - hive-hash for double (1 millisecond) [info] - hive-hash for string (1 millisecond) [info] - hive-hash for date type (4 milliseconds) [info] - hive-hash for timestamp type (45 milliseconds) [info] - hive-hash for CalendarInterval type (20 milliseconds) [info] - hive-hash for array (11 milliseconds) [info] - hive-hash for map (5 milliseconds) [info] - SPARK-19276: Handle FetchFailedExceptions that are hidden by user exceptions (120 milliseconds) [info] - hive-hash for struct (6 milliseconds) [info] - Executor's worker threads should be UninterruptibleThread (63 milliseconds) [info] - SPARK-19276: OOMs correctly handled with a FetchFailure (65 milliseconds) [info] - SPARK-23816: interrupts are not masked by a FetchFailure (60 milliseconds) [info] - Gracefully handle error in task deserialization (18 milliseconds) [info] UninterruptibleThreadSuite: [info] - interrupt when runUninterruptibly is running (1 second, 2 milliseconds) [info] - interrupt before runUninterruptibly runs (2 milliseconds) [info] - nested runUninterruptibly (5 milliseconds) [info] - murmur3/xxHash64/hive hash: struct (2 seconds, 397 milliseconds) [info] - SPARK-30633: xxHash64 with long seed: struct (543 milliseconds) [info] - stress test (1 second, 831 milliseconds) [info] InternalAccumulatorSuite: [info] - internal accumulators in TaskContext (1 millisecond) [info] - internal accumulators in a stage (73 milliseconds) [info] - internal accumulators in multiple stages (389 milliseconds) [info] - internal accumulators in resubmitted stages (1 second, 824 milliseconds) [info] - internal accumulators are registered for cleanups (110 milliseconds) [info] JobWaiterSuite: [info] - call jobFailed multiple times (1 millisecond) [info] DriverSuite: [info] - driver should exit after finishing without cleanup (SPARK-530) !!! IGNORED !!! [info] ElementTrackingStoreSuite: [info] - asynchronous tracking single-fire (8 milliseconds) [info] - tracking for multiple types (3 milliseconds) [info] ExternalAppendOnlyMapSuite: [info] - single insert (80 milliseconds) [info] - multiple insert (77 milliseconds) [info] - insert with collision (70 milliseconds) [info] - ordering (71 milliseconds) [info] - null keys and values (72 milliseconds) [info] - simple aggregator (149 milliseconds) [info] - simple cogroup (84 milliseconds) [info] - spilling (4 seconds, 512 milliseconds) [info] - external shuffle service (22 seconds, 29 milliseconds) [info] - spilling with compression (18 seconds, 376 milliseconds) [info] - murmur3/xxHash64/hive hash: struct,arrayOfString:array,arrayOfArrayOfString:array>,arrayOfArrayOfInt:array>,arrayOfMap:array>,arrayOfStruct:array>,arrayOfUDT:array> (26 seconds, 834 milliseconds) [info] - daysToMillis and millisToDays (43 seconds, 980 milliseconds) [info] - parsing timestamp strings up to microsecond precision (34 milliseconds) [info] - formatting timestamp strings up to microsecond precision (7 milliseconds) [info] - toMillis (1 millisecond) [info] TableSchemaParserSuite: [info] - parse a int (1 millisecond) [info] - parse A int (0 milliseconds) [info] - parse a INT (0 milliseconds) [info] - parse `!@#$%.^&*()` string (1 millisecond) [info] - parse a int, b long (0 milliseconds) [info] - parse a STRUCT (1 millisecond) [info] - parse a int comment 'test' (1 millisecond) [info] - complex hive type (4 milliseconds) [info] - Negative cases (2 milliseconds) [info] StringExpressionsSuite: [info] - concat (221 milliseconds) [info] - SPARK-22498: Concat should not generate codes beyond 64KB (953 milliseconds) [info] - SPARK-22771 Check Concat.checkInputDataTypes results (13 milliseconds) [info] - concat_ws (304 milliseconds) [info] - spilling with compression and encryption (4 seconds, 532 milliseconds) [info] - ExternalAppendOnlyMap shouldn't fail when forced to spill before calling its iterator (334 milliseconds) [info] - spilling with hash collisions (303 milliseconds) [info] - spilling with many hash collisions (664 milliseconds) [info] - spilling with hash collisions using the Int.MaxValue key (276 milliseconds) [info] - spilling with null keys and values (421 milliseconds) [info] - SPARK-22713 spill during iteration leaks internal map (1 second, 46 milliseconds) [info] YarnShuffleAuthSuite: [info] - drop all references to the underlying map once the iterator is exhausted (968 milliseconds) [info] - SPARK-22713 external aggregation updates peak execution memory (335 milliseconds) [info] - SPARK-30633: xxHash64 with long seed: struct,arrayOfString:array,arrayOfArrayOfString:array>,arrayOfArrayOfInt:array>,arrayOfMap:array>,arrayOfStruct:array>,arrayOfUDT:array> (8 seconds, 698 milliseconds) [info] - SPARK-22549: ConcatWs should not generate codes beyond 64KB (6 seconds, 190 milliseconds) [info] - elt (210 milliseconds) [info] - SPARK-22550: Elt should not generate codes beyond 64KB (3 seconds, 847 milliseconds) [info] - StringComparison (240 milliseconds) [info] - Substring (735 milliseconds) [info] - string substring_index function (221 milliseconds) [info] - ascii for string (110 milliseconds) [info] - string for ascii (133 milliseconds) [info] - base64/unbase64 for string (271 milliseconds) [info] - encode/decode for string (254 milliseconds) [info] - initcap unit test (84 milliseconds) [info] - Levenshtein distance (119 milliseconds) [info] - soundex unit test (236 milliseconds) [info] - replace (190 milliseconds) [info] - translate (86 milliseconds) [info] - TRIM (244 milliseconds) [info] - LTRIM (252 milliseconds) [info] - RTRIM (274 milliseconds) [info] - FORMAT (270 milliseconds) [info] - SPARK-22603: FormatString should not generate codes beyond 64KB (1 second, 87 milliseconds) [info] - INSTR (164 milliseconds) [info] - LOCATE (289 milliseconds) [info] - LPAD/RPAD (226 milliseconds) [info] - REPEAT (86 milliseconds) [info] - REVERSE (80 milliseconds) [info] - SPACE (100 milliseconds) [info] - length for string / binary (474 milliseconds) [info] - format_number / FormatNumber (749 milliseconds) [info] - find in set (113 milliseconds) [info] - ParseUrl (164 milliseconds) [info] - Sentences (136 milliseconds) [info] CatalogSuite: [info] - desc table when owner is set to null (9 milliseconds) [info] AggregateExpressionSuite: [info] - test references from unresolved aggregate functions (2 milliseconds) [info] ApproxCountDistinctForIntervalsSuite: [info] - fails analysis if parameters are invalid (11 milliseconds) [info] - merging ApproxCountDistinctForIntervals instances (257 milliseconds) [info] - test findHllppIndex(value) for values in the range (2 milliseconds) [info] - round trip serialization (15 milliseconds) [info] - basic operations: update, merge, eval... (47 milliseconds) [info] - test for different input types: numeric/date/timestamp (5 milliseconds) [info] CountMinSketchAggSuite: [info] - test data type ByteType (15 milliseconds) [info] - test data type ShortType (3 milliseconds) [info] - test data type IntegerType (3 milliseconds) [info] - test data type LongType (3 milliseconds) [info] - test data type StringType (5 milliseconds) [info] - test data type BinaryType (2 milliseconds) [info] - serialize and de-serialize (36 milliseconds) [info] - fails analysis if eps, confidence or seed provided is not foldable (2 milliseconds) [info] - fails analysis if parameters are invalid (2 milliseconds) [info] - null handling (23 milliseconds) [info] DistributionSuite: [info] - UnspecifiedDistribution and AllTuples (7 milliseconds) [info] - SinglePartition is the output partitioning (7 milliseconds) [info] - HashPartitioning is the output partitioning (2 milliseconds) [info] - RangePartitioning is the output partitioning (2 milliseconds) [info] - Partitioning.numPartitions must match Distribution.requiredNumPartitions to satisfy it (0 milliseconds) [info] StringUtilsSuite: [info] - escapeLikeRegex (0 milliseconds) [info] - filter pattern (2 milliseconds) [info] QueryPlanSuite: [info] - origin remains the same after mapExpressions (SPARK-23823) (1 millisecond) [info] ExprValueSuite: [info] - TrueLiteral and FalseLiteral should be LiteralValue (1 millisecond) [info] ExprIdSuite: [info] - hashcode independent of jvmId (0 milliseconds) [info] - equality should depend on both id and jvmId (0 milliseconds) [info] BinaryComparisonSimplificationSuite: [info] - Preserve nullable exprs in general (8 milliseconds) [info] - Preserve non-deterministic exprs (2 milliseconds) [info] - Nullable Simplification Primitive: <=> (2 milliseconds) [info] - Non-Nullable Simplification Primitive (4 milliseconds) [info] - Expression Normalization (6 milliseconds) [info] - SPARK-26402: accessing nested fields with different cases in case insensitive mode (2 milliseconds) [info] EliminateMapObjectsSuite: [info] - SPARK-20254: Remove unnecessary data conversion for primitive array (23 milliseconds) [info] OptimizerExtendableSuite: [info] - Extending batches possible (2 milliseconds) [info] ObjectExpressionsSuite: [info] - SPARK-16622: The returned value of the called method in Invoke can be null (9 milliseconds) [info] - MapObjects should make copies of unsafe-backed data (125 milliseconds) [info] - force to spill for external aggregation (13 seconds, 90 milliseconds) [info] FileAppenderSuite: [info] - basic file appender (2 milliseconds) [info] - SPARK-23582: StaticInvoke should support interpreted execution (521 milliseconds) [info] - SPARK-23583: Invoke should support interpreted execution (115 milliseconds) [info] - SPARK-23593: InitializeJavaBean should support interpreted execution (42 milliseconds) [info] - InitializeJavaBean doesn't call setters if input in null (14 milliseconds) [info] - SPARK-23585: UnwrapOption should support interpreted execution (45 milliseconds) [info] - SPARK-23586: WrapOption should support interpreted execution (29 milliseconds) [info] - SPARK-23590: CreateExternalRow should support interpreted execution (17 milliseconds) [info] - SPARK-23594 GetExternalRowField should support interpreted execution (25 milliseconds) [info] - SPARK-23591: EncodeUsingSerializer should support interpreted execution (191 milliseconds) [info] - SPARK-23587: MapObjects should support interpreted execution (9 milliseconds) [info] - SPARK-23592: DecodeUsingSerializer should support interpreted execution (87 milliseconds) [info] - SPARK-23584 NewInstance should support interpreted execution (51 milliseconds) [info] - murmur3/xxHash64/hive hash: struct,mapOfStringAndArray:map>,mapOfArrayAndInt:map,int>,mapOfArray:map,array>,mapOfStringAndStruct:map>,mapOfStructAndString:map,string>,mapOfStruct:map,struct>> (13 seconds, 742 milliseconds) [info] - rolling file appender - time-based rolling (1 second, 11 milliseconds) [info] - rolling file appender - time-based rolling (compressed) (1 second, 7 milliseconds) [info] - rolling file appender - size-based rolling (10 milliseconds) [info] - rolling file appender - size-based rolling (compressed) (5 milliseconds) [info] - rolling file appender - cleaning (115 milliseconds) [info] - file appender selection (11 milliseconds) [info] - file appender async close stream abruptly (15 milliseconds) [info] - file appender async close stream gracefully (9 milliseconds) [info] PagedDataSourceSuite: [info] - basic (2 milliseconds) [info] PagedTableSuite: [info] - pageNavigation (8 milliseconds) [info] OpenHashMapSuite: [info] - size for specialized, primitive value (int) (5 milliseconds) [info] - initialization (1 millisecond) [info] - primitive value (18 milliseconds) [info] - non-primitive value (13 milliseconds) [info] - null keys (1 millisecond) [info] - null values (1 millisecond) [info] - changeValue (7 milliseconds) [info] - inserting in capacity-1 map (1 millisecond) [info] - contains (1 millisecond) [info] - distinguish between the 0/0.0/0L and null (1 millisecond) [info] PythonRDDSuite: [info] - Writing large strings to the worker (6 milliseconds) [info] - Handle nulls gracefully (1 millisecond) [info] - python server error handling (5 milliseconds) [info] BlockTransferServiceSuite: [info] - fetchBlockSync should not hang when BlockFetchingListener.onBlockFetchSuccess fails (3 milliseconds) [info] AppClientSuite: [info] - interface methods of AppClient using local Master (72 milliseconds) [info] - request from AppClient before initialized with master (34 milliseconds) [info] MetricsConfigSuite: [info] - MetricsConfig with default properties (2 milliseconds) [info] - MetricsConfig with properties set from a file (1 millisecond) [info] - MetricsConfig with properties set from a Spark configuration (1 millisecond) [info] - MetricsConfig with properties set from a file and a Spark configuration (1 millisecond) [info] - MetricsConfig with subProperties (0 milliseconds) [info] SizeTrackerSuite: [info] - vector fixed size insertions (1 second, 148 milliseconds) [info] - LambdaVariable should support interpreted execution (3 seconds, 233 milliseconds) [info] - SPARK-23588 CatalystToExternalMap should support interpreted execution (45 milliseconds) [info] - SPARK-23595 ValidateExternalType should support interpreted execution (384 milliseconds) [info] - SPARK-23589 ExternalMapToCatalyst should support interpreted execution (183 milliseconds) [info] UnsafeRowWriterSuite: [info] - SPARK-25538: zero-out all bits for decimals (0 milliseconds) [info] ResolveNaturalJoinSuite: [info] - natural/using inner join (12 milliseconds) [info] - natural/using left join (5 milliseconds) [info] - natural/using right join (5 milliseconds) [info] - natural/using full outer join (6 milliseconds) [info] - natural/using inner join with no nullability (4 milliseconds) [info] - natural/using left join with no nullability (3 milliseconds) [info] - natural/using right join with no nullability (4 milliseconds) [info] - natural/using full outer join with no nullability (5 milliseconds) [info] - using unresolved attribute (2 milliseconds) [info] - using join with a case sensitive analyzer (2 milliseconds) [info] - using join on nested fields (0 milliseconds) [info] - using join with a case insensitive analyzer (4 milliseconds) [info] TypeUtilsSuite: [info] - checkForSameTypeInputExpr (0 milliseconds) [info] DataSourceV2AnalysisSuite: [info] - Append.byName: basic behavior (4 milliseconds) [info] - Append.byName: does not match by position (2 milliseconds) [info] - Append.byName: case sensitive column resolution (2 milliseconds) [info] - Append.byName: case insensitive column resolution (3 milliseconds) [info] - Append.byName: data columns are reordered by name (2 milliseconds) [info] - Append.byName: fail nullable data written to required columns (1 millisecond) [info] - Append.byName: allow required data written to nullable columns (1 millisecond) [info] - Append.byName: missing required columns cause failure and are identified by name (1 millisecond) [info] - Append.byName: missing optional columns cause failure and are identified by name (1 millisecond) [info] - Append.byName: fail canWrite check (0 milliseconds) [info] - Append.byName: insert safe cast (3 milliseconds) [info] - Append.byName: fail extra data fields (1 millisecond) [info] - Append.byName: multiple field errors are reported (1 millisecond) [info] - Append.byPosition: basic behavior (4 milliseconds) [info] - Append.byPosition: data columns are not reordered (2 milliseconds) [info] - Append.byPosition: fail nullable data written to required columns (0 milliseconds) [info] - Append.byPosition: allow required data written to nullable columns (2 milliseconds) [info] - Append.byPosition: missing required columns cause failure (1 millisecond) [info] - Append.byPosition: missing optional columns cause failure (1 millisecond) [info] - Append.byPosition: fail canWrite check (1 millisecond) [info] - Append.byPosition: insert safe cast (2 milliseconds) [info] - Append.byPosition: fail extra data fields (1 millisecond) [info] - Append.byPosition: multiple field errors are reported (1 millisecond) [info] GenerateUnsafeRowJoinerSuite: [info] - simple fixed width types (293 milliseconds) [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] - rows with all empty strings (24 milliseconds) [info] - rows with all empty int arrays (40 milliseconds) [info] - alternating empty and non-empty strings (1 millisecond) [info] - vector variable size insertions (1 second, 891 milliseconds) [info] - randomized fix width types (1 second, 28 milliseconds) [info] + schema size 83, 31 [info] + schema size 47, 97 [info] + schema size 56, 97 [info] + schema size 30, 87 [info] + schema size 25, 8 [info] + schema size 41, 8 [info] + schema size 2, 58 [info] + schema size 51, 16 [info] + schema size 37, 73 [info] + schema size 36, 57 [info] + schema size 81, 27 [info] + schema size 46, 79 [info] + schema size 37, 68 [info] + schema size 54, 24 [info] + schema size 48, 2 [info] + schema size 57, 93 [info] + schema size 65, 89 [info] + schema size 15, 98 [info] + schema size 69, 99 [info] + schema size 19, 82 [info] - map fixed size insertions (1 second, 103 milliseconds) [info] - simple variable width types (1 second, 979 milliseconds) [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 0 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 0, 1 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 1, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 64, 0 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 0, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] + schema size 64, 64 [info] - map variable size insertions (1 second, 315 milliseconds) [info] - SPARK-30633: xxHash64 with long seed: struct,mapOfStringAndArray:map>,mapOfArrayAndInt:map,int>,mapOfArray:map,array>,mapOfStringAndStruct:map>,mapOfStructAndString:map,string>,mapOfStruct:map,struct>> (7 seconds, 817 milliseconds) [info] - randomized variable width types (954 milliseconds) [info] + schema size 73, 23 [info] + schema size 59, 37 [info] + schema size 99, 42 [info] + schema size 56, 75 [info] + schema size 9, 39 [info] + schema size 67, 32 [info] + schema size 26, 92 [info] + schema size 31, 99 [info] + schema size 48, 51 [info] + schema size 87, 57 [info] - external shuffle service (21 seconds, 31 milliseconds) [info] - map updates (1 second, 293 milliseconds) [info] JavaUtilsSuite: [info] - containsKey implementation without iteratively entrySet call (15 milliseconds) [info] ProactiveClosureSerializationSuite: [info] - throws expected serialization exceptions on actions (8 milliseconds) [info] - mapPartitions transformations throw proactive serialization exceptions (4 milliseconds) [info] - map transformations throw proactive serialization exceptions (2 milliseconds) [info] - filter transformations throw proactive serialization exceptions (3 milliseconds) [info] - flatMap transformations throw proactive serialization exceptions (3 milliseconds) [info] - mapPartitionsWithIndex transformations throw proactive serialization exceptions (4 milliseconds) [info] IndexShuffleBlockResolverSuite: [info] - commit shuffle files multiple times (6 milliseconds) [info] SizeEstimatorSuite: [info] - simple classes (2 milliseconds) [info] - primitive wrapper objects (0 milliseconds) [info] - class field blocks rounding (1 millisecond) [info] - strings (1 millisecond) [info] - primitive arrays (0 milliseconds) [info] - object arrays (9 milliseconds) [info] - 32-bit arch (1 millisecond) [info] - 64-bit arch with no compressed oops (1 millisecond) [info] - class field blocks rounding on 64-bit VM without useCompressedOops (1 millisecond) [info] - check 64-bit detection for s390x arch (1 millisecond) [info] - SizeEstimation can provide the estimated size (1 millisecond) [info] GenericAvroSerializerSuite: [info] - schema compression and decompression (52 milliseconds) [info] - record serialization and deserialization (71 milliseconds) [info] - uses schema fingerprint to decrease message size (2 milliseconds) [info] - caches previously seen schemas (0 milliseconds) [info] PythonRunnerSuite: [info] - format path (3 milliseconds) [info] - format paths (2 milliseconds) [info] KryoSerializerSuite: [info] - SPARK-7392 configuration limits (34 milliseconds) [info] - basic types (20 milliseconds) [info] - pairs (8 milliseconds) [info] - Scala data structures (8 milliseconds) [info] - Bug: SPARK-10251 (9 milliseconds) [info] - ranges (22 milliseconds) [info] - asJavaIterable (14 milliseconds) [info] - custom registrator (12 milliseconds) [info] - kryo with collect (78 milliseconds) [info] - kryo with parallelize (72 milliseconds) [info] - kryo with parallelize for specialized tuples (68 milliseconds) [info] - kryo with parallelize for primitive arrays (63 milliseconds) [info] - kryo with collect for specialized tuples (72 milliseconds) [info] - kryo with SerializableHyperLogLog (77 milliseconds) [info] - kryo with reduce (49 milliseconds) [info] - kryo with fold (49 milliseconds) [info] - kryo with nonexistent custom registrator should fail (2 milliseconds) [info] - default class loader can be set by a different thread (16 milliseconds) [info] - registration of HighlyCompressedMapStatus (7 milliseconds) [info] - serialization buffer overflow reporting (67 milliseconds) [info] - KryoOutputObjectOutputBridge.writeObject and KryoInputObjectInputBridge.readObject (6 milliseconds) [info] - getAutoReset (11 milliseconds) [info] - SPARK-25176 ClassCastException when writing a Map after previously reading a Map with different generic type (8 milliseconds) [info] - instance reuse with autoReset = true, referenceTracking = true (7 milliseconds) [info] - instance reuse with autoReset = false, referenceTracking = true (6 milliseconds) [info] - instance reuse with autoReset = true, referenceTracking = false (6 milliseconds) [info] - instance reuse with autoReset = false, referenceTracking = false (6 milliseconds) [info] - SPARK-27216: test RoaringBitmap ser/dser with Kryo (6 milliseconds) [info] KryoSerializerAutoResetDisabledSuite: [info] - sort-shuffle with bypassMergeSort (SPARK-7873) (126 milliseconds) [info] - calling deserialize() after deserializeStream() (8 milliseconds) [info] - SPARK-25786: ByteBuffer.array -- UnsupportedOperationException (5 milliseconds) [info] ExternalSorterSuite: [info] - empty data stream with kryo ser (76 milliseconds) [info] - empty data stream with java ser (70 milliseconds) [info] - few elements per partition with kryo ser (100 milliseconds) [info] - few elements per partition with java ser (58 milliseconds) [info] - empty partitions with spilling with kryo ser (120 milliseconds) [info] - empty partitions with spilling with java ser (100 milliseconds) [info] - murmur3/xxHash64/hive hash: struct,structOfStructOfString:struct>,structOfArray:struct>,structOfMap:struct>,structOfArrayAndMap:struct,map:map>,structOfUDT:struct> (3 seconds, 511 milliseconds) [info] - SPARK-22508: GenerateUnsafeRowJoiner.create should not generate codes beyond 64KB (3 seconds, 235 milliseconds) [info] + schema size 3000, 3000 [info] - SPARK-30993: UserDefinedType matched to fixed length SQL type shouldn't be corrupted (36 milliseconds) [info] ComplexDataSuite: [info] - inequality tests for MapData (14 milliseconds) [info] - GenericInternalRow.copy return a new instance that is independent from the old one (10 milliseconds) [info] - SpecificMutableRow.copy return a new instance that is independent from the old one (1 millisecond) [info] - GenericArrayData.copy return a new instance that is independent from the old one (1 millisecond) [info] - copy on nested complex type (1 millisecond) [info] - SPARK-24659: GenericArrayData.equals should respect element type differences (1 millisecond) [info] ConvertToLocalRelationSuite: [info] - Project on LocalRelation should be turned into a single LocalRelation (8 milliseconds) [info] - Filter on LocalRelation should be turned into a single LocalRelation (6 milliseconds) [info] - SPARK-27798: Expression reusing output shouldn't override values in local relation (3 milliseconds) [info] SetOperationSuite: [info] - union: combine unions into one unions (13 milliseconds) [info] - union: filter to each side (6 milliseconds) [info] - union: project to each side (4 milliseconds) [info] - Remove unnecessary distincts in multiple unions (11 milliseconds) [info] - Keep necessary distincts in multiple unions (14 milliseconds) [info] - EXCEPT ALL rewrite (14 milliseconds) [info] - INTERSECT ALL rewrite (13 milliseconds) [info] CaseInsensitiveMapSuite: [info] - SPARK-32377: CaseInsensitiveMap should be deterministic for addition (2 milliseconds) line 1:0 mismatched input '' expecting {'(', 'SELECT', 'FROM', 'ADD', 'DESC', 'WITH', 'VALUES', 'CREATE', 'TABLE', 'INSERT', 'DELETE', 'DESCRIBE', 'EXPLAIN', 'SHOW', 'USE', 'DROP', 'ALTER', 'MAP', 'SET', 'RESET', 'START', 'COMMIT', 'ROLLBACK', 'REDUCE', 'REFRESH', 'CLEAR', 'CACHE', 'UNCACHE', 'DFS', 'TRUNCATE', 'ANALYZE', 'LIST', 'REVOKE', 'GRANT', 'LOCK', 'UNLOCK', 'MSCK', 'EXPORT', 'IMPORT', 'LOAD'} [info] ParserUtilsSuite: [info] - unescapeSQLString (0 milliseconds) [info] - command (1 millisecond) [info] - operationNotAllowed (1 millisecond) [info] - checkDuplicateKeys (1 millisecond) [info] - source (0 milliseconds) [info] - remainder (1 millisecond) [info] - string (0 milliseconds) [info] - position (0 milliseconds) [info] - validate (2 milliseconds) [info] - withOrigin (1 millisecond) [info] GenerateUnsafeProjectionSuite: [info] - Test unsafe projection string access pattern (10 milliseconds) [info] - Test unsafe projection for array/map/struct (26 milliseconds) [info] JacksonGeneratorSuite: [info] - initial with StructType and write out a row (26 milliseconds) [info] - initial with StructType and write out rows (3 milliseconds) [info] - initial with StructType and write out an array with single empty row (1 millisecond) [info] - initial with StructType and write out an empty array (0 milliseconds) [info] - initial with Map and write out a map data (1 millisecond) [info] - initial with Map and write out an array of maps (2 milliseconds) [info] - error handling: initial with StructType but error calling write a map (1 millisecond) [info] - error handling: initial with MapType and write out a row (1 millisecond) [info] PullupCorrelatedPredicatesSuite: [info] - PullupCorrelatedPredicates should not produce unresolved plan (10 milliseconds) [info] RewriteSubquerySuite: [info] - Column pruning after rewriting predicate subquery (11 milliseconds) [info] ConstantPropagationSuite: [info] - basic test (11 milliseconds) [info] - with combination of AND and OR predicates (9 milliseconds) [info] - equality predicates outside a `NOT` can be propagated within a `NOT` (7 milliseconds) [info] - equality predicates inside a `NOT` should not be picked for propagation (4 milliseconds) [info] - equality predicates outside a `OR` can be propagated within a `OR` (7 milliseconds) [info] - equality predicates inside a `OR` should not be picked for propagation (4 milliseconds) [info] - equality operator not immediate child of root `AND` should not be used for propagation (6 milliseconds) [info] - conflicting equality predicates (5 milliseconds) [info] - SPARK-30447: take nullability into account (8 milliseconds) [info] ScalaReflectionSuite: [info] - isSubtype (8 milliseconds) [info] - SQLUserDefinedType annotation on Scala structure (3 milliseconds) [info] - primitive data (5 milliseconds) [info] - nullable data (9 milliseconds) [info] - optional data (14 milliseconds) [info] - complex data (14 milliseconds) [info] - generic data (4 milliseconds) [info] - tuple data (4 milliseconds) [info] - type-aliased data (4 milliseconds) [info] - convert PrimitiveData to catalyst (5 milliseconds) [info] - convert Option[Product] to catalyst (10 milliseconds) [info] - infer schema from case class with multiple constructors (5 milliseconds) [info] - SPARK-15062: Get correct serializer for List[_] (2 milliseconds) [info] - SPARK 16792: Get correct deserializer for List[_] (4 milliseconds) [info] - serialize and deserialize arbitrary sequence types (13 milliseconds) [info] - serialize and deserialize arbitrary map types (40 milliseconds) [info] - SPARK-22442: Generate correct field names for special characters (9 milliseconds) [info] - SPARK-22472: add null check for top-level primitive values (3 milliseconds) [info] - SPARK-23025: schemaFor should support Null type (3 milliseconds) [info] - SPARK-23835: add null check to non-nullable types in Tuples (21 milliseconds) [info] CollapseWindowSuite: [info] - collapse two adjacent windows with the same partition/order (7 milliseconds) [info] - Don't collapse adjacent windows with different partitions or orders (10 milliseconds) [info] - Don't collapse adjacent windows with dependent columns (7 milliseconds) [info] CombineConcatsSuite: [info] - combine nested Concat exprs (8 milliseconds) [info] - combine string and binary exprs (6 milliseconds) [info] MetadataSuite: [info] - String Metadata (4 milliseconds) [info] - Long Metadata (1 millisecond) [info] - Double Metadata (1 millisecond) [info] - Boolean Metadata (2 milliseconds) [info] - Null Metadata (1 millisecond) [info] OrderingSuite: [info] - compare two arrays: a = List(), b = List() (90 milliseconds) [info] - compare two arrays: a = List(1), b = List(1) (41 milliseconds) [info] - compare two arrays: a = List(1, 2), b = List(1, 2) (44 milliseconds) [info] - compare two arrays: a = List(1, 2, 2), b = List(1, 2, 3) (49 milliseconds) [info] - compare two arrays: a = List(), b = List(1) (35 milliseconds) [info] - SPARK-30633: xxHash64 with long seed: struct,structOfStructOfString:struct>,structOfArray:struct>,structOfMap:struct>,structOfArrayAndMap:struct,map:map>,structOfUDT:struct> (997 milliseconds) [info] - hive-hash for decimal (3 milliseconds) [info] - compare two arrays: a = List(1, 2, 3), b = List(1, 2, 3, 4) (42 milliseconds) [info] - compare two arrays: a = List(1, 2, 3), b = List(1, 2, 3, 2) (32 milliseconds) [info] - compare two arrays: a = List(1, 2, 3), b = List(1, 2, 2, 2) (48 milliseconds) [info] - compare two arrays: a = List(1, 2, 3), b = List(1, 2, 3, null) (50 milliseconds) [info] - compare two arrays: a = List(), b = List(null) (43 milliseconds) [info] - compare two arrays: a = List(null), b = List(null) (35 milliseconds) [info] - compare two arrays: a = List(null, null), b = List(null, null) (30 milliseconds) [info] - compare two arrays: a = List(null), b = List(null, null) (34 milliseconds) [info] - compare two arrays: a = List(null), b = List(1) (53 milliseconds) [info] - compare two arrays: a = List(null), b = List(null, 1) (49 milliseconds) [info] - compare two arrays: a = List(null, 1), b = List(1, 1) (38 milliseconds) [info] - compare two arrays: a = List(1, null, 1), b = List(1, null, 1) (35 milliseconds) [info] - compare two arrays: a = List(1, null, 1), b = List(1, null, 2) (31 milliseconds) [info] - GenerateOrdering with StringType (28 milliseconds) [info] - GenerateOrdering with NullType (10 milliseconds) [info] - GenerateOrdering with ArrayType(IntegerType,true) (18 milliseconds) [info] - GenerateOrdering with LongType (11 milliseconds) [info] - GenerateOrdering with IntegerType (11 milliseconds) [info] - GenerateOrdering with DecimalType(20,5) (13 milliseconds) [info] - GenerateOrdering with TimestampType (3 milliseconds) [info] - GenerateOrdering with DoubleType (17 milliseconds) [info] - GenerateOrdering with DateType (2 milliseconds) [info] - GenerateOrdering with StructType(StructField(f1,FloatType,true), StructField(f2,ArrayType(BooleanType,true),true)) (26 milliseconds) [info] - SPARK-18207: Compute hash for a lot of expressions (737 milliseconds) [info] - GenerateOrdering with ArrayType(StructType(StructField(f1,FloatType,true), StructField(f2,ArrayType(BooleanType,true),true)),true) (278 milliseconds) [info] - GenerateOrdering with DecimalType(10,0) (15 milliseconds) [info] - GenerateOrdering with BinaryType (30 milliseconds) [info] - GenerateOrdering with BooleanType (14 milliseconds) [info] - GenerateOrdering with DecimalType(38,18) (14 milliseconds) [info] - GenerateOrdering with ByteType (15 milliseconds) [info] - GenerateOrdering with FloatType (18 milliseconds) [info] - GenerateOrdering with ShortType (14 milliseconds) [info] - SPARK-16845: GeneratedClass$SpecificOrdering grows beyond 64 KB (2 seconds, 301 milliseconds) [info] - SPARK-21344: BinaryType comparison does signed byte array comparison (17 milliseconds) [info] - SPARK-22591: GenerateOrdering shouldn't change ctx.INPUT_ROW (1 millisecond) [info] HigherOrderFunctionsSuite: [info] - ArrayTransform (303 milliseconds) [info] - ArrayFilter (145 milliseconds) [info] - ArrayExists (164 milliseconds) [info] - ArrayAggregate (181 milliseconds) [info] - ZipWith (132 milliseconds) [info] ApproximatePercentileSuite: [info] - serialize and de-serialize (16 milliseconds) [info] - spilling in local cluster with kryo ser (5 seconds, 833 milliseconds) [info] - class PercentileDigest, basic operations (2 seconds, 904 milliseconds) [info] - class PercentileDigest, makes sure the memory foot print is bounded (1 second, 374 milliseconds) [info] - class ApproximatePercentile, high level interface, update, merge, eval... (24 milliseconds) [info] - class ApproximatePercentile, low level interface, update, merge, eval... (2 milliseconds) [info] - class ApproximatePercentile, sql string (2 milliseconds) [info] - class ApproximatePercentile, fails analysis if percentage or accuracy is not a constant (0 milliseconds) [info] - class ApproximatePercentile, fails analysis if parameters are invalid (3 milliseconds) [info] - class ApproximatePercentile, automatically add type casting for parameters (22 milliseconds) [info] - class ApproximatePercentile, null handling (0 milliseconds) [info] OptimizerStructuralIntegrityCheckerSuite: [info] - check for invalid plan after execution of rule (7 milliseconds) [info] SubexpressionEliminationSuite: [info] - Semantic equals and hash (0 milliseconds) [info] - Expression Equivalence - basic (4 milliseconds) [info] - Expression Equivalence - Trees (3 milliseconds) [info] - Expression equivalence - non deterministic (1 millisecond) [info] - Children of CodegenFallback (1 millisecond) [info] - Children of conditional expressions (1 millisecond) [info] DataTypeWriteCompatibilitySuite: [info] - Check NullType is incompatible with all other types (2 milliseconds) [info] - Check each type with itself (2 milliseconds) [info] - Check atomic types: write allowed only when casting is safe (4 milliseconds) [info] - Check struct types: missing required field (1 millisecond) [info] - Check struct types: missing starting field, matched by position (1 millisecond) [info] - Check struct types: missing middle field, matched by position (0 milliseconds) [info] - Check struct types: generic colN names are ignored (0 milliseconds) [info] - Check struct types: required field is optional (1 millisecond) [info] - Check struct types: data field would be dropped (1 millisecond) [info] - Check struct types: unsafe casts are not allowed (1 millisecond) [info] - Check struct types: type promotion is allowed (0 milliseconds) [info] - Check struct types: missing optional field is allowed !!! IGNORED !!! [info] - Check array types: unsafe casts are not allowed (1 millisecond) [info] - Check array types: type promotion is allowed (0 milliseconds) [info] - Check array types: cannot write optional to required elements (1 millisecond) [info] - Check array types: writing required to optional elements is allowed (0 milliseconds) [info] - Check map value types: unsafe casts are not allowed (1 millisecond) [info] - Check map value types: type promotion is allowed (0 milliseconds) [info] - Check map value types: cannot write optional to required values (0 milliseconds) [info] - Check map value types: writing required to optional values is allowed (0 milliseconds) [info] - Check map key types: unsafe casts are not allowed (0 milliseconds) [info] - Check map key types: type promotion is allowed (0 milliseconds) [info] - Check types with multiple errors (2 milliseconds) [info] CodeGenerationSuite: [info] - multithreaded eval (77 milliseconds) [info] - metrics are recorded on compile (8 milliseconds) [info] - SPARK-8443: split wide projections into blocks due to JVM code size limit (940 milliseconds) [info] YarnSparkHadoopUtilSuite: [info] - shell script escaping (8 milliseconds) [info] - Yarn configuration override (28 milliseconds) [info] - test getApplicationAclsForYarn acls on (24 milliseconds) [info] - test getApplicationAclsForYarn acls on and specify users (27 milliseconds) [info] - spilling in local cluster with java ser (5 seconds, 762 milliseconds) [info] - SPARK-24149: retrieve all namenodes from HDFS (557 milliseconds) [info] - SPARK-13242: case-when expression with large number of branches (or cases) (1 second, 474 milliseconds) [info] - SPARK-22543: split large if expressions into blocks due to JVM code size limit (237 milliseconds) [info] - SPARK-14793: split wide array creation into blocks due to JVM code size limit (478 milliseconds) [info] CastSuite: [info] - SPARK-22284: Compute hash for nested structs (12 seconds, 103 milliseconds) [info] - SPARK-30633: xxHash with different type seeds (263 milliseconds) [info] - SPARK-14793: split wide map creation into blocks due to JVM code size limit (1 second, 461 milliseconds) [info] - SPARK-14793: split wide struct creation into blocks due to JVM code size limit (400 milliseconds) [info] - SPARK-14793: split wide named struct creation into blocks due to JVM code size limit (232 milliseconds) [info] - SPARK-14224: split wide external row creation into blocks due to JVM code size limit (364 milliseconds) [info] DateExpressionsSuite: [info] - datetime function current_date (156 milliseconds) [info] - datetime function current_timestamp (2 milliseconds) [info] - SPARK-17702: split wide constructor into blocks due to JVM code size limit (2 seconds, 611 milliseconds) [info] - DayOfYear (3 seconds, 400 milliseconds) [info] - spilling in local cluster with many reduce tasks with kryo ser (9 seconds, 934 milliseconds) [info] - null cast (8 seconds, 152 milliseconds) [info] - cast string to date (238 milliseconds) [info] - SPARK-22226: group splitted expressions into one method per nested class (4 seconds, 469 milliseconds) [info] - test generated safe and unsafe projection (28 milliseconds) [info] - */ in the data (29 milliseconds) [info] - \u in the data (10 milliseconds) [info] - check compilation error doesn't occur caused by specific literal (9 milliseconds) [info] - SPARK-17160: field names are properly escaped by GetExternalRowField (10 milliseconds) [info] - SPARK-17160: field names are properly escaped by AssertTrue (16 milliseconds) [info] - should not apply common subexpression elimination on conditional expressions (12 milliseconds) [info] - SPARK-22543: split large predicates into blocks due to JVM code size limit (435 milliseconds) [info] - SPARK-22696: CreateExternalRow should not use global variables (1 millisecond) [info] - SPARK-22696: InitializeJavaBean should not use global variables (1 millisecond) [info] - SPARK-22716: addReferenceObj should not add mutable states (0 milliseconds) [info] - SPARK-18016: define mutable states by using an array (95 milliseconds) [info] - SPARK-22750: addImmutableStateIfNotExists (0 milliseconds) [info] - SPARK-23628: calculateParamLength should compute properly the param length (3 milliseconds) [info] - SPARK-23760: CodegenContext.withSubExprEliminationExprs should save/restore correctly (6 milliseconds) [info] - SPARK-23986: freshName can generate duplicated names (0 milliseconds) [info] - SPARK-25113: should log when there exists generated methods above HugeMethodLimit (726 milliseconds) [info] PercentileSuite: [info] - serialize and de-serialize (75 milliseconds) [info] - class Percentile, high level interface, update, merge, eval... (561 milliseconds) [info] - class Percentile, low level interface, update, merge, eval... (10 milliseconds) [info] - fail analysis if childExpression is invalid (7 milliseconds) [info] - fails analysis if percentage(s) are invalid (5 milliseconds) [info] - null handling (1 millisecond) [info] - negatives frequency column handling (1 millisecond) [info] UDFXPathUtilSuite: [info] - illegal arguments (101 milliseconds) [info] - generic eval (82 milliseconds) [info] - boolean eval (3 milliseconds) [info] - string eval (5 milliseconds) [info] - embedFailure (31 milliseconds) [info] - number eval (3 milliseconds) [info] - node eval (6 milliseconds) [info] - node list eval (3 milliseconds) [info] AnalysisSuite: [info] - union project * (139 milliseconds) [info] - check project's resolved (1 millisecond) [info] - analyze project (5 milliseconds) [info] - resolve sort references - filter/limit (9 milliseconds) [info] - resolve sort references - join (4 milliseconds) [info] - resolve sort references - aggregate (10 milliseconds) [info] - resolve relations (2 milliseconds) [info] - divide should be casted into fractional types (6 milliseconds) [info] - pull out nondeterministic expressions from RepartitionByExpression (2 milliseconds) [info] - pull out nondeterministic expressions from Sort (1 millisecond) [info] - SPARK-9634: cleanup unnecessary Aliases in LogicalPlan (3 milliseconds) [info] - Analysis may leave unnecessary aliases (5 milliseconds) [info] - SPARK-10534: resolve attribute references in order by clause (5 milliseconds) [info] - self intersect should resolve duplicate expression IDs (1 millisecond) [info] - SPARK-8654: invalid CAST in NULL IN(...) expression (2 milliseconds) [info] - SPARK-8654: different types in inlist but can be converted to a common type (2 milliseconds) [info] - SPARK-8654: check type compatibility error (2 milliseconds) [info] - SPARK-11725: correctly handle null inputs for ScalaUDF (11 milliseconds) [info] - SPARK-24891 Fix HandleNullInputsForUDF rule (6 milliseconds) [info] - SPARK-11863 mixture of aliases and real columns in order by clause - tpcds 19,55,71 (8 milliseconds) [info] - Eliminate the unnecessary union (1 millisecond) [info] - SPARK-12102: Ignore nullablity when comparing two sides of case (3 milliseconds) [info] - Keep attribute qualifiers after dedup (4 milliseconds) [info] - SPARK-15776: test whether Divide expression's data type can be deduced correctly by analyzer (22 milliseconds) [info] - SPARK-18058: union and set operations shall not care about the nullability when comparing column types (3 milliseconds) [info] - resolve as with an already existed alias (3 milliseconds) [info] - SPARK-20311 range(N) as alias (14 milliseconds) [info] - SPARK-20841 Support table column aliases in FROM clause (7 milliseconds) [info] - SPARK-20962 Support subquery column aliases in FROM clause (4 milliseconds) [info] - SPARK-20963 Support aliases for join relations in FROM clause (8 milliseconds) [info] - SPARK-22614 RepartitionByExpression partitioning (4 milliseconds) [info] - SPARK-24208: analysis fails on self-join with FlatMapGroupsInPandas (7 milliseconds) [info] - SPARK-24488 Generator with multiple aliases (6 milliseconds) [info] - SPARK-24151: CURRENT_DATE, CURRENT_TIMESTAMP should be case insensitive (4 milliseconds) [info] - SPARK-32131: Fix wrong column index when we have more than two columns during union and set operations (5 milliseconds) [info] - SPARK-33733: PullOutNondeterministic should check and collect deterministic field (8 milliseconds) [info] OuterJoinEliminationSuite: [info] - joins: full outer to inner (14 milliseconds) [info] - joins: full outer to right (5 milliseconds) [info] - joins: full outer to left (5 milliseconds) [info] - joins: right to inner (4 milliseconds) [info] - joins: left to inner (4 milliseconds) [info] - joins: left to inner with complicated filter predicates #1 (6 milliseconds) [info] - joins: left to inner with complicated filter predicates #2 (6 milliseconds) [info] - joins: left to inner with complicated filter predicates #3 (6 milliseconds) [info] - joins: left to inner with complicated filter predicates #4 (5 milliseconds) [info] - joins: no outer join elimination if the filter is not NULL eliminated (5 milliseconds) [info] - joins: no outer join elimination if the filter's constraints are not NULL eliminated (7 milliseconds) [info] - no outer join elimination if constraint propagation is disabled (9 milliseconds) [info] ResolveLambdaVariablesSuite: [info] - resolution - no op (2 milliseconds) [info] - resolution - simple (9 milliseconds) [info] - resolution - nested (2 milliseconds) [info] - resolution - hidden (3 milliseconds) [info] - fail - name collisions (2 milliseconds) [info] - fail - lambda arguments (1 millisecond) [info] InMemorySessionCatalogSuite: [info] - basic create and list databases (44 milliseconds) [info] - create databases using invalid names (48 milliseconds) [info] - get database when a database exists (25 milliseconds) [info] - get database should throw exception when the database does not exist (32 milliseconds) [info] - list databases without pattern (22 milliseconds) [info] - list databases with pattern (21 milliseconds) [info] - drop database (23 milliseconds) [info] - drop database when the database is not empty (67 milliseconds) [info] - drop database when the database does not exist (19 milliseconds) [info] - drop current database and drop default database (19 milliseconds) [info] - alter database (21 milliseconds) [info] - alter database should throw exception when the database does not exist (19 milliseconds) [info] - get/set current database (35 milliseconds) [info] - create table (18 milliseconds) [info] - create tables using invalid names (26 milliseconds) [info] - create table when database does not exist (26 milliseconds) [info] - create temp view (25 milliseconds) [info] - drop table (24 milliseconds) [info] - drop table when database/table does not exist (25 milliseconds) [info] - drop temp table (18 milliseconds) [info] - rename table (21 milliseconds) [info] - rename tables to an invalid name (18 milliseconds) [info] - rename table when database/table does not exist (19 milliseconds) [info] - rename temp table (18 milliseconds) [info] - alter table (18 milliseconds) [info] - alter table when database/table does not exist (19 milliseconds) [info] - alter table stats (24 milliseconds) [info] - alter table add columns (20 milliseconds) [info] - alter table drop columns (19 milliseconds) [info] - get table (17 milliseconds) [info] - get table when database/table does not exist (17 milliseconds) [info] - lookup table relation (20 milliseconds) [info] - look up view relation (20 milliseconds) [info] - table exists (23 milliseconds) [info] - getTempViewOrPermanentTableMetadata on temporary views (21 milliseconds) [info] - list tables without pattern (18 milliseconds) [info] - list tables with pattern (20 milliseconds) [info] - basic create and list partitions (36 milliseconds) [info] - create partitions when database/table does not exist (17 milliseconds) [info] - create partitions that already exist (18 milliseconds) [info] - create partitions with invalid part spec (31 milliseconds) [info] - drop partitions (27 milliseconds) [info] - drop partitions when database/table does not exist (24 milliseconds) [info] - drop partitions that do not exist (16 milliseconds) [info] - drop partitions with invalid partition spec (19 milliseconds) [info] - get partition (21 milliseconds) [info] - get partition when database/table does not exist (17 milliseconds) [info] - get partition with invalid partition spec (21 milliseconds) [info] - rename partitions (21 milliseconds) [info] - rename partitions when database/table does not exist (16 milliseconds) [info] - rename partition with invalid partition spec (16 milliseconds) [info] - alter partitions (18 milliseconds) [info] - alter partitions when database/table does not exist (107 milliseconds) [info] - alter partition with invalid partition spec (16 milliseconds) [info] - list partition names (16 milliseconds) [info] - list partition names with partial partition spec (14 milliseconds) [info] - list partition names with invalid partial partition spec (16 milliseconds) [info] - list partitions (16 milliseconds) [info] - list partitions with partial partition spec (14 milliseconds) [info] - list partitions with invalid partial partition spec (15 milliseconds) [info] - list partitions when database/table does not exist (15 milliseconds) [info] - basic create and list functions (28 milliseconds) [info] - create function when database does not exist (19 milliseconds) [info] - create function that already exists (17 milliseconds) [info] - create temp function (21 milliseconds) [info] - isTemporaryFunction (16 milliseconds) [info] - isRegisteredFunction (14 milliseconds) [info] - isPersistentFunction (14 milliseconds) [info] - drop function (18 milliseconds) [info] - drop function when database/function does not exist (15 milliseconds) [info] - drop temp function (22 milliseconds) [info] - get function (20 milliseconds) [info] - get function when database/function does not exist (20 milliseconds) [info] - lookup temp function (20 milliseconds) [info] - list functions (25 milliseconds) [info] - list functions when database does not exist (23 milliseconds) [info] - copy SessionCatalog state - temp views (29 milliseconds) [info] - copy SessionCatalog state - current db (35 milliseconds) [info] - SPARK-19737: detect undefined functions without triggering relation resolution (53 milliseconds) [info] RandomDataGeneratorSuite: [info] - StringType (nullable=true) (6 milliseconds) [info] - StringType (nullable=false) (3 milliseconds) [info] - LongType (nullable=true) (0 milliseconds) [info] - LongType (nullable=false) (0 milliseconds) [info] - IntegerType (nullable=true) (0 milliseconds) [info] - IntegerType (nullable=false) (0 milliseconds) [info] - TimestampType (nullable=true) (0 milliseconds) [info] - TimestampType (nullable=false) (0 milliseconds) [info] - DoubleType (nullable=true) (1 millisecond) [info] - DoubleType (nullable=false) (0 milliseconds) [info] - DateType (nullable=true) (0 milliseconds) [info] - DateType (nullable=false) (1 millisecond) [info] - BinaryType (nullable=true) (0 milliseconds) [info] - BinaryType (nullable=false) (0 milliseconds) [info] - BooleanType (nullable=true) (1 millisecond) [info] - BooleanType (nullable=false) (0 milliseconds) [info] - ByteType (nullable=true) (0 milliseconds) [info] - ByteType (nullable=false) (1 millisecond) [info] - FloatType (nullable=true) (0 milliseconds) [info] - FloatType (nullable=false) (0 milliseconds) [info] - ShortType (nullable=true) (0 milliseconds) [info] - ShortType (nullable=false) (0 milliseconds) [info] - ArrayType(FloatType,true) (0 milliseconds) [info] - ArrayType(LongType,true) (1 millisecond) [info] - ArrayType(IntegerType,true) (0 milliseconds) [info] - ArrayType(TimestampType,true) (1 millisecond) [info] - ArrayType(ByteType,true) (1 millisecond) [info] - ArrayType(ShortType,true) (0 milliseconds) [info] - ArrayType(DecimalType(20,5),true) (1 millisecond) [info] - ArrayType(DateType,true) (7 milliseconds) [info] - ArrayType(BooleanType,true) (1 millisecond) [info] - ArrayType(BinaryType,true) (3 milliseconds) [info] - ArrayType(DecimalType(10,0),true) (1 millisecond) [info] - ArrayType(StringType,true) (134 milliseconds) [info] - ArrayType(DoubleType,true) (1 millisecond) [info] - ArrayType(DecimalType(38,18),true) (1 millisecond) [info] - MapType(StringType,StringType,true) (83 milliseconds) [info] - MapType(StringType,LongType,true) (63 milliseconds) [info] - MapType(StringType,IntegerType,true) (131 milliseconds) [info] - MapType(StringType,DecimalType(20,5),true) (41 milliseconds) [info] - MapType(StringType,TimestampType,true) (54 milliseconds) [info] - MapType(StringType,DoubleType,true) (34 milliseconds) [info] - MapType(StringType,DateType,true) (57 milliseconds) [info] - MapType(StringType,DecimalType(10,0),true) (49 milliseconds) [info] - MapType(StringType,BinaryType,true) (102 milliseconds) [info] - MapType(StringType,BooleanType,true) (34 milliseconds) [info] - MapType(StringType,DecimalType(38,18),true) (42 milliseconds) [info] - MapType(StringType,ByteType,true) (52 milliseconds) [info] - MapType(StringType,FloatType,true) (59 milliseconds) [info] - MapType(StringType,ShortType,true) (43 milliseconds) [info] - MapType(LongType,StringType,true) (47 milliseconds) [info] - MapType(LongType,LongType,true) (2 milliseconds) [info] - MapType(LongType,IntegerType,true) (1 millisecond) [info] - MapType(LongType,DecimalType(20,5),true) (1 millisecond) [info] - MapType(LongType,TimestampType,true) (1 millisecond) [info] - MapType(LongType,DoubleType,true) (2 milliseconds) [info] - MapType(LongType,DateType,true) (2 milliseconds) [info] - MapType(LongType,DecimalType(10,0),true) (2 milliseconds) [info] - MapType(LongType,BinaryType,true) (9 milliseconds) [info] - MapType(LongType,BooleanType,true) (2 milliseconds) [info] - MapType(LongType,DecimalType(38,18),true) (1 millisecond) [info] - MapType(LongType,ByteType,true) (1 millisecond) [info] - MapType(LongType,FloatType,true) (1 millisecond) [info] - MapType(LongType,ShortType,true) (1 millisecond) [info] - MapType(IntegerType,StringType,true) (88 milliseconds) [info] - MapType(IntegerType,LongType,true) (1 millisecond) [info] - MapType(IntegerType,IntegerType,true) (1 millisecond) [info] - MapType(IntegerType,DecimalType(20,5),true) (1 millisecond) [info] - MapType(IntegerType,TimestampType,true) (1 millisecond) [info] - MapType(IntegerType,DoubleType,true) (1 millisecond) [info] - MapType(IntegerType,DateType,true) (2 milliseconds) [info] - MapType(IntegerType,DecimalType(10,0),true) (1 millisecond) [info] - MapType(IntegerType,BinaryType,true) (3 milliseconds) [info] - MapType(IntegerType,BooleanType,true) (1 millisecond) [info] - MapType(IntegerType,DecimalType(38,18),true) (1 millisecond) [info] - MapType(IntegerType,ByteType,true) (1 millisecond) [info] - MapType(IntegerType,FloatType,true) (1 millisecond) [info] - MapType(IntegerType,ShortType,true) (0 milliseconds) [info] - MapType(TimestampType,StringType,true) (29 milliseconds) [info] - MapType(TimestampType,LongType,true) (1 millisecond) [info] - MapType(TimestampType,IntegerType,true) (2 milliseconds) [info] - MapType(TimestampType,DecimalType(20,5),true) (3 milliseconds) [info] - MapType(TimestampType,TimestampType,true) (1 millisecond) [info] - MapType(TimestampType,DoubleType,true) (0 milliseconds) [info] - MapType(TimestampType,DateType,true) (2 milliseconds) [info] - MapType(TimestampType,DecimalType(10,0),true) (1 millisecond) [info] - MapType(TimestampType,BinaryType,true) (3 milliseconds) [info] - MapType(TimestampType,BooleanType,true) (0 milliseconds) [info] - MapType(TimestampType,DecimalType(38,18),true) (2 milliseconds) [info] - MapType(TimestampType,ByteType,true) (1 millisecond) [info] - MapType(TimestampType,FloatType,true) (1 millisecond) [info] - MapType(TimestampType,ShortType,true) (2 milliseconds) [info] - MapType(DoubleType,StringType,true) (43 milliseconds) [info] - MapType(DoubleType,LongType,true) (2 milliseconds) [info] - MapType(DoubleType,IntegerType,true) (1 millisecond) [info] - MapType(DoubleType,DecimalType(20,5),true) (2 milliseconds) [info] - MapType(DoubleType,TimestampType,true) (1 millisecond) [info] - MapType(DoubleType,DoubleType,true) (1 millisecond) [info] - MapType(DoubleType,DateType,true) (3 milliseconds) [info] - MapType(DoubleType,DecimalType(10,0),true) (1 millisecond) [info] - MapType(DoubleType,BinaryType,true) (2 milliseconds) [info] - MapType(DoubleType,BooleanType,true) (2 milliseconds) [info] - MapType(DoubleType,DecimalType(38,18),true) (2 milliseconds) [info] - MapType(DoubleType,ByteType,true) (1 millisecond) [info] - MapType(DoubleType,FloatType,true) (1 millisecond) [info] - MapType(DoubleType,ShortType,true) (2 milliseconds) [info] - MapType(DateType,StringType,true) (28 milliseconds) [info] - MapType(DateType,LongType,true) (2 milliseconds) [info] - MapType(DateType,IntegerType,true) (3 milliseconds) [info] - MapType(DateType,DecimalType(20,5),true) (4 milliseconds) [info] - MapType(DateType,TimestampType,true) (4 milliseconds) [info] - MapType(DateType,DoubleType,true) (1 millisecond) [info] - MapType(DateType,DateType,true) (4 milliseconds) [info] - MapType(DateType,DecimalType(10,0),true) (2 milliseconds) [info] - MapType(DateType,BinaryType,true) (4 milliseconds) [info] - MapType(DateType,BooleanType,true) (1 millisecond) [info] - MapType(DateType,DecimalType(38,18),true) (2 milliseconds) [info] - MapType(DateType,ByteType,true) (2 milliseconds) [info] - MapType(DateType,FloatType,true) (2 milliseconds) [info] - MapType(DateType,ShortType,true) (2 milliseconds) [info] - MapType(BinaryType,StringType,true) (21 milliseconds) [info] - MapType(BinaryType,LongType,true) (3 milliseconds) [info] - MapType(BinaryType,IntegerType,true) (2 milliseconds) [info] - MapType(BinaryType,DecimalType(20,5),true) (2 milliseconds) [info] - MapType(BinaryType,TimestampType,true) (2 milliseconds) [info] - MapType(BinaryType,DoubleType,true) (2 milliseconds) [info] - MapType(BinaryType,DateType,true) (2 milliseconds) [info] - MapType(BinaryType,DecimalType(10,0),true) (3 milliseconds) [info] - MapType(BinaryType,BinaryType,true) (5 milliseconds) [info] - MapType(BinaryType,BooleanType,true) (6 milliseconds) [info] - MapType(BinaryType,DecimalType(38,18),true) (2 milliseconds) [info] - MapType(BinaryType,ByteType,true) (2 milliseconds) [info] - MapType(BinaryType,FloatType,true) (2 milliseconds) [info] - MapType(BinaryType,ShortType,true) (2 milliseconds) [info] - MapType(BooleanType,StringType,true) (2 milliseconds) [info] - MapType(BooleanType,LongType,true) (0 milliseconds) [info] - MapType(BooleanType,IntegerType,true) (0 milliseconds) [info] - MapType(BooleanType,DecimalType(20,5),true) (0 milliseconds) [info] - MapType(BooleanType,TimestampType,true) (1 millisecond) [info] - MapType(BooleanType,DoubleType,true) (0 milliseconds) [info] - MapType(BooleanType,DateType,true) (0 milliseconds) [info] - MapType(BooleanType,DecimalType(10,0),true) (0 milliseconds) [info] - MapType(BooleanType,BinaryType,true) (0 milliseconds) [info] - MapType(BooleanType,BooleanType,true) (1 millisecond) [info] - MapType(BooleanType,DecimalType(38,18),true) (0 milliseconds) [info] - MapType(BooleanType,ByteType,true) (1 millisecond) [info] - MapType(BooleanType,FloatType,true) (0 milliseconds) [info] - MapType(BooleanType,ShortType,true) (0 milliseconds) [info] - MapType(ByteType,StringType,true) (26 milliseconds) [info] - MapType(ByteType,LongType,true) (1 millisecond) [info] - MapType(ByteType,IntegerType,true) (1 millisecond) [info] - MapType(ByteType,DecimalType(20,5),true) (1 millisecond) [info] - MapType(ByteType,TimestampType,true) (0 milliseconds) [info] - MapType(ByteType,DoubleType,true) (1 millisecond) [info] - MapType(ByteType,DateType,true) (1 millisecond) [info] - MapType(ByteType,DecimalType(10,0),true) (1 millisecond) [info] - MapType(ByteType,BinaryType,true) (3 milliseconds) [info] - MapType(ByteType,BooleanType,true) (0 milliseconds) [info] - MapType(ByteType,DecimalType(38,18),true) (1 millisecond) [info] - MapType(ByteType,ByteType,true) (1 millisecond) [info] - MapType(ByteType,FloatType,true) (1 millisecond) [info] - MapType(ByteType,ShortType,true) (1 millisecond) [info] - MapType(FloatType,StringType,true) (62 milliseconds) [info] - MapType(FloatType,LongType,true) (2 milliseconds) [info] - MapType(FloatType,IntegerType,true) (1 millisecond) [info] - MapType(FloatType,DecimalType(20,5),true) (2 milliseconds) [info] - MapType(FloatType,TimestampType,true) (0 milliseconds) [info] - MapType(FloatType,DoubleType,true) (3 milliseconds) [info] - MapType(FloatType,DateType,true) (2 milliseconds) [info] - MapType(FloatType,DecimalType(10,0),true) (2 milliseconds) [info] - MapType(FloatType,BinaryType,true) (4 milliseconds) [info] - MapType(FloatType,BooleanType,true) (1 millisecond) [info] - MapType(FloatType,DecimalType(38,18),true) (1 millisecond) [info] - Year (9 seconds, 732 milliseconds) [info] - MapType(FloatType,ByteType,true) (1 millisecond) [info] - MapType(FloatType,FloatType,true) (1 millisecond) [info] - MapType(FloatType,ShortType,true) (1 millisecond) [info] - MapType(ShortType,StringType,true) (78 milliseconds) [info] - MapType(ShortType,LongType,true) (1 millisecond) [info] - MapType(ShortType,IntegerType,true) (0 milliseconds) [info] - MapType(ShortType,DecimalType(20,5),true) (1 millisecond) [info] - MapType(ShortType,TimestampType,true) (1 millisecond) [info] - MapType(ShortType,DoubleType,true) (2 milliseconds) [info] - MapType(ShortType,DateType,true) (1 millisecond) [info] - MapType(ShortType,DecimalType(10,0),true) (1 millisecond) [info] - MapType(ShortType,BinaryType,true) (2 milliseconds) [info] - MapType(ShortType,BooleanType,true) (1 millisecond) [info] - MapType(ShortType,DecimalType(38,18),true) (1 millisecond) [info] - MapType(ShortType,ByteType,true) (1 millisecond) [info] - MapType(ShortType,FloatType,true) (1 millisecond) [info] - MapType(ShortType,ShortType,true) (1 millisecond) [info] - StructType(StructField(a,StringType,true), StructField(b,StringType,true)) (3 milliseconds) [info] - StructType(StructField(a,StringType,true), StructField(b,LongType,true)) (0 milliseconds) [info] - StructType(StructField(a,StringType,true), StructField(b,IntegerType,true)) (1 millisecond) [info] - StructType(StructField(a,StringType,true), StructField(b,DecimalType(20,5),true)) (1 millisecond) [info] - StructType(StructField(a,StringType,true), StructField(b,TimestampType,true)) (0 milliseconds) [info] - StructType(StructField(a,StringType,true), StructField(b,DoubleType,true)) (1 millisecond) [info] - StructType(StructField(a,StringType,true), StructField(b,DateType,true)) (0 milliseconds) [info] - StructType(StructField(a,StringType,true), StructField(b,DecimalType(10,0),true)) (0 milliseconds) [info] - StructType(StructField(a,StringType,true), StructField(b,BinaryType,true)) (1 millisecond) [info] - StructType(StructField(a,StringType,true), StructField(b,BooleanType,true)) (0 milliseconds) [info] - StructType(StructField(a,StringType,true), StructField(b,DecimalType(38,18),true)) (0 milliseconds) [info] - StructType(StructField(a,StringType,true), StructField(b,ByteType,true)) (1 millisecond) [info] - StructType(StructField(a,StringType,true), StructField(b,FloatType,true)) (0 milliseconds) [info] - StructType(StructField(a,StringType,true), StructField(b,ShortType,true)) (1 millisecond) [info] - StructType(StructField(a,LongType,true), StructField(b,StringType,true)) (1 millisecond) [info] - StructType(StructField(a,LongType,true), StructField(b,LongType,true)) (0 milliseconds) [info] - StructType(StructField(a,LongType,true), StructField(b,IntegerType,true)) (0 milliseconds) [info] - StructType(StructField(a,LongType,true), StructField(b,DecimalType(20,5),true)) (0 milliseconds) [info] - StructType(StructField(a,LongType,true), StructField(b,TimestampType,true)) (0 milliseconds) [info] - StructType(StructField(a,LongType,true), StructField(b,DoubleType,true)) (1 millisecond) [info] - StructType(StructField(a,LongType,true), StructField(b,DateType,true)) (0 milliseconds) [info] - StructType(StructField(a,LongType,true), StructField(b,DecimalType(10,0),true)) (0 milliseconds) [info] - StructType(StructField(a,LongType,true), StructField(b,BinaryType,true)) (0 milliseconds) [info] - StructType(StructField(a,LongType,true), StructField(b,BooleanType,true)) (0 milliseconds) [info] - StructType(StructField(a,LongType,true), StructField(b,DecimalType(38,18),true)) (1 millisecond) [info] - StructType(StructField(a,LongType,true), StructField(b,ByteType,true)) (0 milliseconds) [info] - StructType(StructField(a,LongType,true), StructField(b,FloatType,true)) (0 milliseconds) [info] - StructType(StructField(a,LongType,true), StructField(b,ShortType,true)) (0 milliseconds) [info] - StructType(StructField(a,IntegerType,true), StructField(b,StringType,true)) (1 millisecond) [info] - StructType(StructField(a,IntegerType,true), StructField(b,LongType,true)) (0 milliseconds) [info] - StructType(StructField(a,IntegerType,true), StructField(b,IntegerType,true)) (0 milliseconds) [info] - StructType(StructField(a,IntegerType,true), StructField(b,DecimalType(20,5),true)) (1 millisecond) [info] - StructType(StructField(a,IntegerType,true), StructField(b,TimestampType,true)) (0 milliseconds) [info] - StructType(StructField(a,IntegerType,true), StructField(b,DoubleType,true)) (0 milliseconds) [info] - StructType(StructField(a,IntegerType,true), StructField(b,DateType,true)) (0 milliseconds) [info] - StructType(StructField(a,IntegerType,true), StructField(b,DecimalType(10,0),true)) (0 milliseconds) [info] - StructType(StructField(a,IntegerType,true), StructField(b,BinaryType,true)) (1 millisecond) [info] - StructType(StructField(a,IntegerType,true), StructField(b,BooleanType,true)) (0 milliseconds) [info] - StructType(StructField(a,IntegerType,true), StructField(b,DecimalType(38,18),true)) (0 milliseconds) [info] - StructType(StructField(a,IntegerType,true), StructField(b,ByteType,true)) (0 milliseconds) [info] - StructType(StructField(a,IntegerType,true), StructField(b,FloatType,true)) (0 milliseconds) [info] - StructType(StructField(a,IntegerType,true), StructField(b,ShortType,true)) (1 millisecond) [info] - StructType(StructField(a,DecimalType(20,5),true), StructField(b,StringType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(20,5),true), StructField(b,LongType,true)) (1 millisecond) [info] - StructType(StructField(a,DecimalType(20,5),true), StructField(b,IntegerType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(20,5),true), StructField(b,DecimalType(20,5),true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(20,5),true), StructField(b,TimestampType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(20,5),true), StructField(b,DoubleType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(20,5),true), StructField(b,DateType,true)) (1 millisecond) [info] - StructType(StructField(a,DecimalType(20,5),true), StructField(b,DecimalType(10,0),true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(20,5),true), StructField(b,BinaryType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(20,5),true), StructField(b,BooleanType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(20,5),true), StructField(b,DecimalType(38,18),true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(20,5),true), StructField(b,ByteType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(20,5),true), StructField(b,FloatType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(20,5),true), StructField(b,ShortType,true)) (0 milliseconds) [info] - StructType(StructField(a,TimestampType,true), StructField(b,StringType,true)) (1 millisecond) [info] - StructType(StructField(a,TimestampType,true), StructField(b,LongType,true)) (0 milliseconds) [info] - StructType(StructField(a,TimestampType,true), StructField(b,IntegerType,true)) (0 milliseconds) [info] - StructType(StructField(a,TimestampType,true), StructField(b,DecimalType(20,5),true)) (0 milliseconds) [info] - StructType(StructField(a,TimestampType,true), StructField(b,TimestampType,true)) (0 milliseconds) [info] - StructType(StructField(a,TimestampType,true), StructField(b,DoubleType,true)) (0 milliseconds) [info] - StructType(StructField(a,TimestampType,true), StructField(b,DateType,true)) (0 milliseconds) [info] - StructType(StructField(a,TimestampType,true), StructField(b,DecimalType(10,0),true)) (0 milliseconds) [info] - StructType(StructField(a,TimestampType,true), StructField(b,BinaryType,true)) (1 millisecond) [info] - StructType(StructField(a,TimestampType,true), StructField(b,BooleanType,true)) (0 milliseconds) [info] - StructType(StructField(a,TimestampType,true), StructField(b,DecimalType(38,18),true)) (0 milliseconds) [info] - StructType(StructField(a,TimestampType,true), StructField(b,ByteType,true)) (0 milliseconds) [info] - StructType(StructField(a,TimestampType,true), StructField(b,FloatType,true)) (0 milliseconds) [info] - StructType(StructField(a,TimestampType,true), StructField(b,ShortType,true)) (0 milliseconds) [info] - StructType(StructField(a,DoubleType,true), StructField(b,StringType,true)) (1 millisecond) [info] - StructType(StructField(a,DoubleType,true), StructField(b,LongType,true)) (0 milliseconds) [info] - StructType(StructField(a,DoubleType,true), StructField(b,IntegerType,true)) (1 millisecond) [info] - StructType(StructField(a,DoubleType,true), StructField(b,DecimalType(20,5),true)) (0 milliseconds) [info] - StructType(StructField(a,DoubleType,true), StructField(b,TimestampType,true)) (0 milliseconds) [info] - StructType(StructField(a,DoubleType,true), StructField(b,DoubleType,true)) (0 milliseconds) [info] - StructType(StructField(a,DoubleType,true), StructField(b,DateType,true)) (1 millisecond) [info] - StructType(StructField(a,DoubleType,true), StructField(b,DecimalType(10,0),true)) (0 milliseconds) [info] - StructType(StructField(a,DoubleType,true), StructField(b,BinaryType,true)) (0 milliseconds) [info] - StructType(StructField(a,DoubleType,true), StructField(b,BooleanType,true)) (0 milliseconds) [info] - StructType(StructField(a,DoubleType,true), StructField(b,DecimalType(38,18),true)) (0 milliseconds) [info] - StructType(StructField(a,DoubleType,true), StructField(b,ByteType,true)) (0 milliseconds) [info] - StructType(StructField(a,DoubleType,true), StructField(b,FloatType,true)) (0 milliseconds) [info] - StructType(StructField(a,DoubleType,true), StructField(b,ShortType,true)) (0 milliseconds) [info] - StructType(StructField(a,DateType,true), StructField(b,StringType,true)) (1 millisecond) [info] - StructType(StructField(a,DateType,true), StructField(b,LongType,true)) (0 milliseconds) [info] - StructType(StructField(a,DateType,true), StructField(b,IntegerType,true)) (0 milliseconds) [info] - StructType(StructField(a,DateType,true), StructField(b,DecimalType(20,5),true)) (1 millisecond) [info] - StructType(StructField(a,DateType,true), StructField(b,TimestampType,true)) (0 milliseconds) [info] - StructType(StructField(a,DateType,true), StructField(b,DoubleType,true)) (0 milliseconds) [info] - StructType(StructField(a,DateType,true), StructField(b,DateType,true)) (0 milliseconds) [info] - StructType(StructField(a,DateType,true), StructField(b,DecimalType(10,0),true)) (1 millisecond) [info] - StructType(StructField(a,DateType,true), StructField(b,BinaryType,true)) (0 milliseconds) [info] - StructType(StructField(a,DateType,true), StructField(b,BooleanType,true)) (0 milliseconds) [info] - StructType(StructField(a,DateType,true), StructField(b,DecimalType(38,18),true)) (0 milliseconds) [info] - StructType(StructField(a,DateType,true), StructField(b,ByteType,true)) (0 milliseconds) [info] - StructType(StructField(a,DateType,true), StructField(b,FloatType,true)) (0 milliseconds) [info] - StructType(StructField(a,DateType,true), StructField(b,ShortType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(10,0),true), StructField(b,StringType,true)) (1 millisecond) [info] - StructType(StructField(a,DecimalType(10,0),true), StructField(b,LongType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(10,0),true), StructField(b,IntegerType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(10,0),true), StructField(b,DecimalType(20,5),true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(10,0),true), StructField(b,TimestampType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(10,0),true), StructField(b,DoubleType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(10,0),true), StructField(b,DateType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(10,0),true), StructField(b,DecimalType(10,0),true)) (1 millisecond) [info] - StructType(StructField(a,DecimalType(10,0),true), StructField(b,BinaryType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(10,0),true), StructField(b,BooleanType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(10,0),true), StructField(b,DecimalType(38,18),true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(10,0),true), StructField(b,ByteType,true)) (1 millisecond) [info] - StructType(StructField(a,DecimalType(10,0),true), StructField(b,FloatType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(10,0),true), StructField(b,ShortType,true)) (0 milliseconds) [info] - StructType(StructField(a,BinaryType,true), StructField(b,StringType,true)) (1 millisecond) [info] - StructType(StructField(a,BinaryType,true), StructField(b,LongType,true)) (1 millisecond) [info] - StructType(StructField(a,BinaryType,true), StructField(b,IntegerType,true)) (0 milliseconds) [info] - StructType(StructField(a,BinaryType,true), StructField(b,DecimalType(20,5),true)) (0 milliseconds) [info] - StructType(StructField(a,BinaryType,true), StructField(b,TimestampType,true)) (0 milliseconds) [info] - StructType(StructField(a,BinaryType,true), StructField(b,DoubleType,true)) (1 millisecond) [info] - StructType(StructField(a,BinaryType,true), StructField(b,DateType,true)) (0 milliseconds) [info] - StructType(StructField(a,BinaryType,true), StructField(b,DecimalType(10,0),true)) (0 milliseconds) [info] - StructType(StructField(a,BinaryType,true), StructField(b,BinaryType,true)) (1 millisecond) [info] - StructType(StructField(a,BinaryType,true), StructField(b,BooleanType,true)) (0 milliseconds) [info] - StructType(StructField(a,BinaryType,true), StructField(b,DecimalType(38,18),true)) (0 milliseconds) [info] - StructType(StructField(a,BinaryType,true), StructField(b,ByteType,true)) (1 millisecond) [info] - StructType(StructField(a,BinaryType,true), StructField(b,FloatType,true)) (0 milliseconds) [info] - StructType(StructField(a,BinaryType,true), StructField(b,ShortType,true)) (0 milliseconds) [info] - StructType(StructField(a,BooleanType,true), StructField(b,StringType,true)) (2 milliseconds) [info] - StructType(StructField(a,BooleanType,true), StructField(b,LongType,true)) (0 milliseconds) [info] - StructType(StructField(a,BooleanType,true), StructField(b,IntegerType,true)) (0 milliseconds) [info] - StructType(StructField(a,BooleanType,true), StructField(b,DecimalType(20,5),true)) (0 milliseconds) [info] - StructType(StructField(a,BooleanType,true), StructField(b,TimestampType,true)) (0 milliseconds) [info] - StructType(StructField(a,BooleanType,true), StructField(b,DoubleType,true)) (0 milliseconds) [info] - StructType(StructField(a,BooleanType,true), StructField(b,DateType,true)) (0 milliseconds) [info] - StructType(StructField(a,BooleanType,true), StructField(b,DecimalType(10,0),true)) (0 milliseconds) [info] - StructType(StructField(a,BooleanType,true), StructField(b,BinaryType,true)) (0 milliseconds) [info] - StructType(StructField(a,BooleanType,true), StructField(b,BooleanType,true)) (0 milliseconds) [info] - StructType(StructField(a,BooleanType,true), StructField(b,DecimalType(38,18),true)) (1 millisecond) [info] - StructType(StructField(a,BooleanType,true), StructField(b,ByteType,true)) (0 milliseconds) [info] - StructType(StructField(a,BooleanType,true), StructField(b,FloatType,true)) (0 milliseconds) [info] - StructType(StructField(a,BooleanType,true), StructField(b,ShortType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(38,18),true), StructField(b,StringType,true)) (1 millisecond) [info] - StructType(StructField(a,DecimalType(38,18),true), StructField(b,LongType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(38,18),true), StructField(b,IntegerType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(38,18),true), StructField(b,DecimalType(20,5),true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(38,18),true), StructField(b,TimestampType,true)) (1 millisecond) [info] - StructType(StructField(a,DecimalType(38,18),true), StructField(b,DoubleType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(38,18),true), StructField(b,DateType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(38,18),true), StructField(b,DecimalType(10,0),true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(38,18),true), StructField(b,BinaryType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(38,18),true), StructField(b,BooleanType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(38,18),true), StructField(b,DecimalType(38,18),true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(38,18),true), StructField(b,ByteType,true)) (1 millisecond) [info] - StructType(StructField(a,DecimalType(38,18),true), StructField(b,FloatType,true)) (0 milliseconds) [info] - StructType(StructField(a,DecimalType(38,18),true), StructField(b,ShortType,true)) (0 milliseconds) [info] - StructType(StructField(a,ByteType,true), StructField(b,StringType,true)) (1 millisecond) [info] - StructType(StructField(a,ByteType,true), StructField(b,LongType,true)) (1 millisecond) [info] - StructType(StructField(a,ByteType,true), StructField(b,IntegerType,true)) (0 milliseconds) [info] - StructType(StructField(a,ByteType,true), StructField(b,DecimalType(20,5),true)) (0 milliseconds) [info] - StructType(StructField(a,ByteType,true), StructField(b,TimestampType,true)) (0 milliseconds) [info] - StructType(StructField(a,ByteType,true), StructField(b,DoubleType,true)) (0 milliseconds) [info] - StructType(StructField(a,ByteType,true), StructField(b,DateType,true)) (0 milliseconds) [info] - StructType(StructField(a,ByteType,true), StructField(b,DecimalType(10,0),true)) (0 milliseconds) [info] - StructType(StructField(a,ByteType,true), StructField(b,BinaryType,true)) (1 millisecond) [info] - StructType(StructField(a,ByteType,true), StructField(b,BooleanType,true)) (0 milliseconds) [info] - StructType(StructField(a,ByteType,true), StructField(b,DecimalType(38,18),true)) (0 milliseconds) [info] - StructType(StructField(a,ByteType,true), StructField(b,ByteType,true)) (0 milliseconds) [info] - StructType(StructField(a,ByteType,true), StructField(b,FloatType,true)) (1 millisecond) [info] - StructType(StructField(a,ByteType,true), StructField(b,ShortType,true)) (0 milliseconds) [info] - StructType(StructField(a,FloatType,true), StructField(b,StringType,true)) (1 millisecond) [info] - StructType(StructField(a,FloatType,true), StructField(b,LongType,true)) (0 milliseconds) [info] - StructType(StructField(a,FloatType,true), StructField(b,IntegerType,true)) (1 millisecond) [info] - StructType(StructField(a,FloatType,true), StructField(b,DecimalType(20,5),true)) (0 milliseconds) [info] - StructType(StructField(a,FloatType,true), StructField(b,TimestampType,true)) (0 milliseconds) [info] - StructType(StructField(a,FloatType,true), StructField(b,DoubleType,true)) (0 milliseconds) [info] - StructType(StructField(a,FloatType,true), StructField(b,DateType,true)) (1 millisecond) [info] - StructType(StructField(a,FloatType,true), StructField(b,DecimalType(10,0),true)) (0 milliseconds) [info] - StructType(StructField(a,FloatType,true), StructField(b,BinaryType,true)) (0 milliseconds) [info] - StructType(StructField(a,FloatType,true), StructField(b,BooleanType,true)) (1 millisecond) [info] - StructType(StructField(a,FloatType,true), StructField(b,DecimalType(38,18),true)) (0 milliseconds) [info] - StructType(StructField(a,FloatType,true), StructField(b,ByteType,true)) (0 milliseconds) [info] - StructType(StructField(a,FloatType,true), StructField(b,FloatType,true)) (0 milliseconds) [info] - StructType(StructField(a,FloatType,true), StructField(b,ShortType,true)) (1 millisecond) [info] - StructType(StructField(a,ShortType,true), StructField(b,StringType,true)) (1 millisecond) [info] - StructType(StructField(a,ShortType,true), StructField(b,LongType,true)) (0 milliseconds) [info] - StructType(StructField(a,ShortType,true), StructField(b,IntegerType,true)) (0 milliseconds) [info] - StructType(StructField(a,ShortType,true), StructField(b,DecimalType(20,5),true)) (1 millisecond) [info] - StructType(StructField(a,ShortType,true), StructField(b,TimestampType,true)) (0 milliseconds) [info] - StructType(StructField(a,ShortType,true), StructField(b,DoubleType,true)) (0 milliseconds) [info] - StructType(StructField(a,ShortType,true), StructField(b,DateType,true)) (0 milliseconds) [info] - StructType(StructField(a,ShortType,true), StructField(b,DecimalType(10,0),true)) (0 milliseconds) [info] - StructType(StructField(a,ShortType,true), StructField(b,BinaryType,true)) (0 milliseconds) [info] - StructType(StructField(a,ShortType,true), StructField(b,BooleanType,true)) (0 milliseconds) [info] - StructType(StructField(a,ShortType,true), StructField(b,DecimalType(38,18),true)) (1 millisecond) [info] - StructType(StructField(a,ShortType,true), StructField(b,ByteType,true)) (0 milliseconds) [info] - StructType(StructField(a,ShortType,true), StructField(b,FloatType,true)) (0 milliseconds) [info] - StructType(StructField(a,ShortType,true), StructField(b,ShortType,true)) (0 milliseconds) [info] - spilling in local cluster with many reduce tasks with java ser (8 seconds, 677 milliseconds) [info] - cleanup of intermediate files in sorter (101 milliseconds) [info] - cleanup of intermediate files in sorter with failures (108 milliseconds) [info] - cleanup of intermediate files in shuffle (202 milliseconds) [info] - cleanup of intermediate files in shuffle with failures (143 milliseconds) [info] - no sorting or partial aggregation with kryo ser (85 milliseconds) [info] - no sorting or partial aggregation with java ser (81 milliseconds) [info] - no sorting or partial aggregation with spilling with kryo ser (78 milliseconds) [info] - no sorting or partial aggregation with spilling with java ser (173 milliseconds) [info] - sorting, no partial aggregation with kryo ser (77 milliseconds) [info] - sorting, no partial aggregation with java ser (73 milliseconds) [info] - sorting, no partial aggregation with spilling with kryo ser (107 milliseconds) [info] - sorting, no partial aggregation with spilling with java ser (85 milliseconds) [info] - partial aggregation, no sorting with kryo ser (93 milliseconds) [info] - partial aggregation, no sorting with java ser (74 milliseconds) [info] - partial aggregation, no sorting with spilling with kryo ser (79 milliseconds) [info] - partial aggregation, no sorting with spilling with java ser (70 milliseconds) [info] - partial aggregation and sorting with kryo ser (71 milliseconds) [info] - partial aggregation and sorting with java ser (92 milliseconds) [info] - partial aggregation and sorting with spilling with kryo ser (191 milliseconds) [info] - partial aggregation and sorting with spilling with java ser (138 milliseconds) [info] - check size of generated map (4 seconds, 615 milliseconds) [info] - Use Float.NaN for all NaN values (1 millisecond) [info] - Use Double.NaN for all NaN values (1 millisecond) [info] ExpressionSetSuite: [info] - expect 1: (A#1 + 1), (a#1 + 1) (0 milliseconds) [info] - expect 2: (A#1 + 1), (a#1 + 2) (1 millisecond) [info] - expect 2: (A#1 + 1), (a#3 + 1) (0 milliseconds) [info] - expect 2: (A#1 + 1), (B#2 + 1) (0 milliseconds) [info] - expect 1: (A#1 + a#1), (a#1 + A#1) (1 millisecond) [info] - expect 1: (A#1 + B#2), (B#2 + A#1) (0 milliseconds) [info] - expect 1: ((A#1 + B#2) + 3), ((B#2 + 3) + A#1), ((B#2 + A#1) + 3), ((3 + A#1) + B#2) (0 milliseconds) [info] - expect 1: ((A#1 * B#2) * 3), ((B#2 * 3) * A#1), ((B#2 * A#1) * 3), ((3 * A#1) * B#2) (1 millisecond) [info] - expect 1: (A#1 = B#2), (B#2 = A#1) (0 milliseconds) [info] - expect 1: ((A#1 + 1) = B#2), (B#2 = (1 + A#1)) (0 milliseconds) [info] - expect 2: (A#1 - B#2), (B#2 - A#1) (0 milliseconds) [info] - expect 1: (A#1 > B#2), (B#2 < A#1) (0 milliseconds) [info] - expect 1: (A#1 >= B#2), (B#2 <= A#1) (0 milliseconds) [info] - expect 1: NOT (none#4 > 1), (none#4 <= 1), NOT (1 < none#4), (1 >= none#4) (1 millisecond) [info] - expect 1: NOT (none#5 > 1), (none#5 <= 1), NOT (1 < none#5), (1 >= none#5) (1 millisecond) [info] - expect 1: NOT (none#4 < 1), (none#4 >= 1), NOT (1 > none#4), (1 <= none#4) (0 milliseconds) [info] - expect 1: NOT (none#5 < 1), (none#5 >= 1), NOT (1 > none#5), (1 <= none#5) (0 milliseconds) [info] - expect 1: NOT (none#4 >= 1), (none#4 < 1), NOT (1 <= none#4), (1 > none#4) (1 millisecond) [info] - expect 1: NOT (none#5 >= 1), (none#5 < 1), NOT (1 <= none#5), (1 > none#5) (0 milliseconds) [info] - expect 1: NOT (none#4 <= 1), (none#4 > 1), NOT (1 >= none#4), (1 < none#4) (1 millisecond) [info] - expect 1: NOT (none#5 <= 1), (none#5 > 1), NOT (1 >= none#5), (1 < none#5) (0 milliseconds) [info] - expect 1: ((A#1 > B#2) && (A#1 <= 10)), ((A#1 <= 10) && (A#1 > B#2)) (1 millisecond) [info] - expect 1: (((A#1 > B#2) && (B#2 > 100)) && (A#1 <= 10)), (((B#2 > 100) && (A#1 <= 10)) && (A#1 > B#2)) (0 milliseconds) [info] - expect 1: ((A#1 > B#2) || (A#1 <= 10)), ((A#1 <= 10) || (A#1 > B#2)) (1 millisecond) [info] - expect 1: (((A#1 > B#2) || (B#2 > 100)) || (A#1 <= 10)), (((B#2 > 100) || (A#1 <= 10)) || (A#1 > B#2)) (0 milliseconds) [info] - expect 1: (((A#1 <= 10) && (A#1 > B#2)) || (B#2 > 100)), ((B#2 > 100) || ((A#1 <= 10) && (A#1 > B#2))) (1 millisecond) [info] - expect 1: ((A#1 >= B#2) || ((A#1 > 10) && (B#2 < 10))), (((B#2 < 10) && (A#1 > 10)) || (A#1 >= B#2)) (0 milliseconds) [info] - expect 1: (((B#2 > 100) || ((A#1 < 100) && (B#2 <= A#1))) || ((A#1 >= 10) && (B#2 >= 50))), ((((A#1 >= 10) && (B#2 >= 50)) || (B#2 > 100)) || ((A#1 < 100) && (B#2 <= A#1))), ((((B#2 >= 50) && (A#1 >= 10)) || ((B#2 <= A#1) && (A#1 < 100))) || (B#2 > 100)) (1 millisecond) [info] - expect 1: ((((B#2 > 100) && (A#1 < 100)) && (B#2 <= A#1)) || ((A#1 >= 10) && (B#2 >= 50))), (((A#1 >= 10) && (B#2 >= 50)) || (((A#1 < 100) && (B#2 > 100)) && (B#2 <= A#1))), (((B#2 >= 50) && (A#1 >= 10)) || (((B#2 <= A#1) && (A#1 < 100)) && (B#2 > 100))) (1 millisecond) [info] - expect 1: (((A#1 >= 10) || (((B#2 <= 10) && (A#1 = B#2)) && (A#1 < 100))) || (B#2 >= 100)), (((((A#1 = B#2) && (A#1 < 100)) && (B#2 <= 10)) || (B#2 >= 100)) || (A#1 >= 10)), (((((A#1 < 100) && (B#2 <= 10)) && (A#1 = B#2)) || (A#1 >= 10)) || (B#2 >= 100)), ((((B#2 <= 10) && (A#1 = B#2)) && (A#1 < 100)) || ((A#1 >= 10) || (B#2 >= 100))) (1 millisecond) [info] - expect 2: ((rand(1) > A#1) && (A#1 <= 10)), ((A#1 <= 10) && (rand(1) > A#1)) (0 milliseconds) [info] - expect 2: (((A#1 > B#2) && (B#2 > 100)) && (rand(1) > A#1)), (((B#2 > 100) && (rand(1) > A#1)) && (A#1 > B#2)) (1 millisecond) [info] - expect 2: ((rand(1) > A#1) || (A#1 <= 10)), ((A#1 <= 10) || (rand(1) > A#1)) (0 milliseconds) [info] - expect 2: (((A#1 > B#2) || (A#1 <= rand(1))) || (A#1 <= 10)), (((A#1 <= rand(1)) || (A#1 <= 10)) || (A#1 > B#2)) (0 milliseconds) [info] - expect 2: rand(1), rand(1) (0 milliseconds) [info] - expect 2: (((A#1 > B#2) || (B#2 > 100)) && (A#1 = rand(1))), (((B#2 > 100) || (A#1 > B#2)) && (A#1 = rand(1))) (0 milliseconds) [info] - expect 2: (((rand(1) > A#1) || ((A#1 <= rand(1)) && (A#1 > B#2))) || ((A#1 > 10) && (B#2 > 10))), (((rand(1) > A#1) || ((A#1 <= rand(1)) && (A#1 > B#2))) || ((B#2 > 10) && (A#1 > 10))) (0 milliseconds) [info] - expect 2: (((rand(1) > A#1) || ((A#1 <= rand(1)) && (A#1 > B#2))) || ((A#1 > 10) && (B#2 > 10))), (((rand(1) > A#1) || ((A#1 > B#2) && (A#1 <= rand(1)))) || ((A#1 > 10) && (B#2 > 10))) (0 milliseconds) [info] - add to / remove from set (1 millisecond) [info] - add multiple elements to set (1 millisecond) [info] - add single element to set with non-deterministic expressions (1 millisecond) [info] - remove single element to set with non-deterministic expressions (0 milliseconds) [info] - add multiple elements to set with non-deterministic expressions (1 millisecond) [info] - remove multiple elements to set with non-deterministic expressions (0 milliseconds) [info] DataTypeParserSuite: [info] - parse int (0 milliseconds) [info] - parse integer (1 millisecond) [info] - parse BooLean (0 milliseconds) [info] - parse tinYint (0 milliseconds) [info] - parse smallINT (1 millisecond) [info] - parse INT (0 milliseconds) [info] - parse INTEGER (0 milliseconds) [info] - parse bigint (1 millisecond) [info] - parse float (0 milliseconds) [info] - parse dOUBle (0 milliseconds) [info] - parse decimal(10, 5) (0 milliseconds) [info] - parse decimal (0 milliseconds) [info] - parse DATE (0 milliseconds) [info] - parse timestamp (0 milliseconds) [info] - parse string (1 millisecond) [info] - parse ChaR(5) (0 milliseconds) [info] - parse varchAr(20) (0 milliseconds) [info] - parse cHaR(27) (1 millisecond) [info] - parse BINARY (0 milliseconds) [info] - parse array (0 milliseconds) [info] - parse Array> (1 millisecond) [info] - parse array> (0 milliseconds) [info] - parse MAP (0 milliseconds) [info] - parse MAp> (0 milliseconds) [info] - parse MAP> (0 milliseconds) [info] - parse struct (1 millisecond) [info] - parse Struct (0 milliseconds) [info] - parse struct< struct:struct, MAP:Map, arrAy:Array, anotherArray:Array> (1 millisecond) [info] - parse struct<`x+y`:int, `!@#$%^&*()`:string, `1_2.345<>:"`:varchar(20)> (0 milliseconds) [info] - parse strUCt<> (1 millisecond) [info] - it is not a data type is not supported (1 millisecond) [info] - struct is not supported (0 milliseconds) [info] - struct is not supported (0 milliseconds) [info] - Do not print empty parentheses for no params (1 millisecond) [info] - parse Struct (1 millisecond) [info] - parse struct (0 milliseconds) [info] - parse Struct (1 millisecond) [info] LogicalPlanSuite: [info] - transformUp runs on operators (1 millisecond) [info] - transformUp runs on operators recursively (0 milliseconds) [info] - isStreaming (3 milliseconds) [info] - transformExpressions works with a Stream (2 milliseconds) [info] AnalysisHelperSuite: [info] - setAnalyze is recursive (1 millisecond) [info] - resolveOperator runs on operators recursively (0 milliseconds) [info] - resolveOperatorsDown runs on operators recursively (1 millisecond) [info] - resolveExpressions runs on operators recursively (0 milliseconds) [info] - resolveOperator skips all ready resolved plans (0 milliseconds) [info] - resolveOperatorsDown skips all ready resolved plans (1 millisecond) [info] - resolveExpressions skips all ready resolved plans (0 milliseconds) [info] - resolveOperator skips partially resolved plans (0 milliseconds) [info] - resolveOperatorsDown skips partially resolved plans (1 millisecond) [info] - resolveExpressions skips partially resolved plans (0 milliseconds) [info] - do not allow transform in analyzer (5 milliseconds) [info] - allow transform in resolveOperators in the analyzer (2 milliseconds) [info] - allow transform with allowInvokingTransformsInAnalyzer in the analyzer (2 milliseconds) [info] EliminateDistinctSuite: [info] - Eliminate Distinct in Max (5 milliseconds) [info] - Eliminate Distinct in Min (3 milliseconds) [info] CollectionExpressionsSuite: [info] - Array and Map Size - legacy (137 milliseconds) [info] - Array and Map Size (130 milliseconds) [info] - MapKeys/MapValues (120 milliseconds) [info] - sort without breaking sorting contracts with kryo ser (1 second, 980 milliseconds) [info] - Map Concat (679 milliseconds) [info] - MapFromEntries (318 milliseconds) [info] - sort without breaking sorting contracts with java ser (1 second, 543 milliseconds) [info] - sort without breaking timsort contracts for large arrays !!! IGNORED !!! [info] - Sort Array (1 second, 66 milliseconds) [info] - spilling with hash collisions (284 milliseconds) [info] - Array contains (500 milliseconds) [info] - spilling with many hash collisions (643 milliseconds) [info] - ArraysOverlap (449 milliseconds) [info] - spilling with hash collisions using the Int.MaxValue key (268 milliseconds) [info] - Slice (483 milliseconds) [info] - spilling with null keys and values (247 milliseconds) [info] - ArrayJoin (283 milliseconds) [info] - Quarter (9 seconds, 648 milliseconds) [info] - ArraysZip (1 second, 667 milliseconds) [info] - sorting updates peak execution memory (1 second, 999 milliseconds) [info] - Array Min (166 milliseconds) [info] - Array max (149 milliseconds) [info] - Sequence of numbers (591 milliseconds) [info] - Sequence of timestamps (425 milliseconds) [info] - Sequence on DST boundaries (72 milliseconds) [info] - Sequence of dates (197 milliseconds) [info] - Sequence with default step (213 milliseconds) [info] - Reverse (274 milliseconds) [info] - Month (3 seconds, 322 milliseconds) [info] - Array Position (284 milliseconds) [info] - elementAt (553 milliseconds) [info] - Concat (532 milliseconds) [info] - force to spill for external sorter (3 seconds, 660 milliseconds) [info] SparkContextSuite: [info] - Only one SparkContext may be active at a time (110 milliseconds) [info] - Flatten (513 milliseconds) [info] - Can still construct a new SparkContext after failing to construct a previous one (47 milliseconds) [info] - Check for multiple SparkContexts can be disabled via undocumented debug option (107 milliseconds) [info] - Test getOrCreate (104 milliseconds) [info] - BytesWritable implicit conversion is correct (2 milliseconds) [info] - ArrayRepeat (396 milliseconds) [info] - basic case for addFile and listFiles (134 milliseconds) [info] - add and list jar files (67 milliseconds) [info] - SPARK-17650: malformed url's throw exceptions before bricking Executors (77 milliseconds) [info] - addFile recursive works (116 milliseconds) [info] - addFile recursive can't add directories by default (75 milliseconds) [info] - Array remove (627 milliseconds) [info] - Array Distinct (442 milliseconds) [info] - Array Union (844 milliseconds) [info] - Shuffle (424 milliseconds) [info] - Array Except (1 second, 37 milliseconds) [info] - cannot call addFile with different paths that have the same filename (3 seconds, 70 milliseconds) [info] - addJar can be called twice with same file in local-mode (SPARK-16787) (55 milliseconds) [info] - addFile can be called twice with same file in local-mode (SPARK-16787) (84 milliseconds) [info] - addJar can be called twice with same file in non-local-mode (SPARK-16787) (206 milliseconds) [info] - addFile can be called twice with same file in non-local-mode (SPARK-16787) (259 milliseconds) [info] - add jar with invalid path (50 milliseconds) [info] - SPARK-22585 addJar argument without scheme is interpreted literally without url decoding (55 milliseconds) [info] - Array Intersect (1 second, 123 milliseconds) [info] - SPARK-31980: Start and end equal in month range (60 milliseconds) [info] RemoveRedundantAliasAndProjectSuite: [info] - all expressions in project list are aliased child output (5 milliseconds) [info] - all expressions in project list are aliased child output but with different order (2 milliseconds) [info] - some expressions in project list are aliased child output (1 millisecond) [info] - some expressions in project list are aliased child output but with different order (2 milliseconds) [info] - some expressions in project list are not Alias or Attribute (3 milliseconds) [info] - some expressions in project list are aliased child output but with metadata (2 milliseconds) [info] - retain deduplicating alias in self-join (5 milliseconds) [info] - alias removal should not break after push project through union (3 milliseconds) [info] - remove redundant alias from aggregate (5 milliseconds) [info] - remove redundant alias from window (2 milliseconds) [info] - do not remove output attributes from a subquery (5 milliseconds) [info] JoinTypesTest: [info] - construct an Inner type (1 millisecond) [info] - construct a FullOuter type (0 milliseconds) [info] - construct a LeftOuter type (0 milliseconds) [info] - construct a RightOuter type (0 milliseconds) [info] - construct a LeftSemi type (0 milliseconds) [info] - construct a LeftAnti type (0 milliseconds) [info] - construct a Cross type (1 millisecond) [info] ConstantFoldingSuite: [info] - eliminate subqueries (2 milliseconds) [info] - Constant folding test: expressions only have literals (8 milliseconds) [info] - Constant folding test: expressions have attribute references and literals in arithmetic operations (4 milliseconds) [info] - Constant folding test: expressions have attribute references and literals in predicates (7 milliseconds) [info] - Constant folding test: expressions have foldable functions (5 milliseconds) [info] - Constant folding test: expressions have nonfoldable functions (6 milliseconds) [info] - Constant folding test: expressions have null literals (10 milliseconds) [info] - Constant folding test: Fold In(v, list) into true or false (4 milliseconds) [info] ResolvedUuidExpressionsSuite: [info] - analyzed plan sets random seed for Uuid expression (3 milliseconds) [info] - Uuid expressions should have different random seeds (3 milliseconds) [info] - Different analyzed plans should have different random seeds in Uuids (4 milliseconds) [info] PredicateSuite: [info] - 3VL Not (53 milliseconds) [info] - AND, OR, EqualTo, EqualNullSafe consistency check (591 milliseconds) [info] - Cancelling job group should not cause SparkContext to shutdown (SPARK-6414) (1 second, 90 milliseconds) [info] - 3VL AND (209 milliseconds) [info] - 3VL OR (151 milliseconds) [info] - 3VL = (174 milliseconds) [info] - Comma separated paths for newAPIHadoopFile/wholeTextFiles/binaryFiles (SPARK-7155) (668 milliseconds) [info] - Default path for file based RDDs is properly set (SPARK-12517) (147 milliseconds) [info] - basic IN predicate test (305 milliseconds) [info] - calling multiple sc.stop() must not throw any exception (83 milliseconds) [info] - No exception when both num-executors and dynamic allocation set. (61 milliseconds) [info] - localProperties are inherited by spawned threads. (66 milliseconds) [info] - localProperties do not cross-talk between threads. (68 milliseconds) [info] - log level case-insensitive and reset log level (63 milliseconds) [info] - register and deregister Spark listener from SparkContext (49 milliseconds) [info] - Cancelling stages/jobs with custom reasons. (96 milliseconds) [info] - client mode with a k8s master url (80 milliseconds) [info] - Killing tasks that raise interrupted exception on cancel (71 milliseconds) [info] - Killing tasks that raise runtime exception on cancel (68 milliseconds) java.lang.Throwable at org.apache.spark.DebugFilesystem$.addOpenStream(DebugFilesystem.scala:36) at org.apache.spark.DebugFilesystem.open(DebugFilesystem.scala:70) at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:766) at org.apache.spark.SparkContextSuite$$anonfun$12.apply$mcV$sp(SparkContextSuite.scala:622) at org.apache.spark.SparkContextSuite$$anonfun$12.apply(SparkContextSuite.scala:615) at org.apache.spark.SparkContextSuite$$anonfun$12.apply(SparkContextSuite.scala:615) at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85) at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) at org.scalatest.Transformer.apply(Transformer.scala:22) at org.scalatest.Transformer.apply(Transformer.scala:20) at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186) at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:147) at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183) at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196) at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196) at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289) at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196) at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:54) at org.scalatest.BeforeAndAfterEach$class.runTest(BeforeAndAfterEach.scala:221) at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:54) at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229) at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229) at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396) at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384) at scala.collection.immutable.List.foreach(List.scala:392) at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384) at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379) at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461) at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229) at org.scalatest.FunSuite.runTests(FunSuite.scala:1560) at org.scalatest.Suite$class.run(Suite.scala:1147) at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560) at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233) at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233) at org.scalatest.SuperEngine.runImpl(Engine.scala:521) at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233) at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:54) at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213) at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210) at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:54) at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314) at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480) at sbt.ForkMain$Run$2.call(ForkMain.java:296) at sbt.ForkMain$Run$2.call(ForkMain.java:286) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) [info] - SPARK-19446: DebugFilesystem.assertNoOpenStreams should report open streams to help debugging (27 milliseconds) [info] - support barrier execution mode under local mode (89 milliseconds) [info] - Day / DayOfMonth (17 seconds, 368 milliseconds) [info] - Seconds (1 second, 385 milliseconds) [info] - DayOfWeek (292 milliseconds) [info] - WeekDay (299 milliseconds) [info] - WeekOfYear (274 milliseconds) [info] - DateFormat (321 milliseconds) [info] - IN with different types (14 seconds, 160 milliseconds) [info] - SPARK-22501: In should not generate codes beyond 64KB (948 milliseconds) [info] - SPARK-22705: In should use less global variables (1 millisecond) [info] - INSET (387 milliseconds) [info] - INSET: binary (129 milliseconds) [info] - INSET: struct (137 milliseconds) [info] - INSET: array (121 milliseconds) [info] - support barrier execution mode under local-cluster mode (16 seconds, 190 milliseconds) [info] BarrierTaskContextSuite: [info] - BinaryComparison consistency check (1 second, 492 milliseconds) [info] - BinaryComparison: lessThan (1 second, 115 milliseconds) [info] - BinaryComparison: LessThanOrEqual (1 second, 329 milliseconds) [info] - BinaryComparison: GreaterThan (1 second, 203 milliseconds) [info] - BinaryComparison: GreaterThanOrEqual (993 milliseconds) [info] - BinaryComparison: EqualTo (1 second, 84 milliseconds) [info] - BinaryComparison: EqualNullSafe (1 second, 37 milliseconds) [info] - BinaryComparison: null test (432 milliseconds) [info] - EqualTo on complex type (59 milliseconds) [info] - EqualTo double/float infinity (23 milliseconds) [info] - SPARK-22693: InSet should not use global variables (1 millisecond) [info] - SPARK-24007: EqualNullSafe for FloatType and DoubleType might generate a wrong result (93 milliseconds) [info] - Interpreted Predicate should initialize nondeterministic expressions (2 milliseconds) [info] ResolveSubquerySuite: [info] - SPARK-17251 Improve `OuterReference` to be `NamedExpression` (4 milliseconds) [info] XPathExpressionSuite: [Fatal Error] :1:7: XML document structures must start and end within the same entity. [info] - xpath_boolean (268 milliseconds) [Fatal Error] :1:7: XML document structures must start and end within the same entity. [info] - xpath_short (157 milliseconds) [Fatal Error] :1:7: XML document structures must start and end within the same entity. [info] - xpath_int (136 milliseconds) [Fatal Error] :1:7: XML document structures must start and end within the same entity. [info] - xpath_long (153 milliseconds) [Fatal Error] :1:7: XML document structures must start and end within the same entity. [info] - xpath_float (132 milliseconds) [Fatal Error] :1:7: XML document structures must start and end within the same entity. [info] - xpath_double (152 milliseconds) [Fatal Error] :1:7: XML document structures must start and end within the same entity. [info] - xpath_string (208 milliseconds) [Fatal Error] :1:7: XML document structures must start and end within the same entity. [info] - xpath (167 milliseconds) [info] - accept only literal path (3 milliseconds) [info] ExpressionParserSuite: [info] - star expressions (3 milliseconds) [info] - named expressions (3 milliseconds) [info] - binary logical expressions (4 milliseconds) [info] - long binary logical expressions (51 milliseconds) [info] - not expressions (4 milliseconds) [info] - exists expression (3 milliseconds) [info] - comparison expressions (4 milliseconds) [info] - between expressions (4 milliseconds) [info] - in expressions (3 milliseconds) [info] - in sub-query (5 milliseconds) [info] - like expressions (4 milliseconds) [info] - like expressions with ESCAPED_STRING_LITERALS = true (2 milliseconds) [info] - is null expressions (2 milliseconds) [info] - is distinct expressions (1 millisecond) [info] - binary arithmetic expressions (9 milliseconds) [info] - unary arithmetic expressions (2 milliseconds) [info] - cast expressions (2 milliseconds) [info] - function expressions (5 milliseconds) [info] - lambda functions (3 milliseconds) [info] - window function expressions (7 milliseconds) [info] - range/rows window function expressions (30 milliseconds) [info] - row constructor (1 millisecond) [info] - scalar sub-query (1 millisecond) [info] - case when (15 milliseconds) [info] - dereference (4 milliseconds) [info] - reference (2 milliseconds) [info] - subscript (2 milliseconds) [info] - parenthesis (1 millisecond) [info] - type constructors (4 milliseconds) [info] - literals (13 milliseconds) [info] - strings (11 milliseconds) [info] - intervals (53 milliseconds) [info] - composed expressions (4 milliseconds) [info] - SPARK-17364, fully qualified column name which starts with number (3 milliseconds) [info] - SPARK-17832 function identifier contains backtick (2 milliseconds) [info] - SPARK-19526 Support ignore nulls keywords for first and last (7 milliseconds) [info] ErrorParserSuite: [info] - no viable input (2 milliseconds) [info] - extraneous input (2 milliseconds) [info] - mismatched input (0 milliseconds) [info] - semantic errors (1 millisecond) [info] JsonExpressionsSuite: [info] - $.store.bicycle (48 milliseconds) [info] - $['store'].bicycle (14 milliseconds) [info] - $.store['bicycle'] (15 milliseconds) [info] - $['store']['bicycle'] (16 milliseconds) [info] - $['key with spaces'] (12 milliseconds) [info] - $.store.book (19 milliseconds) [info] - $.store.book[0] (31 milliseconds) [info] - $.store.book[*] (27 milliseconds) [info] - $ (23 milliseconds) [info] - $.store.book[0].category (23 milliseconds) [info] - $.store.book[*].category (16 milliseconds) [info] - $.store.book[*].isbn (13 milliseconds) [info] - $.store.book[*].reader (13 milliseconds) [info] - $.store.basket[0][1] (13 milliseconds) [info] - $.store.basket[*] (13 milliseconds) [info] - $.store.basket[*][0] (13 milliseconds) [info] - $.store.basket[0][*] (16 milliseconds) [info] - $.store.basket[*][*] (12 milliseconds) [info] - $.store.basket[0][2].b (13 milliseconds) [info] - $.store.basket[0][*].b (13 milliseconds) [info] - $.zip code (12 milliseconds) [info] - $.fb:testid (12 milliseconds) [info] - preserve newlines (12 milliseconds) [info] - escape (12 milliseconds) [info] - $.non_exist_key (11 milliseconds) [info] - $..no_recursive (14 milliseconds) [info] - $.store.book[10] (12 milliseconds) [info] - $.store.book[0].non_exist_key (16 milliseconds) [info] - $.store.basket[*].non_exist_key (11 milliseconds) [info] - SPARK-16548: character conversion (10 milliseconds) [info] - non foldable literal (10 milliseconds) [info] - json_tuple - hive key 1 (5 milliseconds) [info] - json_tuple - hive key 2 (1 millisecond) [info] - json_tuple - hive key 2 (mix of foldable fields) (1 millisecond) [info] - json_tuple - hive key 3 (0 milliseconds) [info] - json_tuple - hive key 3 (nonfoldable json) (1 millisecond) [info] - json_tuple - hive key 3 (nonfoldable fields) (1 millisecond) [info] - json_tuple - hive key 4 - null json (0 milliseconds) [info] - json_tuple - hive key 5 - null and empty fields (1 millisecond) [info] - json_tuple - hive key 6 - invalid json (array) (0 milliseconds) [info] - json_tuple - invalid json (object start only) (0 milliseconds) [info] - json_tuple - invalid json (no object end) (1 millisecond) [info] - json_tuple - invalid json (invalid json) (0 milliseconds) [info] - SPARK-16548: json_tuple - invalid json with leading nulls (0 milliseconds) [info] - json_tuple - preserve newlines (1 millisecond) [info] - SPARK-21677: json_tuple throws NullPointException when column is null as string type (0 milliseconds) [info] - SPARK-21804: json_tuple returns null values within repeated columns except the first one (0 milliseconds) [info] - from_json (41 milliseconds) [info] - from_json - invalid data (26 milliseconds) [info] - from_json - input=array, schema=array, output=array (29 milliseconds) [info] - from_json - input=object, schema=array, output=array of single row (15 milliseconds) [info] - from_json - input=empty array, schema=array, output=empty array (12 milliseconds) [info] - from_json - input=empty object, schema=array, output=array of single row with null (15 milliseconds) [info] - from_json - input=array of single object, schema=struct, output=single row (12 milliseconds) [info] - from_json - input=array, schema=struct, output=null (11 milliseconds) [info] - from_json - input=empty array, schema=struct, output=null (11 milliseconds) [info] - from_json - input=empty object, schema=struct, output=single row with null (11 milliseconds) [info] - from_json null input column (11 milliseconds) [info] - SPARK-20549: from_json bad UTF-8 (12 milliseconds) [info] - global sync by barrier() call (16 seconds, 249 milliseconds) [info] - from_json with timestamp (15 seconds, 528 milliseconds) [info] - SPARK-19543: from_json empty input column (14 milliseconds) [info] - to_json - struct (16 milliseconds) [info] - to_json - array (12 milliseconds) [info] - to_json - array with single empty row (12 milliseconds) [info] - to_json - empty array (11 milliseconds) [info] - to_json null input column (10 milliseconds) [info] - to_json with timestamp (46 milliseconds) [info] - SPARK-21513: to_json support map[string, struct] to json (13 milliseconds) [info] - SPARK-21513: to_json support map[struct, struct] to json (12 milliseconds) [info] - SPARK-21513: to_json support map[string, integer] to json (11 milliseconds) [info] - to_json - array with maps (11 milliseconds) [info] - to_json - array with single map (12 milliseconds) [info] - to_json: verify MapType's value type instead of key type (15 milliseconds) [info] - from_json missing fields (59 milliseconds) [info] - SPARK-24709: infer schema of json strings (73 milliseconds) [info] RandomUUIDGeneratorSuite: [info] - RandomUUIDGenerator should generate version 4, variant 2 UUIDs (1 millisecond) [info] - UUID from RandomUUIDGenerator should be deterministic (1 millisecond) [info] - Get UTF8String UUID (1 millisecond) [info] CollapseProjectSuite: [info] - collapse two deterministic, independent projects into one (5 milliseconds) [info] - collapse two deterministic, dependent projects into one (3 milliseconds) [info] - do not collapse nondeterministic projects (6 milliseconds) [info] - collapse two nondeterministic, independent projects into one (1 millisecond) [info] - collapse one nondeterministic, one deterministic, independent projects into one (4 milliseconds) [info] - collapse project into aggregate (5 milliseconds) [info] - do not collapse common nondeterministic project and aggregate (9 milliseconds) [info] - preserve top-level alias metadata while collapsing projects (3 milliseconds) [info] AggregateEstimationSuite: [info] - set an upper bound if the product of ndv's of group-by columns is too large (3 milliseconds) [info] - data contains all combinations of distinct values of group-by columns. (1 millisecond) [info] - empty group-by column (0 milliseconds) [info] - aggregate on empty table - with or without group-by column (0 milliseconds) [info] - group-by column with only null value (0 milliseconds) [info] - group-by column with null value (0 milliseconds) [info] - non-cbo estimation (3 milliseconds) [info] HyperLogLogPlusPlusSuite: [info] - test invalid parameter relativeSD (1 millisecond) [info] - add nulls (2 milliseconds) [info] - deterministic cardinality estimation (5 seconds, 480 milliseconds) [info] - random cardinality estimation (277 milliseconds) [info] - merging HLL instances (123 milliseconds) [info] SimplifyCastsSuite: [info] - non-nullable element array to nullable element array cast (3 milliseconds) [info] - nullable element to non-nullable element array cast (2 milliseconds) [info] - non-nullable value map to nullable value map cast (3 milliseconds) [info] - nullable value map to non-nullable value map cast (1 millisecond) [info] PropagateEmptyRelationSuite: [info] - propagate empty relation through Union (3 milliseconds) [info] - propagate empty relation through Join (47 milliseconds) [info] - propagate empty relation through UnaryNode (5 milliseconds) [info] - propagate empty streaming relation through multiple UnaryNode (4 milliseconds) [info] - don't propagate empty streaming relation through agg (3 milliseconds) [info] - don't propagate non-empty local relation (7 milliseconds) [info] - propagate empty relation through Aggregate with grouping expressions (2 milliseconds) [info] - don't propagate empty relation through Aggregate without grouping expressions (1 millisecond) [info] - propagate empty relation keeps the plan resolved (2 milliseconds) [info] FilterEstimationSuite: [info] - true (10 milliseconds) [info] - false (1 millisecond) [info] - null (0 milliseconds) [info] - Not(null) (0 milliseconds) [info] - Not(Not(null)) (1 millisecond) [info] - cint < 3 AND null (2 milliseconds) [info] - cint < 3 OR null (1 millisecond) [info] - Not(cint < 3 AND null) (1 millisecond) [info] - Not(cint < 3 OR null) (0 milliseconds) [info] - Not(cint < 3 AND Not(null)) (1 millisecond) [info] - cint = 2 (0 milliseconds) [info] - cint <=> 2 (1 millisecond) [info] - cint = 0 (0 milliseconds) [info] - cint < 3 (1 millisecond) [info] - cint < 0 (0 milliseconds) [info] - cint <= 3 (1 millisecond) [info] - cint > 6 (0 milliseconds) [info] - cint > 10 (0 milliseconds) [info] - cint >= 6 (0 milliseconds) [info] - cint IS NULL (1 millisecond) [info] - cint IS NOT NULL (0 milliseconds) [info] - cint IS NOT NULL && null (0 milliseconds) [info] - cint > 3 AND cint <= 6 (1 millisecond) [info] - cint = 3 OR cint = 6 (0 milliseconds) [info] - Not(cint > 3 AND cint <= 6) (1 millisecond) [info] - Not(cint <= 3 OR cint > 6) (1 millisecond) [info] - Not(cint = 3 AND cstring < 'A8') (1 millisecond) [info] - Not(cint = 3 OR cstring < 'A8') (0 milliseconds) [info] - cint IN (3, 4, 5) (2 milliseconds) [info] - evaluateInSet with all zeros (0 milliseconds) [info] - evaluateInSet with string (1 millisecond) [info] - cint NOT IN (3, 4, 5) (0 milliseconds) [info] - cbool IN (true) (0 milliseconds) [info] - cbool = true (0 milliseconds) [info] - cbool > false (1 millisecond) [info] - cdate = cast('2017-01-02' AS DATE) (0 milliseconds) [info] - cdate < cast('2017-01-03' AS DATE) (1 millisecond) [info] - cdate IN ( cast('2017-01-03' AS DATE), [info] cast('2017-01-04' AS DATE), cast('2017-01-05' AS DATE) ) (1 millisecond) [info] - cdecimal = 0.400000000000000000 (1 millisecond) [info] - cdecimal < 0.60 (0 milliseconds) [info] - cdouble < 3.0 (1 millisecond) [info] - cstring = 'A2' (0 milliseconds) [info] - cstring < 'A2' - unsupported condition (0 milliseconds) [info] - cint IN (1, 2, 3, 4, 5) (1 millisecond) [info] - don't estimate IsNull or IsNotNull if the child is a non-leaf node (2 milliseconds) [info] - cint = cint2 (1 millisecond) [info] - cint > cint2 (1 millisecond) [info] - cint < cint2 (0 milliseconds) [info] - cint = cint4 (1 millisecond) [info] - cint < cint4 (0 milliseconds) [info] - cint = cint3 (1 millisecond) [info] - cint < cint3 (0 milliseconds) [info] - cint > cint3 (1 millisecond) [info] - update ndv for columns based on overall selectivity (0 milliseconds) [info] - Not(cintHgm < 3 AND null) (2 milliseconds) [info] - cintHgm = 5 (1 millisecond) [info] - cintHgm = 0 (0 milliseconds) [info] - cintHgm < 3 (1 millisecond) [info] - cintHgm < 0 (0 milliseconds) [info] - cintHgm <= 3 (1 millisecond) [info] - cintHgm > 6 (0 milliseconds) [info] - cintHgm > 10 (1 millisecond) [info] - cintHgm >= 6 (0 milliseconds) [info] - cintHgm > 3 AND cintHgm <= 6 (1 millisecond) [info] - cintHgm = 3 OR cintHgm = 6 (0 milliseconds) [info] - Not(cintSkewHgm < 3 AND null) (1 millisecond) [info] - cintSkewHgm = 5 (1 millisecond) [info] - cintSkewHgm = 0 (0 milliseconds) [info] - cintSkewHgm < 3 (0 milliseconds) [info] - cintSkewHgm < 0 (0 milliseconds) [info] - cintSkewHgm <= 3 (0 milliseconds) [info] - cintSkewHgm > 6 (1 millisecond) [info] - cintSkewHgm > 10 (0 milliseconds) [info] - cintSkewHgm >= 6 (1 millisecond) [info] - cintSkewHgm > 3 AND cintSkewHgm <= 6 (0 milliseconds) [info] - cintSkewHgm = 3 OR cintSkewHgm = 6 (1 millisecond) [info] - ColumnStatsMap tests (1 millisecond) [info] ExternalCatalogEventSuite: [info] - database (34 milliseconds) [info] - table (20 milliseconds) [info] - function (16 milliseconds) [info] DecimalSuite: [info] - creating decimals (3 milliseconds) [info] - creating decimals with negative scale (0 milliseconds) [info] - double and long values (1 millisecond) [info] - small decimals represented as unscaled long (3 milliseconds) [info] - hash code (1 millisecond) [info] - equals (1 millisecond) [info] - isZero (0 milliseconds) [info] - arithmetic (1 millisecond) [info] - accurate precision after multiplication (0 milliseconds) [info] - fix non-terminating decimal expansion problem (1 millisecond) [info] - fix loss of precision/scale when doing division operation (0 milliseconds) [info] - set/setOrNull (1 millisecond) [info] - changePrecision/toPrecision on compact decimal should respect rounding mode (4 milliseconds) [info] - SPARK-20341: support BigInt's value does not fit in long value range (1 millisecond) [info] - SPARK-26038: toScalaBigInt/toJavaBigInteger (0 milliseconds) [info] CodeBlockSuite: [info] - Block interpolates string and ExprValue inputs (0 milliseconds) [info] - Literals are folded into string code parts instead of block inputs (0 milliseconds) [info] - Block.stripMargin (0 milliseconds) [info] - Block can capture input expr values (1 millisecond) [info] - concatenate blocks (2 milliseconds) [info] - Throws exception when interpolating unexcepted object in code block (1 millisecond) [info] - transform expr in code block (1 millisecond) [info] - transform expr in nested blocks (3 milliseconds) [info] BasicStatsEstimationSuite: [info] - BroadcastHint estimation (2 milliseconds) [info] - range (1 millisecond) [info] - windows (0 milliseconds) [info] - limit estimation: limit < child's rowCount (1 millisecond) [info] - limit estimation: limit > child's rowCount (1 millisecond) [info] - limit estimation: limit = 0 (0 milliseconds) [info] - sample estimation (2 milliseconds) [info] - estimate statistics when the conf changes (0 milliseconds) [info] CodegenExpressionCachingSuite: [info] - GenerateUnsafeProjection should initialize expressions (8 milliseconds) [info] - GenerateMutableProjection should initialize expressions (7 milliseconds) [info] - GeneratePredicate should initialize expressions (6 milliseconds) [info] - GenerateUnsafeProjection should not share expression instances (8 milliseconds) [info] - GenerateMutableProjection should not share expression instances (6 milliseconds) [info] - GeneratePredicate should not share expression instances (5 milliseconds) [info] RandomSuite: [info] - random (82 milliseconds) [info] - SPARK-9127 codegen with long seed (40 milliseconds) [info] JoinEstimationSuite: [info] - equi-height histograms: a bin is contained by another one (5 milliseconds) [info] - equi-height histograms: a bin has only one value after trimming (1 millisecond) [info] - equi-height histograms: skew distribution (some bins have only one value) (2 milliseconds) [info] - equi-height histograms: skew distribution (histograms have different skewed values (1 millisecond) [info] - equi-height histograms: skew distribution (both histograms have the same skewed value (1 millisecond) [info] - cross join (1 millisecond) [info] - disjoint inner join (1 millisecond) [info] - disjoint left outer join (2 milliseconds) [info] - disjoint right outer join (2 milliseconds) [info] - disjoint full outer join (1 millisecond) [info] - inner join (1 millisecond) [info] - inner join with multiple equi-join keys (1 millisecond) [info] - left outer join (1 millisecond) [info] - right outer join (1 millisecond) [info] - full outer join (0 milliseconds) [info] - left semi/anti join (1 millisecond) [info] - test join keys of different types (8 milliseconds) [info] - join with null column (1 millisecond) [info] UnsupportedOperationsSuite: [info] - batch plan - local relation: supported (1 millisecond) [info] - batch plan - streaming source: not supported (2 milliseconds) [info] - batch plan - select on streaming source: not supported (0 milliseconds) [info] - streaming plan - no streaming source (0 milliseconds) [info] - streaming plan - commmands: not supported (4 milliseconds) [info] - streaming plan - aggregate - multiple batch aggregations: supported (4 milliseconds) [info] - streaming plan - aggregate - multiple aggregations but only one streaming aggregation: supported (0 milliseconds) [info] - streaming plan - aggregate - multiple streaming aggregations: not supported (2 milliseconds) [info] - streaming plan - aggregate - streaming aggregations in update mode: supported (0 milliseconds) [info] - streaming plan - aggregate - streaming aggregations in complete mode: supported (0 milliseconds) [info] - streaming plan - aggregate - streaming aggregations with watermark in append mode: supported (1 millisecond) [info] - streaming plan - aggregate - streaming aggregations without watermark in append mode: not supported (1 millisecond) [info] - streaming plan - distinct aggregate - aggregate on batch relation: supported (0 milliseconds) [info] - streaming plan - distinct aggregate - aggregate on streaming relation: not supported (1 millisecond) [info] - batch plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on batch relation: supported (0 milliseconds) [info] - batch plan - flatMapGroupsWithState - multiple flatMapGroupsWithState(Append)s on batch relation: supported (0 milliseconds) [info] - batch plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on batch relation: supported (0 milliseconds) [info] - batch plan - flatMapGroupsWithState - multiple flatMapGroupsWithState(Update)s on batch relation: supported (0 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation without aggregation in update mode: supported (1 millisecond) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation without aggregation in append mode: not supported (1 millisecond) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation without aggregation in complete mode: not supported (1 millisecond) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation with aggregation in Append mode: not supported (2 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation with aggregation in Update mode: not supported (1 millisecond) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Update) on streaming relation with aggregation in Complete mode: not supported (2 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation without aggregation in append mode: supported (0 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation without aggregation in update mode: not supported (2 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation before aggregation in Append mode: supported (0 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation before aggregation in Update mode: supported (0 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation before aggregation in Complete mode: supported (0 milliseconds) [info] - streaming plan - flatMapGroupsWithState - flatMapGroupsWithState(Append) on streaming relation after aggregation in Append mode: not supported (2 milliseconds) [