Started by remote host 35.243.23.62 [EnvInject] - Loading node environment variables. Building remotely on amp-jenkins-worker-04 (centos spark-test) in workspace /home/jenkins/workspace/NewSparkPullRequestBuilder [WS-CLEANUP] Deleting project workspace... [WS-CLEANUP] Done Cloning the remote Git repository Cloning repository https://github.com/apache/spark.git > /home/jenkins/git2/bin/git init /home/jenkins/workspace/NewSparkPullRequestBuilder # timeout=10 Fetching upstream changes from https://github.com/apache/spark.git > /home/jenkins/git2/bin/git --version # timeout=10 > /home/jenkins/git2/bin/git fetch --tags --progress https://github.com/apache/spark.git +refs/heads/*:refs/remotes/origin/* > /home/jenkins/git2/bin/git config remote.origin.url https://github.com/apache/spark.git # timeout=10 > /home/jenkins/git2/bin/git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 > /home/jenkins/git2/bin/git config remote.origin.url https://github.com/apache/spark.git # timeout=10 Fetching upstream changes from https://github.com/apache/spark.git > /home/jenkins/git2/bin/git fetch --tags --progress https://github.com/apache/spark.git +refs/pull/29117/*:refs/remotes/origin/pr/29117/* > /home/jenkins/git2/bin/git rev-parse refs/remotes/origin/pr/29117/merge^{commit} # timeout=10 > /home/jenkins/git2/bin/git rev-parse refs/remotes/origin/origin/pr/29117/merge^{commit} # timeout=10 JENKINS-19022: warning: possible memory leak due to Git plugin usage; see: https://wiki.jenkins-ci.org/display/JENKINS/Remove+Git+Plugin+BuildsByBranch+BuildData Checking out Revision 61dbfc0ae0384ea14b2be1f6f04846cf0e92566c (refs/remotes/origin/pr/29117/merge) > /home/jenkins/git2/bin/git config core.sparsecheckout # timeout=10 > /home/jenkins/git2/bin/git checkout -f 61dbfc0ae0384ea14b2be1f6f04846cf0e92566c > /home/jenkins/git2/bin/git rev-list 61dbfc0ae0384ea14b2be1f6f04846cf0e92566c # timeout=10 [EnvInject] - Executing scripts and injecting environment variables after the SCM step. [EnvInject] - Injecting as environment variables the properties content JENKINS_MASTER_HOSTNAME=amp-jenkins-master JAVA_HOME=/usr/java/jdk1.8.0_191 JAVA_7_HOME=/usr/java/jdk1.7.0_79 SPARK_TESTING=1 LANG=en_US.UTF-8 [EnvInject] - Variables injected successfully. [NewSparkPullRequestBuilder] $ /bin/bash /tmp/hudson2733530967906010898.sh + export AMPLAB_JENKINS=1 + AMPLAB_JENKINS=1 + export PATH=/home/jenkins/.cargo/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/:/home/android-sdk/:/usr/local/bin:/bin:/usr/bin:/home/anaconda/envs/py3k/bin + PATH=/home/jenkins/.cargo/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/:/home/android-sdk/:/usr/local/bin:/bin:/usr/bin:/home/anaconda/envs/py3k/bin + export PATH=/usr/java/jdk1.8.0_191/bin:/home/jenkins/.cargo/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/:/home/android-sdk/:/usr/local/bin:/bin:/usr/bin:/home/anaconda/envs/py3k/bin + PATH=/usr/java/jdk1.8.0_191/bin:/home/jenkins/.cargo/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/:/home/android-sdk/:/usr/local/bin:/bin:/usr/bin:/home/anaconda/envs/py3k/bin + export PATH=/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.3.9/bin/:/usr/java/jdk1.8.0_191/bin:/home/jenkins/.cargo/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/:/home/android-sdk/:/usr/local/bin:/bin:/usr/bin:/home/anaconda/envs/py3k/bin + PATH=/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.3.9/bin/:/usr/java/jdk1.8.0_191/bin:/home/jenkins/.cargo/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.6.3/bin/:/home/android-sdk/:/usr/local/bin:/bin:/usr/bin:/home/anaconda/envs/py3k/bin + export HOME=/home/sparkivy/per-executor-caches/4 + HOME=/home/sparkivy/per-executor-caches/4 + mkdir -p /home/sparkivy/per-executor-caches/4 + export 'SBT_OPTS=-Duser.home=/home/sparkivy/per-executor-caches/4 -Dsbt.ivy.home=/home/sparkivy/per-executor-caches/4/.ivy2' + SBT_OPTS='-Duser.home=/home/sparkivy/per-executor-caches/4 -Dsbt.ivy.home=/home/sparkivy/per-executor-caches/4/.ivy2' + export SPARK_VERSIONS_SUITE_IVY_PATH=/home/sparkivy/per-executor-caches/4/.ivy2 + SPARK_VERSIONS_SUITE_IVY_PATH=/home/sparkivy/per-executor-caches/4/.ivy2 + ./dev/run-tests-jenkins Attempting to post to Github... > Post successful. HEAD is now at 61dbfc0... Merge fe3bf10c32b31ea76dbbcd8e52ccdd28aabbf129 into 8c7d6f9733751503f80d5a1b2463904dfefd6843 HEAD is now at 61dbfc0... Merge fe3bf10c32b31ea76dbbcd8e52ccdd28aabbf129 into 8c7d6f9733751503f80d5a1b2463904dfefd6843 +++ dirname /home/jenkins/workspace/NewSparkPullRequestBuilder/R/install-dev.sh ++ cd /home/jenkins/workspace/NewSparkPullRequestBuilder/R ++ pwd + FWDIR=/home/jenkins/workspace/NewSparkPullRequestBuilder/R + LIB_DIR=/home/jenkins/workspace/NewSparkPullRequestBuilder/R/lib + mkdir -p /home/jenkins/workspace/NewSparkPullRequestBuilder/R/lib + pushd /home/jenkins/workspace/NewSparkPullRequestBuilder/R + . /home/jenkins/workspace/NewSparkPullRequestBuilder/R/find-r.sh ++ '[' -z '' ']' ++ '[' '!' -z '' ']' +++ command -v R ++ '[' '!' /usr/bin/R ']' ++++ which R +++ dirname /usr/bin/R ++ R_SCRIPT_PATH=/usr/bin ++ echo 'Using R_SCRIPT_PATH = /usr/bin' Using R_SCRIPT_PATH = /usr/bin + . /home/jenkins/workspace/NewSparkPullRequestBuilder/R/create-rd.sh ++ set -o pipefail ++ set -e ++++ dirname /home/jenkins/workspace/NewSparkPullRequestBuilder/R/create-rd.sh +++ cd /home/jenkins/workspace/NewSparkPullRequestBuilder/R +++ pwd ++ FWDIR=/home/jenkins/workspace/NewSparkPullRequestBuilder/R ++ pushd /home/jenkins/workspace/NewSparkPullRequestBuilder/R ++ . /home/jenkins/workspace/NewSparkPullRequestBuilder/R/find-r.sh +++ '[' -z /usr/bin ']' ++ /usr/bin/Rscript -e ' if(requireNamespace("devtools", quietly=TRUE)) { setwd("/home/jenkins/workspace/NewSparkPullRequestBuilder/R"); devtools::document(pkg="./pkg", roclets="rd") }' Updating SparkR documentation Loading SparkR Creating a new generic function for ���as.data.frame��� in package ���SparkR��� Creating a new generic function for ���colnames��� in package ���SparkR��� Creating a new generic function for ���colnames<-��� in package ���SparkR��� Creating a new generic function for ���cov��� in package ���SparkR��� Creating a new generic function for ���drop��� in package ���SparkR��� Creating a new generic function for ���na.omit��� in package ���SparkR��� Creating a new generic function for ���filter��� in package ���SparkR��� Creating a new generic function for ���intersect��� in package ���SparkR��� Creating a new generic function for ���sample��� in package ���SparkR��� Creating a new generic function for ���transform��� in package ���SparkR��� Creating a new generic function for ���subset��� in package ���SparkR��� Creating a new generic function for ���summary��� in package ���SparkR��� Creating a new generic function for ���union��� in package ���SparkR��� Creating a new generic function for ���endsWith��� in package ���SparkR��� Creating a new generic function for ���startsWith��� in package ���SparkR��� Creating a new generic function for ���lag��� in package ���SparkR��� Creating a new generic function for ���rank��� in package ���SparkR��� Creating a new generic function for ���sd��� in package ���SparkR��� Creating a new generic function for ���var��� in package ���SparkR��� Creating a new generic function for ���window��� in package ���SparkR��� Creating a new generic function for ���predict��� in package ���SparkR��� Creating a new generic function for ���rbind��� in package ���SparkR��� Creating a generic function for ���substr��� from package ���base��� in package ���SparkR��� Creating a generic function for ���%in%��� from package ���base��� in package ���SparkR��� Creating a generic function for ���lapply��� from package ���base��� in package ���SparkR��� Creating a generic function for ���Filter��� from package ���base��� in package ���SparkR��� Creating a generic function for ���nrow��� from package ���base��� in package ���SparkR��� Creating a generic function for ���ncol��� from package ���base��� in package ���SparkR��� Creating a generic function for ���factorial��� from package ���base��� in package ���SparkR��� Creating a generic function for ���atan2��� from package ���base��� in package ���SparkR��� Creating a generic function for ���ifelse��� from package ���base��� in package ���SparkR��� First time using roxygen2. Upgrading automatically... Updating roxygen version in /home/jenkins/workspace/NewSparkPullRequestBuilder/R/pkg/DESCRIPTION Writing structType.Rd Writing print.structType.Rd Writing structField.Rd Writing print.structField.Rd Writing summarize.Rd Writing alias.Rd Writing arrange.Rd Writing as.data.frame.Rd Writing cache.Rd Writing checkpoint.Rd Writing coalesce.Rd Writing collect.Rd Writing columns.Rd Writing coltypes.Rd Writing count.Rd Writing cov.Rd Writing corr.Rd Writing createOrReplaceTempView.Rd Writing cube.Rd Writing dapply.Rd Writing dapplyCollect.Rd Writing gapply.Rd Writing gapplyCollect.Rd Writing describe.Rd Writing distinct.Rd Writing drop.Rd Writing dropDuplicates.Rd Writing nafunctions.Rd Writing dtypes.Rd Writing explain.Rd Writing except.Rd Writing exceptAll.Rd Writing filter.Rd Writing first.Rd Writing groupBy.Rd Writing hint.Rd Writing insertInto.Rd Writing intersect.Rd Writing intersectAll.Rd Writing isLocal.Rd Writing isStreaming.Rd Writing limit.Rd Writing localCheckpoint.Rd Writing merge.Rd Writing mutate.Rd Writing orderBy.Rd Writing persist.Rd Writing printSchema.Rd Writing registerTempTable-deprecated.Rd Writing rename.Rd Writing repartition.Rd Writing repartitionByRange.Rd Writing sample.Rd Writing rollup.Rd Writing sampleBy.Rd Writing saveAsTable.Rd Writing take.Rd Writing write.df.Rd Writing write.jdbc.Rd Writing write.json.Rd Writing write.orc.Rd Writing write.parquet.Rd Writing write.stream.Rd Writing write.text.Rd Writing schema.Rd Writing select.Rd Writing selectExpr.Rd Writing showDF.Rd Writing subset.Rd Writing summary.Rd Writing union.Rd Writing unionAll.Rd Writing unionByName.Rd Writing unpersist.Rd Writing with.Rd Writing withColumn.Rd Writing withWatermark.Rd Writing randomSplit.Rd Writing broadcast.Rd Writing columnfunctions.Rd Writing between.Rd Writing cast.Rd Writing endsWith.Rd Writing startsWith.Rd Writing column_nonaggregate_functions.Rd Writing otherwise.Rd Writing over.Rd Writing eq_null_safe.Rd Writing partitionBy.Rd Writing rowsBetween.Rd Writing rangeBetween.Rd Writing windowPartitionBy.Rd Writing windowOrderBy.Rd Writing column_datetime_diff_functions.Rd Writing column_aggregate_functions.Rd Writing column_collection_functions.Rd Writing column_string_functions.Rd Writing avg.Rd Writing column_math_functions.Rd Writing column.Rd Writing column_misc_functions.Rd Writing column_window_functions.Rd Writing column_datetime_functions.Rd Writing last.Rd Writing not.Rd Writing fitted.Rd Writing predict.Rd Writing rbind.Rd Writing spark.als.Rd Writing spark.bisectingKmeans.Rd Writing spark.fmClassifier.Rd Writing spark.fmRegressor.Rd Writing spark.gaussianMixture.Rd Writing spark.gbt.Rd Writing spark.glm.Rd Writing spark.isoreg.Rd Writing spark.kmeans.Rd Writing spark.kstest.Rd Writing spark.lda.Rd Writing spark.logit.Rd Writing spark.mlp.Rd Writing spark.naiveBayes.Rd Writing spark.decisionTree.Rd Writing spark.randomForest.Rd Writing spark.survreg.Rd Writing spark.svmLinear.Rd Writing spark.fpGrowth.Rd Writing spark.prefixSpan.Rd Writing spark.powerIterationClustering.Rd Writing spark.lm.Rd Writing write.ml.Rd Writing awaitTermination.Rd Writing isActive.Rd Writing lastProgress.Rd Writing queryName.Rd Writing status.Rd Writing stopQuery.Rd Writing print.jobj.Rd Writing show.Rd Writing substr.Rd Writing match.Rd Writing GroupedData.Rd Writing pivot.Rd Writing SparkDataFrame.Rd Writing storageLevel.Rd Writing toJSON.Rd Writing nrow.Rd Writing ncol.Rd Writing dim.Rd Writing head.Rd Writing join.Rd Writing crossJoin.Rd Writing attach.Rd Writing str.Rd Writing histogram.Rd Writing getNumPartitions.Rd Writing sparkR.conf.Rd Writing sparkR.version.Rd Writing createDataFrame.Rd Writing read.json.Rd Writing read.orc.Rd Writing read.parquet.Rd Writing read.text.Rd Writing sql.Rd Writing tableToDF.Rd Writing read.df.Rd Writing read.jdbc.Rd Writing read.stream.Rd Writing WindowSpec.Rd Writing createExternalTable-deprecated.Rd Writing createTable.Rd Writing cacheTable.Rd Writing uncacheTable.Rd Writing clearCache.Rd Writing dropTempTable-deprecated.Rd Writing dropTempView.Rd Writing tables.Rd Writing tableNames.Rd Writing currentDatabase.Rd Writing setCurrentDatabase.Rd Writing listDatabases.Rd Writing listTables.Rd Writing listColumns.Rd Writing listFunctions.Rd Writing recoverPartitions.Rd Writing refreshTable.Rd Writing refreshByPath.Rd Writing spark.addFile.Rd Writing spark.getSparkFilesRootDirectory.Rd Writing spark.getSparkFiles.Rd Writing spark.lapply.Rd Writing setLogLevel.Rd Writing setCheckpointDir.Rd Writing unresolved_named_lambda_var.Rd Writing create_lambda.Rd Writing invoke_higher_order_function.Rd Writing install.spark.Rd Writing sparkR.callJMethod.Rd Writing sparkR.callJStatic.Rd Writing sparkR.newJObject.Rd Writing LinearSVCModel-class.Rd Writing LogisticRegressionModel-class.Rd Writing MultilayerPerceptronClassificationModel-class.Rd Writing NaiveBayesModel-class.Rd Writing FMClassificationModel-class.Rd Writing BisectingKMeansModel-class.Rd Writing GaussianMixtureModel-class.Rd Writing KMeansModel-class.Rd Writing LDAModel-class.Rd Writing PowerIterationClustering-class.Rd Writing FPGrowthModel-class.Rd Writing PrefixSpan-class.Rd Writing ALSModel-class.Rd Writing AFTSurvivalRegressionModel-class.Rd Writing GeneralizedLinearRegressionModel-class.Rd Writing IsotonicRegressionModel-class.Rd Writing LinearRegressionModel-class.Rd Writing FMRegressionModel-class.Rd Writing glm.Rd Writing KSTest-class.Rd Writing GBTRegressionModel-class.Rd Writing GBTClassificationModel-class.Rd Writing RandomForestRegressionModel-class.Rd Writing RandomForestClassificationModel-class.Rd Writing DecisionTreeRegressionModel-class.Rd Writing DecisionTreeClassificationModel-class.Rd Writing read.ml.Rd Writing sparkR.session.stop.Rd Writing sparkR.init-deprecated.Rd Writing sparkRSQL.init-deprecated.Rd Writing sparkRHive.init-deprecated.Rd Writing sparkR.session.Rd Writing sparkR.uiWebUrl.Rd Writing setJobGroup.Rd Writing clearJobGroup.Rd Writing cancelJobGroup.Rd Writing setJobDescription.Rd Writing setLocalProperty.Rd Writing getLocalProperty.Rd Writing crosstab.Rd Writing freqItems.Rd Writing approxQuantile.Rd Writing StreamingQuery.Rd Writing hashCode.Rd + /usr/bin/R CMD INSTALL --library=/home/jenkins/workspace/NewSparkPullRequestBuilder/R/lib /home/jenkins/workspace/NewSparkPullRequestBuilder/R/pkg/ * installing *source* package ���SparkR��� ... ** R ** inst ** byte-compile and prepare package for lazy loading Creating a new generic function for ���as.data.frame��� in package ���SparkR��� Creating a new generic function for ���colnames��� in package ���SparkR��� Creating a new generic function for ���colnames<-��� in package ���SparkR��� Creating a new generic function for ���cov��� in package ���SparkR��� Creating a new generic function for ���drop��� in package ���SparkR��� Creating a new generic function for ���na.omit��� in package ���SparkR��� Creating a new generic function for ���filter��� in package ���SparkR��� Creating a new generic function for ���intersect��� in package ���SparkR��� Creating a new generic function for ���sample��� in package ���SparkR��� Creating a new generic function for ���transform��� in package ���SparkR��� Creating a new generic function for ���subset��� in package ���SparkR��� Creating a new generic function for ���summary��� in package ���SparkR��� Creating a new generic function for ���union��� in package ���SparkR��� Creating a new generic function for ���endsWith��� in package ���SparkR��� Creating a new generic function for ���startsWith��� in package ���SparkR��� Creating a new generic function for ���lag��� in package ���SparkR��� Creating a new generic function for ���rank��� in package ���SparkR��� Creating a new generic function for ���sd��� in package ���SparkR��� Creating a new generic function for ���var��� in package ���SparkR��� Creating a new generic function for ���window��� in package ���SparkR��� Creating a new generic function for ���predict��� in package ���SparkR��� Creating a new generic function for ���rbind��� in package ���SparkR��� Creating a generic function for ���substr��� from package ���base��� in package ���SparkR��� Creating a generic function for ���%in%��� from package ���base��� in package ���SparkR��� Creating a generic function for ���lapply��� from package ���base��� in package ���SparkR��� Creating a generic function for ���Filter��� from package ���base��� in package ���SparkR��� Creating a generic function for ���nrow��� from package ���base��� in package ���SparkR��� Creating a generic function for ���ncol��� from package ���base��� in package ���SparkR��� Creating a generic function for ���factorial��� from package ���base��� in package ���SparkR��� Creating a generic function for ���atan2��� from package ���base��� in package ���SparkR��� Creating a generic function for ���ifelse��� from package ���base��� in package ���SparkR��� ** help *** installing help indices converting help for package ���SparkR��� finding HTML links ... done AFTSurvivalRegressionModel-class html ALSModel-class html BisectingKMeansModel-class html DecisionTreeClassificationModel-class html DecisionTreeRegressionModel-class html FMClassificationModel-class html FMRegressionModel-class html FPGrowthModel-class html GBTClassificationModel-class html GBTRegressionModel-class html GaussianMixtureModel-class html GeneralizedLinearRegressionModel-class html GroupedData html IsotonicRegressionModel-class html KMeansModel-class html KSTest-class html LDAModel-class html LinearRegressionModel-class html LinearSVCModel-class html LogisticRegressionModel-class html MultilayerPerceptronClassificationModel-class html NaiveBayesModel-class html PowerIterationClustering-class html PrefixSpan-class html RandomForestClassificationModel-class html RandomForestRegressionModel-class html SparkDataFrame html StreamingQuery html WindowSpec html alias html approxQuantile html arrange html as.data.frame html attach html avg html awaitTermination html between html broadcast html cache html cacheTable html cancelJobGroup html cast html checkpoint html clearCache html clearJobGroup html coalesce html collect html coltypes html column html column_aggregate_functions html column_collection_functions html column_datetime_diff_functions html column_datetime_functions html column_math_functions html column_misc_functions html column_nonaggregate_functions html column_string_functions html column_window_functions html columnfunctions html columns html corr html count html cov html createDataFrame html createExternalTable-deprecated html createOrReplaceTempView html createTable html create_lambda html crossJoin html crosstab html cube html currentDatabase html dapply html dapplyCollect html describe html dim html distinct html drop html dropDuplicates html dropTempTable-deprecated html dropTempView html dtypes html endsWith html eq_null_safe html except html exceptAll html explain html filter html first html fitted html freqItems html gapply html gapplyCollect html getLocalProperty html getNumPartitions html glm html groupBy html hashCode html head html hint html histogram html insertInto html install.spark html intersect html intersectAll html invoke_higher_order_function html isActive html isLocal html isStreaming html join html last html lastProgress html limit html listColumns html listDatabases html listFunctions html listTables html localCheckpoint html match html merge html mutate html nafunctions html ncol html not html nrow html orderBy html otherwise html over html partitionBy html persist html pivot html predict html print.jobj html print.structField html print.structType html printSchema html queryName html randomSplit html rangeBetween html rbind html read.df html read.jdbc html read.json html read.ml html read.orc html read.parquet html read.stream html read.text html recoverPartitions html refreshByPath html refreshTable html registerTempTable-deprecated html rename html repartition html repartitionByRange html rollup html rowsBetween html sample html sampleBy html saveAsTable html schema html select html selectExpr html setCheckpointDir html setCurrentDatabase html setJobDescription html setJobGroup html setLocalProperty html setLogLevel html show html showDF html spark.addFile html spark.als html spark.bisectingKmeans html spark.decisionTree html spark.fmClassifier html spark.fmRegressor html spark.fpGrowth html spark.gaussianMixture html spark.gbt html spark.getSparkFiles html spark.getSparkFilesRootDirectory html spark.glm html spark.isoreg html spark.kmeans html spark.kstest html spark.lapply html spark.lda html spark.lm html spark.logit html spark.mlp html spark.naiveBayes html spark.powerIterationClustering html spark.prefixSpan html spark.randomForest html spark.survreg html spark.svmLinear html sparkR.callJMethod html sparkR.callJStatic html sparkR.conf html sparkR.init-deprecated html sparkR.newJObject html sparkR.session html sparkR.session.stop html sparkR.uiWebUrl html sparkR.version html sparkRHive.init-deprecated html sparkRSQL.init-deprecated html sql html startsWith html status html stopQuery html storageLevel html str html structField html structType html subset html substr html summarize html summary html tableNames html tableToDF html tables html take html toJSON html uncacheTable html union html unionAll html unionByName html unpersist html unresolved_named_lambda_var html windowOrderBy html windowPartitionBy html with html withColumn html withWatermark html write.df html write.jdbc html write.json html write.ml html write.orc html write.parquet html write.stream html write.text html ** building package indices ** installing vignettes ** testing if installed package can be loaded * DONE (SparkR) + cd /home/jenkins/workspace/NewSparkPullRequestBuilder/R/lib + jar cfM /home/jenkins/workspace/NewSparkPullRequestBuilder/R/lib/sparkr.zip SparkR + popd [info] Using build tool sbt with Hadoop profile hadoop3.2 and Hive profile hive2.3 under environment amplab_jenkins From https://github.com/apache/spark * [new branch] master -> master [info] Found the following changed modules: root [info] Setup the following environment variables for tests: ======================================================================== Building Spark ======================================================================== [info] Building Spark using SBT with these arguments: -Phadoop-3.2 -Phive-2.3 -Pmesos -Pspark-ganglia-lgpl -Pkubernetes -Pkinesis-asl -Phive-thriftserver -Phive -Pyarn -Phadoop-cloud test:package streaming-kinesis-asl-assembly/assembly Using /usr/java/jdk1.8.0_191 as default JAVA_HOME. Note, this will be overridden by -java-home if it is set. Attempting to fetch sbt Launching sbt from build/sbt-launch-0.13.18.jar [info] Loading project definition from /home/jenkins/workspace/NewSparkPullRequestBuilder/project [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/project/}newsparkpullrequestbuilder-build... [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * com.puppycrawl.tools:checkstyle:8.25 is selected over 6.15 [warn] +- default:newsparkpullrequestbuilder-build:0.1-SNAPSHOT (scalaVersion=2.10, sbtVersion=0.13) (depends on 8.25) [warn] +- com.etsy:sbt-checkstyle-plugin:3.1.1 (scalaVersion=2.10, sbtVersion=0.13) (depends on 6.15) [warn] [warn] * com.google.guava:guava:28.1-jre is selected over 23.0 [warn] +- com.puppycrawl.tools:checkstyle:8.25 (depends on 28.1-jre) [warn] +- default:newsparkpullrequestbuilder-build:0.1-SNAPSHOT (scalaVersion=2.10, sbtVersion=0.13) (depends on 23.0) [warn] [warn] * org.apache.maven.wagon:wagon-provider-api:2.2 is selected over 1.0-beta-6 [warn] +- org.apache.maven:maven-compat:3.0.4 (depends on 2.2) [warn] +- org.apache.maven.wagon:wagon-file:2.2 (depends on 2.2) [warn] +- org.spark-project:sbt-pom-reader:1.0.0-spark (scalaVersion=2.10, sbtVersion=0.13) (depends on 2.2) [warn] +- org.apache.maven.wagon:wagon-http-shared4:2.2 (depends on 2.2) [warn] +- org.apache.maven.wagon:wagon-http:2.2 (depends on 2.2) [warn] +- org.apache.maven.wagon:wagon-http-lightweight:2.2 (depends on 2.2) [warn] +- org.sonatype.aether:aether-connector-wagon:1.13.1 (depends on 1.0-beta-6) [warn] [warn] * org.codehaus.plexus:plexus-utils:3.0 is selected over {2.0.7, 2.0.6, 2.1, 1.5.5} [warn] +- org.apache.maven.wagon:wagon-provider-api:2.2 (depends on 3.0) [warn] +- org.apache.maven:maven-compat:3.0.4 (depends on 2.0.6) [warn] +- org.sonatype.sisu:sisu-inject-plexus:2.3.0 (depends on 2.0.6) [warn] +- org.apache.maven:maven-artifact:3.0.4 (depends on 2.0.6) [warn] +- org.apache.maven:maven-core:3.0.4 (depends on 2.0.6) [warn] +- org.sonatype.plexus:plexus-sec-dispatcher:1.3 (depends on 2.0.6) [warn] +- org.apache.maven:maven-embedder:3.0.4 (depends on 2.0.6) [warn] +- org.apache.maven:maven-settings:3.0.4 (depends on 2.0.6) [warn] +- org.apache.maven:maven-settings-builder:3.0.4 (depends on 2.0.6) [warn] +- org.apache.maven:maven-model-builder:3.0.4 (depends on 2.0.7) [warn] +- org.sonatype.aether:aether-connector-wagon:1.13.1 (depends on 2.0.7) [warn] +- org.sonatype.sisu:sisu-inject-plexus:2.2.3 (depends on 2.0.7) [warn] +- org.apache.maven:maven-model:3.0.4 (depends on 2.0.7) [warn] +- org.apache.maven:maven-aether-provider:3.0.4 (depends on 2.0.7) [warn] +- org.apache.maven:maven-repository-metadata:3.0.4 (depends on 2.0.7) [warn] [warn] * cglib:cglib is evicted completely [warn] +- org.sonatype.sisu:sisu-guice:3.0.3 (depends on 2.2.2) [warn] [warn] * asm:asm is evicted completely [warn] +- cglib:cglib:2.2.2 (depends on 3.3.1) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Compiling 3 Scala sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/project/target/scala-2.10/sbt-0.13/classes... [info] Set current project to spark-parent (in build file:/home/jenkins/workspace/NewSparkPullRequestBuilder/) [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}tags... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}tools... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}spark... [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/assembly/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl-assembly/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/hadoop-cloud/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/external/spark-ganglia-lgpl/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-yarn/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/launcher/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-shuffle/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-yarn/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/common/sketch/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/common/unsafe/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/tools/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-common/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-assembly/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-common/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/common/kvstore/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/common/tags/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/hadoop-cloud/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/launcher/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/examples/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl-assembly/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/common/kvstore/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-shuffle/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/assembly/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-assembly/target [info] Done updating. [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/common/tags/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/tools/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/external/spark-ganglia-lgpl/target [info] Compiling 1 Scala source to /home/jenkins/workspace/NewSparkPullRequestBuilder/tools/target/scala-2.12/classes... [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/common/unsafe/target [info] Done updating. [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/common/sketch/target [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/target/scala-2.12/spark-parent_2.12-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/target/scala-2.12/spark-parent_2.12-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Done updating. [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}launcher... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}mllib-local... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}network-common... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}unsafe... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}kvstore... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}sketch... [info] Compiling 2 Scala sources and 8 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/common/tags/target/classes... [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-token-provider/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/repl/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-token-provider/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/repl/target [info] Done updating. [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}network-shuffle... [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10/target [info] Compiling 79 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-common/target/classes... [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/target [info] Done updating. [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/external/avro/target [info] Done updating. [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/mllib-local/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/graphx/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/hive-thriftserver/target [info] Done updating. [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/hive-thriftserver/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/external/avro/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-sql/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/kubernetes/core/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/graphx/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/mllib-local/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/kubernetes/core/target [info] Done updating. [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-sql/target [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-common/target/spark-network-common-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Done updating. [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/hive/target [info] Done updating. [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}network-yarn... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}core... [info] Compiling 30 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-shuffle/target/classes... [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/examples/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/streaming/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/streaming/target [info] Done updating. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-shuffle/target/spark-network-shuffle-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 3 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-yarn/target/scala-2.12/classes... [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/common/tags/target/spark-tags-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 20 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/launcher/target/classes... [info] Compiling 18 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/common/unsafe/target/classes... [info] Compiling 5 Scala sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/mllib-local/target/classes... [info] Compiling 9 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/common/sketch/target/classes... [info] Compiling 6 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/common/tags/target/test-classes... [info] Compiling 12 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/common/kvstore/target/classes... [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/common/tags/target/spark-tags-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Compiling 23 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-common/target/test-classes... [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:22: Unsafe is internal proprietary API and may be removed in a future release [warn] import sun.misc.Unsafe; [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:28: Unsafe is internal proprietary API and may be removed in a future release [warn] private static final Unsafe _UNSAFE; [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:150: Unsafe is internal proprietary API and may be removed in a future release [warn] sun.misc.Unsafe unsafe; [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:152: Unsafe is internal proprietary API and may be removed in a future release [warn] Field unsafeField = Unsafe.class.getDeclaredField("theUnsafe"); [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/common/sketch/src/main/java/org/apache/spark/util/sketch/Platform.java:154: Unsafe is internal proprietary API and may be removed in a future release [warn] unsafe = (sun.misc.Unsafe) unsafeField.get(null); [warn] ^ [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-yarn/target/scala-2.12/spark-network-yarn_2.12-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-yarn/target/scala-2.12/spark-network-yarn_2.12-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/common/sketch/target/spark-sketch-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 3 Scala sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/common/sketch/target/test-classes... [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/tools/target/scala-2.12/spark-tools_2.12-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/tools/target/scala-2.12/spark-tools_2.12-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/common/unsafe/target/spark-unsafe-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 1 Scala source and 5 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/common/unsafe/target/test-classes... [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/common/kvstore/target/spark-kvstore-3.1.0-SNAPSHOT.jar ... [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/launcher/target/spark-launcher-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Done packaging. [info] Compiling 11 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/common/kvstore/target/test-classes... [info] Compiling 7 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/launcher/target/test-classes... [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-common/src/test/java/org/apache/spark/network/server/OneForOneStreamManagerSuite.java:105: [unchecked] unchecked conversion [warn] Iterator<ManagedBuffer> buffers = Mockito.mock(Iterator.class); [warn] ^ [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/hive/target [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-common/target/spark-network-common-3.1.0-SNAPSHOT-tests.jar ... [info] Compiling 13 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-shuffle/target/test-classes... [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/common/kvstore/target/spark-kvstore-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/launcher/target/spark-launcher-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/common/sketch/target/spark-sketch-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/common/network-shuffle/target/spark-network-shuffle-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}token-provider-kafka-0-10... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}streaming... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}mesos... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}catalyst... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}graphx... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}yarn... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}ganglia-lgpl... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}kubernetes... [info] Compiling 554 Scala sources and 99 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/core/target/classes... [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/common/unsafe/target/spark-unsafe-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/mllib-local/target/spark-mllib-local-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 10 Scala sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/mllib-local/target/test-classes... [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/mllib/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/mllib/target [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/mllib-local/target/spark-mllib-local-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/catalyst/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/core/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/catalyst/target [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/core/target [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-server:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-security:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-server-web-proxy:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}sql... [success] created output: /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/target [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/SparkContext.scala:1867: method isDirectory in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (fs.isDirectory(hadoopPath)) { [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala:168: method getAllStatistics in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] val f = () => FileSystem.getAllStatistics.asScala.map(_.getThreadStatistics.getBytesRead).sum [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala:199: method getAllStatistics in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] val threadStats = FileSystem.getAllStatistics.asScala.map(_.getThreadStatistics) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala:806: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (!fs.isFile(new Path(inProgressLog))) { [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/executor/ExecutorSource.scala:33: method getAllStatistics in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] FileSystem.getAllStatistics.asScala.find(s => s.getScheme.equals(scheme)) [warn] ^ [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/util/Utils.scala:763: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (fs.isFile(path)) { [warn] ^ [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}streaming-kinesis-asl... [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}streaming-kafka-0-10... [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}streaming-kinesis-asl-assembly... [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}hadoop-cloud... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}sql-kafka-0-10... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}avro... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}hive... [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}mllib... [warn] 6 warnings found [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/util/Utils.scala:763: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (fs.isFile(path)) { [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala:168: method getAllStatistics in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] val f = () => FileSystem.getAllStatistics.asScala.map(_.getThreadStatistics.getBytesRead).sum [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala:199: method getAllStatistics in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] val threadStats = FileSystem.getAllStatistics.asScala.map(_.getThreadStatistics) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/executor/ExecutorSource.scala:33: method getAllStatistics in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] FileSystem.getAllStatistics.asScala.find(s => s.getScheme.equals(scheme)) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/SparkContext.scala:1867: method isDirectory in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (fs.isDirectory(hadoopPath)) { [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala:806: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (!fs.isFile(new Path(inProgressLog))) { [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/core/target/spark-core-3.1.0-SNAPSHOT.jar ... [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Done packaging. [info] Compiling 20 Scala sources and 1 Java source to /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/target/classes... [info] Compiling 5 Scala sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-token-provider/target/classes... [info] Compiling 1 Scala source and 1 Java source to /home/jenkins/workspace/NewSparkPullRequestBuilder/external/spark-ganglia-lgpl/target/classes... [info] Compiling 25 Scala sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/target/classes... [info] Compiling 104 Scala sources and 6 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/streaming/target/classes... [info] Compiling 39 Scala sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/kubernetes/core/target/classes... [info] Compiling 38 Scala sources and 5 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/graphx/target/classes... [info] Compiling 311 Scala sources and 98 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/catalyst/target/classes... [info] Compiling 291 Scala sources and 27 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/core/target/test-classes... [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-token-provider/target/spark-token-provider-kafka-0-10-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/external/spark-ganglia-lgpl/target/spark-ganglia-lgpl-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/external/spark-ganglia-lgpl/target/spark-ganglia-lgpl-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala:174: method getYarnUrlFromURI in class ConverterUtils is deprecated: see corresponding Javadoc for more information. [warn] amJarRsrc.setResource(ConverterUtils.getYarnUrlFromURI(uri)) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala:275: method setMemory in class Resource is deprecated: see corresponding Javadoc for more information. [warn] capability.setMemory(amMemory + amMemoryOverhead) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala:290: method setAMContainerResourceRequest in class ApplicationSubmissionContext is deprecated: see corresponding Javadoc for more information. [warn] appContext.setAMContainerResourceRequest(amRequest) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala:352: method getMemory in class Resource is deprecated: see corresponding Javadoc for more information. [warn] val maxMem = newAppResponse.getMaximumResourceCapability().getMemory() [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/ClientDistributedCacheManager.scala:76: method getYarnUrlFromPath in class ConverterUtils is deprecated: see corresponding Javadoc for more information. [warn] amJarRsrc.setResource(ConverterUtils.getYarnUrlFromPath(destPath)) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala:479: method getMemory in class Resource is deprecated: see corresponding Javadoc for more information. [warn] s"${resource.getMemory} MB memory (including $memoryOverhead MB of overhead)" [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala:707: method getMemory in class Resource is deprecated: see corresponding Javadoc for more information. [warn] assert(container.getResource.getMemory >= yarnResourceForRpId.getMemory) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala:707: method getMemory in class Resource is deprecated: see corresponding Javadoc for more information. [warn] assert(container.getResource.getMemory >= yarnResourceForRpId.getMemory) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnSparkHadoopUtil.scala:183: method toContainerId in class ConverterUtils is deprecated: see corresponding Javadoc for more information. [warn] ConverterUtils.toContainerId(containerIdString) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/util/YarnContainerInfoHelper.scala:67: method toString in class ConverterUtils is deprecated: see corresponding Javadoc for more information. [warn] "CONTAINER_ID" -> ConverterUtils.toString(getContainerId(container)), [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:88: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] fwInfoBuilder.setRole(role) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:226: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] role.foreach { r => builder.setRole(r) } [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:262: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] Option(r.getRole), reservation) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:265: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] Option(r.getRole), reservation) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:516: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] role.foreach { r => builder.setRole(r) } [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:532: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] (RoleResourceInfo(resource.getRole, reservation), [warn] ^ [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/kubernetes/core/target/spark-kubernetes-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [warn] 6 warnings found [warn] 10 warnings found [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/util/YarnContainerInfoHelper.scala:67: method toString in class ConverterUtils is deprecated: see corresponding Javadoc for more information. [warn] "CONTAINER_ID" -> ConverterUtils.toString(getContainerId(container)), [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnSparkHadoopUtil.scala:183: method toContainerId in class ConverterUtils is deprecated: see corresponding Javadoc for more information. [warn] ConverterUtils.toContainerId(containerIdString) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala:174: method getYarnUrlFromURI in class ConverterUtils is deprecated: see corresponding Javadoc for more information. [warn] amJarRsrc.setResource(ConverterUtils.getYarnUrlFromURI(uri)) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala:275: method setMemory in class Resource is deprecated: see corresponding Javadoc for more information. [warn] capability.setMemory(amMemory + amMemoryOverhead) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala:290: method setAMContainerResourceRequest in class ApplicationSubmissionContext is deprecated: see corresponding Javadoc for more information. [warn] appContext.setAMContainerResourceRequest(amRequest) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala:352: method getMemory in class Resource is deprecated: see corresponding Javadoc for more information. [warn] val maxMem = newAppResponse.getMaximumResourceCapability().getMemory() [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/ClientDistributedCacheManager.scala:76: method getYarnUrlFromPath in class ConverterUtils is deprecated: see corresponding Javadoc for more information. [warn] amJarRsrc.setResource(ConverterUtils.getYarnUrlFromPath(destPath)) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala:479: method getMemory in class Resource is deprecated: see corresponding Javadoc for more information. [warn] s"${resource.getMemory} MB memory (including $memoryOverhead MB of overhead)" [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala:707: method getMemory in class Resource is deprecated: see corresponding Javadoc for more information. [warn] assert(container.getResource.getMemory >= yarnResourceForRpId.getMemory) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala:707: method getMemory in class Resource is deprecated: see corresponding Javadoc for more information. [warn] assert(container.getResource.getMemory >= yarnResourceForRpId.getMemory) [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/target/spark-yarn-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:88: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] fwInfoBuilder.setRole(role) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:226: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] role.foreach { r => builder.setRole(r) } [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:262: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] Option(r.getRole), reservation) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:265: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] Option(r.getRole), reservation) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:516: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] role.foreach { r => builder.setRole(r) } [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala:532: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] (RoleResourceInfo(resource.getRole, reservation), [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/target/spark-mesos-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/graphx/target/spark-graphx-3.1.0-SNAPSHOT.jar ... [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/streaming/src/main/scala/org/apache/spark/streaming/util/HdfsUtils.scala:33: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (dfs.isFile(dfsPath)) { [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/streaming/src/main/scala/org/apache/spark/streaming/util/HdfsUtils.scala:61: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (!dfs.isFile(dfsPath)) null else throw e [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/streaming/src/main/scala/org/apache/spark/streaming/util/HdfsUtils.scala:95: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] fs.isFile(hdpPath) [warn] ^ [info] Done packaging. [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}streaming-kafka-0-10-assembly... [warn] three warnings found [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/streaming/src/main/scala/org/apache/spark/streaming/util/HdfsUtils.scala:33: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (dfs.isFile(dfsPath)) { [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/streaming/src/main/scala/org/apache/spark/streaming/util/HdfsUtils.scala:61: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (!dfs.isFile(dfsPath)) null else throw e [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/streaming/src/main/scala/org/apache/spark/streaming/util/HdfsUtils.scala:95: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] fs.isFile(hdpPath) [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/streaming/target/spark-streaming-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 10 Scala sources and 1 Java source to /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10/target/classes... [info] Compiling 11 Scala sources and 2 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/target/classes... [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/ConsumerStrategy.scala:108: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/ConsumerStrategy.scala:162: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/DirectKafkaInputDStream.scala:172: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] val msgs = c.poll(0) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:107: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val kinesisClient = new AmazonKinesisClient(credentials) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:108: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] kinesisClient.setEndpoint(endpointUrl) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:223: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val kinesisClient = new AmazonKinesisClient(new DefaultAWSCredentialsProviderChain()) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:224: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] kinesisClient.setEndpoint(endpoint) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:142: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] private val client = new AmazonKinesisClient(credentials) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:152: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] client.setEndpoint(endpointUrl) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:221: method setRequestCredentials in class AmazonWebServiceRequest is deprecated: see corresponding Javadoc for more information. [warn] getRecordsRequest.setRequestCredentials(credentials) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:242: method setRequestCredentials in class AmazonWebServiceRequest is deprecated: see corresponding Javadoc for more information. [warn] getShardIteratorRequest.setRequestCredentials(credentials) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisReceiver.scala:192: constructor Worker in class Worker is deprecated: see corresponding Javadoc for more information. [warn] worker = new Worker(recordProcessorFactory, kinesisClientLibConfiguration) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:58: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val client = new AmazonKinesisClient(KinesisTestUtils.getAWSCredentials()) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:59: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] client.setEndpoint(endpointUrl) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:64: constructor AmazonDynamoDBClient in class AmazonDynamoDBClient is deprecated: see corresponding Javadoc for more information. [warn] val dynamoDBClient = new AmazonDynamoDBClient(new DefaultAWSCredentialsProviderChain()) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:65: method setRegion in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] dynamoDBClient.setRegion(RegionUtils.getRegion(regionName)) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/SparkAWSCredentials.scala:74: method withLongLivedCredentialsProvider in class Builder is deprecated: see corresponding Javadoc for more information. [warn] .withLongLivedCredentialsProvider(longLivedCreds.provider) [warn] ^ [warn] 14 warnings found [warn] three warnings found [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/ConsumerStrategy.scala:108: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/ConsumerStrategy.scala:162: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/DirectKafkaInputDStream.scala:172: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] val msgs = c.poll(0) [warn] [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10/target/spark-streaming-kafka-0-10-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/java/org/apache/spark/examples/streaming/JavaKinesisWordCountASL.java:157: [unchecked] unchecked method invocation: method union in class JavaStreamingContext is applied to given types [warn] unionStreams = jssc.union(streamsList.toArray(new JavaDStream[0])); [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisReceiver.scala:192: constructor Worker in class Worker is deprecated: see corresponding Javadoc for more information. [warn] worker = new Worker(recordProcessorFactory, kinesisClientLibConfiguration) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/SparkAWSCredentials.scala:74: method withLongLivedCredentialsProvider in class Builder is deprecated: see corresponding Javadoc for more information. [warn] .withLongLivedCredentialsProvider(longLivedCreds.provider) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:142: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] private val client = new AmazonKinesisClient(credentials) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:152: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] client.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:221: method setRequestCredentials in class AmazonWebServiceRequest is deprecated: see corresponding Javadoc for more information. [warn] getRecordsRequest.setRequestCredentials(credentials) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:242: method setRequestCredentials in class AmazonWebServiceRequest is deprecated: see corresponding Javadoc for more information. [warn] getShardIteratorRequest.setRequestCredentials(credentials) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:58: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val client = new AmazonKinesisClient(KinesisTestUtils.getAWSCredentials()) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:59: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] client.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:64: constructor AmazonDynamoDBClient in class AmazonDynamoDBClient is deprecated: see corresponding Javadoc for more information. [warn] val dynamoDBClient = new AmazonDynamoDBClient(new DefaultAWSCredentialsProviderChain()) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:65: method setRegion in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] dynamoDBClient.setRegion(RegionUtils.getRegion(regionName)) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:107: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val kinesisClient = new AmazonKinesisClient(credentials) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:108: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] kinesisClient.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:223: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val kinesisClient = new AmazonKinesisClient(new DefaultAWSCredentialsProviderChain()) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:224: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] kinesisClient.setEndpoint(endpoint) [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/target/spark-streaming-kinesis-asl-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl-assembly/target/scala-2.12/spark-streaming-kinesis-asl-assembly_2.12-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl-assembly/target/scala-2.12/spark-streaming-kinesis-asl-assembly_2.12-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-util-ajax:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-hadoop-cloud:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-azure:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-util-ajax:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-hadoop-cloud:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}repl... [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.apache.thrift:libthrift:0.12.0 is selected over 0.9.3 [warn] +- org.apache.spark:spark-hive:3.1.0-SNAPSHOT (depends on 0.9.3) [warn] +- org.apache.thrift:libfb303:0.9.3 (depends on 0.9.3) [warn] [warn] * io.dropwizard.metrics:metrics-json:4.1.1 is selected over 3.1.0 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 3.1.0) [warn] +- org.apache.hive:hive-common:2.3.7 (depends on 3.1.0) [warn] [warn] * io.dropwizard.metrics:metrics-jvm:4.1.1 is selected over 3.1.0 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 3.1.0) [warn] +- org.apache.hive:hive-common:2.3.7 (depends on 3.1.0) [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * io.dropwizard.metrics:metrics-core:4.1.1 is selected over {3.1.2, 3.1.0} [warn] +- io.dropwizard.metrics:metrics-json:4.1.1 (depends on 4.1.1) [warn] +- io.dropwizard.metrics:metrics-jmx:4.1.1 (depends on 4.1.1) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 4.1.1) [warn] +- org.apache.spark:spark-network-shuffle:3.1.0-SNAPSHOT (depends on 4.1.1) [warn] +- org.apache.spark:spark-network-common:3.1.0-SNAPSHOT (depends on 4.1.1) [warn] +- io.dropwizard.metrics:metrics-graphite:4.1.1 (depends on 4.1.1) [warn] +- io.dropwizard.metrics:metrics-jvm:4.1.1 (depends on 4.1.1) [warn] +- org.apache.hive:hive-common:2.3.7 (depends on 3.1.0) [warn] +- com.github.joshelser:dropwizard-metrics-hadoop-metrics2-reporter:0.1.2 (depends on 3.1.2) [warn] [warn] * org.apache.hadoop:hadoop-common:3.2.0 is selected over 2.6.0 [warn] +- org.apache.hadoop:hadoop-client:3.2.0 (depends on 3.2.0) [warn] +- com.github.joshelser:dropwizard-metrics-hadoop-metrics2-reporter:0.1.2 (depends on 2.6.0) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}hive-thriftserver... [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala:1069: method readFileToString in class FileUtils is deprecated: see corresponding Javadoc for more information. [warn] assert(FileUtils.readFileToString(new File(outputUri.getPath)) === [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala:1070: method readFileToString in class FileUtils is deprecated: see corresponding Javadoc for more information. [warn] FileUtils.readFileToString(new File(sourceUri.getPath))) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala:1115: method write in class FileUtils is deprecated: see corresponding Javadoc for more information. [warn] FileUtils.write(jarFile, content) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala:1131: method write in class FileUtils is deprecated: see corresponding Javadoc for more information. [warn] FileUtils.write(jarFile, content) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala:1521: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] override def isFile(path: Path): Boolean = super.isFile(local(path)) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/test/scala/org/apache/spark/deploy/history/EventLogFileWritersSuite.scala:216: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] assert(fileSystem.exists(finalLogPath) && fileSystem.isFile(finalLogPath)) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/test/scala/org/apache/spark/deploy/history/EventLogFileWritersSuite.scala:360: method isDirectory in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] assert(fileSystem.exists(logDirPath) && fileSystem.isDirectory(logDirPath)) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/test/scala/org/apache/spark/deploy/history/EventLogFileWritersSuite.scala:363: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] assert(fileSystem.exists(appStatusFile) && fileSystem.isFile(appStatusFile)) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/test/scala/org/apache/spark/deploy/history/HistoryServerSuite.scala:196: method toString in class IOUtils is deprecated: see corresponding Javadoc for more information. [warn] val exp = IOUtils.toString(new FileInputStream( [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/test/scala/org/apache/spark/deploy/history/HistoryServerSuite.scala:687: method toString in class IOUtils is deprecated: see corresponding Javadoc for more information. [warn] val inString = in.map(IOUtils.toString) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/test/scala/org/apache/spark/deploy/history/HistoryServerSuite.scala:703: method toString in class IOUtils is deprecated: see corresponding Javadoc for more information. [warn] err.map(IOUtils.toString) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/test/scala/org/apache/spark/storage/BlockManagerSuite.scala:223: value rpcEnv does nothing other than call itself recursively [warn] override val rpcEnv: RpcEnv = this.rpcEnv [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/test/scala/org/apache/spark/storage/BlockManagerSuite.scala:237: value rpcEnv does nothing other than call itself recursively [warn] override val rpcEnv: RpcEnv = this.rpcEnv [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/test/scala/org/apache/spark/util/FileAppenderSuite.scala:348: method closeQuietly in class IOUtils is deprecated: see corresponding Javadoc for more information. [warn] IOUtils.closeQuietly(inputStream) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/test/scala/org/apache/spark/util/UtilsSuite.scala:248: method closeQuietly in class IOUtils is deprecated: see corresponding Javadoc for more information. [warn] IOUtils.closeQuietly(mergedStream) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/test/scala/org/apache/spark/util/UtilsSuite.scala:249: method closeQuietly in class IOUtils is deprecated: see corresponding Javadoc for more information. [warn] IOUtils.closeQuietly(in) [warn] ^ [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}examples... [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.apache.thrift:libthrift:0.12.0 is selected over 0.9.3 [warn] +- org.apache.spark:spark-hive:3.1.0-SNAPSHOT (depends on 0.9.3) [warn] +- org.apache.thrift:libfb303:0.9.3 (depends on 0.9.3) [warn] [warn] * io.dropwizard.metrics:metrics-json:4.1.1 is selected over 3.1.0 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 3.1.0) [warn] +- org.apache.hive:hive-common:2.3.7 (depends on 3.1.0) [warn] [warn] * io.dropwizard.metrics:metrics-jvm:4.1.1 is selected over 3.1.0 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 3.1.0) [warn] +- org.apache.hive:hive-common:2.3.7 (depends on 3.1.0) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * io.dropwizard.metrics:metrics-core:4.1.1 is selected over {3.1.2, 3.1.0} [warn] +- io.dropwizard.metrics:metrics-json:4.1.1 (depends on 4.1.1) [warn] +- io.dropwizard.metrics:metrics-jmx:4.1.1 (depends on 4.1.1) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 4.1.1) [warn] +- org.apache.spark:spark-network-shuffle:3.1.0-SNAPSHOT (depends on 4.1.1) [warn] +- org.apache.spark:spark-network-common:3.1.0-SNAPSHOT (depends on 4.1.1) [warn] +- io.dropwizard.metrics:metrics-graphite:4.1.1 (depends on 4.1.1) [warn] +- io.dropwizard.metrics:metrics-jvm:4.1.1 (depends on 4.1.1) [warn] +- org.apache.hive:hive-common:2.3.7 (depends on 3.1.0) [warn] +- com.github.joshelser:dropwizard-metrics-hadoop-metrics2-reporter:0.1.2 (depends on 3.1.2) [warn] [warn] * org.apache.hadoop:hadoop-common:3.2.0 is selected over 2.6.0 [warn] +- org.apache.hadoop:hadoop-client:3.2.0 (depends on 3.2.0) [warn] +- com.github.joshelser:dropwizard-metrics-hadoop-metrics2-reporter:0.1.2 (depends on 2.6.0) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.scala-lang.modules:scala-xml_2.12:1.2.0 is selected over 1.0.6 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 1.2.0) [warn] +- org.scala-lang:scala-compiler:2.12.10 (depends on 1.0.6) [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [warn] 16 warnings found [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/catalyst/target/spark-catalyst-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 467 Scala sources and 59 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/target/classes... [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-assembly/target/scala-2.12/spark-streaming-kafka-0-10-assembly_2.12-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-assembly/target/scala-2.12/spark-streaming-kafka-0-10-assembly_2.12-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Compiling 34 Scala sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/kubernetes/core/target/test-classes... [info] Compiling 21 Scala sources and 3 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/target/test-classes... [info] Compiling 19 Scala sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/graphx/target/test-classes... [info] Compiling 41 Scala sources and 9 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/streaming/target/test-classes... [info] Compiling 6 Scala sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-token-provider/target/test-classes... [info] Compiling 11 Scala sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/target/test-classes... [info] Compiling 255 Scala sources and 6 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/catalyst/target/test-classes... [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/core/target/spark-core-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-token-provider/target/spark-token-provider-kafka-0-10-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:114: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] Resource.newBuilder().setRole("*") [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:117: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] Resource.newBuilder().setRole("*") [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:122: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] Resource.newBuilder().setRole("role2") [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:125: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] Resource.newBuilder().setRole("role2") [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:139: method valueOf in Java enum Status is deprecated: see corresponding Javadoc for more information. [warn] ).thenReturn(Status.valueOf(1)) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:152: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] assert(cpus.exists(_.getRole() == "role2")) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:153: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] assert(cpus.exists(_.getRole() == "*")) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:156: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] assert(mem.exists(_.getRole() == "role2")) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:157: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] assert(mem.exists(_.getRole() == "*")) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:419: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] Resource.newBuilder().setRole("*") [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterSchedulerSuite.scala:422: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] Resource.newBuilder().setRole("*") [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:281: method valueOf in Java enum Status is deprecated: see corresponding Javadoc for more information. [warn] ).thenReturn(Status.valueOf(1)) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:282: method valueOf in Java enum Status is deprecated: see corresponding Javadoc for more information. [warn] when(driver.declineOffer(mesosOffers.get(1).getId)).thenReturn(Status.valueOf(1)) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:283: method valueOf in Java enum Status is deprecated: see corresponding Javadoc for more information. [warn] when(driver.declineOffer(mesosOffers.get(2).getId)).thenReturn(Status.valueOf(1)) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:310: method valueOf in Java enum Status is deprecated: see corresponding Javadoc for more information. [warn] when(driver.declineOffer(mesosOffers2.get(0).getId)).thenReturn(Status.valueOf(1)) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:337: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] .setRole("prod") [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:341: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] .setRole("prod") [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:346: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] .setRole("dev") [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:351: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] .setRole("dev") [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:394: method valueOf in Java enum Status is deprecated: see corresponding Javadoc for more information. [warn] ).thenReturn(Status.valueOf(1)) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:411: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] assert(cpusDev.getRole.equals("dev")) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:414: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] r.getName.equals("mem") && r.getScalar.getValue.equals(484.0) && r.getRole.equals("prod") [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosFineGrainedSchedulerBackendSuite.scala:417: method getRole in class Resource is deprecated: see corresponding Javadoc for more information. [warn] r.getName.equals("cpus") && r.getScalar.getValue.equals(1.0) && r.getRole.equals("prod") [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtilsSuite.scala:55: method setRole in class Builder is deprecated: see corresponding Javadoc for more information. [warn] role.foreach { r => builder.setRole(r) } [warn] ^ [warn] 24 warnings found [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/mesos/target/spark-mesos-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/ClientDistributedCacheManagerSuite.scala:83: method getPathFromYarnURL in class ConverterUtils is deprecated: see corresponding Javadoc for more information. [warn] assert(ConverterUtils.getPathFromYarnURL(resource.getResource()) === destPath) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/ClientDistributedCacheManagerSuite.scala:105: method getPathFromYarnURL in class ConverterUtils is deprecated: see corresponding Javadoc for more information. [warn] assert(ConverterUtils.getPathFromYarnURL(resource2.getResource()) === destPath2) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/ClientDistributedCacheManagerSuite.scala:161: method getPathFromYarnURL in class ConverterUtils is deprecated: see corresponding Javadoc for more information. [warn] assert(ConverterUtils.getPathFromYarnURL(resource.getResource()) === destPath) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/ClientDistributedCacheManagerSuite.scala:190: method getPathFromYarnURL in class ConverterUtils is deprecated: see corresponding Javadoc for more information. [warn] assert(ConverterUtils.getPathFromYarnURL(resource.getResource()) === destPath) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/ResourceRequestHelperSuite.scala:180: method setMemory in class Resource is deprecated: see corresponding Javadoc for more information. [warn] resource.setMemory(512) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnAllocatorSuite.scala:273: method getMemory in class Resource is deprecated: see corresponding Javadoc for more information. [warn] val expectedResources = Resource.newInstance(handler.defaultResource.getMemory(), [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnAllocatorSuite.scala:656: method getMemory in class Resource is deprecated: see corresponding Javadoc for more information. [warn] val memory = handler.defaultResource.getMemory [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala:482: method toString in class ConverterUtils is deprecated: see corresponding Javadoc for more information. [warn] "CONTAINER_ID" -> ConverterUtils.toString(containerId), [warn] ^ [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/graphx/target/spark-graphx-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/kubernetes/core/target/spark-kubernetes-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] 8 warnings found [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/streaming/src/test/scala/org/apache/spark/streaming/InputStreamsSuite.scala:267: method write in class IOUtils is deprecated: see corresponding Javadoc for more information. [warn] IOUtils.write(text, out) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/streaming/src/test/scala/org/apache/spark/streaming/StreamingContextSuite.scala:913: method write in class FileUtils is deprecated: see corresponding Javadoc for more information. [warn] FileUtils.write(new File(fakeCheckpointFile.toString()), "blablabla") [warn] ^ [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/resource-managers/yarn/target/spark-yarn-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Done updating. [info] Updating {file:/home/jenkins/workspace/NewSparkPullRequestBuilder/}assembly... [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.apache.thrift:libthrift:0.12.0 is selected over 0.9.3 [warn] +- org.apache.spark:spark-hive:3.1.0-SNAPSHOT (depends on 0.9.3) [warn] +- org.apache.thrift:libfb303:0.9.3 (depends on 0.9.3) [warn] [warn] * io.dropwizard.metrics:metrics-json:4.1.1 is selected over 3.1.0 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 3.1.0) [warn] +- org.apache.hive:hive-common:2.3.7 (depends on 3.1.0) [warn] [warn] * io.dropwizard.metrics:metrics-jvm:4.1.1 is selected over 3.1.0 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 3.1.0) [warn] +- org.apache.hive:hive-common:2.3.7 (depends on 3.1.0) [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * io.dropwizard.metrics:metrics-core:4.1.1 is selected over {3.1.2, 3.1.0} [warn] +- io.dropwizard.metrics:metrics-json:4.1.1 (depends on 4.1.1) [warn] +- io.dropwizard.metrics:metrics-jmx:4.1.1 (depends on 4.1.1) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 4.1.1) [warn] +- org.apache.spark:spark-network-shuffle:3.1.0-SNAPSHOT (depends on 4.1.1) [warn] +- org.apache.spark:spark-network-common:3.1.0-SNAPSHOT (depends on 4.1.1) [warn] +- io.dropwizard.metrics:metrics-graphite:4.1.1 (depends on 4.1.1) [warn] +- io.dropwizard.metrics:metrics-jvm:4.1.1 (depends on 4.1.1) [warn] +- org.apache.hive:hive-common:2.3.7 (depends on 3.1.0) [warn] +- com.github.joshelser:dropwizard-metrics-hadoop-metrics2-reporter:0.1.2 (depends on 3.1.2) [warn] [warn] * org.apache.hadoop:hadoop-common:3.2.0 is selected over 2.6.0 [warn] +- org.apache.hadoop:hadoop-client:3.2.0 (depends on 3.2.0) [warn] +- com.github.joshelser:dropwizard-metrics-hadoop-metrics2-reporter:0.1.2 (depends on 2.6.0) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:127: value ENABLE_JOB_SUMMARY in class ParquetOutputFormat is deprecated: see corresponding Javadoc for more information. [warn] && conf.get(ParquetOutputFormat.ENABLE_JOB_SUMMARY) == null) { [warn] ^ [warn] two warnings found [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/parquet/ParquetWriteBuilder.scala:90: value ENABLE_JOB_SUMMARY in class ParquetOutputFormat is deprecated: see corresponding Javadoc for more information. [warn] && conf.get(ParquetOutputFormat.ENABLE_JOB_SUMMARY) == null) { [warn] ^ [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/streaming/target/spark-streaming-3.1.0-SNAPSHOT-tests.jar ... [info] Compiling 8 Scala sources and 1 Java source to /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/target/test-classes... [info] Compiling 6 Scala sources and 4 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10/target/test-classes... [info] Done packaging. [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] [warn] * org.eclipse.jetty:jetty-util:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.eclipse.jetty:jetty-xml:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-jndi:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-io:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-assembly_2.12:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-servlets:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-util-ajax:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-proxy:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.spark:spark-hadoop-cloud:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-http:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.apache.thrift:libthrift:0.12.0 is selected over 0.9.3 [warn] +- org.apache.spark:spark-hive:3.1.0-SNAPSHOT (depends on 0.9.3) [warn] +- org.apache.thrift:libfb303:0.9.3 (depends on 0.9.3) [warn] [warn] * io.dropwizard.metrics:metrics-json:4.1.1 is selected over 3.1.0 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 3.1.0) [warn] +- org.apache.hive:hive-common:2.3.7 (depends on 3.1.0) [warn] [warn] * io.dropwizard.metrics:metrics-jvm:4.1.1 is selected over 3.1.0 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 3.1.0) [warn] +- org.apache.hive:hive-common:2.3.7 (depends on 3.1.0) [warn] [warn] * org.scala-lang.modules:scala-xml_2.12:1.2.0 is selected over 1.0.6 [warn] +- org.scala-lang:scala-compiler:2.12.10 (depends on 1.0.6) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 1.0.6) [warn] [warn] * org.eclipse.jetty:jetty-servlet:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-webapp:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-common:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-server:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.eclipse.jetty:jetty-security:9.4.28.v20200408 (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-yarn-server-web-proxy:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * org.eclipse.jetty:jetty-util-ajax:9.4.28.v20200408 is selected over 9.3.24.v20180605 [warn] +- org.apache.spark:spark-hadoop-cloud:3.1.0-SNAPSHOT (depends on 9.4.28.v20200408) [warn] +- org.apache.hadoop:hadoop-azure:3.2.0 (depends on 9.3.24.v20180605) [warn] [warn] * io.dropwizard.metrics:metrics-core:4.1.1 is selected over {3.1.2, 3.1.0} [warn] +- io.dropwizard.metrics:metrics-json:4.1.1 (depends on 4.1.1) [warn] +- io.dropwizard.metrics:metrics-jmx:4.1.1 (depends on 4.1.1) [warn] +- org.apache.spark:spark-core:3.1.0-SNAPSHOT (depends on 4.1.1) [warn] +- org.apache.spark:spark-network-shuffle:3.1.0-SNAPSHOT (depends on 4.1.1) [warn] +- org.apache.spark:spark-network-common:3.1.0-SNAPSHOT (depends on 4.1.1) [warn] +- io.dropwizard.metrics:metrics-graphite:4.1.1 (depends on 4.1.1) [warn] +- io.dropwizard.metrics:metrics-jvm:4.1.1 (depends on 4.1.1) [warn] +- org.apache.hive:hive-common:2.3.7 (depends on 3.1.0) [warn] +- com.github.joshelser:dropwizard-metrics-hadoop-metrics2-reporter:0.1.2 (depends on 3.1.2) [warn] [warn] * org.apache.hadoop:hadoop-common:3.2.0 is selected over 2.6.0 [warn] +- org.apache.hadoop:hadoop-client:3.2.0 (depends on 3.2.0) [warn] +- org.apache.hadoop:hadoop-yarn-registry:3.2.0 (depends on 3.2.0) [warn] +- com.github.joshelser:dropwizard-metrics-hadoop-metrics2-reporter:0.1.2 (depends on 2.6.0) [warn] [warn] * com.nimbusds:nimbus-jose-jwt:4.41.1 is selected over 3.10 [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on 4.41.1) [warn] +- org.apache.kerby:token-provider:1.0.1 (depends on 3.10) [warn] [warn] * net.minidev:json-smart:2.3 is selected over [1.3.1,2.3] [warn] +- com.nimbusds:nimbus-jose-jwt:4.41.1 (depends on [1.3.1,2.3]) [warn] +- org.apache.hadoop:hadoop-auth:3.2.0 (depends on [1.3.1,2.3]) [warn] [warn] Run 'evicted' to see detailed eviction warnings [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/DirectKafkaStreamSuite.scala:258: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] s.consumer.poll(0) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/DirectKafkaStreamSuite.scala:314: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] s.consumer.poll(0) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/DirectKafkaStreamSuite.scala:478: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/test/scala/org/apache/spark/streaming/kinesis/KinesisInputDStreamBuilderSuite.scala:182: method initialPositionInStream in class Builder is deprecated (since 2.3.0): use initialPosition(initialPosition: KinesisInitialPosition) [warn] .initialPositionInStream(InitialPositionInStream.AT_TIMESTAMP) [warn] ^ [warn] one warning found [warn] three warnings found [info] Note: /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/test/java/org/apache/spark/streaming/kinesis/JavaKinesisInputDStreamBuilderSuite.java uses or overrides a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/target/spark-streaming-kinesis-asl-3.1.0-SNAPSHOT-tests.jar ... [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/UnsafeArraySuite.scala:77: abstract type T is unchecked since it is eliminated by erasure [warn] assert(converted.isInstanceOf[T]) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/UnsafeArraySuite.scala:77: abstract type T is unchecked since it is eliminated by erasure [warn] assert(converted.isInstanceOf[T]) [warn] ^ [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10/target/spark-streaming-kafka-0-10-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/catalyst/src/test/scala/org/apache/spark/sql/connector/InMemoryTable.scala:103: match may not be exhaustive. [warn] It would fail on the following inputs: ((_ : Int), TimestampType), ((_ : Int), _), ((_ : Long), DateType), ((_ : Long), _), (??, _), (_, DataType()), (_, DateType), (_, TimestampType), (_, _) [warn] extractor(ref.fieldNames, schema, row) match { [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/catalyst/src/test/scala/org/apache/spark/sql/connector/InMemoryTable.scala:111: match may not be exhaustive. [warn] It would fail on the following inputs: ((_ : Int), TimestampType), ((_ : Int), _), ((_ : Long), DateType), ((_ : Long), _), (??, _), (_, DataType()), (_, DateType), (_, TimestampType), (_, _) [warn] extractor(ref.fieldNames, schema, row) match { [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/catalyst/src/test/scala/org/apache/spark/sql/connector/InMemoryTable.scala:119: match may not be exhaustive. [warn] It would fail on the following inputs: ((_ : Long), _), (_, TimestampType), (_, _) [warn] extractor(ref.fieldNames, schema, row) match { [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/catalyst/src/test/scala/org/apache/spark/sql/connector/InMemoryTable.scala:126: match may not be exhaustive. [warn] It would fail on the following inputs: ((_ : Long), _), (_, TimestampType), (_, _) [warn] extractor(ref.fieldNames, schema, row) match { [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/UDFRegistration.scala:742: class UserDefinedAggregateFunction in package expressions is deprecated (since 3.0.0): Aggregator[IN, BUF, OUT] should now be registered as a UDF via the functions.udaf(agg) method. [warn] val udaf = clazz.getConstructor().newInstance().asInstanceOf[UserDefinedAggregateFunction] [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/UDFRegistration.scala:743: method register in class UDFRegistration is deprecated (since 3.0.0): Aggregator[IN, BUF, OUT] should now be registered as a UDF via the functions.udaf(agg) method. [warn] register(name, udaf) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/aggregate/udaf.scala:330: class UserDefinedAggregateFunction in package expressions is deprecated (since 3.0.0): Aggregator[IN, BUF, OUT] should now be registered as a UDF via the functions.udaf(agg) method. [warn] udaf: UserDefinedAggregateFunction, [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/aggregate/udaf.scala:328: class UserDefinedAggregateFunction in package expressions is deprecated (since 3.0.0): Aggregator[IN, BUF, OUT] should now be registered as a UDF via the functions.udaf(agg) method. [warn] case class ScalaUDAF( [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DaysWritable.scala:41: class DateWritable in package io is deprecated: see corresponding Javadoc for more information. [warn] extends DateWritable { [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DaysWritable.scala:46: class DateWritable in package io is deprecated: see corresponding Javadoc for more information. [warn] def this(dateWritable: DateWritable) = { [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/InMemoryFileIndex.scala:408: constructor LocatedFileStatus in class LocatedFileStatus is deprecated: see corresponding Javadoc for more information. [warn] val lfs = new LocatedFileStatus(f.getLen, f.isDirectory, f.getReplication, f.getBlockSize, [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/PartitioningAwareFileIndex.scala:224: method isDirectory in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (!fs.isDirectory(userDefinedBasePath)) { [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:261: class ParquetInputSplit in package hadoop is deprecated: see corresponding Javadoc for more information. [warn] new org.apache.parquet.hadoop.ParquetInputSplit( [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:272: method readFooter in class ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter(sharedConf, filePath, SKIP_ROW_GROUPS).getFileMetaData [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:450: method readFooter in class ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter( [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/parquet/ParquetPartitionReaderFactory.scala:122: class ParquetInputSplit in package hadoop is deprecated: see corresponding Javadoc for more information. [warn] LegacyBehaviorPolicy.Value) => RecordReader[Void, T]): RecordReader[Void, T] = { [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/parquet/ParquetPartitionReaderFactory.scala:127: class ParquetInputSplit in package hadoop is deprecated: see corresponding Javadoc for more information. [warn] new org.apache.parquet.hadoop.ParquetInputSplit( [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/parquet/ParquetPartitionReaderFactory.scala:136: method readFooter in class ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter(conf, filePath, SKIP_ROW_GROUPS).getFileMetaData [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/parquet/ParquetPartitionReaderFactory.scala:188: class ParquetInputSplit in package hadoop is deprecated: see corresponding Javadoc for more information. [warn] split: ParquetInputSplit, [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/parquet/ParquetPartitionReaderFactory.scala:219: class ParquetInputSplit in package hadoop is deprecated: see corresponding Javadoc for more information. [warn] split: ParquetInputSplit, [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/FileStreamSink.scala:47: method isDirectory in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (fs.isDirectory(hdfsPath)) { [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/WriteToMicroBatchDataSource.scala:36: class WriteToDataSourceV2 in package v2 is deprecated (since 2.4.0): Use specific logical plans like AppendData instead [warn] def createPlan(batchId: Long): WriteToDataSourceV2 = { [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/WriteToMicroBatchDataSource.scala:37: class WriteToDataSourceV2 in package v2 is deprecated (since 2.4.0): Use specific logical plans like AppendData instead [warn] WriteToDataSourceV2(new MicroBatchWrite(batchId, write), query) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/v2.3/src/main/scala/org/apache/spark/sql/execution/datasources/orc/OrcShimUtils.scala:41: class DateWritable in package io is deprecated: see corresponding Javadoc for more information. [warn] new DaysWritable(value.asInstanceOf[DateWritable]).gregorianDays [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/v2.3/src/main/scala/org/apache/spark/sql/execution/datasources/orc/OrcShimUtils.scala:49: class DateWritable in package io is deprecated: see corresponding Javadoc for more information. [warn] def getDateWritable(reuseObj: Boolean): (SpecializedGetters, Int) => DateWritable = { [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/xml/UDFXPathUtilSuite.scala:88: method writeStringToFile in class FileUtils is deprecated: see corresponding Javadoc for more information. [warn] FileUtils.writeStringToFile(tempFile, secretValue) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/RebaseDateTimeSuite.scala:332: constructor Date in class Date is deprecated: see corresponding Javadoc for more information. [warn] val julianDays = fromJavaDateLegacy(new Date(year - 1900, month - 1, day)) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/RebaseDateTimeSuite.scala:350: constructor Timestamp in class Timestamp is deprecated: see corresponding Javadoc for more information. [warn] val julianMicros = toJulianMicros(new Timestamp( [warn] ^ [warn] 23 warnings found [info] Note: /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/SpecificParquetRecordReaderBase.java uses or overrides a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [warn] 9 warnings found [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/WriteToMicroBatchDataSource.scala:36: class WriteToDataSourceV2 in package v2 is deprecated (since 2.4.0): Use specific logical plans like AppendData instead [warn] def createPlan(batchId: Long): WriteToDataSourceV2 = { [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/WriteToMicroBatchDataSource.scala:37: class WriteToDataSourceV2 in package v2 is deprecated (since 2.4.0): Use specific logical plans like AppendData instead [warn] WriteToDataSourceV2(new MicroBatchWrite(batchId, write), query) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/PartitioningAwareFileIndex.scala:224: method isDirectory in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (!fs.isDirectory(userDefinedBasePath)) { [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/InMemoryFileIndex.scala:408: constructor LocatedFileStatus in class LocatedFileStatus is deprecated: see corresponding Javadoc for more information. [warn] val lfs = new LocatedFileStatus(f.getLen, f.isDirectory, f.getReplication, f.getBlockSize, [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/parquet/ParquetWriteBuilder.scala:90: value ENABLE_JOB_SUMMARY in class ParquetOutputFormat is deprecated: see corresponding Javadoc for more information. [warn] && conf.get(ParquetOutputFormat.ENABLE_JOB_SUMMARY) == null) { [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/aggregate/udaf.scala:330: class UserDefinedAggregateFunction in package expressions is deprecated (since 3.0.0): Aggregator[IN, BUF, OUT] should now be registered as a UDF via the functions.udaf(agg) method. [warn] udaf: UserDefinedAggregateFunction, [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/aggregate/udaf.scala:330: class UserDefinedAggregateFunction in package expressions is deprecated (since 3.0.0): Aggregator[IN, BUF, OUT] should now be registered as a UDF via the functions.udaf(agg) method. [warn] udaf: UserDefinedAggregateFunction, [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/aggregate/udaf.scala:330: class UserDefinedAggregateFunction in package expressions is deprecated (since 3.0.0): Aggregator[IN, BUF, OUT] should now be registered as a UDF via the functions.udaf(agg) method. [warn] udaf: UserDefinedAggregateFunction, [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/aggregate/udaf.scala:330: class UserDefinedAggregateFunction in package expressions is deprecated (since 3.0.0): Aggregator[IN, BUF, OUT] should now be registered as a UDF via the functions.udaf(agg) method. [warn] udaf: UserDefinedAggregateFunction, [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/aggregate/udaf.scala:328: class UserDefinedAggregateFunction in package expressions is deprecated (since 3.0.0): Aggregator[IN, BUF, OUT] should now be registered as a UDF via the functions.udaf(agg) method. [warn] case class ScalaUDAF( [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/aggregate/udaf.scala:330: class UserDefinedAggregateFunction in package expressions is deprecated (since 3.0.0): Aggregator[IN, BUF, OUT] should now be registered as a UDF via the functions.udaf(agg) method. [warn] udaf: UserDefinedAggregateFunction, [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/aggregate/udaf.scala:328: class UserDefinedAggregateFunction in package expressions is deprecated (since 3.0.0): Aggregator[IN, BUF, OUT] should now be registered as a UDF via the functions.udaf(agg) method. [warn] case class ScalaUDAF( [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:127: value ENABLE_JOB_SUMMARY in class ParquetOutputFormat is deprecated: see corresponding Javadoc for more information. [warn] && conf.get(ParquetOutputFormat.ENABLE_JOB_SUMMARY) == null) { [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:261: class ParquetInputSplit in package hadoop is deprecated: see corresponding Javadoc for more information. [warn] new org.apache.parquet.hadoop.ParquetInputSplit( [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:272: method readFooter in class ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter(sharedConf, filePath, SKIP_ROW_GROUPS).getFileMetaData [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:450: method readFooter in class ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter( [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/v2.3/src/main/scala/org/apache/spark/sql/execution/datasources/orc/OrcShimUtils.scala:41: class DateWritable in package io is deprecated: see corresponding Javadoc for more information. [warn] new DaysWritable(value.asInstanceOf[DateWritable]).gregorianDays [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/v2.3/src/main/scala/org/apache/spark/sql/execution/datasources/orc/OrcShimUtils.scala:49: class DateWritable in package io is deprecated: see corresponding Javadoc for more information. [warn] def getDateWritable(reuseObj: Boolean): (SpecializedGetters, Int) => DateWritable = { [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/FileStreamSink.scala:47: method isDirectory in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (fs.isDirectory(hdfsPath)) { [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/parquet/ParquetPartitionReaderFactory.scala:122: class ParquetInputSplit in package hadoop is deprecated: see corresponding Javadoc for more information. [warn] LegacyBehaviorPolicy.Value) => RecordReader[Void, T]): RecordReader[Void, T] = { [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/parquet/ParquetPartitionReaderFactory.scala:127: class ParquetInputSplit in package hadoop is deprecated: see corresponding Javadoc for more information. [warn] new org.apache.parquet.hadoop.ParquetInputSplit( [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/parquet/ParquetPartitionReaderFactory.scala:136: method readFooter in class ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter(conf, filePath, SKIP_ROW_GROUPS).getFileMetaData [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/parquet/ParquetPartitionReaderFactory.scala:188: class ParquetInputSplit in package hadoop is deprecated: see corresponding Javadoc for more information. [warn] split: ParquetInputSplit, [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/parquet/ParquetPartitionReaderFactory.scala:219: class ParquetInputSplit in package hadoop is deprecated: see corresponding Javadoc for more information. [warn] split: ParquetInputSplit, [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DaysWritable.scala:41: class DateWritable in package io is deprecated: see corresponding Javadoc for more information. [warn] extends DateWritable { [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DaysWritable.scala:46: class DateWritable in package io is deprecated: see corresponding Javadoc for more information. [warn] def this(dateWritable: DateWritable) = { [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/UDFRegistration.scala:742: class UserDefinedAggregateFunction in package expressions is deprecated (since 3.0.0): Aggregator[IN, BUF, OUT] should now be registered as a UDF via the functions.udaf(agg) method. [warn] val udaf = clazz.getConstructor().newInstance().asInstanceOf[UserDefinedAggregateFunction] [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/main/scala/org/apache/spark/sql/UDFRegistration.scala:743: method register in class UDFRegistration is deprecated (since 3.0.0): Aggregator[IN, BUF, OUT] should now be registered as a UDF via the functions.udaf(agg) method. [warn] register(name, udaf) [warn] [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/target/spark-sql-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 28 Scala sources and 1 Java source to /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-sql/target/classes... [info] Compiling 18 Scala sources and 1 Java source to /home/jenkins/workspace/NewSparkPullRequestBuilder/external/avro/target/classes... [info] Compiling 2 Scala sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/hadoop-cloud/target/classes... [info] Compiling 30 Scala sources and 2 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/hive/target/classes... [info] Compiling 325 Scala sources and 5 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/mllib/target/classes... [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/hadoop-cloud/target/spark-hadoop-cloud-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 2 Scala sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/hadoop-cloud/target/test-classes... [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/avro/src/main/scala/org/apache/spark/sql/avro/AvroFileFormat.scala:99: value ignoreExtension in class AvroOptions is deprecated (since 3.0): Use the general data source option pathGlobFilter for filtering file names [warn] if (parsedOptions.ignoreExtension || file.filePath.endsWith(".avro")) { [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/avro/src/main/scala/org/apache/spark/sql/avro/AvroUtils.scala:56: value ignoreExtension in class AvroOptions is deprecated (since 3.0): Use the general data source option pathGlobFilter for filtering file names [warn] inferAvroSchemaFromFiles(files, conf, parsedOptions.ignoreExtension, [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/avro/src/main/scala/org/apache/spark/sql/v2/avro/AvroPartitionReaderFactory.scala:63: value ignoreExtension in class AvroOptions is deprecated (since 3.0): Use the general data source option pathGlobFilter for filtering file names [warn] if (parsedOptions.ignoreExtension || partitionedFile.filePath.endsWith(".avro")) { [warn] ^ [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/hadoop-cloud/target/spark-hadoop-cloud-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:139: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:538: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] ^ [warn] three warnings found [warn] two warnings found [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/avro/src/main/java/org/apache/spark/sql/avro/SparkAvroKeyOutputFormat.java:55: [unchecked] unchecked call to SparkAvroKeyRecordWriter(Schema,GenericData,CodecFactory,OutputStream,int,Map<String,String>) as a member of the raw type SparkAvroKeyRecordWriter [warn] return new SparkAvroKeyRecordWriter( [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/avro/src/main/java/org/apache/spark/sql/avro/SparkAvroKeyOutputFormat.java:55: [unchecked] unchecked conversion [warn] return new SparkAvroKeyRecordWriter( [warn] ^ [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Compiling 371 Scala sources and 40 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/target/test-classes... [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/catalyst/target/spark-catalyst-3.1.0-SNAPSHOT-tests.jar ... [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/avro/src/main/scala/org/apache/spark/sql/v2/avro/AvroPartitionReaderFactory.scala:63: value ignoreExtension in class AvroOptions is deprecated (since 3.0): Use the general data source option pathGlobFilter for filtering file names [warn] if (parsedOptions.ignoreExtension || partitionedFile.filePath.endsWith(".avro")) { [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/avro/src/main/scala/org/apache/spark/sql/avro/AvroFileFormat.scala:99: value ignoreExtension in class AvroOptions is deprecated (since 3.0): Use the general data source option pathGlobFilter for filtering file names [warn] if (parsedOptions.ignoreExtension || file.filePath.endsWith(".avro")) { [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/avro/src/main/scala/org/apache/spark/sql/avro/AvroUtils.scala:56: value ignoreExtension in class AvroOptions is deprecated (since 3.0): Use the general data source option pathGlobFilter for filtering file names [warn] inferAvroSchemaFromFiles(files, conf, parsedOptions.ignoreExtension, [warn] [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/external/avro/target/spark-avro-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Done packaging. [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:139: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala:538: method poll in trait Consumer is deprecated: see corresponding Javadoc for more information. [warn] consumer.poll(0) [warn] [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-sql/target/spark-sql-kafka-0-10-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [warn] there were 7 deprecation warnings; re-run with -deprecation for details [warn] one warning found [info] Note: /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/hive/src/main/java/org/apache/hadoop/hive/ql/io/orc/SparkOrcNewRecordReader.java uses or overrides a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/hive/target/spark-hive-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 26 Scala sources and 175 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/hive-thriftserver/target/classes... [info] Note: Some input files use or override a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/hive-thriftserver/target/spark-hive-thriftserver-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/mllib/target/spark-mllib-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 4 Scala sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/repl/target/classes... [info] Compiling 200 Scala sources and 137 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/examples/target/scala-2.12/classes... [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/repl/target/spark-repl-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Compiling 3 Scala sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/repl/target/test-classes... [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/assembly/target/scala-2.12/jars/spark-assembly_2.12-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/assembly/target/scala-2.12/jars/spark-assembly_2.12-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/repl/target/spark-repl-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/examples/target/scala-2.12/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/examples/target/scala-2.12/jars/spark-examples_2.12-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/streaming/StreamingQuerySuite.scala:705: a pure expression does nothing in statement position; multiline expressions might require enclosing parentheses [warn] q1 [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DataFrameSuite.scala:119: method explode in class Dataset is deprecated (since 2.0.0): use flatMap() or select() with functions.explode() instead [warn] df.explode("words", "word") { word: String => word.split(" ").toSeq }.select('word), [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DataFrameSuite.scala:127: method explode in class Dataset is deprecated (since 2.0.0): use flatMap() or select() with functions.explode() instead [warn] df.explode('letters) { [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DataFrameSuite.scala:309: method explode in class Dataset is deprecated (since 2.0.0): use flatMap() or select() with functions.explode() instead [warn] df.explode($"*") { case Row(prefix: String, csv: String) => [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DataFrameSuite.scala:316: method explode in class Dataset is deprecated (since 2.0.0): use flatMap() or select() with functions.explode() instead [warn] df.explode('prefix, 'csv) { case Row(prefix: String, csv: String) => [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DataFrameWindowFunctionsSuite.scala:367: class UserDefinedAggregateFunction in package expressions is deprecated (since 3.0.0): Aggregator[IN, BUF, OUT] should now be registered as a UDF via the functions.udaf(agg) method. [warn] val udaf = new UserDefinedAggregateFunction { [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DatasetBenchmark.scala:242: object typed in package scalalang is deprecated (since 3.0.0): please use untyped builtin aggregate functions. [warn] df.as[Data].select(typed.sumLong((d: Data) => d.l)).queryExecution.toRdd.foreach(_ => ()) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:52: method toDegrees in object functions is deprecated (since 2.1.0): Use degrees [warn] testOneToOneMathFunction(toDegrees, math.toDegrees) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:59: method toDegrees in object functions is deprecated (since 2.1.0): Use degrees [warn] Seq(0).toDF().select(toDegrees(lit(0)), toDegrees(lit(1)), toDegrees(lit(1.5))) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:59: method toDegrees in object functions is deprecated (since 2.1.0): Use degrees [warn] Seq(0).toDF().select(toDegrees(lit(0)), toDegrees(lit(1)), toDegrees(lit(1.5))) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:59: method toDegrees in object functions is deprecated (since 2.1.0): Use degrees [warn] Seq(0).toDF().select(toDegrees(lit(0)), toDegrees(lit(1)), toDegrees(lit(1.5))) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:63: method toDegrees in object functions is deprecated (since 2.1.0): Use degrees [warn] df.select(toDegrees("a")) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:69: method toRadians in object functions is deprecated (since 2.1.0): Use radians [warn] testOneToOneMathFunction(toRadians, math.toRadians) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:76: method toRadians in object functions is deprecated (since 2.1.0): Use radians [warn] Seq(0).toDF().select(toRadians(lit(0)), toRadians(lit(1)), toRadians(lit(1.5))) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:76: method toRadians in object functions is deprecated (since 2.1.0): Use radians [warn] Seq(0).toDF().select(toRadians(lit(0)), toRadians(lit(1)), toRadians(lit(1.5))) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:76: method toRadians in object functions is deprecated (since 2.1.0): Use radians [warn] Seq(0).toDF().select(toRadians(lit(0)), toRadians(lit(1)), toRadians(lit(1.5))) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:80: method toRadians in object functions is deprecated (since 2.1.0): Use radians [warn] df.select(toRadians("a")) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:91: method approxCountDistinct in object functions is deprecated (since 2.1.0): Use approx_count_distinct [warn] df.select(approxCountDistinct("a"))) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:101: method monotonicallyIncreasingId in object functions is deprecated (since 2.0.0): Use monotonically_increasing_id() [warn] df.select(monotonicallyIncreasingId(), expr("monotonically_increasing_id()")), [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:113: method !== in class Column is deprecated (since 2.0.0): !== does not have the same precedence as ===, use =!= instead [warn] nullData.filter($"b" !== 1), [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:116: method !== in class Column is deprecated (since 2.0.0): !== does not have the same precedence as ===, use =!= instead [warn] checkAnswer(nullData.filter($"b" !== null), Nil) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:119: method !== in class Column is deprecated (since 2.0.0): !== does not have the same precedence as ===, use =!= instead [warn] nullData.filter($"a" !== $"b"), [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:125: method registerTempTable in class Dataset is deprecated (since 2.0.0): Use createOrReplaceTempView(viewName) instead. [warn] Seq(1).toDF().registerTempTable("t") [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:132: constructor SQLContext in class SQLContext is deprecated (since 2.0.0): Use SparkSession.builder instead [warn] val sqlContext = new SQLContext(sc) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:133: method setActive in object SQLContext is deprecated (since 2.0.0): Use SparkSession.setActiveSession instead [warn] SQLContext.setActive(sqlContext) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:135: method clearActive in object SQLContext is deprecated (since 2.0.0): Use SparkSession.clearActiveSession instead [warn] SQLContext.clearActive() [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:144: method applySchema in class SQLContext is deprecated (since 1.3.0): Use createDataFrame instead. [warn] checkAnswer(sqlContext.applySchema(rowRdd, schema), Row("Jack", 20) :: Row("Marry", 18) :: Nil) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:145: method applySchema in class SQLContext is deprecated (since 1.3.0): Use createDataFrame instead. [warn] checkAnswer(sqlContext.applySchema(rowRdd.toJavaRDD(), schema), [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:155: method parquetFile in class SQLContext is deprecated (since 1.4.0): Use read.parquet() instead. [warn] val parquetDF = sqlContext.parquetFile(parquetFile) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:166: method jsonFile in class SQLContext is deprecated (since 1.4.0): Use read.json() instead. [warn] var jsonDF = sqlContext.jsonFile(jsonFile) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:171: method jsonFile in class SQLContext is deprecated (since 1.4.0): Use read.json() instead. [warn] jsonDF = sqlContext.jsonFile(jsonFile, schema) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:175: method jsonFile in class SQLContext is deprecated (since 1.4.0): Use read.json() instead. [warn] jsonDF = sqlContext.jsonFile(jsonFile, 0.9) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:180: method jsonRDD in class SQLContext is deprecated (since 1.4.0): Use read.json() instead. [warn] jsonDF = sqlContext.jsonRDD(jsonRDD) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:182: method jsonRDD in class SQLContext is deprecated (since 1.4.0): Use read.json() instead. [warn] jsonDF = sqlContext.jsonRDD(jsonRDD.toJavaRDD()) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:187: method jsonRDD in class SQLContext is deprecated (since 1.4.0): Use read.json() instead. [warn] jsonDF = sqlContext.jsonRDD(jsonRDD, schema) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:189: method jsonRDD in class SQLContext is deprecated (since 1.4.0): Use read.json() instead. [warn] jsonDF = sqlContext.jsonRDD(jsonRDD.toJavaRDD(), schema) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:193: method jsonRDD in class SQLContext is deprecated (since 1.4.0): Use read.json() instead. [warn] jsonDF = sqlContext.jsonRDD(jsonRDD, 0.9) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:195: method jsonRDD in class SQLContext is deprecated (since 1.4.0): Use read.json() instead. [warn] jsonDF = sqlContext.jsonRDD(jsonRDD.toJavaRDD(), 0.9) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:207: method load in class SQLContext is deprecated (since 1.4.0): Use read.load(path) instead. [warn] var loadDF = sqlContext.load(path) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:210: method load in class SQLContext is deprecated (since 1.4.0): Use read.format(source).load(path) instead. [warn] loadDF = sqlContext.load(path, "parquet") [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:213: method load in class SQLContext is deprecated (since 1.4.0): Use read.format(source).options(options).load() instead. [warn] loadDF = sqlContext.load("parquet", Map("path" -> path)) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/DeprecatedAPISuite.scala:216: method load in class SQLContext is deprecated (since 1.4.0): Use read.format(source).schema(schema).options(options).load() instead. [warn] loadDF = sqlContext.load("parquet", expectDF.schema, Map("path" -> path)) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/UDFSuite.scala:145: method udf in object functions is deprecated (since 3.0.0): Scala `udf` method with return type parameter is deprecated. Please use Scala `udf` method without return type parameter. [warn] val bar = udf(() => Math.random(), DataTypes.DoubleType).asNondeterministic() [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/UDFSuite.scala:454: method udf in object functions is deprecated (since 3.0.0): Scala `udf` method with return type parameter is deprecated. Please use Scala `udf` method without return type parameter. [warn] val f = udf((x: Int) => x, IntegerType) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/UDFSuite.scala:459: method udf in object functions is deprecated (since 3.0.0): Scala `udf` method with return type parameter is deprecated. Please use Scala `udf` method without return type parameter. [warn] val f2 = udf((x: Double) => x, DoubleType) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/UDFSuite.scala:468: method udf in object functions is deprecated (since 3.0.0): Scala `udf` method with return type parameter is deprecated. Please use Scala `udf` method without return type parameter. [warn] val e = intercept[AnalysisException](udf((x: Int) => x, IntegerType)) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetCompatibilityTest.scala:49: method readAllFootersInParallel in class ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readAllFootersInParallel(hadoopConf, parquetFiles, true) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetInteroperabilitySuite.scala:177: method readFooter in class ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter(hadoopConf, part.getPath, NO_FILTER) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetTest.scala:113: method writeMetadataFile in class ParquetFileWriter is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileWriter.writeMetadataFile(configuration, path, Seq(footer).asJava) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetTest.scala:128: method writeMetadataFile in class ParquetFileWriter is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileWriter.writeMetadataFile(configuration, path, Seq(footer).asJava) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetTest.scala:134: method readAllFootersInParallel in class ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readAllFootersInParallel(configuration, fs.getFileStatus(path)).asScala.toSeq [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetTest.scala:138: method readFooter in class ParquetFileReader is deprecated: see corresponding Javadoc for more information. [warn] ParquetFileReader.readFooter( [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/internal/DeprecatedCreateExternalTableSuite.scala:33: method createExternalTable in class Catalog is deprecated (since 2.2.0): use createTable instead. [warn] spark.catalog.createExternalTable( [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/internal/DeprecatedCreateExternalTableSuite.scala:52: method createExternalTable in class Catalog is deprecated (since 2.2.0): use createTable instead. [warn] spark.catalog.createExternalTable( [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/internal/DeprecatedCreateExternalTableSuite.scala:71: method createExternalTable in class Catalog is deprecated (since 2.2.0): use createTable instead. [warn] spark.catalog.createExternalTable( [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/jdbc/JDBCSuite.scala:1707: method jdbc in class SQLContext is deprecated (since 1.4.0): Use read.jdbc() instead. [warn] var jdbcDF = sqlContext.jdbc(urlWithUserAndPass, "TEST.PEOPLE") [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/jdbc/JDBCSuite.scala:1710: method jdbc in class SQLContext is deprecated (since 1.4.0): Use read.jdbc() instead. [warn] jdbcDF = sqlContext.jdbc(urlWithUserAndPass, "TEST.PEOPLE", "THEID", 0, 4, 3) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/sql/jdbc/JDBCSuite.scala:1715: method jdbc in class SQLContext is deprecated (since 1.4.0): Use read.jdbc() instead. [warn] jdbcDF = sqlContext.jdbc(urlWithUserAndPass, "TEST.PEOPLE", parts) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/src/test/scala/org/apache/spark/status/api/v1/sql/SqlResourceSuite.scala:58: Reference to uninitialized value edges [warn] SparkPlanGraph(nodes, edges).allNodes.filterNot(_.name == WHOLE_STAGE_CODEGEN_1) [warn] ^ [warn] 59 warnings found [info] Note: Some input files use or override a deprecated API. [info] Note: Recompile with -Xlint:deprecation for details. [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Compiling 8 Scala sources and 1 Java source to /home/jenkins/workspace/NewSparkPullRequestBuilder/external/avro/target/test-classes... [info] Compiling 101 Scala sources and 17 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/hive/target/test-classes... [info] Compiling 20 Scala sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-sql/target/test-classes... [info] Compiling 205 Scala sources and 66 Java sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/mllib/target/test-classes... [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/core/target/spark-sql-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala:469: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] kc.poll(0) [warn] ^ [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala:483: method poll in class KafkaConsumer is deprecated: see corresponding Javadoc for more information. [warn] kc.poll(0) [warn] ^ [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/external/avro/target/spark-avro-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] two warnings found [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kafka-0-10-sql/target/spark-sql-kafka-0-10-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [warn] there were 14 deprecation warnings [warn] there were two deprecation warnings (since 2.0.0) [warn] there were 7 deprecation warnings (since 3.0.0) [warn] there were 23 deprecation warnings in total; re-run with -deprecation for details [warn] four warnings found [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:464: [unchecked] unchecked cast [warn] setLint((List<Integer>)value); [warn] ^ [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Compiling 18 Scala sources to /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/hive-thriftserver/target/test-classes... [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/hive/target/spark-hive-3.1.0-SNAPSHOT-tests.jar ... [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/CliSuite.scala:101: match may not be exhaustive. [warn] It would fail on the following input: Nil [warn] val queryEcho = query.split("\n").toList match { [warn] ^ [info] Done packaging. [warn] one warning found [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/sql/hive-thriftserver/target/spark-hive-thriftserver-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/mllib/target/spark-mllib-3.1.0-SNAPSHOT-tests.jar ... [info] Done packaging. [success] Total time: 395 s, completed Jul 21, 2020 6:39:33 AM [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/util/Utils.scala:763: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (fs.isFile(path)) { [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala:168: method getAllStatistics in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] val f = () => FileSystem.getAllStatistics.asScala.map(_.getThreadStatistics.getBytesRead).sum [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala:199: method getAllStatistics in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] val threadStats = FileSystem.getAllStatistics.asScala.map(_.getThreadStatistics) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/executor/ExecutorSource.scala:33: method getAllStatistics in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] FileSystem.getAllStatistics.asScala.find(s => s.getScheme.equals(scheme)) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/SparkContext.scala:1867: method isDirectory in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (fs.isDirectory(hadoopPath)) { [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala:806: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (!fs.isFile(new Path(inProgressLog))) { [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/core/target/spark-core-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/streaming/src/main/scala/org/apache/spark/streaming/util/HdfsUtils.scala:33: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (dfs.isFile(dfsPath)) { [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/streaming/src/main/scala/org/apache/spark/streaming/util/HdfsUtils.scala:61: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] if (!dfs.isFile(dfsPath)) null else throw e [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/streaming/src/main/scala/org/apache/spark/streaming/util/HdfsUtils.scala:95: method isFile in class FileSystem is deprecated: see corresponding Javadoc for more information. [warn] fs.isFile(hdpPath) [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisReceiver.scala:192: constructor Worker in class Worker is deprecated: see corresponding Javadoc for more information. [warn] worker = new Worker(recordProcessorFactory, kinesisClientLibConfiguration) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/SparkAWSCredentials.scala:74: method withLongLivedCredentialsProvider in class Builder is deprecated: see corresponding Javadoc for more information. [warn] .withLongLivedCredentialsProvider(longLivedCreds.provider) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:142: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] private val client = new AmazonKinesisClient(credentials) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:152: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] client.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:221: method setRequestCredentials in class AmazonWebServiceRequest is deprecated: see corresponding Javadoc for more information. [warn] getRecordsRequest.setRequestCredentials(credentials) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisBackedBlockRDD.scala:242: method setRequestCredentials in class AmazonWebServiceRequest is deprecated: see corresponding Javadoc for more information. [warn] getShardIteratorRequest.setRequestCredentials(credentials) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:58: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val client = new AmazonKinesisClient(KinesisTestUtils.getAWSCredentials()) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:59: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] client.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:64: constructor AmazonDynamoDBClient in class AmazonDynamoDBClient is deprecated: see corresponding Javadoc for more information. [warn] val dynamoDBClient = new AmazonDynamoDBClient(new DefaultAWSCredentialsProviderChain()) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisTestUtils.scala:65: method setRegion in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] dynamoDBClient.setRegion(RegionUtils.getRegion(regionName)) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:107: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val kinesisClient = new AmazonKinesisClient(credentials) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:108: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] kinesisClient.setEndpoint(endpointUrl) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:223: constructor AmazonKinesisClient in class AmazonKinesisClient is deprecated: see corresponding Javadoc for more information. [warn] val kinesisClient = new AmazonKinesisClient(new DefaultAWSCredentialsProviderChain()) [warn] [warn] /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala:224: method setEndpoint in class AmazonWebServiceClient is deprecated: see corresponding Javadoc for more information. [warn] kinesisClient.setEndpoint(endpoint) [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Checking every *.class/*.jar file's SHA-1. [warn] Strategy 'discard' was applied to 2 files [warn] Strategy 'filterDistinctLines' was applied to 11 files [warn] Strategy 'first' was applied to 50 files [info] SHA-1: bc70edc1d07f65e5ad2155ccc3cbc462d4ebbfd8 [info] Packaging /home/jenkins/workspace/NewSparkPullRequestBuilder/external/kinesis-asl-assembly/target/scala-2.12/spark-streaming-kinesis-asl-assembly-3.1.0-SNAPSHOT.jar ... [info] Done packaging. [success] Total time: 18 s, completed Jul 21, 2020 6:39:51 AM ======================================================================== Running PySpark packaging tests ======================================================================== Constructing virtual env for testing Using conda virtual environments Testing pip installation with python 3.6 Using /tmp/tmp.7spGqFp6sH for virtualenv Fetching package metadata ........... Solving package specifications: . Package plan for installation in environment /tmp/tmp.7spGqFp6sH/3.6: The following NEW packages will be INSTALLED: _libgcc_mutex: 0.1-main blas: 1.0-mkl ca-certificates: 2020.6.24-0 certifi: 2020.6.20-py36_0 intel-openmp: 2020.1-217 ld_impl_linux-64: 2.33.1-h53a641e_7 libedit: 3.1.20191231-h14c3975_1 libffi: 3.3-he6710b0_2 libgcc-ng: 9.1.0-hdf63c60_0 libgfortran-ng: 7.3.0-hdf63c60_0 libstdcxx-ng: 9.1.0-hdf63c60_0 mkl: 2020.1-217 mkl-service: 2.3.0-py36he904b0f_0 mkl_fft: 1.1.0-py36h23d657b_0 mkl_random: 1.1.1-py36h0573a6f_0 ncurses: 6.2-he6710b0_1 numpy: 1.18.5-py36ha1c710e_0 numpy-base: 1.18.5-py36hde5b4d6_0 openssl: 1.1.1g-h7b6447c_0 pandas: 1.0.5-py36h0573a6f_0 pip: 20.1.1-py36_1 python: 3.6.10-h7579374_2 python-dateutil: 2.8.1-py_0 pytz: 2020.1-py_0 readline: 8.0-h7b6447c_0 setuptools: 49.2.0-py36_0 six: 1.15.0-py_0 sqlite: 3.32.3-h62c20be_0 tk: 8.6.10-hbc83047_0 wheel: 0.34.2-py36_0 xz: 5.2.5-h7b6447c_0 zlib: 1.2.11-h7b6447c_3 # # To activate this environment, use: # > source activate /tmp/tmp.7spGqFp6sH/3.6 # # To deactivate an active environment, use: # > source deactivate # Creating pip installable source dist running sdist running egg_info creating pyspark.egg-info writing pyspark.egg-info/PKG-INFO writing dependency_links to pyspark.egg-info/dependency_links.txt writing requirements to pyspark.egg-info/requires.txt writing top-level names to pyspark.egg-info/top_level.txt writing manifest file 'pyspark.egg-info/SOURCES.txt' package init file 'deps/bin/__init__.py' not found (or not a regular file) package init file 'deps/sbin/__init__.py' not found (or not a regular file) package init file 'deps/jars/__init__.py' not found (or not a regular file) package init file 'pyspark/python/pyspark/__init__.py' not found (or not a regular file) package init file 'lib/__init__.py' not found (or not a regular file) package init file 'deps/data/__init__.py' not found (or not a regular file) package init file 'deps/licenses/__init__.py' not found (or not a regular file) package init file 'deps/examples/__init__.py' not found (or not a regular file) reading manifest file 'pyspark.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files matching '*.py[cod]' found anywhere in distribution warning: no previously-included files matching '__pycache__' found anywhere in distribution warning: no previously-included files matching '.DS_Store' found anywhere in distribution writing manifest file 'pyspark.egg-info/SOURCES.txt' running check creating pyspark-3.1.0.dev0 creating pyspark-3.1.0.dev0/deps creating pyspark-3.1.0.dev0/deps/bin creating pyspark-3.1.0.dev0/deps/data creating pyspark-3.1.0.dev0/deps/data/graphx creating pyspark-3.1.0.dev0/deps/data/mllib creating pyspark-3.1.0.dev0/deps/data/mllib/als creating pyspark-3.1.0.dev0/deps/data/mllib/images creating pyspark-3.1.0.dev0/deps/data/mllib/images/origin creating pyspark-3.1.0.dev0/deps/data/mllib/images/origin/kittens creating pyspark-3.1.0.dev0/deps/data/mllib/images/partitioned creating pyspark-3.1.0.dev0/deps/data/mllib/images/partitioned/cls=kittens creating pyspark-3.1.0.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01 creating pyspark-3.1.0.dev0/deps/data/mllib/ridge-data creating pyspark-3.1.0.dev0/deps/data/streaming creating pyspark-3.1.0.dev0/deps/examples creating pyspark-3.1.0.dev0/deps/examples/ml creating pyspark-3.1.0.dev0/deps/examples/mllib creating pyspark-3.1.0.dev0/deps/examples/sql creating pyspark-3.1.0.dev0/deps/examples/sql/streaming creating pyspark-3.1.0.dev0/deps/examples/streaming creating pyspark-3.1.0.dev0/deps/jars creating pyspark-3.1.0.dev0/deps/licenses creating pyspark-3.1.0.dev0/deps/sbin creating pyspark-3.1.0.dev0/lib creating pyspark-3.1.0.dev0/pyspark creating pyspark-3.1.0.dev0/pyspark.egg-info creating pyspark-3.1.0.dev0/pyspark/cloudpickle creating pyspark-3.1.0.dev0/pyspark/ml creating pyspark-3.1.0.dev0/pyspark/ml/linalg creating pyspark-3.1.0.dev0/pyspark/ml/param creating pyspark-3.1.0.dev0/pyspark/mllib creating pyspark-3.1.0.dev0/pyspark/mllib/linalg creating pyspark-3.1.0.dev0/pyspark/mllib/stat creating pyspark-3.1.0.dev0/pyspark/python creating pyspark-3.1.0.dev0/pyspark/python/pyspark creating pyspark-3.1.0.dev0/pyspark/resource creating pyspark-3.1.0.dev0/pyspark/sql creating pyspark-3.1.0.dev0/pyspark/sql/avro creating pyspark-3.1.0.dev0/pyspark/sql/pandas creating pyspark-3.1.0.dev0/pyspark/streaming copying files to pyspark-3.1.0.dev0... copying MANIFEST.in -> pyspark-3.1.0.dev0 copying README.md -> pyspark-3.1.0.dev0 copying setup.cfg -> pyspark-3.1.0.dev0 copying setup.py -> pyspark-3.1.0.dev0 copying deps/bin/beeline -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/beeline.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/docker-image-tool.sh -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/find-spark-home -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/find-spark-home.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/load-spark-env.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/load-spark-env.sh -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/pyspark -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/pyspark.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/pyspark2.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/run-example -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/run-example.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-class -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-class.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-class2.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-shell -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-shell.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-shell2.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-sql -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-sql.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-sql2.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-submit -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-submit.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-submit2.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/sparkR -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/sparkR.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/sparkR2.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/data/graphx/followers.txt -> pyspark-3.1.0.dev0/deps/data/graphx copying deps/data/graphx/users.txt -> pyspark-3.1.0.dev0/deps/data/graphx copying deps/data/mllib/gmm_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/iris_libsvm.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/kmeans_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/pagerank_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/pic_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_binary_classification_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_fpgrowth.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_isotonic_regression_libsvm_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_kmeans_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_lda_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_lda_libsvm_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_libsvm_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_linear_regression_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_movielens_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_multiclass_classification_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_svm_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/streaming_kmeans_data_test.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/als/sample_movielens_ratings.txt -> pyspark-3.1.0.dev0/deps/data/mllib/als copying deps/data/mllib/als/test.data -> pyspark-3.1.0.dev0/deps/data/mllib/als copying deps/data/mllib/images/license.txt -> pyspark-3.1.0.dev0/deps/data/mllib/images copying deps/data/mllib/images/origin/license.txt -> pyspark-3.1.0.dev0/deps/data/mllib/images/origin copying deps/data/mllib/images/origin/kittens/not-image.txt -> pyspark-3.1.0.dev0/deps/data/mllib/images/origin/kittens copying deps/data/mllib/images/partitioned/cls=kittens/date=2018-01/not-image.txt -> pyspark-3.1.0.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01 copying deps/data/mllib/ridge-data/lpsa.data -> pyspark-3.1.0.dev0/deps/data/mllib/ridge-data copying deps/data/streaming/AFINN-111.txt -> pyspark-3.1.0.dev0/deps/data/streaming copying deps/examples/als.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/avro_inputformat.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/kmeans.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/logistic_regression.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/pagerank.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/parquet_inputformat.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/pi.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/sort.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/status_api_demo.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/transitive_closure.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/wordcount.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/ml/aft_survival_regression.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/als_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/anova_selector_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/anova_test_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/binarizer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/bisecting_k_means_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/bucketed_random_projection_lsh_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/bucketizer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/chi_square_test_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/chisq_selector_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/correlation_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/count_vectorizer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/cross_validator.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/dataframe_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/dct_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/decision_tree_classification_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/decision_tree_regression_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/elementwise_product_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/estimator_transformer_param_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/feature_hasher_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/fm_classifier_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/fm_regressor_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/fpgrowth_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/fvalue_selector_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/fvalue_test_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/gaussian_mixture_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/generalized_linear_regression_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/gradient_boosted_tree_classifier_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/gradient_boosted_tree_regressor_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/imputer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/index_to_string_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/interaction_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/isotonic_regression_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/kmeans_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/lda_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/linear_regression_with_elastic_net.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/linearsvc.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/logistic_regression_summary_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/logistic_regression_with_elastic_net.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/max_abs_scaler_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/min_hash_lsh_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/min_max_scaler_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/multiclass_logistic_regression_with_elastic_net.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/multilayer_perceptron_classification.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/n_gram_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/naive_bayes_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/normalizer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/one_vs_rest_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/onehot_encoder_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/pca_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/pipeline_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/polynomial_expansion_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/power_iteration_clustering_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/prefixspan_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/quantile_discretizer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/random_forest_classifier_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/random_forest_regressor_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/rformula_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/robust_scaler_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/sql_transformer.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/standard_scaler_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/stopwords_remover_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/string_indexer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/summarizer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/tf_idf_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/tokenizer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/train_validation_split.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/variance_threshold_selector_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/vector_assembler_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/vector_indexer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/vector_size_hint_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/vector_slicer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/word2vec_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/mllib/binary_classification_metrics_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/bisecting_k_means_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/correlations.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/correlations_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/decision_tree_classification_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/decision_tree_regression_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/elementwise_product_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/fpgrowth_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/gaussian_mixture_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/gaussian_mixture_model.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/gradient_boosting_classification_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/gradient_boosting_regression_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/hypothesis_testing_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/hypothesis_testing_kolmogorov_smirnov_test_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/isotonic_regression_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/k_means_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/kernel_density_estimation_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/kmeans.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/latent_dirichlet_allocation_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/linear_regression_with_sgd_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/logistic_regression.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/logistic_regression_with_lbfgs_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/multi_class_metrics_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/multi_label_metrics_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/naive_bayes_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/normalizer_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/pca_rowmatrix_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/power_iteration_clustering_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/random_forest_classification_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/random_forest_regression_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/random_rdd_generation.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/ranking_metrics_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/recommendation_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/regression_metrics_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/sampled_rdds.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/standard_scaler_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/stratified_sampling_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/streaming_k_means_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/streaming_linear_regression_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/summary_statistics_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/svd_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/svm_with_sgd_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/tf_idf_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/word2vec.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/word2vec_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/sql/arrow.py -> pyspark-3.1.0.dev0/deps/examples/sql copying deps/examples/sql/basic.py -> pyspark-3.1.0.dev0/deps/examples/sql copying deps/examples/sql/datasource.py -> pyspark-3.1.0.dev0/deps/examples/sql copying deps/examples/sql/hive.py -> pyspark-3.1.0.dev0/deps/examples/sql copying deps/examples/sql/streaming/structured_kafka_wordcount.py -> pyspark-3.1.0.dev0/deps/examples/sql/streaming copying deps/examples/sql/streaming/structured_network_wordcount.py -> pyspark-3.1.0.dev0/deps/examples/sql/streaming copying deps/examples/sql/streaming/structured_network_wordcount_windowed.py -> pyspark-3.1.0.dev0/deps/examples/sql/streaming copying deps/examples/streaming/hdfs_wordcount.py -> pyspark-3.1.0.dev0/deps/examples/streaming copying deps/examples/streaming/network_wordcount.py -> pyspark-3.1.0.dev0/deps/examples/streaming copying deps/examples/streaming/network_wordjoinsentiments.py -> pyspark-3.1.0.dev0/deps/examples/streaming copying deps/examples/streaming/queue_stream.py -> pyspark-3.1.0.dev0/deps/examples/streaming copying deps/examples/streaming/recoverable_network_wordcount.py -> pyspark-3.1.0.dev0/deps/examples/streaming copying deps/examples/streaming/sql_network_wordcount.py -> pyspark-3.1.0.dev0/deps/examples/streaming copying deps/examples/streaming/stateful_network_wordcount.py -> pyspark-3.1.0.dev0/deps/examples/streaming copying deps/jars/HikariCP-2.5.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/JLargeArrays-1.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/JTransforms-3.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/RoaringBitmap-0.7.45.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/ST4-4.0.4.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/accessors-smart-1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/activation-1.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/aircompressor-0.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/algebra_2.12-2.0.0-M2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/aliyun-sdk-oss-2.8.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/antlr-runtime-3.5.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/antlr4-runtime-4.7.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/aopalliance-1.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/aopalliance-repackaged-2.6.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/arpack_combined_all-0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/arrow-format-0.15.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/arrow-memory-0.15.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/arrow-vector-0.15.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/audience-annotations-0.5.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/automaton-1.11-8.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/avro-1.8.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/avro-ipc-1.8.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/avro-mapred-1.8.2-hadoop2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/aws-java-sdk-bundle-1.11.375.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/azure-data-lake-store-sdk-2.2.9.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/azure-keyvault-core-1.0.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/azure-storage-7.0.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/bonecp-0.8.0.RELEASE.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/breeze-macros_2.12-1.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/breeze_2.12-1.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/cats-kernel_2.12-2.0.0-M4.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/chill-java-0.9.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/chill_2.12-0.9.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-beanutils-1.9.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-cli-1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-codec-1.11.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-collections-3.2.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-compiler-3.1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-compress-1.9.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-configuration2-2.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-crypto-1.0.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-daemon-1.0.13.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-dbcp-1.4.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-httpclient-3.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-io-2.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-lang-2.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-lang3-3.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-logging-1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-math3-3.4.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-net-3.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-pool-1.5.4.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-text-1.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/compress-lzf-1.0.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/core-1.1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/curator-client-2.12.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/curator-framework-2.13.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/curator-recipes-2.13.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/datanucleus-api-jdo-4.2.4.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/datanucleus-core-4.1.17.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/datanucleus-rdbms-4.1.19.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/derby-10.12.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/dnsjava-2.1.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/ehcache-3.3.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/flatbuffers-java-1.9.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/generex-1.0.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/geronimo-jcache_1.0_spec-1.0-alpha-1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/gmetric4j-1.0.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/gson-2.2.4.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/guava-14.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-aliyun-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-annotations-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-auth-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-aws-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-azure-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-azure-datalake-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-client-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-cloud-storage-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-common-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-hdfs-client-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-mapreduce-client-common-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-mapreduce-client-core-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-mapreduce-client-jobclient-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-openstack-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-yarn-api-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-yarn-client-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-yarn-common-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-yarn-registry-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-yarn-server-common-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-yarn-server-web-proxy-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-beeline-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-cli-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-common-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-exec-2.3.7-core.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-jdbc-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-llap-client-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-llap-common-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-metastore-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-serde-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-shims-0.23-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-shims-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-shims-common-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-shims-scheduler-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-storage-api-2.7.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-vector-code-gen-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hk2-api-2.6.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hk2-locator-2.6.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hk2-utils-2.6.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/htrace-core4-4.1.0-incubating.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/httpclient-4.5.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/httpcore-4.4.12.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/istack-commons-runtime-3.0.8.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/ivy-2.4.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-annotations-2.10.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-core-2.10.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-core-asl-1.9.13.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-databind-2.10.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-dataformat-cbor-2.10.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-dataformat-yaml-2.10.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-datatype-jsr310-2.10.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-jaxrs-base-2.9.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-jaxrs-json-provider-2.9.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-mapper-asl-1.9.13.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-module-jaxb-annotations-2.10.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-module-paranamer-2.10.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-module-scala_2.12-2.10.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jakarta.activation-api-1.2.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jakarta.annotation-api-1.3.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jakarta.inject-2.6.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jakarta.validation-api-2.0.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jakarta.ws.rs-api-2.1.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jakarta.xml.bind-api-2.3.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/janino-3.1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/javassist-3.25.0-GA.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/javax.inject-1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/javax.jdo-3.2.0-m3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/javax.servlet-api-3.1.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/javolution-5.5.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jaxb-api-2.2.11.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jaxb-runtime-2.3.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jcip-annotations-1.0-1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jcl-over-slf4j-1.7.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jdo-api-3.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jdom-1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jersey-client-2.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jersey-common-2.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jersey-container-servlet-2.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jersey-container-servlet-core-2.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jersey-hk2-2.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jersey-media-jaxb-2.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jersey-server-2.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-client-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-continuation-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-http-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-io-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-jndi-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-plus-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-proxy-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-security-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-server-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-servlet-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-servlets-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-util-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-util-ajax-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-webapp-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-xml-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jline-2.14.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/joda-time-2.10.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jodd-core-3.5.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jpam-1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/json-1.8.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/json-smart-2.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/json4s-ast_2.12-3.6.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/json4s-core_2.12-3.6.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/json4s-jackson_2.12-3.6.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/json4s-scalap_2.12-3.6.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jsp-api-2.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jsr305-3.0.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jta-1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jul-to-slf4j-1.7.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-admin-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-client-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-common-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-core-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-crypto-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-identity-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-server-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-simplekdc-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-util-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerby-asn1-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerby-config-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerby-pkix-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerby-util-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerby-xdr-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kryo-shaded-4.0.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kubernetes-client-4.9.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kubernetes-model-4.9.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kubernetes-model-common-4.9.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/leveldbjni-all-1.8.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/libfb303-0.9.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/libthrift-0.12.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/log4j-1.2.17.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/logging-interceptor-3.12.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/lz4-java-1.7.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/machinist_2.12-0.6.8.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/macro-compat_2.12-1.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/mesos-1.4.0-shaded-protobuf.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/metrics-core-4.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/metrics-graphite-4.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/metrics-jmx-4.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/metrics-json-4.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/metrics-jvm-4.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/minlog-1.3.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/netty-3.10.6.Final.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/netty-all-4.1.47.Final.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/nimbus-jose-jwt-4.41.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/objenesis-2.5.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/okhttp-2.7.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/okhttp-3.12.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/okio-1.15.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/opencsv-2.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/orc-core-1.5.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/orc-mapreduce-1.5.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/orc-shims-1.5.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/oro-2.0.8.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/osgi-resource-locator-1.0.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/paranamer-2.8.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/parquet-column-1.10.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/parquet-common-1.10.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/parquet-encoding-1.10.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/parquet-format-2.4.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/parquet-hadoop-1.10.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/parquet-jackson-1.10.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/pmml-model-1.4.8.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/protobuf-java-2.5.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/py4j-0.10.9.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/pyrolite-4.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/re2j-1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/remotetea-oncrpc-1.1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/scala-collection-compat_2.12-2.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/scala-compiler-2.12.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/scala-library-2.12.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/scala-parser-combinators_2.12-1.1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/scala-reflect-2.12.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/scala-xml_2.12-1.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/shapeless_2.12-2.3.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/shims-0.7.45.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/slf4j-api-1.7.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/slf4j-log4j12-1.7.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/snakeyaml-1.24.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/snappy-java-1.1.7.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-assembly_2.12-3.1.0-SNAPSHOT-tests.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-assembly_2.12-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-catalyst-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-core-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-ganglia-lgpl-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-graphx-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-hadoop-cloud-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-hive-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-hive-thriftserver-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-kubernetes-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-kvstore-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-launcher-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-mesos-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-mllib-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-mllib-local-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-network-common-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-network-shuffle-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-repl-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-sketch-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-sql-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-streaming-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-tags-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-unsafe-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-yarn-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spire-macros_2.12-0.17.0-M1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spire-platform_2.12-0.17.0-M1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spire-util_2.12-0.17.0-M1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spire_2.12-0.17.0-M1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spotbugs-annotations-3.1.9.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/stax-api-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/stax2-api-3.1.4.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/stream-2.9.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/super-csv-2.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/threeten-extra-1.5.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/token-provider-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/transaction-api-1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/univocity-parsers-2.8.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/unused-1.0.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/velocity-1.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/wildfly-openssl-1.0.4.Final.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/woodstox-core-5.0.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/xbean-asm7-shaded-4.15.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/xz-1.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/zjsonpatch-0.3.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/zookeeper-3.4.14.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/zstd-jni-1.4.5-4.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/licenses/LICENSE-AnchorJS.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-CC0.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-bootstrap.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-cloudpickle.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-copybutton.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-d3.min.js.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-dagre-d3.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-datatables.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-graphlib-dot.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-heapq.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-join.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-jquery.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-json-formatter.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-matchMedia-polyfill.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-modernizr.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-mustache.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-py4j.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-respond.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-sbt-launch-lib.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-sorttable.js.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-vis-timeline.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/sbin/spark-config.sh -> pyspark-3.1.0.dev0/deps/sbin copying deps/sbin/spark-daemon.sh -> pyspark-3.1.0.dev0/deps/sbin copying deps/sbin/start-history-server.sh -> pyspark-3.1.0.dev0/deps/sbin copying deps/sbin/stop-history-server.sh -> pyspark-3.1.0.dev0/deps/sbin copying lib/py4j-0.10.9-src.zip -> pyspark-3.1.0.dev0/lib copying lib/pyspark.zip -> pyspark-3.1.0.dev0/lib copying pyspark/__init__.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/_globals.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/accumulators.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/broadcast.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/conf.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/context.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/daemon.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/files.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/find_spark_home.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/heapq3.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/java_gateway.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/join.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/profiler.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/rdd.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/rddsampler.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/resultiterable.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/serializers.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/shell.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/shuffle.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/statcounter.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/status.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/storagelevel.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/taskcontext.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/traceback_utils.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/util.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/version.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/worker.py -> pyspark-3.1.0.dev0/pyspark copying pyspark.egg-info/PKG-INFO -> pyspark-3.1.0.dev0/pyspark.egg-info copying pyspark.egg-info/SOURCES.txt -> pyspark-3.1.0.dev0/pyspark.egg-info copying pyspark.egg-info/dependency_links.txt -> pyspark-3.1.0.dev0/pyspark.egg-info copying pyspark.egg-info/requires.txt -> pyspark-3.1.0.dev0/pyspark.egg-info copying pyspark.egg-info/top_level.txt -> pyspark-3.1.0.dev0/pyspark.egg-info copying pyspark/cloudpickle/__init__.py -> pyspark-3.1.0.dev0/pyspark/cloudpickle copying pyspark/cloudpickle/cloudpickle.py -> pyspark-3.1.0.dev0/pyspark/cloudpickle copying pyspark/cloudpickle/cloudpickle_fast.py -> pyspark-3.1.0.dev0/pyspark/cloudpickle copying pyspark/cloudpickle/compat.py -> pyspark-3.1.0.dev0/pyspark/cloudpickle copying pyspark/ml/__init__.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/base.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/classification.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/clustering.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/common.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/evaluation.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/feature.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/fpm.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/functions.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/image.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/pipeline.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/recommendation.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/regression.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/stat.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/tree.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/tuning.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/util.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/wrapper.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/linalg/__init__.py -> pyspark-3.1.0.dev0/pyspark/ml/linalg copying pyspark/ml/param/__init__.py -> pyspark-3.1.0.dev0/pyspark/ml/param copying pyspark/ml/param/_shared_params_code_gen.py -> pyspark-3.1.0.dev0/pyspark/ml/param copying pyspark/ml/param/shared.py -> pyspark-3.1.0.dev0/pyspark/ml/param copying pyspark/mllib/__init__.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/classification.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/clustering.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/common.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/evaluation.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/feature.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/fpm.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/random.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/recommendation.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/regression.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/tree.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/util.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/linalg/__init__.py -> pyspark-3.1.0.dev0/pyspark/mllib/linalg copying pyspark/mllib/linalg/distributed.py -> pyspark-3.1.0.dev0/pyspark/mllib/linalg copying pyspark/mllib/stat/KernelDensity.py -> pyspark-3.1.0.dev0/pyspark/mllib/stat copying pyspark/mllib/stat/__init__.py -> pyspark-3.1.0.dev0/pyspark/mllib/stat copying pyspark/mllib/stat/_statistics.py -> pyspark-3.1.0.dev0/pyspark/mllib/stat copying pyspark/mllib/stat/distribution.py -> pyspark-3.1.0.dev0/pyspark/mllib/stat copying pyspark/mllib/stat/test.py -> pyspark-3.1.0.dev0/pyspark/mllib/stat copying pyspark/python/pyspark/shell.py -> pyspark-3.1.0.dev0/pyspark/python/pyspark copying pyspark/resource/__init__.py -> pyspark-3.1.0.dev0/pyspark/resource copying pyspark/resource/information.py -> pyspark-3.1.0.dev0/pyspark/resource copying pyspark/resource/profile.py -> pyspark-3.1.0.dev0/pyspark/resource copying pyspark/resource/requests.py -> pyspark-3.1.0.dev0/pyspark/resource copying pyspark/sql/__init__.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/catalog.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/column.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/conf.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/context.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/dataframe.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/functions.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/group.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/readwriter.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/session.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/streaming.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/types.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/udf.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/utils.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/window.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/avro/__init__.py -> pyspark-3.1.0.dev0/pyspark/sql/avro copying pyspark/sql/avro/functions.py -> pyspark-3.1.0.dev0/pyspark/sql/avro copying pyspark/sql/pandas/__init__.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/sql/pandas/conversion.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/sql/pandas/functions.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/sql/pandas/group_ops.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/sql/pandas/map_ops.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/sql/pandas/serializers.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/sql/pandas/typehints.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/sql/pandas/types.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/sql/pandas/utils.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/streaming/__init__.py -> pyspark-3.1.0.dev0/pyspark/streaming copying pyspark/streaming/context.py -> pyspark-3.1.0.dev0/pyspark/streaming copying pyspark/streaming/dstream.py -> pyspark-3.1.0.dev0/pyspark/streaming copying pyspark/streaming/kinesis.py -> pyspark-3.1.0.dev0/pyspark/streaming copying pyspark/streaming/listener.py -> pyspark-3.1.0.dev0/pyspark/streaming copying pyspark/streaming/util.py -> pyspark-3.1.0.dev0/pyspark/streaming Writing pyspark-3.1.0.dev0/setup.cfg creating dist Creating tar archive removing 'pyspark-3.1.0.dev0' (and everything under it) Installing dist into virtual env Processing ./python/dist/pyspark-3.1.0.dev0.tar.gz Collecting py4j==0.10.9 Downloading py4j-0.10.9-py2.py3-none-any.whl (198 kB) Building wheels for collected packages: pyspark Building wheel for pyspark (setup.py): started Building wheel for pyspark (setup.py): finished with status 'done' Created wheel for pyspark: filename=pyspark-3.1.0.dev0-py2.py3-none-any.whl size=301437724 sha256=2aca23e221aee1e35676bed917e4bfd44814fbc68b7036dd32cd7912c3fcf722 Stored in directory: /tmp/pip-ephem-wheel-cache-n61xeibw/wheels/f8/53/33/6dc460dea6d3686f375d2f53a4b8451a5ef26af3ac62212b8c Successfully built pyspark Installing collected packages: py4j, pyspark Successfully installed py4j-0.10.9 pyspark-3.1.0.dev0 Run basic sanity check on pip installed version with spark-submit 20/07/21 06:41:19 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 20/07/21 06:41:20 INFO SparkContext: Running Spark version 3.1.0-SNAPSHOT 20/07/21 06:41:20 INFO ResourceUtils: ============================================================== 20/07/21 06:41:20 INFO ResourceUtils: No custom resources configured for spark.driver. 20/07/21 06:41:20 INFO ResourceUtils: ============================================================== 20/07/21 06:41:20 INFO SparkContext: Submitted application: PipSanityCheck 20/07/21 06:41:20 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 20/07/21 06:41:20 INFO ResourceProfile: Limiting resource is cpu 20/07/21 06:41:20 INFO ResourceProfileManager: Added ResourceProfile id: 0 20/07/21 06:41:20 INFO SecurityManager: Changing view acls to: jenkins 20/07/21 06:41:20 INFO SecurityManager: Changing modify acls to: jenkins 20/07/21 06:41:20 INFO SecurityManager: Changing view acls groups to: 20/07/21 06:41:20 INFO SecurityManager: Changing modify acls groups to: 20/07/21 06:41:20 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jenkins); groups with view permissions: Set(); users with modify permissions: Set(jenkins); groups with modify permissions: Set() 20/07/21 06:41:21 INFO Utils: Successfully started service 'sparkDriver' on port 35170. 20/07/21 06:41:21 INFO SparkEnv: Registering MapOutputTracker 20/07/21 06:41:21 INFO SparkEnv: Registering BlockManagerMaster 20/07/21 06:41:21 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 20/07/21 06:41:21 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 20/07/21 06:41:21 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 20/07/21 06:41:21 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-4e528af6-8c18-4320-ae21-41e198fb1756 20/07/21 06:41:21 INFO MemoryStore: MemoryStore started with capacity 366.3 MiB 20/07/21 06:41:21 INFO SparkEnv: Registering OutputCommitCoordinator 20/07/21 06:41:21 INFO log: Logging initialized @3533ms to org.eclipse.jetty.util.log.Slf4jLog 20/07/21 06:41:21 INFO Server: jetty-9.4.28.v20200408; built: 2020-04-08T17:49:39.557Z; git: ab228fde9e55e9164c738d7fa121f8ac5acd51c9; jvm 1.8.0_191-b12 20/07/21 06:41:21 INFO Server: Started @3647ms 20/07/21 06:41:21 INFO AbstractConnector: Started ServerConnector@11d72247{HTTP/1.1, (http/1.1)}{0.0.0.0:4040} 20/07/21 06:41:21 INFO Utils: Successfully started service 'SparkUI' on port 4040. 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@20888b78{/jobs,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3551d293{/jobs/json,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@505e7816{/jobs/job,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@35f15da5{/jobs/job/json,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7a3ffddb{/stages,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@33eb02ee{/stages/json,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3841630c{/stages/stage,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3e6864e0{/stages/stage/json,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@69ab1d2a{/stages/pool,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@87d8a3a{/stages/pool/json,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@19a7e8d7{/storage,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@75d03fa0{/storage/json,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5cd96c46{/storage/rdd,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7b94f08b{/storage/rdd/json,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@603ece41{/environment,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5cc0a302{/environment/json,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2cd3759a{/executors,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1567dc86{/executors/json,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5791dbca{/executors/threadDump,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5534ec0{/executors/threadDump/json,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@74db5491{/static,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1393b046{/,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3908ed88{/api,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1df9f164{/jobs/job/kill,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7a22c531{/stages/stage/kill,null,AVAILABLE,@Spark} 20/07/21 06:41:21 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://amp-jenkins-worker-04.amp:4040 20/07/21 06:41:21 INFO Executor: Starting executor ID driver on host amp-jenkins-worker-04.amp 20/07/21 06:41:21 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 37075. 20/07/21 06:41:21 INFO NettyBlockTransferService: Server created on amp-jenkins-worker-04.amp:37075 20/07/21 06:41:21 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 20/07/21 06:41:21 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, amp-jenkins-worker-04.amp, 37075, None) 20/07/21 06:41:21 INFO BlockManagerMasterEndpoint: Registering block manager amp-jenkins-worker-04.amp:37075 with 366.3 MiB RAM, BlockManagerId(driver, amp-jenkins-worker-04.amp, 37075, None) 20/07/21 06:41:21 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, amp-jenkins-worker-04.amp, 37075, None) 20/07/21 06:41:21 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, amp-jenkins-worker-04.amp, 37075, None) 20/07/21 06:41:21 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@72f5ac3b{/metrics/json,null,AVAILABLE,@Spark} 20/07/21 06:41:22 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/spark-warehouse'). 20/07/21 06:41:22 INFO SharedState: Warehouse path is 'file:/spark-warehouse'. 20/07/21 06:41:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5f6c3552{/SQL,null,AVAILABLE,@Spark} 20/07/21 06:41:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@a5137b9{/SQL/json,null,AVAILABLE,@Spark} 20/07/21 06:41:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@64bfdb12{/SQL/execution,null,AVAILABLE,@Spark} 20/07/21 06:41:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@614e42ea{/SQL/execution/json,null,AVAILABLE,@Spark} 20/07/21 06:41:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@492319a4{/static/sql,null,AVAILABLE,@Spark} 20/07/21 06:41:23 INFO SparkContext: Starting job: reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:29 20/07/21 06:41:23 INFO DAGScheduler: Got job 0 (reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:29) with 10 output partitions 20/07/21 06:41:23 INFO DAGScheduler: Final stage: ResultStage 0 (reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:29) 20/07/21 06:41:23 INFO DAGScheduler: Parents of final stage: List() 20/07/21 06:41:23 INFO DAGScheduler: Missing parents: List() 20/07/21 06:41:23 INFO DAGScheduler: Submitting ResultStage 0 (PythonRDD[1] at reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:29), which has no missing parents 20/07/21 06:41:23 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 6.5 KiB, free 366.3 MiB) 20/07/21 06:41:23 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 4.1 KiB, free 366.3 MiB) 20/07/21 06:41:23 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on amp-jenkins-worker-04.amp:37075 (size: 4.1 KiB, free: 366.3 MiB) 20/07/21 06:41:23 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1293 20/07/21 06:41:23 INFO DAGScheduler: Submitting 10 missing tasks from ResultStage 0 (PythonRDD[1] at reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:29) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9)) 20/07/21 06:41:23 INFO TaskSchedulerImpl: Adding task set 0.0 with 10 tasks resource profile 0 20/07/21 06:41:23 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, amp-jenkins-worker-04.amp, executor driver, partition 0, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:41:23 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, amp-jenkins-worker-04.amp, executor driver, partition 1, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:41:23 INFO TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, amp-jenkins-worker-04.amp, executor driver, partition 2, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:41:23 INFO TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3, amp-jenkins-worker-04.amp, executor driver, partition 3, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:41:23 INFO TaskSetManager: Starting task 4.0 in stage 0.0 (TID 4, amp-jenkins-worker-04.amp, executor driver, partition 4, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:41:23 INFO TaskSetManager: Starting task 5.0 in stage 0.0 (TID 5, amp-jenkins-worker-04.amp, executor driver, partition 5, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:41:23 INFO TaskSetManager: Starting task 6.0 in stage 0.0 (TID 6, amp-jenkins-worker-04.amp, executor driver, partition 6, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:41:23 INFO TaskSetManager: Starting task 7.0 in stage 0.0 (TID 7, amp-jenkins-worker-04.amp, executor driver, partition 7, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:41:23 INFO TaskSetManager: Starting task 8.0 in stage 0.0 (TID 8, amp-jenkins-worker-04.amp, executor driver, partition 8, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:41:23 INFO TaskSetManager: Starting task 9.0 in stage 0.0 (TID 9, amp-jenkins-worker-04.amp, executor driver, partition 9, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:41:23 INFO Executor: Running task 2.0 in stage 0.0 (TID 2) 20/07/21 06:41:23 INFO Executor: Running task 5.0 in stage 0.0 (TID 5) 20/07/21 06:41:23 INFO Executor: Running task 9.0 in stage 0.0 (TID 9) 20/07/21 06:41:23 INFO Executor: Running task 8.0 in stage 0.0 (TID 8) 20/07/21 06:41:23 INFO Executor: Running task 3.0 in stage 0.0 (TID 3) 20/07/21 06:41:23 INFO Executor: Running task 1.0 in stage 0.0 (TID 1) 20/07/21 06:41:23 INFO Executor: Running task 6.0 in stage 0.0 (TID 6) 20/07/21 06:41:23 INFO Executor: Running task 4.0 in stage 0.0 (TID 4) 20/07/21 06:41:23 INFO Executor: Running task 7.0 in stage 0.0 (TID 7) 20/07/21 06:41:23 INFO Executor: Running task 0.0 in stage 0.0 (TID 0) 20/07/21 06:41:25 INFO PythonRunner: Times: total = 1194, boot = 1146, init = 48, finish = 0 20/07/21 06:41:25 INFO PythonRunner: Times: total = 1194, boot = 1140, init = 54, finish = 0 20/07/21 06:41:25 INFO PythonRunner: Times: total = 1196, boot = 1154, init = 41, finish = 1 20/07/21 06:41:25 INFO PythonRunner: Times: total = 1194, boot = 1150, init = 44, finish = 0 20/07/21 06:41:25 INFO PythonRunner: Times: total = 1197, boot = 1157, init = 40, finish = 0 20/07/21 06:41:25 INFO PythonRunner: Times: total = 1200, boot = 1159, init = 41, finish = 0 20/07/21 06:41:25 INFO PythonRunner: Times: total = 1203, boot = 1162, init = 41, finish = 0 20/07/21 06:41:25 INFO PythonRunner: Times: total = 1207, boot = 1165, init = 41, finish = 1 20/07/21 06:41:25 INFO PythonRunner: Times: total = 1210, boot = 1168, init = 41, finish = 1 20/07/21 06:41:25 INFO PythonRunner: Times: total = 1213, boot = 1171, init = 41, finish = 1 20/07/21 06:41:25 INFO Executor: Finished task 9.0 in stage 0.0 (TID 9). 1550 bytes result sent to driver 20/07/21 06:41:25 INFO Executor: Finished task 4.0 in stage 0.0 (TID 4). 1550 bytes result sent to driver 20/07/21 06:41:25 INFO Executor: Finished task 5.0 in stage 0.0 (TID 5). 1550 bytes result sent to driver 20/07/21 06:41:25 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1549 bytes result sent to driver 20/07/21 06:41:25 INFO Executor: Finished task 6.0 in stage 0.0 (TID 6). 1550 bytes result sent to driver 20/07/21 06:41:25 INFO Executor: Finished task 3.0 in stage 0.0 (TID 3). 1550 bytes result sent to driver 20/07/21 06:41:25 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 1549 bytes result sent to driver 20/07/21 06:41:25 INFO Executor: Finished task 7.0 in stage 0.0 (TID 7). 1550 bytes result sent to driver 20/07/21 06:41:25 INFO Executor: Finished task 2.0 in stage 0.0 (TID 2). 1549 bytes result sent to driver 20/07/21 06:41:25 INFO Executor: Finished task 8.0 in stage 0.0 (TID 8). 1550 bytes result sent to driver 20/07/21 06:41:25 INFO TaskSetManager: Finished task 5.0 in stage 0.0 (TID 5) in 1783 ms on amp-jenkins-worker-04.amp (executor driver) (1/10) 20/07/21 06:41:25 INFO TaskSetManager: Finished task 4.0 in stage 0.0 (TID 4) in 1787 ms on amp-jenkins-worker-04.amp (executor driver) (2/10) 20/07/21 06:41:25 INFO TaskSetManager: Finished task 9.0 in stage 0.0 (TID 9) in 1784 ms on amp-jenkins-worker-04.amp (executor driver) (3/10) 20/07/21 06:41:25 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 1811 ms on amp-jenkins-worker-04.amp (executor driver) (4/10) 20/07/21 06:41:25 INFO TaskSetManager: Finished task 6.0 in stage 0.0 (TID 6) in 1788 ms on amp-jenkins-worker-04.amp (executor driver) (5/10) 20/07/21 06:41:25 INFO TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 1789 ms on amp-jenkins-worker-04.amp (executor driver) (6/10) 20/07/21 06:41:25 INFO PythonAccumulatorV2: Connected to AccumulatorServer at host: 127.0.0.1 port: 37707 20/07/21 06:41:25 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 1796 ms on amp-jenkins-worker-04.amp (executor driver) (7/10) 20/07/21 06:41:25 INFO TaskSetManager: Finished task 7.0 in stage 0.0 (TID 7) in 1790 ms on amp-jenkins-worker-04.amp (executor driver) (8/10) 20/07/21 06:41:25 INFO TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 1794 ms on amp-jenkins-worker-04.amp (executor driver) (9/10) 20/07/21 06:41:25 INFO TaskSetManager: Finished task 8.0 in stage 0.0 (TID 8) in 1791 ms on amp-jenkins-worker-04.amp (executor driver) (10/10) 20/07/21 06:41:25 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 20/07/21 06:41:25 INFO DAGScheduler: ResultStage 0 (reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:29) finished in 2.200 s 20/07/21 06:41:25 INFO DAGScheduler: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job 20/07/21 06:41:25 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage finished 20/07/21 06:41:25 INFO DAGScheduler: Job 0 finished: reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:29, took 2.256970 s Successfully ran pip sanity check 20/07/21 06:41:25 INFO AbstractConnector: Stopped Spark@11d72247{HTTP/1.1, (http/1.1)}{0.0.0.0:4040} 20/07/21 06:41:25 INFO SparkUI: Stopped Spark web UI at http://amp-jenkins-worker-04.amp:4040 20/07/21 06:41:25 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 20/07/21 06:41:25 INFO MemoryStore: MemoryStore cleared 20/07/21 06:41:25 INFO BlockManager: BlockManager stopped 20/07/21 06:41:25 INFO BlockManagerMaster: BlockManagerMaster stopped 20/07/21 06:41:25 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 20/07/21 06:41:25 INFO SparkContext: Successfully stopped SparkContext 20/07/21 06:41:26 INFO ShutdownHookManager: Shutdown hook called 20/07/21 06:41:26 INFO ShutdownHookManager: Deleting directory /tmp/spark-1fb2dbce-f0e5-47bf-8275-425f5db24a65 20/07/21 06:41:26 INFO ShutdownHookManager: Deleting directory /tmp/spark-59f67f8e-0abc-4aa8-89de-cff7ba1bd0b7/pyspark-7e5686ff-0864-49dc-b02b-95d80588534b 20/07/21 06:41:26 INFO ShutdownHookManager: Deleting directory /tmp/spark-59f67f8e-0abc-4aa8-89de-cff7ba1bd0b7 Run basic sanity check with import based 20/07/21 06:41:29 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). [Stage 0:> (0 + 10) / 10] Successfully ran pip sanity check Run the tests for context.py 20/07/21 06:41:37 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 20/07/21 06:41:40 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). [Stage 0:> (0 + 4) / 4] [Stage 10:> (0 + 4) / 4] [Stage 10:> (0 + 4) / 4][Stage 11:> (0 + 0) / 2] 20/07/21 06:41:54 WARN PythonRunner: Incomplete task 3.0 in stage 10 (TID 42) interrupted: Attempting to kill Python Worker 20/07/21 06:41:54 WARN PythonRunner: Incomplete task 0.0 in stage 10 (TID 39) interrupted: Attempting to kill Python Worker 20/07/21 06:41:54 WARN PythonRunner: Incomplete task 2.0 in stage 10 (TID 41) interrupted: Attempting to kill Python Worker 20/07/21 06:41:54 WARN PythonRunner: Incomplete task 1.0 in stage 10 (TID 40) interrupted: Attempting to kill Python Worker 20/07/21 06:41:54 WARN TaskSetManager: Lost task 1.0 in stage 10.0 (TID 40, amp-jenkins-worker-04.amp, executor driver): TaskKilled (Stage cancelled) 20/07/21 06:41:54 WARN TaskSetManager: Lost task 3.0 in stage 10.0 (TID 42, amp-jenkins-worker-04.amp, executor driver): TaskKilled (Stage cancelled) 20/07/21 06:41:54 WARN TaskSetManager: Lost task 2.0 in stage 10.0 (TID 41, amp-jenkins-worker-04.amp, executor driver): TaskKilled (Stage cancelled) 20/07/21 06:41:54 WARN TaskSetManager: Lost task 0.0 in stage 10.0 (TID 39, amp-jenkins-worker-04.amp, executor driver): TaskKilled (Stage cancelled) Testing pip installation with python 3.6 Using /tmp/tmp.7spGqFp6sH for virtualenv Fetching package metadata ........... Solving package specifications: . Package plan for installation in environment /tmp/tmp.7spGqFp6sH/3.6: The following NEW packages will be INSTALLED: _libgcc_mutex: 0.1-main blas: 1.0-mkl ca-certificates: 2020.6.24-0 certifi: 2020.6.20-py36_0 intel-openmp: 2020.1-217 ld_impl_linux-64: 2.33.1-h53a641e_7 libedit: 3.1.20191231-h14c3975_1 libffi: 3.3-he6710b0_2 libgcc-ng: 9.1.0-hdf63c60_0 libgfortran-ng: 7.3.0-hdf63c60_0 libstdcxx-ng: 9.1.0-hdf63c60_0 mkl: 2020.1-217 mkl-service: 2.3.0-py36he904b0f_0 mkl_fft: 1.1.0-py36h23d657b_0 mkl_random: 1.1.1-py36h0573a6f_0 ncurses: 6.2-he6710b0_1 numpy: 1.18.5-py36ha1c710e_0 numpy-base: 1.18.5-py36hde5b4d6_0 openssl: 1.1.1g-h7b6447c_0 pandas: 1.0.5-py36h0573a6f_0 pip: 20.1.1-py36_1 python: 3.6.10-h7579374_2 python-dateutil: 2.8.1-py_0 pytz: 2020.1-py_0 readline: 8.0-h7b6447c_0 setuptools: 49.2.0-py36_0 six: 1.15.0-py_0 sqlite: 3.32.3-h62c20be_0 tk: 8.6.10-hbc83047_0 wheel: 0.34.2-py36_0 xz: 5.2.5-h7b6447c_0 zlib: 1.2.11-h7b6447c_3 # # To activate this environment, use: # > source activate /tmp/tmp.7spGqFp6sH/3.6 # # To deactivate an active environment, use: # > source deactivate # Creating pip installable source dist running sdist running egg_info creating pyspark.egg-info writing pyspark.egg-info/PKG-INFO writing dependency_links to pyspark.egg-info/dependency_links.txt writing requirements to pyspark.egg-info/requires.txt writing top-level names to pyspark.egg-info/top_level.txt writing manifest file 'pyspark.egg-info/SOURCES.txt' package init file 'deps/bin/__init__.py' not found (or not a regular file) package init file 'deps/sbin/__init__.py' not found (or not a regular file) package init file 'deps/jars/__init__.py' not found (or not a regular file) package init file 'pyspark/python/pyspark/__init__.py' not found (or not a regular file) package init file 'lib/__init__.py' not found (or not a regular file) package init file 'deps/data/__init__.py' not found (or not a regular file) package init file 'deps/licenses/__init__.py' not found (or not a regular file) package init file 'deps/examples/__init__.py' not found (or not a regular file) reading manifest file 'pyspark.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files matching '*.py[cod]' found anywhere in distribution warning: no previously-included files matching '__pycache__' found anywhere in distribution warning: no previously-included files matching '.DS_Store' found anywhere in distribution writing manifest file 'pyspark.egg-info/SOURCES.txt' running check creating pyspark-3.1.0.dev0 creating pyspark-3.1.0.dev0/deps creating pyspark-3.1.0.dev0/deps/bin creating pyspark-3.1.0.dev0/deps/data creating pyspark-3.1.0.dev0/deps/data/graphx creating pyspark-3.1.0.dev0/deps/data/mllib creating pyspark-3.1.0.dev0/deps/data/mllib/als creating pyspark-3.1.0.dev0/deps/data/mllib/images creating pyspark-3.1.0.dev0/deps/data/mllib/images/origin creating pyspark-3.1.0.dev0/deps/data/mllib/images/origin/kittens creating pyspark-3.1.0.dev0/deps/data/mllib/images/partitioned creating pyspark-3.1.0.dev0/deps/data/mllib/images/partitioned/cls=kittens creating pyspark-3.1.0.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01 creating pyspark-3.1.0.dev0/deps/data/mllib/ridge-data creating pyspark-3.1.0.dev0/deps/data/streaming creating pyspark-3.1.0.dev0/deps/examples creating pyspark-3.1.0.dev0/deps/examples/ml creating pyspark-3.1.0.dev0/deps/examples/mllib creating pyspark-3.1.0.dev0/deps/examples/sql creating pyspark-3.1.0.dev0/deps/examples/sql/streaming creating pyspark-3.1.0.dev0/deps/examples/streaming creating pyspark-3.1.0.dev0/deps/jars creating pyspark-3.1.0.dev0/deps/licenses creating pyspark-3.1.0.dev0/deps/sbin creating pyspark-3.1.0.dev0/lib creating pyspark-3.1.0.dev0/pyspark creating pyspark-3.1.0.dev0/pyspark.egg-info creating pyspark-3.1.0.dev0/pyspark/cloudpickle creating pyspark-3.1.0.dev0/pyspark/ml creating pyspark-3.1.0.dev0/pyspark/ml/linalg creating pyspark-3.1.0.dev0/pyspark/ml/param creating pyspark-3.1.0.dev0/pyspark/mllib creating pyspark-3.1.0.dev0/pyspark/mllib/linalg creating pyspark-3.1.0.dev0/pyspark/mllib/stat creating pyspark-3.1.0.dev0/pyspark/python creating pyspark-3.1.0.dev0/pyspark/python/pyspark creating pyspark-3.1.0.dev0/pyspark/resource creating pyspark-3.1.0.dev0/pyspark/sql creating pyspark-3.1.0.dev0/pyspark/sql/avro creating pyspark-3.1.0.dev0/pyspark/sql/pandas creating pyspark-3.1.0.dev0/pyspark/streaming copying files to pyspark-3.1.0.dev0... copying MANIFEST.in -> pyspark-3.1.0.dev0 copying README.md -> pyspark-3.1.0.dev0 copying setup.cfg -> pyspark-3.1.0.dev0 copying setup.py -> pyspark-3.1.0.dev0 copying deps/bin/beeline -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/beeline.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/docker-image-tool.sh -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/find-spark-home -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/find-spark-home.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/load-spark-env.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/load-spark-env.sh -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/pyspark -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/pyspark.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/pyspark2.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/run-example -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/run-example.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-class -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-class.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-class2.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-shell -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-shell.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-shell2.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-sql -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-sql.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-sql2.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-submit -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-submit.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/spark-submit2.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/sparkR -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/sparkR.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/bin/sparkR2.cmd -> pyspark-3.1.0.dev0/deps/bin copying deps/data/graphx/followers.txt -> pyspark-3.1.0.dev0/deps/data/graphx copying deps/data/graphx/users.txt -> pyspark-3.1.0.dev0/deps/data/graphx copying deps/data/mllib/gmm_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/iris_libsvm.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/kmeans_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/pagerank_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/pic_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_binary_classification_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_fpgrowth.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_isotonic_regression_libsvm_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_kmeans_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_lda_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_lda_libsvm_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_libsvm_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_linear_regression_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_movielens_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_multiclass_classification_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/sample_svm_data.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/streaming_kmeans_data_test.txt -> pyspark-3.1.0.dev0/deps/data/mllib copying deps/data/mllib/als/sample_movielens_ratings.txt -> pyspark-3.1.0.dev0/deps/data/mllib/als copying deps/data/mllib/als/test.data -> pyspark-3.1.0.dev0/deps/data/mllib/als copying deps/data/mllib/images/license.txt -> pyspark-3.1.0.dev0/deps/data/mllib/images copying deps/data/mllib/images/origin/license.txt -> pyspark-3.1.0.dev0/deps/data/mllib/images/origin copying deps/data/mllib/images/origin/kittens/not-image.txt -> pyspark-3.1.0.dev0/deps/data/mllib/images/origin/kittens copying deps/data/mllib/images/partitioned/cls=kittens/date=2018-01/not-image.txt -> pyspark-3.1.0.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01 copying deps/data/mllib/ridge-data/lpsa.data -> pyspark-3.1.0.dev0/deps/data/mllib/ridge-data copying deps/data/streaming/AFINN-111.txt -> pyspark-3.1.0.dev0/deps/data/streaming copying deps/examples/als.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/avro_inputformat.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/kmeans.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/logistic_regression.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/pagerank.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/parquet_inputformat.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/pi.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/sort.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/status_api_demo.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/transitive_closure.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/wordcount.py -> pyspark-3.1.0.dev0/deps/examples copying deps/examples/ml/aft_survival_regression.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/als_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/anova_selector_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/anova_test_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/binarizer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/bisecting_k_means_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/bucketed_random_projection_lsh_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/bucketizer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/chi_square_test_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/chisq_selector_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/correlation_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/count_vectorizer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/cross_validator.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/dataframe_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/dct_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/decision_tree_classification_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/decision_tree_regression_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/elementwise_product_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/estimator_transformer_param_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/feature_hasher_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/fm_classifier_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/fm_regressor_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/fpgrowth_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/fvalue_selector_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/fvalue_test_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/gaussian_mixture_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/generalized_linear_regression_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/gradient_boosted_tree_classifier_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/gradient_boosted_tree_regressor_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/imputer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/index_to_string_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/interaction_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/isotonic_regression_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/kmeans_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/lda_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/linear_regression_with_elastic_net.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/linearsvc.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/logistic_regression_summary_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/logistic_regression_with_elastic_net.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/max_abs_scaler_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/min_hash_lsh_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/min_max_scaler_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/multiclass_logistic_regression_with_elastic_net.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/multilayer_perceptron_classification.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/n_gram_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/naive_bayes_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/normalizer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/one_vs_rest_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/onehot_encoder_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/pca_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/pipeline_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/polynomial_expansion_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/power_iteration_clustering_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/prefixspan_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/quantile_discretizer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/random_forest_classifier_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/random_forest_regressor_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/rformula_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/robust_scaler_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/sql_transformer.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/standard_scaler_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/stopwords_remover_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/string_indexer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/summarizer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/tf_idf_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/tokenizer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/train_validation_split.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/variance_threshold_selector_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/vector_assembler_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/vector_indexer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/vector_size_hint_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/vector_slicer_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/ml/word2vec_example.py -> pyspark-3.1.0.dev0/deps/examples/ml copying deps/examples/mllib/binary_classification_metrics_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/bisecting_k_means_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/correlations.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/correlations_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/decision_tree_classification_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/decision_tree_regression_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/elementwise_product_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/fpgrowth_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/gaussian_mixture_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/gaussian_mixture_model.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/gradient_boosting_classification_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/gradient_boosting_regression_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/hypothesis_testing_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/hypothesis_testing_kolmogorov_smirnov_test_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/isotonic_regression_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/k_means_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/kernel_density_estimation_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/kmeans.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/latent_dirichlet_allocation_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/linear_regression_with_sgd_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/logistic_regression.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/logistic_regression_with_lbfgs_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/multi_class_metrics_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/multi_label_metrics_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/naive_bayes_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/normalizer_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/pca_rowmatrix_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/power_iteration_clustering_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/random_forest_classification_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/random_forest_regression_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/random_rdd_generation.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/ranking_metrics_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/recommendation_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/regression_metrics_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/sampled_rdds.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/standard_scaler_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/stratified_sampling_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/streaming_k_means_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/streaming_linear_regression_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/summary_statistics_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/svd_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/svm_with_sgd_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/tf_idf_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/word2vec.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/mllib/word2vec_example.py -> pyspark-3.1.0.dev0/deps/examples/mllib copying deps/examples/sql/arrow.py -> pyspark-3.1.0.dev0/deps/examples/sql copying deps/examples/sql/basic.py -> pyspark-3.1.0.dev0/deps/examples/sql copying deps/examples/sql/datasource.py -> pyspark-3.1.0.dev0/deps/examples/sql copying deps/examples/sql/hive.py -> pyspark-3.1.0.dev0/deps/examples/sql copying deps/examples/sql/streaming/structured_kafka_wordcount.py -> pyspark-3.1.0.dev0/deps/examples/sql/streaming copying deps/examples/sql/streaming/structured_network_wordcount.py -> pyspark-3.1.0.dev0/deps/examples/sql/streaming copying deps/examples/sql/streaming/structured_network_wordcount_windowed.py -> pyspark-3.1.0.dev0/deps/examples/sql/streaming copying deps/examples/streaming/hdfs_wordcount.py -> pyspark-3.1.0.dev0/deps/examples/streaming copying deps/examples/streaming/network_wordcount.py -> pyspark-3.1.0.dev0/deps/examples/streaming copying deps/examples/streaming/network_wordjoinsentiments.py -> pyspark-3.1.0.dev0/deps/examples/streaming copying deps/examples/streaming/queue_stream.py -> pyspark-3.1.0.dev0/deps/examples/streaming copying deps/examples/streaming/recoverable_network_wordcount.py -> pyspark-3.1.0.dev0/deps/examples/streaming copying deps/examples/streaming/sql_network_wordcount.py -> pyspark-3.1.0.dev0/deps/examples/streaming copying deps/examples/streaming/stateful_network_wordcount.py -> pyspark-3.1.0.dev0/deps/examples/streaming copying deps/jars/HikariCP-2.5.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/JLargeArrays-1.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/JTransforms-3.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/RoaringBitmap-0.7.45.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/ST4-4.0.4.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/accessors-smart-1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/activation-1.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/aircompressor-0.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/algebra_2.12-2.0.0-M2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/aliyun-sdk-oss-2.8.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/antlr-runtime-3.5.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/antlr4-runtime-4.7.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/aopalliance-1.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/aopalliance-repackaged-2.6.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/arpack_combined_all-0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/arrow-format-0.15.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/arrow-memory-0.15.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/arrow-vector-0.15.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/audience-annotations-0.5.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/automaton-1.11-8.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/avro-1.8.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/avro-ipc-1.8.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/avro-mapred-1.8.2-hadoop2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/aws-java-sdk-bundle-1.11.375.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/azure-data-lake-store-sdk-2.2.9.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/azure-keyvault-core-1.0.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/azure-storage-7.0.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/bonecp-0.8.0.RELEASE.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/breeze-macros_2.12-1.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/breeze_2.12-1.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/cats-kernel_2.12-2.0.0-M4.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/chill-java-0.9.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/chill_2.12-0.9.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-beanutils-1.9.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-cli-1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-codec-1.11.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-collections-3.2.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-compiler-3.1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-compress-1.9.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-configuration2-2.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-crypto-1.0.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-daemon-1.0.13.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-dbcp-1.4.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-httpclient-3.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-io-2.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-lang-2.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-lang3-3.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-logging-1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-math3-3.4.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-net-3.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-pool-1.5.4.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/commons-text-1.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/compress-lzf-1.0.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/core-1.1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/curator-client-2.12.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/curator-framework-2.13.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/curator-recipes-2.13.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/datanucleus-api-jdo-4.2.4.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/datanucleus-core-4.1.17.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/datanucleus-rdbms-4.1.19.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/derby-10.12.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/dnsjava-2.1.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/ehcache-3.3.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/flatbuffers-java-1.9.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/generex-1.0.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/geronimo-jcache_1.0_spec-1.0-alpha-1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/gmetric4j-1.0.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/gson-2.2.4.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/guava-14.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-aliyun-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-annotations-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-auth-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-aws-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-azure-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-azure-datalake-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-client-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-cloud-storage-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-common-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-hdfs-client-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-mapreduce-client-common-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-mapreduce-client-core-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-mapreduce-client-jobclient-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-openstack-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-yarn-api-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-yarn-client-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-yarn-common-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-yarn-registry-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-yarn-server-common-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hadoop-yarn-server-web-proxy-3.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-beeline-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-cli-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-common-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-exec-2.3.7-core.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-jdbc-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-llap-client-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-llap-common-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-metastore-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-serde-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-shims-0.23-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-shims-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-shims-common-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-shims-scheduler-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-storage-api-2.7.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hive-vector-code-gen-2.3.7.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hk2-api-2.6.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hk2-locator-2.6.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/hk2-utils-2.6.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/htrace-core4-4.1.0-incubating.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/httpclient-4.5.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/httpcore-4.4.12.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/istack-commons-runtime-3.0.8.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/ivy-2.4.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-annotations-2.10.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-core-2.10.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-core-asl-1.9.13.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-databind-2.10.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-dataformat-cbor-2.10.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-dataformat-yaml-2.10.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-datatype-jsr310-2.10.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-jaxrs-base-2.9.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-jaxrs-json-provider-2.9.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-mapper-asl-1.9.13.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-module-jaxb-annotations-2.10.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-module-paranamer-2.10.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jackson-module-scala_2.12-2.10.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jakarta.activation-api-1.2.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jakarta.annotation-api-1.3.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jakarta.inject-2.6.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jakarta.validation-api-2.0.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jakarta.ws.rs-api-2.1.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jakarta.xml.bind-api-2.3.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/janino-3.1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/javassist-3.25.0-GA.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/javax.inject-1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/javax.jdo-3.2.0-m3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/javax.servlet-api-3.1.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/javolution-5.5.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jaxb-api-2.2.11.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jaxb-runtime-2.3.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jcip-annotations-1.0-1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jcl-over-slf4j-1.7.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jdo-api-3.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jdom-1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jersey-client-2.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jersey-common-2.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jersey-container-servlet-2.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jersey-container-servlet-core-2.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jersey-hk2-2.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jersey-media-jaxb-2.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jersey-server-2.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-client-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-continuation-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-http-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-io-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-jndi-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-plus-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-proxy-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-security-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-server-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-servlet-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-servlets-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-util-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-util-ajax-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-webapp-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jetty-xml-9.4.28.v20200408.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jline-2.14.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/joda-time-2.10.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jodd-core-3.5.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jpam-1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/json-1.8.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/json-smart-2.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/json4s-ast_2.12-3.6.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/json4s-core_2.12-3.6.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/json4s-jackson_2.12-3.6.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/json4s-scalap_2.12-3.6.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jsp-api-2.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jsr305-3.0.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jta-1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/jul-to-slf4j-1.7.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-admin-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-client-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-common-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-core-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-crypto-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-identity-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-server-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-simplekdc-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerb-util-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerby-asn1-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerby-config-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerby-pkix-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerby-util-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kerby-xdr-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kryo-shaded-4.0.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kubernetes-client-4.9.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kubernetes-model-4.9.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/kubernetes-model-common-4.9.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/leveldbjni-all-1.8.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/libfb303-0.9.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/libthrift-0.12.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/log4j-1.2.17.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/logging-interceptor-3.12.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/lz4-java-1.7.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/machinist_2.12-0.6.8.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/macro-compat_2.12-1.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/mesos-1.4.0-shaded-protobuf.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/metrics-core-4.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/metrics-graphite-4.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/metrics-jmx-4.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/metrics-json-4.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/metrics-jvm-4.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/minlog-1.3.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/netty-3.10.6.Final.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/netty-all-4.1.47.Final.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/nimbus-jose-jwt-4.41.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/objenesis-2.5.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/okhttp-2.7.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/okhttp-3.12.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/okio-1.15.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/opencsv-2.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/orc-core-1.5.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/orc-mapreduce-1.5.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/orc-shims-1.5.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/oro-2.0.8.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/osgi-resource-locator-1.0.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/paranamer-2.8.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/parquet-column-1.10.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/parquet-common-1.10.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/parquet-encoding-1.10.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/parquet-format-2.4.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/parquet-hadoop-1.10.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/parquet-jackson-1.10.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/pmml-model-1.4.8.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/protobuf-java-2.5.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/py4j-0.10.9.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/pyrolite-4.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/re2j-1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/remotetea-oncrpc-1.1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/scala-collection-compat_2.12-2.1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/scala-compiler-2.12.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/scala-library-2.12.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/scala-parser-combinators_2.12-1.1.2.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/scala-reflect-2.12.10.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/scala-xml_2.12-1.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/shapeless_2.12-2.3.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/shims-0.7.45.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/slf4j-api-1.7.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/slf4j-log4j12-1.7.30.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/snakeyaml-1.24.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/snappy-java-1.1.7.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-assembly_2.12-3.1.0-SNAPSHOT-tests.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-assembly_2.12-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-catalyst-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-core-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-ganglia-lgpl-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-graphx-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-hadoop-cloud-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-hive-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-hive-thriftserver-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-kubernetes-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-kvstore-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-launcher-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-mesos-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-mllib-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-mllib-local-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-network-common-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-network-shuffle-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-repl-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-sketch-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-sql-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-streaming-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-tags-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-unsafe-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spark-yarn-3.1.0-SNAPSHOT.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spire-macros_2.12-0.17.0-M1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spire-platform_2.12-0.17.0-M1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spire-util_2.12-0.17.0-M1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spire_2.12-0.17.0-M1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/spotbugs-annotations-3.1.9.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/stax-api-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/stax2-api-3.1.4.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/stream-2.9.6.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/super-csv-2.2.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/threeten-extra-1.5.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/token-provider-1.0.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/transaction-api-1.1.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/univocity-parsers-2.8.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/unused-1.0.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/velocity-1.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/wildfly-openssl-1.0.4.Final.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/woodstox-core-5.0.3.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/xbean-asm7-shaded-4.15.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/xz-1.5.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/zjsonpatch-0.3.0.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/zookeeper-3.4.14.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/jars/zstd-jni-1.4.5-4.jar -> pyspark-3.1.0.dev0/deps/jars copying deps/licenses/LICENSE-AnchorJS.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-CC0.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-bootstrap.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-cloudpickle.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-copybutton.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-d3.min.js.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-dagre-d3.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-datatables.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-graphlib-dot.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-heapq.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-join.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-jquery.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-json-formatter.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-matchMedia-polyfill.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-modernizr.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-mustache.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-py4j.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-respond.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-sbt-launch-lib.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-sorttable.js.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/licenses/LICENSE-vis-timeline.txt -> pyspark-3.1.0.dev0/deps/licenses copying deps/sbin/spark-config.sh -> pyspark-3.1.0.dev0/deps/sbin copying deps/sbin/spark-daemon.sh -> pyspark-3.1.0.dev0/deps/sbin copying deps/sbin/start-history-server.sh -> pyspark-3.1.0.dev0/deps/sbin copying deps/sbin/stop-history-server.sh -> pyspark-3.1.0.dev0/deps/sbin copying lib/py4j-0.10.9-src.zip -> pyspark-3.1.0.dev0/lib copying lib/pyspark.zip -> pyspark-3.1.0.dev0/lib copying pyspark/__init__.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/_globals.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/accumulators.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/broadcast.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/conf.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/context.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/daemon.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/files.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/find_spark_home.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/heapq3.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/java_gateway.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/join.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/profiler.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/rdd.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/rddsampler.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/resultiterable.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/serializers.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/shell.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/shuffle.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/statcounter.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/status.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/storagelevel.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/taskcontext.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/traceback_utils.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/util.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/version.py -> pyspark-3.1.0.dev0/pyspark copying pyspark/worker.py -> pyspark-3.1.0.dev0/pyspark copying pyspark.egg-info/PKG-INFO -> pyspark-3.1.0.dev0/pyspark.egg-info copying pyspark.egg-info/SOURCES.txt -> pyspark-3.1.0.dev0/pyspark.egg-info copying pyspark.egg-info/dependency_links.txt -> pyspark-3.1.0.dev0/pyspark.egg-info copying pyspark.egg-info/requires.txt -> pyspark-3.1.0.dev0/pyspark.egg-info copying pyspark.egg-info/top_level.txt -> pyspark-3.1.0.dev0/pyspark.egg-info copying pyspark/cloudpickle/__init__.py -> pyspark-3.1.0.dev0/pyspark/cloudpickle copying pyspark/cloudpickle/cloudpickle.py -> pyspark-3.1.0.dev0/pyspark/cloudpickle copying pyspark/cloudpickle/cloudpickle_fast.py -> pyspark-3.1.0.dev0/pyspark/cloudpickle copying pyspark/cloudpickle/compat.py -> pyspark-3.1.0.dev0/pyspark/cloudpickle copying pyspark/ml/__init__.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/base.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/classification.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/clustering.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/common.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/evaluation.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/feature.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/fpm.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/functions.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/image.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/pipeline.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/recommendation.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/regression.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/stat.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/tree.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/tuning.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/util.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/wrapper.py -> pyspark-3.1.0.dev0/pyspark/ml copying pyspark/ml/linalg/__init__.py -> pyspark-3.1.0.dev0/pyspark/ml/linalg copying pyspark/ml/param/__init__.py -> pyspark-3.1.0.dev0/pyspark/ml/param copying pyspark/ml/param/_shared_params_code_gen.py -> pyspark-3.1.0.dev0/pyspark/ml/param copying pyspark/ml/param/shared.py -> pyspark-3.1.0.dev0/pyspark/ml/param copying pyspark/mllib/__init__.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/classification.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/clustering.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/common.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/evaluation.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/feature.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/fpm.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/random.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/recommendation.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/regression.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/tree.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/util.py -> pyspark-3.1.0.dev0/pyspark/mllib copying pyspark/mllib/linalg/__init__.py -> pyspark-3.1.0.dev0/pyspark/mllib/linalg copying pyspark/mllib/linalg/distributed.py -> pyspark-3.1.0.dev0/pyspark/mllib/linalg copying pyspark/mllib/stat/KernelDensity.py -> pyspark-3.1.0.dev0/pyspark/mllib/stat copying pyspark/mllib/stat/__init__.py -> pyspark-3.1.0.dev0/pyspark/mllib/stat copying pyspark/mllib/stat/_statistics.py -> pyspark-3.1.0.dev0/pyspark/mllib/stat copying pyspark/mllib/stat/distribution.py -> pyspark-3.1.0.dev0/pyspark/mllib/stat copying pyspark/mllib/stat/test.py -> pyspark-3.1.0.dev0/pyspark/mllib/stat copying pyspark/python/pyspark/shell.py -> pyspark-3.1.0.dev0/pyspark/python/pyspark copying pyspark/resource/__init__.py -> pyspark-3.1.0.dev0/pyspark/resource copying pyspark/resource/information.py -> pyspark-3.1.0.dev0/pyspark/resource copying pyspark/resource/profile.py -> pyspark-3.1.0.dev0/pyspark/resource copying pyspark/resource/requests.py -> pyspark-3.1.0.dev0/pyspark/resource copying pyspark/sql/__init__.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/catalog.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/column.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/conf.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/context.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/dataframe.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/functions.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/group.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/readwriter.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/session.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/streaming.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/types.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/udf.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/utils.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/window.py -> pyspark-3.1.0.dev0/pyspark/sql copying pyspark/sql/avro/__init__.py -> pyspark-3.1.0.dev0/pyspark/sql/avro copying pyspark/sql/avro/functions.py -> pyspark-3.1.0.dev0/pyspark/sql/avro copying pyspark/sql/pandas/__init__.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/sql/pandas/conversion.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/sql/pandas/functions.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/sql/pandas/group_ops.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/sql/pandas/map_ops.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/sql/pandas/serializers.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/sql/pandas/typehints.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/sql/pandas/types.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/sql/pandas/utils.py -> pyspark-3.1.0.dev0/pyspark/sql/pandas copying pyspark/streaming/__init__.py -> pyspark-3.1.0.dev0/pyspark/streaming copying pyspark/streaming/context.py -> pyspark-3.1.0.dev0/pyspark/streaming copying pyspark/streaming/dstream.py -> pyspark-3.1.0.dev0/pyspark/streaming copying pyspark/streaming/kinesis.py -> pyspark-3.1.0.dev0/pyspark/streaming copying pyspark/streaming/listener.py -> pyspark-3.1.0.dev0/pyspark/streaming copying pyspark/streaming/util.py -> pyspark-3.1.0.dev0/pyspark/streaming Writing pyspark-3.1.0.dev0/setup.cfg Creating tar archive removing 'pyspark-3.1.0.dev0' (and everything under it) Installing dist into virtual env Obtaining file:///home/jenkins/workspace/NewSparkPullRequestBuilder/python Collecting py4j==0.10.9 Downloading py4j-0.10.9-py2.py3-none-any.whl (198 kB) Installing collected packages: py4j, pyspark Running setup.py develop for pyspark Successfully installed py4j-0.10.9 pyspark Run basic sanity check on pip installed version with spark-submit 20/07/21 06:42:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 20/07/21 06:42:42 INFO SparkContext: Running Spark version 3.1.0-SNAPSHOT 20/07/21 06:42:42 INFO ResourceUtils: ============================================================== 20/07/21 06:42:42 INFO ResourceUtils: No custom resources configured for spark.driver. 20/07/21 06:42:42 INFO ResourceUtils: ============================================================== 20/07/21 06:42:42 INFO SparkContext: Submitted application: PipSanityCheck 20/07/21 06:42:42 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 20/07/21 06:42:42 INFO ResourceProfile: Limiting resource is cpu 20/07/21 06:42:42 INFO ResourceProfileManager: Added ResourceProfile id: 0 20/07/21 06:42:42 INFO SecurityManager: Changing view acls to: jenkins 20/07/21 06:42:42 INFO SecurityManager: Changing modify acls to: jenkins 20/07/21 06:42:42 INFO SecurityManager: Changing view acls groups to: 20/07/21 06:42:42 INFO SecurityManager: Changing modify acls groups to: 20/07/21 06:42:42 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jenkins); groups with view permissions: Set(); users with modify permissions: Set(jenkins); groups with modify permissions: Set() 20/07/21 06:42:43 INFO Utils: Successfully started service 'sparkDriver' on port 35490. 20/07/21 06:42:43 INFO SparkEnv: Registering MapOutputTracker 20/07/21 06:42:43 INFO SparkEnv: Registering BlockManagerMaster 20/07/21 06:42:43 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 20/07/21 06:42:43 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 20/07/21 06:42:43 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 20/07/21 06:42:43 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-bce45073-778d-4e98-a749-977aa9cac355 20/07/21 06:42:43 INFO MemoryStore: MemoryStore started with capacity 366.3 MiB 20/07/21 06:42:43 INFO SparkEnv: Registering OutputCommitCoordinator 20/07/21 06:42:43 INFO log: Logging initialized @3884ms to org.eclipse.jetty.util.log.Slf4jLog 20/07/21 06:42:43 INFO Server: jetty-9.4.28.v20200408; built: 2020-04-08T17:49:39.557Z; git: ab228fde9e55e9164c738d7fa121f8ac5acd51c9; jvm 1.8.0_191-b12 20/07/21 06:42:43 INFO Server: Started @3989ms 20/07/21 06:42:43 INFO AbstractConnector: Started ServerConnector@e39a39{HTTP/1.1, (http/1.1)}{0.0.0.0:4040} 20/07/21 06:42:43 INFO Utils: Successfully started service 'SparkUI' on port 4040. 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@62824b78{/jobs,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@37dbd918{/jobs/json,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@58eaf55{/jobs/job,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@283cde23{/jobs/job/json,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@11bbc128{/stages,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@462ac40f{/stages/json,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@56c09618{/stages/stage,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@261d67f0{/stages/stage/json,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6367b6d9{/stages/pool,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@20c023b9{/stages/pool/json,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1e05b912{/storage,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2d9cc924{/storage/json,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@e0d6e08{/storage/rdd,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4e8feae2{/storage/rdd/json,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@a29571b{/environment,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@30ee7f55{/environment/json,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@17a7d338{/executors,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6fa74e88{/executors/json,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@271de403{/executors/threadDump,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5a0730aa{/executors/threadDump/json,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4a67b72e{/static,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@263ee6a8{/,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@881a5e6{/api,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1758e0c4{/jobs/job/kill,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5a573583{/stages/stage/kill,null,AVAILABLE,@Spark} 20/07/21 06:42:43 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://amp-jenkins-worker-04.amp:4040 20/07/21 06:42:43 INFO Executor: Starting executor ID driver on host amp-jenkins-worker-04.amp 20/07/21 06:42:43 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 34553. 20/07/21 06:42:43 INFO NettyBlockTransferService: Server created on amp-jenkins-worker-04.amp:34553 20/07/21 06:42:43 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 20/07/21 06:42:43 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, amp-jenkins-worker-04.amp, 34553, None) 20/07/21 06:42:43 INFO BlockManagerMasterEndpoint: Registering block manager amp-jenkins-worker-04.amp:34553 with 366.3 MiB RAM, BlockManagerId(driver, amp-jenkins-worker-04.amp, 34553, None) 20/07/21 06:42:43 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, amp-jenkins-worker-04.amp, 34553, None) 20/07/21 06:42:43 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, amp-jenkins-worker-04.amp, 34553, None) 20/07/21 06:42:44 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3e336fb3{/metrics/json,null,AVAILABLE,@Spark} 20/07/21 06:42:44 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/spark-warehouse'). 20/07/21 06:42:44 INFO SharedState: Warehouse path is 'file:/spark-warehouse'. 20/07/21 06:42:44 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@42bd35e7{/SQL,null,AVAILABLE,@Spark} 20/07/21 06:42:44 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6561112a{/SQL/json,null,AVAILABLE,@Spark} 20/07/21 06:42:44 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6ff3a529{/SQL/execution,null,AVAILABLE,@Spark} 20/07/21 06:42:44 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@10fbeb8c{/SQL/execution/json,null,AVAILABLE,@Spark} 20/07/21 06:42:44 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6032c4e1{/static/sql,null,AVAILABLE,@Spark} 20/07/21 06:42:47 INFO SparkContext: Starting job: reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:29 20/07/21 06:42:47 INFO DAGScheduler: Got job 0 (reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:29) with 10 output partitions 20/07/21 06:42:47 INFO DAGScheduler: Final stage: ResultStage 0 (reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:29) 20/07/21 06:42:47 INFO DAGScheduler: Parents of final stage: List() 20/07/21 06:42:47 INFO DAGScheduler: Missing parents: List() 20/07/21 06:42:47 INFO DAGScheduler: Submitting ResultStage 0 (PythonRDD[1] at reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:29), which has no missing parents 20/07/21 06:42:47 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 6.5 KiB, free 366.3 MiB) 20/07/21 06:42:48 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 4.0 KiB, free 366.3 MiB) 20/07/21 06:42:48 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on amp-jenkins-worker-04.amp:34553 (size: 4.0 KiB, free: 366.3 MiB) 20/07/21 06:42:48 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1293 20/07/21 06:42:48 INFO DAGScheduler: Submitting 10 missing tasks from ResultStage 0 (PythonRDD[1] at reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:29) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9)) 20/07/21 06:42:48 INFO TaskSchedulerImpl: Adding task set 0.0 with 10 tasks resource profile 0 20/07/21 06:42:48 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, amp-jenkins-worker-04.amp, executor driver, partition 0, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:42:48 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, amp-jenkins-worker-04.amp, executor driver, partition 1, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:42:48 INFO TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, amp-jenkins-worker-04.amp, executor driver, partition 2, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:42:48 INFO TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3, amp-jenkins-worker-04.amp, executor driver, partition 3, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:42:48 INFO TaskSetManager: Starting task 4.0 in stage 0.0 (TID 4, amp-jenkins-worker-04.amp, executor driver, partition 4, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:42:48 INFO TaskSetManager: Starting task 5.0 in stage 0.0 (TID 5, amp-jenkins-worker-04.amp, executor driver, partition 5, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:42:48 INFO TaskSetManager: Starting task 6.0 in stage 0.0 (TID 6, amp-jenkins-worker-04.amp, executor driver, partition 6, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:42:48 INFO TaskSetManager: Starting task 7.0 in stage 0.0 (TID 7, amp-jenkins-worker-04.amp, executor driver, partition 7, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:42:48 INFO TaskSetManager: Starting task 8.0 in stage 0.0 (TID 8, amp-jenkins-worker-04.amp, executor driver, partition 8, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:42:48 INFO TaskSetManager: Starting task 9.0 in stage 0.0 (TID 9, amp-jenkins-worker-04.amp, executor driver, partition 9, PROCESS_LOCAL, 7333 bytes) taskResourceAssignments Map() 20/07/21 06:42:48 INFO Executor: Running task 3.0 in stage 0.0 (TID 3) 20/07/21 06:42:48 INFO Executor: Running task 2.0 in stage 0.0 (TID 2) 20/07/21 06:42:48 INFO Executor: Running task 6.0 in stage 0.0 (TID 6) 20/07/21 06:42:48 INFO Executor: Running task 4.0 in stage 0.0 (TID 4) 20/07/21 06:42:48 INFO Executor: Running task 0.0 in stage 0.0 (TID 0) 20/07/21 06:42:48 INFO Executor: Running task 1.0 in stage 0.0 (TID 1) 20/07/21 06:42:48 INFO Executor: Running task 8.0 in stage 0.0 (TID 8) 20/07/21 06:42:48 INFO Executor: Running task 9.0 in stage 0.0 (TID 9) 20/07/21 06:42:48 INFO Executor: Running task 7.0 in stage 0.0 (TID 7) 20/07/21 06:42:48 INFO Executor: Running task 5.0 in stage 0.0 (TID 5) 20/07/21 06:42:49 INFO PythonRunner: Times: total = 566, boot = 525, init = 41, finish = 0 20/07/21 06:42:49 INFO PythonRunner: Times: total = 565, boot = 520, init = 45, finish = 0 20/07/21 06:42:49 INFO PythonRunner: Times: total = 565, boot = 510, init = 55, finish = 0 20/07/21 06:42:49 INFO PythonRunner: Times: total = 565, boot = 516, init = 49, finish = 0 20/07/21 06:42:49 INFO PythonRunner: Times: total = 571, boot = 529, init = 41, finish = 1 20/07/21 06:42:49 INFO PythonRunner: Times: total = 575, boot = 532, init = 42, finish = 1 20/07/21 06:42:49 INFO PythonRunner: Times: total = 578, boot = 536, init = 42, finish = 0 20/07/21 06:42:49 INFO PythonRunner: Times: total = 583, boot = 540, init = 42, finish = 1 20/07/21 06:42:49 INFO PythonRunner: Times: total = 585, boot = 544, init = 41, finish = 0 20/07/21 06:42:49 INFO PythonRunner: Times: total = 590, boot = 547, init = 43, finish = 0 20/07/21 06:42:49 INFO Executor: Finished task 9.0 in stage 0.0 (TID 9). 1550 bytes result sent to driver 20/07/21 06:42:49 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1549 bytes result sent to driver 20/07/21 06:42:49 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 1549 bytes result sent to driver 20/07/21 06:42:49 INFO Executor: Finished task 2.0 in stage 0.0 (TID 2). 1549 bytes result sent to driver 20/07/21 06:42:49 INFO Executor: Finished task 3.0 in stage 0.0 (TID 3). 1593 bytes result sent to driver 20/07/21 06:42:49 INFO Executor: Finished task 4.0 in stage 0.0 (TID 4). 1550 bytes result sent to driver 20/07/21 06:42:49 INFO Executor: Finished task 7.0 in stage 0.0 (TID 7). 1550 bytes result sent to driver 20/07/21 06:42:49 INFO Executor: Finished task 5.0 in stage 0.0 (TID 5). 1550 bytes result sent to driver 20/07/21 06:42:49 INFO Executor: Finished task 8.0 in stage 0.0 (TID 8). 1550 bytes result sent to driver 20/07/21 06:42:49 INFO Executor: Finished task 6.0 in stage 0.0 (TID 6). 1550 bytes result sent to driver 20/07/21 06:42:49 INFO TaskSetManager: Finished task 9.0 in stage 0.0 (TID 9) in 994 ms on amp-jenkins-worker-04.amp (executor driver) (1/10) 20/07/21 06:42:49 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 1025 ms on amp-jenkins-worker-04.amp (executor driver) (2/10) 20/07/21 06:42:49 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 1005 ms on amp-jenkins-worker-04.amp (executor driver) (3/10) 20/07/21 06:42:49 INFO TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 1005 ms on amp-jenkins-worker-04.amp (executor driver) (4/10) 20/07/21 06:42:49 INFO TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 1007 ms on amp-jenkins-worker-04.amp (executor driver) (5/10) 20/07/21 06:42:49 INFO TaskSetManager: Finished task 7.0 in stage 0.0 (TID 7) in 1002 ms on amp-jenkins-worker-04.amp (executor driver) (6/10) 20/07/21 06:42:49 INFO TaskSetManager: Finished task 4.0 in stage 0.0 (TID 4) in 1006 ms on amp-jenkins-worker-04.amp (executor driver) (7/10) 20/07/21 06:42:49 INFO PythonAccumulatorV2: Connected to AccumulatorServer at host: 127.0.0.1 port: 50520 20/07/21 06:42:49 INFO TaskSetManager: Finished task 5.0 in stage 0.0 (TID 5) in 1005 ms on amp-jenkins-worker-04.amp (executor driver) (8/10) 20/07/21 06:42:49 INFO TaskSetManager: Finished task 8.0 in stage 0.0 (TID 8) in 1004 ms on amp-jenkins-worker-04.amp (executor driver) (9/10) 20/07/21 06:42:49 INFO TaskSetManager: Finished task 6.0 in stage 0.0 (TID 6) in 1005 ms on amp-jenkins-worker-04.amp (executor driver) (10/10) 20/07/21 06:42:49 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 20/07/21 06:42:49 INFO DAGScheduler: ResultStage 0 (reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:29) finished in 2.015 s 20/07/21 06:42:49 INFO DAGScheduler: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job 20/07/21 06:42:49 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage finished 20/07/21 06:42:49 INFO DAGScheduler: Job 0 finished: reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:29, took 2.073157 s Successfully ran pip sanity check 20/07/21 06:42:49 INFO AbstractConnector: Stopped Spark@e39a39{HTTP/1.1, (http/1.1)}{0.0.0.0:4040} 20/07/21 06:42:49 INFO SparkUI: Stopped Spark web UI at http://amp-jenkins-worker-04.amp:4040 20/07/21 06:42:49 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 20/07/21 06:42:49 INFO MemoryStore: MemoryStore cleared 20/07/21 06:42:49 INFO BlockManager: BlockManager stopped 20/07/21 06:42:49 INFO BlockManagerMaster: BlockManagerMaster stopped 20/07/21 06:42:49 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 20/07/21 06:42:49 INFO SparkContext: Successfully stopped SparkContext 20/07/21 06:42:50 INFO ShutdownHookManager: Shutdown hook called 20/07/21 06:42:50 INFO ShutdownHookManager: Deleting directory /tmp/spark-e4006587-1f95-4e53-9033-a58afb8ca72b 20/07/21 06:42:50 INFO ShutdownHookManager: Deleting directory /tmp/spark-e4006587-1f95-4e53-9033-a58afb8ca72b/pyspark-b3c0545e-98e0-4f15-bd2a-5892a74fcbd9 20/07/21 06:42:50 INFO ShutdownHookManager: Deleting directory /tmp/spark-75025691-e465-46c4-953b-6bf5652f9534 Run basic sanity check with import based 20/07/21 06:42:52 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). [Stage 0:> (0 + 10) / 10] [Stage 0:=====> (1 + 9) / 10] Successfully ran pip sanity check Run the tests for context.py 20/07/21 06:43:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 20/07/21 06:43:04 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). [Stage 0:> (0 + 4) / 4] [Stage 10:> (0 + 4) / 4] [Stage 10:> (0 + 4) / 4][Stage 11:> (0 + 0) / 2] 20/07/21 06:43:17 WARN PythonRunner: Incomplete task 1.0 in stage 10 (TID 40) interrupted: Attempting to kill Python Worker 20/07/21 06:43:17 WARN PythonRunner: Incomplete task 2.0 in stage 10 (TID 41) interrupted: Attempting to kill Python Worker 20/07/21 06:43:17 WARN PythonRunner: Incomplete task 3.0 in stage 10 (TID 42) interrupted: Attempting to kill Python Worker 20/07/21 06:43:17 WARN PythonRunner: Incomplete task 0.0 in stage 10 (TID 39) interrupted: Attempting to kill Python Worker 20/07/21 06:43:17 WARN TaskSetManager: Lost task 0.0 in stage 10.0 (TID 39, amp-jenkins-worker-04.amp, executor driver): TaskKilled (Stage cancelled) 20/07/21 06:43:17 WARN TaskSetManager: Lost task 3.0 in stage 10.0 (TID 42, amp-jenkins-worker-04.amp, executor driver): TaskKilled (Stage cancelled) 20/07/21 06:43:17 WARN TaskSetManager: Lost task 2.0 in stage 10.0 (TID 41, amp-jenkins-worker-04.amp, executor driver): TaskKilled (Stage cancelled) 20/07/21 06:43:17 WARN TaskSetManager: Lost task 1.0 in stage 10.0 (TID 40, amp-jenkins-worker-04.amp, executor driver): TaskKilled (Stage cancelled) Cleaning up temporary directory - /tmp/tmp.7spGqFp6sH Attempting to post to Github... > Post successful. + ./build/sbt unsafe/test Archiving artifacts WARN: No artifacts found that match the file pattern "**/target/unit-tests.log,python/unit-tests.log". Configuration error? WARN: java.lang.InterruptedException: no matches found within 10000 Recording test results Finished: SUCCESS