FailedConsole Output

Started by an SCM change
Started by an SCM change
[EnvInject] - Loading node environment variables.
[EnvInject] - Preparing an environment for the build.
[EnvInject] - Keeping Jenkins system variables.
[EnvInject] - Keeping Jenkins build variables.
[EnvInject] - Injecting as environment variables the properties content 
PATH=/home/anaconda/bin:/home/jenkins/.cargo/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.1.1/bin/:/home/android-sdk/:/home/jenkins/.cargo/bin:/home/anaconda/bin:/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.1.1/bin/:/home/android-sdk/:/usr/local/bin:/bin:/usr/bin
JAVA_HOME=/usr/java/jdk1.8.0_60
JAVA_7_HOME=/usr/java/jdk1.7.0_79
SPARK_BRANCH=master
AMPLAB_JENKINS_BUILD_PROFILE=hadoop2.6
AMPLAB_JENKINS="true"
SPARK_TESTING=1

[EnvInject] - Variables injected successfully.
[EnvInject] - Injecting contributions.
Building remotely on amp-jenkins-worker-02 (spark-compile centos spark-test) in workspace /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
 > /home/jenkins/git2/bin/git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > /home/jenkins/git2/bin/git config remote.origin.url https://github.com/apache/spark.git # timeout=10
Fetching upstream changes from https://github.com/apache/spark.git
 > /home/jenkins/git2/bin/git --version # timeout=10
 > /home/jenkins/git2/bin/git fetch --tags --progress https://github.com/apache/spark.git +refs/heads/*:refs/remotes/origin/*
 > /home/jenkins/git2/bin/git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 52f9f66d545093a76d53ebc68104a132a594952f (origin/master)
 > /home/jenkins/git2/bin/git config core.sparsecheckout # timeout=10
 > /home/jenkins/git2/bin/git checkout -f 52f9f66d545093a76d53ebc68104a132a594952f
 > /home/jenkins/git2/bin/git rev-list d47a25f681abd761fb16b6a861d892f1b399daf2 # timeout=10
[spark-master-test-sbt-hadoop-2.6] $ /bin/bash /tmp/hudson1518654860880248065.sh
Removing R/lib/
Removing R/pkg/man/
Removing build/apache-maven-3.5.4/
Removing build/sbt-launch-0.13.17.jar
Removing build/scala-2.11.12/
Removing build/zinc-0.3.15/
Removing dev/__pycache__/
Removing dev/create-release/__pycache__/
Removing dev/lint-r-report.log
Removing dev/pr-deps/
Removing dev/pycodestyle-2.4.0.py
Removing dev/sparktestsupport/__init__.pyc
Removing dev/sparktestsupport/__pycache__/
Removing dev/sparktestsupport/modules.pyc
Removing dev/sparktestsupport/shellutils.pyc
Removing dev/sparktestsupport/toposort.pyc
Removing examples/src/main/python/__pycache__/
Removing examples/src/main/python/ml/__pycache__/
Removing examples/src/main/python/mllib/__pycache__/
Removing examples/src/main/python/sql/__pycache__/
Removing examples/src/main/python/sql/streaming/__pycache__/
Removing examples/src/main/python/streaming/__pycache__/
Removing external/kinesis-asl-assembly/target/
Removing external/kinesis-asl/src/main/python/examples/streaming/__pycache__/
Removing external/kinesis-asl/target/
Removing external/spark-ganglia-lgpl/target/
Removing lib/
Removing project/project/
Removing project/target/
Removing python/__pycache__/
Removing python/docs/__pycache__/
Removing python/docs/_build/
Removing python/pyspark/__pycache__/
Removing python/pyspark/ml/__pycache__/
Removing python/pyspark/ml/linalg/__pycache__/
Removing python/pyspark/ml/param/__pycache__/
Removing python/pyspark/mllib/__pycache__/
Removing python/pyspark/mllib/linalg/__pycache__/
Removing python/pyspark/mllib/stat/__pycache__/
Removing python/pyspark/sql/__pycache__/
Removing python/pyspark/streaming/__pycache__/
Removing python/test_coverage/__pycache__/
Removing python/test_support/__pycache__/
Removing resource-managers/kubernetes/integration-tests/tests/__pycache__/
Removing scalastyle-on-compile.generated.xml
Removing sql/__pycache__/
Removing sql/hive/src/test/resources/data/scripts/__pycache__/
+++ dirname /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/install-dev.sh
++ cd /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R
++ pwd
+ FWDIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R
+ LIB_DIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/lib
+ mkdir -p /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/lib
+ pushd /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R
+ . /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/find-r.sh
++ '[' -z '' ']'
++ '[' '!' -z '' ']'
+++ command -v R
++ '[' '!' /usr/bin/R ']'
++++ which R
+++ dirname /usr/bin/R
++ R_SCRIPT_PATH=/usr/bin
++ echo 'Using R_SCRIPT_PATH = /usr/bin'
Using R_SCRIPT_PATH = /usr/bin
+ . /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/create-rd.sh
++ set -o pipefail
++ set -e
++++ dirname /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/create-rd.sh
+++ cd /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R
+++ pwd
++ FWDIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R
++ pushd /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R
++ . /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/find-r.sh
+++ '[' -z /usr/bin ']'
++ /usr/bin/Rscript -e ' if("devtools" %in% rownames(installed.packages())) { library(devtools); devtools::document(pkg="./pkg", roclets=c("rd")) }'
Updating SparkR documentation
Loading SparkR
Updating collate directive in  /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/pkg/DESCRIPTION 
Loading required package: methods
Creating a new generic function for 'as.data.frame' in package 'SparkR'
Creating a new generic function for 'colnames' in package 'SparkR'
Creating a new generic function for 'colnames<-' in package 'SparkR'
Creating a new generic function for 'cov' in package 'SparkR'
Creating a new generic function for 'drop' in package 'SparkR'
Creating a new generic function for 'na.omit' in package 'SparkR'
Creating a new generic function for 'filter' in package 'SparkR'
Creating a new generic function for 'intersect' in package 'SparkR'
Creating a new generic function for 'sample' in package 'SparkR'
Creating a new generic function for 'transform' in package 'SparkR'
Creating a new generic function for 'subset' in package 'SparkR'
Creating a new generic function for 'summary' in package 'SparkR'
Creating a new generic function for 'union' in package 'SparkR'
Creating a new generic function for 'lag' in package 'SparkR'
Creating a new generic function for 'rank' in package 'SparkR'
Creating a new generic function for 'sd' in package 'SparkR'
Creating a new generic function for 'var' in package 'SparkR'
Creating a new generic function for 'window' in package 'SparkR'
Creating a new generic function for 'predict' in package 'SparkR'
Creating a new generic function for 'rbind' in package 'SparkR'
Creating a generic function for 'lapply' from package 'base' in package 'SparkR'
Creating a generic function for 'Filter' from package 'base' in package 'SparkR'
Creating a generic function for 'alias' from package 'stats' in package 'SparkR'
Creating a generic function for 'substr' from package 'base' in package 'SparkR'
Creating a generic function for '%in%' from package 'base' in package 'SparkR'
Creating a generic function for 'mean' from package 'base' in package 'SparkR'
Creating a generic function for 'unique' from package 'base' in package 'SparkR'
Creating a generic function for 'nrow' from package 'base' in package 'SparkR'
Creating a generic function for 'ncol' from package 'base' in package 'SparkR'
Creating a generic function for 'head' from package 'utils' in package 'SparkR'
Creating a generic function for 'factorial' from package 'base' in package 'SparkR'
Creating a generic function for 'atan2' from package 'base' in package 'SparkR'
Creating a generic function for 'ifelse' from package 'base' in package 'SparkR'
First time using roxygen2 4.0. Upgrading automatically...
Writing SparkDataFrame.Rd
Writing printSchema.Rd
Writing schema.Rd
Writing explain.Rd
Writing isLocal.Rd
Writing showDF.Rd
Writing show.Rd
Writing dtypes.Rd
Writing columns.Rd
Writing coltypes.Rd
Writing createOrReplaceTempView.Rd
Writing registerTempTable-deprecated.Rd
Writing insertInto.Rd
Writing cache.Rd
Writing persist.Rd
Writing unpersist.Rd
Writing storageLevel.Rd
Writing coalesce.Rd
Writing repartition.Rd
Writing repartitionByRange.Rd
Writing toJSON.Rd
Writing write.json.Rd
Writing write.orc.Rd
Writing write.parquet.Rd
Writing write.text.Rd
Writing distinct.Rd
Writing sample.Rd
Writing nrow.Rd
Writing ncol.Rd
Writing dim.Rd
Writing collect.Rd
Writing limit.Rd
Writing take.Rd
Writing head.Rd
Writing first.Rd
Writing groupBy.Rd
Writing summarize.Rd
Writing dapply.Rd
Writing dapplyCollect.Rd
Writing gapply.Rd
Writing gapplyCollect.Rd
Writing select.Rd
Writing subset.Rd
Writing selectExpr.Rd
Writing withColumn.Rd
Writing mutate.Rd
Writing rename.Rd
Writing arrange.Rd
Writing filter.Rd
Writing dropDuplicates.Rd
Writing join.Rd
Writing crossJoin.Rd
Writing merge.Rd
Writing union.Rd
Writing unionByName.Rd
Writing rbind.Rd
Writing intersect.Rd
Writing intersectAll.Rd
Writing except.Rd
Writing exceptAll.Rd
Writing write.df.Rd
Writing saveAsTable.Rd
Writing describe.Rd
Writing summary.Rd
Writing nafunctions.Rd
Writing as.data.frame.Rd
Writing attach.Rd
Writing with.Rd
Writing str.Rd
Writing drop.Rd
Writing histogram.Rd
Writing write.jdbc.Rd
Writing randomSplit.Rd
Writing getNumPartitions.Rd
Writing isStreaming.Rd
Writing write.stream.Rd
Writing checkpoint.Rd
Writing localCheckpoint.Rd
Writing cube.Rd
Writing rollup.Rd
Writing hint.Rd
Writing alias.Rd
Writing broadcast.Rd
Writing withWatermark.Rd
Writing sparkR.conf.Rd
Writing sparkR.version.Rd
Writing createDataFrame.Rd
Writing read.json.Rd
Writing read.orc.Rd
Writing read.parquet.Rd
Writing read.text.Rd
Writing sql.Rd
Writing tableToDF.Rd
Writing read.df.Rd
Writing read.jdbc.Rd
Writing read.stream.Rd
Writing WindowSpec.Rd
Writing partitionBy.Rd
Writing orderBy.Rd
Writing rowsBetween.Rd
Writing rangeBetween.Rd
Writing over.Rd
Writing createExternalTable-deprecated.Rd
Writing createTable.Rd
Writing cacheTable.Rd
Writing uncacheTable.Rd
Writing clearCache.Rd
Writing dropTempTable-deprecated.Rd
Writing dropTempView.Rd
Writing tables.Rd
Writing tableNames.Rd
Writing currentDatabase.Rd
Writing setCurrentDatabase.Rd
Writing listDatabases.Rd
Writing listTables.Rd
Writing listColumns.Rd
Writing listFunctions.Rd
Writing recoverPartitions.Rd
Writing refreshTable.Rd
Writing refreshByPath.Rd
Writing column.Rd
Writing columnfunctions.Rd
Writing substr.Rd
Writing startsWith.Rd
Writing endsWith.Rd
Writing between.Rd
Writing cast.Rd
Writing match.Rd
Writing otherwise.Rd
Writing eq_null_safe.Rd
Writing not.Rd
Writing spark.addFile.Rd
Writing spark.getSparkFilesRootDirectory.Rd
Writing spark.getSparkFiles.Rd
Writing spark.lapply.Rd
Writing setLogLevel.Rd
Writing setCheckpointDir.Rd
Writing column_aggregate_functions.Rd
Writing column_datetime_functions.Rd
Writing column_datetime_diff_functions.Rd
Writing column_math_functions.Rd
Writing column_string_functions.Rd
Writing column_nonaggregate_functions.Rd
Writing column_misc_functions.Rd
Writing column_collection_functions.Rd
Writing column_window_functions.Rd
Writing avg.Rd
Writing corr.Rd
Writing cov.Rd
Writing count.Rd
Writing last.Rd
Writing sampleBy.Rd
Writing windowPartitionBy.Rd
Writing windowOrderBy.Rd
Writing fitted.Rd
Writing predict.Rd
Writing spark.als.Rd
Writing spark.bisectingKmeans.Rd
Writing spark.gaussianMixture.Rd
Writing spark.gbt.Rd
Writing spark.glm.Rd
Writing spark.isoreg.Rd
Writing spark.kmeans.Rd
Writing spark.kstest.Rd
Writing spark.lda.Rd
Writing spark.logit.Rd
Writing spark.mlp.Rd
Writing spark.naiveBayes.Rd
Writing spark.decisionTree.Rd
Writing spark.randomForest.Rd
Writing spark.survreg.Rd
Writing spark.svmLinear.Rd
Writing spark.fpGrowth.Rd
Writing write.ml.Rd
Writing awaitTermination.Rd
Writing isActive.Rd
Writing lastProgress.Rd
Writing queryName.Rd
Writing status.Rd
Writing stopQuery.Rd
Writing GroupedData.Rd
Writing pivot.Rd
Writing install.spark.Rd
Writing print.jobj.Rd
Writing sparkR.callJMethod.Rd
Writing sparkR.callJStatic.Rd
Writing sparkR.newJObject.Rd
Writing LinearSVCModel-class.Rd
Writing LogisticRegressionModel-class.Rd
Writing MultilayerPerceptronClassificationModel-class.Rd
Writing NaiveBayesModel-class.Rd
Writing BisectingKMeansModel-class.Rd
Writing GaussianMixtureModel-class.Rd
Writing KMeansModel-class.Rd
Writing LDAModel-class.Rd
Writing FPGrowthModel-class.Rd
Writing ALSModel-class.Rd
Writing AFTSurvivalRegressionModel-class.Rd
Writing GeneralizedLinearRegressionModel-class.Rd
Writing IsotonicRegressionModel-class.Rd
Writing glm.Rd
Writing KSTest-class.Rd
Writing GBTRegressionModel-class.Rd
Writing GBTClassificationModel-class.Rd
Writing RandomForestRegressionModel-class.Rd
Writing RandomForestClassificationModel-class.Rd
Writing DecisionTreeRegressionModel-class.Rd
Writing DecisionTreeClassificationModel-class.Rd
Writing read.ml.Rd
Writing structType.Rd
Writing print.structType.Rd
Writing structField.Rd
Writing print.structField.Rd
Writing sparkR.session.stop.Rd
Writing sparkR.init-deprecated.Rd
Writing sparkRSQL.init-deprecated.Rd
Writing sparkRHive.init-deprecated.Rd
Writing sparkR.session.Rd
Writing sparkR.uiWebUrl.Rd
Writing setJobGroup.Rd
Writing clearJobGroup.Rd
Writing cancelJobGroup.Rd
Writing setJobDescription.Rd
Writing setLocalProperty.Rd
Writing getLocalProperty.Rd
Writing crosstab.Rd
Writing freqItems.Rd
Writing approxQuantile.Rd
Writing StreamingQuery.Rd
Writing hashCode.Rd
Warning messages:
1: In check_dep_version(pkg, version, compare) :
  Need roxygen2 >= 5.0.0 but loaded version is 4.1.1
2: In check_dep_version(pkg, version, compare) :
  Need roxygen2 >= 5.0.0 but loaded version is 4.1.1
+ /usr/bin/R CMD INSTALL --library=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/lib /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/pkg/
* installing *source* package 'SparkR' ...
** R
** inst
** preparing package for lazy loading
Creating a new generic function for 'as.data.frame' in package 'SparkR'
Creating a new generic function for 'colnames' in package 'SparkR'
Creating a new generic function for 'colnames<-' in package 'SparkR'
Creating a new generic function for 'cov' in package 'SparkR'
Creating a new generic function for 'drop' in package 'SparkR'
Creating a new generic function for 'na.omit' in package 'SparkR'
Creating a new generic function for 'filter' in package 'SparkR'
Creating a new generic function for 'intersect' in package 'SparkR'
Creating a new generic function for 'sample' in package 'SparkR'
Creating a new generic function for 'transform' in package 'SparkR'
Creating a new generic function for 'subset' in package 'SparkR'
Creating a new generic function for 'summary' in package 'SparkR'
Creating a new generic function for 'union' in package 'SparkR'
Creating a new generic function for 'lag' in package 'SparkR'
Creating a new generic function for 'rank' in package 'SparkR'
Creating a new generic function for 'sd' in package 'SparkR'
Creating a new generic function for 'var' in package 'SparkR'
Creating a new generic function for 'window' in package 'SparkR'
Creating a new generic function for 'predict' in package 'SparkR'
Creating a new generic function for 'rbind' in package 'SparkR'
Creating a generic function for 'lapply' from package 'base' in package 'SparkR'
Creating a generic function for 'Filter' from package 'base' in package 'SparkR'
Creating a generic function for 'alias' from package 'stats' in package 'SparkR'
Creating a generic function for 'substr' from package 'base' in package 'SparkR'
Creating a generic function for '%in%' from package 'base' in package 'SparkR'
Creating a generic function for 'mean' from package 'base' in package 'SparkR'
Creating a generic function for 'unique' from package 'base' in package 'SparkR'
Creating a generic function for 'nrow' from package 'base' in package 'SparkR'
Creating a generic function for 'ncol' from package 'base' in package 'SparkR'
Creating a generic function for 'head' from package 'utils' in package 'SparkR'
Creating a generic function for 'factorial' from package 'base' in package 'SparkR'
Creating a generic function for 'atan2' from package 'base' in package 'SparkR'
Creating a generic function for 'ifelse' from package 'base' in package 'SparkR'
** help
*** installing help indices
  converting help for package 'SparkR'
    finding HTML links ... done
    AFTSurvivalRegressionModel-class        html  
    ALSModel-class                          html  
    BisectingKMeansModel-class              html  
    DecisionTreeClassificationModel-class   html  
    DecisionTreeRegressionModel-class       html  
    FPGrowthModel-class                     html  
    GBTClassificationModel-class            html  
    GBTRegressionModel-class                html  
    GaussianMixtureModel-class              html  
    GeneralizedLinearRegressionModel-class
                                            html  
    GroupedData                             html  
    IsotonicRegressionModel-class           html  
    KMeansModel-class                       html  
    KSTest-class                            html  
    LDAModel-class                          html  
    LinearSVCModel-class                    html  
    LogisticRegressionModel-class           html  
    MultilayerPerceptronClassificationModel-class
                                            html  
    NaiveBayesModel-class                   html  
    RandomForestClassificationModel-class   html  
    RandomForestRegressionModel-class       html  
    SparkDataFrame                          html  
    StreamingQuery                          html  
    WindowSpec                              html  
    alias                                   html  
    approxQuantile                          html  
    arrange                                 html  
    as.data.frame                           html  
    attach                                  html  
    avg                                     html  
    awaitTermination                        html  
    between                                 html  
    broadcast                               html  
    cache                                   html  
    cacheTable                              html  
    cancelJobGroup                          html  
    cast                                    html  
    checkpoint                              html  
    clearCache                              html  
    clearJobGroup                           html  
    coalesce                                html  
    collect                                 html  
    coltypes                                html  
    column                                  html  
    column_aggregate_functions              html  
    column_collection_functions             html  
    column_datetime_diff_functions          html  
    column_datetime_functions               html  
    column_math_functions                   html  
    column_misc_functions                   html  
    column_nonaggregate_functions           html  
    column_string_functions                 html  
    column_window_functions                 html  
    columnfunctions                         html  
    columns                                 html  
    corr                                    html  
    count                                   html  
    cov                                     html  
    createDataFrame                         html  
    createExternalTable-deprecated          html  
    createOrReplaceTempView                 html  
    createTable                             html  
    crossJoin                               html  
    crosstab                                html  
    cube                                    html  
    currentDatabase                         html  
    dapply                                  html  
    dapplyCollect                           html  
    describe                                html  
    dim                                     html  
    distinct                                html  
    drop                                    html  
    dropDuplicates                          html  
    dropTempTable-deprecated                html  
    dropTempView                            html  
    dtypes                                  html  
    endsWith                                html  
    eq_null_safe                            html  
    except                                  html  
    exceptAll                               html  
    explain                                 html  
    filter                                  html  
    first                                   html  
    fitted                                  html  
    freqItems                               html  
    gapply                                  html  
    gapplyCollect                           html  
    getLocalProperty                        html  
    getNumPartitions                        html  
    glm                                     html  
    groupBy                                 html  
    hashCode                                html  
    head                                    html  
    hint                                    html  
    histogram                               html  
    insertInto                              html  
    install.spark                           html  
    intersect                               html  
    intersectAll                            html  
    isActive                                html  
    isLocal                                 html  
    isStreaming                             html  
    join                                    html  
    last                                    html  
    lastProgress                            html  
    limit                                   html  
    listColumns                             html  
    listDatabases                           html  
    listFunctions                           html  
    listTables                              html  
    localCheckpoint                         html  
    match                                   html  
    merge                                   html  
    mutate                                  html  
    nafunctions                             html  
    ncol                                    html  
    not                                     html  
    nrow                                    html  
    orderBy                                 html  
    otherwise                               html  
    over                                    html  
    partitionBy                             html  
    persist                                 html  
    pivot                                   html  
    predict                                 html  
    print.jobj                              html  
    print.structField                       html  
    print.structType                        html  
    printSchema                             html  
    queryName                               html  
    randomSplit                             html  
    rangeBetween                            html  
    rbind                                   html  
    read.df                                 html  
    read.jdbc                               html  
    read.json                               html  
    read.ml                                 html  
    read.orc                                html  
    read.parquet                            html  
    read.stream                             html  
    read.text                               html  
    recoverPartitions                       html  
    refreshByPath                           html  
    refreshTable                            html  
    registerTempTable-deprecated            html  
    rename                                  html  
    repartition                             html  
    repartitionByRange                      html  
    rollup                                  html  
    rowsBetween                             html  
    sample                                  html  
    sampleBy                                html  
    saveAsTable                             html  
    schema                                  html  
    select                                  html  
    selectExpr                              html  
    setCheckpointDir                        html  
    setCurrentDatabase                      html  
    setJobDescription                       html  
    setJobGroup                             html  
    setLocalProperty                        html  
    setLogLevel                             html  
    show                                    html  
    showDF                                  html  
    spark.addFile                           html  
    spark.als                               html  
    spark.bisectingKmeans                   html  
    spark.decisionTree                      html  
    spark.fpGrowth                          html  
    spark.gaussianMixture                   html  
    spark.gbt                               html  
    spark.getSparkFiles                     html  
    spark.getSparkFilesRootDirectory        html  
    spark.glm                               html  
    spark.isoreg                            html  
    spark.kmeans                            html  
    spark.kstest                            html  
    spark.lapply                            html  
    spark.lda                               html  
    spark.logit                             html  
    spark.mlp                               html  
    spark.naiveBayes                        html  
    spark.randomForest                      html  
    spark.survreg                           html  
    spark.svmLinear                         html  
    sparkR.callJMethod                      html  
    sparkR.callJStatic                      html  
    sparkR.conf                             html  
    sparkR.init-deprecated                  html  
    sparkR.newJObject                       html  
    sparkR.session                          html  
    sparkR.session.stop                     html  
    sparkR.uiWebUrl                         html  
    sparkR.version                          html  
    sparkRHive.init-deprecated              html  
    sparkRSQL.init-deprecated               html  
    sql                                     html  
    startsWith                              html  
    status                                  html  
    stopQuery                               html  
    storageLevel                            html  
    str                                     html  
    structField                             html  
    structType                              html  
    subset                                  html  
    substr                                  html  
    summarize                               html  
    summary                                 html  
    tableNames                              html  
    tableToDF                               html  
    tables                                  html  
    take                                    html  
    toJSON                                  html  
    uncacheTable                            html  
    union                                   html  
    unionByName                             html  
    unpersist                               html  
    windowOrderBy                           html  
    windowPartitionBy                       html  
    with                                    html  
    withColumn                              html  
    withWatermark                           html  
    write.df                                html  
    write.jdbc                              html  
    write.json                              html  
    write.ml                                html  
    write.orc                               html  
    write.parquet                           html  
    write.stream                            html  
    write.text                              html  
** building package indices
** installing vignettes
** testing if installed package can be loaded
* DONE (SparkR)
+ cd /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/lib
+ jar cfM /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/lib/sparkr.zip SparkR
+ popd
[info] Using build tool sbt with Hadoop profile hadoop2.6 under environment amplab_jenkins
[info] Found the following changed modules: root
[info] Setup the following environment variables for tests: 

========================================================================
Running Apache RAT checks
========================================================================
Attempting to fetch rat
RAT checks passed.

========================================================================
Running Scala style checks
========================================================================
Scalastyle checks passed.

========================================================================
Running Python style checks
========================================================================
pycodestyle checks passed.
0
flake8 checks passed.
rm -rf _build/*
pydoc checks passed.

========================================================================
Running R style checks
========================================================================
Loading required package: methods

Attaching package: 'SparkR'

The following objects are masked from 'package:stats':

    cov, filter, lag, na.omit, predict, sd, var, window

The following objects are masked from 'package:base':

    as.data.frame, colnames, colnames<-, drop, intersect, rank, rbind,
    sample, subset, summary, transform, union


Attaching package: 'testthat'

The following objects are masked from 'package:SparkR':

    describe, not

Error: StartTag: invalid element name [68]
Execution halted
lintr checks passed.

========================================================================
Running build tests
========================================================================
exec: curl -s -L https://downloads.lightbend.com/zinc/0.3.15/zinc-0.3.15.tgz
exec: curl -s -L https://downloads.lightbend.com/scala/2.11.12/scala-2.11.12.tgz
exec: curl -s -L https://www.apache.org/dyn/closer.lua?action=download&filename=/maven/maven-3/3.5.4/binaries/apache-maven-3.5.4-bin.tar.gz
Using `mvn` from path: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/build/apache-maven-3.5.4/bin/mvn
Using `mvn` from path: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/build/apache-maven-3.5.4/bin/mvn
Performing Maven install for hadoop-2.7
Using `mvn` from path: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/build/apache-maven-3.5.4/bin/mvn
Performing Maven validate for hadoop-2.7
Using `mvn` from path: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/build/apache-maven-3.5.4/bin/mvn
Generating dependency manifest for hadoop-2.7
Using `mvn` from path: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/build/apache-maven-3.5.4/bin/mvn
Performing Maven install for hadoop-3.1
Using `mvn` from path: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/build/apache-maven-3.5.4/bin/mvn
Performing Maven validate for hadoop-3.1
Using `mvn` from path: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/build/apache-maven-3.5.4/bin/mvn
Generating dependency manifest for hadoop-3.1
Using `mvn` from path: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/build/apache-maven-3.5.4/bin/mvn
Using `mvn` from path: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/build/apache-maven-3.5.4/bin/mvn

========================================================================
Building Spark
========================================================================
[error] Could not find hadoop2.6 in the list. Valid options  are ['hadoop2.7']
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Finished: FAILURE