FailedConsole Output

Skipping 2,065 KB.. Full Log
ttlJQNjBwdjEwsFayAwAsE8VZpQAAAA==- subtract mutable pairs
- sort with Java non serializable class - Kryo
- sort with Java non serializable class - Java
- shuffle with different compression settings (SPARK-3426)
- [SPARK-4085] rerun map stage if reduce stage cannot find its local shuffle file
- metrics for shuffle without aggregation
- metrics for shuffle with aggregation
- multiple simultaneous attempts for one task (SPARK-8029)
CountEvaluatorSuite:
- test count 0
- test count >= 1
KryoSerializerSuite:
- SPARK-7392 configuration limits
- basic types
- pairs
- Scala data structures
- Bug: SPARK-10251
- ranges
- asJavaIterable
- custom registrator
- kryo with collect
- kryo with parallelize
- kryo with parallelize for specialized tuples
- kryo with parallelize for primitive arrays
- kryo with collect for specialized tuples
- kryo with SerializableHyperLogLog
- kryo with reduce
- kryo with fold
- kryo with nonexistent custom registrator should fail
- default class loader can be set by a different thread
- registration of HighlyCompressedMapStatus
- serialization buffer overflow reporting
- SPARK-12222: deserialize RoaringBitmap throw Buffer underflow exception
- KryoOutputObjectOutputBridge.writeObject and KryoInputObjectInputBridge.readObject
- getAutoReset
- instance reuse with autoReset = true, referenceTracking = true
- instance reuse with autoReset = false, referenceTracking = true
- instance reuse with autoReset = true, referenceTracking = false
- instance reuse with autoReset = false, referenceTracking = false
BlacklistTrackerSuite:
- blacklist still respects legacy configs
- check blacklist configuration invariants
FailureSuite:
- failure in a single-stage job
- failure in a two-stage job
- failure in a map stage
- failure because task results are not serializable
- failure because task closure is not serializable
- managed memory leak error should not mask other failures (SPARK-9266
- last failure cause is sent back to driver
- failure cause stacktrace is sent back to driver if exception is not serializable
- failure cause stacktrace is sent back to driver if exception is not deserializable
- failure in tasks in a submitMapStage
- failure because cached RDD partitions are missing from DiskStore (SPARK-15736)
- SPARK-16304: Link error should not crash executor
PartitionwiseSampledRDDSuite:
- seed distribution
- concurrency
JdbcRDDSuite:
- basic functionality
- large id overflow
FileSuite:
- text files
- text files (compressed)
- SequenceFiles
- SequenceFile (compressed)
- SequenceFile with writable key
- SequenceFile with writable value
- SequenceFile with writable key and value
- implicit conversions in reading SequenceFiles
- object files of ints
- object files of complex types
- object files of classes from a JAR
- write SequenceFile using new Hadoop API
- read SequenceFile using new Hadoop API
- binary file input as byte array
- portabledatastream caching tests
- portabledatastream persist disk storage
- portabledatastream flatmap tests
- fixed record length binary file as byte array
- negative binary record length should raise an exception
- file caching
- prevent user from overwriting the empty directory (old Hadoop API)
- prevent user from overwriting the non-empty directory (old Hadoop API)
- allow user to disable the output directory existence checking (old Hadoop API
- prevent user from overwriting the empty directory (new Hadoop API)
- prevent user from overwriting the non-empty directory (new Hadoop API)
- allow user to disable the output directory existence checking (new Hadoop API
- save Hadoop Dataset through old Hadoop API
- save Hadoop Dataset through new Hadoop API
- Get input files via old Hadoop API
- Get input files via new Hadoop API
- spark.files.ignoreCorruptFiles should work both HadoopRDD and NewHadoopRDD
SparkContextSuite:
- Only one SparkContext may be active at a time
- Can still construct a new SparkContext after failing to construct a previous one
- Check for multiple SparkContexts can be disabled via undocumented debug option
- Test getOrCreate
- BytesWritable implicit conversion is correct
- basic case for addFile and listFiles
- add and list jar files
- SPARK-17650: malformed url's throw exceptions before bricking Executors
- addFile recursive works
- addFile recursive can't add directories by default
- cannot call addFile with different paths that have the same filename
- addJar can be called twice with same file in local-mode (SPARK-16787)
- addFile can be called twice with same file in local-mode (SPARK-16787)
- addJar can be called twice with same file in non-local-mode (SPARK-16787)
- addFile can be called twice with same file in non-local-mode (SPARK-16787)
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Spark Project ML Local Library 2.1.4-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-versions) @ spark-mllib-local_2.11 ---
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @ spark-mllib-local_2.11 ---
[INFO] Add Source directory: /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/mllib-local/src/main/scala
[INFO] Add Test Source directory: /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/mllib-local/src/test/scala
[INFO] 
[INFO] --- maven-dependency-plugin:2.10:build-classpath (default-cli) @ spark-mllib-local_2.11 ---
[INFO] Dependencies classpath:
/home/jenkins/.m2/repository/com/chuusai/shapeless_2.11/2.0.0/shapeless_2.11-2.0.0.jar:/home/jenkins/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/common/tags/target/scala-2.11/classes:/home/jenkins/.m2/repository/org/scala-lang/scala-library/2.11.8/scala-library-2.11.8.jar:/home/jenkins/.m2/repository/org/spire-math/spire-macros_2.11/0.7.4/spire-macros_2.11-0.7.4.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-math3/3.4.1/commons-math3-3.4.1.jar:/home/jenkins/.m2/repository/net/sourceforge/f2j/arpack_combined_all/0.1/arpack_combined_all-0.1.jar:/home/jenkins/.m2/repository/org/scalanlp/breeze_2.11/0.12/breeze_2.11-0.12.jar:/home/jenkins/.m2/repository/net/sf/opencsv/opencsv/2.3/opencsv-2.3.jar:/home/jenkins/.m2/repository/org/scala-lang/scala-reflect/2.11.8/scala-reflect-2.11.8.jar:/home/jenkins/.m2/repository/org/scalanlp/breeze-macros_2.11/0.12/breeze-macros_2.11-0.12.jar:/home/jenkins/.m2/repository/org/slf4j/slf4j-api/1.7.16/slf4j-api-1.7.16.jar:/home/jenkins/.m2/repository/com/github/rwl/jtransforms/2.4.0/jtransforms-2.4.0.jar:/home/jenkins/.m2/repository/org/spire-math/spire_2.11/0.7.4/spire_2.11-0.7.4.jar:/home/jenkins/.m2/repository/com/github/fommil/netlib/core/1.1.2/core-1.1.2.jar
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark-mllib-local_2.11 ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-mllib-local_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/mllib-local/src/main/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-mllib-local_2.11 ---
[INFO] Using zinc server for incremental compilation
[info] Compiling 5 Scala sources to /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/mllib-local/target/scala-2.11/classes...
[info] Compile success at Aug 27, 2018 4:59:25 PM [2.738s]
[INFO] 
[INFO] --- maven-compiler-plugin:3.5.1:compile (default-compile) @ spark-mllib-local_2.11 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-antrun-plugin:1.8:run (create-tmp-dir) @ spark-mllib-local_2.11 ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ spark-mllib-local_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/mllib-local/src/test/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ spark-mllib-local_2.11 ---
[INFO] Using zinc server for incremental compilation
[info] Compile success at Aug 27, 2018 4:59:25 PM [0.091s]
[INFO] 
[INFO] --- maven-compiler-plugin:3.5.1:testCompile (default-testCompile) @ spark-mllib-local_2.11 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-dependency-plugin:2.10:build-classpath (generate-test-classpath) @ spark-mllib-local_2.11 ---
[INFO] 
[INFO] --- maven-surefire-plugin:2.19.1:test (default-test) @ spark-mllib-local_2.11 ---

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0

Results :

Tests run: 0, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-surefire-plugin:2.19.1:test (test) @ spark-mllib-local_2.11 ---
[INFO] Skipping execution of surefire because it has already been run for this configuration
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ spark-mllib-local_2.11 ---
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
Discovery starting.
Discovery completed in 284 milliseconds.
Run starting. Expected test count is: 79
BLASSuite:
- copy
Aug 27, 2018 4:59:27 PM com.github.fommil.netlib.BLAS <clinit>
WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS
Aug 27, 2018 4:59:27 PM com.github.fommil.netlib.BLAS <clinit>
WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS
- scal
- axpy
- dot
- spr
- syr
- gemm
- gemv
- spmv
UtilsSuite:
- EPSILON
TestingUtilsSuite:
- Comparing doubles using relative error.
- Comparing doubles using absolute error.
- Comparing vectors using relative error.
- Comparing vectors using absolute error.
- Comparing Matrices using absolute error.
- Comparing Matrices using relative error.
BreezeMatrixConversionSuite:
- dense matrix to breeze
- dense breeze matrix to matrix
- sparse matrix to breeze
- sparse breeze matrix to sparse matrix
BreezeVectorConversionSuite:
- dense to breeze
Aug 27, 2018 4:59:28 PM com.github.fommil.netlib.LAPACK <clinit>
WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeSystemLAPACK
Aug 27, 2018 4:59:28 PM com.github.fommil.netlib.LAPACK <clinit>
WARNING: Failed to load implementation from: com.github.fommil.netlib.NativeRefLAPACK
- sparse to breeze
- dense breeze to vector
- sparse breeze to vector
- sparse breeze with partially-used arrays to vector
MultivariateGaussianSuite:
- univariate
- multivariate
- multivariate degenerate
- SPARK-11302
MatricesSuite:
- dense matrix construction
- dense matrix construction with wrong dimension
- sparse matrix construction
- sparse matrix construction with wrong number of elements
- index in matrices incorrect input
- equals
- matrix copies are deep copies
- matrix indexing and updating
- toSparse, toDense
- map, update
- transpose
- foreachActive
- horzcat, vertcat, eye, speye
- zeros
- ones
- eye
- rand
- randn
- diag
- sprand
- sprandn
- toString
- numNonzeros and numActives
- fromBreeze with sparse matrix
- row/col iterator
VectorsSuite:
- dense vector construction with varargs
- dense vector construction from a double array
- sparse vector construction
- sparse vector construction with unordered elements
- sparse vector construction with mismatched indices/values array
- sparse vector construction with too many indices vs size
- sparse vector construction with negative indices
- dense to array
- dense argmax
- sparse to array
- sparse argmax
- vector equals
- vectors equals with explicit 0
- indexing dense vectors
- indexing sparse vectors
- zeros
- Vector.copy
- fromBreeze
- sqdist
- foreachActive
- vector p-norm
- Vector numActive and numNonzeros
- Vector toSparse and toDense
- Vector.compressed
- SparseVector.slice
Run completed in 2 seconds, 239 milliseconds.
Total number of tests run: 79
Suites: completed 9, aborted 0
Tests: succeeded 79, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Project GraphX
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Project Streaming
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Project Catalyst
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Project SQL
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Project ML Library
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Spark Project Tools 2.1.4-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-versions) @ spark-tools_2.11 ---
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @ spark-tools_2.11 ---
[INFO] Add Source directory: /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/tools/src/main/scala
[INFO] Add Test Source directory: /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/tools/src/test/scala
[INFO] 
[INFO] --- maven-dependency-plugin:2.10:build-classpath (default-cli) @ spark-tools_2.11 ---
[INFO] Dependencies classpath:
/home/jenkins/.m2/repository/org/clapper/classutil_2.11/1.0.6/classutil_2.11-1.0.6.jar:/home/jenkins/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/home/jenkins/.m2/repository/org/clapper/grizzled-slf4j_2.11/1.0.2/grizzled-slf4j_2.11-1.0.2.jar:/home/jenkins/.m2/repository/org/scala-lang/scala-library/2.11.8/scala-library-2.11.8.jar:/home/jenkins/.m2/repository/org/scala-lang/modules/scala-async_2.11/0.9.1/scala-async_2.11-0.9.1.jar:/home/jenkins/.m2/repository/org/scala-lang/scala-reflect/2.11.8/scala-reflect-2.11.8.jar:/home/jenkins/.m2/repository/org/clapper/grizzled-scala_2.11/1.4.0/grizzled-scala_2.11-1.4.0.jar:/home/jenkins/.m2/repository/org/ow2/asm/asm-commons/5.0.2/asm-commons-5.0.2.jar:/home/jenkins/.m2/repository/org/ow2/asm/asm-tree/5.0.2/asm-tree-5.0.2.jar:/home/jenkins/.m2/repository/org/ow2/asm/asm-util/5.0.2/asm-util-5.0.2.jar:/home/jenkins/.m2/repository/org/scala-lang/modules/scala-parser-combinators_2.11/1.0.4/scala-parser-combinators_2.11-1.0.4.jar:/home/jenkins/.m2/repository/org/slf4j/slf4j-api/1.7.16/slf4j-api-1.7.16.jar:/home/jenkins/.m2/repository/org/scala-lang/scala-compiler/2.11.8/scala-compiler-2.11.8.jar:/home/jenkins/.m2/repository/jline/jline/2.12.1/jline-2.12.1.jar:/home/jenkins/.m2/repository/org/scala-lang/modules/scala-xml_2.11/1.0.4/scala-xml_2.11-1.0.4.jar:/home/jenkins/.m2/repository/org/ow2/asm/asm/5.0.2/asm-5.0.2.jar
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark-tools_2.11 ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-tools_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/tools/src/main/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-tools_2.11 ---
[INFO] Using zinc server for incremental compilation
[info] Compile success at Aug 27, 2018 4:59:29 PM [0.040s]
[INFO] 
[INFO] --- maven-compiler-plugin:3.5.1:compile (default-compile) @ spark-tools_2.11 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-antrun-plugin:1.8:run (create-tmp-dir) @ spark-tools_2.11 ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ spark-tools_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/tools/src/test/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ spark-tools_2.11 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-compiler-plugin:3.5.1:testCompile (default-testCompile) @ spark-tools_2.11 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-dependency-plugin:2.10:build-classpath (generate-test-classpath) @ spark-tools_2.11 ---
[INFO] 
[INFO] --- maven-surefire-plugin:2.19.1:test (default-test) @ spark-tools_2.11 ---
[INFO] 
[INFO] --- maven-surefire-plugin:2.19.1:test (test) @ spark-tools_2.11 ---
[INFO] Skipping execution of surefire because it has already been run for this configuration
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ spark-tools_2.11 ---
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
Discovery starting.
Discovery completed in 77 milliseconds.
Run starting. Expected test count is: 0
DiscoverySuite:
Run completed in 144 milliseconds.
Total number of tests run: 0
Suites: completed 1, aborted 0
Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
No tests were executed.
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Project Hive
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Project REPL
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Spark Project YARN Shuffle Service 2.1.4-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-versions) @ spark-network-yarn_2.11 ---
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @ spark-network-yarn_2.11 ---
[INFO] Add Source directory: /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/common/network-yarn/src/main/scala
[INFO] Add Test Source directory: /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/common/network-yarn/src/test/scala
[INFO] 
[INFO] --- maven-dependency-plugin:2.10:build-classpath (default-cli) @ spark-network-yarn_2.11 ---
[INFO] Dependencies classpath:
/home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/common/network-common/target/scala-2.11/classes:/home/jenkins/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/home/jenkins/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/home/jenkins/.m2/repository/io/dropwizard/metrics/metrics-core/3.1.2/metrics-core-3.1.2.jar:/home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/common/tags/target/scala-2.11/classes:/home/jenkins/.m2/repository/org/scala-lang/scala-library/2.11.8/scala-library-2.11.8.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-lang3/3.5/commons-lang3-3.5.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.6.5/jackson-databind-2.6.5.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.6.5/jackson-core-2.6.5.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.6.5/jackson-annotations-2.6.5.jar:/home/jenkins/.m2/repository/io/netty/netty-all/4.0.43.Final/netty-all-4.0.43.Final.jar:/home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/common/network-shuffle/target/scala-2.11/classes:/home/jenkins/.m2/repository/org/fusesource/leveldbjni/leveldbjni-all/1.8/leveldbjni-all-1.8.jar
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark-network-yarn_2.11 ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-network-yarn_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/common/network-yarn/src/main/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-network-yarn_2.11 ---
[INFO] Using zinc server for incremental compilation
[info] Compiling 2 Java sources to /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/common/network-yarn/target/scala-2.11/classes...
[warn] warning: [options] bootstrap class path not set in conjunction with -source 1.7
[warn] 1 warning
[info] Compile success at Aug 27, 2018 4:59:33 PM [1.456s]
[INFO] 
[INFO] --- maven-compiler-plugin:3.5.1:compile (default-compile) @ spark-network-yarn_2.11 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-antrun-plugin:1.8:run (create-tmp-dir) @ spark-network-yarn_2.11 ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ spark-network-yarn_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/common/network-yarn/src/test/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ spark-network-yarn_2.11 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-compiler-plugin:3.5.1:testCompile (default-testCompile) @ spark-network-yarn_2.11 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-dependency-plugin:2.10:build-classpath (generate-test-classpath) @ spark-network-yarn_2.11 ---
[INFO] 
[INFO] --- maven-surefire-plugin:2.19.1:test (default-test) @ spark-network-yarn_2.11 ---
[INFO] 
[INFO] --- maven-surefire-plugin:2.19.1:test (test) @ spark-network-yarn_2.11 ---
[INFO] Skipping execution of surefire because it has already been run for this configuration
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ spark-network-yarn_2.11 ---
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
Discovery starting.
Discovery completed in 113 milliseconds.
Run starting. Expected test count is: 0
DiscoverySuite:
Run completed in 198 milliseconds.
Total number of tests run: 0
Suites: completed 1, aborted 0
Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
No tests were executed.
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Project YARN
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Project Mesos
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Project Hive Thrift Server
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Project Assembly
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Spark Project External Flume Sink 2.1.4-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-versions) @ spark-streaming-flume-sink_2.11 ---
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @ spark-streaming-flume-sink_2.11 ---
[INFO] Add Source directory: /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/external/flume-sink/src/main/scala
[INFO] Add Test Source directory: /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/external/flume-sink/src/test/scala
[INFO] 
[INFO] --- maven-dependency-plugin:2.10:build-classpath (default-cli) @ spark-streaming-flume-sink_2.11 ---
[INFO] Dependencies classpath:
/home/jenkins/.m2/repository/joda-time/joda-time/2.9.3/joda-time-2.9.3.jar:/home/jenkins/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/home/jenkins/.m2/repository/org/scala-lang/scala-library/2.11.8/scala-library-2.11.8.jar:/home/jenkins/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/home/jenkins/.m2/repository/org/apache/mina/mina-core/2.0.4/mina-core-2.0.4.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/home/jenkins/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:/home/jenkins/.m2/repository/org/slf4j/slf4j-api/1.7.16/slf4j-api-1.7.16.jar:/home/jenkins/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/home/jenkins/.m2/repository/commons-codec/commons-codec/1.10/commons-codec-1.10.jar:/home/jenkins/.m2/repository/org/xerial/snappy/snappy-java/1.1.2.6/snappy-java-1.1.2.6.jar:/home/jenkins/.m2/repository/org/slf4j/slf4j-log4j12/1.7.16/slf4j-log4j12-1.7.16.jar:/home/jenkins/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/home/jenkins/.m2/repository/com/google/code/gson/gson/2.2.2/gson-2.2.2.jar:/home/jenkins/.m2/repository/org/apache/flume/flume-ng-sdk/1.6.0/flume-ng-sdk-1.6.0.jar:/home/jenkins/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/home/jenkins/.m2/repository/org/apache/avro/avro-ipc/1.7.7/avro-ipc-1.7.7.jar:/home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/common/tags/target/scala-2.11/classes:/home/jenkins/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/home/jenkins/.m2/repository/org/apache/avro/avro/1.7.7/avro-1.7.7.jar:/home/jenkins/.m2/repository/org/apache/flume/flume-ng-core/1.6.0/flume-ng-core-1.6.0.jar:/home/jenkins/.m2/repository/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar:/home/jenkins/.m2/repository/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/home/jenkins/.m2/repository/org/apache/flume/flume-ng-configuration/1.6.0/flume-ng-configuration-1.6.0.jar
[INFO] 
[INFO] --- avro-maven-plugin:1.7.7:idl-protocol (default) @ spark-streaming-flume-sink_2.11 ---
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark-streaming-flume-sink_2.11 ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-streaming-flume-sink_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/external/flume-sink/src/main/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-streaming-flume-sink_2.11 ---
[INFO] Using zinc server for incremental compilation
[info] Compiling 3 Java sources to /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/external/flume-sink/target/scala-2.11/classes...
[warn] warning: [options] bootstrap class path not set in conjunction with -source 1.7
[warn] /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/external/flume-sink/target/scala-2.11/src_managed/main/compiled_avro/org/apache/spark/streaming/flume/sink/EventBatch.java:243: warning: [unchecked] unchecked cast
[warn]         record.events = fieldSetFlags()[2] ? this.events : (java.util.List<org.apache.spark.streaming.flume.sink.SparkSinkEvent>) defaultValue(fields()[2]);
[warn]                                                                                                                                               ^
[warn]   required: List<SparkSinkEvent>
[warn]   found:    Object
[warn] /home/jenkins/workspace/spark-branch-2.1-test-maven-hadoop-2.6/external/flume-sink/target/scala-2.11/src_managed/main/compiled_avro/org/apache/spark/streaming/flume/sink/SparkSinkEvent.java:188: warning: [unchecked] unchecked cast
[warn]         record.headers = fieldSetFlags()[0] ? this.headers : (java.util.Map<java.lang.CharSequence,java.lang.CharSequence>) defaultValue(fields()[0]);
[warn]                                                                                                                                         ^
[warn]   required: Map<CharSequence,CharSequence>
[warn]   found:    Object
[warn] 3 warnings
[info] Compile success at Aug 27, 2018 4:59:38 PM [1.042s]
[INFO] 
[INFO] --- maven-compiler-plugin:3.5.1:compile (default-compile) @ spark-streaming-flume-sink_2.11 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-antrun-plugin:1.8:run (create-tmp-dir) @ spark-streaming-flume-sink_2.11 ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ spark-streaming-flume-sink_2.11 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] Copying 3 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ spark-streaming-flume-sink_2.11 ---
[INFO] Using zinc server for incremental compilation
[info] Compile success at Aug 27, 2018 4:59:38 PM [0.051s]
[INFO] 
[INFO] --- maven-compiler-plugin:3.5.1:testCompile (default-testCompile) @ spark-streaming-flume-sink_2.11 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-dependency-plugin:2.10:build-classpath (generate-test-classpath) @ spark-streaming-flume-sink_2.11 ---
[INFO] 
[INFO] --- maven-surefire-plugin:2.19.1:test (default-test) @ spark-streaming-flume-sink_2.11 ---

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0

Results :

Tests run: 0, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-surefire-plugin:2.19.1:test (test) @ spark-streaming-flume-sink_2.11 ---
[INFO] Skipping execution of surefire because it has already been run for this configuration
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ spark-streaming-flume-sink_2.11 ---
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
Discovery starting.
Discovery completed in 130 milliseconds.
Run starting. Expected test count is: 5
SparkSinkSuite:
- Success with ack
- Failure with nack
- Failure with timeout
- Multiple consumers
- Multiple consumers with some failures
Run completed in 6 seconds, 108 milliseconds.
Total number of tests run: 5
Suites: completed 2, aborted 0
Tests: succeeded 5, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Project External Flume
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Project External Flume Assembly
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Integration for Kafka 0.8
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Kafka 0.10 Source for Structured Streaming
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Kinesis Integration
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Project Examples
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Project External Kafka Assembly
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Integration for Kafka 0.10
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Integration for Kafka 0.10 Assembly
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Project Kinesis Assembly
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Spark Project Java 8 Tests
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Parent POM ........................... SUCCESS [  9.548 s]
[INFO] Spark Project Tags ................................. SUCCESS [  6.145 s]
[INFO] Spark Project Sketch ............................... SUCCESS [ 21.114 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 56.463 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 27.768 s]
[INFO] Spark Project Unsafe ............................... SUCCESS [ 11.712 s]
[INFO] Spark Project Launcher ............................. SUCCESS [ 15.576 s]
[INFO] Spark Project Core ................................. FAILURE [16:23 min]
[INFO] Spark Project ML Local Library ..................... SUCCESS [  6.624 s]
[INFO] Spark Project GraphX ............................... SKIPPED
[INFO] Spark Project Streaming ............................ SKIPPED
[INFO] Spark Project Catalyst ............................. SKIPPED
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SUCCESS [  1.101 s]
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SUCCESS [  5.043 s]
[INFO] Spark Project YARN ................................. SKIPPED
[INFO] Spark Project Mesos ................................ SKIPPED
[INFO] Spark Project Hive Thrift Server ................... SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Flume Sink .................. SUCCESS [ 10.389 s]
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External Flume Assembly .............. SKIPPED
[INFO] Spark Integration for Kafka 0.8 .................... SKIPPED
[INFO] Kafka 0.10 Source for Structured Streaming ......... SKIPPED
[INFO] Spark Kinesis Integration .......................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project External Kafka Assembly .............. SKIPPED
[INFO] Spark Integration for Kafka 0.10 ................... SKIPPED
[INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED
[INFO] Spark Project Kinesis Assembly ..................... SKIPPED
[INFO] Spark Project Java 8 Tests ......................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 19:15 min
[INFO] Finished at: 2018-08-27T16:59:45-07:00
[INFO] Final Memory: 67M/1998M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test (test) on project spark-core_2.11: There are test failures -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :spark-core_2.11
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Finished: FAILURE